State of progress :
Assisting in the production of ultrasound images in complete autonomy and without medical expertise
CNES, Sonoscanner, SFR, Université Paris Descartes, Vermon, CERCOM-Université de Tours
From 1982 to this day, ultrasonography has always been the only imaging modality available onboard human spaceflight. Ultrasonography is broadly used to support scientific research and medical surveillance of physiological phenomenon induced by microgravity.
On board the ISS, ultrasound sessions are supervised by a trained sonographer teleguiding the astronaut (Ultrasound 2 – NASA) or teleoperating the probe (ECHO – CNES/CSA). The international effort to establish a sustainable human presence on the moon will induce a more frequent usage of ultrasonography.
Indeed, the common spaceflight duration will increase while the environment becomes riskier and harsher. This will offer a new field of scientific investigation and amplify the need of adequately monitoring the health of the astronauts.
However, the constraints on real-time communications severely impact telemedicine-based technologies, requiring new medical operations and systems to support astronauts in ultrasound sessions.
To allow earth-independent imaging sessions, we developed different technologies to enclose the expertise of an experienced sonographer into a tool.
We presented an augmented reality software which provides a real-time guidance towards a previously saved ideal probe positioning.
In parallel, we worked on a computer vision process made of both physical aware image analysis and machine learning based solution to detect and segment targeted organs.
The augmented reality software has been tested on ground and during a parabolic flight campaign. It allows a novice operator to locate the probe correctly on the subject’s body in 88% of the case.
Now, we are able to combine both these techniques to provide a complete and functional solution running on a tablet. As the astronaut is searching for the ideal probe positioning thanks to augmented reality, real-time indications are given by the organ detection algorithm.
This allows to find and automatically save an organ image, in order to send back to the ground a perfect 2D view of the desired organ without any action of the astronaut on the ultrasound device. The astronaut will also be able to proceed to specific ultrasound examination such as doppler or time motion. A version of this software is also running on an augmented-reality headset allowing a more intuitive and immersive experience.
A question ?
Our clinical studies