Ultrasound... It's A Whole New World
By Cindy Owen, Director of Clinical Insights and Development for Point of Care Ultrasound at GE Healthcare
Picture this: A patient arrives at the Emergency Department with symptoms characteristic of shock - fever, chills, and difficulty breathing. Time is of the essence, and the attending physician needs to act quickly to make an accurate diagnosis and accelerate treatment.
"As more algorithms emerge, consequent automation techniques will continue to augment the clinician’s work"
She pulls around a portable ultrasound system to get information about the heart, lungs, and inferior vena cava. A decade ago, this may have taken several minutes– time the patient couldn’t spare -- to fumble with the machine and to ensure the measurements are precise. But, today, these tools are automated with the help of Artificial Intelligence, streamlining the hard and tedious parts of the acquisition and allowing the clinician to focus more on the patient and less on the machine.
Intelligent machines are changing health care
The AI-powered Venue ultrasound system from GE Healthcare released last year introduced three automated tools for the critical patient - Auto VTI, Auto IVC and Auto B-lines:
• The Auto VTI tool calculates the Velocity Time Integral (with a five or three chamber view), locates and traces the optimal Doppler waveform, and calculates multiple parameters at once - Velocity Time Integral (VTI), Cardiac Output (if the LVOT is measured or entered), Cardiac Index, Stroke Volume, Heart rate and Cardiac Output Flux.
• The Auto B-lines tool detects and counts B-lines – a clinical marker of extravascular lung water – within a lung space, simplifying the lung exam in patients with pulmonary edema.
• Auto IVC tracks the Inferior Vena Cava, measuring the Collapsibility or Distensibility Index (depending on whether the patient is breathing spontaneously or ventilated), reducing complexity in evaluation of a patient’s fluid status.
It’s incumbent upon the user to obtain an adequate view of the anatomy with each tool. A quality indicator displays green, yellow or red to alert users to the quality required for the tools to work appropriately. This helps drive consistency, whether there’s a different user 10 minutes later or the same user the next day.
And when it comes to AI, we’re only scratching the surface. As more algorithms emerge, consequent automation techniques will continue to augment the clinician’s work. That’s not to say there won’t be challenges. To be effective and accurate, these tools require many curated datasets. Depending on the frequency of occurrence and complexity of the problem to be solved, the number of datasets needed to develop an AI-based algorithm can reach well into the thousands.
It’s a worthy goal, however. Physicians and sonographers using diagnostic ultrasound need help to keep up with the demands for fast turn-around of exams and patient disposition. After all, they are evaluated on quality metrics and patient satisfaction. That’s where AI based tools can help.
30 years ago as a new sonographer, I never could have imagined where ultrasound is today.
At the time, I used a B-scanner ultrasound system. It built an image step-by-step as I meticulously rubbed the transducer across the patient’s abdomen, fanning it between ribs to “paint” a still representation of their internal organs. Each “painting” was repeated in a stepwise pattern from the tip of the sternum to the pelvis. While the technology sounds crude, it served as a powerful and even fun tool that produced cross-sectional images to visualize the internal organs. But there were limitations: It was time-consuming, required significant expertise and demanded patience from all involved. Real-time ultrasound was a significant breakthrough, giving us the ability to observe motion inside the patient: gallstones rolled, vessels pulsated, the heart pumped and -- perhaps most fascinating of all—the fetus kicked and stretched. Who could stop watching?
Soon after, color Doppler arrived and provided appreciation for blood flow direction, speed and turbulence. Gradually, ultrasound systems became more powerful and complex. They required increased proficiency to optimize the controls. Many of us took a great deal of pride in being able to finesse these large and intricate ultrasound systems, but the complexity of acquisition diverted our attention from the patient in front of us and led to significant variations in the quality of ultrasound exams.
Some of the basic functionality soon became automated - fetal biometry, Doppler scale, image brightness, etc. As a result, the time it took to acquire an ultrasound image was significantly reduced.
While some users were skeptical of the need for automated steps, others embraced the trend and advocated for more, which led us to where we are today.
Ultrasound reaches beyond a hospital’s walls
A new wave in diagnostic ultrasound has enormous potential to improve quality and to increase access around the world.
Ultrasound is powerful for its many uses – inside and outside the hospitals. In fact, there are many areas of the globe where ultrasound is the only feasible medical technology available because of its portability, low cost, and accuracy.
In remote areas, it can be particularly beneficial to care for patients who may not have access to treatment centers. And, while it can sometimes be difficult to train novice users, this is where algorithms can offer a solution.
The number and complexity of algorithms that simplify, speed up, and increase accuracy will only grow. The next wave could be wearable ultrasound with AI-based tools that automatically detect and transmit data in patients with chronic disease such as heart failure, renal failure, hemophilia or even remote monitoring of the healing process from a sports injury. Wearable ultrasound devices could allow patient conditions to be monitored remotely, freeing them of the need for frequent visits to the hospital or doctor except when truly indicated.
* Ultimately, these “waves” of progress and development should help us, as clinicians, carry out what we’re called to do: Improve quality of life.
* Technology in development that represents ongoing research and development efforts. These technologies are not products and may never become products. Not for sale. Not cleared or approved by the FDA or any other global regulator for commercial availability.