Making ultrasound more accessible with AI guidance

Making ultrasound more accessible with AI guidance


“I would love to see a future where looking inside the body becomes as routine as a blood pressure cuff measurement,” says Charles Cadieu ’04, MEng ’05. As president of the medical technology startup Caption Health, he sees that future in reach—with the help of artificial intelligence.

Cadieu still remembers the “lightbulb moment” during his postdoctoral research at MIT when he realized that the field of AI would never be the same. He was working in the lab of James DiCarlo (now the Peter de Florez Professor of Neuroscience) on neural networks—AI systems made up of deep-learning algorithms that emulate the dense networks of neurons in the brain. Until then, neural networks had been unable to perform even simple visual tasks that the brain handles with ease. However, in 2012, sitting at his computer in the McGovern Institute for Brain Research, Cadieu saw something remarkable: the first evidence that one of the neural networks he was studying could recognize objects as well as neurons in the primate brain can.

The next year, Cadieu tapped his expertise in deep learning—developed while studying electrical engineering and computer science at MIT and neuroscience at the University of California, Berkeley—to cofound Caption Health. Cadieu says he and his cofounders “fell in love with this idea of democratizing medical imaging”—expanding patient access to high-quality diagnostic technology. Specifically, they decided to focus on ultrasound, an affordable, portable, and effective imaging technology that relies on a probe and high-frequency sound waves. “Ultrasound required very experienced and skilled people to operate it,” Cadieu says. “If our software could emulate those people, we could give that skill to a whole new set of medical providers.”

READ ALSO  How to Convert 4K Video to MP4 and More Flawlessly with WinX HD Video Converter Deluxe

Enter the company’s flagship product, Caption Guidance, an ultrasound “copilot” with deep-learning algorithms trained on ultrasound images captured by experts. The software guides users through positioning and moving the ultrasound probe, continuously gathering images and assessing their quality. The FDA approved Caption Guidance in February after a study showed that nurses without ultrasound training could use it to obtain diagnostic-quality images of the heart. Cadieu expects that the technology will first be used by emergency room physicians to look at heart function. However, he hopes it will eventually be used everywhere from big hospitals to rural villages to examine people for a wide range of medical conditions.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com