The tool, dubbed “Live Surface” by its creators professor William Barrett and graduate student Chris Armstrong at Brigham Young University (BYU), also has special-effects applications – in similar fashion it can be used to extract a single actor’s performance or inanimate objects from video clips.

“A program like this has to be incredibly fast and very interactive, or else it’s very frustrating for the user, who currently has to go get a sandwich and come back before he has what he wants,” Barrett explained. Live Surface has the additional benefit of allowing users to easily isolate “tricky” anatomy such as soft tissue – blood vessels, hearts and muscles – that a lot of other techniques can’t readily extract, said Barrett. “The hard stuff - bones - is easy to see using traditional methods, but even there, the simpler techniques sometimes overestimate, underestimate, or fuse joints, whereas Live Surface neatly and accurately separates them,” said Barrett. “Our program also provides more robust isolation of soft tissue, which is quite a breakthrough.”

The BYU software works by extracting information from data collected in 3-D volumes – CT scans, MRIs or 3-D ultrasounds. With a click and drag of the mouse, a user identifies the object he wishes to extract. Next, he identifies those portions of the data that surround the object. Immediately, the desired object is extracted from the data.

After a surgeon had extracted a 3-D image of a person’s heart or brain, for example, the image could then be projected onto the patient’s body, fitted to create a road map for the surgeon as he operated. Additionally, doctors could use the tool to make better diagnoses after visualising a patient’s organs from multiple angles or do a better job of locating cancerous tumours.

MEDICA.de; Source: Brigham Young University