Visualizing the invisible: Using augmented reality to reveal the inner ear

Visualizing the invisible: Using augmented reality to reveal the inner ear

“The inner ear is one of the toughest topics to teach because it is anatomically very complex, very small and buried in the dense petrous bone,” said Professor Emerita Pat Stewart.

Andrea Zariwny, a Biomedical Communications graduate student, tackled this teaching problem with an augmented reality iPad app and 3D-printing to visualize the anatomy of the inner ear.

Here’s how it works: Launch Zariwny’s app. Hold the iPad over Zariwny’s illustration of the petrous bone. The illustration acts as a glyph—a high-contrast image recognized by the tablet’s camera—and triggers a 3-dimensional digital model on top of the illustration. Look through the digital model to the illustration below and see the cochlea of the inner ear or the spiral ganglion lift right off the page.

“The illustration on its own is an informative, didactic tool,” said Professor Marc Dryer, Zariwny’s research supervisor. “But augmented with another layer of information, it becomes a strong educational feature that enhances the learning experience and engages student attention.”

Zariwny and her content advisor, Stewart, used micro CT scanning to create medical images from real anatomical evidence. Zariwny took the data into a medical image viewer and extracted 3D digital models of the petrous bone, and the cochlea and spiral ganglion—internal structures of the inner ear. She retopologized or ‘cleaned up’ the models and took them into gaming software to create the 3-dimensional digital models.

After Zariwny, who has a background in industrial design, created the augmented textbook illustration, she wanted to work with 3D printing. “I thought augmented reality could be an even more useful tool combined with a physical model,” said Zariwny.

Zariwny 3D-printed a model of the petrous bone. The model rests on a separate platform—a plinth. She wrapped the plinth in a glyph. She printed another petrous bone but, this time, sliced in half and printed the glyph onto the cross-sections. She created modules for her app that “ghost” the internal structure of the inner ear over the physical models. The apps recognize and track the glyphs, and match the digital models to the position of the physical models. Move the models, or the iPad, and the digital models allow viewers access to different orientations and perspectives.

Zariwny’s tool accurately conveys the size and position of what is essentially space buried in bone, said Stewart. “The first time I saw the models, I thought, that’s it! Exactly what we need.”

Revolutionary new simulator preps doctors for the OR

Revolutionary new simulator preps doctors for the OR

Research, renderings and rainforest

Research, renderings and rainforest