| | |

Extended Reality in Action at Duke: Immersive Technology in Real-World Medicine

Example of Harvis visualization

Extended reality is already reshaping industries of all kinds — but its most life-changing impact may be in healthcare. 

Pilots don’t fly new routes blind. They train in immersive cockpits, learning how to respond in real time. 
 
Imagine if physicians could do the same — stepping inside patient-specific models to better understand, plan, and personalize care.
 
At the Duke Center for Computational and Digital Health Innovations, we’re using extended reality technologies to turn complex patient data into interactive models. It’s a leap forward in how we visualize, plan, and personalize care — and it’s made possible by the unique convergence of clinical insight and technical expertise across the university.

What Is Extended Reality?

Extended reality (XR) is an umbrella term that encompasses three core technologies, each of which plays a distinct role in reshaping the healthcare landscape:

  • Virtual reality (VR): A fully digital environment that replaces the real world, used for simulations like surgery, training, or therapy
  • Augmented reality (AR): A technology that overlays digital information, such as 3D models or real-time data, onto the real world
  • Spatial computing: Technology that lets computers understand and interact with physical space, enabling smarter, more responsive AR and VR experiences

These technologies allow researchers and physicians to move beyond flat screens and static images, interacting instead with rich, three-dimensional data in real time.

At the Center, we’re using XR to make healthcare more intuitive, personalized, and precise. Whether it’s visualizing a patient’s blood flow from inside the vessel wall, overlaying surgical guidance directly onto a physician’s field of view, or building virtual training environments where surgeons can hone their skills, XR enables experiences that are simply not possible through traditional means.

These immersive tools align closely with the Center’s mission to find, track, and treat. XR helps us find subtle signs of disease through better visualization, track patient-specific changes over time, and provide effective treatment by simulating procedures before they happen. By combining engineering ingenuity with clinical insight, we’re shaping a future where care is not only smarter, but more human.

How We’re Using Extended Reality to Reshape Healthcare

XR isn’t just a novel interface — it’s a fundamental shift in how we engage with patient data, plan procedures, and train clinicians. Across our labs, we’re developing tools that make healthcare more precise, personalized, and proactive.

The Randles Lab: Immersive Cardiovascular Planning

In my lab, we’re combining high-fidelity simulation with immersive interfaces to help physicians better understand and treat cardiovascular disease. Two of our key tools — Harvis and HarVI — are reshaping how we visualize and plan interventions for conditions like coronary artery disease.

Harvis is a VR platform that allows users to explore 3D vascular geometries and blood flow patterns in an intuitive, immersive environment. Instead of interpreting 2D slices, clinicians can step inside a patient’s anatomy to see how blood moves through vessels and where issues may arise.

HarVI (HARVEY Virtual Intervention) builds on this by enabling real-time simulation of post-treatment blood flow. Using AI trained on thousands of fluid dynamics simulations, it provides immediate feedback when users test different intervention strategies, such as placing a stent or adjusting geometry, without needing to re-run complex computations.

These tools are tightly integrated with digital twin technology, incorporating wearable sensor data to track a patient’s evolving condition. Together, they offer a powerful and personalized platform for diagnosis, planning, and treatment — with the potential to catch complications early and improve long-term outcomes.

The McIntyre Lab: Sharpening Neurosurgical Precision

Led by Cameron McIntyre, Ph.D., the McIntyre Lab is using VR and holographic visualization to support more accurate planning in deep brain stimulation (DBS), a treatment used for Parkinson’s disease, epilepsy, OCD, and depression.

Using advanced head-mounted displays, Dr. McIntyre’s team reconstructs the brain’s intricate network of axonal pathways in 3D. Their approach, known as Connectomic DBS, combines high-resolution imaging and simulation to guide precise electrode placement that’s tailored to the unique structure of each patient’s brain.

These tools can give neurosurgeons greater confidence in their preoperative plans and hold the potential to significantly improve outcomes by improving targeting accuracy and minimizing side effects.

The Gorlatova Lab: AR and VR for Surgical Training

Maria Gorlatova, Ph.D., and her team focus on applying AR and VR to support surgical navigation and training. One of their standout innovations is Neurolens, an AR tool that overlays real-time imaging data onto the surgeon’s field of view, allowing for more accurate instrument tracking and navigation during complex procedures.

Beyond the operating room, the lab creates highly realistic VR training environments where surgeons can practice procedures on mannequins or digital phantoms. These systems capture detailed performance data, including gaze patterns and gestures, to provide feedback and improve skill development over time.

This approach not only accelerates learning but also creates new opportunities for behavioral research, helping us understand how surgeons think and move as they operate. It’s a powerful example of how immersive technologies can improve both education and clinical performance.

Building the Future of Extended Reality in Healthcare

Extended reality is more than a set of tools — it’s a catalyst for connection across disciplines. At Duke, we’re building these technologies at the intersection of engineering, medicine, and computer science, ensuring that innovation is grounded in real clinical need and driven by deep technical expertise.

Looking ahead, our focus is on expanding the reach and usability of these solutions. To do that, we’re eager to partner with physicians, researchers, developers, and industry leaders. Together, we can accelerate progress and reimagine what’s possible in patient care. 

Learn more about collaborating with us to drive progress in computational health innovation.

Amanda Randles, Ph.D., is Director of the Duke Center for Computational and Digital Health Innovation and Alfred Winborne Mordecai and Victoria Stover Mordecai Associate Professor of Biomedical Engineering at the Pratt School of Engineering.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

ten − 8 =