Extended Reality
Extended reality (XR), which includes virtual reality, augmented reality, and spatial computing, is revolutionizing healthcare by providing immersive and interactive environments for medical research and clinical applications.
At the Duke Center for Computational and Digital Health Innovation, we use XR technologies to empower healthcare professionals to provide more personalized, precise, and effective healthcare solutions. XR allows our researchers to visualize and interact with complex 3D data in intuitive and immersive ways.
What Is Extended Reality?
XR encompasses a spectrum of immersive technologies that blend physical and digital environments, including:
- Augmented reality (AR): AR overlays digital information onto the real world, providing real-time insights and guidance during medical procedures.
- Virtual reality (VR): VR immerses users in fully virtual environments for advanced simulation, training, and visualization.
- Spatial computing: Spatial computing integrates AR and VR with the physical environment, enabling dynamic interaction with 3D models and digital twins.
These technologies offer unique options for visualizing the human body, allowing for more precise and effective medical interventions. XR also provides a more realistic way to further exploration of research through interaction with data, and enhances education and training.
What Are the Advantages of Extended Reality?
The integration of extended reality into healthcare offers many benefits:
- Enhanced precision: We can visualize patient-specific digital twins to simulate treatments and predict outcomes with exceptional accuracy.
- Improved training: We can create realistic VR environments for medical education, allowing trainees to practice complex procedures in a risk-free setting.
- Collaborative planning: Through the immersive environments they allow, we use AR and spatial computing to facilitate collaboration among teams by visualizing 3D models and treatment pathways.
- Personalized care: By visualizing data in 3D and in immersive environments, we can tailor interventions to individual patients by simulating potential outcomes in virtual environments.
These advantages make XR a transformative tool for healthcare innovation.
How We’re Using Extended Reality
By harnessing the power of XR, the Center for Computational and Digital Health Innovation is pushing the boundaries of precision, personalization, and collaboration in medicine.
Advancing Neurosurgical Precision
Neurosurgical procedures, particularly those involving deep brain stimulation (DBS), require highly accurate visualization and planning. At the Center, a team led by Prof. Cameron McIntyre has developed advanced holographic visualization techniques to reconstruct axonal pathways in the brain.
The method, known as Connectomic DBS, couples patient-specific DBS modeling with high-resolution MRI data. It provides clinicians with detailed insights, helping to improve the precision of electrode implantation in the brain. This technology is critical in treating neurological conditions like Parkinson’s disease and depression, enhancing the effectiveness of these interventions.

Enhancing Surgical Navigation with Augmented Reality
The use of AR in surgical settings is another groundbreaking application at our Center. Research led by Prof. Maria Gorlatova includes the development of Neurolens, an AR tool designed to track surgical instruments in real time during neurosurgery.
The tool overlays real-time imaging data onto the surgeon’s field of view, facilitating precise navigation and improving surgical outcomes. Beyond the operating room, our work in AR extends to applications in various industries, including gaming, retail, and education, demonstrating the versatility and potential of this technology in different contexts.
Making Cardiovascular Care More Intuitive
In the realm of cardiovascular care, XR technologies are being used to create highly detailed simulations of blood flow and disease progression. Prof. Amanda Randles and team have developed Harvis and HarVI, platforms that integrate VR and machine learning to facilitate detailed exploration and surgical planning.
HarVI enables simulation of post-intervention blood flow changes through AI, providing real-time feedback of interventions, while Harvis provides an intuitive Unity-based framework, allowing us to make use of all forms of XR for interacting with patient-specific anatomy. These tools allow for developing personalized treatment strategies and laying the groundwork to enable clinicians to better understand and manage cardiovascular conditions. We also have an arm of research completing user studies to formally assess when different aspects are needed to improve interaction.