[The following notes were generated by Mark Slivkoff.]
IAMSE Webinar Series, Fall 2018
Speaker: Douglas Danforth, PhD
The Ohio State University College of Medicine, Columbus, Ohio
Title: Virtual Reality and Augmented Reality in Medical Education
Series: Evolution and Revolution in Medical Education: Technology in the 21st Century
- The objectives of Dr. Danforth’s discussion included:
- Define and describe virtual reality (VR), augmented reality (AR), and mixed reality (MR)
- Go through a history of VR, AR, and MR in medical education
- Describe current technologies
- Opportunities and examples
- Describe challenges and future directions
- VR is the computer-generated simulation of a three-dimensional image or environment that can be interacted with in a seemingly real or physical way by a person using special electronic equipment, such as a helmet with a screen inside or gloves fitted with sensors.
- Advantage is that you can create simulations that are impossible in the real world
- VR can be created via software, or by using special video cameras with 360 degree angles
- In contrast to VR, AR is a technology that superimposes a computer-generated image on a user’s view of the real world
- Advantage is that it augments experiences in the physical world
- MR, also referred to as hybrid reality, is the merging of real and virtual worlds to produce new environments and visualizations where physical and digital objects co-exist and interact in real time.
- The history of VR and AR dates back decades:
- Morton Hellig created the Sensorama in 1957
- Before this there was the ViewMaster in 1939
- Flight simulator came out in 1966
- The term VR was popularized by a VPL scientist, Jaron Lanier, in 1987
- In medical education, VR and AR has their roots at the University of Chicago with CAVE, Cave Automatic Virtual Environment
- Use of rear projection screens that surround the user
- User wears 3D glasses
- University of Toledo and The Ohio State University both have CAVE
- CAVE environments are expensive
- Second Life (software), also used in medical education, was started in the early 2000s.
- Used by many academic institutions
- Danforth used the software to design a testis through which you could fly and learn about all the relevant anatomy and physiology
- 2007 was the heyday of Second Life’s use, with over 1000 users/month added
- Not used any more at many institutions
- Current VR technologies include:
- Google Cardboard
- Advantages: inexpensive ($10), uses your smartphone, and it’s entry level
- Disadvantages: low resolution, you can’t walk around in the environment, you can only turn your head with limited to no interaction
- It’s a great way to expose folks to VR, especially the rollercoaster ride
- Use with YouTube’s Virtual Reality channel
- Samsung Gear VR and Google Daydream
- Advantages: inexpensive, use your smartphone, some interaction
- Disadvantages: low resolution, stationary VR
- Oculus Go (and the Quest which just got released)
- Advantages: relatively inexpensive ($200), built-in display and sound, allows for interaction
- Allows for 6 degrees of freedom (up, down, right, left, forward, backward)
- HTC Vive, PlayStation VR, Oculus Rift and Samsung Odyssey
- Advantages: high resolution, smooth video, allows interaction, mobile VR
- Disadvantages: expensive ($400 to $800), requires a powerful computer that you have to be tethered to, can be challenging to set up
- Danforth uses an HTC Vive
- Current AR technologies: Google Glass, Microsoft HoloLens
- Advantages: not isolated from surroundings, good for training, access to real time schematics
- Disadvantages: expensive ($3000), requires a powerful computer (but not too expensive these days…$1500)
- Microsoft’s Holo Anatomy put to use
- Opportunities and Examples
- Anatomy education is the most obvious discipline to target
- Surgery applications
- Microsoft HoloLens, HoloAnatomy: https://www.youtube.com/watch?rel=0&start=35&v=SKpKlh1-en0
- Case Western is extensively using HoloLens
- Voxel Bay at Nationwide Children’s Hospital: https://www.youtube.com/watch?rel=0&showinfo=0&start=35&v=uVRilk_6UWI
- The Ohio State U. collaborates with them
- Used by pain researchers to distract kids from pain
- Mass Casualty Training at the Ohio State University
- University used to use an analog subway set which bordered on immersive but “lame”: https://www.youtube.com/watch?rel=0&v=JNdKo1uSRto
- They have now recreated the set virtually
- Proof-of-Concept Disaster Sim – Subway bombing (featuring Dr. Danforth): https://www.youtube.com/watch?v=nvuxMnSHHtg
- Conan Visits YouTube’s VR Lab (hilarious, if you enjoy Conan’s humor): https://www.youtube.com/c126d7c3-4fa3-4c75-a3dc-d35ea8ad155a
- Visual standardized Patients: https://youtu.be/mvXIruMt9Ek
- High fidelity, simulated “real” standardized patient
- Conversational, can understand and respond to student questions
- Easy to use, require little or no training
- Soon students will wear VR goggles
- Challenges of VR and AR
- Fatigue, disorientation and vertigo
- headsets are heavy, and there’s usually a lag when watching
- Difficult to scale, to do multiplayer VR
- Movement in virtual space
- Lack of haptic tools
- Limited interoperability, but software is allowing for some cross-platform compatibility
- Future of VR and AR
- Portability
- Untethered systems
- Smaller more comfortable headsets
- Fidelity
- Higher resolution displays/increased framerates
- Haptic feedback
- Multiplayer
- Team based simulations
- Interoperability
- Build once – deploy everywhere
- Upcoming in Content and Applications
- Surgical simulation
- Patient specific simulations
- Remote surgery
- Virtual Patients
- Practice history taking, physical exam skills, differential diagnoses
- Automated assessment
- Team training
- Emergency medicine, surgical stimulation
- Google Cardboard
Questions asked after seminar:
(Note that some questions and/or answers have been reworded for clarity)
Do you suggest any starter tools to get your feet wet in the technology?
Recommend that you collaborate (with gaming folks, the software folks). It’s simple to get started with 360 degree camera. Higher end creation requires software such as Second Life. And you’ll need a programmer who knows Unity or Unreal Engine (software).
How do everyday operating systems (Oss) figure into all this?
Apple has some VR labs but they don’t compare to what I’ve discussed. Some material can be ported to operating systems such as iOS.
Where is the market going?
Single software use for all purposes. Hardware includes the newly released Oculus Quest.
How are your students at The Ohio State University Evaluated?
Pre and Post Tests are administered. Working on building the mass casualty training system similar to games in that there are various difficulty levels. Must pass one level before moving to the next.
Have you received any pushback from certain populations (e.g. students who get sick)?
There has been very little pushback but we have to have alternatives. Enhances in technology should mitigate the sickness factor.
Has VR or AR been mapped to high fidelity mannequins?
Not yet, but someday it is bound to happen. Companies are trying to merge the two. Seeing different things inside the same mannequin is an example.
What about procedural skills such as suturing or lobotomy?
This is much further down the road. The main problem is that there are no commercially available gloves yet that allow for haptic feedback.
You showed a couple VR anatomy simulations. What were they?
3D Orgnan VR Anatomy and Microsoft HoloLens Anatomy.
You showed the Google Cardboard. What’s the entry point for medical education?
Invest in Oculus Rift or HTC Vibe, plus the computer. But Google Cardboard is a great starting point. We’ve recently attended a conference and had a bunch of headsets at our poster.
Related links supplied by audience:
https://link.springer.com/article/10.1007%2Fs10916-016-0459-8