Stoma VR

Role: Lead Designer
Genre: VR Simulation / Educational
Development Time: 4 Months
Team Size: 7
Engine: Unity

Stoma VR is a Virtual Reality medical simulation and teaching module designed to teach medical students the proper patient care for laryngectomy and tracheostomy patients. Players will simulate routine care in office, and interact with educational games to learn material from various teaching modules.

Created in partnership with Michigan State University’s speech pathology department, and awarded $1.2 Million of grant funding from the National Cancer Institute.

Responsibilities:

  • Meeting with speech pathology experts to accurately represent stoma patient care in VR

  • Daily coordination with programmers and artists to create assets and scripts, with weekly coorespondance with panel of experts

  • Converting teaching material for medical students into interactive VR game modules

  • Extensive design documentation to allow for easy flow of communication between the development team and researchers

TEACHING VR

Stoma VR is a simulation that will be circulated and used within Michigan State University’s medical program, tracking student progress and results and uploading them to professors to grade. Therefore, it was imperative that the simulation could reach a wide audience of users, with the assumption that the majority of users would not be well versed in VR technology. Users are guided through the simulation with a step by step hovering UI guide toggled to their hand, while also having all interactables be controlled through only a single button or area detection. To keep users concentrated on completing medical tasks and not extensive VR controls, I split longer, more involved tasks into small steps that only require one motion or one button control, which is both taught to the user in the tutorial learning module, and reinforced throughout the procedure simulation.

TEACHING MATERIAL

Along with designing the moment to moment actions of the 2 laryngectomy and 2 tracheostomy patient care procedures, I was also in charge of taking extensive teaching material and converting it into VR games. I parsed through 4 different teaching materials covering anatomy, equipment, cancer, and many more topics to create an alpha teaching module for users. This module includes a hub with unlockable submodules, and each submodule focuses on a different teaching material, which I converted into physical games such as darts, skee-ball, and sorting games. The educational material is taught entirely though games and I set up a dynamic system to register when a room is completed, when a sub-set within the submodule is completed, and when a submodule is completed, all with visual indicators for the player on their progress within the overall education module.

UNIQUE SOLUTIONS

Finding the intersection of accurate patient care and intuitive Virtual Reality controls was a balance that I worked daily to strike within the development of the game, and each step in the procedure required a different solution. While procedure steps like “Put on sanitary gloves” could be simplified to the player reaching into the glove box, the prosthetic insertion step could not be simplified in the same way and required a more precise approach. I decided to add a magnified screen behind the patient’s seat, to allow for the player to move their hands in a comfortable positions, while making accurate movements during the prosthetic insertion step. I was able to use this system again in one of the teaching modules, where the player sets a ball with an anatomy term on the correct position of the brain, allowing for players to move through the VR space while also getting precise clarity of vision.