Fall Research Expo 2022

Developing a Surgical Skin Cutting System For Virtual Reality Training Didactics

Virtual Reality (VR) has been utilized as an effective educational tool that places the learner directly into their subject matter. Due to its immersion, Virtual Reality is currently being explored in the medical, nursing, and surgical fields as a way to accurately teach and assess both the intellectual and physical aspects of each occupation. Currently, the University of Pennsylvania Hospital is working on a VR Simulation to give surgeons, residents, and medical students experience with a procedure known as a 'Surgical Airway'. However, to properly simulate surgical techniques in virtual space, the cutting of the patient's skin and other organs must be replicated in a way that would intuitively follow how cutting the skin with a scalpel would behave in the real world. There are existing libraries to simulate splitting an object along a line or curve into two separate meshes, but no such library exists that allows for an object to be cut into in a surgical fashion. The objective of this project was to find an efficient way to generate a skin cutting system that would behave accurately to real world surgical skin cutting and maintain a minimum of 60 frames per second when testing the simulation on the Oculus Quest 2. The key technique that was used to create a cut was the use of a custom renderer that displayed objects underneath another object which created the visual illusion of seeing the object being cut. With this method and the use of a spline library, custom hashing function, and the use of surface normals, the system was created. The average frames per second was 64. This was done over 12 tests, 3 tests for each of 4 cuttable objects, a cube, a sphere, a cylinder, and a human patient model. In conclusion, this cutting system creates accurate cuts in an acceptable frame rate. However, the program can slow down if too many cuts are created. On sharp edges, the vertices of the cut are not accurately marked and can create inaccurate cuts. In the future adding functionality to the depth of the cut, merging adjacent cuts together, and changing the width of the cut by using the user's fingers are features that will be implemented.

PRESENTED BY
PURM - Penn Undergraduate Research Mentoring Program
Engineering & Applied Sciences 2025
Advised By
William Yi
General Surgeon, Dr.
Kristoffel Dumon
Bariatric Surgeon, Dr.
Daniel Weber
Technical Lead
PRESENTED BY
PURM - Penn Undergraduate Research Mentoring Program
Engineering & Applied Sciences 2025
Advised By
William Yi
General Surgeon, Dr.
Kristoffel Dumon
Bariatric Surgeon, Dr.
Daniel Weber
Technical Lead

Comments