Work
  • Unity3D Programming
  • Interaction Design
  • UI/UX
Date

01/2025

Overview

The VR Medical Training Simulation is an advanced immersive learning platform developed in partnership with Lemons4Training and Careagi Hospital (Florence). This initiative redefines traditional medical education by introducing a high-fidelity, risk-free virtual training environment. By leveraging Unity3D and Oculus Quest 3 headsets, the system enables healthcare professionals and trainees to practice complex procedures, hone decision-making skills, and reinforce safety protocols without the limitations of physical facilities or traditional classroom settings.

Immersive Training Environment

Participants enter a fully realized virtual hospital ward equipped with patient beds, diagnostic equipment, and interactive medical tools. Each scenario is built to mirror real-world hospital workflows—covering routine procedures, patient care, emergency interventions, and hygiene protocols. Dynamic environmental responses ensure a high level of realism: for example, patient reactions, environmental alerts, and feedback adapt to the trainee’s actions. This level of fidelity allows professionals to build confidence and procedural accuracy before entering real clinical settings.

Advanced Hand Tracking Integration

A defining feature of the platform is its cutting-edge hand tracking. By eliminating the need for controllers, trainees can practice authentic hand hygiene, patient handling, and equipment manipulation using natural gestures. The system simulates both surface-based handwashing and alcohol-based gel sanitizing procedures, reinforcing infection control measures—one of the most critical aspects of healthcare training. The hand tracking module also supports delicate interactions such as needle handling, intravenous setup, and tool manipulation.

Intuitive Tool Interaction

Realistic tool interaction was a key development priority. Trainees can:

  • Place and position equipment with realistic physics.
  • Wear and Remove personal protective equipment (gowns, gloves, masks) using natural gestures.
  • Interact with syringes, thermometers, monitors, and other diagnostic tools.
  • Complete virtual charts, update medical records, and manage patient data inside the VR environment.

These elements replicate real-world tactile and procedural complexity, improving knowledge retention and skill transfer.

User Experience and Learning Progression

The simulation includes a progressive tutorial system to onboard new users, ensuring familiarity with VR controls, gestures, and procedures before entering advanced scenarios. As trainees progress, the system introduces more complex challenges, immediate feedback loops, and scenario branching that mirrors real-world medical decision-making. This adaptive approach boosts confidence, accelerates learning, and creates a measurable improvement in procedural performance.

Assessment and Performance Analytics

At the core of the platform is an advanced assessment engine tracking procedural accuracy, efficiency, safety compliance, and decision-making under pressure. Each training session produces a comprehensive performance report including:

  • Overall performance scores with detailed breakdowns.
  • Areas of strength and opportunities for improvement.
  • Specific, actionable feedback and recommendations.
  • Progress tracking across multiple sessions and comparative benchmarking.

Supervisors and training coordinators can access these reports to make data-driven decisions about individual progress and program-wide training needs, thereby improving workforce readiness and clinical outcomes.

Why it matters

This project bridges the gap between traditional medical education and the future of healthcare training. By combining immersive VR environments, advanced interaction techniques, and real-time analytics, the VR Medical Training Simulation helps hospitals, universities, and training centers deliver safer, more effective, and more scalable medical education.

Accelerating VR Environment Design: From Prototype to Production

I used ShapesXR has proven to be an invaluable tool for environment prototyping. By leveraging its capabilities, we were able to:

  • Accelerate prototyping cycles: rapidly create and iterate on VR layouts, reducing time between concept and validation.

  • Early alignements: provide clients with immersive previews of object placement and MVP configurations before full production.

  • Validate spatial experience: assess distances, arrangements, and interaction flows directly in VR – something that traditional scripts and 2D storyboards cannot fully capture.

  • Facilitate real-time collaboration: share the evolving virtual environment seamlessly with team members and clients through the online viewer.

  • Streamline Unity integration: import the approved mock-up directly into Unity as a production-ready starting scene, minimizing redundant build iterations.

Using ShapesXR in this way allowed us to bridge the gap between client specifications and practical VR implementation, improving decision-making, enhancing user-experience design, and saving significant development time.

Tech stuff: Action Sequencing and Material Optimization Frameworks

To design and control the complex sequence of actions within the VR Medical Training Simulation, I used a custom-built Python tool developed in-house: SessionActionFramework. This framework provides an intuitive way to script the narrative flow of events in the simulation, defining not only the precise order of actions but also which steps can run in parallel. By modeling dependencies between actions, the system enables highly dynamic, non-linear experiences tailored to the user’s interactions.

The power of SessionActionFramework lies in its ability to describe and manage relationships between actions, objects, users, NPCs, animations, audio cues, and environmental elements. Because it is built in Python, the tool is engine-agnostic and can be generalized for multiple use cases:

  • Cross-domain applicability: The same logic can be used to define action sequences for other immersive experiences—such as escape rooms, interactive museum exhibits, or industrial simulations—where branching narratives or complex workflows are essential.
  • Cross-engine integration: Although implemented here with Unity3D, the framework’s output can also be adapted to Unreal Engine, Godot, or any other engine, providing a universal language for describing simulation behaviors.

Once authored, the SessionActionFramework output was imported directly into Unity via a custom C# importer specifically developed for this project. This importer ensures accurate offline and runtime interpretation of the action sequences, allowing trainers and developers to make rapid iterations and deploy updates seamlessly.

Additionally, MaterialManagerOptimizer, another proprietary Unity3D plugin I developed, played a crucial role in performance optimization. This tool:

  • Analyzes all materials in the scene to identify batching bottlenecks.
  • Facilitates the creation of material variants and highlights shader keywords, enabling a more unified and optimized material setup.
  • Maximizes the efficiency of the Unity SRP Batcher, significantly reducing draw calls and improving runtime performance.

Together, these two tools—SessionActionFramework and MaterialManagerOptimizer—provided a robust backbone for scripting, maintaining, and optimizing a high-performance VR medical training simulation.