Work
  • Unity3D Programming
  • Multiplayer (Photon)
  • Interaction Design
  • UI/UX
Date

01/01/2019

Overview

Amazon VR Experience is an innovative multiplayer virtual reality project designed to provide participants with a shared immersive environment unlike anything seen before. Unlike “classic” multiplayer experiences where players connect from different locations, this solution allows multiple users to be physically present in the same room while interacting simultaneously in both the real and virtual worlds. Each participant, wearing the headset, perceives the others as astronaut avatars and can actually touch the other players, synchronizing real and virtual positions perfectly.

This type of experience is ideally suited for the promotion of brands and products at major events: participants enjoy a unique, collective moment that strengthens their connection with the brand. The project has been conceived to be fully customizable according to client needs: logos placed in strategic locations, themed environments, interactive elements, and the ability to expand the interaction area depending on the available space.

To celebrate the fiftieth anniversary of the Moon landing, the chosen theme was the lunar mission. Players, as astronauts, were required to cooperate to solve puzzles and complete the mission on the Moon’s surface. The virtual environment and script were co-designed together with the client, carefully defining the gameplay dynamics, challenges, and interaction modes most suitable for the context created.

The project was developed in collaboration with OSC Innovation over just two months and presented as a temporary installation during the Amazon Xmas San Babila event (from November 27 to December 12, 2019, in Milan, Porta Venezia). The initiative demonstrated how virtual reality can transform physical events into powerful, shared experiences.

Tracking and technology infrastructure

One of the main challenges of the project was achieving absolute tracking for the headsets. At the time, using Oculus Quest 2, no official API for absolute positional tracking was yet available. The solution was to integrate Antilatency technology: external infrared sensors and an array of floor emitters enabled precise headset localization. This approach resulted in a calibration accuracy of approximately 2 cm, allowing for a seamless overlap of the real and virtual worlds.

Supporting this system was a centralized desktop application, which received the real-time positions of all users, processed them, and redistributed the data to every VR client running on the headsets. This client–server architecture ensured stable, synchronized, and scalable data flow.

Players’ avatars featured full-body tracking with movements handled by inverse kinematics, delivering fluid, natural motion. This significantly enhanced the sense of presence, fostering spontaneous collaboration among participants.

Usability and user experience

Significant effort went into studying usability and interaction to create an engaging, entertaining, and visually striking experience. An external guide voice instructed players, suggesting how to collaborate and solve the challenges appearing in the environment. This approach made the experience accessible even to users who had never tried virtual reality before.