Making of the game
Goal and Motivation
Explore interactions in VR, including the hand tracking of Quest 2
Deliver an immersive Sound Interaction in VR
Experiment on sound visualization
Technologies
Unity
Bluetooth
FMOD
Meta Quest 2
React
Heroku
Vercel
Arduino
Justification of Graphics/Interactions
We decided to use the Meta Quest 2 for this project as it is wireless and one of the more ubiquitous VR headsets. Developing for a more ubiquitous headset would support any future business goal of reaching out to players. Furthermore, since the interactions may afford the player to move around a lot, we reasoned a wireless headset is more appropriate. We also chose the Quest 2 based on our aim to explore, and utilize, the Quest's hand tracking capabilities. However, it failed to provide good enough tracking for the fast hand movements that playing our percussion instruments required. Hence, we chose to design the interactions with the Quest's VR controller in mind.
We chose to use the game engine Unity mainly because of the team's familiarity with the engine and anecdotal online statements about Unity better supporting VR development. We also felt there were a lot more tutorials available for working with VR in Unity compared to other engines. In the Oculus API for Unity there are three main interaction techniques: using the VR controller, using hand tracking and a mix by using VR controller visualized as hands. Since the hand tracking didn't work satisfactory, and we wanted a more immersive experience, we chose the third option. Designing interactions that use the VR controller but in game still looks like a hand. We found this to be working well.
Simple cartoonish/lowpoly graphics were used as the focus was more on creating good immersion through feedback and intuitive interactions. We aimed to implement sound visualization elements but ran out of time for this. However, a majority of the graphical elements are in alignment with the music triggered and played by the musician. For that, we performed a Fourier Transformation to get Frequency and Magnitude values and mapped those values to graphical elements. This assured that the graphical visualization is connected to the music and seemingly increased the immersion as many testing the application went "wow" when graphical techniques activated during a song.
Early in the project after playtesting we realized that people really want to interact with the virtual drum pedals in-game with their feet. We therefore put a large effort into creating a set of physical pedals that interfaced with the application. We had the aim of making these pedals "plug and play" for future users and the requirement of being wireless. This resulted in us discarding ideas that required networked setups and instead use Bluetooth that the Quest 2 supported. The Arduino enabled for a HID-compliant Bluetooth connection with the Quest, meaning we could program the physical pedals connected to the Arduino to be recognized as simple gamepads, joysticks, keyboards etc.
Gallery
Challenges
Firstly, a big challenge was the latency of the interactions - both audio and visual latency. A low latency for instruments is especially important in order to play synchronized with a given beat. Both outputs have a native delay that we could not really get around. However, we made sure that the game is not running with low Frames Per Second.
Secondly, the "physically based sonification", i.e., mapping the player's interaction with objects to sounds based on physical elements, was another big challenge for us.
For example, when hitting a drum, the sound generated should be affected by several parameters such as different velocities, different angles hits, or what part of the stick hits the drum.
But eventually, we managed to adapt the sound to the velocity of the controllers, which turned out to be an unexpected challenge since returning the parameters of the VR controllers was not easy. We did not manage to implement a full "physically based sonification" due to time constraints.
Obstacles
Initially, we wanted to use hand tracking as the main control interface. Due to the underperforming hand tracking (ie., losing track of virtual hands at high speed, high latency and low frostrum detection of the hands) we switched to using the Oculus controller.
We used speakers and displays during exhibitions for the audiences to see a streaming of the VR player's view. However, as the streaming had a latency, it caused an echo effect for the VR player sitting nearby, as the streaming sound were louder than the headset's sound making the audio feedback feel delayed for the player.
Lessons Learned
A key to our success was to start early with distributing the workload and conduct regular project meetings (we had one each Friday, discussing the progress and challenges for each member) and keep short note entries with your progress and date.
Create an own workspace for your whole group. We used Discord with several different channels, each channel was dedicated for a single topic: troubleshooting, feedback, useful links and research - that was super helpful as we could discuss issues in single a thread.
Keep track of issues and transform received feedback into action points in order to push productivity and outcome. Try to always stay connected and communicate as much as possible in the group.
Credits
• Pizzaslize records @ www.youtube.com/pizzaslizecrew for the songs [Kall - Frej Larsson & BenG] and [Vini, weedi, whiskey - Frej Larsson & Young Earth Sauce]
• "Low Poly Recording Studio" by Justiplay and "PBR Stage Equipment" by Tirgames assets for 3D models and textures (from the Unity Asset Store)
• Freesound (www.freesound.org/) and Ableton Live (www.ableton.com/en/) for various digital instruments, sounds and sound effects
The Team
Copyright © 2022 - Room for Sound?