Night Hunter
An AR Audio Exploration Game
Skills: AR development, Unity, C#, design iteration, playtesting
If you’ve ever walked outside in complete darkness, you know the feeling of relying on your hearing more than your sight to guide you. In order to explore the ability of humans to localize sound with little to no visual input, we (Sophie Clyde and Pablo Villalobos) created an audio augmented reality game in which the player uses the movements of their head and the sounds coming through the speakers of the Bose Frames to navigate towards individual sounds in order to earn points. In designing and building this game we wished to explore the design challenges and limits of audio-first game design, as well as explore people’s ability to recognize and locate a source of sound with limited to no use of their visual abilities.
​
After several rounds of playtesting to refine our original ideas, we created a robust prototype in Unity. Night Hunter, a game in which your ears are more important than your eyes, was the final result. When playing, you are a ship sailing through the darkness of space, looking for fuel before it runs out. The Bose Frames generate a sound when a fuel pod appears and you must turn your head towards the sound to navigate towards it; when it gets close enough, a white dot will appear onscreen so you can use both visuals and audio to course-correct and pick up the fuel pod. The visuals occasionally vanish as your radar cuts out, leaving you to use your ears alone to find the fuel. Can you survive when the lights go out?
​
Watch our final presentation on the project here: https://ocw.mit.edu/courses/cms-s63-playful-augmented-reality-audio-design-exploration-fall-2019/resources/student-project-night-hunter/
​
Read the final paper for the project here: https://18193375-aae5-45e2-bfae-30bfcdc62078.usrfiles.com/ugd/181933_376f3a00597547cab041904f2ecc145b.pdf
​
Read more about the course here: https://ocw.mit.edu/courses/cms-s63-playful-augmented-reality-audio-design-exploration-fall-2019/pages/assignments/
Demo Video
Project Description & Goals
Our project sprang from a simple question: How can we use audio to augment some part of our lives? We wanted to use audio to create a playful experience that will allowed us to investigate our own auditory abilities and the abilities of those around us. Our solution was to develop an audio augmented reality game that allowed for research through game design, development and play.
​
From the beginning of the project we intended to use the Bose Frames (provided by our class sponsor Bose) as the base technology for the game, as the frames have built-in audio AR sensors that allow them to create sounds that appear to be coming from different directions around the wearer and detect gestures such as a double tap, nod or shake of the head. With a design direction in mind and the technology we would use firmly set, we began shaping the initial concept for the game.
Initial Concepting and Ideation
Our original idea for our game was taken from the idea of superpowers, particularly the enhanced senses that many comic book heroes sport. In our game the player would play as a submarine in the deepest parts of the ocean, using sonar at regular intervals to locate obstacles and steer around them. All obstacles would make a sound when the “sonar” wave hit them, allowing the player to approximate their location through their auditory localization abilities and steer their ship in between obstacles. In the early version of the game obstacles could have been as simple as a box blocking the path or as complex as a hoop the ship had to move through.
​
However, this early game had some big hurdles to overcome and many iterations to go through. Considering what obstacles to include drove us to develop our driving research question: How good are people at localizing objects in space with limited or no visual input? To win this early game, the answer would have to be “very good,” otherwise the players would quickly run into trouble when they couldn’t locate objects quickly or accurately enough. This first game relied too much on the assumption that people were good at sound localization; we needed to iterate on the idea in order to make a game that would allow us to test people’s abilities with sound localization.
First Iteration & Playtesting
Ater some initial feedback that our game would be easier to play if players were searching for the sounds they could hear instead of trying to avoid them, we reworked our design and game concept. The game now involved the player taking the role of a bat hunting at night, using echolocation in order to find and collect all the juicy flies around it. The change to finding the objects instead of avoiding them made the game easier, as the player would not need to be as precise with their sound localization if they were trying to get closer to the object making noise instead of avoiding it.
​
To further refine our idea and determine how good players were at sound localization, we needed to playtest our idea, so we created a physical version of the game and tested it with our classmates. During the playtest we found out that the instructions for the game were somewhat unclear to our participants and that the blindfolded main player had a hard time finding sounds at first, but when they focused on each sound individually and took them out one by one they improved drastically. Providing a visual example to confused players proved helpful, an idea that would reappear in later prototypes. One important question we had going into playtesting was also answered; our players said that having the sounds approach them while blind was not frightening or uncomfortable because they could hear them coming; we were concerned that the loss of sight might be uncomfortable for our players.
Final Prototype & User Testing
Now armed with actionable insights from the playtest, we dove into making a final prototype in the Unity game engine with the Bose Audio AR SDK to connect our game to the Bose Frames. In the new prototype the player could hear sounds coming closer to them and have to turn their head in order to face the sounds. When the player and the sound collided, the player would collect that sound. To make the game more easily playable while sitting, we restricted the spawn locations for the sounds to be where the player was looking when the game started so they didn’t need to continually turn around as the spawn location changed.
​
When conducting another round of user tests we found that this version of the game was incredibly difficult to play so after some thought, we decided to add a user interface to aid the players. Players still needed to use their ears to locate sounds, but once they were close enough to the ship, the sounds appeared onscreen and players could then easily steer to collect the sound. We were worried that players might become reliant on visuals so we implemented a periodic screen blackout; occasionally, the screen and UI will disappear, forcing the players to steer using their ears alone, offering a happy medium between absolute blindness and a full visual interface.
Findings, Takeaways, & What's Next
The development of Night Hunter helped answer our questions about people’s ability to localize spatial sound and helped us develop valuable research, prototyping, coding, user testing and AR skills. Throughout the development of our project we found that the task of finding the source of a sound in space when give only sound to go on is a difficult task for most people, which becomes nearly impossible when there is more than one sound similar to another. The more distinct and the fewer sounds there are in a particular soundscape, the easier the task becomes. Additionally, it is easiest for people to locate sounds in space when they are located in different places horizontally and not vertically; it is much harder for people to locate sounds above or below them
.
Utimately, we think that for precision games like ours, a hybrid approach that offers some visual tools and feedback coupled with audio signals allows players to explore audio-first gaming without too much difficulty or too steep of a learning curve. Our findings about the ability of people to localize sound using audio and in particular a mix of audio and visual abilities could lead to move games and experiences successfully using this style of interaction in the future. This new playstyle may also allow for greater accessibility in games, as it can be played by those who are impaired in either their audio or visual capabilities. In the future we see many opportunities to modulate difficulty using screen blackout duration, sound speed, or even audio blackouts to make a longer and more enjoyable game while also exploring people’s ability to balance audio and visual abilities to locate sounds around them.
Figure 1: The Bose Frames Rondo used to run Night Hunter. They are equipped with sound spatialization technology and audio augmented reality sensors used to make Night Hunter an enjoyable game experience.
Figure 2: Students playtesting an early physical prototype of Night Hunter
Figure 3: An image of the Night Hunter GUI. The small arrows represent the player, who must turn their heads towards the sounds of sounds to make the small white balls that represent fuel move towards them. In the background tiny white dots representing stars move by, showing the direction that the ship is moving
Figure 5: An image of the instruction menu on an Android phone. This menu is accessible from the main page the player sees when opening the game