BeatMaP
A downloadable game for Android
Overview of experience
- You’ve seen AR Google Maps. It’s literally just an arrow you see in your space, and that’s basically it. We saw a lot of potential in improving this experience to make it more engaging and interactive. Since a lot of people already listen to music as they walk, we thought it would be fun to turn those beats into navigation/direction visuals in the world space that sync to the music. That leads us to BeatMap, a mixed reality map using audio visualizer indicators for directions.
Development process
- Audio visualization
- Using an AI software (MVSEP) to split a song into multitracks: bass, drums, guitar, instrumental, other, piano, vocals. Downloading all these tracks and putting them in the same directory.
- Using Python, analyze the audio track files to produce a single JSON file that details the amplitude and frequency of each track at every second. Using Librosa library to extract information like amplitude and frequency from an audio file.
- Export JSON into Unity, using Newtonsoft to parse it into a format Unity can read.
- Using the converted JSON data, creating a list at every second that ranks the tracks based on their dominance, or how much they are carrying the song at that specific time. This list will be the input to determine which visuals appear at a given second in the song, and the top 3 most dominant tracks will appear. There is a 30% chance to move a nondominant track to the front so that there will be a variety of visuals; prevent dominant track visuals from constantly appearing.
- Making a base class for prefab movement. Making seven different classes for each track prefab that inherits from the base class. Each one has its own movement and is attached to a prefab.
- We used VFX Graph (through downloading Visual Effect Graph package in Package Manager) to create particle strip trails.
- Navigation
- Using Google Directions API to request walking route steps from current GPS location to a destination address.
- Storing API keys in a config asset and loading them at runtime.
- Receiving user location with an app created with unity that streams latitude and longitude coordinates and heading to Firebase.
- Parsing Directions API JSON response into route legs and steps.
- Extracting each step’s end coordinate as a waypoint and generating instruction text for the user.
- Creating a live GPS-to-Unity coordinate map at route start using current camera position and compass heading.
- Converting latitude/longitude to world-space positions with north/east meter offsets.
- Using periodic recalibration to correct GPS/VR drift while preserving route alignment.
- Initializing route orientation from compass heading and camera forward direction so world navigation aligns with real-world facing direction.
- Rendering a live path with Line Renderer in world space.
- Updating path points every frame so the line follows user movement and remains aligned to current route progress.
- Using standard threshold for intermediate waypoints and a larger destination radius for final arrival detection.
- Updating messages on the hand with current location, next waypoint coordinates, step distance, and turn/continue instruction.
Successes
- Navigation worked in the end!!! (mostly)
- We were successful in extracting amplitude and frequency data from the songs as JSON files, then using that data to display visuals that synced with the music.
- VFX Graph, specifically particle strips, worked really well for creating trail visualizations that were dynamic and optimized for the headset. In a previous iteration of this project, we tried using shaders to create the trails, which was very performance-intensive for the headset and we had to scrap it. We also researched creating trails with Trail Renderer, but it would not be as responsive to the fluctuating movement we wanted to create using the frequencies of the beats.
- The trails and prefabs were able to follow the navigation path smoothly, as well as fade out nicely.
Challenges
- At first, we thought the assignment had to include ALL passthrough, hand tracking, and plane detection, so we were determined to include wall anchors where visualizations would spawn to the beat. Our experience is meant to be used while roaming around rather than stationary in a room, so we had to turn off physical space features in settings. However, doing so also turned off plane detection, which caused a lot of trouble for us. The headset cannot analyze planes on the fly, so we were thinking of pivoting to a roomscale experience. Navigation/geolocation features become obsolete in a roomscale experience, we tried switching gears to an experience where animals walked on surfaces and spawned to the beat of the music. This was not too ideal because there was not much the user could do to interact with the experience, besides picking up the animals and putting them on various surfaces. Upon learning that this assignment did not require all three MR features at once, we once again pivoted back to our original idea of creating audio visualizers that served as navigational directions.
- Aligning real-world GPS to Unity space was challenging because GPS, headset tracking, and phone compass data each have different noise profiles and update rates. Over time, GPS drift caused the rendered route to gradually misalign with the physical environment, so we added periodic recalibration to keep the path stable. In the initial implementation, positioning relied on multiple offset layers tied to Unity world orientation, camera orientation, and phone orientation. We then replaced that approach with a geo-to-world mapping system. This system used a real-world coordinate origin as an anchor to establish the Unity world orientation. GPS coordinates are then projected directly into that mapped Unity coordinate frame. This made world placement and route orientation much more consistent.
- Integrating the phone app was also a challenge. Initially we used an Expo Go app which relied on websocket connections. This was inconsistent, inaccurate, and required the same network (with no firewall) in order to work. We ended up using another app built in Unity made by the goat Professor Bruneau to get more accurate location data that didn't rely on ip connectivity.
What you learned
- If there’s a will there’s a way :D We really wanted to do this project and didn’t really want to pivot despite running into obstacle after obstacle. Pushing through these obstacles and coming up with new ways to problem solve, we did not give up on our idea and as a result, created a cool experience we’re proud of.
- We learned a lot about the workflow of collaborating together on a Unity project using version control with GitHub.
- We also learned to work JSON files using Python, then turning that data into something usable in Unity C#.
Possible future revisions
- Given more time on this project, we would add more songs of different genres so that there could be a wider variety of music visualizations.
- We would also add onboarding UI to the experience, including start and end points so that users will know that they’ve arrived at their destination.
- It would also be cool to implement Spotify API so that users can choose any song they want to play. However, we would need to figure out a way to analyze these songs on the fly, as opposed to pre-defined JSON files of each track.
Repo link
Walkthrough demo https://drive.google.com/file/d/1sJ8HTt2Saf6c9eJexzum-gd6k7FZON-0/view?usp=drive_link
Download
Download
BeatMaP.apk 68 MB
Install instructions
Download and upload to Quest :3

Leave a comment
Log in with itch.io to leave a comment.