My flagship VR development project is a fully functional harmonica simulator built for the Meta Quest 3, deployed as a standalone .apk application for the Android-based Quest platform. This project synthesized everything I’ve learned across music, 3D modeling, and VR development into a cohesive, interactive experience. Users can hold a virtual harmonica, play notes by blowing and drawing through proper head movements and controller interactions, and experience the tactile feedback of playing a real instrument in virtual space. Building this simulator required solving complex technical challenges across multiple domains: asset pipeline integration, real-time audio synthesis, physics-based interaction design, and Android deployment—all while maintaining the performance standards critical for comfortable VR experiences.

The development process began with creating and optimizing 3D assets in Blender, then importing them into Unity while preserving materials, textures, and proper scale. I learned to navigate Unity’s component-based architecture, attaching scripts to GameObjects, managing parent-child hierarchies for the harmonica’s individual note holes, and configuring colliders for spatial interaction. The core technical challenge was implementing MIDI-based audio playback through C# scripting specifically optimized for Android devices. I had to manage real-time MIDI note triggering based on user input, handle polyphony when multiple notes play simultaneously, and ensure audio latency remained imperceptible—critical for maintaining the illusion of playing a real instrument. Each note hole required precise box collider placement and scripting to detect controller proximity and trigger appropriate audio responses, balancing realistic spatial constraints with forgiving interaction volumes that feel natural in VR.

Beyond the core functionality, I implemented shader systems for visual feedback—highlighting active note holes and providing visual cues for proper hand positioning. Optimization became paramount: reducing draw calls, managing audio resource loading, and ensuring consistent frame rates since any performance dip in VR causes discomfort. Once development was complete, I learned the entire Android deployment pipeline for Quest: building the .apk through Unity, sideloading via SideQuest, testing on actual hardware, and iterating based on real-world VR usage. I then marketed the application through SideQuest’s platform, writing compelling descriptions, creating promotional materials, and gathering user feedback to inform updates. This end-to-end experience—from concept to deployed, user-tested application—demonstrated that I can take a VR project through every stage of development, not just prototype in the editor but deliver a polished, distributable product that real users can experience.

Posted in

Leave a comment