Between the Mountain and the Sea
People have always tried to visualize music, whether it’s a music video or a computer-generated visualizer. Through this project, I combined these concepts together to produce a new way of experiencing music. Combining both the audio and visual senses to produce a unique experience only made possible through Virtual Reality.
Between the Mountain and the Sea is a Virtual Reality Music Experience. This will involve the viewer listening to music while the world moves around them. The animations and themes of the world will be uniquely linked to the song and its lyrics. This will be a fully passive experience requiring no inputs from the viewer.
I began by making a music visualiser in unity. This enabled me to tie animations and events to the frequency of the music.
I then used this data to make a moving street that only moved when the guitar strummed. This was done using an FTP system and then specifying that once a frequency went over a specified value, move position.
After doing this, I began working on the transitions between the different scenes. I used multiple scenes to make it easier to work on whilst having a master scene which was always loaded with the music and transition timers to keep a cohesive experience.
During development I noticed that the transition between skyboxes was very harsh between scenes so in order to ratify this I replaced them with giant inverted spheres and added the skybox material to them. This allowed me to scale the skybox as I wished as well as change them within the same scene.
The music for the experience was provided by my friend Archie Johnson who was studying Music at university. I first began the project by listening to the lyrics and tone of the song in order to visualise the different scenes and events that would happen within the experience. I then plotted this out into a storyboard.
The main purpose for this project was for me to test how to smoothly transition the viewer between different scenes in Virtual Reality while limiting VR Sickness. I found that having fixed points that don't move during a transition greatly reduced reports of nausea during user testing. E.g., The floor or walls remain still while other components move.
Originally, in the street scene I had the floor moving in time with the houses to simulate movement however many users felt unwell or lost balance during the experience, so I ended up making the floor stationary.
If I were to do this again, I'd either choose a shorter song or split the song up before starting as I found that in order to test the timing of certain sections, I needed to play through the entire song to reach the point I wanted to test. This greatly reduced production time.
Something I would do again however, was the use of FFT's to automate the animation process of certain aspects so they are perfectly synced to the music. I even had different levels of the music, I.E the bass and treble on separate values allowing me to have further customisation.
Overall, I am very happy with the outcome of this demo, as it proves the possibility of translating music videos into VR and the effectiveness of it.