Jasz Pike
Sound for Game & Apps
pic of me for profile.jpg

Bespoke Middleware Blog

Research, Design and Development Blog for Bespoke Middleware Project

Hybrid Interactive Music

Part 1 - HIM

AudioKinetic blog post on interactive music in get Evan game developed by The Farm 51.

Is Hybrid Interactive Music the Future? PART I - How I used Get Even as an R&D platform for Interactive Music

The term “Hybrid Interactive Music” is used to describe this process of game composition.

Hybrid Interactive Music - Wwise blog.png
Get even is a real R&D for Interactive Music. The rule was that any music should have a diegetic sound source as starting point. It pushed me to create some unique systems and to create real-time generated music. In this video you can witness some of the systems in-game.
“For instance, anything you are doing with Interactive Music can be connected to sound emitters in your own game. This is just amazing! I can play a music sequence using HIM into the 3D world of the game, synced with a 2D prerecorded track. It opened so many possibilities and such a meaningful approach for Get Even.”
“This led me to create 17 different ways of using interactive music throughout the whole game, such as the use of diegetic sounds synced with prerecorded music, drones processed in real-time by the console, drums sequenced in MIDI, going from 3D to 2D layers, even better visuals links with the music, and so on.”

Talks about the loss of characteristics of game music when the CD came into the industry. Now it's come back around with tools like Wwise, enabling composers to use synthesis and MIDI controllers. Mentions of vertical and horizontal layering in Wwise enables users to have more creativity and flexibility when composing.

Talks about how they approached the implementation of music in a VR experience. The challenges they faced and how they overcame them:

Final results combined a various number of traditional interactive game audio design approaches:

Part two goes into great detail explaining techniques and tools used in Wwise to create an intense interactive game audio environment. Using the Wwise effect meter to increase and pulse sounds based on player progression through the level. (from about here in the video).

Details and music production techniques used such as side-chaining and synth manipulation using RTPC. Tempo changes, reverb changes.

"Switch container controlled by Tempo_Prologue RTPC that swaps switch tracks and makes the clock in-sync with the downbeat."

"Switch container controlled by Tempo_Prologue RTPC that swaps switch tracks and makes the clock in-sync with the downbeat."

“In the end, with just one RTPC (Tempo_Prologue) and 1 bar at 60 bpm, I was able to make the musical experience completely organic for the whole level. This is also great because if the developer were to change the scale of the level, the music would continue to follow the progression as the RTPC is dependent on the distance of the player to the girl.”

Video provided demonstrating Wwise audio systems Video.

Industry Research - Audio Plugins


Found this procedural sfx generator that has been turned into a unity plugin - USFXR. Originally a standalone application. All free.

Github available with information on developers and the background of the project development stages

usfxr is a port of Thomas Vian's as3sfxr, which itself is an ActionScript 3 port of Tomas Pettersson's sfxr.”


Weather Report

Script editing to on trigger enter and exit. Need to adjust Pure Data patch objects to smooth the transition of parameter changes on trigger enter and exit. Also looking into integrating thunder into the patch. Issues include certain objects not supported by the heavy compiler ("vline~" and "switch~"). Although I have swapped out all the "vline~" for "hv vine~" objects, there is no substitution for switch~. Attempts to try and make a version without these objects is one option although the quality of sound would most likely be degraded significantly.

Additional to thunder I would like to try and integrate insects and birds and attach these to a time parameter. This is the same as the Unity FMOD integration demo project except using procedural audio rather than samples. also straight into Unity rather than switch between middleware application.

RTPC from player distance and weather element parameter adjustment is the next hurdle to overcome. Looked into measuring player distance and manipulating object properties through scripting. Found resources in audio proximity triggers and dynamic audio systems but these all use the native audio mixer in Unity. As the heavy compiled patch uses the native Unity audio source anyway, I can route the signal through Unity’s mixer and control parameters through scripting.