Hybrid Interactive Music
Part 1 - HIM
The term “Hybrid Interactive Music” is used to describe this process of game composition.
Talks about the loss of characteristics of game music when the CD came into the industry. Now it's come back around with tools like Wwise, enabling composers to use synthesis and MIDI controllers. Mentions of vertical and horizontal layering in Wwise enables users to have more creativity and flexibility when composing.
Talks about how they approached the implementation of music in a VR experience. The challenges they faced and how they overcame them:
Final results combined a various number of traditional interactive game audio design approaches:
Part 2 - Technical Demonstrations
Part two goes into great detail explaining techniques and tools used in Wwise to create an intense interactive game audio environment. Using the Wwise effect meter to increase and pulse sounds based on player progression through the level. (from about here in the video).
Details and music production techniques used such as side-chaining and synth manipulation using RTPC. Tempo changes, reverb changes.
Video provided demonstrating Wwise audio systems Video.
Industry Research - Audio Plugins
Found this procedural sfx generator that has been turned into a unity plugin - USFXR. Originally a standalone application. All free.
Github available with information on developers and the background of the project development stages
Script editing to on trigger enter and exit. Need to adjust Pure Data patch objects to smooth the transition of parameter changes on trigger enter and exit. Also looking into integrating thunder into the patch. Issues include certain objects not supported by the heavy compiler ("vline~" and "switch~"). Although I have swapped out all the "vline~" for "hv vine~" objects, there is no substitution for switch~. Attempts to try and make a version without these objects is one option although the quality of sound would most likely be degraded significantly.
Additional to thunder I would like to try and integrate insects and birds and attach these to a time parameter. This is the same as the Unity FMOD integration demo project except using procedural audio rather than samples. also straight into Unity rather than switch between middleware application.
RTPC from player distance and weather element parameter adjustment is the next hurdle to overcome. Looked into measuring player distance and manipulating object properties through scripting. Found resources in audio proximity triggers and dynamic audio systems but these all use the native audio mixer in Unity. As the heavy compiled patch uses the native Unity audio source anyway, I can route the signal through Unity’s mixer and control parameters through scripting.