ABSTRACT MuX is a modular synthesizer audio-visual environment made for Virtual Reality (VR). In this paper, after describing MuX and its components, we present new elements developed for the environment, focusing on lumped and distributed physically-inspired models for sound synthesis. A simple interface was developed to control the physical models with gestures, expanding the interaction possibilities within MuX. Preliminary evaluation of MuX shows that as the number and complexity of the components increase, it becomes important to provide to the users ready-made machines instead of allowing them to build everything from scratch.
The rapid development and low-cost solutions for Virtual Reality (VR) head-mounted displays have expanded the possibilities for new ways of creating interactive sonic experiences. In a VR experience, sound can direct the users’ attention, enhance the sense of presence and create time-varying sonic experiences by monitoring the location of the user . Virtual Reality Musical Instruments (VRMIs) are a new category of digital instruments that can be played in a virtual space. These instruments afford new techniques and ways of interacting, since the instruments can be played with different kinds of controllers and mappings. Several suggestions and guidelines for creating VRMIs are described in , and . One type of VRMI that recently appeared on the market is MuX. MuX is a modular synthesis and programming environment where the user builds instruments in the VR space. The user is immersed in a vast abstract space and can explore, tweak and alter the sound processing structure that floats in the virtual space around the user.Other popular digital synthesis environments such as Pure Data, Max/MSP and VCVRack allow for advanced patching and sonic explorations where the user is merely interacting using a keyboard and a mouse, unless an alternative controller is used. In MuX, one can build similar sound processing algorithms, but the experience is completely different because of the immersive nature of VR. In this experience, patches take the form of "machines" that are distributed throughout the 3D VR space where the user can navigate through and interact with. Here lies the novelty of MuX, because the software can be seen as an immersive audio programming environment, where both the interaction and the synthesis can be designed from scratch. MuX is currently available for purchase on the Steam VR platform with basic DSP components such as classical waveforms, a one-pole filter, and a one-sample delay. This paper expands MuX’s sound processing components with physics based models. The new components include strings, plates and bars implemented as alternative sound generators , . Furthermore a nonlinear virtual analog (VA) model of the voltage-controlled Moog filter and the Serge Middle Wave Multiplier were implemented in the environment to modify and alter the sound , . The models were created using MATLAB and the open-source C++ frameworks JUCE and VCVRack for prototyping and experimenting with the algorithms. Afterwards, the models were ported to MuX and Unity using Decochon’s C++ audio framework. The structure of the paper is as follows. Section 2 provides background on existing MuX blocks and creating simple music machines. Section 3 is devoted to the implementation of new components in general. We then conclude the paper, indicating further work in section 6.
Fig. 1: MuX blocks of different types. Only the cycle- logic is shown as an event component.
In MuX it is possible to create "machines", that symbolize instruments or a sound processing structure, made from simple building blocks. By combining components into a signal processing chain, the user can form their own creations in real-time. The blocks can be defined into three categories (see Fig. 1):
• Stream components (components that deal with sound),
• Event components (components that deal with triggering events),
• I/O components (Components for altering param- eter values with motion).
The stream components consist of ordinary signal processing blocks such as oscillators, envelopes and filters. Basic arithmetics such as multiply, add and subtract fall under this category as well. These components are linked together by placing the output of one component into the input of the next. To obtain sound output, there is a speaker component, that feeds the output of the sound chain to the digital audio converter (DAC). Figure 2 depicts a simple machine that adds two oscillators, multiplies the add-block output with 0.5 and feeds it to the DAC.
The event components can trigger and alter the values of the stream components. The event compo- nents are connected together through small red wires that transmit the event or information carried by the components. By linking together these components one can create sequences and trigger-based variations as well as if-statements and for-loops. The I/O components consist of buttons, knobs and sliders that require an input action from the user. When the slider component is moved, the new value is sent to the attached stream component. Other interesting components in the MuX environment include marble generators and actuators that can trigger and move components around in space.
Fig. 2: Simple additive synthesis machine with two oscillators, an add, a multiply and a speaker (DAC).
3 Interaction with physical models
To interact with the physical models in a more intuitive way, a drum pad interface was created. This serves as the first prototype for motion based interaction in MuX. The user can play the drum pad by using the VR controllers to hit the virtual object, currently represented by a grey square. When hitting the drum pad, the events containing the velocity and strike position are triggered and can update the parameters for a physics based model. Figure 6 shows the drum pad interface being excited marbles, which can also be used as in- put. The figure also shows the event outputs, located underneath the drum pad, that send the information.
The position is normalized in the range [0,1] and mapped internally to a fixed grid point in the physical model component. However, the velocity is mapped directly and the user needs to perform the mapping. The goal of developing the drum pad interface is to extend this to others types of interaction, investigating the possibilities for VRMIs described in . Further- more, there are several ways to improve upon this de- sign. Expanding the drum pad with haptic feedback and visualizing the user’s motion are possible exten- sions. Developing new interfaces and using "magical" interactions that would not be achievable outside of VR are of interest in MuX. Designing and developing the physical model and the interface simultaneously, would allow for for interesting customization. However, the implementation and design considera- tions are left as further development.
Fig. 6: Prototype drum pad being excited with a rotating marble generator.
Fig. 7: Simple machine with two drum pads controlling two plates with wave-folding.
Figure 7 shows a simple performance setup with two drum pads controlling two plates. Each plate is connected to three Serge VCM and a simple reverb. The velocity is mapped to the input gain into the Serge VCM and the position is used to control pitch, impact position and reverb parameters.
This section describes a preliminary evaluation based on the experience from working with MuX. The evaluation is formed by informal user testing and exploration of the MuX environment. The goal of this evaluation was to obtain observations that can assist the development of a formal test. The informal testers can roughly be divided into three groups: musicians with no DSP experience, engineers with DSP and musical experi- ence and users with no musical experience. All of these had no prior experience with MuX.
It was observed that most new users enjoyed the experience of playing in MuX, however, understanding how to build machines from scratch is difficult. One strategy used to introduce new users were to have a pre-made machine and show how to alter values and sequence beats. By preparing the sounds and the sequencer logic, the new user only needs to connect wires to trigger the sounds at certain steps. These actions are nontrivial without guidance. However, after repeating the motion a couple of times, the user could interact without external help. For making simple tweaks and connections, MuX worked well for all of the groups, since understanding the controls and interacting with machines, is easy to learn for most, even for people with no previous VR experience. Here, MuX provides a platform for quick experimentation and performance with a ready-made machine.
Creating machines from scratch is a more demanding task for the user in terms of time and understanding the logic behind MuX. Experience with DSP and programming eases the learning process, however, mainly because of the many different component types, it takes time to learn.
6 Conclusion and further development
MuX has the potential to be used for performances, learning and artistic VR experiences. Furthermore, it is an excellent tool for sonic exploration and prototyping. In this paper we have outlined how we extended the MuX library with lumped or distributed components, and how these extensions shaped the learning experience of MuX in informal settings. We have observed that as the number and complexity of the components increase, it becomes important to onboard the users with ready-made (but sometimes incomplete) machines instead of building everything from scratch.
There are still many open areas to explore such as us- ability and interaction models for audio programming in VR. Further research might go in the direction of ex- tending the physical models and allow for connections in between models such as described in . Allowing panning and spatial audio is a future direction for the project as well. Using location, distance and movement to alter the perceived sound opens up new directions for use and interaction inside of MuX. As of now, MuX is still under development, however a beta version can be bought through the Steam VR store.
@AES Conference on Immersive and Interactive Audio, York, UK, 2019 March 27 – 29
 Serafin, S., Geronazzo, M., Erkut, C., Nilsson, N. C., and Nordahl, R., “Sonic Interactions in Vir- tual Reality: State of the Art, Current Challenges, and Future Directions,” IEEE Computer Graphics and Applications, 38(2), pp. 31–43, 2018, doi: 10.1109/MCG.2018.193142628.
 Serafin, S., Erkut, C., Kojs, J., Nordahl, R., and Nilsson, N. C., “Virtual Reality Musical Instru- ments: Guidelines for Multisensory Interaction Design,” in Proceedings of the Audio Mostly 2016, AM ’16, pp. 266–271, ACM, Norrköping, Swe- den, 2016, doi:10.1145/2986416.2986431.
 Serafin, S., Erkut, C., Kojs, J., Nilsson, N. C., and Nordahl, R., “Virtual Reality Musical Instru- ments: State of the Art, Design Principles, and Fu- ture Directions,” Computer Music Journal, 40(3), pp. 22–40, 2016, doi:10.1162/COMJ\_a\_00372.
 Bilbao, S., “A Modular Percussion Synthesis En- vironment,” in Proceedings of the 12th Interna- tional Conference on Digital Audio Effects (DAFx- 09), pp. 456–466, 2009.
 Bilbao, S., Numerical Sound Synthesis, Finite Difference Schemes and Simulation in Musical Acoustics, John Wiley and Sons, Ltd, 2009.
 Paschou, E., Esqueda, F., Välimäki, V., and Mour- jopoulos, J., “Modeling and Measuring a Moog Voltage-Controlled Filter,” in Asia-Pacific Sig- nal and Information Processing Association An- nual Summit and Conference (APSIPA ASC), pp. 1641–1647, IEEE, 2017, doi:10.1109/APSIPA. 2017.8282295.
 Esqueda, F., Pöntynen, H., Parker, J., and Bilbao, S., “Virtual Analog Models of the Lockhart and Serge Wavefolders,” Applied Sciences, 7, p. 1328, 2017.
© L'ÉDUCATION MUSICALE 2019