Creating Sci-fi User Interface Sound Effects: the Creative Process

Hi! My name is Antoine and I take care of sound design and interactive music on GalaCollider. I live in Belgium, birthplace of the big bang theory (no kidding), and I’ve always been fascinated by outer space.

In this post I’ll tell you about the different aspects of my creative process for the user interface (UI) sound design, which has been my main focus so far on the game.

Before working on specific sounds, I thought about the general sonic aesthetics I wanted to achieve. My goal is that GalaCollider’s UI sounds like a slick and advanced high-tech device. It should sound smooth, pristine and satisfying to use. Typically, UI sounds have to support visual aesthetics and give informative feedback to players. In a strategy game like GalaCollider, the UI is really the interface between players and the core gameplay. So what we hear when clicking a button should tie in with what that action means within the game’s wider strategic framework.

For every sound effect, I start by playing the game and paying close attention to visual elements. At that point, there is often a gut feeling of how the effect should sound. I then ask myself the following questions:

  • What do we want players to feel?
  • What is the meaning of the action in relation to gameplay?
  • Is it related to other sound effects in the game? (for example, all sounds related to a specific resource)
  • What information should the sound carry?
  • Is it part of a sequence of actions that means the effect will need to work nicely with other sounds – in sequence and/or simultaneously?
  • Do I need to create one sound or different variations?

Based on this I start creating different layers which I synchronize to video in my digital audio workstation. Here is a picture of the “draw a card” sound effect, which is made of seven layers.

As a starting point for UI sounds I often use a nice little synth called Galactic Assistant by SoundMorph. I also use a bunch of other software synths, a Moog Sub 37, and some sample-based instruments.

Once I have a good basis for a layer I start processing it with different effects. Often the processed sound is very different from what I recorded initially. In most cases, it is possible to know that by recording a certain source and applying specific effects I’ll reach the required result. But sometimes I just experiment with different effects and see how they work together with the game. As in most creative fields you get happy accidents, which is always nice!

Here are some examples of sounds from the game.

Drawing a card (as shown in the above image):

Opening Tech Research:

Confirming Resource transfer:

To implement sound and music in GalaCollider, I use DarkTonic’s MasterAudio plugin (shown in the picture below) in Unity. Once sounds are imported in MasterAudio, they can be triggered in C# in one of the game’s scripts. So I dive into the code and find where to trigger the sound. Implementation can vary from one line at the right place to more sophisticated syntax. Working mostly in C# has been great because it allows me to learn how the game really works under the hood. It helps being more self-sufficient and take some weight off the programmers’ workload.

Then it’s time for the first verdict. I test the sound in the game and see how I respond to it as a player. Does it really work with the visuals? What does it make me feel? Does it sound good in context with music? So I can then go back to my audio session and adjust the sound and repeat the process until I have a first version that works. A few parameters can also be adjusted with MasterAudio. Over time, I also go back to the sound with a fresh set of ears and make revisions based on further impressions and feedback from the team.

If you have any question about creating UI sounds or working with MasterAudio, don’t hesitate to get in touch on twitter (@Antoine_VL) or through my website, where you can also find samples of my work.