January 2022

Beginning to focus back on the audio experience game I was creating, I want to look at the specific sounds used across the experience. At the moment, many of the sounds serve as placeholders, cut from games and other experiences. I want to look into what kind of sound create neutral ambience in order to create a game dictated by how it plays, rather than how it sounds for the initial testing.

Above is a video sound experience which does something much the opposite, focusing on tone to envisage a scene without visual. However listening to experiences like this will enable me to find what spaces work best spatially.

After I have found more appropriate sounds, I want to begin testing how menus and interfaces will work best in a non visual experience. Tactile interfaces may be the most appropriate option, but I want to avoid bespoke interfacing options for now so that testing for the game can remain as accessible as possible with devices people will already own.

First thinking about the visual element, and how to keep it as simple as possible. I wanted to have some form of indicator that you are interacting and that key presses are being made, while also creating some sort of visual identity to go along with the game.

Using the keys themselves as a graphic works really well. They are the only tactile part of the game currently so therefore their presence isn’t entirely about the visual. The keys will turn green when pressed to help the user confirm they are interacting correctly, and a small 6 key keyboard will be acquired to be the game interface for now. For the moment, if keys are numbered 1-6 with 1,2,3 being the top row and 4,5,6 being the second row, keys 2,4,5,6 will be used to control the movement of the character. key 1 will be used to activate a voice indicator of the current level and play time, and key 3 will be held for 5 seconds to restart the game with audio indicators for the 5 seconds pressed. These will be explained during the tutorial segment.

After playing through the game, the spoken voice queues given to the player also need to be revised alongside the specific sounds. These are going to be changed as follows for various actions:

Hello and welcome to here, an audio only game, any visuals are only to indicate the game is running. Use the R, D, F and G keys to move in the game. The T key will tell you the level and current play time, and the E key will reset the playthrough. Your aim is to navigate through a series of maze levels using spatial sound, headphones are required. Two sounds will be played for you now, please listen carefully…. This sound indicates that you are near a wall…. This sound indicates the end of the level which you are aiming for, try to move to this one now.

This sound indicates that you have found the end of the level, level 1 is about to begin, hold T to skip the tutorial next time, and good luck.

This next level will contain obstacles, you must avoid them, they are indicated by this sound… If you touch them the level will restart and you will hear this sound… The game is about to continue.

Hello, you haven’t moved in a while, press F to restart the game and hear the tutorial.


Current Level: (TXT TO SPEECH), Play Time: (TXT TO SPEECH).

The following nodes within the game enable the overlay which I explained above to work. They display a black screen with 6 greyed out keys on top. This not only allows the player to see which keys are being pressed, and provides a visual feedback for the user, just enough to ensure them the game is running, but also the W toggle enables the game to be viewed or hidden. This is useful for myself when I want to test the game is working properly.

This first set of nodes is within the widget (overlay hud of the game’s blueprint). It establishes a series of custom events which can be accessed globally throughout the engine for each key and that these events should affect the visibility of the green key overlays.
This set of nodes is within the player pawn blueprint. This enables the overlay to work across all levels of the game as a global blueprint. The top section runs on play, sets the audio listener (which hears and then outputs the spatial audio) to be on the yellow character rather than at the camera. It then will create and make visible the overlay as a whole which would otherwise not be loaded in. The second sequence flip-flops between this overlay being visible and invisible when W is pressed to enable debugging and correct development issues.
The third set of nodes are also within the player pawn blueprint. These are 2 examples, with 4 very similar ones for E,R,D and F keys. They are activated on key press and send a message to the widgets to change the visibility. This is activated on press, and deactivated on release.

The next series of nodes were all created to add button functions, either for the debugging and developer process, or for the game itself.

This series creates a counter when the T key is held during a tutorial section. When held for 7 seconds in total, the tutorial will be skipped and a print line is generated to explain. I will turn this along with the others into audio queues once the game is fully functional in terms of the code itself.
This node is a developer only one and allows me to skip through the levels quickly to fix issues and make sure the play happens smoothly. It has the same function as completing a level and skips onto the next one immediately with a print line string alongside.
This node is for a T press during a non-tutorial level. It will make a print line telling you the level you are on which will become an audio queue. It checks the level is a map level first and if it is initiates the rest of the command.
This final button based node for the moment works on E press. It is similar to the one to skip tutorials but is for any level. When held for 7 seconds, it will reset the whole game back to the start level and reset the level count string so the levels function properly.