Since 2008, I’ve been passionate about exploring the intersections of music composition, sound design and game design. Studying Composition & Sound Design for Adaptive Systems at the HKU University of the Arts, brought me in contact with a lot of game designers.
Back in 2011, I was part of an interdisciplinary team who designed the multi-player music installation Thresholder. On a 10x10M floor-projected network of lines, players could engage in a mixture of social improvisation, individual expressivity, physical movement and architectural spatiality. Now in 2019, I’m trying to find ways of reviving this project. Simultaneously, I’m thinking up new ideas for music games, and I’d like share some of my thoughts about this process. One of my key questions about music game innovation is: could improvisation be the key?
I penned a basic overview of:
1. non-linear (improvisation-related) game design
2. music game design
3. musical improvisation.
From those rough analyses, I proposed a set of ideas for (improvisation-based) music games – listed at the end of this blog post.
Skills or Tactics
In game design, non-linearity can be employed in various (intertwined) ways – each with various levels of complexity. Even a game as basic as Pac Man (1980) finds a lot of its strength in the fact that players can roam through the maze in directions of their choice, triggering the enemy ghosts to find various paths as well. This game mechanic gives players the incentive to keep replaying the game, not only motivated to improve their hand-eye coordination, but also to come up with new, more clever tactics.
Of course, various kinds of games combine agility with tactics, e.g. sports games and first person shooters. Yet, few games revolve specifically around emergent gameplay: the concept of giving players the ability to discover, develop and improvise tactics that were unforeseen by the designers of the game.
Emergent gameplay results from game design that lets objects and environments interact with each other through highly variable / complex properties. This is usually achieved through implementations of physics and chemistry (more accurately: playful interpretations thereof). Think of gravity, viscosity, fragility, flexibility, reactivity to water, fire, wind, and so forth.
Examples of such games are: puzzle games The Incredible Machine (1993), Scribblenauts (2009) and Infinifactory (2015), role playing action game Deus Ex (2000) and action-adventure Zelda – Breathe of the Wild (2017).
In 2013, thirteen years after he designed Deux Ex, Warren Spector said:
“Plans must be devised by the player; it’s not about how clever you are [as a game designer]. (…) Games that don’t offer quote-unquote ‘real choices’ might be just fine,” he suggests. “Some people may want a game that’s all about squeezing a virtual trigger, or moving forward like a shark, or solving a puzzle that shows more about how clever the designer is, rather than how clever they are. I’m just more interested in emergence than in scripted adventures… and I believe once players get a taste of that kind of game, it’s very hard for them to go back”(via Gamasutra)
Games like Garry’s Mod (2004) and Minecraft (2011) – don’t even rely on specific problems that are designed to be solved by the players. Instead, their sandbox design stimulates creativity and artistry by allowing players to express themselves (socially) through inventively and improvisingly playing with objects, physics, camera perspective, NPCs (non playable characters) – as a sort of on-the-fly level editing experience.
Randomisation offers other forms of non-linearity. It can add an element of surprise to a game, thereby increasing its replay-value. Algorithms (sets of instructions for the computer to follow) are called procedural when they are employed to generate endlessly evolving patterns. Some games tap into a random moment of the procedural pattern’s evolution, and then use it to evolve for example random landscapes.
Procedular level generation was first used in the Apple II game Beneath Apple Manor (1978), which also happened to be the first commercially available role playing game (RPG). Cutting-edge games like Horizon Zero Dawn (2017) for the Playstation 4 use procedural placement to organically simulate nature.
Open world games like The Legend Of Zelda (1986) and Red Dead Redemption 2 (2018), allow players to wander around freely and tackle quests in various (non-linear) orders. Yet, there isn’t a clear consensus on the definition of an open world. I believe this is because there’s a spectrum: non-linearity and linearity can be combined in various ways and degrees:
▫️ Non-linear storytelling: the amount of variation a game offers in terms of completing challenges in different orders, and how much effect this has on changes in the story: in how many directions can the story branch out?
▫️ Bonus play: the amount of extra variation a game offers by offering optional gameplay. For example by rewarding the players extra points for finding hidden objects or executing more difficult to perform moves.
▫️ Procedural generation: which elements are randomised and how random does the outcome feel?
▫️ Sandbox elements: in what ways are the players able to leave their artistic mark?
In many contemporary high-end games, the music seamlessly adapts itself to the player’s current situation, in terms of: health, location, activities within that location (relaxed vs threatening), room acoustics (small toilet vs big church), etc. This kind of adaptive music can work in (at least) three ways:
A. Horizontally: by chopping a composition into loops that can be re-sequenced.
B. Vertically: by adding or removing certain instruments from the aforementioned loops, also known as re-orchestration.
C. Modulation: by adding direct flexibility to musical parameters (timbre, intensity, speed, etc.) of certain instruments.
The first game to use a combined horizontal-vertical adaptive music system, with smooth transitions, was the point-and-click adventure Monkey Island 2 (1991).
The sound design of Tetris Effect (2018, with VR support) pairs every single action (movement, rotation, etc.) with a corresponding sound effect, thereby blurring the lines between sound effects and music. GlitchHiker (2011) game I co-created at Global Game Jam, also used interactive music in such as way (instant glitching) that additional sound effects weren’t necessary.
Musical Improvisation as a Byproduct
Otocky (1987) was a very musical game for the Nintendo Famicom (NES) game console. In this side-scrolling-shoot-em-up the player was able to shoot in 8 directions, each of which represented one of the 7 notes of a (western music) minor/major scale, plus the next octave’s root note (e.g. DO-re-mi-fa-sol-la-si-DO.
This meant that you would simultaneously shoot goals and improvise melodies on top of the background music. Your melodies would automatically be quantised: the timing of each note would be slightly nudged to align it with the rhythmic grid of the background music.
Procedular Music and A.I.
Portal 2 (2011) took it a step further by using real-time procedurally generated music: every action (such as shooting) instantaneously generated a unique bit of music.
Recently the first tech demos of an AI-driven music system for the real-time generation of adaptive music were created by Melodrive (2018).
Music Games – Pre-History
What about music games? A beautiful example is Mozart’s Dice Game from 1787 (play online), for which he wrote 272 measures to be arranged in any random order (non-linearly). In the video game world, most music games are neither non-linear nor improvisatory; yet there exceptions and innovations, we’ll get to to them soon.
First, I have to mention an invention that might have kickstarted the whole music video game genre.
A Japanese invention from 1971. Initially it didn’t include a television screen — see Daisuke Inoue’s Juke 8 — but as karaoke spread across Asia in the 1980s, it transformed into the karaoke box, also known as KTV (karaoke television): private, sound-proof karaoke rooms with on-screen lyrics.
Although karaoke binds its players to relatively linear ruleset — singing the songs as accurately as possible — it does give players room to express themselves creatively; after all, a little improvisation or weirdness can add to the fun and social interaction.
In relatively recent Japanese karaoke systems, players can score skill-points, based on various elements of singing. A spin-off of this scoring system appeared in Karaoke Joysound (2008) for Nintendo Wii. Playstation game Singstar (2004) visualised the melody as it approached, and gave its players a live-indication of their accuracy.
Music Games: Call-and-Response Rhythm Games
After Karaoke, Japan once again gave birth to an interactive music idea that would become hugely popular: rhythm games. Most commonly, the objective of these 1990’s games was to follow the rhythmic cues that scrolled across the screen.
Peculiarly, the first rhythm game, PaRappa the Rapper (1996) for Playstation 1, didn’t exactly work like that. It did show a stream (sequence) of notes, but it mainly challenged the players to repeat the teacher’s rhythms; to copy him in a call-and-response way.
Another thing that made PaRappa’s game mechanic quite unique, was its emphasis on improvisation: by playing extra notes in the rhythmic grid, the player could “beat” the teacher and get a chance to perform on the “cool mode” freestyle stage.
In 2011, fifteen years after he created PaRappa, designer Masaya Matsuura, explained to gaming website Kotaku how much room there still was (and is) for innovation in music game design:
“Strictly speaking I do not believe that ‘Music Games’ as a genre really exists yet. We just have ‘Rhythm Games’. We can’t really grow the genre until we have some games that explore areas of music other than just rhythm. I really want us to help overcome this deficiency. I’d like to for example do something that features extremely accurate musical performance animation. As an example, if we could zoom in and see lifelike fingerwork in an animated pianist, the opportunities afforded to uncover and develop new methods of playing beyond the capabilities of humans would be a crucial step in musical evolution. The ability of ‘games’ to allow us to take part in such advances is crucial.“
Jazz musician Bobby McFerrin plays a melodic game with his audience in below video. He appeals to our intuitive understanding of the pentatonic (ancient 5-note) musical scale. Could we translate this to a video game? Why not, right?
In 1997, just one year after PaRappa came out, the rhythm virus spread from the Playstation to the Arcade Hall, thanks to Konami’s DJ simulator Beatmania. Players now had to bash buttons in sync with scrolling notes. 1998 marked another revolution: Dance Dance Revolution (DDR), the first rhythm game with a dance controller.
The game didn’t award its players with any extra points for improvising, yet that didn’t stop a ‘DDR freestyle’ scene from emerging. Freestylers are not focused on the game’s expert levels or extreme scores; they instead prefer to come up with new choreographies and/or improvisations. Thus DDR reached cult status.
Later games, like Dance Central (2010) and Dance Evolution / Dance Masters (2010), challenged the players to dance with their full bodies. Hand and foot gestures appear on the screen right before they have to be executed. These games use the Kinect 3D-sensor for full body tracking. DANCERUSH (2008) continued in that tradition.
Other Rhythm Games
Many other rhythm games would follow: rail-shooter Rez (2001), “traditional” Japanese drum game Taiko no Tatsujin (2001), puzzle game Lumines (2004), Guitar Hero (2005), countless mobile games, motion controlled games like Wii Music (2009) which included a Custom Jam mode for improv, “piano” game Chunithm (2015) and VR games like Beat Saber (2018).
Frequency (2001), for Playstation 2, featured a remix mode that allowed up to 4 players to improvise collaborative compositions online. Vib-Ribbon (1999) was the first rhythm game to feature an audio-based level generator; it could create new levels based on any audio CD you would insert.
Music Games controlled by Real Instruments
In recent years we’ve seen innovative games that can be controlled by improvising on real (acoustic or electronic) instruments, like Cello Fortress (2012): an action game in which a cellist defends a fortress from 4 gamepad-players who try to destroy it. And SoundSelf (2012): a psychedelic, therapeutic VR experience controlled by chanting (singing).
Music Education Games
Still focused on hitting the right note on the right cue, but somewhat less rhythm oriented are music-education games, as they allow you to practice on real instruments. Examples are Rocksmith (2011) for electric guitar and Synthesia (2006) for MIDI keyboard. Robert Nichols hacked Rocksmith to make it work with violin as well.
Sound toys are arguably similar to sandbox games, or they’re no games at all. Examples are SimTunes (1996), Electroplankton (2005), Björk’s Biophilia (2011), the physics-based Musyc (2013), as well as remix apps ninjaJAMM (2013) and Playground (2017). Table-top inspired ReacTable (2005) can be scaled large enough for multi-player improvisation.
Sentris (2014) is possibly the first real / advanced hybrid of a puzzle game and a sequencer (software to write and produce music with). Non-game influenced sequencers have also become more playful and non-linear, see for example stochastic sequencer SECTOR (2014), multi-playhead sequencer Fugue Machine (2015) and the complex but playful grid-sequencers JR Hexatone (2009) and New Path (2017). Just released, iPad app Gestrument (2018), promises to become a highly customisable interactive music engine/instrument that can be controlled by eye tracking, movements, audio tracking and more.
Augmented Music Apps
RJDJ (2008) is a fascinating augmented reality music app / sound toy; it can make music out of any sound it picks up around the user, plus it can use smartphone sensors to change parameters of the music. RDJD’s interactive compositions (scenes) are programmed in the visual programming language Pure Data.
Interactive Music Installations
When it comes to large (public) interactive installations, there are many examples of interactive music or sound, like the simple but funny Piano Stairs (2009), or the body tracking avant-garde noises of Device Unknown (2018), or the immersive, brainwave (EEG) controlled lasers and sounds of On Your Wavelength (2015). Visual art can be created by groups of people on floor-projected installations like Healing Pool (2008). But it’s rare to find one that combines collaborative interaction (social game design) + music/sound improvisation. Something that comes close is The Music Box Village (2018), a collection of architectural sound toys that can be used by artists and audience to jam with otherworldly (texture based) sounds.
In musical improvisation it’s crucial for all players to be aware of each other, only then one can choose how to respond in one of the following ways:
▫️ Silence: giving space to other musicians to place the focus on their output.
▫️ Synchronisation: complementing one or more of the currently playing elements, move the music forward in its current direction through variations or dynamic changes.
▫️ Recontextualization: transforming one or more of the currently playing elements by playing something around them that gives them a completely different meaning.
▫️ Disruption: playing something that completely disturbs current atmosphere, thereby giving space for a drastic change of collective course.
In many traditions, group improvisation takes place on a micro level: a direct (physical) change in dynamics, melody, harmony, rhythms or choice of instrumentation (timbre). Improvisation occurs when an instrumentalist or a collective spontaneously alters any of those five elements.
But that does not mean that non of those elements may be pre-set. To give an example, jazz improv is traditionally based on the harmonies and rhythms of its well-known repertoire known as jazz standards. And in case a drum circle, there is of course no melody or harmony present whatsoever.
Thanks to music technologies – such as recording media, sequencers, samplers, loopers and algorithmic music generators – improvisation can also be done on a more macro level: by manipulating larger chunks of music ahead of their playback. In any case, all music, improvised or composed, can move between the 2 core-axes of music: “relaxed VS stressed” VS “stable VS chaotic”.
The momentary state of the individual instruments can be measured in the same way. Some examples:
▫️ Relaxed + Stable = a church choir singing a lengthy harmonic chord
▫️ Relaxed + Chaotic = a hectic jazz drum solo played with very soft brushes
▫️ Stressed + Stable = a demonic sounding organ chord droning on
▫️ Stressed + Chaotic = a fast death metal guitar solo at its climax
Apple’s Garageband uses a similar concept for intuitively composing drums:
From all the knowledge I gathered so far, I defined a set of challenges for designing music games in 2019 (under construction):
▫️ Melody/harmony games inspired by rhythm games. Moving away from the on-screen rhythmic cues, towards actually listening (to random generated sequences) with the challenge to respond with suitable notes and timings.
▫️ Rhythm games with a stronger focus on groove and swing, achieve neural rhythmic synchronisation through repetitive patterns. Research has shown that none of the current rhythm games achieve this.
▫️ Music games that are equally as suitable for beginners, hardcore (high-score) players, and a third group: freestylers (DDR is a succesfull example of this).
▫️ Entertainment-educational crossover games, that teach players how musical elements like rhythm, harmony, melody actually work; so they can use this to become more musical and learn how to improvise.
▫️ Masaya Matsuura’s idea: exploring post-human virtuosity, for example by giving supernatural influence over a virtual pianist. Perhaps this also relates to inventing ways to conduct expressively, in stead of performing every single note.
▫️ Music games that stimulate the players to improvise more, for example through an AI system that knows how well a player is improvising.
▫️ Music games with endless variation of music by implementing procedural (generative/algorithmic) music systems.
▫️ Music games revolving around emergent gameplay, by for example letting players build their own instruments by using physics-based playgrounds. Or through playful sequencers (combinations of games and composition software)
▫️ Music games to improve social contact (incl. games for the elderly, like the Tovertafel).
▫️Music games crossed over with sports games. And connecting rhythm games with fitness, for example by using weights as controllers.
▫️ Music game elements in serious electronic instruments.
▫️ Music games integrated in architecture / public space.
▫️ Music games suitable for large groups of people.
▫️ Open source music games allowing for co-design.
▫️ ….what are your ideas and opinions?