We’re living in a historic era for music creation, and the essence of the era is summed up in the laptop computer. It’s an incredibly powerful and liberating compositional tool. For the first time, we have easy access to digital tools that let us create dense and complicated arrangements with a few touches and clicks. We can built arrangements out of sounds that are unimaginably rich and subtle — and we can reproduce them with perfect consistency.
Yet translating “computer music” to a performance setting presents all kinds of challenges. It's the very things that make electronic music so exciting that also make performing it so challenging.
As anyone who has attended electronic music events has noticed, there have been many attempts to surmount these obstacles. Some work, and some fall flat. Performing artists like Bonobo almost entirely eschew reproducing the electronic parts of their recorded music, opting instead to reinterpret most parts with live instruments. Floating Points has toured with an eleven-piece orchestra to reimagine his pointillistic electronica in a live setting — often to the surprise of his audience.
Others take the more popular approach: using controllers to trigger clips, samples and stems of their tracks to create something reminiscent of (but not quite) a DJ set. It's easy to see why this approach is attractive. Lots of sounds can be triggered totally consistently. It’s much cheaper than touring with a band. Also the performer can decide how hands-on to be: sets range from Madeon’s Launchpad virtuosity to Four Tet’s more languid layering of longer loops.
Triggering controllers on stage ticks many boxes from a performer’s point of view. But how does this approach translate for audiences? Most electronic music fans have probably experienced at least a modicum of disappointment at attending a gig only to discover their favorite artist hunched over a laptop making little effort to engage the audience. While this arguably shouldn’t detract from the music, there’s an almost complete disconnect between what the audience sees and what they hear.
This connection — or lack thereof — is important in distinguishing live music from its recorded counterpart. In live music there’s a one-to-one relationship between an instrumentalist’s physical movements and the sound you hear, and that provides a way in to the performance. When we watch a skilled musician perform we feel as if we’re sharing a moment with them, momentarily peeking at the inner workings of their musical mind and understanding the emotions that they bare right in front of us. So when that connection is missing, we notice. It can feel like instead of watching a live performance, we’re listening to an interactive backing track.
Most people performing electronic music are on some level compensating for this deficiency to varying degrees of success. The caricature of a DJ unnecessarily tweaking EQ knobs with flamboyant gestures is familiar to most dance music fans. There are many less vexing and more honest ways that artists attempt to create engagement. Performers like Shigeto incorporate a live instrument into an otherwise electronic setup, while others make use of modern electronic instruments such as drum sample pads or a Seaboard. By triggering loops on high-powered digital controllers and playing live instruments on the same tracks, artists like Jack Garratt, Mura Masa and Binkbeats take multitasking to a new level.
FKA Twigs’ live show offers something else entirely: spectacle. She intelligently recreates productions by Arca and Clams Casino with a live ensemble, but the ensemble is not the star of the show. Twigs’ fragile vocals, provocative choreography, and flamboyant costume design take centre stage. The performance is mesmerizing. The relationship between music and instrument is intact, but it seems irrelevant next to the audiovisual power of the show.
If visuals can distract from this music-instrument relationship, could they replace it? The number of extravagant audiovisual shows that have toured in recent years suggests that it absolutely can. Amon Tobin’s ISAM is one of the most famous and ambitious of this sort. Tobin stands inside a towering custom-built structure upon which intricate and otherworldly visuals are projected in synchronicity with the music. The visual cacophony on all sides of him obscures the technical act of his musical performance. His show is not about watching a musician perform. It also does not try to compensate for the deficiencies of electronic music performance. Instead, it embraces and brutally exaggerates them: it’s alienating, imposing and impersonal, and deliberately so.
These are extremes on the spectrum. The majority fall somewhere in the middle. Flying Lotus, for example, has toured with a team of visual artists who improvise with prepared materials projected onto a structure of screens. This creates a three-dimensional effect in the musical performance. Their approach differs from Tobin’s ISAM in a few ways. First is the improvisation. Flying Lotus and his visual artists leave a lot up to chance night by night in terms of their visual and musical contributions and how these interact. Secondly, he often stops the music and addresses the audience directly through the microphone, sometimes even stepping outside of the structure. In doing so Flying Lotus loses some of the high-priestly mystique that surrounds ISAM, but he compensates through a more personal connection with the audience — which is, after all, the reason so many of us want to hear music live.
Aphex Twin’s 2017 shows took another approach to the immersive live experience. His sets created sensory overload by combining a relentlessly cerebral sound with a frenetic light show. At certain moments a camera picked up on faces from the audiences and superimposed the disturbing Aphex Twin face on them. Whether they wanted to or not, audience members were pulled into the center of the audiovisual performance. For some, the connection with the music and musician was intensely personal.
However vast the potential of electronic music, bringing it into a live venue is an exercise in compromise. The electronic musician engages in a precarious balancing act between musical artistry and audience engagement. Thankfully we are now surrounded by tools that let people be much more creative in their approach to this balancing act. Live performance software, MIDI controllers, audio interfaces, and even entirely new expressive instruments are now more flexible, effective and affordable than ever. Instead, electronic music performance is like so many lower-tech areas of the arts: we use the tools available to us to pursue our creativity and share it with others.
Ben Hayes is a London-based composer, music producer, and graduate of the Guildhall School of Music and Drama.