First things first, what is an augmented trumpet and where did the idea come from?
The Augmented Trumpet is a just regular trumpet with a sensor attached to the valves, used to perform electroacoustic music. The sensor follows the movement of each valve up to down, and that data is sent to the computer. Then, with the software I'm using, I program the electronic effects to be controlled and synchronised with the normal movements of playing the trumpet.
The idea was inspired by Japanese/American violinist Mari Kimura's Augmented Violin project. She developed a glove that the performer wears on their bowing hand, which can pick up whether the performer is playing a downbow or upbow, which string is being bowed, and how long each bow stroke is. When I first saw it I was captivated by how it transformed the way you could perform electroacoustic music, and immediately thought about how I could apply that idea to the trumpet.
You've commissioned a number of local composers to write for this exciting new brass instrument - what were some of the challenges when commissioning for a completely new sound, and how did you work around them?
The first challenge is that very few people in Australia have seen any 'Augmented' instruments, so just explaining & demonstrating what it does and why that's useful was an important place to start. It really surprises people how precise the electronics can follow along with the live sound, and also just how many different sound options you have! I think every composer came back to me asking 'can it do this?' and 99 times out of 100 my answer was yes!
Also, to help the composers I wrote a little guidebook (available here for free!) that explains how the instrument works and gives some examples of what I've attempted so far.
I think one of the other challenges was that it really adds another layer of thinking for the composer when they're writing. You're already thinking about normal compositional things like structure, pacing, colour, as well as what sounds you to create through the electronics; but then on top of that, you've got to think about what interaction is going on between the sensor and electronics. It's a lot to get your head around!
How collaborative have each of the commissions been? What is the process for you in creating a new work with a composer, from beginning to end?
So, after I invited each composer to be part of the project, I sent around some recordings of what I've already done to try to show the breadth of sounds you could create. We each met one-on-one as well so they could ask questions and talk through their ideas. After that, each composer just started writing their piece and sent me a score when it was ready! Once I'd read through the works, I started programming the electronics for each piece and then we met up again, this time to get really into the shape of the piece together, because the electronics I'm building need to fit closely with how the composer imagined it would sound. We also went through the normal stuff when you're playing a new piece - how's this tempo, how's this phrasing, dynamics, articulation etc. This was actually quite a new experience for me because usually I'm on the other side as the composer presenting a new work, rather than performer learning the new work!
You have already performed improvisations on the augmented trumpet but never notated work. Moving forward, do you see possibilities for other instrumentalists to perform your commissions and play on the new instrument?
Yes absolutely! I'd love to see this idea spread and these works be performed by other performers. This year I've actually been developing a 3D printed model of my sensor that will fit onto any trumpet (the original sensor is made out of PVC pipe, velcro and gaffer tape...). So ideally, anyone with a trumpet, the sensor, a microphone and a computer could perform these works. That's a little way off yet, but we'll get there!
Each piece will be accompanied by live visuals: can you tell us about these projections and why offering a multisensory audience experience is important to you?
I've always loved live visuals at electroacoustic music concerts - I see it as a challenge to find the relationship between one and the other. Electroacoustic music performance can also be quite difficult to watch as an audience member if the performer is really focussing on a computer while they're playing - it can almost become a barrier, and you can just can't see what's happening on the screen so maybe you feel left out a little? Whatever the reason, I feel having live visuals can make the performance more engaging for the audience.
The other reason I'm creating the visuals for this gig was more of a personal challenge. The software I used to do the Augmented Trumpet sound, Max MSP, has a whole other side of it that creates visuals. I've used this software for six or seven years now and never really looked into it, and so I just thought, why not?!
Finally, why is it crucial that we keep pushing the boundaries of what our traditional orchestral instruments can do in a performance setting?
I think it's always been happening; instruments have always been changing and developing alongside the music of the day. I mean, trumpets and horns wouldn't have any valves if someone hadn't thought it'd be pretty cool if those instruments could play a chromatic scale! Perhaps now this is just trying to incorporate today's technology with our centuries-old instruments in meaningful ways. I really believe that using electronics with instruments broadens our range of expression, and just lets us create so many more colours! And also, it's probably my composer brain, but I'm constantly looking for another cool new sound. And then another, and another, and another...
Hear Elliott Hughes' Augmented Trumpet at The Burrow from 8pm on August 4, 2018.