In a town known for its strong work ethic and an overabundance of world-class musicians, Melissa Mattey may be one of the hardest-working creative talents in Nashville. A graduate of Berklee College of Music, with a focus on synthesis and recording, Melissa deftly juggles three careers without missing a beat. By day she’s an in-demand mix and recording engineer for a broad array of artists including Van hunt and Cam. By night she’s synthesist and composer for the electro-ambient trio Everyone Moves Away. And the rest of the time she’s handling keyboard chores and producing records for Indie favorites, Elliot Root. Did we mention that she tours with both of the aforementioned bands, as well?

We caught up with Melissa in a rare free moment.

So how do you juggle three different careers? How do you manage to keep all of that going?

Well, so far it hasn’t been a problem just yet. (laughs) Most of the time the shows are on weekends and most of my studio sessions are during the week. So as long as I’m home by Sunday night, I can make that session on Monday! I try to tell my clients: “Okay, weekends are off for the most part. I’m a Monday-to-Friday girl, so if you want me to mix or track something for you, that’s when you’ll get me.” (laughs)

Is working studio hours in Nashville a 9 to 5 gig or is it “anytime” studio hours?

It depends. Most of the independent freelance studio stuff that I do is basically 24/7. It can happen anytime. Especially if you have your own place, where you’re available whether it’s 9 in the morning or 9 at night. But with studio sessions, when it comes to booking players in one of the bigger studios, it’s usual studio hours, where you start at 10 and you work in 3 hour blocks. So it’s 10 to 1 with lunch at 1 o’clock, then 2 to 5. Then, if you offer a third session, it’s like 6 to 9. If it’s a union-scheduled session, that’s how it goes. We try to stay with that as much as possible. Otherwise you’re paying overtime at the studio and you’re paying overtime for the players so it can get complicated if you go around those hours. Most of the time the guys are pretty flexible, so sometimes it can be like, “Let’s just work from 10 to 7 and do a late dinner.” It’s not always a straight, to-the-minute kind of thing.

You’re working as an engineer in Nashville and your two groups are also based there. Did you grow up there? Or did you move there because of the music?

I definitely moved here for the music. I moved here at the end of 1998 after college. (Berklee College of Music) I moved here because I wanted to be an engineer. I thought, “Okay, there are only three places you can go: New York, LA, or Nashville.” And probably 90% of my classmates were going to LA. So I figured, “I’m not going to fight for an internship with a hundred of you guys. I’m going to Nashville where there’s an internship waiting for me.” And I also wanted to be able to record live instruments, which is the big thing in Nashville. You record bands. Whether it’s country music, rock, gospel… you record real people.

That didn’t seem to be the kind of thing that was happening in New York or LA at the time — unless you hooked up with a band. Otherwise it was mainly just recording vocals or program instruments. You weren’t really getting a live band recording experience. It becomes like a living breathing thing when you have anywhere from 5 players to… well, sometimes in these Nashville sessions you have 8 -9 players on the floor at the same time and it’s amazing to hear songs come to life all at once.

So what’s your musical background? What got you here?

When I was 5 or 6, I started with organ lessons. I was living in Brunei at the time. I was born in Malaysia but my parents and I moved to Brunei when I was 5 or 6. I lived there until I was 15, so most of my musical training was in Brunei. My parents enrolled me in a music program that was actually curated by the Royal School of Music in London. So we were following their curriculum. It was almost like a Masters program, where you get to Grade 1 and you learn play scales in C major. Then in Grade 2 you take theory lessons and counterpoint. That kind of thing.

It was a very strict program. I enrolled in that and studied until I was 14 or 15. But I started with organ lessons. Not like Hammond B3 organ. Almost like “pop” organ. At the time there were these Yamaha and Hammond organs that had a little tape player that had songs in them. And you kind of accompanied these pop tracks. Remember those? (laughs) So I was part of that schooling for a while and I didn’t really take proper classical piano lessons until I was about nine. Then I started the program and did theory at the same time. I was in a couple of school bands, and marching band, choir…. anything that had to do with music, I did.

Do you remember your first keyboard setup back then?

I had a Clavinova for all my piano sounds, which was a pain in the ass to drag around! But I had my parents as my roadies, which was great. (laughs) There was that and I also had these little Casio keyboards. I went through so many of those because they kept breaking! I was also into melodica then. So melodica, as well. We really need to bring the melodica back! (laughs)

From there, how did you decide to become a recording engineer? That’s a different thing altogether.

To be honest, it never occurred to me to get into recording until I got into college. So when I attended Berklee my first year, the program at the time was that you couldn’t really get into your major until the third semester. So the first couple of semesters you had to take all of these general courses like writing, arranging, and ear training before you decided which lane you wanted to go into — whether it was writing, arranging or film scoring. But at the time, I was really interested in the film scoring program. So I kind of did a little bit of that but was also interested in the synthesis department. I had to apply to be in the music synthesis department because there were only a few labs and only a few spots that you could get into.

I got into that program my third semester. I had never programmed anything before. Never did any sort of synth programming or sequencing or anything until I started Berklee. My advantage was that I learned it the right way. I had some really great programmers teaching me. At the time we were sequencing things using Vision (software by Opcode Systems), using these little Mac SE computers and a Roland JD800 synth. That was my first experience with programming, so I got the best of the best training from the very beginning.

Not a lot of people gravitate to programming. You must have a real technical bent if you’re talking about synthesis and engineering.

I guess so. It was something that I had always been interested in when I was watching other people playing keyboards and programming and I thought, “That’s interesting that you could do that,” but never really thought about it until I started a couple of classes and got into it really quickly. And at the time, there weren’t really any software-based instruments. Everything you had to do was MIDI-based and you had to sample everything and carry around floppy disks to load your samples. (laughs) So I think I got really great basic training when it came to making music with synthesizers. And I knew that I never really wanted to be like a really pro live performer. What I wanted to do was to learn to create music on my own. And because I was thinking about doing the whole film scoring side of things, I was thinking it would be amazing if I could score a whole movie by myself with synthesizers and keyboards because I don’t really play guitar or anything. Keyboards are my main instrument. So I was thinking: “How do I go about doing that?” That’s how I got into it.

That’s interesting because listening to your work with Everyone Moves Away and Elliot Root, your music is very cinematic and atmospheric. There are many interesting things going on, but it’s never cluttered. Do you think that’s an outgrowth of your interest in film scoring?

Oh absolutely. It was something that I always wanted to do and never had the opportunity to get into. The Everyone Moves Away thing was actually an accident. It was really just an experiment with my friend, Chris Small. We met up at an M83 concert and while we were there, we looked at each other and said: “This is amazing. Let’s do something like this.” So we talked and sent demos back and forth and finally decided, “Hey, let’s give this project a shot, just for fun. Let’s see what happens.”

So when we started Everyone Moves Away, we just said “Let’s score an imaginary movie in our heads. Let’s think about all of our favorite movies and keep that in the back of our minds and create the score for them.” And that’s how the project came about.

When you say you score an imaginary movie in your head, does someone come up with a theme? Or a texture? What’s the process?

It kind of happens organically. It’s not like a real process for us like: “Okay, let’s do this, and then let’s do this.” It’s a thing where one of us will start with either a small drumbeat or a couple of chords on the synth, or something in Logic, because we record everything in Logic so there are a lot of software instruments that we have. Or maybe a piano riff. And then we’ll vibe on it for a little bit.

At this point we’re not even sitting in the same room together. The lead singer/keyboard player, Chris Small, has his own little room at home so he’ll send us a 16 bar piano riff over DropBox. And then I’ll plug it in and put a little drumbeat on there and I’ll layer a couple of synths and then we’ll just go back and forth until part of the arrangement is there. Then we get together in a room and think about vocals. Vocals are almost always the last thing that’s on there. The textures are there, the arrangement is there, and a lot of the sounds are already in place before we put vocals on. Because we never really wanted to make it into a pop project where the vocals are at the forefront. We wanted to make the synth and the other instruments tell the story. And the vocals just sort of help that.

What are some of your go-to tools and why do you choose a given tool in a given situation?

It kind of depends on how much patience I have that day. If it’s an idea that we’re trying to work though quickly, I’ll use a soft synth, whether it’s just a basic pad layer or a basic piano sound, just to get the ideas down — the motif down. And then we almost always go back and replace those sounds later on.

A lot of these sounds are still in MIDI before I put hardware stuff on there. Once we get the arrangement and make sure the key is working, then we start replacing all of those basic sounds with more layered sounds — more thought-through Prophet 12 sounds or something like that. It just depends. And then sometimes I’ll start a track with a hardware synth. Not MIDI at all. Something that’s free-flowing. Something that I feel that the texture works and the sound works and it’s something that I want to keep until the very end. So sometimes I’ll work with a hardware synth and work around that.

What’s your current setup with Everyone Moves Away?

We have three different setups. In my setup I have the Prophet 12, and an Akai 261 controller that controls Apple MainStage and also some samples that I trigger. I’m also running Logic in the background for backup tracks. And then Chris Small, our singer, who also plays guitar, plays the Mopho x4 and he also has a MainStage setup. Then Tony High has a Maschine drum setup and plays that and samples, and a Mopho. We use the samples for textural stuff and vocal blips because we have a lot of that throughout our songs. We try to be as live and improvisational as synth music can get. (laughs) We like to be able to track those and trigger those at any point during our performance.

How would you compare your role in Elliot Root with Everyone moves Away?

Well, with Everyone Moves Away, the synths are definitely forefront. With Elliot Root, I’m the producer on those records, so most of the time we start with just the band, minus the keys. The songs are usually written on guitar, so the textures are already very different from the beginning. Everything is very organic and everything is acoustic and my role is basically to create the soundscape around those songs. I think that’s what makes the music more interesting because if you take the keyboards away it just sounds like a four-piece guitar band. But I’m trying to add this extra layer. Something that I would do naturally in my other project, Everyone Moves Away. I do these more synthetic and ethereal sounds around a rock band.

How did you get together with Elliot Root?

I was actually introduced to Elliot Root first as a producer. They were working on their first official EP and we had a mutual friend that introduced us. He said, “I have a couple of friends in this sort of folk/Indie band and they’re looking for some direction. They’re looking for a collaborator. Not necessarily just someone who can push buttons, but someone who can give input creatively to their songs and to their sound — because they’re chasing something but not exactly sure what it is they’re looking for. So why don’t you meet with these guys? They’re ready to record their first EP. They have some songs ready but would love some outside input.” So that’s where I came in.

I came in to record their first EP as an engineer, to mix it, and also to provide some production direction with the band. So it started there and at that time it was just a three-piece: drummer, bass player, and singer/guitar player. But they were interested in what I was doing with Everyone Moves Away and they said, “It would be great if you could provide some keyboard work also — whether it’s just wurly or synth layers or sampling, or that kind of stuff.” So it started there. And then, in my usual fashion, I try to push those guys as far into the synth world as I would love them to be. I was like, “If you want to keep me around, this is what I do.” (laughs)

Did you get into the Prophet 12 because of its depth and ability to do those more “out there” sounds?

Well, initially I was into the four-voice Mopho x4 thing because I felt that the limitations made me more creative with sounds. Also because I’m always coming from a production standpoint first, over being a performer. That’s how I create and that’s how I am in a band. I never feel like I’m out there just to be out there. I always feel like I want to fill in spaces and create moods. So I really embraced the limitations of the Mopho x4, but then decided, when I started to play in these two bands, that I needed a little more ability to stretch the sounds. Especially in Elliot Root, where I’m handling all of the keyboard work. I needed something that I could layer. Something that I could play one patch, but make it sound like 4 sounds.

The layering capabilities of the Prophet 12 were basically what sold it for me. I thought, “Okay, I have 12 voices. Awesome! I can make 4-8 sounds out of that and make it work for a whole song, and never let it feel like it’s the same sound playing throughout the whole song.” I’ve also integrated a few pedals, too, so I can kick in the reverb pedal and all of sudden the layers can just bloom.

When you’re looking for synth sounds to fill a space, how do you sculpt your parts? Are you programming sounds from scratch?

Most of the time, it’s like the way I program the Everyone Moves Away stuff, where I start with a very basic sound, say on the Prophet 12, where it’s a preset layer. Then once we get into learning the song and playing the song, I’ll start tweaking it depending on what the guitar tones are. I’ll switch a couple of oscillators to different shapes and so on. I do start with presets a lot of the time, but there are a few patches I’ve made completely from scratch just to kind of keep in practice with it and I’ll end up using that for our sets. But it all depends. A lot of the sounds I’ll use the same pad sound for 4 different songs but they’re all slightly different from each other, mainly so I can keep consistency in the tone so that it’s not too buzzy or flat. So that it just rides underneath the guitars.

Since you’re trained as an engineer and producer, I imagine when you’re doing this you’re really conscious of —

The frequencies and the balances. Yes. Absolutely. We’re using in-ear systems when we play, so I can hear things really directly and really clearly — especially when we’re creating songs or playing as a band. I hear all of the guitars clearly. I can hear what kind of tones they’re creating. And I try to fit in there without getting in the way. I’m just sort of filling the spots where maybe a third guitar player would be playing. A lot the tones I have are almost guitar-esque with a lot of distortion and delay and reverb like the guys have on their guitars, so I’m sort of like the third guitar player.

The Character effects on the Prophet 12 are good for that.

Yes. Definitely. I love it. (laughs)

I assume you played on Elliot Root’s cover of “Pure Imagination” from Willy Wonka? That’s a great cover.

Thank you. We did that a while ago and we’ve been performing it live for about a year now. We had dropped it from our set for a while but now we have plans to re-release it again because of Gene Wilder (passing). Those are all Prophet 12 sounds on there. Those sort of single-note tones are all from the Prophet 12.

What kind of things inspire you these days? Where do you go for inspiration? Is it visual things? Is it other music?

I listen to music all of the time. Basically 24/7. Whether it’s something I’m working on, or something that I’m just checking out for the first time, or music in a movie. I’m trying to find new music all the time. I definitely have my favorites — like Nine Inch Nails. I love everything they do. Especially when it comes to synth sounds.

But definitely movies. The visual part. I get inspiration from stories and textures that way. I always try to keep movies running in the back of my head when I create.


Instagram: @macmattey