Technology in Music Education Task 2: Update #3

Okay! So much to report here all of a sudden. Yesterday I had both a discouraging and victorious day regarding this assignment.

I spent a large portion of the morning troubleshooting on my computer only to find that somehow I managed to delete Logic Pro (how does that even happen..). So a lot of time vanished as I was trying to fix this, which only really left me with 1 hour to try things out before going to work in the afternoon. At which point I discovered another weird problem. Anywho, point of the story was that not much happened on the computer, but my brain has been hard at work!

But! After work in the evening, I made a lot of progress in a relatively short amount of time with very little/possibly no troubleshooting at all. In fact I was actually able to set up most of my installation. So as you will have read, I will be using Genesis (a recent composition of mine) as the experimental platform to demonstrate the skills that I have taught myself, and the creative potential of this interactive installation. I’ve decided to use Mainstage as the host for the installation. If you are not familiar with Mainstage, basically it is a live version of Logic, designed to be used for performing.

Here’s a photo of all the controllers (including the Wii Balance Board and my iPad) that I have access to and considering using. I have a feeling I won’t be able to use all of them but I will let you know how many I can use without cluttering the interactivity.


The first thing I achieved was muting a large number of tracks in the final mix of Genesis (in Logic) to make a backing track in which I can layer interactive instruments on. I then opened the Playback plugin in Mainstage and inserted the newly bounced audio file.


The next step was to open each instrument (that I had muted in the final mix) in new channel strips within Mainstage. The interactivity will come from mapping the controllers that I have to the various instruments and their effects.

In my composition Genesis there’s a couple of sounds that I made using The Mangle, which is a granular synth plugin. I sampled my mum washing up one night (thankfully I didn’t damage my mic!) and opened the audio sample within The Mangle. Taking little (editable) grain sized chunks of the gurgles and sloshes of washing up, The Mangle can play them back at any pitch, speed, reversed, panned etc. The cross hairs you can see is where the grain is playing back from in the audio file. I also randomised the section of where the grains are playing back from whilst also randomising the volume and reversability of the grain play back. This creates a pretty unpredictable sound, especially when the source sound is water.

Screen Shot 2015-11-11 at 9.50.10 AM

It’s pretty limitless. From there I mapped the the rate of the grain playback as well as the pitch of the playback to the Mod Wheel Midi CC – which is 1.

There are Midi CC standard messages which most softwares conform to – here’s a link to a handy site that says what each one typically does.

The next picture shows the Midi mapping from within The Mangle.

Screen Shot 2015-11-11 at 10.09.41 AM

This is all pretty cool if you have a keyboard that is already sending through Midi CC 1 from the Mod Wheel. But I am only halfway there as I want to control this with a Wii Remote. This is where OSCulator comes into play. When you open up OSCulator for the first time it looks like this. Pretty blank.

Screen Shot 2015-11-11 at 9.57.21 AM

Having automatically found the wii remote using the sync button, OSCulator has a number of ‘events’ that each button or motion control can be used for. If you’re using OSCulator for the first time, make sure you press every button on your remote so that the program knows about them. It will automatically know about the motion controls however, as they are always sending through data. In the picture below, you can see that I have two wii remote devices connected. One is in fact the Wii Balance Board and the other is a standard Wii Remote connected to a Guitar Hero Guitar attachment.

 The reason behind the Wii Balance Board is that I want participants of this installation to have a designated place to experience the music from. Looking at the picture below, you can see up the top that I have the ‘SUM’ message mapped to the Midi Note D1. This means whenever someone steps on the board, it will sense their weight and play a D1 on an instrument. I then went ahead and mapped the top left and right regions of the board to two other notes so when the person shifts their weight on the board different notes/pitches will play. However, these pitches aren’t really going to be heard as the sound it will be triggering is going to be water sloshes.

Screen Shot 2015-11-11 at 10.05.03 AM

I got really excited when I started playing with the actual Wii Remote events as I mapped the motion ‘Pitch’ event to the Midi CC 1 Value. What this means is that the Wii Remote effectively just became the Mod Wheel of a keyboard and instantly starting sending value messages to The Mangle grain pitch and rate playback. The graph below shows the values (up to 127) of the Pitch event being changed by the Wii Remote motion within OSCulator (angling the remote up and down).


All this together creates a morphing, ambient sloshing and gurgling of water that can only be controlled by standing on the Wii Balance Board and angling the Wii Remote up and down – which means it can be played by anyone. Totally cool. At this point I bounded around the house in victory.

But I wasn’t finished yet! I also had plans for the Pad Sound that I had also created. Sampling my own voice, I was able to use The Mangle to create a very smooth Synth Pad sound by editing the Attack, Sustain and Release of the grain playback. As I used only 4 chords for the Pad Sound, I decided I wanted to map each chord to a different button the Guitar Hero Guitar.

Screen Shot 2015-11-11 at 9.55.12 AM


Using the Chord Trigger Midi FX in Mainstage, I was able to map a different key to play one of the four chords. It looks like this (below). Hit the learn button, choose the note you want to play the chord with  on the blue keyboard (top). Then choose the notes that you want in your chord on the orange keyboard below. Don’t forget to hit Learn button again before moving on to the next chord. 

Screen Shot 2015-11-11 at 9.55.02 AM

The reason why I chose notes so high on the blue keyboard was because I haven’t yet found a way to make OSCulator send sounds to a different Midi input. This meant that as I played the pad sound, the water gurgles were also being triggered by the Guitar Hero Notes. I went back to The Mangle and adjusted which keys are allowed to play the gurgles so that I didn’t get any over laps. This is my quick fix, but if you happen to know a solution please comment below as it will just get more messy to program as I add more sounds in.

So now I have both the Wii Remote, the Balance Board and the Guitar Hero Guitar all working together! The really cool part is that all of these are playable by a single person at once as the Wii Remote sits inside the Guitar Hero Guitar. So to adjust the ‘Pitch’ event in OSCulator all you have to do is pretend to be a rock star with the ‘Guitar’ and angle it up and down. Combine that with standing on the Board you can control the sloshes and gurgles whilst playing the Pad chords with the coloured buttons in your left hand.

I want as much of this installation as possible to be ‘computer free’ and so I came up with the idea of mapping a single button on the Wii Remote to begin playing the backing track. That way the entire performance can be controlled without touching the computer. So I mapped the Home button on the Wii Remote to the Playback plugin Play/Stop button in the assignments/mapping menu in Mainstage. Below is a picture. See how on the left there’s a column with ‘OSCulator Out’? Under channel 1, I have made the Midi CC number 21 . Following on to the right you can see it’s mapped to the Play/Stop Switchable button.

Screen Shot 2015-11-11 at 9.56.01 AM

In OSCulator, the event I chose was a Midi CC Toggle event and I also had to choose CC 21. However you have to be careful when trying to map it within Mainstage as it will automatically think that you’re trying to map motion control as they are always sending messages, remember? So you have to enter it in manually within Mainstage to only look for Midi CC 21 and not anything else. To do that click on the ‘Ctrl 21 (21) and edit it in the little menu that pops up.

I’m really excited to share this! The next things that I plan to do is add in some of the other effects that I have in Genesis such as the Thunder (door slams), Tongue Clicks, and Dishwasher Hits. These could prove to be difficult as I’ll have to choose notes that aren’t being used by any other instrument unless I find a way to send through another Midi input from OSCulator into Mainstage.

I also plan to make a poster or something similar to accompany the installation showing how it works so people aren’t so lost when they have a go.

I think that’s everything for now, I’ll let you know how the rest is going shortly!

Peace – Jonno


Composition in Music Education Task 3: Update #6

I am really pleased to be able to say that I have finished the scoring of Homecoming! Apart from drums, I would probably say that guitar notation is up there on being one of the most interesting/difficult instruments to notate for. I settled for a kind of hybrid notation between a music score and a lead sheet.

Here’s what I mean:

Guitar Notation Sample

It looks a little cluttered here as I wanted to show you both of the two main strumming patterns in the piece. At the start of the score I’ve created a little key stating what the arrows mean but essentially it’s this: Down arrows are down strums and up arrows are up strums. Having notated the rhythm and provided the chord shapes, there should be enough information there to play what I intend on a normal acoustic guitar.

Here’s a sample of what the final score looks like with all the parts together (just before the final chorus):

Homecoming - Score Sample

In both the piano and the guitar parts, I’ve provided the chords above the music. This is so that if I ever use this piece in a classroom setting, students who cannot read music well, can follow the chords. Also, the instrumental lines could be transposed for any melodic instrument that might in the class.

I think it is safe to say that on the face of it, there is little resemblance between Pastance’s Skye Boat Song (which was the basis of the composition task that is in my iBook) and Homecoming. Of course the chords, structure and the key are (that was the task), but it is now an original composition in it’s own right. I hope you enjoy it.

If you have any questions or anything you’d like to add, feel free to comment below.

Peace – Jonno

Composition in Music Education Task 3: Update #5

People compose in very diverse ways.

Up until this year, I would have never considered myself a composer. Not really… A great lover of many kinds of music though, but not really a composer.

I remember when I was in my last year of school and I had to compose my Core Composition for Music 2. Over the course of the proceeding six months I had toyed with quite a few different ideas but nothing had really settled until very close to the draft deadline. Thankfully on that occasion everything fell together and I quite enjoyed the process and the finished product.

It’s interesting to observe how differently, different people compose. It was my observation that most people in my class began with some manuscript and pencil or Sibelius/Finale. Most recently I have found it more helpful to compose through a process of jamming either with myself or others (see my earlier update for this assignment).

Having finished the recording, I find myself at the notating process. I’ve been using Sibelius to transcribe all the parts.

Here’s part of what I have so far:

Score Annotation 1

I have the basic structure of the song mapped out but still have a bit more to go. Just the other day I consulted with someone that I know who has plenty of experience with drum notation. He confirmed that what I had written was both readable and playable, so that was a relief. Drum notation is hard. I’ve been using this handy little resource as a guide (Equally you can just type in ‘how to notate drums into Google’ and it’s pretty much the first link).

Currently, I have the chords and chord shapes in the guitar part, but have still to find a proper way of showing the correct strumming pattern. None of the backing vocal parts have been charted yet, however the lead vocals (with lyrics), instrumental lines, bass and drums are all in.

Unless something comes up, next blog post I should have it all scored! Listen to the recording here.

Composition in Music Education Task 3: Update #4

I have officially finished the recording process! Hooray! I’ve uploaded it to Soundcloud so feel free to have a listen and let me know what you think.

Now I have to clarify quite clearly that the piece is exactly 3:00 minutes in length, but when I bounced the final track from Logic it added an extra second of silence which is not part of my composition. So I am still within the requirements!

So how did I finish? Well I had already decided on the structure (see last post) and had a rough idea of what I needed to do to complete it. I knew I needed some backing vocals and some instrumental lines and so I asked my mate Phil over to record the BV lines as he has a great range and I preferred a different vocal quality for those parts also. I am really pleased with them as they really fill out the overall sound and add lots of presence. Once we finished tracking those, we recorded the Flugelhorn and Trombone lines. I decided against adding a saxophone line as after hearing all the parts together, I felt as though there was no need for the extra layer and I was concerned about cluttering the arrangement.

Here’s a pic of the latest tracks and where they appear in the comp. On the left you can see the label for each part:

Screen Shot 2015-11-03 at 8.02.40 PM

You can see that there are 3 extra vocal harmony lines and an additional 3 ‘Ahhh’ vocal lines as well. I was inspired by ELO’s amazing backing vocals for the ‘ahhhs’.

As it turns out I was able to subtly slip in the melody of Skye Boat Song. During the last chorus in the Backing Vocals, the harmony part follows the same outline as the main chorus melody of Skye Boat Song. The words are ‘Thank-ful-ly I’m Fin-‘lly Home…‘ But it’s different enough that I don’t feel as though I’m directly ripping it – which is good.

There will be an update soon on the notation process.

Peace – Jonno

Composition in Music Education Task 3: Update #3

CME Task 3

As you can see (above), I have began the arranging process using Logic. And if you look carefully, I have also tracked some vocals now (many takes later, I’m not really a singer). Underneath the main bulk of tracks, you can see the audio and MIDI regions I made for the ibook task. I used these for the construction of a basic structure (but will not include them in the final mix). This is what I’ve decided on:

Intro – Chorus – Verse 1 – Chorus (Instrumental) – Verse (Instrumental) – Verse 2 – Chorus

So far, I have not yet recorded any instrumental lines and so the middle of the piece sounds pretty bare… however I plan to include 3 wind instruments. One to play the melody, another a harmony line and the third a countermelody part. My lecturer suggests I should try and reverse, flip or retrograde the melody to Skye Boat Song and slip it in somewhere. I’ll see what I can do. For the instrumental lines I am thinking of composing for Flugelhorn, Trombone and either Alto or Tenor Saxophone. When the vocals come back in for the final Verse (2) and chorus I will then try and continue these lines.

I’ve also already started notating what I have and will post an update shortly.

Here’s the lyrics I promised:

 CME Task 3 Lyrics

Technology in Music Education Task 2: Update #2

So a lot has happened between now and my last post regarding this, however not so much has actually happened on this particular project. To be fair I’ve been doing my other assignments, but now is the time that I can start applying more of my brain’s CPU usage to this and I am itching to get started. I have, however, spent some good time brainstorming things I could potentially do for the interactive installation that I am creating and so I’m looking forward to actually attempting to realise those soon.

I went into Uni early on Tuesday morning to meet up with someone who has plenty of advice and experience in this area. I shared with him my idea of connecting various game controllers/gear to my computer to connect with the music programs I have. Showing him OSCulator (which he hadn’t seen before), we spent the next hour or so ‘fiddling’ with the various parameters in both Logic and OSCulator, getting my hardware to properly communicate with them simultaneously.

— Cue amazing artwork of a diagram that I drew on the train to aid those who learn visually —Screen Shot 2015-10-29 at 4.09.54 PM

As a result I have a better understanding of MIDI CC messages and how I can map them to the various parameters and plugins within Logic.

Having just completed an electroacoustic/soundscape composition, I’ve decided to use that work in the interactive installation. I plan to remove two main instruments from the mix and give those who are interested to have a turn with the game controllers. They will have the opportunity to control and create a new version of the piece. Sound like fun? You can hear the piece here. I am still yet to do this, but so far I am thinking that the sounds you can play will be the Pad Sound, the ‘Water Sloshes’ (not sure on a good title for that part), and the Thunder (possibly some more). Along side the installation will be a small key for what each button does for each MIDI Controller so participants can musically decide their choices.

By the way if you are in Sydney and wish to come and see my installation and all the wonderful variety of projects that my classmates have undertaken, visit the Cafe in the Sydney Conservatorium of Music on Friday the 13th of November from 6pm to 9pm.

Please comment below if you plan on coming and come and say hi!


Yesterday I finished a composition! Check it out here:

It is in three movements and is inspired by water.
1. Waves
2. Waterology
3. Storm

An inspiration for this piece came from the creation story in Genesis from the Bible, where we are told that the Spirit of God hovered over the waters. In my mind this piece aims to capture this ‘moment’ where both the calmness of God’s control and his creative genius are simultaneously ordered and terrifying.

As water is a reflective substance, the second half of the piece is largely a reversed version of the first half (it begins just after the storm).

I recorded, arranged and edited all the sounds that you can hear. Apart from my voice, no ‘normal’ acoustic instruments were used.
Some of the sounds included are (in order of appearance):
– foam packing (to create waves – first sound you hear)
– my mum washing up (used in conjunction with The Mangle)
– my voice (pad sound – again with The Mangle)
– hand sifter
– my dishwasher closing (thuds)
– body percussion (clicks, chest beats and tongue clicks)
– door slams (thunder – I am most pleased with this sound!)

The artwork was created by my Dad on his Surface Pro.

Let me know what you think in the comments!
Peace – Jonno

Screen Shot 2015-10-29 at 10.53.41 AM