Interactive Music Installation Final Reflections: Update #6

Some final reflections..

Last night in the Con cafe, our class had a great night presenting all of our music technology related endeavours. It was a really cool evening to be a part of (thanks to those who came!). The night was filmed, so I’m hoping I’ll be able to upload a video of my demonstration for you to see in the near future.

12241590_10153737997191369_7480534500287822291_n

In light of last night, I’ve been thinking about my installation and how the music is actually created/played with it. What I enjoy about it is that anyone, musical or not can have a go and instantly be able to create something that sounds cool. Another thing is that the buttons and controllers that I’ve used are large enough to be used by people who might struggle with fine motor skills, have a disability, small children, or people who are in rehab.

Here are a few articles on Nintendo Wii’s in rehab (I used nintendo controllers for my installation):

http://www.physicaltherapytoolbox.com/expanded/wii.html

http://www.physicaltherapytoolbox.com/pdfs/wii07.pdf

http://www.centerforphysicaltherapy.com/wii.html

http://www.apta.org/PTCareers/Profiles/HesGotGame/

What if you could combine Music Therapy with this kind of rehab? Using the controllers to create the music? I’m unsure as to how compatible both are, but I think it could be very exciting and interesting area to experiment with and develop.

If you’re reading this and haven’t a clue what I am on about, I recently finished a project that you can catch up on in my previous blog posts.

Peace – Jonno

Advertisements

Interactive Music Installation Tech in Music Ed Task 2: Update #5

Really excited to share that I am finished working on my installation and have also completed the visuals in Mainstage. Here’s a look!

Screen Shot 2015-11-13 at 8.38.07 AM

Using the layout menu in Mainstage, I was able to make the page you see above. As I play the different sounds from the Guitar Hero Guitar and Wii Balance Board, the on screen controls will also move – pretty cool! The squares on the top represent the coloured buttons on the Guitar Hero Guitar and are also coloured to match. On the right shows the volume levels of both the dishwasher and thunder strike sounds. Below is a pitch shift gauge and the backing track playback (blue). The squares resembling the Wii Balance Board light up when you stand in the corresponding places on the actual Wii Board and the X and Y Pad on the left show the Left and Right delays as I control them with the Joystick.

I had a practice run yesterday and all was running well so that’s good. Hoping I don’t run out of batteries on the night but I’ve bought some extras just in case.

Accompanying my installation, I’ve made some instructions so people aren’t completely lost when they have a go.

Screen Shot 2015-11-13 at 9.04.26 AM

Hope to see you there!

Peace – Jonno

Interactive Music Installation Tech in Music Ed Task 2: Update #4

This afternoon I made the decision to only focus on using the Guitar Hero Guitar and the Balance Board as Midi controllers for my installation. There were plenty of buttons and features to utilise on the guitar that I hadn’t yet mapped and I also wanted to simplify the interaction experience for any participants.

Screen Shot 2015-11-11 at 9.12.06 PM

The picture above shows a selection of the instruments (with effects) from Genesis that I have mapped to control from the Guitar Hero Guitar and Wii Balance Board. As you can see, I have added a Tremelo effect and a Pitch Shifter in the Vox Channel and I have also added a limiter to the ‘Gurgles’ to prevent them from clipping. Both the Tremolo and Pitch Shift effects are bypass-able by pressing their corresponding buttons on the Wii Remote inside the Guitar. Button ‘1’ toggles the pitch shift effect, and that can then be controlled by the Whammy Bar and Button ‘2’ toggles the Tremolo effect on and off. This is to create some extra textures and experiment with.

The sound that I am most pleased with is the Thunder. In the recording of Genesis, the thunder rolls in and out with many varying strikes. I still wanted quite a selection of strikes at my disposal for the installation and so I learnt via Music Tech Help Guy (by the way his videos are really great) on Youtube how to alternate different audio samples within the Logic EXS 24 Sampler Instrument. Now I have 5 different lightening strikes mapped to each of the two black buttons (+ and -) on the Guitar Hero guitar. I unchecked the ‘Pitch’ selection so no matter what note I map them to, they will still sound the same. From there I moved them up high on the keyboard so I wouldn’t trigger them with any other buttons, intending to play other sounds.

Screen Shot 2015-11-11 at 3.01.23 PM

So this next picture (below) is of the Sample Delay effect for the instrument that I’ve named ‘Dad’. It’s called that because I sampled my Dad’s impressive tongue clicking skills and they sound really cool with a delay effect on them. Mapping the Right and Left delays to the X and Y axis on the Joy Stick found on the Guitar Hero Guitar (through OSCulator) I am able to control this delay on the go. It’s really cool (I also secretly love just watching the little sliders move as well).

Screen Shot 2015-11-11 at 10.15.03 PM

I then linked this up neatly with an X and Y Pad on the Workspace in Mainstage to make it look pretty. There isn’t any function here really, but the visual cue helps understand what effect it’s having as you change it in real time. The X and Y pad is below on the left and the little dot moves around as the Joystick does.

Screen Shot 2015-11-11 at 9.11.20 PM

The big blue strip along the bottom shows the Playback audio (but at the start is fairly quiet so that’s why it looks bland). I plan to edit and add more features to the Workspace soon.

And here’s a picture showing all the mapping that I have now completed.

Screen Shot 2015-11-11 at 9.11.49 PM

Since I last posted, I’ve decided to sign up and do a demonstration performance of my installation, performing Genesis live – so that should be pretty fun!

And here’s a video of how it was working before I added in all these extra instruments! Hope you enjoy!

As always, if you have any questions or comments please go for it.

Blessings – Jonno

Technology in Music Education Task 2: Update #3

Okay! So much to report here all of a sudden. Yesterday I had both a discouraging and victorious day regarding this assignment.

I spent a large portion of the morning troubleshooting on my computer only to find that somehow I managed to delete Logic Pro (how does that even happen..). So a lot of time vanished as I was trying to fix this, which only really left me with 1 hour to try things out before going to work in the afternoon. At which point I discovered another weird problem. Anywho, point of the story was that not much happened on the computer, but my brain has been hard at work!

But! After work in the evening, I made a lot of progress in a relatively short amount of time with very little/possibly no troubleshooting at all. In fact I was actually able to set up most of my installation. So as you will have read, I will be using Genesis (a recent composition of mine) as the experimental platform to demonstrate the skills that I have taught myself, and the creative potential of this interactive installation. I’ve decided to use Mainstage as the host for the installation. If you are not familiar with Mainstage, basically it is a live version of Logic, designed to be used for performing.

Here’s a photo of all the controllers (including the Wii Balance Board and my iPad) that I have access to and considering using. I have a feeling I won’t be able to use all of them but I will let you know how many I can use without cluttering the interactivity.

IMG_6029

The first thing I achieved was muting a large number of tracks in the final mix of Genesis (in Logic) to make a backing track in which I can layer interactive instruments on. I then opened the Playback plugin in Mainstage and inserted the newly bounced audio file.

Playback

The next step was to open each instrument (that I had muted in the final mix) in new channel strips within Mainstage. The interactivity will come from mapping the controllers that I have to the various instruments and their effects.

In my composition Genesis there’s a couple of sounds that I made using The Mangle, which is a granular synth plugin. I sampled my mum washing up one night (thankfully I didn’t damage my mic!) and opened the audio sample within The Mangle. Taking little (editable) grain sized chunks of the gurgles and sloshes of washing up, The Mangle can play them back at any pitch, speed, reversed, panned etc. The cross hairs you can see is where the grain is playing back from in the audio file. I also randomised the section of where the grains are playing back from whilst also randomising the volume and reversability of the grain play back. This creates a pretty unpredictable sound, especially when the source sound is water.

Screen Shot 2015-11-11 at 9.50.10 AM

It’s pretty limitless. From there I mapped the the rate of the grain playback as well as the pitch of the playback to the Mod Wheel Midi CC – which is 1.

There are Midi CC standard messages which most softwares conform to – here’s a link to a handy site that says what each one typically does.

The next picture shows the Midi mapping from within The Mangle.

Screen Shot 2015-11-11 at 10.09.41 AM

This is all pretty cool if you have a keyboard that is already sending through Midi CC 1 from the Mod Wheel. But I am only halfway there as I want to control this with a Wii Remote. This is where OSCulator comes into play. When you open up OSCulator for the first time it looks like this. Pretty blank.

Screen Shot 2015-11-11 at 9.57.21 AM

Having automatically found the wii remote using the sync button, OSCulator has a number of ‘events’ that each button or motion control can be used for. If you’re using OSCulator for the first time, make sure you press every button on your remote so that the program knows about them. It will automatically know about the motion controls however, as they are always sending through data. In the picture below, you can see that I have two wii remote devices connected. One is in fact the Wii Balance Board and the other is a standard Wii Remote connected to a Guitar Hero Guitar attachment.

 The reason behind the Wii Balance Board is that I want participants of this installation to have a designated place to experience the music from. Looking at the picture below, you can see up the top that I have the ‘SUM’ message mapped to the Midi Note D1. This means whenever someone steps on the board, it will sense their weight and play a D1 on an instrument. I then went ahead and mapped the top left and right regions of the board to two other notes so when the person shifts their weight on the board different notes/pitches will play. However, these pitches aren’t really going to be heard as the sound it will be triggering is going to be water sloshes.

Screen Shot 2015-11-11 at 10.05.03 AM

I got really excited when I started playing with the actual Wii Remote events as I mapped the motion ‘Pitch’ event to the Midi CC 1 Value. What this means is that the Wii Remote effectively just became the Mod Wheel of a keyboard and instantly starting sending value messages to The Mangle grain pitch and rate playback. The graph below shows the values (up to 127) of the Pitch event being changed by the Wii Remote motion within OSCulator (angling the remote up and down).

Pitch

All this together creates a morphing, ambient sloshing and gurgling of water that can only be controlled by standing on the Wii Balance Board and angling the Wii Remote up and down – which means it can be played by anyone. Totally cool. At this point I bounded around the house in victory.

But I wasn’t finished yet! I also had plans for the Pad Sound that I had also created. Sampling my own voice, I was able to use The Mangle to create a very smooth Synth Pad sound by editing the Attack, Sustain and Release of the grain playback. As I used only 4 chords for the Pad Sound, I decided I wanted to map each chord to a different button the Guitar Hero Guitar.

Screen Shot 2015-11-11 at 9.55.12 AM

FullSizeRender

Using the Chord Trigger Midi FX in Mainstage, I was able to map a different key to play one of the four chords. It looks like this (below). Hit the learn button, choose the note you want to play the chord with  on the blue keyboard (top). Then choose the notes that you want in your chord on the orange keyboard below. Don’t forget to hit Learn button again before moving on to the next chord. 

Screen Shot 2015-11-11 at 9.55.02 AM

The reason why I chose notes so high on the blue keyboard was because I haven’t yet found a way to make OSCulator send sounds to a different Midi input. This meant that as I played the pad sound, the water gurgles were also being triggered by the Guitar Hero Notes. I went back to The Mangle and adjusted which keys are allowed to play the gurgles so that I didn’t get any over laps. This is my quick fix, but if you happen to know a solution please comment below as it will just get more messy to program as I add more sounds in.

So now I have both the Wii Remote, the Balance Board and the Guitar Hero Guitar all working together! The really cool part is that all of these are playable by a single person at once as the Wii Remote sits inside the Guitar Hero Guitar. So to adjust the ‘Pitch’ event in OSCulator all you have to do is pretend to be a rock star with the ‘Guitar’ and angle it up and down. Combine that with standing on the Board you can control the sloshes and gurgles whilst playing the Pad chords with the coloured buttons in your left hand.

I want as much of this installation as possible to be ‘computer free’ and so I came up with the idea of mapping a single button on the Wii Remote to begin playing the backing track. That way the entire performance can be controlled without touching the computer. So I mapped the Home button on the Wii Remote to the Playback plugin Play/Stop button in the assignments/mapping menu in Mainstage. Below is a picture. See how on the left there’s a column with ‘OSCulator Out’? Under channel 1, I have made the Midi CC number 21 . Following on to the right you can see it’s mapped to the Play/Stop Switchable button.

Screen Shot 2015-11-11 at 9.56.01 AM

In OSCulator, the event I chose was a Midi CC Toggle event and I also had to choose CC 21. However you have to be careful when trying to map it within Mainstage as it will automatically think that you’re trying to map motion control as they are always sending messages, remember? So you have to enter it in manually within Mainstage to only look for Midi CC 21 and not anything else. To do that click on the ‘Ctrl 21 (21) and edit it in the little menu that pops up.

I’m really excited to share this! The next things that I plan to do is add in some of the other effects that I have in Genesis such as the Thunder (door slams), Tongue Clicks, and Dishwasher Hits. These could prove to be difficult as I’ll have to choose notes that aren’t being used by any other instrument unless I find a way to send through another Midi input from OSCulator into Mainstage.

I also plan to make a poster or something similar to accompany the installation showing how it works so people aren’t so lost when they have a go.

I think that’s everything for now, I’ll let you know how the rest is going shortly!

Peace – Jonno

Composition in Music Education Task 3: Update #6

I am really pleased to be able to say that I have finished the scoring of Homecoming! Apart from drums, I would probably say that guitar notation is up there on being one of the most interesting/difficult instruments to notate for. I settled for a kind of hybrid notation between a music score and a lead sheet.

Here’s what I mean:

Guitar Notation Sample

It looks a little cluttered here as I wanted to show you both of the two main strumming patterns in the piece. At the start of the score I’ve created a little key stating what the arrows mean but essentially it’s this: Down arrows are down strums and up arrows are up strums. Having notated the rhythm and provided the chord shapes, there should be enough information there to play what I intend on a normal acoustic guitar.

Here’s a sample of what the final score looks like with all the parts together (just before the final chorus):

Homecoming - Score Sample

In both the piano and the guitar parts, I’ve provided the chords above the music. This is so that if I ever use this piece in a classroom setting, students who cannot read music well, can follow the chords. Also, the instrumental lines could be transposed for any melodic instrument that might in the class.

I think it is safe to say that on the face of it, there is little resemblance between Pastance’s Skye Boat Song (which was the basis of the composition task that is in my iBook) and Homecoming. Of course the chords, structure and the key are (that was the task), but it is now an original composition in it’s own right. I hope you enjoy it.

If you have any questions or anything you’d like to add, feel free to comment below.

Peace – Jonno

Composition in Music Education Task 3: Update #5

People compose in very diverse ways.

Up until this year, I would have never considered myself a composer. Not really… A great lover of many kinds of music though, but not really a composer.

I remember when I was in my last year of school and I had to compose my Core Composition for Music 2. Over the course of the proceeding six months I had toyed with quite a few different ideas but nothing had really settled until very close to the draft deadline. Thankfully on that occasion everything fell together and I quite enjoyed the process and the finished product.

It’s interesting to observe how differently, different people compose. It was my observation that most people in my class began with some manuscript and pencil or Sibelius/Finale. Most recently I have found it more helpful to compose through a process of jamming either with myself or others (see my earlier update for this assignment).

Having finished the recording, I find myself at the notating process. I’ve been using Sibelius to transcribe all the parts.

Here’s part of what I have so far:

Score Annotation 1

I have the basic structure of the song mapped out but still have a bit more to go. Just the other day I consulted with someone that I know who has plenty of experience with drum notation. He confirmed that what I had written was both readable and playable, so that was a relief. Drum notation is hard. I’ve been using this handy little resource as a guide (Equally you can just type in ‘how to notate drums into Google’ and it’s pretty much the first link).

Currently, I have the chords and chord shapes in the guitar part, but have still to find a proper way of showing the correct strumming pattern. None of the backing vocal parts have been charted yet, however the lead vocals (with lyrics), instrumental lines, bass and drums are all in.

Unless something comes up, next blog post I should have it all scored! Listen to the recording here.

Composition in Music Education Task 3: Update #4

I have officially finished the recording process! Hooray! I’ve uploaded it to Soundcloud so feel free to have a listen and let me know what you think.

Now I have to clarify quite clearly that the piece is exactly 3:00 minutes in length, but when I bounced the final track from Logic it added an extra second of silence which is not part of my composition. So I am still within the requirements!

So how did I finish? Well I had already decided on the structure (see last post) and had a rough idea of what I needed to do to complete it. I knew I needed some backing vocals and some instrumental lines and so I asked my mate Phil over to record the BV lines as he has a great range and I preferred a different vocal quality for those parts also. I am really pleased with them as they really fill out the overall sound and add lots of presence. Once we finished tracking those, we recorded the Flugelhorn and Trombone lines. I decided against adding a saxophone line as after hearing all the parts together, I felt as though there was no need for the extra layer and I was concerned about cluttering the arrangement.

Here’s a pic of the latest tracks and where they appear in the comp. On the left you can see the label for each part:

Screen Shot 2015-11-03 at 8.02.40 PM

You can see that there are 3 extra vocal harmony lines and an additional 3 ‘Ahhh’ vocal lines as well. I was inspired by ELO’s amazing backing vocals for the ‘ahhhs’.

As it turns out I was able to subtly slip in the melody of Skye Boat Song. During the last chorus in the Backing Vocals, the harmony part follows the same outline as the main chorus melody of Skye Boat Song. The words are ‘Thank-ful-ly I’m Fin-‘lly Home…‘ But it’s different enough that I don’t feel as though I’m directly ripping it – which is good.

There will be an update soon on the notation process.

Peace – Jonno