SoundCloud

Sunday, 23 November 2014

Finding a solution

Recently, I was asked to be the sound recordist for a shoot in East-London, for a filmmaker I have worked for previously. My standard procedure is to find out as much about the shoot and base my approach from what I can gather. Normally, I will use the standard boom/shotgun microphone arrangement as the super-cardiod directionality of the microphone allows for a more focused pickup and is good for ambient sound rejection...not to mention it is industry standard and widely used in production sound at all levels.


However, whilst testing the equipment prior to the shoot I discovered, to my frustration, that my newly acquired shotgun microphone was faulty. I went through the appropriate checks to determine whether it was in fact a technical fault or simple stupidity on my part;

  • XLR connected properly? 
  • Phantom power on? 
  • Positioned close to sound source? 
  • Gain adjustments? 
  • Audio interface fault? 

Sadly, it seems that there was an internal fault as the only achievable output was a lot of hissing, buzzing and the faintest hint of dialogue, unlike all the other microphones that were checked and that produced crystal clear audio.

This left me with a decision to make as the shoot was an 8am start and there was next to no time to replace the microphone having only just got it. There wasn't really any other choice but to go with the closest possible alternative. I decided to use a large-diaphragm condenser microphone for two simple reasons: dialogue is best recorded with a condenser as they typically have a better response to the higher part of the frequency spectrum. The other deciding factor, which was out of my control, was that it was the only freestanding condenser available to me - the others being embedded in portable recorders.



This isn't the most desirable arrangement as that microphone doesn't offer as much ambient rejection than a shotgun microphone will, but as it is normally used for vocal recording it was the logical choice. To add to the potential issues, the location chosen for recording was a very large space and one that had little in the way of acoustic treatment; it would give most echo chambers a run for their money.


The fortune in this situation was the choice of camera shot*; it was quite a long shot that gave a basic perspective of the overall room size. Thus any natural reverberations picked up in the audio should seemingly match the environment on a psychological level. It wouldn't sound out of place, but it does limit the creative flexibility when editing. My preference is to get as clean a signal - clear, crisp and void of any unwanted artefacts - as possible regardless of location and then add to it in the edit. This is obviously best achieved in an acoustically treated environment such as a recording studio, but it doesn't mean a clean signal can't be captured in an environment with more reflective surfaces. This is where shotgun microphones perform well, giving you that extra bit of ambient rejection that can make all the difference when editing.

*This also required a further compromise as the microphone had an area that it couldn't be placed in as it would have been visible to the camera. It ultimately ended up being further away from the performers than would normally be desirable, especially considering its lack of ambient rejection when compared to a shotgun microphone.

The moral of the story here is an obvious one: 'check your equipment regularly for faults'. Although, an equally as important element is knowing your alternatives, workarounds and any conceivable combination out of the equipment that is available to you. It's always better to be prepared and have time on your side, but knowing what your options are when agreeing to a job should be at the forefront of your mind in the event of any faults, issues or breakages that may occur in the time leading up to the recording.

Thursday, 25 September 2014

Dynamics: Compressing the Issue

This post is entirely based on personal observation. 
There are no facts or figures just my own views and opinions. 

I have often enjoyed singing choral music because of the simply breathtaking sonic experiences it can produce. As anyone who has ever performed a piece of music should know, there is a lot to take in and comprehend on an artistic level; of course that is providing it has been notated properly. Whether it is the rhythm, tempo, or articulation - it all exists in the form presented in front of you because the composer wants to create an impression. 

This is his/her impression but that is not to say it should be the one and only interpretation; after all conductors make a living out of this very notion. One person's musical opinions are never any more right than the next person's and we shouldn't forget that music is a subjective entity. However there is a line between an interpretation and a reworking, albeit a subtle one. 

Dynamics! - The markings that indicate loud, quiet and everything in-between...

On a basic level, yes, they are, but the word is also used to refer to the "stylistic and/or functional aspects of the execution of a given piece" as wikipedia would have it known. This in itself is the very nature of artistic interpretation: demonstrating through the contrasting use of techniques, both "stylistic and functional", in order to communicate the composers work but in your own unique manner. As a soloist this can be more apparent, but when performing as part of a collective - especially when there is more than one person delivering the same part - the artistic statement tends to be at the mercy of the conductor or leader. 

Deep-rooted Issue

So what! Where are you going with this? Well, considering the importance of dynamic control on an artistically driven interpretation, why does it take so much effort to get performers to adhere to the directions set out by the composer? Obviously the more experienced and concert-savvy performers will be better at this, but I am hitting at those who perform for the love of music regardless of how adept they are. Surely 'note-bashing' does not bring the same sense of satisfaction that a well crafted interpretation might. I have also felt this to be a deeply-rooted issue - to the point that observing dynamic control is not as natural as it once was. So what is the problem?

The well documented 'Loudness War' points an accusative finger. With music and other forms of media that is consumed by the mainstream of society, there has been an ever growing battle that tries to get the individual noticed from amongst their competitors. As a result of the practices of mastering - not to mention the powerhouse music labels that demand it - a lot of mainstream music has become dynamically 'boring'. Loud and boring, what could be worse!

For those well trained ears and knowledgable performers this will have little impact, but it is for the more impressionable musical minds that the issue really hits home. If one listens to music over a lengthy period of time it is only natural to assume that you will inevitably pick up traits and characteristics exhibited by said music. These may be either deliberate or involuntary and could be down to a range of different things, but the point still stands: we are influenced by the music we listen to.

Intrinsic Mechanism

So what happens when you have a collective set of ears that are being influenced by this dynamically stunted music? The very concept of an artistic performance begins to suffer and the music that relies upon upon this type of interpretation loses out. Some of the most emotive pieces of music would come across as empty or void of any intensity...there is more to dynamics than simply "sing/play louder than the last bit". It is a tool that can transform even the most simply constructed phrase or selection of words into a powerful mechanism that transforms the experience of the listener. Imagine the climactic build up towards the end of a piece of music in which it gradually builds up to a loud finale, but then take away that crescendo so that it merely stays at the same volume/intensity. The impact would be greatly diminished if not completely destroyed.

I remember learning to play/sing and being constantly told to emphasise the music by acknowledging the dynamic markings instead of playing it over and over at the same monotonous level, and it always used to bore me senseless. Now, being a little wiser and more artistically minded, I have come to appreciate that it often takes time to develop a respect, or perhaps instinct, that allows one to utilise the powerful tool that it is. Whilst I share slightly depressed views of the state of music - probably unfairly so - I am aware that it isn't a natural instinct for everyone and that even with training, it still takes time to develop.

I am merely making an observation based on conversations that I have had over recent years and it dawned on me that there might be a link...

If you agree or not, it would be good to hear other opinions and thoughts on the matter.

Friday, 25 July 2014

MusicLab - Hi5! - Arduino and Pure Data

This post is a direct follow-on from previous posts covering MusicLab and 
might be better understood having first read the preceding articles.

What does Arduino and Pure Data have to do with Hi5! ?

Hi5! is dependent on the use of a MaKey MaKey and Pure Data in order to function properly and without them, it would merely be a large piece of wood, acrylic and metal - more decoration than exploration, had that been the case. To simplify the connection between the three, try and visualise it as follows:

(c) JAllen_SD
  • Computer 
    • running Pure Data 
    • [USB 2.0  to USB Mini]

  • MaKey MaKey 
    • identified as Human Interface Device
    • [MaKey MaKey inputs/ ground to...]

  • Hi5! board



The computer is the nerve centre of Hi5! as it hosts the Pure Data visual programming software - responsible for interpreting the participants interactions - and all of the audio that is triggered as a result of these interactions and interpretations. 


The MaKey MaKey is the device that forms the conceptual backbone of Hi5! This clever piece of hardware works on the principle of 'completing the circuit' - very much like turning a light on and off with a switch - but with the added benefit of being able to programme it to output any ASCII character (or practically any symbol your can generate with you computer keyboard).


(c) makeymakey.com, 2014

The Hi5! board itself is 'the face', 'the UI' or the graphical interface that the public is able to interact with, whilst simplifying the experience and providing a larger, more accessible piece of hardware - the MaKey MaKey is small and mighty, but delicate all the same and may not withstand being continually manhandled by 30+ people an hour.


Arduino and MaKey MaKey

Having established that the MaKey MaKey is identified as a HID by the computer and that it is also programmable, you may be wondering what was actually programmed into the MaKey MaKey, if anything at all?

It all relies upon the features of the MaKey MaKey and the potential that they can offer.


(c) sparkfun.com, 2014
Underside view
The device used has 18 inputs:

  • 12 on the back
    • two rows of 6 inputs on female header strips  
  • 6 on the front
    • represented by the arrow pad, space and click

(c) sparkfun.com, 2014
Topside view
There are various Ground inputs too:
  • 6 on the back
    • a single row of 6 inputs on a female header strip
  •  6 inputs on the front via alligator/crocodile clip connection

By default - if you were to connect the MaKey MaKey to the computer and use it to control an application i.e. word processing software - the device would output the following values:

Underside-Left inputs:
  • w, a, s, d, f, g
Underside-Right inputs
  • 'up', 'down', 'left', 'right', 'left-click', 'right-click'
Topside inputs:
  • 'up', 'down', 'left', 'right', 'space', 'left-click'

These default values can be changed, as mentioned, to any ASCII character. The choice of values is more important for Pure data in that, for the concept to work, there could be no repeated characters - they all had to be unique. This is where Arduino comes into play.

You can download the Arduino software, sketch and settings files that will allow you to edit the output values for the MaKey MaKey. The files should look something like the following images:

[settings.h] - output values
[makey_makey_1_4_1] - default sketch
Under the [settings.h] tab you can view the default values assigned to each pin and changing them is simpler than you might think. To change a value all you need to do is highlight, or click just after the character, then type in the new character you wish to be outputted upon triggering.

Once you are finished assigning new characters the device can be updated. Checking that the sketch is accurate - by clicking the green tick in the Arduino software - will confirm that there aren't any mistakes and then clicking the green arrow will upload/update the MaKey MaKey settings with the new values. Note: the MaKey MaKey must be connected to the computer via USB and selected via the correct port in Arduino before any updates can be applied.

For reference, the choice of values were as follows:
  • 4 of the inputs were assigned the values of '1', '2', '3' and '4' respectively 
    • These were the 'technique' identifiers - bow, wind, hand, mallet
  • The other 14 inputs were assigned alphabetical values starting from 'a' and continued until 'n'
    • These were the 'instrument' identifiers
This kept it all fairly simple, but crucially ensured that each character was unique and non-repeating.


Pure Data

(c) fernadoquiros, 2014
Visual programming software developed by Miller Puckette, it is another example of dataflow programming - alongside Max/MSP - within a graphical environment (Pd, 2014).

It is the interpreter and brain of Hi5! - translating the values it receives via the MaKey MaKey (HID), applying a set of logic based rules, before triggering an audio file that should relate to the combination of 'technique' and 'instrument' as chosen by the participant.

The Pure data patch was quite simple in its conception and was organised in a logical manner: columns of 'techniques' and rows of instruments. Each corresponding sub-patch was identical to the rest with an exception occurring when there were more audio files for a particular sub-patch. 

Despite the incoming values from the MaKey MaKey being alpha-numeric ASCII characters, Pure Data would require its own interpretation of these values. Each ASCII character - case sensitive - has its own unique number assigned by Pure Data - for example 'a' = '97', whereas 'A' = '65'. To determine what the Pure Data equivalent values were I used the [Key] and 'Number' object, which displays the number associated with a particular ASCII key when triggered.

Pd - working out the ASCII character equivalent 

Once these numbers were known they could then be used to establish the rules for triggering the audio files. 

Each sub-patch took on the following form:
  • Combination 'gate'
    • Random selection sub-patch
      • Audio sub-patch

The signal chain for all of these patches is in a linear manner in that the logic principles behind each sub-patch must be satisfied before the next sub-patch can trigger...right up until the audio is triggered.

Pure Data logic

To aid the visualisation of how Pure Data works, use this example of the most common data comparison object used in the Hi5! patch:

  • Once triggered, the signal reaches an object - i.e. [Sel] - that compares the values it receives. 
  • It operates based on what data you input into the [sel] object to begin with:
    • If left blank, it will allow for two inputs of data to be compared
    • Similarly, if only one number is entered - i.e. [sel 49] - you will also get two inputs, but in the case of Hi5! only one input was needed in this scenario.
    • If more than one number is entered - i.e. [sel 1 2 3] - there will only be one signal input, but there will now be four signal outputs that will trigger (if desired) when the incoming data matches the stored data - the 1, 2, or 3 -  and its associated output.
  • Once the object conditions are met, it can be designed to output a trigger, also known as a 'bang'.
    • This can be altered to trigger a different output, if needed, by adding a number object that is inserted into the signal chain before being triggered by the preceding 'bang'.

The Combination 'gate'

This compares the incoming data - from 'technique' and 'instrument' - against the predefined values in order to 'open' the gate, thus triggering the next stage of the chain.

Combination 'gate'

In order for Hi5! to operate correctly, the Ground nodes are housed as part of the 'technique' nodes and control whether or not the 'gate' opens and closes. Using the [Key] and [Keyup] objects, the [sel] object outputs a '1', when the correct values have been triggered, continuing the signal flow. If the 'technique'/Ground node is released, the [Keyup] function 'closes' the 'gate' by triggering a '0' output that conflicts with the necessary conditions needed to satisfy the [sel] object. This is also a feature that stops multiple techniques from triggering audio when released as it requires the node to be reactivated to open the gate again.

Random selection sub-patch

Once the conditions have been satisfied, the path is then randomised for the benefit of the audience to ensure that the choice of audio clip does not appear sequential or repetitive due to a cyclical progression. 

Randomiser
The [shuffle 1 6] object randomly selects a number from a range of 1 to 6 and if this number satisfies the [sel] object to which that number has been assigned, it will continue the signal flow through that path. Each path leads to a different audio sample hence the desire to randomise it in the first place.

Audio sub-patch

When the inlet in this sub-patch is triggered, it simultaneously opens and plays the associated audio file. The volume and, subsequently, the duration of this audio file are controlled by another sub-patch that affects the triggering of the fade out. Each fade out is over a time of 4 seconds.

Audio sub-patch
Fade out sub-patch

Each main sub-patch is the same, but varies based on the number of audio files used. I do hope that this outline helps to convey the concept of the programming despite not going through and explaining every last function object by object.




Easter Egg

Blame popular culture for this...once the notion of including such a feature was deemed possible, it was an inevitability. It is a fun addition to the Hi5! board, but not an obvious one as it is based on the methodology for triggering existing sounds. Whilst I will not explain exactly HOW to trigger the easter egg, I will leave the following images of the sub-patches so that the more observant of you can piece together how it is achieved. That said, anyone who has a good knowledge of Pure Data could probably work it out just from the data within the images and the corresponding MaKey MaKey assigned values. You may also fluke it...in which case, good for you! :-)

Easter egg main sub-patch

Easter egg combination selector

Easter egg randomiser and file selection/fade control

I hope this has been an informative read and, whilst the Pure Data programming wasn't the most beautifully constructed or streamlined that it could have been, I hope you can appreciate Hi5! for what it is. I enjoyed undertaking this project and have learnt more than I could have hoped. I should finish by thanking the Philharmonia Orchestra for providing the opportunity to be a part of the overall MusicLab installation, but principally for the trust and support to allow Hi5! to become what it has.

Thanks for reading,

Friday, 20 June 2014

MusicLab - Hi5!

What is Hi5!

'Hi5!' is an one of the interactive features within the MusicLab trailer that will be touring the south-west of England as of April 2014. For more about MusicLab itself follow the following link: 


...or if you would like to know more about my own personal involvement you can read my previous post in which I provide an overview to MusicLab here.

In it's basic form, Hi5 is an exploration of the instruments of the orchestra through the medium of sound, but with the added twist that the participant must choose the instrument and how it should be played. There are 14 instruments accompanied by 4 methods of playing them - referred to as techniques, for the sake of simplicity  - and they are all represented by two-dimensional stencil images.


…but why is it called Hi5! then?

As it is an interactive feature, there must be some way for the participant to engage with it…put simply, you literally trigger the sound. Each instrument and technique image has a corresponding metal 'node' that connects to an Arduino based human interface device (HID) that goes by the name of MaKey MaKey

The MaKey MaKey HID - (c) makeymakey.com, 2014
This device is recognised by computers in the same way a keyboard would be and as a result can be used to input letters, numbers, or symbols that you might find on any standard computer keyboard. The MaKey MaKey device is designed to output values (via USB connection) as a result of closing a low-voltage circuit. The circuit can be completed using metal wires or any conductive material, but what makes it more fun to use is that you can be a part of this conductive chain. In fact, a chain of 10 people, probably more, will work and it is completely safe too.


As a result this, the name Hi5! was derived from the suggested means of closing the circuit when being used by two people:

"Choose an instrument, Decide how to play it…and high five!"


(c) makeymakey.com, 2014

Design

Hi5! is, in its most basic form, a wooden board with an acrylic surface that is embedded with conductive metal nodes. It is roughly 1.5 metres tall by 1 metre wide and can be mounted onto a wall with the use of brackets at the back. 

Whilst I had a high involvement in this part of MusicLab, the design was ultimately decided upon by the production manager and finalised by a freelance designer. The board itself was constructed by 3FD and I found myself in an advisory role in which suggestions and critical features were put forward to ensure that the board would work as intended. The main risk involved with Hi5! was that is was mainly based on theory until we could get our hands on the board and get it prepared for MusicLab. I did a lot of extensive testing, which provided the basis of all suggestions that were put forward to the designers, but the success of the board was dependant on how it would turn out once constructed.

As mentioned above, the instruments and techniques are represented as two-dimensional stencils and each have a corresponding conductive metal node. They all look the same and that would be true except that the technique nodes have an extra inner-element that is critical to the operation of Hi5! I will explain the role this has in more detail during a future post as it is software related, although it should be noted that it is an independent connection that has to remain insulated from the surrounding node in order to operate correctly.

The Hi5! board

The Instruments

The more observant readers might have realised that there are in fact more instruments in a typical orchestral setup than the fourteen featured on the Hi5! board. The amount of instruments were limited by a number of factors such as the technology - the MaKey MaKey only allows for 18 inputs - and physical dimensions - considerations had to made in order to make the board accessible to people of all ages and abilities - but crucially, not all instruments lend themselves well to the concept of Hi5!

For an instrument to be considered for the board, it must be playable/ possible to generate sound from the following methods or techniques:
  • Hand
  • Sticks/ Mallets/ Beaters
  • Bow
  • Wind
To give you a better idea of how this concept lends itself well to some instruments, but less so for others, consider this: 

A member of the percussion section, such a mettalophone, can be played using mallets - the traditional method of playing - or, if like the Vibraphone it has a motor and resonating chambers, an effect similar to the vibrato of a wind instrument can be created - hence the link to the wind technique. There is a unique method of playing an instrument like the Vibraphone and that is by using a bow - a nice sound that is achieved by bowing one end of a tuned bar - and perhaps a slightly less conventional practice of using the hand - whether to flick the tuned bar or hit it   

However...

A woodwind instrument, such as the Clarinet, only works well with the 'wind' technique due to the simple fact that wind/air/breath is the core element driving behind the sound of the instrument. That means for the other three techniques - mallets, bow and hand - I had to be a little bit more creative or basic, in some cases. As a result the sounds produced by a clarinet when combined with the hand is the sound of keywork - something simple, but heard when sat in close proximity to a clarinetist - and a similar sound of the wood of a Clarinet being hit by a mallet and bow - something that isn't really a technique, but at the same time should tell the listener a small amount about the materials and density of the instrument.

The final choices of instruments also take into consideration the other features of MusicLab based on what else was on offer in terms of instrument exploration. 'The Instruments' feature being the prime focus in this case as it includes a French Horn, Cello, Clarinet and Timpani. Hi5! does not include any of these, with the exception of the Timpani, as 'The Instruments' offers a hands on experience of each instrument and as such I felt it would be best to leave as many of these out as it would result in some minor repetition between features. 

The full list of instruments included on Hi5! is as follows:

  • Tam-Tam
  • Tuba
  • Harp
  • Flute
  • Double Bass
  • Cymbals
  • Mettalophone
  • Trumpet
  • Snare Drum
  • Bassoon
  • Timpani
  • Violin
  • Crotales
  • Oboe

Audio

Just a quick mention about the audio. The audio samples that are triggered as a result of an interaction were sourced or recorded by myself.

A list of the instruments that I recorded:


  • Bassoon, Crotales, Cymbals, Double Bass, Harp, Vibraphone, Tam-Tam
The instruments that were sourced or sampled:

  • Flute, Oboe, Snare, Timpani, Trumpet, Tuba, Violin, 
    • Minor sampling: Crotales, Cymbals, Tam-Tam

N.B. Special thanks to Havering Music School for allowing me access to record a selection of percussion instruments.


Easter Egg

Inspired by various forms of media that have chosen to include an 'easter egg' or hidden feature, I have added a simple 'extra' of my own. If you would like to find out more about it then feel free to read the next post in which I will be discussing the back-end design of Hi5! That said, I won't be disclosing the exact details of the easter egg...that would be making things too easy!



Monday, 19 May 2014

MusicLab

What is MusicLab?

"MusicLab is a new mobile interactive digital installation. Housed in a 13.5m purpose-built trailer, the Philharmonia Orchestra is developing a flexible, pop-up orchestral experience that can reach targeted communities in Plymouth, Torbay and Cornwall. Using cutting-edge digital technology and innovative interactive design, MusicLab will offer a series of hands-on musical experiences and interactions, designed to put you in the shoes of a composer, performer, producer and conductor" - MusicLab, 2014



During the first four months of 2014, I am proud to say that I played a part in the development of this project. This was a critical part of the project as various features of the design were starting to be implemented.  I managed to see the early stages of the refurbishment process in Leicestershire and was also present during the the final days in Kent.

N.B. There are a many aspects - such as funding, planning, and other critical requirements - that I am skipping over due to their complex nature (and relevance to this and future posts). 

Over the course of the next few posts I will go into more detail about some work that I was personally responsible for, but for now I shall outline my basic involvement.

So, to work...

Refurbishment


Before the trailer reached its current state the cabling needed to be run. This was particularly important, however it wasn't an easy task given the design of trailer, the route they had to follow, and the number of cables involved. While I was assisting in Leicestershire, the majority of the cables were VGA or XLR and these would run between the digital nerve centre and the various screen situated throughout the installation. 

To aid your understanding of how this task was 'easier-said-than-done', have a look at this photo of the trailer's interior during the re-fit: 



The cables had to run underneath the floor - as can be seen in the previous image - and this meant threading them through gaps or cavities that would allow them to run without damage. Unfortunately, this became tougher as more and more cables fought for the limited space, although it should be said that this hinderance was mainly down to the square shape of the connecter heads.

Main Console



One aspect that will remain constant for all of the following sections was my involvement with the development of the concept. On a basic level this meant that I was someone to bounce ideas off, a different perspective, or a critical mind to assist in the process. This development would range from the technological elements to the visual design and layout, but ultimately the Production Manager would have the final say.

In terms of the main console, I was responsible for: 
  • Software upgrades and installations on the iPads
  • Sourcing a flexible iPad stand that would allow universal access regardless of age and ability
  • Sourcing 10inch LED monitors to accompany the iPads
On a more creative note, I was tasked with editing and exporting footage from a previous installation - Universe of Sound - so that it could be used as one of the modes in this section. As there are eight iPads to the main console this meant editing different sections of the orchestra into one video and they were grouped based on the section of the orchestra they represent and the whether they represented a high or low pitched instrument. This went according to the following example: 1st Violins, 2nd Violins and Violas would be grouped into 'High Strings', whereas Cellos and the Double Bass section formed the 'Low Strings'. 

As there are seven movements to 'The Planets' suite by Holst this meant 56 different video edits were required prior to export using the Apple Pro Res 422 compression. However, there was an additional eight edits required due to the inclusion of Joby Talbot's 'Worlds, Stars, Systems, Infinity' as a direct follow-on to the final movement of 'The Planets' suite. 

The edits included synchronisation to an adapted audio track that comprised of the instruments featured in that video. As a result, the final video edits were organised as follows:

  • High Strings
    • 1st Violin, 2nd Violin and Viola
  • Low Strings
    • Cello and Double Bass
  • High Brass
    • Trumpet and French Horn
  • Low Brass
    • Trombone, Euphonium and Tuba
  • High Woodwind
    • Flute and Oboe
  • Low Woodwind
    • Clarinet (a compromise had to be met), Bassoon and Contra Bassoon
  • Percussion 
    • Percussion (various) and Timpani
  • Extra
    • Harp, Celeste, Organ and Choir (no visual edit)
During the final few weeks in Kent, I assisted with the testing of the Main Console providing myself as a willing 'would-be' member of the public.


The Instruments



This is by far the most interactive of all the sections given it requires the participant to use a modified instrument to control the gameplay. There were many challenges that had to be tackled in order to successfully achieve the desired end result and these ranged from sourcing instruments to coding the gameplay. 

Initially I was supporting the search for instruments that would then be sent away to be fitted with the appropriate sensors thus enabling the instruments to interact with the game. However, I became more involved with the filming of the instrument tutorials that would form the backbone of the entire experience. The four instruments chosen were:

  • Cello
  • Clarinet
  • French Horn
  • Timpani


The tutorials for each respective instrument were shot at Waterloo Film Studios in London and it was my responsibility to capture the audio. This required two different sources that would then be synchronised and mixed, although the two would never be heard simultaneously:


  1. Dialogue (Røde NTG-1)
  2. Instrument audio (XY-Stereo pair: AKG C391 B)
The Røde NTG-1 went straight into one of the two HD cameras whilst the stereo audio was captured on a portable recorder (Marantz).


Video Bench



As with the Main Console, a solution was required. The same flexible iPad stand provided by Bouncepad was also used for the Video Bench. For this section of the trailer the participants would be seated and this was another area that I assisted in the development of: researching the layout and the cushions that would be used.

Chorus Booth



I used my knowledge of audio equipment and classical music to help the progression of this concept. The choice of microphone was important as its properties would be crucial to capturing clear and crisp sound. As it happens, Chorus Booth incorporates a broadcast standard dynamic microphone from Electro Voice with a Cardioid polar pattern. 

The choice of music and the structure of the interface were all debated and reasoned to allow the participant the best possible experience. This choice of music had to pass through various criteria such as: 
  • How melodic is it?
  • What is the level of complexity?
  • Is it memorable?
  • Is it of a sufficient duration? (neither too long or too short).  
Of course, there were other questions, but those listed above are there to outline the type of questions relevant to the conceptual development.

Hi5!



I dedicated most of my time into this section of the trailer and as such I feel I must dedicate further posts to cover it in more detail. I will discuss the nature of Hi5! and how I went about taking it from its original concept and resulting with the finished article.

In future posts I shall provide an overview of Hi5!, a technical insight into the way it functions and all the factors that influenced the final design.

Wednesday, 20 November 2013

Analysis - 'A Minor Incident'

Title - A Minor Incident


Composer - Badly Drawn Boy (Damon Gough)




Structure and Lyrics


Introduction (4 chord outline)

Intro Progression (6-bars)

Verse 1 (8-bars)

There's nothing I could say
To make you try to feel ok,
And nothing you could do
To stop me feeling the way I do,
And if the chance should happen
That I never see you again
Just remember that I'll always love you

Intro Progression (6-bars)

Verse 2 (8-bars)

I'd be a better person
On the other side I'm sure,
You'd find a way to help yourself
And find another door,
To shrug off minor incidents
Make us both feel proud,
I just wish I could be there
To see you through

Intro Progression (6-bars)

Instrumental Middle 8 (8-bars)

Verse 3 (8-bars)

You always were the one
To make us stand out in a crowd,
Though every once upon a while
Your head was in a cloud,
There's nothing you could never do
To ever let me down,
And remember that I'll always love you

Intro Progression (6-bars)


Outro (4-bars + 4 off-tempo bars)


Lyric Analysis

The song was written as part of the soundtrack to the film ‘About a Boy’ and can be heard during a scene in which the young boy (Marcus) is reading a suicide letter from his mother. Gough stated that he “chose to write the words that the mother may have written in the note” (Peggah, 2007) and the most logical emotion that is conveyed is unconditional love. This can be seen in the lyric: “remember that I'll always love you” which is repeated at the end of the first and last verse.

The lyrics are full of imagery and metaphor, a lot of which can be understood when watching the film, but in essence depicts a person that is the sole cause for happiness in life (“There's nothing you could never do, To ever let me down”) but is ultimately helpless in preventing this tragedy (“And nothing you could do, To stop me feeling the way I do). However these words (out of the context of the film) could mean entirely different things to whomsoever is reading: the first verse could be about a person in love or, conversely, knowing that they can never be with the person they are writing to.

Gough also mentions that when writing the song he was influenced by Bob Dylan and the song ‘Don’t Think Twice, It’s Alright’ (Peggah, 2007). When comparing both sets of lyrics, it becomes apparent just how similar they are in style (Dylan, 1963). “…Where I’m bound, I can’t tell.
But goodbye’s too good a word, gal. So I’ll just say, fare thee well…” (Dylan, 1963). Dylan’s lyrics were renowned for having powerful imagery and are one of the reasons he has been so successful as a songwriter: “Come you masters of war, You that build all the guns, You that build the death planes, You that build the big bombs, You that hide behind walls, You that hide behind desks, I just want you to know, I can see through your masks” (Dylan, 1963).

There isn’t much by the way of a rhyming scheme but there are occasions in which the pattern incorporates those akin to Dylan’s style: Verse 1) AABBCDB. Verse 2) ABCBDEFG. Verse 3) ABCBDED, compare that to the verse structure of ‘Don’t Think Twice, It’s Alright’ – ABABCCCD (Dylan, 1963) – and the similarities are there but not as rigid as Dylan’s.


Melody and Harmony

The melody is simple but effective in that it creates tension through the use of an ‘inverted pedal-point’ (Walker, n.d) before resolving with the rest of the harmony, on the root note of ‘G’. Although around half of the melody is sung upon the note ‘D’, it incorporates variation to the melodic phrasing by using a “common technique of arcing scales” (Fryer, 2012) that exploit the vocal range. The melody line is fairly syncopated but with only one discernable rhythmical motif coming in the form of a repeated quaver-two semiquaver pattern (with a tie between the last semiquaver and the next quaver). This motif develops over the course of the song however only in a minor way due to Gough having to fit the words to the melody.

Interestingly the melody can be seen to factor contrapuntal motion techniques: At times when the ‘D’ is repeated in the melody the ‘false’* bass line is heard to be rising in ‘oblique motion’ (Drabkin, n.d.), and during the variation of the phrase the melody and the bass line can be heard in ‘contrary motion’ (Drabkin, n.d.). This has long been an effective tool in harmonic and melodic writings such as the numerous Chorales written by J. S. Bach (Marshall & Leaver, n.d.).

The song is in the key of G major, and only incorporates the diatonic chords I, II, IV, V and VI, which again is very similar to the style of Dylan (Griffiths, 2009), although there are harmonic alterations to some of the chords such as the addition of the 7th to A minor and a ‘major 7th’ to the C major chords.

* ‘False’ bass line is implied as a differential from general notions that suggest bass-lines are usually played on a separate instrument.


Structure and Arrangement

Again there are similarities to ‘Don’t Think Twice, It’s Alright’ in particular (Dylan, 1963), as both include Guitar, Harmonica and Vocals. The song structure, which can be seen outlined in the lyrics above, is simple but does feature four 6-bar guitar fills (first heard in the introduction) which help to break the song into different sections. Apart from the being a filler between the sections, there a sonic markers that help the transitions into the next sections: in bars 4-5 the strum pattern changes to accentuate the chord before settling on the impending verse’s strum pattern in bar 6. Markers are common in song writing but again referring to the context of Dylan, a rhythmical marker can be heard in the drum roll that joins the instrumental section and verse in ‘All Along The Watchtower’ (RustyShackleford965, 2011).  

Keeping the arrangement simple and not having much variation allows for greater concentration on the lyrical content and what the songwriter is trying to communicate to the listener. This is accentuated further by the melody and harmony following the same structures leaving only the lyrics to change. In essence, it’s as close as a song gets to being a Bob Dylan piece [example: Mr Tambourine Man (BobDylanTV, 2012)], barring Dylan actually writing it himself, and clearly doesn’t speak much for originality. 


Production

There are only three different ‘timbres’ in the song: Voice, Acoustic Guitar and Harmonica. The Guitar is a constant backing throughout the song and provides the harmonic and rhythmic base for the entire song whereas the Voice is the storyteller and part that makes the composition a song (Flattery, 2012). The Harmonica is an instrument that, when heard, will often instil the notions of folk ideologies and is the key link between the song sounding like just another acoustic piece or one that harks back to the traditional days of Folk. The author of the original book from which the film is based seems to agree with the effect when he mentions the “wheezy Dylanesque (sic.) harmonica solo” (Hornby, n.d.).   

Dylan defined simplicity in his production: “Voice for words, guitar for chords…and harmonica for formal breaks” (Griffiths, 2009). ‘A Minor Incident’ is a confirmation of this style, especially given the Harmonica featuring only in the middle 8 of the song. Gough has clearly kept the standards that Dylan went along with, at least as a solo artist, as the use of only one voice limits the potential for vocal harmonies that were big features when artists like Dylan performed with folk artists such as Joan Baez (peacebreizh, 2012).

In terms of the recording, the voice and guitar are based around the same (predominant) frequencies whereas the harmonica is the only timbre that appears at higher registers and therefore creating a contrast. It also appears to be quite a dry mix with little noticeable reverb [compared to songs that got him noticed and subsequently hired to compose the soundtrack (userfr, 2006)], this creates a more intimate mix as it gives the impression that the performer is close up as was most likely intended. Surprisingly, it was the first time Gough had worked with a producer (Murphy, 2002)
and so it can be assumed that although the song may have been recorded in one ‘live’ take, the modern techniques might have meant overdubs are present.



References

BobDylanTV (2012) Mr. Tambourine Man (Live at the Newport Folk Festival. 1964). [Online Video] 11 September. Available from: <http://www.youtube.com/watch?v=OeP4FFr88SQ> [Accessed 16th October 2012].

Drabkin, W. (n.d.) Part-Writing. [Online]. Grove Music Online, Oxford Music Online. Available from: <http://www.oxfordmusiconline.com/subscriber/article/grove/music/20989> [Accessed 12th October 2012].

Dylan, B. (1963). Don’t Think Twice, It’s Alright. [Online]. bobdylan.com. Available from: <http://www.bobdylan.com/us/songs/dont-think-twice-its-all-right> [Accessed 8th October 2012]. [See sidebar for audio sample]

Dylan, B. (1963). Masters of War. [Online]. bobdylan.com. Available from: <http://www.bobdylan.com/us/songs/masters-war> [Accessed 18th October 2012]. 

Flattery, C. (2012) Song Writing Tutorial. [Personal Communication]. Leeds. Leeds Metropolitan University. [Date: 2nd October 2012].

Fryer, A. (2012) Song Writing Tutorial. [Personal Communication]. Leeds. Leeds Metropolitan University. [Date: 11th October 2012].

Griffiths, D. (2009) Dylan, Bob. [Online]. Grove Music Online, Oxford Music Online. Available from <http://www.oxfordmusiconline.com/subscriber/article/grove/music/08456> [Accessed 11th October 2012].

Hornby, N. (n.d.) The Books: 31 Songs: Extract. [Online]. Penguin Books. Available from: <http://www.penguin.co.uk/static/cs/uk/0/minisites/nickhornby/books/31s_extract.html> [Accessed 4th October 2012]

Marshall, R.L., Leaver, R.A (n.d.) Chorale. [Online]. Grove Music Online, Oxford Music Online. Available from: <http://www.oxfordmusiconline.com/subscriber/article/grove/music/05652> [Accessed 13th October 2012]. 

Murphy, J. (2002) Badly Drawn Boy – About A Boy. [Online]. MusicOMH. Available from: <http://www.musicomh.com/albums/badly-drawn-boy-2.htm> [Accessed 18th October 2012].

peacebreizh (2012) Bob Dylan and Joan Baez - It ain't me babe. [Online Video]. 13 July. Available from: <http://www.youtube.com/watch?v=O8A3BZAXMrQ> [Accessed 14th October 2012].

Peggah (2007) Badly Drawn Boy – A Minor Incident [Online Video]. 10 March. Available from: <http://www.youtube.com/watch?v=vCW1QGpIy9g> [Accessed 28 September 2012].

RustyShackleford965 (2011) Bob Dylan – All Along The Watchtower. [Online Video]. 22 June. Available from: <https://www.youtube.com/watch?v=YanjY9CsPDQ> [Accessed 19th October 2012].

userfr (2006) Badly Drawn Boy - Disillusion (directed by Garth Jennings). [Online Video]. 25 September. Available from: <https://www.youtube.com/watch?v=B11msns6wPU> [Accessed 19th October 2012].

Walker, P.M. (n.d.) Pedal point. [Online]. Grove Music Online, Oxford Music Online. Available from: <http://www.oxfordmusiconline.com/subscriber/article/grove/music/21181> [Accessed 13th October 2012].