Posts Tagged ‘Network Performance’

RCM Laptop Orchestra

This year is my 2nd and final year of the MMus in Composition & Composition for Screen at the Royal College of Music.  I’ve recently set up a group for the purposes of teaching and encouraging improvisation through experimentation and interactive graphic scores for performance on Laptop.  The RCM Laptop Orchestra has now had two meetings, 1 rehearsal and is off to a great start!  Keep up with our progress here at

http://www.MajorC.co.uk/

http://www.MajorC.co.uk/MajorC/RCM-Laptop_Orchestra/RCM-Laptop_Orchestra.html

shapeimage_4

Puppet Whispers – Communal Areas

Ok, so I haven’t got a lot of time to really explain this one as I’m currently fixing it for rehearsals coming up but this is a sneak peek at the latest version of what has come to be known as ‘Communal Areas’.  This pieces is composed in real-time but performance is controlled by an ‘active time bar’ which moves left and right according to the position of the musician to your right.  If this movement is proving too erratic for your poor classical mind, you may stomp ur foot to ‘pause’ the active time bar for a maximum of 4 seconds.

Graphic real-time notation

Communal Areas: Graphic real-time notation

The Bag which appear to be full of little balls gets heavier throughout the performance whenever a musician requires the ‘pause’ facility.  Dynamics of notes played during the piece should comply with the position of the bag of the ‘dynamic gradient’ which forms the main method of structuring material during the piece.  Pitch content is the most improvisatory whilst dynamics are the most static.  Performers may choose to never stomp their feet, but then the piece would never end, and that wouldn’t be very musical now would it?

The system being used for the composition of pitch streams will be facilitated by a fader fox micromodul LC2 (kindly provided by Chris McClelland).  The system allows me to use the fingers of one hand to control the pitch values whilst my other hand is free to press any of 4 buttons which input the notes into the real-time stream.

real-time pitch stream input using a faderfox LC2

real-time pitch stream input using a faderfox LC2

We’ll see how well it works!

Tantrum – Puppet Whispers

Here’s the latest version of ‘Tantrum’.  The piece features a musical ‘toy’ which up to 3 performers are allowed to manipulate at any one time.  The 4th musician is granted permission to play with the toy by having a ‘tantrum’ and stomping their feet.  This in turn knocks out one of the original 3 performers.

Puppet Whispers – Progress report summary

For anyone interested in the project I’ve uploaded my progress report which i had to hand in today.  It gives a fairly brief summary of the project and what you can expect at the concert on Sept 8th/9th 2008 at SARC in Belfast.

link:

http://www.MajorC.co.uk/InterimReport.pdf

Puppet Whispers – ‘Tug of Score’

I will not begin this post with a rant about the lack of a ‘#’ button on my mac keyboard…

Puppet Whispers is in phase 2 after an initial rehearsal/experimenting session.  Had some excellent feedback from the musicians regarding what worked and what was quite definitely impossible!  Discovered a fairly large issue regarding traditional notation, which is time based, and my dynamically shifting score which made performance from it headache inducing.  The idea we were testing looks like this:

An Idea based on John Cage and Tug of War

An Idea based on John Cage and Tug of War

Essentially, in something like John Cage‘s Music for piano where imperfections in the paper inform the placement of notes, I’m designing a system that superimposes Clefs and staves over a page full of notes.  The score itself is shifted left and right based on performers leaning back and forward.  The problem is that playing music from a traditional staff makes no sense when there is no sense of time and so to fix this problem i will try taking snapshots of the music which remain static for short periods, and also with a moving caret/progress bar type indicator.  Below is the Graphic score in action (clef graphic courtesy of Chris McClelland).

Notes are also somewhat limited to short percussive attacks at the moment, so i’ll be updating them so there’s some variety of note durations.  The whole thing is being done, as usual, in Processing and Max/MSP.

The Performance will be free entry at the Sonic Arts Research Centre (SARC) on Sept. 8th 2008 in Belfast at the Queen’s University Belfast.  Feel free to come if graphic animated notation and ideas from network performance interest you.

Graphic Notation – Puppet Whispers

The latest project! This summer I’ve set myself the task of designing a piece of animated music notation software that converts ensemble gestures into data which affects a musical score. The idea was initially to design a performance framework where performers could be separated visually allowing a composer/mediator to control the level of interactivity between each instrumentalist.

Chinese Puppet Whispers

The interaction found in the traditional game of ‘Chinese Whispers‘ or ‘telephone‘ will be used as a central theme for the pieces performed in concert. The transmission of gestural and aural information and its transformation as it is constantly recycled in addition to composed materials.

Each performer will have a computer generated graphic score. I’ve been prototyping the score as a program in Processing which is a great tool for visual/sonic artists who understand programming concepts. I will be using pressure sensitive resistors on the performers’ feet to collect time lapsed data and accelerometers attached to the performer to control the drawn imaged in the ‘shared space’ half of the graphic score.  The software in its current state looks like this.

Puppet Whispers - software by Christopher Chong

I will be keeping this blog updated as the system evolves. The current shadow screens are too small and the pieces to be performed are still being developed. At the moment the software divides the screen into an individual and shared spaces. The individual space presents each performer with different information dependent on the movement gestures captured by an adjacent performer. The shared space allows for communication between all 4 performers simultaneously but as with the other gestural information, its visibility is controlled by a mediator.

Any comments and suggestions are welcome at this early stage of development!!
-Major C

SL’etude – Sax, Max and Second Life

We’ve finally had a chance to do a test performance on Franziska Schroeder’s DRHA submission piece ‘SLetude’ based on the SLProxy application we’ve been developing to send data between Second Life and Max/MSP. The testing took place on Wednesday June 18, 2008 at the Sonic Lab in the Sonic Arts Research Centre at the Queen’s University Belfast.

What you’ll see in the video is a combination of about 4 different programming languages and applications to send audio in and out of second life from a concert venue whereby 3D objects with sounds attached are thrown around in the virtual world with their sound diffused 3-dimensionally in the Sonic Lab. The Sonic Lab at SARC is a room capable of total surround sound in 48 channels above and below the listeners/performers.

Christopher ChongProxy dev, Visuals, Max/MSP patching, Framework design
Pedro Rebelo – Independent Study supervisor, Network Performance research
Franziska Schroeder – Saxophonist, SL’etude composition
Hunter McCurry – LED gesture visuals

The piece is to be performed in concert along with a paper session at the Digital Humanities and Research for the Arts (DRHA) 2008 conference.