Posts Tagged ‘Belfast’

Tantrum – Puppet Whispers

Here’s the latest version of ‘Tantrum’.  The piece features a musical ‘toy’ which up to 3 performers are allowed to manipulate at any one time.  The 4th musician is granted permission to play with the toy by having a ‘tantrum’ and stomping their feet.  This in turn knocks out one of the original 3 performers.

Advertisements

Graphic Notation – Puppet Whispers

The latest project! This summer I’ve set myself the task of designing a piece of animated music notation software that converts ensemble gestures into data which affects a musical score. The idea was initially to design a performance framework where performers could be separated visually allowing a composer/mediator to control the level of interactivity between each instrumentalist.

Chinese Puppet Whispers

The interaction found in the traditional game of ‘Chinese Whispers‘ or ‘telephone‘ will be used as a central theme for the pieces performed in concert. The transmission of gestural and aural information and its transformation as it is constantly recycled in addition to composed materials.

Each performer will have a computer generated graphic score. I’ve been prototyping the score as a program in Processing which is a great tool for visual/sonic artists who understand programming concepts. I will be using pressure sensitive resistors on the performers’ feet to collect time lapsed data and accelerometers attached to the performer to control the drawn imaged in the ‘shared space’ half of the graphic score.  The software in its current state looks like this.

Puppet Whispers - software by Christopher Chong

I will be keeping this blog updated as the system evolves. The current shadow screens are too small and the pieces to be performed are still being developed. At the moment the software divides the screen into an individual and shared spaces. The individual space presents each performer with different information dependent on the movement gestures captured by an adjacent performer. The shared space allows for communication between all 4 performers simultaneously but as with the other gestural information, its visibility is controlled by a mediator.

Any comments and suggestions are welcome at this early stage of development!!
-Major C

SL’etude – Sax, Max and Second Life

We’ve finally had a chance to do a test performance on Franziska Schroeder’s DRHA submission piece ‘SLetude’ based on the SLProxy application we’ve been developing to send data between Second Life and Max/MSP. The testing took place on Wednesday June 18, 2008 at the Sonic Lab in the Sonic Arts Research Centre at the Queen’s University Belfast.

What you’ll see in the video is a combination of about 4 different programming languages and applications to send audio in and out of second life from a concert venue whereby 3D objects with sounds attached are thrown around in the virtual world with their sound diffused 3-dimensionally in the Sonic Lab. The Sonic Lab at SARC is a room capable of total surround sound in 48 channels above and below the listeners/performers.

Christopher ChongProxy dev, Visuals, Max/MSP patching, Framework design
Pedro Rebelo – Independent Study supervisor, Network Performance research
Franziska Schroeder – Saxophonist, SL’etude composition
Hunter McCurry – LED gesture visuals

The piece is to be performed in concert along with a paper session at the Digital Humanities and Research for the Arts (DRHA) 2008 conference.

Max/MSP + Second Life: Spatialization

Ok so here is a better example of what we’ve been trying to do as part of the Second Life performance project. I’ve tried to make the video as clear as possible. The site is the European University II which is where virtual SARC is based. The patch is running on Max/MSP 5 and the SLProxy is running in the background. There is practically no delay between getting information out of Max and into an object whereby its speed it limited by the LSL scripting system in second life (soon to be replaced by mono).

So here it is, sound objects each with 6 specifically programmed locations and trajectories driven by random selection in Max/MSP. The fellow walking into SARC at the beginning is my little avatar DjChongy. Look me up if you are about in SL!

Impact. React. Respond.

This is a project I worked on last term for MA in Sonic Arts at SARC in Belfast. It’s ‘live performance system’ for making music using drawing gestures and physics-based movement and motion capture. Not sure what else I can say about it! Basically you could draw shapes using an infrared light which I tracked the motion of using Max/MSP/Jitter and I programmed the interaction and motion capture using JMyron for blob tracking in Processing.

enjoy!

Max/MSP + Second Life gesture control

Details to follow, but basically this is a demo of Max/MSP controlling object movement in 3d in second life!

Max/MSP having a chat with Second Life

More second life updates!!  Yes it appears that we now have TWO-WAY communication between Max/MSP and Second Life.  After a long painstaking day with C# code I figured out where to put the UDP client so that we can send information from an mxj netsend object in Max/MSP to the SLProxy which in turn fires off a little packet full of communicatory goodness.  It’s time to do a little victory dance!!

here’s a little screen shot!

Max/MSP having a little chat with Second Life