[-empyre-] Participatory Art: New Media and the Archival Trace

Anna Munster A.Munster at unsw.edu.au
Thu Jun 4 12:36:55 EST 2009


Hi Hana and all,

I found what you had to say interesting re the trace-body-media  
relation generally but also specifically in relation to the Sonic  
Interface piece you mentioned below:
> A project that I think very specifically engages both sets of body  
> functions in very interesting ways is Akitsugu Maebayashi’s Sonic  
> Interface, a portable hearing device that is made from headphones,  
> microphones, and a laptop computer. The participant is invited to  
> walk around the city, and experiences modified sonic environments  
> processed real time (with a 3 second delay) from the sounds it picks  
> up. The experience of the altered environment generated by the  
> software program influences and questions the sense of space and  
> time. Mayebayashi has focused on the auditory sense as an interface  
> between the body and the environment, in a different way than an  
> audio walk of any kind – locative or pre-recorded.
>

what I think is really interesting in the context of participatory art  
right now, is the way in which this is moving into a much broader  
sphere of newer forms of participatory culture. So, for example, see  
the new iPhone app RJDJ (http://more.rjdj.me/what/) where you can use  
incoming sensorially activated data (movement/ environmental sound) to  
affect pre-recorded sonic data and tracks. Essentially what you are  
doing is in/remixing environmental data with prerecorded data on an  
iPhone/IPod device and listening to it as it gets in/remixed. The app  
is free and being used to generate RJ/DJ events in the same way people  
were using iPods for live podcasting events a few years ago.

The RJ stands for 'real jockey' with an overt reference to 'realtime'  
processing and mixing. But what is really interesting here is if we  
start inflecting this with a Bergsonian-Deleuzian understanding then  
we come up with a kind of music-memory-machine that is about  
generating sonic space-time in-between the present-processed  
(realtime) and the past-retensive (prerecorded) such that one is  
continually producing a kind of sonic rendering of the temporal that  
cannot settle between the present and the past (or the 'to come' -  
protentive)...

This has implications for your below comment:

> By uncoupling sound from vision, this project questions what we  
> assume as "real".  "Presence" requires the constant stabilizing and  
> synchronizing of vision and sound; an uncoupling of the two opens up  
> the possibility for other presences, other experiences of "self."  
> This separation also importantly has the effect of destabilizing the  
> experience of "place."


the trace, then, of both the machine and of matter (sonic,  
environment, participant) in the RJDJ app is really an inmixing rather  
than a remixing...I think this has consequences for all the fairly  
boring and banal notions of remix/participatory culture around  
(Shirkey, jenkins et al) and opens up, instead, something much more  
novel about how one creates a platform for participating in a  
temporality that is both occurring but  has not yet happened or only  
partly happened and that part will be open to re-happening (TOL so  
don't hold me to this ;-)...

cheers
Anna


A/Prof. Anna Munster
Assistant Dean, Grant Support
Acting Director Centre for Contemporary Art and Politics
School of Art History and Art Education
College of Fine Arts
UNSW
P.O. Box 259
Paddington
NSW 2021
612 9385 0741 (tel)
612 9385 0615(fax)
a.munster at unsw.edu.au











-------------- next part --------------
An HTML attachment was scrubbed...
URL: https://mail.cofa.unsw.edu.au/pipermail/empyre/attachments/20090604/0bbd83aa/attachment.html 


More information about the empyre mailing list