Hi Roger

interesting commentary now, expanding on what you do and are interested in,
and I am just wondering how a (physical) body performance, such a Stelarc's, 
that transposes data to avatar motion/behavior in Second Life (or other 3d 
immersive environment) compares to external (environment, nature, urban space, 
light, forces etc) environmental data used to influence and actuate sound and 
object motion in a dislocated / networked space? and how you work with "music" 
or sound in this way? 
Your comment brought back a memory, it's over 10 years ago now,  and I was in 
Toronto at Subtle Technologies, witnessing a beautiful and most curious 
installation, ""Wind Array Cascade Machine", by Canadian sound artist Steve 
Heimbecker. 

We were in gallery and watched a delicate field of slender rods, swaying 
lightly, like a corn field would, and they were luminous or light-moving (light 
emitting diodes were placed on long vertical rods in the exhibition space),
and I did not know what made them move or sound so (and have forgotten the 
sound). 

what slowly dawned on us was  (we were informed) that the light movement and 
the slight movement was caused by "data" of wind movement captured in another 
city in Canada –  the sensing device was placed on the roof of the Méduse 
complex in Quebec City, and the data were transmitted by the network and used 
to control a series of corresponding lights in the Toronto gallery space..   it 
was quite magical and impressed me.

see:  http://www.fondation-langlois.org/html/e/page.php?NumPage=369
see: http://subtletechnologiesarchive.com/2003/heimbecker.html 

with regards
Johannes Birringer


________________________________________
From: netbehaviour-boun...@netbehaviour.org 
[netbehaviour-boun...@netbehaviour.org] on behalf of Roger Mills 
[ro...@eartrumpet.org]
Sent: Sunday, August 31, 2014 1:30 AM
To: netbehaviour@netbehaviour.org
Subject: [NetBehaviour] Cross-Reality

Thanks for you response and links Aharon, I will have a look at these.

Cross-reality, sometimes seen as x-reality is a fusion of 3D immersive 
environments typically seen in gaming, and networked virtual environments such 
as Second Life augmented by sensor/actuators that bring data from the 
real-world into a networked virtual environment in some way. A kind of 
augmented virtual reality i guess.

Here are some MIT articles  
http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=5165555 in IEEE 
Pervasive computing going back to 2009 that outline the way that I am using the 
term, although they are primarily talking about technological projects as 
opposed to creative or performative interaction within them, which is my 
interest.

Many of the descriptions you provide suggest elements of these in various ways, 
although for what I am researching I am primarily interested in this mix of 
bringing data generated by dispersed light, temperature, humidity and movement 
sensors into networked virtual environments. Data from dispersed locations 
might be used to trigger as light, colour or sound ect..or guide musicians 
through a virtual environment.

Stellarc has touched on some of this with distributed movement sensors moving 
him in a located space..

I am looking at this from a networked music perspective and the ways in which 
data from these elements might contribute to increased awareness of presence 
and perception in tele-musical interaction based on a recent collaboration i 
was involved in http://eartrumpet.org/projects.html#Seeschwalbe

Thanks
Roger

--
Roger Mills
http://www.eartrumpet.org
http://roger.netpraxis.net

"Knowledge is only rumour until it is in the muscle" - Asaro Mudmen, Papua New 
Guinea.







_______________________________________________
NetBehaviour mailing list
NetBehaviour@netbehaviour.org
http://www.netbehaviour.org/mailman/listinfo/netbehaviour

Reply via email to