Re: [-empyre-] sidebar - continued discussion between John Klimaand Bill Seaman - part 2
I have a unresearched hunch that an output of a very generic type of 'noise' between objects should be encouraged in attempting to emulate meaningful relations between objects. This enables each object to do something different with each 'snap-shot' of noise it may receives. Like looking at clouds, each object can see something different in the noise coming from numerous sensors, but in turn, filters that noise and then outputs it again. The idea is to resample first stage input noise in a reciprocal fashion until hard patterns emerge. As face recognition reduces features to distances between eyes and eyes to the edges of lips, it strikes me that this relation is a kind of state that may be similiar.
Behind all these concepts is the theory that consciousness is a state of relations ('the matrix', 'hard patterns', etc) between numerous clusters of specific and oft-used brain functions. The state of relations is essentially just a composition, a diplomacy between different clusters.
The word Conduit comes from using a pda with a destop PC. A conduit is desktop software that is inbetween the pc and the pda that follows strict protocols in order to ensure data is passed safely to and from systems. It seems appropriate that with so many objects dealing with different kinds of data, a conduit is an inevitible concept when we need objects to recombine in ways that polymorphism cannot. For the system to recombine itself in order to deal with input 'outside its existing logic', would indeed be a step closer to a kind-of-consciousness.
Chris Poole
On Sat, 15 Oct 2005 07:05:34 -0400
Bill Seaman <bseaman@risd.edu> wrote:
> Chris Poole said:
>
> I keep thinking about how objects, or collections of objects in
> programming could represent 'States of Consciousness' that is itself
> a meta-object. Could one write conduits inbetween the gaps of objects
> that are 'universal', that distribute something iconic and meaningful
> to all objects in that family? Perhaps, as a way of creating
> patterns, reinforcing patterns, death of neglected patterns, that
> informs how the objects relate to each other.
>
> Even if it were merely a 'throttle' for possible relations between
> objects. This object talks apples, this one pears, but can interact
> through a third object, etc.
>
> Seaman:
> first steps toward a "throttle" system...
> I am very much interested in the potentials of object-based
> recombinant systems that might be written to enable very high level
> system authorship --- and in turn enable one to "get at" particular
> states of consciousness.
>
> In an work in progress entitled The Poly-sensing Environment -
> Seaman, Verbauwhede (EE at UCLA), Mark Hansen (Statistical CS UCLA)
> have been working on such a system.
>
> The system includes a set of bundled sensors that "talk to" a matrix
> that enables behavior picked up by the sensors to trigger particular
> media-oriented responses.
>
> Let me try to say this in a different language - sensors in a space
> send signals to a system that
> enables an author to define a particular media relation (or secondary
> robotic behavior] to that behavior.
> Chunks of code enable media-element configurations with particular
> behaviors to be interaced with or conjoined. The triggered media
> element configurations elicits a response from the interactant. Your
> idea of conduits [I have used the term Conjunction Codes in terms of
> a another of my works called The Hybrid Invention Generator] is an
> excellent concept.
>
> I also called this a "grammar of attention" as the subtitle of the
> Poly-sensing environment.
>
> Part of the system is called the Emergent Intention Matrix ---
> see
> http://homestudio.thing.net/database/video/Poly-Sensing_Environment/pse_Experimental.dcr
>
> This is just a sketch of the system (thanks Fabian Winkler)
>
> This matrix enables one to author a series of relationships between
> behaviors in physical space and related media objects (or media
> behaviors / affector relations [robotics] in virtual/physical space.
>
> I am hopeful that this will become a kind of universal physical
> interface generator...
> at the moment we are stalled (when you try to make a "universal
> system" it is hard to argue its singular potential especially to the
> NSF. The Langlois Foundation did fund one year of research and we are
> thankful to them.
> http://www.fondation-langlois.org/html/e/page.php?NumPage=49
>
> Imagine that you could have a series of objects in a room (as one
> example). When the objects were explored a text could appear on a
> screen or be spoken in relation to the object (or a particular
> behavior) --- or video clips could play or an alife set of characters
> might appear etc. etc.
> b
> --
> Professor Bill Seaman, Ph.D.
> Department Head
> Digital+ Media Department (Graduate Division)
> Rhode Island School of Design
> Two College St.
> Providence, R.I. 02903-4956
> 401 277 4956
> fax 401 277 4966
> bseaman@risd.edu
>
> http://billseaman.com
> http://www.art.235media.de/index.php?show=2http://digitalmedia.risd.edu
> _______________________________________________
> empyre forum
> empyre@lists.cofa.unsw.edu.au
> http://www.subtle.net/empyre
This archive was generated by a fusion of
Pipermail 0.09 (Mailman edition) and
MHonArc 2.6.8.