[-empyre-] A question concerning the electrification of digital objects
Dragan Espenschied
dragan.espenschied at rhizome.org
Thu Oct 9 13:29:03 EST 2014
> Whenever discussing "digital objects" to undergraduates I find that it is
> helpful to relate the well-worn etymology of "digital": that it is about the
> finger, or more specifically, the width of the finger which came to mean the
> gaps between. Immediately, this helps students to recognize that the
> electrification of digital objects is a purely contingent matter, which arose
> only after many non-electrical digital apparatuses. In fact, the computer, our
> zenith of digital apparatuses, can be fashioned out of many different material
> substrates---I then tell the undergrads about how I was once tasked with making
> a computer out of Meccano, an old children's toy that uses connecting pins to
> connect flat rods that have been punched with holes. I failed at the task, but
> learned first-hand about the importance of these holes. That the holes are
> *discrete* (separated, like the fingers) is vitally important for digitality.
Dear Quinn, everybody,
I very much appreciate tracing the roots of the digital, to times before
electrified computers did exist, or to the early days of computing, where
concepts can be clearly studied. Working on this level with simple,
ascertainable computers is very satisfying and enlightening, as I have
experienced myself when programming 8 bit machines. (I never got to build a
mechanical computer though, this is something special and I can only guess what
it reveals.)
All this being said, the usage and whole culture of computational processes has
changed drastically, mostly because of an abundance of digital symbols and
operations being available now; early 'computing', be it executed by a human
architect, a Meccano machine, or, as a redstone circuit built in Minecraft, has
to be much more resourceful and efficient with respect to the grammar, the
number of steps a desired performance can be broken down into, etc.
Quite naturally, this lead to 'computers' being used for 'computation', connect
it to a set of engineering ideology of planning, systemization, generalization,
purposefulness and efficiency. Computers were projected as machines for math,
calculation, science, because of the group of people using them. But
contemporary computers are largely everybody's Rorschach Blobs, their
applications are generally not concerned with efficient or 'elegant' use of
their own grammars, or any kind of efficiency in general. Instead, this view of
computers became one of many possible views.
In the highly complex environments of today, stating a purpose for a process
conducted inside a computer is not much more than putting a cartoon figure on a
cereal box. I also believe that writing bits as zeros and ones, as *numbers* is
not much more than a quirky anecdote. :) But since there is no other convincing
and widely respected narrative available, stories about computers are still told
around these ideals, and, to return to our topic, definitions of objects are
tied to them.
Ange's work shows clearly how a purpose assigned to a process can be variable on
even the level of something as 'basic' as encryption or compression. Or, even
more low level, adding two numbers together in floating point arithmetic.
Best wishes,
Dragan
More information about the empyre
mailing list