[-empyre-] Digital Objects // PROCESS : What is a digital process?

Alexander Wilson contact at alexanderwilson.net
Fri Oct 24 07:54:53 EST 2014


​Thanks Ben and Quinn for your responses. ​

I would like to address certain points raised.

Ben is interested in the link I was alluding to between discretization and
the digital. It seems to me the movement from the "prediscrete world"--that
of our archaic ancestors, for example, who did not have a word for each
object around them--to the progressively grammatized world, can be modelled
almost identically to a simple analog-digital converter, where a continuous
(unified) flow is sampled and transformed into a series of discontinuities.
I agree with Ben that discretization opens up new possibilities; these are
related to the "mobility" afforded by the discrete. The atomists believed
that the void, the space between atoms, explained movement. In much the
same way, the breaks between the discrete bits of the digital, allow them
to become mobile within their space of possibility in a way a continuous
flux simply cannot. Digital information can be copied precisely, whereas
analog information can only be copied at a loss. These abilities of the
digital realm stem from the follwoing fact: as “individualized” objects,
bits are disconnected from their surroundings, they are mobile. I say
"individualized" to reiterate what I think Ben is noting about Stiegler,
which is perhaps more attributable to Simondon, namely that once
individualized, a process has exhausted its "preindividuality". Its
potential to become, to change, for Simondon, is predicated on this
preindividuality. Thus there is a sense in which, once an object is
perfectly objectified, that is, once it can be specified (copied) with
absolute precision (as with a collection of digital bits), for Simondon it
elides process, change, becoming, individuation.

This comes back to what I was alluding to: process is something outside the
digital. I find myself continually coming back to this idea when I think
about the digital. This is the point Ben is not convinced by. It just seems
to me that when we have digital bits, or when we have purely defined
objects, we still don't have an account of change or time. This is a big
problem with object oriented ontology, for example.
​ ​
It is also related to what Quentin Meillassoux describes as hyperchaos:
interestingly, he arrives at the conclusion that contingency (ungrounded,
unmotivated, indeterminate change) is necessary, because of the
non-totalizability of the cosmos (the whole), as per the paradoxes of set
theory. If the cosmos itself cannot "contain" itself, or if there can be no
absolute whole of everything, he reasons, then we must posit the necessity
a kind of ungrounded change that escapes the determination of any holistic
set of possibilities or laws.


In Deleuze, I find further support for this idea. Pure difference,
difference in itself, is a positive concept for Deleuze; it is not
subordinated to representation, identity, negation, and basically all the
features required for the digital domain’s ability to be specified to
absolute precision (or precisely copied). This is why I was saying about
the term “analogue”. At issue is more than just a relation between digital
and analogue or discrete and continuous. The pivotal relation is rather
between “difference” and “difference that makes a difference”. In Deleuze,
difference is never given; rather, it is that which gives the given.
Difference has an ontological necessity to give us discrepancies and
identifiable distinctions. What is given is discrepancies, distinctions,
relations of identity and contiguity: scientists will call this
“information”. Deleuze insists that as things individuate, difference never
gets “used up”. Only *diversity*, he says, is truly reduced in the
processes. Difference stays intact, he insists, surviving all possible
transformations. Process, change, time, stems what we might call the
“ontological incontinence” of difference in itself, that is, before it
gives the given or “makes a difference”. This is also related to his
concept of “quasi-causality”, which conditions events through “surface
effects”, yet, which is “unproductive” in a way amenable to how difference
does not get used up in causal process. Notably, he claims, after Valery,
that the deepest level of being is the skin, implying that such surface
effects, which are “causally unproductive” are that which give us causal
relations in the first place.



Which brings me to my point about Chaitin’s Omega. Thanks to Quinn for
underlining its main aspects. Omega is really a formal mathematical object,
constructed in order to show the “probability” that a given program of N
bits is decidable or undecidable. This is in the spirit of Turing and the
decision problem, formulated as a question of whether a device churning
through a computation would ever come to halt on a definitive “yes or no”
result. We know since Gödel that mathematical systems are “incomplete”,
because they allow expressions that are consistent yet cannot be proved.
This responded to Hilbert’s first and second questions: mathematics is
incomplete if consistent. Hilbert’s third question remained: is mathematics
decidable? Turing’s machine was a thought experiment for responding to
this. (It is very telling that his thought experiment actually paved the
way for modern computing: the “general purpose computer”, is what is
mathematically called “turing universal”). As it turns out, using a
technique similar to Gödel, but inscribed into the procedure of the
machine, he showed that the decision problem is unsolvable, that is, there
is no algorithm or shortcut procedure for knowing in advance whether a
certain computation will halt, or whether it will keep on computing
forever, alternating eternally between yes and no. Chaitin sees this as
fundamental, and extends it with Omega: what is the probability that a
given program will halt? It turns out that this number, for programs of any
arbitrary length, is a seemingly infinite and random stream of digits: even
if you knew the number to the Nth placeholder, you would still have no way
of deducing what the next digit would be. It has maximal “algorithmic
information content”: each number in the probability is a singular,
irreducible event, which is “independent” in the mathematical sense,
uncorrelated to the other numbers in the sequence. The probability of a
given computation being decidable is thus random in this sense.



This is what I was trying to suggest with the “thumbprint of god” analogy.
It is as though, not only physics, but also mathematics, is based on an
ungrounded randomness. I see this as equivalent to Deleuze’s idea of
quasi-causality: it is this unproductive conditioning of everything, from
an absolute contingency or difference outside of the system. It seems
everywhere we dig, we are confronted with this randomness. Chaitin, as
Quinn notes, is discouraged by this, claiming that it denies Leibniz’s
principle of sufficient reason. Yet in another sense I think it gives
substance to Leibniz’s cosmological argument. But we should change his
question from “Why is there something rather than nothing?” to “why is
there non-randomness rather than randomness?, or “why is it there seems to
be stability, locality, law-like repeatability, rather than absolute
contingency all the time?”. I think it has something to do with observation
itself. Meillassoux denies this postulate and steadfastly wants to avoid
the “correlationist” stance, and so ends up denying stabilities and
non-randomness altogether: he expects anything to happen at any moment.



Randomness, to me, is that which requires no explanation: instinctively, no
one ever looks at a randomly distributed array of objects and asks: why are
these randomly distributed? What requires explanation is that which
diverges from randomness, because statistically, a non-random distribution
is less probable than a random one. We are very good at identifying
patterns. Humans, and indeed living organisms generally, are pattern
recognition systems. In fact we are so good at it we see patterns
(coincidence) even when things are random (see Kahneman and Tversky). But I
think this non-randomness has everything to do with observation. As we
sample the “chaotic outside”, as we cut off a bit from its preindividual
potential, and endow it with mobility and nameability, it becomes a
determinant in time, an anchor, an irreversible fact, a point of
equivalence between potential and actual, and is encoded in our very
structure as pattern-recognizing agents. In quantum physics, this will be
called the “collapse” of the wave function, or better, “decoherence” (check
out Zurek’s work), in which the observer, the act of measurement, or the
event of interaction is central to the transition from randomness and
non-locality to predictability and locality. Observation causes a bias on
reality: a “predictability sieve” as Zurek calls it. Once a particle is
“entangled” with another, their futures are interdependent and correlated.
On the quantum level a thing is here AND there at the same time. In the
classical level of reality (where we carry out our lives), a thing is here
OR there. Decoherence is the transition from the “and, and, and...” to the
“or, or, or…”.  Similarly, irreversibility and process emerges in the
moment we measure. The event is the Aion: it cuts the reversibility of the
Chronos and gives us the irreversibility of the before and after. As in
Hölderlin’s caesura, the beginning and end no longer rhyme. The “and and
and” is difference in itself. The “or or or” is difference that makes a
difference for something or someone.



To me all this suggests that process is necessarily “outside” the category
of the digital.


​Thanks for reading.

Best,
Alexander​
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cofa.unsw.edu.au/pipermail/empyre/attachments/20141023/73ec03e8/attachment.htm>


More information about the empyre mailing list