[-empyre-] Search, privacy, data - the abuse of encapsulation
magnus lawrie
magnus at ditch.org.uk
Wed Feb 29 10:17:00 EST 2012
Hi Tero and all,
Thanks for the recent thought-provoking input.
On Tue, Feb 28, 2012 at 02:43:19PM +0000, Tero Karppi wrote:
> I think Rob makes a wonderful list of points and I can follow-up only a
> few here. There is an important notion in points 5 and 6 that we are
> constantly ideologically, politically and economically exploited by
> different platforms and web sites and services and while we know it we
> do nothing about it. This is also reflected by Ana, who points out that
> living without leaving digital traces is practically impossible. Not
> only are we using social media but also our data travels in banks,
> health care institutions and schools. What is implied here is that our
> information and data is being potentially gathered in all areas of our
> daily lives.
> Recently we have seen two different takes on this subject by new media
> artists. The first take is making visible invisible or tactics of
> non-existence (in the spirit of Galloway & Thacker). For example Web
> 2.0 Suicide Machine by moddr_ & Fresco Gamba was built to help users
> close their Facebook and Twitter accounts. In contrast the other take
> makes invisible visible. An example of this could be the Transparency
> Grenade by Julian Oliver, which when detonated captures the network
> traffic and audio data and presents it online.
>
>
> On a more general level I think both of these works show how network
> culture works behind the interfaces and platforms. Web 2.0 Suicide
> Machine makes it an issue of biopolitics. Transparency Grenade shows
> explicitly what happens to our data. Making these processes visible
> help us to consider privacy & privatization issues on a more concrete
> level.
Following up this information (very interesting, thankyou!) with some
online searches, I came to this post, from Zach Blas - Though not
directly involved in the disussion here, he was part of the workshop
and was also involved in other aspects of reSource. I think the post
touches on some themes that have been discussed during this month. I
offer the link as a sort of informational feedback, to possibly enter
back into the empyre discussion object - an empyre 'throwback' if you
like:
http://lists.cofa.unsw.edu.au/pipermail/empyre/2009-November/002175.html
(some keywords I noted: nonrepresentable identity, creative
destruction, tactics of non-existence).
I provide this link also to respond to Gabriel's original discussion
statement, asking "how similar can the results of a further
in/compatible research debate be, in a different environment and time
frame?". I am also then wondering about aspects of search in an
institutional setting where the market rules (or is this purely
dystopian?) and in a more general sense, the possible application of
encapsulating protocols to research and institutions.
Best wishes,
Magnus
>
> Best,
> Tero
> __________________________________________________________________
>
> From: empyre-bounces at lists.cofa.unsw.edu.au
> [empyre-bounces at lists.cofa.unsw.edu.au] on behalf of Rob Jackson
> [robertjackson3900 at gmail.com]
> Sent: Monday, February 27, 2012 11:08 PM
> To: soft_skinned_space
> Subject: Re: [-empyre-] Search, privacy, data - the abuse of
> encapsulation
> Hi All,
> It's a sincere pleasure to be having this discussion with like-minded
> people. A thousand thanks to Gabriel for the invitation.
> I'll follow on from Tero's wonderful introduction and Andy's fantastic
> follow-up with ten quick points (technical, historical and theoretical)
> of my own concerning the unsettling privatisation of platforms.
> 1.) Platform applications have become the primary mode of accessing
> online information and communication in recent years. However, they are
> increasingly characterised by the forced removal of complexity
> implemented through the logic of encapsulation that closes off access
> to source code. This is an old story.
> 2.) In Object Oriented Programming, 'encapsulation' is defined as a
> paradigmatic logic which programmers use to conceal functions and
> methods which in turn restrict a user's access to the program's units.
> Although it didn't originate with OOP, it's original purpose was to
> prevent a computer program from bugs, exterior abuse, and it's
> environment. As computers (especially personal ones) became
> increasingly more complex, encapsulation methods were required to
> 'encapsulate' that complexity so the user need not be concerned with
> the inner workings of the program in question. Think of a cashpoint
> machine; when we wish to take our money out of the machine, we're not
> expecting to witness the nefarious complexities of someone transferring
> numbers, hardwiring physical money to our hand, understanding the
> interest gained on that account, etc ; the interface closes off certain
> functions that do not need to be made public, so that the user has a
> simple experience and saves time in the use of that program. This is
> why the rise of OOP is linked with GUI's.
> 3.) In the last 25 years or so the logic of encapsulation has been
> fundamentally and consistently abused for the sake of proprietary
> benefit. This is a major problem.
> 4.) The problem here is not encapsulation per se (even open source
> software is encapsulated) but the abuse to which it is subjected.
> Paraphrasing Dymtri Kleiner, an artist many of you may know, the issue
> isn't technical but political. Whilst others disagree, I am of the
> opinion that computing is an independent real process: it is not the
> logic of encapsulation which is the issue, but its proprietary use and
> abuse which should worry the masses. Don't blame the algorithms
> themselves!
> 5.) Tero's introduction highlights a major update of this abuse.
> Proprietary interfaces are incredible ideological pieces of machinery,
> designed to conceal necessary methods and functions away from the user.
> It doesn't matter if Google start spouting off self-congratuatory
> throws of "This stuff matters", the abuse of encapsulation for
> proprietary benefit already puts the user in a lower ground position in
> the technical sense, It's been going on for years. Making the interface
> more personable and user friendly like most other functions of
> encapsulation, is designed to save the user time and direct attention
> away from their abuse.
> 6.) Following Zizek, the worst part about this entire level of abuse is
> that, most of the target market already know they are being abused.
> Human animals in the Western world are very sanguine creatures. We must
> never forget that, nor start beating ourselves up about it. There is an
> even more fundamental theoretical reason as to why ideology works so
> well in this forced removal of complexity, but suffice to say this is a
> philosophical conversation best left elsewhere (unless someone wants to
> know - in essence, the interface isn't just a technical feature of
> human existence).
> 7.) The forced removal of complexity works for the proprietor (and
> never the consumer) and this occurs in three main principles which need
> attention (I'll end on these three issues).
> 8.) Data mining: The first principle concerns the production of public
> data; the public 'waste' so to speak. Private data is purged from the
> ideological interfaces we deal with day in, day out, because the logic
> of encapsulation is to conceal the private as per the programmer's
> intention. One way of making the consequences of this abuse visible to
> users, is to highlight what will happen when such large private
> databases are made public. This is always some danger attached to large
> companies holding private data of ours, not so much in the proprietary
> abuse, but the heightened fallout when that data is unexpectedly
> released in public, or has become lost. The impact of private data (and
> it's sheer volume) is becoming more and more insecure, and this should
> worry us. But again this is linked to our sanguine nature.
> 9.) Infrastructure, complexity and use: The problem with iPhones is
> that they aren't shitty enough. Again, this is linked to the logic of
> encapsulation, and the ability to save us time, as per the Western
> infrastructure of career enforcement and obsession with social
> attention 'sharing'. Platforms are part and parcel of this
> simplification; I believe that the expected success of iPhones, iPads,
> Androids, even the ubiquity of Cloud Computing all work on this logic
> of forced complex removal. Witness the steady demise of I.T departments
> all across the working world, as businesses and starter companies
> remove the complex production of I.T management and leave it to a
> hosting company of some sort. Platforms purchased today are
> always-already limited right out the box (consider Google's Chrome OS
> platform). The functional aspect of simplified use need only be turned
> on 'somewhere' for someone. Platforms embellish an almost obscene level
> of private encapsulation. I'm being grossly unjust to the technical
> details here, but in summary complexity is worth fighting for. (This
> ties into Andy's point on the oppositional character of glitch - my
> view would be this is a genuine form of computational complexity, where
> information is irreducible to a user's 'understanding' - i.e corruption
> or overload).
> 10.) Rentier profit: By far the biggest profit gain in the last ten or
> so years has arisen from the use of rentier capital in software itself.
> Most private software companies make enormous profits not from consumer
> purchases, platform devices, nor patents (although this is increasing
> too, especially Apple), but the rentier logic of purchasing licenses to
> use software in business use. Microsoft for instance make most of their
> money from enforcing concurrent CAL business licenses to use Windows
> Server 2003 or SQL Server 2005. Other software companies follow suit
> abusing encapsulation with license agreements for the sake of profit.
> But this also extends to other 'forced choice' factors of software and
> 'web apps' which require license keys or T&C's as a precondition of
> user agreement. Software rentier capital is the biggest form of profit
> generation seen, outside of investment banking. And it's arguably the
> worst sort, because powerful technology is blocked from use simply
> because one would not have access to funds required to purchase the
> licenses (which means you never own the thing anyway).
> all the best
> Rob
> On 27 Feb 2012, at 13:19, Tero Karppi wrote:
>
> Hi all,
> I'll start with a theme that is loosely related to privatization of the
> web & related platforms.
> On March 1st, Google will implement its new, unified privacy policy.
> This policy will affect data Google has collected on you as well as
> data it collects on you in the future. Until now Google Web History has
> been isolated from its other services. However with the new privacy
> policy in action Google will begin to combine information across its
> products. According to Electronic Frontier Foundation Google search
> data is particularly sensitive since it can reveal "information about
> you, including facts about your location, interests, age, sexual
> orientation, religion, health concerns, and more." Hence they have
> urged people to remove their Google Search History before the policy
> takes effect.
> Google, however, sees the new privacy policy as an improvement of their
> search; "Our search box now gives you great answers not just from the
> web, but your personal stuff too. So if I search for restaurants in
> Munich, I might see Google+ posts or photos that people have shared
> with me, or that are in my albums." In addition, the search will be
> able to better predict what you 'really' are looking for and target ads
> more accurately.
> Now, what interests me here, at a more abstract level, is the change we
> are witnessing in relation to data mining. Until now, more or less, the
> data we share in various platforms (browser, search, social media,
> iOS/Android etc.) has been mined, combined into statistics and
> potentially sold onwards but we haven't really seen it in action except
> in some more or less accurately targeted ads. However, now we are
> witnessing a throwback of our own data; Google begins to make the
> search more personal, Facebook has the frictionless sharing to name a
> few examples.
> What are the implications of this change? Is the 'social' media
> becoming now more 'individual' and 'personal'? What should we think of
> these algorithms that predict what we want?
> References
> [1]https://www.eff.org/deeplinks/2012/02/how-remove-your-google-search-
> history-googles-new-privacy-policy-takes-effect
> http://googleblog.blogspot.com/2012/01/updating-our-privacy-policies-an
> d-terms.html
> Best,
> Tero
> --
> Tero Karppi (MA)
> Doctoral Student | Media Studies | University of Turku
> http://www.hum.utu.fi/oppiaineet/mediatutkimus/en/tero_en.html
> _______________________________________________
> empyre forum
> empyre at lists.cofa.unsw.edu.au
> http://www.subtle.net/empyre
>
> References
>
> 1. https://www.eff.org/deeplinks/2012/02/how-remove-your-google-search-history-googles-new-privacy-policy-takes-effect
> _______________________________________________
> empyre forum
> empyre at lists.cofa.unsw.edu.au
> http://www.subtle.net/empyre
More information about the empyre
mailing list