[-empyre-] More tactics for engaging with social media algorithms
domenicobarra.db at protonmail.com
Wed Feb 17 23:43:08 AEDT 2021
Hello and sorry to catch up on this so late.
I must say, all of the projects and stories I have read and seen are blowing my mind, from GoingViral to WarTweets, Modems sans Frontiers, quality.
I want to share with you this journey I had for some months on Facebook. More than investigate algorithms I wanted to trigger algorithms, and discover human conditions and behaviors in the realm of algorithms. The best way to do so was through creating a fake profile and be someone else. I created this character, Addle Eratat, an anagram of Altered_Data, the artist name I use in some of my art project. I became Addle Eratat on Facebook.
I didn't know what to expect and what I was going to go through, all I know was to do a human performance for algorithms, be the best user FB had ever had, use all of its functions from check-ins, tagging, everything, creating tons and tons of data. I even carried some kind of Like Marathon time racing. Basically put as many likes as possible in the shortest time as possible to all of the page suggestions proposed by the algorithms until I got blocked. I wanted to do with FB everything I did not feel comfortable to do with my own FB, my "authentic" FB persona, take stereotypes to the extreme, see what other people felt to be on FB, be in someone else shoes on Facebook.
Rules were simple. Every week change character, change persona. During the week, do the best and the most and the worst that persona would do. Disguising my self as someone else in groups, see how far people would I believe I was real. Engaging into groups, events, fan pages, everything to the extreme. I must say, nobody ever believed I was a fake. This is absurd, even counting that I had never deleted the previous characters posts, it was all there left to be seen. I found Addle Eratat in some bizarre and obscure sides of the internet, the FB we would never experience. I was an American woman in her middle age, divorced and working at Wallmart. A Venezuelan man just released from prison. A young student from Lebanon. A gay man from France. An Italian boy influencer wanna be obsessed with beauty and fitness and many other fake persona. At some point I gave the user ID and password to other people and asked them to become Addle Eratat, become someone else, as a perfomance, a theater, a stage. As Addle Eratat I experienced a FB I totally didn't out of my reach. There is an underground world happening within FB away from our circles and communities that I could have not even imagined. When things went a little to off the limit I had to tell people that it was a fake, often I reported abusive men sending photos of their penis over and over again, young boys from Pakistan asking to marry me to become US citizen and leave Pakistan. I stopped the project because at some point it was mentally and emotionally too strong to handle and I started to feel bad as I was cheating people and their trust, or even creating opportunities for their abusive behaviors, triggering those acts. I realized that the project was going in a direction I could not control anymore, far away from what I expected, too much. And it felt really wrong. I don't say regret but it felt uncomfortable with my actions, even just log in felt wrong. After few months I stopped the performance. It was becoming too much engaging, too real.
At the end I had a FB profile loaded with data. This was back in 2016. I found online a project, applymagicsauce.com/test.html , and fed it all of the data from Addle Eratat's FB. This project was what then, in a more advanced form, what became used by Cambridge Analytica.
You can read more about Addle Eratat here. http://www.dombarra.art/addleeratat
Thank you so much and I look forwards to read more about your projects.
Email : rockmyworld at dombarra.art
Website : www.dombarra.art
White Page Gallery/s : http://www.whitepagegallery.network
‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
On Monday, 8 February 2021 23:34, Gradecki, Jennifer <j.gradecki at northeastern.edu> wrote:
> Thanks to Renate for the warm welcome and for putting together such an interesting group for this month’s topic. It’s been really interesting to listen to the discussions so far on issues that I’ve also been reflecting on for several years, including social media dataveillance, mechanisms of power and control as it pertains to data and information, and of course, surveillance capitalism.
> I wanted to introduce a few projects I’ve done in collaboration with Derek Curry as a way to start discussing how we’ve been engaging with these issues through artistic research.
> Our first investigation into social media monitoring was with the Crowd-Sourced Intelligence Agency, beginning in 2015. The CSIA  was a partial replication of an open-source intelligence (OSINT) processing system that allowed participants to see how their Twitter posts may be interpreted in the context of OSINT systems used by law enforcement and intelligence agencies. The app had multiple components, but one of them in particular resonates with Ben’s discussion of how obfuscation tactics can produce tension for the participant. We allowed users to check the content of their tweet against our predictive policing algorithms and social media monitoring keywords (taken from DHS’s FOIA’d list of search terms) and once they saw how their tweet might look to an intelligence analyst, they could choose to create information overload through false positives, obfuscate their message through code words, or could opt to delete their tweet and not engage at all. One of our main tactics with the CSIA was reverse-engineering dataveillance technologies in order to provide a practice-based understanding of these secretive techniques. We’ve described this as providing a ‘visceral heuristic’ that allows participants to see how biases come into play during decision-making processes that are framed through dataveillance technology. We found it to be impactful for people to see their own messages decontextualized and recontextualized within the context of a surveillance interface.
> I share Ben’s concern for the “surveillance-based engagement- and profit-motivated monopolistic-platform-enabled algorithmic feed,” this is a good way to phrase it. Through two recent projects (from 2020), Derek Curry and I have been exploring how the spread of coronavirus misinformation and disinformation on social media has been driven by celebrity, scientific uncertainty, and the targeted advertising business model used by these platforms. Infodemic  is a neural network-generated video that questions the mediated narratives created by social media influencers and celebrities about the coronavirus. The project uses conditional generative adversarial networks, which are sometimes used in deepfakes, quite differently. Rather than creating a convincing likeness of one person, we blended multiple people together in each corpora, so the celebrity talking heads mutate, much like the virus, as well as misinformation online. Likewise, Going Viral  uses the same algorithms and techniques as Infodemic, but it invites people to share COVID-19 informational videos that featuring algorithmically generated influencers delivering public service announcements or presenting news stories that counter the misinformation they have spread. With these projects, we are using the guise of celebrity to incite the viewer to question the veracity of information that they encounter online.
> Reflecting on Renate’s excellent framing of this month’s topic, my position, as a new media artist engaging with these complex topics, has been focused on developing and democratizing a technical and experiential understanding of the current virtual landscape, with the aim of opening up conversations about how these socio-technical systems are designed and used beyond just insular groups of surveillance capitalist technocrats, to anyone who may be impacted by them. I think this can be an effective way to begin to formulate effective responses and alternatives to things like predictive policing and social media echo chambers.
> I’d love to hear from others in the group and on the listserv about tactics they’ve been using to investigate social media algorithms, surveillance capitalism, and other related topics.
> - Crowd-Sourced Intelligence Agency: http://www.crowdsourcedintel.org/
> - Infodemic: http://www.jennifergradecki.com/InfodemicVideo.html
> - Going Viral: https://www.goingviral.art/
> Jennifer Gradecki, PhD
> Assistant Professor, Art + Design
> Lake Hall 213A
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the empyre