[-empyre-] More tactics for engaging with social media algorithms
Gradecki, Jennifer
j.gradecki at northeastern.edu
Tue Feb 9 09:34:13 AEDT 2021
Thanks to Renate for the warm welcome and for putting together such an interesting group for this month’s topic. It’s been really interesting to listen to the discussions so far on issues that I’ve also been reflecting on for several years, including social media dataveillance, mechanisms of power and control as it pertains to data and information, and of course, surveillance capitalism.
I wanted to introduce a few projects I’ve done in collaboration with Derek Curry as a way to start discussing how we’ve been engaging with these issues through artistic research.
Our first investigation into social media monitoring was with the Crowd-Sourced Intelligence Agency, beginning in 2015. The CSIA [1] was a partial replication of an open-source intelligence (OSINT) processing system that allowed participants to see how their Twitter posts may be interpreted in the context of OSINT systems used by law enforcement and intelligence agencies. The app had multiple components, but one of them in particular resonates with Ben’s discussion of how obfuscation tactics can produce tension for the participant. We allowed users to check the content of their tweet against our predictive policing algorithms and social media monitoring keywords (taken from DHS’s FOIA’d list of search terms) and once they saw how their tweet might look to an intelligence analyst, they could choose to create information overload through false positives, obfuscate their message through code words, or could opt to delete their tweet and not engage at all. One of our main tactics with the CSIA was reverse-engineering dataveillance technologies in order to provide a practice-based understanding of these secretive techniques. We’ve described this as providing a ‘visceral heuristic’ that allows participants to see how biases come into play during decision-making processes that are framed through dataveillance technology. We found it to be impactful for people to see their own messages decontextualized and recontextualized within the context of a surveillance interface.
I share Ben’s concern for the “surveillance-based engagement- and profit-motivated monopolistic-platform-enabled algorithmic feed,” this is a good way to phrase it. Through two recent projects (from 2020), Derek Curry and I have been exploring how the spread of coronavirus misinformation and disinformation on social media has been driven by celebrity, scientific uncertainty, and the targeted advertising business model used by these platforms. Infodemic [2] is a neural network-generated video that questions the mediated narratives created by social media influencers and celebrities about the coronavirus. The project uses conditional generative adversarial networks, which are sometimes used in deepfakes, quite differently. Rather than creating a convincing likeness of one person, we blended multiple people together in each corpora, so the celebrity talking heads mutate, much like the virus, as well as misinformation online. Likewise, Going Viral [3] uses the same algorithms and techniques as Infodemic, but it invites people to share COVID-19 informational videos that featuring algorithmically generated influencers delivering public service announcements or presenting news stories that counter the misinformation they have spread. With these projects, we are using the guise of celebrity to incite the viewer to question the veracity of information that they encounter online.
Reflecting on Renate’s excellent framing of this month’s topic, my position, as a new media artist engaging with these complex topics, has been focused on developing and democratizing a technical and experiential understanding of the current virtual landscape, with the aim of opening up conversations about how these socio-technical systems are designed and used beyond just insular groups of surveillance capitalist technocrats, to anyone who may be impacted by them. I think this can be an effective way to begin to formulate effective responses and alternatives to things like predictive policing and social media echo chambers.
I’d love to hear from others in the group and on the listserv about tactics they’ve been using to investigate social media algorithms, surveillance capitalism, and other related topics.
Links:
1. Crowd-Sourced Intelligence Agency: http://www.crowdsourcedintel.org/
2. Infodemic: http://www.jennifergradecki.com/InfodemicVideo.html
3. Going Viral: https://www.goingviral.art/
--
Jennifer Gradecki, PhD
Assistant Professor, Art + Design
Lake Hall 213A
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.artdesign.unsw.edu.au/pipermail/empyre/attachments/20210208/7c69f1d0/attachment.html>
More information about the empyre
mailing list