[-empyre-] Data Visualization and Decelerationist Aesthetics

Annina Ruest arust at syr.edu
Thu Jul 14 01:26:03 AEST 2016


Yes, I am aware that it is a dilemma that the pies do not show more information beyond the woman/man gender ratio. In practice, it is a bit more complicated: When I (manually) count gender ratios, I check how a person is talking about themselves in their bio and that is how I count them. Oftentimes, people choose to keep more specific information about themselves private to a more general public. I really don’t want to deduce anything (gender, race, sexual orientation, etc.) about anybody beyond what they are making public to me. Gender seems to be something that people generally feel free to share and it seems to me like at least outwardly (in a public bio), people tend to talk of themselves as either woman or man. 

Although they are limited, a set of woman/man gender ratios can reveal issues that may not be immediately obvious. For example, In the last iteration of my project I collected gender ratios from both art and tech and found that the share of women represented in many art venues is about the same as the share of female technologists working in major tech companies. Tech has the reputation of being grim for women, but based on the data I collected it seems to me that art is even worse, given that more women graduate from art programs as from cs and engineering programs. In general, it can be said that creator roles (whether that be in art or in tech) are more available to men than to women.

So here’s a question: I’ve read your posts but I still don’t get your objection to overpoduction. I understand that we (as a society) live with abundance and that we are wasteful. But when it comes to creative production or working for social justice or a political cause, it seems to me that it is not so clear-cut.  Are there rules about when it is o.k. to be wasteful and when is it not? How do you decide whether you would want more of something and when is it too much? If you approach everything from the point of view of “we are wasteful” do you not just find more waste?   My project obviously has a potential for food waste (among other wastes) because it uses food to “perform”. So this interests me.  Please let me know! 

Thanks!
Annina



> On Jul 12, 2016, at 8:31 PM, Katherine Behar <kb at katherinebehar.com> wrote:
> 
> ----------empyre- soft-skinned space----------------------
> Hi, 
> 
> Thank you for sharing these examples, Annina and Catherine. There are some great projects out there. Annina’s pies and A Sort of Joy are favorites. Let me try to explain. 
> 
> Visuals aside, I’m super interested in what these projects do performatively. I’m less interested in works that reiterate data because they can wind up reiterating data’s problems too. For example, to take something fairly obvious, visualizations that picture data in male/female or women/men terms ascribe algorithmically to a cultural gender binary. Such binaries will fail to culturally account for—or algorithmically count at all—folks who identify as various shades of gender queer. Or worse, they might misrepresent individuals by ascribing them to categories that don’t match their gender identity. Lisa Nakamura writes powerfully about this with regard to race. Even when such visualizations can be visually inventive (and some are great on those terms), they nevertheless may risk re-performing problems with data and its collection.
> 
> But I love the pies! Why? Because to my mind, projects like the pies and the MoMA performance are able to performatively speak to some surplus or illogical quotient beyond what the data at first glance appears measure. And that, I think, is important feminist work. 
> 
> Annina, I think your project goes far beyond reductive binaries by providing a remarkably undigested (pun intended) surplus—the pie itself! The pie makes such a great, subversive addition to the data because it also references the types of affective labor that are conventionally ascribed to women, such as care taking, pie baking, and gift giving, that are problematically left uncounted in census-type gathering of labor data and/or don’t conform to the economic model of wage labor. Plus, the pie conveys the crucial sense that the data is incomplete—there might be missing data that got eaten or there might be external data being produced in the conversations shared over a slice. 
> 
> By far my favorite is the edible magic marker version in which the failure of the robot to adequately represent the data is foregrounded to hilarious effect. It’s so fun and absurd that it made me laugh out loud. I think that piece is a great send up of the inherent sketchiness of all pie charts to begin with, which were after all famously despised by the likes of Edward Tufte.
> 
> I’d be interested to see more of the performance, but I felt that the polyvocal layering was working in a similar direction toward an indication of surplus.
> 
> Best, 
> Katherine
> 
> 
> 
> 
>> On Jul 12, 2016, at 1:32 PM, Annina Ruest <arust at syr.edu> wrote:
>> 
>> ----------empyre- soft-skinned space----------------------
>> A Sort of Joy is super cool! Here’s a more conventional visualization about museum artists by Misha Rabinovich from a workshop on feminist data that Micol Hebron and I did at LACMA. It’s the top 100 exhibited artists at LACMA color coded by gender.  http://misharabinovich.com/blog/?p=250 
>> 
>> Here is another one made by Lisa Fedak (from the same workshop). It uses the generative nature of google search to show what google really thinks about equal pay:
>> http://gallerytally.tumblr.com/post/141212943217/poster-for-google-search-regarding-equal-pay
>> 
>> Annina
>> 
>>> On Jul 12, 2016, at 6:33 AM, kanarinka <kanarinka at ikatun.org> wrote:
>>> 
>>> ----------empyre- soft-skinned space----------------------
>>> Another piece in a similar vein is A Sort of Joy (Thousands of Exhausted Things) by the Office of Creative Research and Elevator Repair Service Theater where the groups sorted the entire MOMA archive by author name and then proceed to read the names as performance art. Men read men's names and women read women's. Sorted in this way, the starkest thing that emerges from the metadata is the pervasive anglo-ness/Western-ness ("John", "Michael", etc) and maleness of the archive. 
>>> 
>>> https://vimeo.com/133815147
>>> 
>>> Catherine
>>> 
>>> 
>>> ///////////////////////////// 
>>> kanarinka at ikatun.org   ||   @kanarinka   ||   +1 617 501 2441   ||   www.kanarinka.com
>>> 
>>> On Mon, Jul 11, 2016 at 3:57 PM Annina Ruest <arust at syr.edu> wrote:
>>> ----------empyre- soft-skinned space----------------------
>>>> On Jul 11, 2016, at 12:43 PM, Katherine Behar <kb at katherinebehar.com> wrote:
>>>>     *Here I wonder, how does this fit with Catherine’s distinction between dat vis critique and generative data vis? I think all data visualization is generative, and I’m concerned about that because I feel that we don’t need *more* of anything right now; if I may use the royal “we” are already producing too much, too quickly, I think.
>>> 
>>> 
>>> I would like to offer an example of generative feminist data visualization that I cannot get enough of and explain why I think it’s so awesome in all it’s generative data-visualization-producing glory:   I consider the Gallery Tally Project (directed by Micol Hebron) http://gallerytally.tumblr.com/ to be a generative feminist data visualization project. Micol started counting how many women/men were represented by commercial galleries in specific cities and asked people to make posters using the data for group exhibitions online and offline. I would argue that Micol created an algorithm that produces a wealth of data and feminist visualizations through crowd-sourcing creating absolutely indispensable results. I wish that there were even more of an overproduction feminist data of the kind that Gallery Tally produces. I also think that Gallery Tally is a great model for showing that we are not just at the mercy of algorithms producing terrifying racist data (such as the results described  in the NYT article on AI that Catherine linked [1])  but that we can influence what kind of data is being produced by insisting that intersectional feminism is indispensable to any kind of algorithmic pursuit: In many tech-centric departments (CS, Engineering), the humanities are treated as a sideshow but they really are essential to tech production more than ever.
>>> 
>>> 
>>> Annina
>>> 
>>> [1] http://www.nytimes.com/2016/06/26/opinion/sunday/artificial-intelligences-white-guy-problem.html?_r=0
>>> 
>>> _______________________________________________
>>> empyre forum
>>> empyre at lists.artdesign.unsw.edu.au
>>> http://empyre.library.cornell.edu
>>> _______________________________________________
>>> empyre forum
>>> empyre at lists.artdesign.unsw.edu.au
>>> http://empyre.library.cornell.edu
>> 
>> _______________________________________________
>> empyre forum
>> empyre at lists.artdesign.unsw.edu.au
>> http://empyre.library.cornell.edu
> 
> _______________________________________________
> empyre forum
> empyre at lists.artdesign.unsw.edu.au
> http://empyre.library.cornell.edu



More information about the empyre mailing list