PAGE online

»To me, the users are not abstract but real people that got hurt«

Caroline Sinders is a machine-learning researcher, designer and artist. She works for clients ranging from IBM and Mozilla to Amnesty International and the United Nations. But however diverse her clientele, her focus is always on human rights and activism.

Caroline Sinders Interview PAGE 11.2021

It’s hard to decide where to begin when looking at the work of Caroline Sinders and at the idealism and resilience it takes to dive into the abyss of hate speech and conspiracy theories like she does. Instead of despairing, the Louisiana born who lives in London right now, makes her findings available to inte­r­est­ed parties or turns them into research driven art. She is a member of the non-profit media collective Unicorn Riot and shows her art at Tate Modern, MoMA PS1 and Ars Electronica. We’re talking about her online investigations, about feminism and AI, data sets of cypress trees and why art is a perfect tool for under­standing the mechanisms of the digital world.

You’re known for your work as a machine-learning-design researcher and artist. But you started out as a photojournalist. How did you get into technology ethics?
Caroline Sinders: I’ve always been interested in tech­nology. I just didn’t know how to make that my job. When I went to high school in the early 2000s, they didn’t understand technology enough to recom­mend going into user research, into experience design, or that you can do a major in political science and focus on the internet. As I had already won some awards for my pictures, I studied photography at NYU and combined it with ethnographic research, showing people and communities and the issues they face.

What kind of photography projects did you work on exactly?
I was photographing Jewish Russian immigrants who came to Brooklyn in the 1990s or the disappear­ance of the furniture supply stores in one particular street in Lower Manhattan. I also did a comprehensive project about my family and our community after hurricane Katrina. I was interested in how you portray a post disaster, knowing that the traumatic scars are there even if the city is rebuilt. It’s called How to Explain a Hurricane to an Algorithm because I’m still working on it, bringing technological research into it.

Caroline Sinders Interview PAGE 11.2021

Caroline Sinders Interview PAGE 11.2021

Caroline Sinders Interview PAGE 11.2021

Caroline Sinders Interview PAGE 11.2021

Caroline Sinders Interview PAGE 11.2021
Since 2005, when Hurricane Katrina hit her hometown New Orleans, Caroline Sinders has been working on her interactive project How to Explain a Hurricane to an Algorithm, using family pictures and experiences and “AI, data, loss”, as she describes it

Someone wrote that you now walk through the internet like a photojournalist with a camera …
I’ve always been interested in people and the world around them. Was I on site earlier, I’m investigating the problems communities are facing online, now.

The problems you are talking about are ha­rassment, abuse or misinformation. Investigating this, you also always focus on design. Opposed to human centered design you are demanding a human rights centered design.
But that is not opposed to human centered design, I just think it needs an update. And that means inclu­ding human rights research and human rights findings into the design process. It involves commu­nity research and collaboration and absolute transparen­cy. Not everyone can add this into their work process right away, but it slowly gets there. So I hope that there is a shift away from the traditional design process, especially in social networks and big technology.

Could you explain this a little bit further?
We have to recognize that many of the products we’re building have human right implications. The easiest ones to point out are social networks but there are many others, too. Design doesn’t just randomly appear. When we are online, we’re still part of the world and of a system that has ties to certain inte­r­ests and politics and you have to understand that all different kinds of harm can manifest into a digital space. And you have to understand also, that design is not just what something looks like but it’s the whole pipeline of what we are building. Product and technology teams often don’t think through the implications of what they are doing and how much harm they can cause.

»Instead of thinking about ­personas you have to think about who your most vulnerable users are«

What are the exact problems?
I think one is that designers often work with personas, but with very flat ones. Human rights centered design would like you to think about what would happen if someone is using your product in a country that is not your own. Have you thought through its political structure? And what might be helpful or hurtful to somebody there? Image that you are an email provider and you are requiring people’s real names and identities for identification. How secure is this? And what happens when somebody gets access to this material? In countries undergoing political strife, how safe is it for people there using your email to connect to each other? Human centered design should get people to really think through their design process on a broader and more pluralistic scale. These are things I don’t see in an ideation room where they are building new products. Instead of thinking about personas you have to think about who your most vulnerable users are. When you design for their needs you generally design for a safer experience. And look at location services. They are automatically installed before you can turn them off. That could be very unsafe for some people.

Is this what designers should do? A task that they often neglect? Also with an eye on AI?
That’s a good question. I would say that our role as designers is to build legible and harm reducing expe­riences and putting the safety of their users in the center. And there is still a lot of work to do and a lot of questions to be answered. Should we build data war­n­ing labels? Or can there also be more explanatory AI that shows the user how AI is working within the pro­duct? That is something I’m super interested in. Google search and the discover weekly playlist from Spotify are very popular AI products. Spotify design­ers say that their research shows that users don’t really care how the AI works. But I think a lot of users do want to know why something is shown to them. Data concern is growing and also the wish for more privacy, even in the US, which is based on consumerism, innovation and capitalism, and because of this a policy for privacy is not a cultural expectation people have.

Caroline Sinders Interview PAGE 11.2021
One of Caroline Sinders’ best known works is her Feminist Data Set, a project that questions every step in the AI, that must be made to build it – and points out difficulties and impossibilities

You have a great metaphor to symbolize AI: You say AI is like salt.
Oh yes, that is a metaphor from The people’s guide to AI. It’s a fantastic zine I recommend everybody to read. I really like that salt metaphor which says that by itself salt is not interesting, it’s even inedible. But in the right quantity it’s transformative. Just like AI, which on its own is just pieces of code that couldn’t do anything. AI needs a context or to be programmed to do something. I think it’s helpful to just demystify AI, it cannot have a mind of its own and harm you.

The term “human” runs through all your work. It also appears in relation to data sets, the basis for machine-learning. You emphasize the humanity of data, that it’s people’s output and must be treated with respect.
There’s this nonchalance about getting and using data. When I did my master’s in a technology design program, we built a lot of stuff and therefore used lots of data sets. But we weren’t really thinking about where they came from or why it was so easy to have access. We even scraped data from twitter. I saw this attitude in the industry as well. There, data is something you get very easily and very cheaply and with­out passing a lot of safeguards. Data feels like a ubiq­ui­tous common material, but in fact it’s people’s information and we need to be protective of it.

»I would say that our role as designers is to build legible and harm re­ducing experiences and putting the safety of their users in the center«

Online harassment is also an important part of your work. You’re investigating white su­premacy groups, alt-right and other hate speech. How do you deal with so much traumatic data?
I really appreciate you’re asking this question, because it doesn’t get asked a lot. When I’m focusing on hate speech, I never work full-time on a project. That’s extraordinarily helpful to create resilience. A lot of researchers are doing it this way, because if you’re studying this kind of traumatic subject matter, it deeply affects you, it can give you nightmares and panic attacks. I’m also in constant contact with the group of people I’m working with and that helps, too. But if I still feel affected by the subject matter,
I take time off.

What do you do with the findings? Do you use it for your projects or is it a kind of activism?
I try to make sure that it’s both. Sometimes, I make art about this information or disinformation, and I also work with a journalism collective called Unicorn Riot and we publish neo-Nazi discord channels online, so that journalists can use them. Online harassment, hate speech and white supremacy are still some of the major things I’m focusing on. This also includes that I spend a lot of time interviewing vic­tims of harassment and I think this is also why humans are so centered in my work. To me, the users are not abstract but real people that got hurt.

Caroline Sinders Interview PAGE 11.2021
Her installation Within the Terms and Conditions, which was just on view in a London gallery, is based on weeks of research in which Sinders explored the mechanisms of neo-Nazis, alt-right groups and conspiracy theorists on YouTube Bild: Tim Bowditch
Bild: Tim Bowditch

You say that this work gave rise to the idea for the Feminist Data Set, which is one of your best-known projects.
Working so much with hate speech and harassment, I really wanted to do something that was the opposite of that. That’s how the idea for the Feminist Data Set came up. It is a critical design art project using intersectional feminism as an investigatory framework and is biasing machine-learning. Looking at the machine-learning pipeline from start to finish, it’s effectedly asking if it’s possible to build a feminist AI system. And one that is not just focusing on the algorithm to be used, but literally on every step: from assembling data and what a feminist way to do that could look like. What a feminist way would be to clean and label data. What tools we’re using and so on. The project is looking at every small step and asking if there is an intersectional feminist alternative to do this. And if not, how do I make one?

How will something like this be implemented somewhere?
It’s a critical design project that highlights all the problems and concessions we have to make. I think it will manifest in a visual book and as a series of ­installations that will show each step of the project. Unlike hardware, it’s much harder to show the process and the software. The Feminist Data Set should make people think about the whole framework. I’m a white woman making the set. This is something that, like many other things, needs to be pointed out. I’ll have to use Amazon Web Services because I cannot build my own version. I have to use a computer but where does the silver in its microchips come from? How ethically sourced are they? These are all prob­lems we have to be aware of. It’s just not enough to be part of a collective that has good intentions, that doesn’t solve everything. You must always question your own practice.

So your research driven art is giving people ideas and making things visible in another way?
Yes, that’s a good way to describe it. It makes certain urgencies visible and tangible. That’s why some social scientists, NGOs and think tanks are starting to work with artists, they recognize the power of art and also its ability to make invisible things visible. Even if you cannot change things immediately, you can show the problem, potential alternatives or provoke discussions about solutions. Another topic I just started to research about is scale. One of the questions is if a project, a collective, an open-source tool or community is growing to a certain size, does it become inequitable then? Is there a certain size when this starts to happen? Or is it possible to create a cooperative as an alternative to Uber for example? Do we need other forms of technology for that? And what else do you need to ensure that it’s still treating people fairly? Especially in the US, that’s the total opposite of how we’re taught to build things. I’m ­very interested in these sort of questions.

Caroline Sinders Interview PAGE 11.2021
Caroline Sinders’ latest project, supported by the Ars Electronica Futurelab and created together with artist Anna Ridler, is a data set of cypresses that tells of climate grief and disappearing trees

You are influenced by artist groups like Forensic Architecture that is known in Germany for the investigations of the racially motivated NSU murders. And also by scan map that works in New York. What interests you about their work?
Scan map is a collective that came together right after the death of George Floyd and the protests that followed. With access to the police radio frequency in New York City they published their movements in real time. So the protesters knew when they had to move or leave. Scan map came together really quick­ly while the research of Forensic Architecture can stretch over years. They both work in the art context but with­out the ego of a traditional artist and their work embodies usefulness. And usefulness is the whole ethos of my design work, of my art and my ­research work. I’m always struck by the amount of work Forensic Architecture is doing, by their comprehensive investigations and the different kinds of experts they are working with. From audio experts to data visualizers, machine-learning experts and all kinds of specialists for open source and investigation methods. That is really powerful. Other projects that I really admire are Radical love from Heather Dewey-­Hagborg that uses DNA of Chelsea Man­ning she sent the artist, who made 3D portraits out of it that don’t look like Chelsea at all and show the inaccuracy of data sets and the gender identity stereotypes in forensic DNA phenotyping. Or the Library of Missing Data Set from Mimi Onuoha which visualizes that in a society with obsessive data collection some information is just excluded.

You just had an exhibition yourself in London called Within the Terms and Conditions that deals with YouTube videos.
Yes, I spent three months watching YouTube videos, 150 hours altogether, I think. One thing about my art projects is, if I’m incorporating data into them, I always like to look at all of it. I also had been reading lot of about misinformation, also about the AfD in Germany and their use of YouTube, and I studied the work of the British NGO Hope not Hate. I studied themes like anti-vaccine-Videos and 5G conspiracy theories and how they were emerging. And the rise of stagings of white supremacists that have their YouTube shows look like newscasts now. They have a desk, they are wearing a suit and a tie and have big graphics behind them. I tried to find their channels and collect what they were talking about.

»Data feels like an ubiquitous common material, but in fact it’s people’s information and we need to be pro­tective of it«

That sounds challenging.
What helps me is that I get into a mindset where I kind of tune out what it is I’m looking at. I will turn the subtitles on and fully focus on their argumen­tation and whether it fits into the methodology. That gives me a shield. But at one point during the pandem­ic, when I lived with my partner in a one-­bedroom apartment, he asked me to stop because he couldn’t stand the content any more.

Another project of yours is the one you’re doing at the Ars Electronic Futurelab at the moment.
Yes, it’s a project about climate change and I’m colla­borating with Anna Ridler, of whom I’ve been a big fan for many years. We both are interested in climate grief, which is a term coined by the researcher Britt Wray. We were interested in handmade data sets and an AI project focusing on this issue. Anna’s family is from Georgia and I’m from Louisiana – these two states share similar ecosystems and are on the Gulf Coast. The coastline and the trees are disappearing there, and we decided to build a data set out of cypress trees. Especially in Louisiana, these very tall and indigenous trees are making up the swamp. It’s a very rich ecosystem, a wetland with a variety of flora and fauna, which ist also a barrier to hurricanes. But due to ­climate change, the drilling for oil and drenching of the swamps to make way for ships, they are becoming less and less. At the beginning of the pandemic, I had to travel back to Louisiana because a family member had passed away. During that time, I photographed the data set for Anna and me to annotate – and for Anna to build the AI video we’re creating.

Putting your exciting insights and demands together, are these the necessary prerequisites for an ethical technology of the future?
Obviously the problems are more complex that we can ever get into in one project or talk. But another important thing I want to point out, is that at the core of the human rights centered design ethos is the ability to take criticism and instead of rebuking it, actively solve the problem. A bug is a feature until you fix it. It is important to remove our own egos and apologize for any harm and minimize it timely and efficiently. There also must be easier ways for criticism and a feedback process. Like when an app is updated and people can respond to it. We need ­similar mechanisms in the design process and much more transparency in regards to what has changed and why. It’s all about caring and repairing and I think that’s the only way to really improve things.

PDF-Download: PAGE 11.2021

Designprojekte kalkulieren – Honorare durchsetzen ++ Sustainable Design ++ Inspiration: Pilze ++ SPE-CIAL Caroline Sinders ++ Making-of 3D-Experience »Penderecki’s Garden« ++ Ratgeber: Mentoring in der Kreativbranche ++ AR-Audioguide Drifter

8,80 €
AGB

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert.

Das könnte dich auch interessieren