Fortifying Truth in the Age of Deepfakes: Lessons from Yvette Alberdingk Thijm, ex-Director of Witness

by | February 23, 2024 | Podcasts, Tech Matters Podcast

Tech Matters Podcast on Spotify
Tech Matters Podcast on Apple
Tech Matters Podcast on Pocket Casts
Tech Matters RSS Podcast Feed

“Seeing is no longer believing. You need to fortify the truth.” (Y.A.T.)

Generative AI, deepfakes, synthetic media… we’ve known the risks for a while, but the dangers feel ever closer given the amazing advancements of companies like OpenAI. Meanwhile, human rights activists have faced these risks for years.

Witness was ahead of its time by giving people tools to document human rights violations (departing from conventional notions of “acquiring data”) and prioritizing authenticity, personal security, and safety, particularly through its focus on human rights use cases. This issue of authenticity is a major one: Despite the ongoing challenge of combating weaponized misinformation and disinformation, organizations like Witness have been navigating all the complexities of trust — trusting the source, but also trusting those who are making the tools to document the truth, especially across cultural and social barriers.

Yvette is an advisor/consultant/board member, and previously Executive Director of Witness, with decades of experience leading org design and ecosystem thinking at the intersection of human rights, video, and technology. Tune in to hear about how she has been addressing the issue of trust and authenticity in media!

Transcript

Jim Fruchterman [00:00]

Welcome to Tech Matters, a bi-weekly podcast about digital technology and social entrepreneurship. I’m your host, Jim Fruchterman. Over the course of this series, I’ll be talking to some amazing social change leaders about how they’re using tech to help tackle the wicked problems of the world. 

We’ll also learn from them about what it means to be a tech social entrepreneur, how to build a great tech team, exit strategies, the ethical use of data, finding money, of course, and finally, making sure that when you’re designing software, you’re putting people first. 

Over 20 years ago, I had an idea for a human rights project that I was going to call “The Witness Project”. But then a little bit of research turned up… there already was a Witness Project! But both of these ideas had the same common theme: How can we shift power and authority and voice to local human rights defenders instead of reinterpreting or editing what they have to say, getting in between them and the rest of the world? 

Today, I’ll be interviewing Yvette Albertingk Thijm, the long-term Executive Director of Witness for 15 out of the last 20 years, stepping down just recently in 2023. 

But I’m so excited about hearing from her about what it took to turn Witness into the human rights powerhouse that it is. 

Yvette Alberdingk Thijm [1:28]

It really started back in the 80s when video cameras first got into the hands of amateurs. And one of those amateurs was a musician called Peter Gabriel, and he was on a Human Rights Now! tour with Amnesty International to celebrate the 40th anniversary of the Universal Declaration of Human Rights. 

And he started filming people with a big fat camcorder that he had with him, and he met people who had experienced human rights abuses or had seen members of their family being disappeared, or had experienced torture. And he just realized that that piece of technology was very powerful in the sense that their stories could no longer be buried or forgotten. So that then sparked the notion of Witness. 

Then in 1991 in North America the Rodney King beating happened. A black cab driver was beaten savagely by Los Angeles police. And that was captured on a camcorder by a guy called George Holliday. And it actually was — of course people knew that there was systemic racism and police brutality — but the video of it being caught on camera  really changed the equation. And that actually led to the idea of Witness being created. 

And I think then there was a guy called Jim Fruchterman who had some ideas about how the truth was really important. 

Jim [3:00]

This focus on strengthening local human rights defenders is so exciting to me for three big reasons. The first is, this isn’t just data (even though we talk a lot about local control of data), these are stories and facts, and compellingly told, these can change societies, can change the human rights context. 

And that’s the second part. International human rights groups are important, but it’s societies that actually change themselves to reduce or eliminate a certain pattern of human rights abuses. And this is why it’s so important that the data be combined with the local context. 

And lastly, the third part is about trust. There’s so much misinformation and disinformation out there. It’s so easy to make a deep fake that we need groups like Witness to train local human rights defenders on how to deliver their stories in a way that has such an authenticity and rings so true that it overcomes the effort to discredit the people who are trying to defend those human rights. 

Yvette [4:05]

It started by literally taking that piece of technology, a video camera, putting it in a cardboard box and shipping it to the other side of the world. For example, to the Philippines where people were defending their land to support activists with a tool to achieve the goals that they had in mind to create accountability and justice. 

And then it evolved just together with the way technology evolved. It evolved along the way to start doing training because people started to realize that the footage that came back needed sometimes to be more specifically aimed at holding perpetrators accountable or stronger in advocacy or in verification. 

And then in 2005, YouTube was launched. And this is at a time that there were very few videos of human rights abuses online, right? There were Egyptian bloggers starting to post things online, of police beatings. But it was the early days of these videos actually showing up for the world to really experience. So what Witness did at the time is conceptualize something that was called the Hub, which was sort of the first live video community website. And that was at the time quite revolutionary because even other human rights people said, “No, human rights videos should not be online”. But the essential values that informed that, which were people should be able to contextualize videos themselves, and there needs to be a community of activists, as a dedicated space also to take action and to be in solidarity with each other, I believe those are still very essential values. And there was, I think, a very prescient sense that relying on privately owned for-profit tech companies and platforms was potentially also very harmful. 

So I think that, I think that since then obviously have been a lot of learnings from that. 

Jim [6:27]

Well, and of course, of course, back then, you know, and continues to be, disintermediation was the name of the game, right? You know, so new organizations were getting started that were like, again, “Don’t pay attention to the local activists. We’re going to talk directly to the people on the ground”. And Witness made a deliberate decision to not be, you sort of synthesizing the voice of local activists, but instead hold it up. So, do you think that actually is the right long term equation? 

Yvette [6:59]

Oh, absolutely. Absolutely. And I mean, to me, it’s a no brainer. But maybe it’s… your right, maybe in the NGO world it’s not necessarily no brainer, in the world. Like, look at big media, you know, not ever really fully or actually doing a lot of harm in how they represent people’s voices. And I think what had happened a lot, you know, there’s one particular piece that I know you’re very familiar with is that, at the time like tech was a very glossy thing. And everybody was like, “OK, oh, let’s just make sure that that  there’s an app for that” and “Let’s just make sure that we support these communities”. Sort of always with this very, sort of very slightly colonial way of thinking. It’s kind of like, it’s like tech colonialism I guess you might call it. It’s kind of like these communities have these tools, but it was never really coming from the local context. coming from what do people actually really need? 

I remember even — this very old school example — we used to work with these little mini cameras called flip cameras, and we did a great partnership with people who were forcibly evicted in Cambodia and when we started working with them, there was a little blinking red light on that flip camera. And the local activists were doing that work at great risk, because they were under a lot of threat. They basically said this blinking light is gonna completely give me away, like, a camera with a red blink. 

Those tools were never developed with consultation with people who were most at risk. 

We’ve always iterated as technology developed, but I think one of the major pivots that happened was around 2011 after we did that Cameras Everywhere report. Because one of the things we really started to see in the world was the huge volume of sort of user-generated videos and that anybody could be a witness. And the digital divide was closing, even though it was still absolutely there. 

And I remember — this is around the time that the Syrian conflict started — there were mass, millions of videos coming, sadly, out of that conflict, where people who might live in Aleppo were just filming what was happening, when a barrel bomb fell in their neighbor’s house. And that has led over the years to many, many guidances and resources that very frequently are shared and developed together with activists, including very recently a video, a land defender videos evidence guide, on how tools like video could actually support the documentation when big mining companies and governments make incursions and do environmental violations of people’s land. 

Jim [9:55]

Well, that’s very timely because the society is going through this giant energy transition, which is going to require us to mine a whole bunch of places that aren’t mined. And there are people who live there and maybe based on past experience, that’s not going to go very well for them. 

So you’re in this deep partnership with local groups, you’re paying attention to technology, you’re paying attention to what they actually want, you’re actually listening to them. But you’ve also gone in sort of the partnership direction when it comes to technology, and I wanted to ask you about sort of the Witnessing App kind of idea. 

Yvette [10:30]

Yeah, absolutely. So one thing we learned is technology is hard [laughs] and I think you know that. But we started collaborating with an amazing group called The Guardian Project, their Android developers, to say what are the specific needs if you look at it from a human rights lens. There needs to be more safety, security. How do I share a video securely? And how do I authenticate? How do I know when and how it was actually… what’s the provenance, as we like to say… of that video?

So we worked together and we created something called InformaCam, which did something that people might see more now as a blockchainy kind of thing that sort of took all of the data that was married to your video and ran it in your phone so that you could prove that that video was not tampered with. 

Jim [11:29]

Let me interject a little bit here about InformaCam. The idea was to have a specialized app that had all these protections against deep faking. So the idea is that when you had a camera that had a very specific app, you would store additional information using encryption and the like to lock in that this really was taken on this camera at this time in this place, and that tampering would be obvious. So that a video taken from this by a human rights defender or a witness to civil rights issues would be regarded as far more authentic than something that could be faked. 

Yvette [12:00]

I remember the meetings we had then and they were always meetings between activists and technologists and in that process people would say, what about this? What about that? Would this work? Would that work? And because that process was so inclusive of the actual needs and uses of people, for example it would say something like, well there might be one moment where I would like my identity to be known when I’m a witness myself and I testify to this video in an international criminal justice proceeding. 

But there are many other ways where I don’t want my name or my identity anywhere near this video. So the nuance of that… But we also learned that collaboration with people who really understand technology really well, as we learned from the Hub, is very important. That eventually became an app called Proof Mode which also sent very important — so, even if these apps don’t become widely accepted — they send a very important message to people who are developing new technologies because these are the standards that really should be there. 

And one of the things that I remember, which I think is that we, at some point were developing this, we talked to some commercial venture capitalist people who said, “Oh, this is a great app, but it will never scale. Because you have to take all these things, security, privacy, protectors, you need to take them out of there. And then, oh my God, you’re like…” I mean, what Nathan and the Guardian project did was far, it really was far ahead of where the commercial developer world was at. 

And we were like, actually, no, because that’s… contrary to everything we were hoping to achieve here. And today, Witness has a particular program that’s about content authenticity, because what’s happening is these big tech companies are actually developing the same time of provenance and authenticity infrastructures. 

But again, they’re being developed with not enough thought to what does this actually mean if you’re a woman in Iran and all of a sudden either your story will just not be believed because you didn’t use the right technology, because that technology is not accessible to people in the majority of the world. Or it will put you incredibly at risk because the only way you can use a certain tool to show that the video is real and it came from you is by identity, like putting your name on it which doesn’t work. 

So one of the ways in which our strategies really also evolved is we are in things like the Partnership on AI and the Content Authenticity Initiative to bring in the human rights perspectives as people are starting to develop tools that when they really scale, we’re going to have a huge impact. 

Jim [15:04]

This is kind of fascinating, so you’re describing you know how maybe 10 years before the times you built all this authenticity infrastructure and to serve human rights and with human rights use cases in mind. And now the world is freaking out which it’s been sort of freaking out all along with now it’s really really freaking out about you know how we can make up videos that appear to show you know a former president running from the police or something. 

When it didn’t really happen and yeah and so so what’s this balancing act that you guys have tried to strike between changing the way the commercial tech industry works to make it. Less human rights hostile or maybe, more human rights friendly versus actually serving the human rights groups where they are right now with the challenges they have so how do you guys navigate that balancing act. 

Yvette [15:55]

Yeah, I really love that question and I think the answer it’s never a versus. To give you one very concrete example, a couple of months ago Witness hosted this convening in Nairobi, where we brought together activists journalists developers policy advocates creators of deepfakes, like people who use generative AI, and basically talk through in a two day workshop like what are the opportunities and what are the risks of these technologies, and that actually also resulted in a set of recommendations. 

One of the risks that many activists are identifying of generative AI is that their credibility may be utterly undermined, right? So people will create videos or bad actors will create videos of amazing human rights activists doing a drug deal somewhere, because nobody… it’s too easy to attack people’s credibility. But also one of the very key things that emerged there was like, again going back to the very very early beginnings of Witness, is the accessibility of technology. So, who is going to have the, we call it equity, who’s going to have access to the tools to figure out whether a video is real or not real?

So I think that, to go back to your question, the tension, I don’t think it’s a tension. It’s the only way we can work is always being part of an ecosystem of actors that are turning to technology, whether it is Indigenous groups changing the narrative on the big mining companies, or whether it is people really needing strong tools to have video evidence or doing open source investigation. 

The one thing I would say is, about eight years ago, Witness started working on looking at synthetic media. And so like the idea that anybody could just create an image or create a video or could make anybody say anything or do anything based without that person actually ever being part of that video might have been slightly funny for people in some ways, right? When you see Putin do something or Elon Musk do something. But when it comes to people who are most at risk of these kinds of technologies, it actually becomes a very serious matter. 

So for example, if you look at my colleague, Nkem Agunwa recently was talking about this, saying, if you take for example, the anti-SARS movement, the brutal policing that happened in Nigeria, where the protesters were very much attacked, right? So there are people already filming and creating content under very dangerous, risky circumstances. For them, bad actors and governments are weaponizing a lot of the technology people get surveyed. Now there’s this layer of generative AI that opens the floodgates to false, you know, disinformation, misinformation, false narratives, a huge amount of distrust accounts that people might have risked their lives for getting dismissed as, “Oh, this is fake” or “This is a deep fake”. So that layer, there’s just a volume of this kind of disinformation and fake content coming, creates very real harms and real risks for activists. So in thinking about solutions for that, we need to really involve the people who are most at risk and most from the frontlines. 

Jim [19:39]

So here we have these small groups, you know, who, including Witness relatively speaking, compared to the big tech industry or government, there’s less money. And yet, you know, part of the storytelling goal of the human rights is to straight, you know, speak truth to power. I mean, what’s, how do you guys actually approach that? I mean, what do you think of the essentials to actually make progress when everyone is distrusting everything? 

Yvette [20:06]

Seeing is no longer believing, right? So you need to fortify the truth. And how do you fortify the truth is by supporting the people who are telling the true stories of human rights abuse or filming people coming into Ecuador and taking their land. You need to support them to ensure that their truths are stronger. So it may be various tools and tactics for how do you film something in a way that it’s easier… it’s harder to dismiss it, right? Or it may be, for example, how do you deal with when bad governments, which there are many including in liberal democracies, do things like shut down the internet, which is the tool of choice these days. Like what are the ways in which you can actually preserve your documentation?

So I think there’s a lot of guidance and resources that Witness has and shares. There’s also things like pushing for media literacies, the Witness Africa team did great campaigns like “verify before you share”, like can you trust…

So there’s lots of different tactics and tools there. But the other part of that is advocating to the tech companies, you know, to ensure that, for example, many indigenous youth use TikTok as a way to amplify their narratives. That’s actually a great opportunity. But if TikTok doesn’t have the right policies in place or potentially has ways in which they jeopardize privacy or deletes, as they did for Ukrainian war crimes videos, deletes mass amount of videos, you both see the huge challenges of people using these platforms for their advocacy. 

So we do advocate very strongly to tech companies and particularly when it’s about emerging tech, we call it pipeline responsibilities — every step of the way, how do you start really thinking about the use cases that have to do with, not with your average consumer, but with the people most at risk? And how do you listen better to… there’s so many incredible, like from, you know, we know many of them from, Nighat Dad to Esra’a Al Shafei to Timnit Gebru… people who have a lot of expertise and who for years have been saying, take the following things into account, as opposed to listening to, you know, the CEOs of the tech companies. 

Jim [22:50]

Well, and I know, I mean, just because I know who some of those people are doesn’t mean all of our listeners well, and of course Timnit very famously was fired by Google for pointing out inconveniently true things about how Google is going about that. And so, do you want to say anything about those people?

Yvette [23:08]

I mean, and there are many, many of them. And particularly, I would say someone even like a Nighat Dad who runs the Digital Rights Foundation in Pakistan, is that they come from, not necessarily from Global North perspectives, or they don’t come from dominant perspectives, right? And someone like an Esra’a Al Shafei is also someone who is a brilliant, Baharaini developer, tech maven and human rights activist who has years and years of experience in developing platforms that were, for example, better adjusted to security concerns for LGBTQ people or in the Middle East. There’s so much knowledge, but people need to incorporate it. 

Jim [23:57]

And I always remember Esra never wanted her picture circulated on any means. There was always a cartoon version of her to make it just a little bit harder for governments to spot her. 

But let’s talk about that dominant North sort of power thing. Because you and I, we’re from the Global North, right, we run organizations that are based in the Global North. Why on Earth should local communities in Pakistan or the Persian Gulf, why should they trust Witness? How do you guys actually go about that when the natural setting might be, “You’re part of the former colonial powers?”

Yvette [24:41]

Actually, first of all, I love that question because nobody should trust other, particularly not the Global North organizations. 

I think that trust is earned. And I think that also we learn a lot together. So to give you the example of the videos evidence, the videos environmental guide, defense guide was developed over a period of probably 10 years of longstanding collaborative relationships, learnings with many different groups. And that then, eventually, I do think that for example, for Witness, one of the things we often talk about, is that we see people who are turning to these tools and technologies have very similar learnings. So internet shutdown is a very good example. Like we saw this happening many different places in the world. 

So I think from a humble perspective, you can say we are probably, as a global team of activists, in a better position to kind of go like, “Oh wait, what the people in Sudan are experiencing right now is actually what we saw happening in Myanmar right before that”. So how do we then help kind of connect the dots between the different learnings and hopefully in that sense accelerate it. 

Jim [26:05]

I think you’ve kind of underscored a lot of the wisdom that you purvey is actually wisdom that represents the collective wisdom of the people that you’re actually serving. It’s often not what you’ve created. But boy, these activists figure out a solution to the internet shutdown, and well, let’s pass it on to the next country, right? 

Yvette [26:25]

Yeah. Yeah. 

Jim [26:25]

So, you know, now, Witness is — you know, you’ve you stewarded it for 15 years through some huge tech transitions, right? I mean, the growth of YouTube, a high quality camera in everyone’s pocket, you know, an incredible growth… In some ways, the Syrian conflict was over-documented, whereas when you started, human rights were rarely seen, right, that people could deny these bad things were happening. 

And now that’s getting harder and harder, and you’re in this arms race where the fake content is competing with the real content. So, what do you see as the future of, you know, video documentary and storytelling for human rights? 

Yvette [27:14]

The biggest fear for me is that distrust becomes the easy go-to. So, that’s when people actually do share a story that is extremely important for accountability, or for advocacy changes, that it will too easily be dismissed. So I do think we’re in a very critical pivotal moment. But we also see mass organizing. We see many people participating in change. I believe that, you know, but it’s slightly self-serving, that the way Witness is thinking about the future and fortifying the truth, is that’s not just Witness. That is Witness plus plus plus plus many other groups around the world that we’ve had the privilege of working with for the last 30 years.

I have a lot of hope for that, right? But I do think that video is increasingly central to many things. We need to be proactive and we need to protect the truth in the most essential way, if that makes sense. 

Jim [28:20]

Absolutely. And I think that’s, you know, kind of a takeaway, I think, from today is that these essential challenges remain, even as the technology evolves and society evolves. And so, you know, what Witness is doing today is similar in philosophy to what you were doing 15 or 20 years ago, even if the tech has changed and the environment has changed. 

And so, well, Yvette I really do appreciate the chance to kind of go through the story and also understand some of these key trends that, you know, transcend your experience at Witness. 

Yvette [29:02]

Thank you and thanks, Jim. It was a pleasure to be here. 

Jim [29:04]

Thanks for listening to this episode. I think as you listen to Yvette, you understand how important trust is in doing the work of social good. And of course, it’s multi-layered. It’s not just making human rights defenders more trustworthy by teaching them how to make a good video and doing it in an authentic way. But it’s also the trust that Yvette and her team have generated with human rights defenders across the world, who have every reason in the world to not trust an international organization based in the United States. 

So how do you do that? The positive word of mouth from human rights defenders is the most valuable asset you can have, to have the opportunity to make a difference in the human rights field. So that’s very impressive that Witness has been able to do that under Yvette’s leadership. 

To hear more interviews like this one, be sure to follow the Tech Matters podcast on Spotify, Apple Podcasts, your favorite platform, where we’ll be publishing new episodes every two weeks. 

And is there something that you found particularly insightful about this episode? Anything maybe you disagreed with? Be sure to let us know by sending an email to [email protected]. We want to hear your thoughts and of course, feel free to leave us a rating. 

Thank you so much for listening and see you next time!

SHARE THIS:

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Categories

Tech Matters Podcast

Listen on Spotify Listen on Apple Listen on Pocket Casts RSS Podcast Feed