Generative AI

Transcript: The Futurist Summit: The Age of AI: Democracy on the Line with Maria Ressa


MS. ZAKRZEWSKI: Well, welcome back to Post Live. I’m Cat Zakrzewski, national tech policy reporter here at The Washington Post, and my guest today needs little introduction, Maria Ressa, a Nobel Prize winner who actually flew in today from the Philippines to join us.

Thank you so much for being here, Maria.

MS. RESSA: No, thanks for having me. Thank you for coming.

MS. ZAKRZEWSKI: And, Maria, I want to start with your comments at the end of that video, that, quote, “If we don’t have integrity of facts, we cannot have integrity of elections.” You’ve warned the world will know whether democracy, quote, “lives or dies by the end of this year.” Help set the stage for us. What’s at stake?

MS. RESSA: First, journalism, facts, right? With all of the big tech changes that have happened, digital news may disappear, may not survive the next year. That’s the first.

The second thing is that you’re seeing generative AI, and I’ve been sitting–I was sitting in the audience listening to everyone else. Please understand you’re listening to people who not only created it but want you to use it. Right? It’s a sales pitch. It is hype, and if you truly look at–let’s put the generative AI later, and let me answer your question directly.

As of January this year, V-Dem out of Sweden said that 71 percent of the world is now under authoritarian rule. We’ve seen that increase as the attacks on journalists increased, right, hand in hand. So the quality of democracy has gone down as attacks on journalism have increased, and journalists have had to sacrifice more to keep giving you facts to hold power to account.

The biggest thing with generative AI is that, literally, not only will we have to deal with the harms of social media. Let’s be clear what that is. That is the new distribution system for every piece of information, and by design–this is an MIT study from 2018–Social media spreads lies, lies at least six times faster than facts. And I am sorry to bring you to X, but I’ll tweet that same study later on.

And our data in the Philippines shows that as of 2017, if you lace it with fear, anger, and hate, that it spreads even faster. That’s like telling your child, “Lie. I’m going to reward you. I’m going to give you ice cream each time you lie.” You’re going to–and then you’re going to carry this through the years. You’re going to have an adult that’s gotten overweight because they keep eating ice cream, and then–and also somebody who has absolutely no integrity. It is the demise of democracy, and I feel, you know, that last phrase that you heard me say, which is “no facts, no truth, no trust.” If you don’t know what the facts are, how are you going to vote? Does this technology take away your agency? If the technology hacked our biology, how are you going to choose? That’s the first, that social media, our first human contact with artificial intelligence, which, by the way, is neither artificial nor intelligent.

This is a this is a body of study that began 70 years ago. There are many different ways you can you can use machine learning or artificial intelligence, but that phrase itself was coined in 1956 as a marketing phrase. It’s to market to you, to make you use it. So that’s that, I would say.

But last part in this is generative AI is now going to take us to enshittification, the enshittification of the internet.

MS. RESSA: And I’m going to–I’m going to footnote that word to Corey Doctorow, but what he meant–and this is an academic study that came out January this year. This generative AI, you know, some of the folks here said that, you know, well, you can create content. I cringe because that means you can create crap really quickly, and then it’s going to come to you. And you won’t be able to tell the difference between quality and crap.

The enshittification is as of this year, January this year, an academic study said 57.1 percent of the content of the internet is low quality. There are derivations of what The Washington Post would do, what I would do, or bad translations. That’s before all the deepfakes started taking off.

Let me shut up because I sound like doomsday.

MS. ZAKRZEWSKI: No, I mean, I’m going to ask you a question that might lead to more doomsday answers right now.

MS. ZAKRZEWSKI: But I want to ask you–I mean, we’re about halfway through this crucial year for democracies around the world. We’re still a few months away from our election here in the United States, but when you look at the elections that we’ve already seen happen this year in India, Taiwan, most recently the EU, what can we learn? What can we maybe learn from those elections about what we could expect here from AI and social media?

MS. RESSA: I think Americans are really not prepared for these elections, because you are going to walk into them using–does anyone here not have a cell phone? This monitors you, right, whether you’re–whether it’s social media or Netflix or Spotify or any–on any of these things. This turns you into numbers. And then generative AI essentially turns those numbers, takes the stuff, and then tries to distill us into that. But numbers aren’t who we are.

And let me go to the elections. I’m sorry. Let me go to your question. There’s so much, because I made notes from everyone else who was in front of me.

One election stood out as a good election, as a surprise election where democracy won. Taiwan wasn’t a surprise, because Taiwan has always been fighting China. If you’re familiar with Taiwan, you know that this is a population that is always on alert. So the fact that they can fight disinformation, which even they raised the alarm on, is not a surprise. They were prepared for it.

Poland was the bright spot in elections, because you already had a rightist government. We all expected that government to win, but that government passed an abortion law that brought the women and the youth out to vote. They walked out of social media. They walked out of the virtual world, and they redefined what civic engagement is, because they felt like their backs were up against the wall.

America is so far from really feeling that. In fact, you’re in denial that you’re being manipulated, and what we’ve documented in data are seven layers of manipulation, the campus protests being only one of the tip of the iceberg. Gaza is a fault line. So we’ve had eight years of insidious manipulation.

How will you choose to vote? So this is–please do better. The world, where you go, you will wind up taking us.

MS. ZAKRZEWSKI: So for the people in this room, we have policymakers, people from the tech companies gathered here. What steps can we take to ensure we don’t repeat those mistakes here in the U.S.?

MS. RESSA: Yeah. So let me talk first about the news organizations. The biggest mistake we made with social media was literally not building our own tech and taking the share buttons and putting them on our websites and sharing you with them. That’s the social graph. That’s the social graph. That is the data that you have.

And we’ve seen the impact of social media that hasn’t really been addressed. The EU has the most aggressive legislation on it. But I always joke that the EU won the race of the turtles.

I am the vice chair of the Internet Governance Forum at the UN. The chair of it is Vint Cerf, the Father of the Internet. He’s with Google. He did the TCP/IP protocol. We fight all the time in a really nice way, but the Internet Governance Forum is one way that the internet used to be governed. It won’t be that way anymore, because the software has been so important.

For civil society, for people who are on these devices, you have to understand that in the age of exponential lies, we have to define what civic engagement is. We have to walk out of the virtual world where we don’t even realize we’re being manipulated into the physical world and use exactly what the tech companies have used, which is our propensity to believe our family and friends. Like those stats said, that each of us people will believe their family and friends 94 percent times more than they will an institution–94 percent. Even if your family and friends are lying, you know–and that’s part of the problem that we have.

So once social media–what we’re dealing with today are the cascading failures that began with the distribution, with the replacement of news organizations as gatekeepers by technology. And that hasn’t changed. You’ve done the reporting on this. You’re walking into elections with the social media companies taking away all of the guardrails they put in place for America’s 2020 elections. You will be far more vulnerable.

Cat, you tell me why they’ve taken it away.

MS. ZAKRZEWSKI: So I think there’s been several factors. We’ve seen Elon Musk take over Twitter, and that had an effect in Silicon Valley–

MS. ZAKRZEWSKI: –where the companies dismantled some of these systems, and there has been a conservative legal campaign to dismantle some of these systems that we’ve seen come out of Congress and fight in the courts all the way to the Supreme Court.

MS. RESSA: And I’ve seen this in in countries in the Global South. Just I feel like in many ways, Americans don’t think it’s happening in their country, but it is.

And, you know, a lesson–so here’s the upside. My country went from the years of Duterte, the six years of Rodrigo Duterte, where the number of people killed in the brutal drug war is our first casualty in the battle for facts. It goes–the police still say it’s only about 8,000. Well, actually, they claim it’s 2,000. Then it became 8,000. But our Commission on Human Rights says it’s at least 30,000 in 2018. So that was our first casualty. We were in hell, and in our 2022 elections, we moved to purgatory.

I’m just so worried America is going to hell.

MS. ZAKRZEWSKI: And just to help our audience understand the stakes, I want to take a moment here to show an example of–

MS. ZAKRZEWSKI: –you know, how a deepfake could be used to misguide voters in elections taking place around the world this year. Let’s turn to the video.

MS. ZAKRZEWSKI: And this is an example that shows how good the technology has gotten at making it almost impossible to separate fact from fiction online, and, you know, this is a deepfake of Donald Trump. It’s a little difficult to see how far of a reach this video has garnered online or what impact it might have had on Trump’s favorability abroad. But in this era, when it’s possible to cheaply and quickly make a video like that one, what should people do? How should people sift through what they’re seeing online?

MS. RESSA: I think three things that need to happen. And we–Dmitry Muratov and I–rolled out. Dmitry is a Russian journalist who, with me in 2021, received the Nobel Peace Prize. He’s been forced to shut down his news organization, right? Most of them are now living in exile.

In 2022, we came out with a 10-point action plan, and this pulls it up to a high level. 300 Nobel Prize winners, civil society groups, people who are journalists, people who are working in the information ecosystem have signed on to this.

It distills to just three points. The first is to stop surveillance for profit. Everything that you have heard today is built on surveillance capitalism, which we didn’t even really name until 2019, when Shoshana Zuboff, the Harvard emeritus professor, came out with a 750-page book on surveillance capitalism, a new business model that the tech companies were using. That means that you, your data, your privacy has been–you know, the companies will tell you they’ve created models of each person on their platform. I just say we’ve been cloned without our permission.

And then AI comes in and takes all of our clones, and that’s the motherlode database for microtargeting. Microtargeting is not advertising in the old age. Microtargeting is finding your weakest moment to a message and then selling that to a company or a country. So that’s the first. That’s still social media. That’s still distribution.

And now you have to ask me the question again because I forgot where I was headed.

MS. ZAKRZEWSKI: I just think, you know, in this era where these deep fakes are so cheap and easy–

MS. RESSA: Oh, what do we do? Yes, yes.

MS. ZAKRZEWSKI: –how can we separate fact from fiction online?

MS. RESSA: Actually, you can’t.

MS. RESSA: This is the problem. And, you know, I know because I have several deepfakes also. My first deep fake was in March last year, and it was–it came out of Russia. So I couldn’t tell whether–but it was a Russian advertising system that that came out with it. It was me selling Bitcoin.

MS. RESSA: So it sounded like me. It was wagging its mouth like me, but it came from Russia. And what they did is the distribution was on Facebook, but they used a credit card to sell it on Bing. So that was interesting to me. This is where it goes hand in hand.

And sorry, you use Donald Trump, but let me talk about business and what happened to a Hong Kong company. A Hong Kong–a guy in a Hong Kong company was asked–the CFO was supposedly in London, and he was asked to wire $25 million to London, right, to this account. And he then says okay. They have a Zoom meeting with many different people in the Zoom meeting, including his CFO. And he then, after that meeting, wires $25 million. Every single person in that Zoom meeting with him was a deepfake. I’ll tweet that story too. Right?

So you can’t tell the difference between fact and fiction, and so this is part of the reason. So let me say the three things. Stop surveillance for profit. Stop coded bias. Coded bias, you heard a little bit about it in the medical panel but not enough. Coded bias means that if you’re a woman or LGBTQ+, if you are brown or Black or from the Global South, if you are marginalized in the real world, you’re further marginalized online. The code that is exported to the rest of the world has these biases built in.

So–and then the third one–sorry. I keep giving you bad news. The third one is journalism as an antidote to tyranny, because those first two have been used and exploited by illiberal forces so that they have forced, without you knowing it–they have been elected democratically. Seventy-one percent.

MS. ZAKRZEWSKI: And I want to come back to that question of journalism. But first, I quickly want to ask, I mean, as someone who’s been a target of so many online attacks, I think if I saw a deepfake of you selling crypto, I would maybe be a little skeptical, if that came up in my news feed. But, you know, given that these are happening more frequently–

MS. ZAKRZEWSKI: –and increasing in sophistication, are there steps that you’re personally taking to protect yourself in the era of deepfakes?

MS. RESSA: You know, the hard part is and the worst that I’ve dealt with have been an average of 90 hate messages per hour. Ninety, nine zero. Right? And I cringe every time I hear someone say, yes, but this is free speech. Please know the difference. This is not free speech. This is free speech being used to stifle free speech. Right? It is being used to take–we just did a whole bunch of reports on gendered disinformation. We came out with a study in Brazil, which is also coming out from hell to purgatory. They went from Bolsonaro to Lula. Gendered disinformation there is taking women journalists, women activists, and women politicians, pounding them to silence. Women politicians all around the world are opting out of the public sphere because they have kids. They don’t want to be doxed.

Gendered dehumanization of women is off the scale. I mean, even the EU, Věra Jourová, who’s probably the most powerful figure globally in terms of regulating big tech, has already warned that women are getting forced out.

Sorry. Let me go back to your question. Please remind me again.

MS. ZAKRZEWSKI: So I was just wondering if there’s any steps that you personally have taken–

MS. RESSA: Are doing. Yeah, yeah.

MS. ZAKRZEWSKI: –to protect yourself.

MS. RESSA: I keep talking to you. You know, I am–in this particular year, I really was–we started looking at 2024 as early as 2020, because I come from a country that elected Duterte democratically, that elected Ferdinand Marcos. You know, when Milan Kundera said, “The struggle of man against power is the struggle of memory against forgetting,” we elected the only son and namesake of Ferdinand Marcos, our dictator, who we ousted in a people power revolt in 1986. We elected his only son and namesake, and thankfully, we’re still in purgatory. Maybe it could be that the basement line for what he preceded was just so bad. But I would not wish it on anyone else.

What do I do for myself? During that time period, you have no choice but to be a punching bag. That’s the way I felt.

In order to be here today, I’ve had to ask my courts for approval. The Supreme Court of the Philippines has to know my flights, my hotels. So I haven’t gained all my rights back. But warning, you can lose your rights like this. And what do we do? Move into the real world. Understand the hype that you’re being fed is hype, right, and organize in the real world. If we miss 2024, the tilt of the world will take a decade to return. Minimum. In the Philippines, we knew from the first Marcos, our police and military were radicalized under that dictatorship, and it took a decade after 1986 to bring human rights back. I don’t want to see that happen to you. Yeah.

MS. ZAKRZEWSKI: And throughout this conversation, you’ve pointed to journalism as a key–

MS. ZAKRZEWSKI: –pillar in this crisis. I mean, I want to better understand. You were showing me backstage a little bit the work that you’re doing with AI at the Rappler.

MS. ZAKRZEWSKI: But given these challenges, what steps can media companies take to promote truth in this environment?

MS. RESSA: First, understand we’re all on the same side. We’re all on the side of facts. Why are we not collaborating the same way that–and actually demanding better the same way that the WGA demanded better as soon as generative AI came out? Right? We’re each still in the old world where the kind of–we think we have a vestigial tail, that we still have power. We don’t. We must stand on the side of facts.

But let me quickly tell you why you will get less news in your feed. Starting in 2018, Meta, the world’s largest distributor of news–Mark Zuckerberg said it. He began to choke traffic to new sites. That wasn’t such a steep drop in 2018, even after the Cambridge Analytica scandal. And so, you know, right, Americans were the most compromised accounts.

But the country with the second most number of compromised accounts was the Philippines, because we, your former colony, tend to be the testing ground for you. So–and then what happened after that was then generative AI came in. Right?

So sorry about this. I’m going to ask you again for that question of what are we going to do about it, right? I think we have to move into the real world. You have to organize and understand and accept that you are being manipulated, that this technology is treating us like Pavlov’s dogs.

And the kind–having rolled out the tech, right, so we were one of the first globally to use generative AI because we took a foundational model, trained it on our data, and then if you go on Rappler, every single story will give you a three-point–a bullet point summary. We rolled it out in June last year and then announced it at near the end of the year after we perfected it.

But generative AI as a technology is a probabilistic machine. It is not anchored on facts. So every single thing that it spews out is based on what you have fed it, the data you have fed it. Sorry. So understand that the truth, facts are not going to be in the new technology that you have, but they can program it to ask how you feel and make it seem human. And they now know that they shouldn’t make it. Well, China, for example, has now rolled out a tech that pretends to be a dead relative, someone you love who’s dead, to help you cope supposedly. But that again leads us to this fantasy versus reality question. Sorry. Last thing I’ll say, because there’s so much I can say about the technology, because we are using the chatbot. We will. But before we roll out that chatbot, we are going to anchor it in a very strong ontology and put GraphRAG on it, something all these big tech companies should have done in the first place, but don’t because they can get away with having us do it for the, with the harms to us. And every single harm that happens will have to be paid for by whatever government is in place.

Anyway, so to go back to what journalists can do, number one, less news on your feed, because Meta, as of January 2024, decided to choke traffic to news websites in particular. And what does that mean? A drop of 50 percent to 85 percent globally for news.

The second is search. There are only three ways a website gets traffic, right? Social search and direct. Search. Last year, last March, SGE, search generative experience, generative AI on search was rolled out as an experiment. The U.S. is one of them. The Philippines is one of them. 125 countries. Search generative experience was built on search, which means if The Washington Post or I opt out of that, we will disappear from the Internet. Search will not have that. So we don’t have the option to opt out.

Once search generative experience was put in place, it’s now called–it has a new name, and it was rolled out just in May. You will now get a chatbot to spew what our websites tell you. It will have a probability that it will be wrong, but our links will no longer be there. So our traffic will decrease even more.

You know, this year there’s been a–I can’t even use the word “bloodbath” today. There have been so many journalists laid off, so less news in your feed but more outrage, more polarization. Polarization is an algorithmic choice on the distribution platforms.

I sound really–so what do we do? Right? Let me tell you what worked in the Philippines, even though we elected Ferdinand Marcos, my new president, President Marcos. I have to be happier, right? Because we’re now in purgatory. So, yeah, I went from 11 courts to now only two. That’s not bad in eight years.

What we did was we did a four-layer pyramid, a whole-of-society approach, what we called an influencer marketing campaign for facts and in time for our elections, within three months, we had 60 news organizations at the bottom of the pyramid.

Fact checks. Fact checks don’t spread as much, right? Because lies spread at least six times faster. But what we did is we worked to get to consolidate our data and did a data feed that we process all the way up the four layers of the pyramid, so fact checks on the first layer.

The second layer we called the mesh distribution. 116 different civil society groups, human rights organizations, the church came in. Business finally came in. This is in 2022. And that everyone there, 116 groups, which had thousands, tens of thousands of people, every day we would point out here are five fact checks you can share. They’re asked to share the fact checks with emotion but not to use anger. And what we found from that study was that inspiration spreads as fast as anger. Inspiration.

The third layer were the academics, six different–sorry–eight universities, because we punched the data up to them, and instead of first doing peer review of their academic journals, we asked them every week to rotate how they tell the public we’re being manipulated.

And the last layer, layer four, were legal groups left, right, and center, and they protected the four-layer pyramid.

Before our elections, we took over the center of the information ecosystem with facts. It wasn’t enough to, you know, stop the election, not that we wanted to do that, because it was never about taking a political position as it was about making sure you weren’t being manipulated, that you had the facts to be able to vote.

MS. ZAKRZEWSKI: And, Maria, I want to leave our conversation on that inspirational message. Thank you so much for being here with us today at Washington Post Live.

MS. RESSA: Thank you. Thank you. Please vote.

MS. ZAKRZEWSKI: I really appreciate it. Thank you.



Source

Related Articles

Back to top button