AI

Opinion | Will A.I. Break the Internet? Or Save It?


[MUSIC PLAYING]

ezra klein

From New York Times Opinion, this is “The Ezra Klein Show.”

Earlier this week, we did an episode on how to use A.I. right now. Now, I want to turn the question around and look at how A.I. is being used on you right now. One of the conversations has been sticking in my head was with this person in the A.I. world who was saying to me that if you look at where use has been sticky, if you look at where people keep using it day after day, you’re looking at places where the product doesn’t need to be very good. That’s why it’s really helpful for college and high school students, college and high school papers — they’re often not very good. That’s sort of their point. It’s why it’s working pretty well for a very low-level coding tasks. That kind of work doesn’t need to be very good. It gets checked and compiled, and so on.

But there’s something else that it is working really well for, which is spewing mediocre content onto the internet. And the reason is that a lot of what is on the internet right now isn’t very good. Its point is not to be good — spam isn’t very good, marketing emails aren’t very good, social media bots aren’t very good. Frankly, a lot of social media posters even when they’re not bots are not very good.

There are all kinds of websites and internet operations that are filler content designed to give search engines something to index — filler content structured to do well in a Google result so people click on it and then see an ad.

Something you’re going to hear a lot of in this episode is the term S.E.O., and that is what we’re talking about: Search Engine Optimized. Things that are built to rank highly in Google and Bing just to get somebody to click on the website. It doesn’t always matter to that person if they read the website.

But into this comes A.I. Over the last year, Google and the big social platforms — they have been flooded with A.I. spam, flooded with fake news sites filled with stolen or made up stories. There are TikToks of A.I. voices reading random text off of Reddit, nonsensical YouTube videos for kids. It’s no novel observation to say the internet has felt like it is in a state of decay for a while.

Google search results, Facebook, Twitter, or X, YouTube, TikTok — all of it felt better, more human, more delightful, more spontaneous, more real a few years ago. So what happens when this flood of content hits this decaying internet?

And then — and I actually think this is the harder, weirder question — what happens when this flood of A.I. content gets better? What happens when it doesn’t feel like garbage anymore? What happens when we don’t know if there’s a person on the other end of what we’re seeing or reading or hearing?

Should we care? What if that content is actually better than a lot of what we’re getting right now? Is that an internet we want to be on or not?

My friend Nilay Patel is the co-founder and editor in chief of the tech news site The Verge, and host of the great “Decoder” podcast. And I got to be honest, I can’t tell from this conversation if Nilay is more or less optimistic than me because he seems to think A.I. is going to break the internet. But he seems kind of happy about it.

Before we get into the actual conversation here, we are nominated for a Webby — speaking of hopefully good things on the internet — in the Best Interview Talk Show category. We are up against Oprah here, so we are decided underdogs, but this is a voting category so if we’re going to win, we need your help. You can vote using the link in the show notes or go to vote.webbyawards.com

And as always, if you want to email me with guest suggestions or thoughts on the episode, that is ezrakleinshow@nytimes.com.

[MUSIC PLAYING]

Nilay Patel, welcome to the show.

nilay patel

Thank you for having me. This is very exciting.

ezra klein

Let’s just begin with the big question here, which is what is A.I. doing to the internet right now?

nilay patel

It is flooding our distribution channels with a cannon-blast of — at best — C+ content that I think is breaking those distribution channels.

ezra klein

Why would it break them?

nilay patel

So most of the platforms the internet are based on the idea that the people using those platforms will in some sort of crowdsourced way find the best stuff. And you can disagree with that notion. I think maybe the last 10 years have proven that that notion is not percent true when it’s all people.

When you increase the supply of stuff onto those platforms to infinity, that system breaks down completely. Recommendation algorithms break down completely, our ability to discern what is real and what is false break down completely, and I think importantly, the business models of the internet break down completely. So if you just think about the business model of the internet as — there’s a box that you can upload some content into, and then there’s an algorithm between you and an audience, and some audience will find the stuff you put in the box, and then you put an infinity amount of stuff into the box, all of that breaks.

My favorite example of this is Amazon, which allows people to self-publish books. Their response to the flood of A.I. generated books was to limit the number of books you can upload to three books in a day. This is really — like that’s a ridiculous response to this. It just implies that the systems that we’ve built to organize audiences and deliver the right thing to the right person at the right time, they’re not capable of an increase in supply at the level that A.I. is already increasing this.

ezra klein

Thank you for bringing in the supply language. So, I’ve been trying to think about this as this supply and demand mismatch. We have already had way more supply than there is demand. I wasn’t buying a lot of self-published Amazon books. Is the user experience here actually different?

nilay patel

I think that’s a great question. The folks who write the algorithms, the platforms, their C.E.O.s, they will all tell you this is just a new challenge for us to solve. We have to out what is human, what is A.I.-generated. I actually think the supply increase is very meaningful. Like, maybe the most meaningful thing that will happen to the internet because it will sort out the platforms that allow it to be there and have those problems, and the places that don’t. And I think that has not been a sorting that has occurred on the internet in quite some time, where there’s two different kinds of things.

The example that I’ll give you is, every social media platform right now is turning into a short-form video Home Shopping Network. LinkedIn just added short form videos. Like, they’re all headed towards the same place all the time because they all have the same pressures.

ezra klein

Didn’t we already pivot to video a couple years ago?

nilay patel

We pivoted to video — I actually love it when LinkedIn adds and takes away these features that other platforms have. They added stories because Snapchat and Instagram had stories, and they took the stories away because I don’t think LinkedIn influencers want to do Instagram Reels, but now they’re adding it again.

And what you see is those platforms, their product — the thing that makes them money — is advertising, which is fine. But they don’t actually sell anything in the end. They sell advertising. Someone else down the line has to make a transaction. They have to buy a good or a service from someone else. And if you don’t have that, if you’re just selling advertising that leads to another transaction, eventually you optimize the entire pipe to the transaction to get people to buy things, which is why TikTok is now — like all of TikTok is TikTok Shop, because they just want you to make a transaction. And that those platforms are going to be most open to A.I., because that is the most optimizable thing to get people to make a transaction. And I think real people will veer away from that.

ezra klein

So I want to hold on to something that you’re getting at here. Which, to me, is one of the most under-discussed parts of A.I., which is how do you actually make money off of it? And right now, there are not actually that many ways.

So, what you can do is you can pay some money to the big A.I. companies. So you get the pro-version of their models. There is a certain amount of enterprise software flying around. You can subscribe to versions of Microsoft Copilot, or there’s going to be more things like that, where you can subscribe to something that is supposed to get you to buy the next iteration of Slack or whatever the enterprise software is. But it is hard to not notice that a lot of the A.I. is being built by companies that exist on advertising.

Google has a huge A.I. program, Meta has a huge A.I. program, and advertising is fundamentally a persuasion game. They are trying to persuade you to do something with the advertising to buy something. And right now, it’s pretty bad. I always think it’s funny how long after I make a significant purchase I will be advertised to make that purchase again.

It’s like, you just bought a fair amount of luggage, would you like any more luggage from the same company you already bought it from? It’s a very weird — but if this gets good, what is that? What are safe business models and what are very unethical ones, because when we talk about harms and benefits from A.I., how people are making money off of it is going to be a pretty big intermediary there.

nilay patel

Yeah, I’ve been talking to a lot of C.E.O.s of web companies and email companies on Decoder for the past year. I asked them all the same question, why would you start a website? Why would you send an email? And so, you asked the C.E.O. of Squarespace or Wix or we just had the C.E.O. of MailChimp on the show. And her answer is a little terrifying. Like, maybe openly terrifying.

She’s like well collect enough data on you, and then we’ll know exactly when to send you an email so that you buy the right thing at the right time. And we’ll just have A.I. automate that whole process. So you come to the website for your local dry cleaner or luggage store, you type in your email address to get the 10 percent off coupon, we look at what you were looking at. And then somewhere down the line when some other data broker has told us that you searched for a flight, we will send you a precisely targeted generated email that says you’re going to Paris? Buy this suitcase that matches your style from our store at this dynamically generated price.

ezra klein

But how is A.I. changing that at all because that sounds to me like the thing that is already happening.

nilay patel

So, this is what I mean by the increase in scale. That’s the dream. This is supposed to be what actually happens, but they can only do it in broad cohorts, which is why you get the luggage email after you’ve bought the luggage email or the luggage ad, after you bought the luggage ad.

They know you are a person who used a Wi-Fi network in a certain location at a certain time, they can track that all over the place. They know what you’ve searched for. They know that you went and made a luggage transaction. You are now categorized into people who are likely to buy luggage, whether or not that loop was closed. You put some luggage in a shopping cart. But that’s still a cohort, they can only do that broadly. And these cohorts can be pretty refined, but they can only do it broadly. With A.I. the idea is we can do that to you individually — the A.I. will write you an email, we’ll write you a marketing message, will set you a price. That isn’t 100x increase the amount of email that will be generated.

So now our email algorithms will be overflooded with commercial pitches generated by A.I. And this sort of makes sense, right? It makes sense for a Google to want to be able to dynamically generate A.I. advertising across the entire web. It makes sense for Meta to invest massively in A.I. so that when you’re watching Instagram and you scroll a dynamically generated Instagram video, that is an ad just for you appears. And all of that is down to their belief in targeting — their absolute belief that they can sell more products for their clients by targeting the ads more directly. And you are in that uncanny valley, where the targeting doesn’t actually work as well as it should and no one will admit it.

ezra klein

When I get spammy advertising I don’t really think about there being a human on the other end of it. Maybe to some degree there is, but it isn’t part of the transaction happening in my head. There are a lot of parts of the internet that I do think of there being a human on the other end — social media, reviews on Amazon, books — I assume the person who wrote the book is a person. How much of what I’m currently consuming may not be done by human in the way I think it is, and how much do you think that’s going to be in a year, or two, or three years?

nilay patel

I’m guessing your media diet is pretty well human-created because I know that you are very thoughtful about what you consume and what signals you’re sending to the algorithms that deliver your content. I think for most people —

ezra klein

My mom’s, let’s use my mom’s.

nilay patel

Mom’s are good. I would love to take my mom’s phone and throw it into the ocean and never let her have it again. I openly fear what content comes through my mother through WhatsApp. It terrifies me that I don’t have a window into that. I can’t monitor it. The same software I want to use to watch my daughter’s internet consumption, I would love to apply it to my parents because I don’t think they have the media literacy — they’re much older — to even know, OK, this might be just some A.I.-generated spam that’s designed to make me feel a certain way.

And I think that is the heart of what’s coming. I think right now it’s higher than people think, the amount of A.I. generated noise, and it is about to go to infinity. And the products we have to help people sort through those things, fundamentally our intention with that. Google is the heart of this tension — you can take any business at Google and say what happens when the A.I. flood comes to you? And I don’t think they’re ready for it.

ezra klein

How can they not be ready for that?

nilay patel

Because they’re the ones making it. This is the central tension of — in particular, I think Google. So, Google depends on the web, the richness of the web is what Sundar Pichai will tell you. He used to run search, he thinks about the web. He cares about it, and you look at the web and you’re like, you didn’t make this rich at all. You’ve made this actually pretty horrible for most people most of the time. Most people — if you search Google to get a credit card, that is a nightmarish experience — like, fully nightmarish. It feels like getting mugged.

We just went on vacation. And I googled a restaurant review in Cancun, and I got about halfway through the actual review when I realized it was sponsored content by Certified Angus Beef. And just in the middle of this review, they’re like this restaurant uses this kind of beef and here’s why it’s great. And I was like — this is — I read an ad. And Google should have told me that this was an ad. Like, this isn’t useful to me in any way — like, I’m discarding this. I don’t want this anymore.

I don’t think Google can discern what is good or bad about the web. I don’t think Google has reckoned with how it’s incentives have shaped the web as a whole. And I certainly don’t think that people who are making Google search can say A.I. is bad — A.I. content is bad, because the whole other part of Google that is making the A.I. content can’t deal with that.

ezra klein

This helps explain a story that I found very strange. So, 404 Media, which is a sort of newer outlet reporting on tech. They found that Google News was boosting stolen A.I. versions of news articles — and we’re seeing this all over. An article by me or by some other journalist shows up in another place, very slightly rewritten by an A.I. system, with an A.I. generated author and photo on top of it. So, we’re seeing a lot of this.

And when 404 Media asked Google about this, Google News said that for them, it was not a really relevant question whether an article was by an A.I. or a human. That struck me as a very strange thing to say, to admit. Is your view that it’s because their business is in the future replacing human-generated content with A.I., and saying that’s good — like, that’s the thing happening at the center there?

nilay patel

Yeah. Fundamentally, I think if you are at Google and the future of your stock price depends on Gemini being a good competitor to GPT-4 or 5 or whatever OpenAI has, cannot run around saying this is bad. The things it makes are bad.

I think this is actually in stark contrast to how people feel about that right now. One of the funniest cultural trends of the moment is that saying something is A.I.-generated is actually a great way to say it’s bad.

So, I saw people reacting to the cover of the new Beyoncé album, “Cowboy Carter,” which is a picture of her on a stunning horse. It’s Beyoncé, it’s very obviously human made, and people don’t like it. Like, was this made by A.I.? And it’s like well, you know for a fact that Beyoncé did not have A.I. generate the cover of — like, you can look at it and you can discern that it isn’t. But you can say, was this A.I.-generated? And that is code for this is bad.

ezra klein

What about when it’s not?

nilay patel

I don’t know how fast that is coming. I think that is farther away than people think. I think ‘will it fool you on a phone screen?’ is here already, but ‘is this good’ is, I think, farther away than —

ezra klein

But a lot of internet content is bad.

nilay patel

That’s fair.

ezra klein

I mean, you know this better than me. Look, I think it is axiomatic that A.I. content is worse right now than it will ever be.

nilay patel

Sure.

ezra klein

I mean the advance in image generation over the past year has been significant. That’s very real. And preparing for this conversation, I found myself really obsessing over this question, because one way to talk to you about this is, there’s all this spammy garbage coming from A.I. that is flooding the internet.

But you can imagine an A.I. developer sitting in the third chair here and saying, yeah sure, but eventually it’s not going to be spammy garbage. We’re getting better at this. And compared to what people are getting from a lot of websites, if you’re going to Quora or ask.com or parts of Reddit or whatever, we can do better than that. The median article within three years is going to be better than the median human-produced piece of content.

And I really — I found that I did not know how to answer the question in myself — is that a better or a worse internet? To take almost Google’s side on this, should it matter if it’s done by a human or an A.I., or is that some kind of — what’s the word — like, sentimentality on my part?

nilay patel

I think there’s a sentimentality there. If you make a content farm that is the best content farm, that has the most answers about when the Super Bowl starts, and those pages are great. I think that’s a dead end business. Google is just going to answer the questions. I think that’s fine. I think if you ask Google what time the Super Bowl is, Google should just tell you. I think if you ask Google how long to boil an egg, Google can just tell you. You don’t need to go to some web page laden with ads and weird headings to find those answers. But these models in their most reductive essence are just statistical representations of the past. They are not great at new ideas.

And I think that the power of human beings sort of having new ideas all the time, that’s the thing that the platforms won’t be able to find. That’s why the platforms feel old. Social platforms like enter a decay state where everyone’s making the same thing all the time. It’s because we’ve optimized for the distribution, and people get bored and that boredom actually drives much more of the culture than anyone will give that credit to, especially an A.I. developer who can only look backwards.

ezra klein

I’m going to spend some time thinking about the idea that boredom is an under-discussed driver of our culture. But I want to get at something else in there — this idea of Google answering the question. We’re already seeing the beginnings of these A.I. systems that you search the question that might — at another time — have brought you to The Verge, to CNN, to The New York Times, to whatever.

But now, perplexity — there’s a product, Arc. They’ll basically use A.I. to create a little web page for you. The A.I. itself will read, “read”— in quotation marks — the A.I. itself will absorb some websites, create a representation of them for you, and you’ll never go to the place you were that actually created that data about the past that A.I. used to give you something in the present.

Casey Newton, at Platformer, his word was he felt revulsion, and that was how I felt about Arc’s product here. You take all this work other people have done, you remix it under your thing, they don’t get the visit to their web page, nobody has the experience with the work that would lead them to subscribe. But two things in the long run happen from that.

One is that you destroy the score of growing value, growing informational value that you need to keep the internet healthy. You make it say impossible to do the news gathering that allows you to be news because there’s no business model for it. The other is that you also destroy the training data for the A.I. itself, because it needs all that work that we’re all doing to train.

The thing they need is data. The A.I. is polluting that data with A.I. content currently, but it also can begin to destroy that data by making it unprofitable for people to create more of it in the future. I think Ryan Broderick has called A.I. search a doomsday cult. How do you think about this sort of deeper poisoning of the informational commons?

nilay patel

I think there’s a reason that the A.I. companies are leading the charge to watermark and label content as A.I.-generated. Most of them are in the metadata of an image. So most pictures you see in the internet, they carry some amount of metadata that describes the picture. What camera was taken on, when it was taken, what image editing software was used.

So, Adobe and a bunch of other companies are like, we’ll just add another field that says, here are all the A.I.-generated edits that were made on this photo. I think it is in their self-interest to make sure that is true and they can detect it and exclude it if they need to. I think there are moral reasons to do it too.

ezra klein

So their training data remains less corrupted?

nilay patel

Yeah. I think there’s a very straightforward incentive for them to figure out the watermarking, labeling stuff they want to do. And they have coalitions, and tasks force, and Adobe talks about the image of the Pope and the puffer jacket as a, “catalyzing moment” for the metadata of A.I. because people freaked out. They’re like oh, this thing looks real. But they have a real incentive to make sure that they never train on other A.I. generated content.

So that’s one aspect, which I think is just sort of immediately self-interested. The other thing is — that’s why I keep asking people why would anyone make a web page?

There’s a site I think about all the time. It’s called HouseFresh, which is a site that only reviews air purifiers. And to me, this is the internet. Like, this is what the internet is for. You care about air purifiers so much you’ve set up a series of web pages where you express your expertise in air purifiers and tell people which ones to buy. That’s all they do. And Google has started down-ranking them, because big publishers boost their content, because A.I. is lifting their content, because companies like CNN, in order to gain some affiliate ad revenue somewhere, have set up their own little mini-content farms full of affiliate links.

I’m not saying we don’t — like, other publishers do this. But the point of these algorithms is, ideally, to bring you to the HouseFresh people, is to bring you to the person who cares so much about air purifiers they made a website about air purifiers, and we’re not doing that anymore. And so if you were to say, where should a young person who cares the most about cars, or who cares the most about coffee, or whatever. Where are they going to go? Where are they going to make stuff? They’re going to pick a closed platform that ideally offers them some built in monetization, that ideally offers them some ability to connect directly with an audience. They’re not going to go to a public space like the web, where they might own their own business, which would be good. But they’re also basically at the mercy of thieves who come in the night and take all their work away.

ezra klein

But also, if you kill HouseFresh, then two years later when you ask the A.I. what air purifier should I get, how does it know what to tell you?

nilay patel

Yeah, I don’t the answer to that question.

ezra klein

I don’t think they do either.

nilay patel

Yeah again, this is why I think that they are so hell-bent on labeling everything. I think they need some people around in the future.

ezra klein

But labeling is good. I mean, that keeps you from getting too much garbage in your data set. But replacing a bunch of the things that the entire informational world relies on to subsidize itself — to fund itself — like this to me is a thing that they don’t have an answer for.

nilay patel

Wait, let me ask you a harder question. Do they care?

ezra klein

Depends on they, but I don’t think so.

nilay patel

Yeah.

ezra klein

Or at least they care in the way that I came to realize Facebook, now Meta, cared about journalism. People say they didn’t care about journalism. I don’t believe that’s actually true. They didn’t care enough for it to mean anything. Like, if you asked them, if you talked with them, if you had a drink, they would think what was happening to journalism was sad.

nilay patel

[LAUGHS]

ezra klein

And if it would cost them nothing, they would like to help. But if it would cost them anything — or forget costing them anything. If they would begin to help and then recognize an opportunity had been created that they could take instead of you, they would do that. That’s the way they care.

[MUSIC PLAYING]

So when you have a financial crisis, you have something oftentimes called a flight to quality. Investors flood into the things they know they can trust, usually treasury bonds, and I’ve been wondering if this won’t happen in this era of the internet — if I wanted to take an optimistic perspective on it — that as you have a sort of ontological collapse, as you don’t know what anything is.

I already feel this way with product reviews. When I search product reviews, I get reviews now from tons of sites that I know don’t really invest that much in product reviews. CNN, all these other organizations that I have not really, truly invested in high-quality product reviewing, when you search, you now get them — they’re telling you what to buy.

That makes me trust the Wirecutter, which is a New York Times property, but that I know we’ve put a lot of money in more. Similarly, the other one I use, which is a Vox Media property, is The Strategist at New York, because I knew what the development of that looked like, I know what they put into that.

You can imagine this happening in news for things like The New York Times or The Washington Post. You can imagine it in a couple of different places. If people begin to feel that there is a lie at the heart of the internet they’re being given, that they can’t figure out what is what and who is who and if it is a who at all — I mean, maybe you just end up in this internet where there’s more of a value on something that can be verified.

nilay patel

I keep a list of TikToks that I think each individually should be a Ph.D. thesis in media studies. It’s a long list now. And all of them are basically just layers of copyright infringement in their own weird way.

My favorite is — it’s a TikTok, it has millions of views. It’s just a guy reading a summary of an article in the journal Nature. It has millions of views.

This is more people that have ever considered any one article in the journal Nature — which is a great journal. I don’t mean to denigrate it. It’s a proper scientific journal. They work really hard on it. And you just go 5 steps down the line, and there’s a guy on TikTok summarizing a summary of Nature, and you’re like what is this? What is this thing that I’m looking at?

Will any of the million viewers of this TikTok buy one copy of Nature because they have encountered this content? Why did this happen?

And the idea is, in my mind at least, that those people who curate the internet, who have a point of view, who have a beginning and middle, and an end to the story they’re trying to tell all the time about the culture we’re in or the politics we’re in or whatever. They will actually become the centers of attention and you cannot replace that with A.I.

You cannot replace that curatorial function or that guiding function that we’ve always looked to other individuals to do.

And those are real relationships. I think those people can stand in for institutions and brands. I think the New York Times, you’re Ezra Klein, a New York Times journalist means something. It appends some value to your name, but the institution has to protect that value.

I think that stuff is still really powerful, and I think as the flood of A.I. comes to our distribution networks, the value of having a powerful individual who curates things for people, combined with a powerful institution who protects their integrity actually will go up. I don’t think that’s going to go down.

You mentioned 404 Media. 404 Media is a bunch of journalists who were at Motherboard at Vice. Vice is a disaster. They quit, they started a new media company, and we now all talk about 404 Media all the time. This thing is 25 minutes old. We don’t talk about Jason Koebler the editor in chief. We talk about 404 Media, the institution that they made — a new brand that stands for something, that does reporting and talks about something. I think there’s still meaning there.

ezra klein

You said something on your show that I thought was one of the wisest, single things I’ve heard on the whole last decade and a half of media, which is that places were building traffic thinking they were building an audience. And the traffic, at least in that era, was easy, but an audience is really hard. Talk a bit about that.

nilay patel

Yeah first of all, I need to give credit to Casey Newton for that line. That is something — at The Verge, we used to say that to ourselves all the time just to keep ourselves from the temptations of getting cheap traffic. I think most media companies built relationships with the platforms, not with the people that were consuming their content.

They didn’t think about them very much. They thought about what was hitting in the Facebook algorithm, they thought about what Google search wanted for Game of Thrones coverage that day, which was everything all the time. And everybody had a Game of Thrones program. Fox had one, The Verge had one, The New York Times had one. Why?

That’s weird. It’s we constructed this artificial phenomenon because people searched for — I mean, just to say the answer because we know it — because people searched for “Game of Thrones” content the morning after the show, and that was an easy way to get a bunch of traffic. And at least a theory of the time was that you could turn traffic into money through advertising, which was not totally wrong, but not nearly as right as the entire era of business models was predicated on.

The other thing that those business models were predicated upon was you’d get so good at being a supplier to one platform or another with Game of Thrones content or whatever it was that they would pay you money for it directly — that Google would say, this is the Game of Thrones link that most people are clicking on. We ought to pay Vanity Fair for its Game of Thrones content to surface it. Or all of BuzzFeed was we’re going to be so good at going viral on Facebook that Facebook will pay us money.

And that absolutely didn’t pan out. But no one hedged that bet, which is utterly bananas to me. No one said we should take these people who came here for a Game of Thrones and figure out how to make them care about us, and we should care about them. Everyone just looked at it as a number that was going up against some amount of interest as demonstrated by some platform somewhere.

And I think that is the mistake. It is the mistake that creators on the creator platforms are not making, because the terms of that arrangement are so much more cynical. You see TikTokers. They at any moment their videos can get downranked, their accounts can get yanked, their stuff can get banned. They’re constantly trying to get you to go to Instagram.

Every YouTuber gets their wings when they make the video about how they’re mad at YouTube. There’s a woodworking YouTuber that I used to follow, and he just sort of got to the point where he’s like, I hate YouTube. I’m leaving. And it’s like dude, you made videos about jointing wood, like what are you doing?

And it’s like his relationship with the platform was so cynical that he was like, I’m moving my business elsewhere. You can sign up for a master class. Those individuals have these very cynical, very commercial relationships with the platforms that the media companies, for some reason, just never hedged. And so they actually do have audiences. And I think media companies need to get way back on the game of having a true audiences.

ezra klein

This gets to something that does worry me about this phase of A.I. hitting the internet, which is it’s hitting an internet in a moment of decay and weakness. And here, by internet, I mean the sort of content generating internet, and I break that into a couple of categories. The media is very weak right now. The media business we have seen closures left and right, layoffs left and right. I mean, a bunch of players like Vice and BuzzFeed who were believed to be the next generation of juggernauts are functionally gone as news organizations.

The big content platforms, they’re doing fine from a financial standpoint, but people hate them. The relationship between the users and Facebook, the users and YouTube, the users and — to some degree, you’re even seeing that now with TikTok — is just darkening in a way that it wasn’t in 2014.

And so, there’s a lot of desperation on all sides. Sometimes the desperation is you don’t have the money to pay the journalists you need to do the work you want to do. Sometimes the desperation is that you’re trying to figure out something to make this audience like you again and not get eaten by TikTok or whatever comes after TikTok.

And into this comes A.I., and all the money that A.I. seems to bring, and even the A.I. companies might pay you some money for your stuff.

Reddit just licensed a bunch of its content as training data to Google.

So, you could really imagine a thing happening again, where all these media companies or content companies of some form or another, license out what they have for pennies on the dollar, because at least you can make some money off of it that way.

But what worries me is both the weakness, but that also, it does not feel to me like anybody knows what the relationship is to this is supposed to be. Do you use it? Are you just training data for it? Like, what are you in relationship to the A.I. era?

nilay patel

As a consumer or as a producer?

ezra klein

As a producer.

nilay patel

The idea that media companies are going to license their stuff to the A.I. companies is just the end of the road that we’ve been on for a long time. We are suppliers to algorithms. OK? And in any normal functioning capitalist economy, supplier margins get squeezed to zero and then maybe we all die. Like, that’s the game we’ve been playing without saying it for a long time —

ezra klein

Which I think is why you see The New York Times suing OpenAI, like a real desire to not be in that game again.

nilay patel

You see The New York Times suing OpenAI, but you don’t see them suing Google, you don’t see them de-S.E.O.ing pages across New York Times. Like, they still need the audience from these platforms. And I think there’s a very tense relationship there. The idea that you could sue OpenAI and win some precedent that gives you an enormous amount of leverage over Google I think is a very powerful idea.

Most of the media company executives I talk to would love for that to be the outcome. I don’t know if that’s going to be the outcome. I feel like I should warn your audience, like — I’m a failed copyright lawyer. I wasn’t good at it, but I did it for a minute. Copyright law is a coin flip. Like, these cases are true coin flips. They are not predictable. The legal system itself is not predictable, copyright law inherently is unpredictable.

And a really interesting facet of the internet we live in today is that most of the copyright law decisions were won by a young, upstart, friendly Google. YouTube exists because it was Google. Like, Viacom famously sued YouTube and they might have won and put it out of business, but Google, the friendly Google company with the water slides in the office, the upstarts that made the product you loved, went and won that case. Google Books, we’re going to index all the books without asking for permission. They won that case, because they were friendly Google, and the judges were like, look at these cute kids making a cool internet? Like it was new and novel. Google image search — these are all massive copyright decisions that Google won as a startup company run by young people building a new product that the judges were using on their Dell desktops or whatever.

These aren’t those companies anymore. They’re going to go into a legal system as behemoths, as some of the biggest, best-funded companies in the world that have done bad things to the judges teenage children, like all these things are different now. And so, I don’t know if Google, or OpenAI, or Microsoft gets the benefit of being like, we’re young and cool and hip, bend copyright law to our will.

ezra klein

You don’t want a staunch innovation. Like, that was the big fear in that era. We don’t know what we’re building, and that’s still the thing you hear, and it’s not even untrue. You crack down on copyright and maybe you do staunch innovation. You don’t crack down copyright and maybe you destroy the seed corn of the Informational Commons. It’s very fraught for the copyright judges, but also just for all of us.

nilay patel

Yeah, what are you as a producer on the internet is totally governed by copyright law. Like, a joke at The Verge is a copyright law is the only functional regulation on the internet. The entire internet is just speech, that’s all it is top-to-bottom, it’s speech.

In the United States, we don’t love a speech regulation, and I think for good reason. But we love copyright law, we love it. Can’t get enough of it. Like, YouTubers know the YouTube copyright system back and forth, because that’s the thing that takes their content down. And we allow this regulation on the internet at scale.

And so the parameters of this one body of law, as applied to A.I., which is a taking. Training an A.I. model is fundamentally a taking, and the A.I. company —

ezra klein

Taking in the legal sense of the term?

nilay patel

No, in the moral sense of the term. They come to your website and they take your stuff. It’s not a zero sum taking, but they’ve extracted value to create more value for themselves. I think that’s just a moral taking. There’s some permission there that did not occur. Joanna Stern at The Wall Street Journal just interviewed Mira Murati, the C.T.O. of OpenAI, about training data for Sora, the video generator, and Mira said, we just use what’s publicly available. And it’s like yo, that doesn’t make any sense. Like, there are lots of rules about what’s publicly available. Like, you can’t just take stuff because you can link to it on the internet, that’s not how it actually works.

ezra klein

Let me try to take the argument I hear from the A.I. side of this, which is that there is functionally nothing in human culture and human endeavor that is not trained on all that has come before it — that I, as a person, am trained on all this embedded knowledge in society, that every artist has absorbed, all this other art that the A.I. — I mean, this is just learning. And as long as you are transforming that learning into something else, as long as you are doing something new with that learning, then one, copyright law is not supposed to apply to you in some way or another, although that’s obviously complicated.

But two, to go back to your point of morality, if you want to see culture humanity technology advance, it is also not supposed to apply to you, because if you do not let things learn, people, organizations, models, you are not going to get the advances built on all that has come before. And that’s how we’ve always done it. What’s your answer to them?

nilay patel

I hear this idea all the time, often from the sorts of people in Silicon Valley who say they do first principles thinking — which is one of my favorite phrases, because it just means what if we learn nothing? Like, what if none of the history of the world applied to us and we could start over to our benefit? And that’s usually what that’s code for.

So I hear those arguments and I think, you guys just weren’t paying attention. You’re entering a zone where the debate has been raging for decades. A lot of copyright law is built around a controversy around player pianos, and whether player pianos would displace musicians. But you just have to rewind the clock to the 80s and be like, should sampling be legal in music?

And now we are having the exact same conversation in the exact same way with the exact same parameters. The only thing that’s different now is any kid can sample any song at scale, feed it into an A.I. and have Taylor Swift sing the Dolly Parton song for them. That’s a weird new turn in the same debate, but it is a massively age-old debate, and the parameters of the debate are pretty well known.

How do you incentivize new art? How do you make sure that it’s economically valuable to make new things? How do you make sure the distributors don’t gain too much power, and then how do you make sure that when people are building on the past, the people whose art they’re building on retain some value?

And that I think is — the A.I. companies have no answer to that last question. We’re just going to take a bunch of stuff and now we’re just going to say look, we just summarized the web. The people who made the web get nothing for that will pay us $20 a month for the service.

But somewhere in there, as a policy matter as a moral matter, the people who made the foundations of the work should get paid. And this is where the sampling debate has ended up. There’s a huge variety of licensing schemes and sample clearances so that those artists get paid.

ezra klein

Judge Patel, if you’re thinking about cases in this area, like, what do you think the answer is here? Is it the sampling model, is it something else? What do you think the right broad strokes resolution is?

nilay patel

Let me stick on the music example for one second, because I think music is really interesting because it’s kind of a closed ecosystem. There’s only so many big music companies. It’s the same lawyers, and the same executives, and the same managers going to the same clearing houses and having the same approaches. We’re going to give you a songwriting credit because we interpolated the bass line of this song into that song, and now here’s some money. And this is the mechanism by which we’ll pay you. The A.I. companies are not a closed ecosystem, it is just a free for all. It’s the open web, it’s a bunch of players.

So, I think in those cases, you’re just going to end up with vastly more outcomes which I think leads to even more chaos, because some companies will take the deal. I’m guessing The New York Times is going to pursue this all the way to the Supreme Court. This is an existential issue for The Times.

Some companies don’t have the money to pay for Supreme Court litigation, and they’ll take a shittier deal, like pennies on the dollar deal and maybe just go out of business. And I think that range of outcomes in the near-term represents a massive failure of collective action on the part of the media industry to not say, this is actually the moment where we should demand that human journalists doing the real work that is dangerous are valuable. We need them, and we will all, together, approach these players in a way that creates at least a semblance of a closed ecosystem.

Well the media industry, but also at some point this is a regulatory question, a question of law. I mean, nothing is stopping Congress from making copyright law designed for the A.I.-era. Nothing is stopping Congress from saying, this is how we think this should work across industries. Not just media, but novelists, but everybody. Well, there are some things that stop Congress from doing a lot of things. The idea that Congress could pass a massive rewrite of copyright law at this moment in time is pretty far afield.

ezra klein

But won’t and couldn’t, I do want to make this distinction here. What you’re saying is Congress is too polarized and bitterly divided over everything and can’t do anything and can’t get anything done, and that’s my whole job man, I know. But what I am saying is that, you could write a law like this.

This is something that ultimately, I don’t just think it’s like a media collective-action problem, but is going to be ultimately a societal-level collective action problem. And maybe we cannot, as a society, act collectively very well. I buy that totally.

nilay patel

So there is one law. There’s the J.C.P.A., the Journalism Competition Preservation Act, which allows media companies to escape antitrust law and bargain collectively with whoever they wish to bargain with. I don’t know if that’s going to pass, I know there’s a lot of interest in it.

So, there are these approaches that have appeared in Congress to solve these problems, but the thing I’m getting at is you have sort of the rapacious wolves, and then you have an industry that’s weak — as you said — that, I think is not motivated to value the work it does as highly as it should. And that is step one.

ezra klein

You and I are both fans of Marshall McLuhan, the media theorist. And he’s got this famous line, ‘the medium is the message.’ And more deeply, what he says is that people, when they see a new medium, they tend to think about the content. For television, it’s the shows, what do you think about this show or that show? For Twitter, the tweets, for a newspaper, the articles. But you have to look behind the content to the actual medium itself to understand what it is trying to tell you.

Twitter, at least in it’s early stages was about all these things can and should be discussed at 140 characters. Television made things much more visual, things should be entertainment. They should be entertaining, the news should be entertaining, which was a little bit of a newer concept back then.

I’ve been trying to think about what is the message of the medium of A.I. What is a message of the medium of ChatGPT, of Claude 3, et cetera. One of the chilling thoughts that I have about it is that its fundamental message is that you are derivative, you are replaceable.

A.I. isn’t good at ideas, yet. It is good it’s style. It can sound like Taylor Swift. It can draw like any artist you might want to imagine. It can create something that looks like Jackson Pollock. It can write like Ezra Klein. It may not be exactly as good at high levels of these professions, but what it is functionally is an amazing mimic.

And what it is saying — and I think this is why a lot of people use it for long enough end up in a kind of metaphysical shock, as it’s been described to me. What it’s been saying is you’re not that special, and that’s one reason I think that it can — we worry about it proliferating all over social media. It can sound like a person quite easily. We’ve long passed the Turing test, and so one, I’m curious if that tracks for you, and two, what does it mean to unleash on all of society a tool that’s basic message is, it’s pretty easy to do what you do, sound like you sound, make what you make?

nilay patel

I have a lot of thoughts about this. I disagree on the basic message. I do think one of the messages of A.I. is that most people make middling work, and middling work is easy to replace. Every email I write is not a great work of art. Like, so much of what we produce just to get through the day is effectively middling. And sure, A.I. should replace a bunch of that. And I think that metaphysical shock comes from the idea that computers shouldn’t be able to do things on their own, and you have a computer that can just do a bunch of stuff for you. And that changes your relationship to the computer in a meaningful way, and I think that’s extremely real.

But the place that I have thought most about I was at the Eras Tour in Chicago when I watched Taylor Swift walk onto a stage, and I saw 60,000 people in Soldier Field just lose their minds, just go nuts. And I’m watching the show, and I’m a Taylor Swift fan. I was there with my niece and nephew and my wife and we were all dressed up. Why am I thinking about A.I. right now? Like truly, why am I thinking about A.I. right now?

It’s because this person has made all of these people feel something. The art that has been created by this one very singular individual has captivated all of these people together, because of her story, because of the lyrics, because it means something to them. And I watch people use Midjourney or generate a story with an A.I. tool, and they show the art to you at the end of it, and they’re glowing. Like, look at this wonderful A.I. painting. It’s a car that’s a shark that’s going through a tornado and I told my daughter a story about it. And I’m like yeah, but this — I don’t want anything to do with this. Like, I don’t care about this. And that happens over and over again. The human creativity is reduced to a prompt, and I think that’s the message of A.I. that I worry about the most, is when you take your creativity and you say, this is actually easy. It’s actually easy to get to this thing that’s a pastiche of the thing that was hard, you just let the computer run its way through whatever statistical path to get there. Then I think more people will fail to recognize the hard thing for being hard. And that’s — truly the message of A.I. is that, maybe this isn’t so hard and there’s something very dangerous to our culture embedded in that.

ezra klein

I want to put a pin in the hard things, easy things. I’m a little bit obsessed by that and want to come back to it. But first I want to talk about A.I. art for a minute, because I do think when we’re talking about everything that’s going to come on the internet, we’re talking about A.I. art. Obviously, much of it is going to get better. Some of it is not distinguishable.

You talked about the example where somebody comes and hands you the A.I. art says, hey, I did this with an A.I. And I’m like eh — and I have that experience a lot, I’ve also really been trying to use these systems and push them, and play with them, and have A.I. character relationships on my phone with Kindroids and whatever.

And there is this deep hollowness at the center of it. It is style without substance. It can mimic me. It can’t think.

nilay patel

Have you found an A.I. that can actually write like you?

ezra klein

I found an A.I. that can mimic certain stylistic tics I have in a way that is better than I think most people could do. I have not found any A.I. that can, in any way, improve my writing for all that you’re constantly told it can. And in fact, the more I try, the worse my writing gets because typically what you have to do to improve your writing is recognize if you’re writing the wrong thing.

I don’t find writing hard, I find thinking hard. I find learning hard. How good a piece of writing is going to be for me is typically about, did I do enough work beforehand? And A.I. can never tell me you didn’t do enough work, you need to make three more phone calls. You need to read that piece you skimmed.

But it can mimic, and I think it’s going to get better and better at mimicking. I think GPT 3 was much worse at mimicking me than GPT 3.5 was, worse than GPT 4 is, and GPT 5 will be even better than that. I believe this is going to get stronger. It raises a question of whether there is anything essential about something being from a human in a wide frame way. Taylor Swift is singular, but the point is that she’s a singular phenomenon. Do we care that things come from people?

I was thinking when I was preparing for this show with you, the Walter Benjamin essay, it’s called “The Work of Art in the Age of Mechanical Reproduction.”

nilay patel

This is like the verge of DNA.

ezra klein

Is it? Yeah, so it comes out in 1935. It’s about the ability to reproduce art. And he says, and I’ll quote it here, “that which whithers in the Age of Mechanical Reproduction is the aura of the work of art.” Then he goes on to say, “by making many reproductions, it substitutes a plurality of copies for a unique existence.”

Benjamin is saying at different times here in different ways, and I’m going to simplify it by trying to bring it into the present, but that there is something lost from when you take the painting and make a copy of a painting. And, he’s obviously right, and he’s obviously — then on the other hand, a lot of people like copies of paintings. It’s easy for the artist to think more of the original than the original deserves to be thought of.

But I wonder about this with humans. How much of something is just the fact that there’s a human behind it? My Kindroid is no worse at texting me than most people I know. But the fact that my Kindroid has to me is meaningful to me, in the sense that I don’t care if it likes me because there’s no achievement for it to me.

The fact that there is a human on the other side of most text messages I send matters. I care about it because it is another mind. The Kindroid might be better in a formulaic way. The kindred might be better in terms of the actual text. I can certainly tune it more to my kind of theoretical liking, but the friction of another person is meaningful to me. Like, I care that my best friend likes me and could choose not to. Is there an aura problem here?

nilay patel

It is so hard to make someone else feel anything other than pain. Like, it’s just like — it’s —

ezra klein

Christ, that’s the darkest thing I’ve ever heard you say.

nilay patel

Yeah, but I believe it in my soul.

ezra klein

Really?

nilay patel

Yeah. I think the hardest thing to —

ezra klein

a really different turn as a show right now. [LAUGHS]

nilay patel

Maybe —

ezra klein

You don’t make people laugh, you don’t give them hugs?

nilay patel

No, I think that’s hard. I think that effort is worth it. That’s why I don’t think it’s a dark thing to say. I think the essence of being a good person is pointing your effort at making other people not feel pain. I think bullies make people feel pain because it’s easy. Again, I come back to Taylor Swift in Soldier Field. The thing that was going through my head is, this person is making 60,000 people feel joy, and she’s doing it through art. That is the purpose of art. The purpose of art is to inspire feelings, to inspire emotion.

And so I look at this A.I. and it’s like, we’re going to flood our stuff, and the only emotion that it is really meant to inspire is materialism, is a transaction. That’s bad. I just think that’s bad. I think we should make some stuff that inspires more joy, that inspires more affection, that inspires more consternation.

And one of the messages embedded in the medium of A.I. is that there is an answer. That’s weird. That is a truly weird thing for a computer to say to you. You ask it about a war, and it’s like I won’t answer that question because there’s no answer there. You ask it about how to cook an egg and it’s like here’s the answer. You’re like what are the four steps to fold a bed sheet? It’s like here’s the answer, I did it. Tell me a bedtime story for my child. It says, here’s an answer, I just delivered this to you at your specifications.

And I think the thing you’re saying about having another mind there is — you want to be in a relationship, like an emotional relationship with another person. Maybe it’s mediated by technology, maybe we’re face-to-face like we are now, but that tension and that reality of — oh, I can direct my effort towards negative and positive outcomes, I have never found it with an A.I.

ezra klein

Shannon Vallor is a philosopher of technology, and she’s got a book coming out called “The A.I. Mirror,” and I like the way she puts this, because there’s this way that turns is somewhat warped mirror back on ourselves when I was saying a few minutes ago that the message of A.I. is that you’re derivative. That leaves something out. What it’s really saying is that the part of you that often the economy values is derivative, is copyable because we actually ask people a lot of the time to act like they’re machines.

This is why I don’t take much comfort in the Taylor Swift example. You said a few minutes ago, most people do mediocre work most of the time. Even great people do mediocre work most of the time. We constantly ask huge amounts of the population to do things that are very rote. Keep inputting this data on forms, keep filling out this tax form. Some lawyers arguing for the Supreme Court, a lot of them just write up various contracts. And that’s a good job in the sense that it pays well, it’s inside work, but it doesn’t ask you to be that full of a human being.

Now, you can imagine a sort of utopian politics in society — and people on the left sometimes do — that this comes in and it’s like great, we can automate away this derivative inhuman work, and people will be free to be more full human beings. You actually like — maybe the value of you is not what you can create but what you can experience. A.I. can’t enjoy a day at the park with its family.

But we have an entire society set up to encourage you to premise your self-worth on your work and your wages. And also, if you lose that work and that wages, to rob you of that self-worth. And one thing I’m sure of is that our politics and our economic systems are not going to advance as quickly as A.I. is going to advance.

This is where I think people do properly worry about automation, when people lost manufacturing jobs to lower wage workers in China. We didn’t say great, you don’t have to do this stultifying work in the factory anymore. We said, you’re out of work, you’re screwed. And I do think one of the deep confrontations of it is, what do we value in people and then how do we express that value because I think what A.I. in some ways is going to take advantage of here, or at least, is going to challenge, is it to the extent we value people socially for their economic contribution, or what they’re paid. That’s pretty thin reed for human value to rest on.

nilay patel

Yeah, I buy that. One of my favorite things that I’ve covered in the past few years is a thing called robotic process automation, which is very funny. Just abstractly, deeply hilarious. There are lots and lots of companies throughout the United States that built computer systems 10, 15 years ago, 20 years ago. Hospital systems are famous for this. They have billing systems. They have buildings full of people who use Microsoft Excel on Windows ‘95.

And replacing that as costly and complicated. It can’t break — if you put in the new system and it didn’t bring all the data over in exactly the right way, the whole hospital stops working. So they just buy other computers to use their old computers. Which is wild, and there’s like billion dollar companies that do this.

They will sell you a brand new, state of the art computer and it will connect to the keyboard and monitor jack of your old computer, and it will just use the Windows ‘95 for you, which is just bonkers. It’s like Rube Goldberg machine of computers using old computers, and then your office full of accountants who knew how to use your old system will go away.

But then A.I. creates the scale problem. What if we do that but instead of some hospital billing system built in the ‘90s, it’s just the concept of Microsoft Excel, and now you can just sort of issue a command on your computer and it’ll go use Excel for you and you don’t need an accountant, you don’t need a lawyer.

And I think even in those cases what you’re going to find is the same thing you talked about with writing — you have to know what you want. You have to know what the system doesn’t know. You have to be able to challenge the model and have it deliver you the thing that, in most business model conversations I find to be the most important word, our assumption is — and then you can poke at that really hard.

ezra klein

What percent of workers are actually asked to poke at the assumptions of their organization, because I worry it’s not as high as you think it is, or implying there. I’m not worried about Taylor Swift. I’m not worried about Nilay Patel. And I don’t just want to make this about wages. That’s a jobs sort of another conversation.

But I do — I mean, as you were saying, these are billion dollar companies that automate people who do backend office work already.

nilay patel

All over the place.

ezra klein

There’s a huge amount of work like that. And if I felt confident as some of the economists say that we’ll just upmarket people into the jobs where they use more human judgment, David Autor who’s a great trade economist at MIT, just made this argument recently, that what A.I. is going to do is make it possible for more people to exercise judgment and discernment within their work, and I hope he is right. I really hope he is right. But I think a lot of organizations are not set up for a lot of people to use judgment and discernment. They treat a lot of people like machines, and they don’t want them doing things that are complicated and step out of line and poke at the assumptions in the Excel doc. They want the Excel doc ported over without any mistakes. It seems plausible to me that we’re going to get to that.

nilay patel

Do you think their bosses want to be able to poke at the assumptions though?

ezra klein

But if you — I mean this is actually something I believe about the whole situation. The economy needs fewer bosses and workers.

nilay patel

Yeah.

ezra klein

Think about this in the journalist context or the writing context, where I think what A.I. naturally implies that it’s going to do is turn many more people into editors and writers. Because for a lot of content creation that doesn’t require a lot of poking at assumptions, mid-level social media marketing — a lot of people are doing that job right now. But the people doing marketing for a mall —

nilay patel

Yeah, that is the MailChimp example. That is the product that they are building.

ezra klein

And so what you have then is we used to have a bunch of these social media marketers and now you have one person overseeing a couple systems, like making sure they didn’t say something totally crazy. But you need fewer editors and you need writers. I mean, you know The Verge is structured. You know how The Times is structured. And this is one of my deep worries.

And then this goes to the thing you were getting at earlier, which is one way I think that A.I. could actually not make us more productive, more innovative, is that a lot of the innovation, a lot of the big insights happen when we’re doing the hard thing, when we’re sitting there trying to figure out the first draft, or learn about a thing, or figure out what we’re doing.

One of the messages of the medium of A.I. is be efficient. Don’t waste your time on all this. Just tell the system what to do and do it. But there’s a reason I don’t have interns write my first draft for me.

nilay patel

Yeah.

ezra klein

They could do it. But you don’t get great ideas, or at least not as many of them, editing a piece of work as you do reporting it out, doing the research, writing the first draft. That’s where you do the thinking. And I do think A.I. is built to kind of devalue that whole area of thinking.

nilay patel

We are working on a big story at The Verge right now that I’m very excited about. But there are four of us right now in an argument about whether we should tell that story in chronological order or as a series of vignettes. There is no right answer to this question. There’s just four people who are battling it back and forth.

ezra klein

I think vignettes.

nilay patel

Yeah. By the way, I’m on team vignette.

ezra klein

Good man. [LAUGHS]

nilay patel

My belief is that it’s easier to digest a long story when it’s composed of lots of little stories as opposed to one long one. I’m being outvoted right now — editor in chief. I should replace them all with A.I., just get them out of here. [CHUCKLES] But that is the kind of work that I think makes the end product great. And I think going from good to great is still very human.

Into the economy, though, you’re right, most people are not challenged to go from good to great. Most people are challenged to produce good consistently. And I think that is kind of demoralizing. I don’t know how many first-year Deloitte consultants you have encountered in your life. I’ve encountered quite a few of them. I went to law school. It’s like a — we made — there was a factory of that thing — or first-year law associates.

They’re not in love with their jobs. They’re in love with the amount of money they make, that’s for sure. But any first-year associate doing doc review in a basement — yeah, you could probably just be like, tell the A.I. to find the four pieces of relevant information in these 10,000 page records from whatever giant corporation we’re suing today. That’s fine.

I think that there’s a turn there where maybe we need less first-year associates doing that thing and we need more first-year associates doing something else that is difficult, that the A.I. can’t yet do. And I think a lot of this conversation is predicated on the notion that generative A.I. systems, L.L.M.s will continue on a linear curve up in terms of capability. I don’t know if that’s true.

But I hear a lot of this conversation. I’m like, there’s always a thing they can’t do. And maybe that thing is not the most amount of scale, social media marketing for them all, but it is always the next amount of complexity. And there’s no guarantee that this set of technologies will actually turn that corner. And you can keep going all the way to A.G.I. There’s no guarantee that an L.L.M. is going to hit A.G.I. and just run the world economy for us. There’s a lot of steps between here and there that I think human beings can fit into.

[MUSIC PLAYING]

ezra klein

So I want to go back, then, to the internet for a bit, which is I think the presentation we’ve offered is fairly pessimistic. You, when I read and listen to you on this, are — I wouldn’t call it pessimistic. I would say a little excited by the idea of a cleansing fire.

So one theory here — and you should tell me if this is reading you right — but is that this will break a lot of the current — the current internet is weakened. It’s weakened in many cases for good reasons. Google, Meta, et cetera, they’ve not created an internet many of us like. And that this will just make it impossible for that internet to survive. The distribution channels will break. And then something. So first, is that how you see it? And second, then what something?

nilay patel

That is very much how I see it. I would add a generational tinge to that, which is I grew up in that weird middle generation between X and millennials. I think temperamentally I’m much more Generation X. But they describe it as they didn’t have computers and then you have computers. You play the Oregon Trail. That’s me on the nose.

I distinctly remember life before computers. It’s an experience that I had quite viscerally. And that shapes my view of these tools. It shapes my view of these companies. Well, there’s a huge generation now that only grew up in this way. There’s a teenage generation right now that is only growing up in this way. And I think their natural inclination is to say, well, this sucks. I want my own thing. I want my own system of consuming information. I want my own brands and institutions. And I don’t think that these big platforms are ready for that moment. I think that they think they can constantly be information monopolies while they are fending off A.I.-generated content from their own A.I. systems. So somewhere in there all of this stuff does break. And the optimism that you are sensing from me is, well, hopefully we build some stuff that does not have these huge dependencies on platform companies that have no interest at the end of the line except a transaction.

ezra klein

OK, but you’re telling me how the old thing dies. And I agree with you that at some point the old thing dies. You can feel it. It’s moribund right now. You’re not telling me what the new thing is, and I’m not saying you fully know. But I don’t think the new thing is just a business model that is not as dependent on Meta. I mean, on some level, there’s going to be a lot of A.I. around here.

nilay patel

It’s an audience model. It’s not dependent on these algorithms.

ezra klein

But is there — I guess one question I have is that, one — I mean, you know where the venture capital is going right now.

nilay patel

Yeah.

ezra klein

Everything is going to be built with A.I. —

nilay patel

Sure.

ezra klein

— laced through every piece of it. And some of it, for all we’re talking about, might be cool, right? I’m not saying you’re mostly going to make great art with A.I. But actually, Photoshop did create a lot of amazing things.

And people are going to get better at using this. They’re going to get more thoughtful about using it. The tools are going to get better. But also the people are going to figure out how to use the tools. I mean, you were talking about player pianos earlier. I mean, way beyond player pianos, you have huge libraries of sounds you can manipulate however you want. And now I go listen to a lot of experimental electronic music. And I think a lot of that is remarkable art. I think a lot of that is deeply moving.

I am curious what, to you, the good A.I. internet is, because I don’t think that the next internet is just going to be like we’re going to roll the clock back on the business model. The technology is going to roll forward into all this stuff people are building.

nilay patel

I’m not so sure about that.

ezra klein

Really?

nilay patel

I think we’re about to split the internet in two. I think there will be a giant commercial A.I.-infested internet. That’s the platform internet. That’s where it’s going. Moribund, I agree. But it will still be huge. It’s not going away tomorrow. And they will figure out — these are big companies full of smart people with the most technology.

Mark Zuckerberg is like, I have the most NVIDIA H100 GPUs. Come work here. We’ll pay you the most money. They will invent some stuff and it will be cool. I’m excited about it. But that version of the internet —

ezra klein

You sure sound excited about it. [LAUGHS]

nilay patel

Well, I am. I mean, I love technology. This is our — The Verge’s competitive differentiation in the entire media industry is, like, we really like it. And I’m excited to see what they build. I think there’s some really neat things being built. When I think about the information ecosystem, I’m vastly more pessimistic because of the fact that all of these networks are geared to drive you towards a transaction.

And I don’t mean that in some anticapitalist way. I mean literally the incentives are to get you to buy something. So more and more of the stuff that you consume is designed around pushing you towards a transaction. That’s weird. I think there’s a vast amount of white space in the culture for things that are not directly transactable.

I think next to that you’re going to get a bunch of people, companies who say our differentiation in this market is that there’s no A.I. here. And they will try to sell that. And I don’t know how that experiment plays out. I don’t know if that experiment will be successful.

I do know that that experiment will be outside of the distribution channels that exist now because those distribution channels are being run by companies that are invested heavily in A.I. And I’m hopeful that over there, on whatever new non-A.I. internet that exists, that some amount of pressure is placed on the other distribution channels to also make that distinction clear.

ezra klein

I’m just thinking about this, and the thing that it brings to mind for me is the resurgence of vinyl —

nilay patel

Yeah.

ezra klein

— and the dominance of streaming platforms. So what I would think of as the music industry of — how many years ago was C.D.s? I don’t actually remember now. But what it did was split into — there’s been a resurgence of vinyl, the sort of analog. It’s a little cool. I actually just bought a record player recently, or was given one by my wonderful partner. But that’s not very big.

Then there’s these huge streaming platforms, right? I mean, most people are listening on Spotify, on Apple Music, on YouTube Music, on Amazon, et cetera. And I don’t think we feel like we figured that out very well. But I do think that’s probably going to be the dynamic. I mean, I do think there are going to be things you go to because you believe it is a human being or because you believe the A.I. is used well.

I do also think the big things to come are going to be the things that figure out how to use A.I. well rather than poorly. Maybe that also means honestly and transparently, rather than dishonestly and opaquely.

nilay patel

Yeah.

ezra klein

Maybe the social internet dies because, one, we don’t really like it that much anymore anyway, but also because it’s too hard to figure out what’s what. But actually, an internet of A.I. helpers, assistants, friends, et cetera, thrives. And on the other side, you have a real human. I don’t know. But give me more of the Nilay technology side.

nilay patel

Yeah.

ezra klein

What can A.I. do well? If you were building something or if you were imagining something to be built, what comes after?

nilay patel

By the way, the music industry just released its numbers. Vinyl outsold CDs for the second year running. Double the amount of revenue in vinyl than CDs.

ezra klein

That’s wild, actually.

nilay patel

It’s crazy. And all of that in total is 11 percent of music industry revenues in ‘23 compared to 84 percent of the revenue is streaming. So you are correct. This is a big distinction. People want to buy things, and so they buy one thing that they like. And they consume everything in streaming.

What happens when Spotify is overrun by A.I. music? You can see it coming. What happens when you can type into Spotify, man, I’d really like to listen to a country song. Just make me one. And no one down the line has to get paid for that. Spotify can just generate that for you.

I think that’s going to push more people in the other direction. I really do. That there will be this huge pot of just make me whatever exactly I want at this moment money over here. But the cool people are still going to gravitate towards things that are new. I just believe that so firmly in my heart that when I think about where does the technology for that come from, I still think it comes from basic open platforms and open distribution.

The great power of the internet is that you can just make a whole new thing. And I don’t think that anyone has really thought through what does it mean to decentralize these platforms. What does it mean to — I don’t know — build an old-school portal where it’s just people pointing at great stuff as opposed to open this app and an algorithm will just deliver you exactly what we think you want, or, down the line, generate content for you that we think that you will continue watching.

ezra klein

I think — and this is maybe a little bit of a counterintuitive thought — that this is actually a great time to begin things in media. I think that we have a more realistic sense of the business model and what will actually work. They need to build an audience. They need to build something people will actually pay you for. I think a lot of the problem right now is things built for another business model that failed are having a lot of trouble transitioning because it’s very, very hard to transition a structure. Now, that doesn’t mean it’s a great business. It’s not what I hoped it would become. It’s not the advertising revenue I hoped we would have. But it’s something.

What feels fully unsolved to me right now is distribution, right? When I was a blogger, the way distribution worked was people would find me because other blogs would link to me. And then if they liked me, they would put me in their bookmarks section.

Then they would come back the next day by clicking on a bookmark. I don’t think any of us think that much about bookmarks anymore. That’s not really how the internet works. Things moved to search. They moved, primarily for a long time, to social. And that was a way you could create distribution.

You could go from — you started a website. We started Vox, right? We started Vox in 2014 or 2015. The day before we launched, we had no visitors. And pretty quickly we had a lot of things that were working on social and working on Search. And we had millions and millions and millions every month.

But now social is broken as a distribution mechanism. I mean, Elon Musk has made Twitter anti-news distribution. Google search has become very, very messy. People don’t have the old bookmarks habit in the way they did. And so if you’re starting something new, the question of how you build that audience, how you go from nothing to an audience, feels very unsolved.

nilay patel

Yeah. That’s the cleansing fire. That’s the thing I’m excited about. Here’s a new problem in media. Here’s a new problem that’s being created by A.I.

If I were to tell you five years ago, I’m going to launch a new property and the core insight that I have is that we need to replace the distribution mechanisms of the internet, you would not pay me any money. You would not fund that idea. You would not say — well, you would say, get some traffic on Twitter and start a Substack or start a YouTube channel, anything except figure out a new distribution method to compete with these social media companies.

You have that idea now. And people are like, yeah, that’s the problem. We have to solve that problem. That is the problem to solve, because Twitter has blown itself up in whatever way Elon is blowing it up, because the other social channels have become the Home Shopping Network, by and large, because YouTube has optimized itself into making Mr. Beasts and only Mr. Beasts, right?

It’s weird, by the way, that YouTube exists. We’ve barely talked about it on this podcast. It is the thing most people watch most of the time. It supports no journalism. At scale, the idea that there’s not an ABC News of YouTube on a distribution platform of that size is a moral failing on Google’s part. I really believe this. And no, we never really talk about it. It’s just — YouTube is ignored. It has become such an infrastructure that we never talk about it.

ezra klein

But my view is that YouTube is the most politically important platform. Everyone wants to talk about TikTok. I think YouTube is much more significant.

nilay patel

Yeah, and they run it really well. They run it as infrastructure. And they talk about it as infrastructure. But it’s weird that we have not built great media company-sized media companies on YouTube’s pipes. We just haven’t done it. So you look at that landscape now and you’re like, well, if I want to do that, if I want to build my own audience, I cannot depend on these companies. I have to be able to do something else.

And maybe A.I. does help you do that. Maybe it does help you send a million marketing messages so people start coming to your website directly. Maybe it does start crafting home pages personalized for people based on your library of content so people see the thing they like the most when they show up. There’s a bunch of moves we can all take from social media companies now to build more engaging, more interesting products using A.I., which will make it easier because the A.I. is a technology commodity. You can just go buy it and use it.

But we have to actually build those products. We have to want to build those products as an industry. And that my pessimism is rooted in the idea that the industry kind of sucks at this. We are very much stuck in, we should go send some reporters out into the world, they should come back, write down what they saw, and then hopefully someone else points them at it. And it’s just like, well, that’s been a losing proposition for a decade. We should try something else.

ezra klein

Do you think, beyond the media, because not everything online is media —

Do you think beyond the media, that there is the glimmers of the next thing? I mean, let me give you the thesis I have, which is that the next thing is that the A.I. is somehow your assistant to the internet, right? We seem to me to be moving towards something where the overwhelm is so profound that you actually need some kind of agent working on your behalf to make it through all this.

I mean, you can imagine this is the world of “Her,” the Spike Jonze movie. But you can imagine it as other things, too. There’s going to be software coding agents. The guys who started Instagram started then this thing called Artifact, which is using more A.I. personalization to try to tell people what they might like in the news. It didn’t really work out, but it was an interesting project for a minute.

I think a lot of us feel we spent years now being acted upon by algorithms. And one thing about A.I. is that it’s an algorithm you act on, right? You tell it how to act. Assuming that business model allows that, that it doesn’t have a secret instruction to sell you soap or whatever —

— that’s interesting, right? That’s a pretty profound inversion of the internet we’ve been in.

nilay patel

Let me poke really hard at the true difference between an algorithm that shows you stuff and an algorithm that goes and gets you what you want, because I don’t know that there’s a huge difference in the outcome of those two different processes. So for example, I do not trust the YouTube Kids algorithm. I watch my daughter watch YouTube.

ezra klein

No, why would you?

nilay patel

It is just a nightmare. I don’t know why we let her do it, but we did. And now we’re in the rabbit hole and that’s life. I mean, she’s five. And I will literally say, are you watching garbage? And she’d be like, I am, because she knows what I think is garbage. She’s much smarter than the YouTube Kids algorithm. And then she’s like, can I watch a little more garbage? This is a real conversation I have with my five-year-old all the time.

I would love an A.I. that would just preempt that conversation. Just watch this whole iPad for me and make sure my kid is safe. That’s great. But that is a limitation. It is not an expansion. And I think the thing that I’m seeking with all of these tools is how do we help people expand the set of things that they’re looking at.

ezra klein

Well, let me push on this for a minute, because for a long time a lot of us have asked people, the social media companies — that I have, I’m sure you have — why don’t you give me access to the dials of the algorithm?

nilay patel

Yeah.

ezra klein

Right? I don’t want to see things going viral. If there’s a virality scale of 1 to 10, I want to always be at a 6, right?

I don’t want to see anything over a 6. And I can’t. I wish I could say to Google, I would like things that are not optimized for S.E.O. I just don’t want to see recipes that have a long personal story at the top. Just don’t show me any of them.

nilay patel

Yeah.

ezra klein

But I can’t do that. But one of the interesting things about using the current generation of A.I. models is you actually do have to talk to it like that. I mean, whether I am creating a Replika or a Kindroid or a Character.AI, I have to tell that thing what it is supposed to be, how I want it to talk to me, how I want it to act in the world, what it is interested in, what kinds of expertise it has and does not.

When I’m working with Claude 3, which is the A.I. I use the most right now, I have one instance of it, that I’m just like, you are a productivity coach and you are here to help me stay on task. But I have another where I’m getting some help on, in theory, looking at political science papers, so it’s actually not that good at that.

But this ability to tell this extraordinarily protean algorithm what I want it to do in plain English, that is different, right? The one thing that A.I. seems to make possible is an algorithm that you shape in plain English, an agent that you are directing to help you, in some cases, maybe create the internet, but much more often to navigate it.

Right now it is very hard for me to keep up on the amount of news, particularly around the amount of local news I would like to keep up on. If there is a system that I could say, hey, here’s some things I’m interested in from these kinds of sources, that would be very helpful to me. It doesn’t seem like an impossible problem. In fact, it seems like a problem that is inches away from being solved. That might be cool.

nilay patel

I think that’d be great. I’ve known you for a long time. I think you have a unique ability to articulate exactly what you want and tell it to a computer. [LAUGHS] And you have to scale that idea, right? You have to go to the average — our mothers and say, OK, you have to tell the algorithm exactly what you want. And maybe they’ll get close to it, maybe they won’t, right?

ezra klein

You don’t feel like mothers are able to tell you what they want?

nilay patel

[LAUGHS] I like that idea a lot. I think fundamentally that is still an A.I. closing the walls around you. And I think the power of the recommendation algorithm is not expressed in virality. It’s actually to help you expand your filter bubble. Here’s a band you’d never heard of before. Here’s a movie you never thought of watching. Here’s an article about a subject that you weren’t interested in before.

I think TikTok, in its 2020 TikTok moment, was terrific at this. Everyone was going to sing a sea shanty for five minutes, right? Why do we suddenly care about this and it’s gone? And it was able to create cultural moments out of things that no one had ever really thought of before. And I want to make sure, as I use A.I., that I’m actually preserving that, instead of actually just recreating a much more complicated filter bubble.

ezra klein

I think it’s a good place to end. Always our final question, for the Nilay Patel recommendation algorithm —

what are three books you’d recommend to the audience?

nilay patel

Well, I’m sorry, Ezra, I brought you six.

ezra klein

Did you really?

nilay patel

Is that allowed?

ezra klein

Did you actually bring six?

nilay patel

I didn’t bring six physical books, but I have six recommendations for you.

ezra klein

Damn. All right, go through them quick, man.

nilay patel

They’re in two categories. One is the three books that I thought of and three books from Verge people that if people are interested in these ideas are important. So the first one is “The Conquest of Cool” by Thomas Frank, one of my favorite books of all time. It is about how advertising agencies in the ‘60s co-opted the counterculture and basically replaced counterculture in America. I’ve thought about this a lot because I’m constantly wondering where the punk bands and rage against the machines of 2024 are. And the answer is that they’re the mainstream culture. It’s very interesting. Love that book. It explains, I think, a lot about our culture.

Two is “Liar in a Crowded Theater” by Jeff Kosseff, which is a book about the First Amendment and why we preserve the ability to lie in America. I am very complicated thoughts about the First Amendment right now. I think social media companies should do a better job protecting my kid. I also think the First Amendment is really important. And those ideas are crashing into each other.

Third, I love the band New Order. I know you’re a music fan, so I brought you a music recommendation. It’s “Substance: Inside New Order” by Peter Hook, who is the bassist of New Order. This band hates each other. They broke up acrimoniously, so the book is incredibly bitchy. It’s just a lot of shit-talking about the ‘80s. It’s great.

But inside the book, he is constantly talking about how the technology they used to make the music of New Order didn’t work very well. And there’s long vignettes of why the songs sound the way they do because of how the synthesizers worked. And that just brings together all the ideas I can think of. So those are the three outside of The Verge universe.

But there are three from Verge people that I think are very important. The first is “Everything I Need I Get From You” by Kaitlyn Tiffany, who’s one of my favorite Verge expats. It is about how the entire internet was shaped by the fandom of the band One Direction. And I think this is totally underemphasized, underreported that fandoms are actually what shape the internet. And a lot of what we think of as internet culture is actually fandom culture. And so Kait’s book is really good.

The other, obviously, I have to shout it out is “Extremely Hardcore” by Zoë Schiffer, who basically wrote about the downfall of Twitter. And I think understanding how a social network works — these are lots of people making lots of decisions, and it was just dismantled. And now you can see how the social network broke. And I think we take these things for granted.

And then the third is “Beyond Measure” by James Vincent, which is a history of the systems of measurement and how political they are. And it is one of my favorite books because it is — you just take this stuff for granted. And you look at it, and you’re like, oh, this was deeply, deeply acrimonious.

ezra klein

Nilay Patel, you’re saving the internet through blogging again.

Your podcast is “Decoder.” Thank you very much.

nilay patel

Thanks, man. [MUSIC PLAYING]

ezra klein

This episode of “The Ezra Klein Show” was produced by Claire Gordon. Fact-checking by Michelle Harris with Kate Sinclair and Mary Marge Locker. Our senior engineer is Jeff Geld. We’ve got additional mixing by Isaac Jones and Efim Shapiro. Our senior editor is Claire Gordon. The show’s production team also includes Annie Galvin, Rollin Hu and Kristin Lin. We have original music by Isaac Jones. Audience strategy by Kristina Samulewski and Shannon Busta. The executive producer of New York Times Opinion Audio is Annie-Rose Strasser. And special thanks here to Sonia Herrero.

[MUSIC PLAYING]



Source

Related Articles

Back to top button