Apple Is Bringing A.I. to Your Personal Life, Like It or Not
Last week, Apple held its Worldwide Developers Conference, the annual event that is often used to showcase the company’s most significant innovations. Much of the presentation this year was devoted to A.I., or, as the company is branding it, Apple Intelligence. Whereas Google and Microsoft have leaped headlong into A.I. with their Gemini and OpenAI products, respectively, Apple is so far taking a narrower approach. The A.I. model it is unveiling on iPhone hardware is relatively weak. A.I. models are measured on their number of “parameters,” or the variables adjusted during the training process; while OpenAI’s GPT-4 has more than one and a half trillion parameters, Apple’s model has three billion. For queries that require more horsepower, users will be offered the option to outsource a task via the cloud to ChatGPT, via a corporate licensing deal that is reportedly not in exchange for a fee but for exposure for OpenAI. In other words, there’s no Apple-made superintelligent thinking machine—at least not yet.
Accordingly, the reaction to the conference presentation has been somewhat muted. In New York magazine, John Herrman wrote that it represented “a cautious approach by Apple,” and speculated that the company might be wary of overinvesting in a technology that isn’t quite as far along as it is often marketed to be. In the Washington Post, Josh Tyrangiel described Apple Intelligence as “the first rational theory of AI for the masses,” praising the applications’ limited scope and the partnership between the veteran computing company and the upstart OpenAI. I suppose we should be celebrating the fact that Apple hasn’t entered the A.I. arms race full throttle. Google’s rush to keep pace with Microsoft’s A.I. developments has already resulted in the accelerated decay of Google Search tools. But I had a less sanguine reaction to the W.W.D.C. Apple Intelligence, a small model that could eventually be nestled on more than a billion iPhones around the world, crosses a kind of Rubicon: A.I. is entering our personal lives, and once it’s there it’s not likely to retreat.
As demonstrated at the presentation, A.I. for the iPhone will soon be available to rewrite your e-mails for you; summarize your overactive group texts; and triage your notifications, sorting which messages you see first. Apple’s C.E.O., Tim Cook, described the tool as a “new personal intelligence system,” not simply a tool but a secondary, semi-autonomous brain. His remarks reminded me of Steve Jobs’s declaration, in 1990, that the computer is a “bicycle of the mind,” but in this case the computer is now just the mind, and the human being using it becomes engaged in a kind of automation of the self. Over the past two decades, Apple has succeeded in integrating iPhones into all of the mundane tasks of our daily lives: contacting friends, navigating places, sending work e-mails, making payments. Its introduction of Apple Intelligence marks a step into a new technological era—call it the domestication of generative A.I.
Apple is confident that its access to all of your day-to-day information—where you travel, what you read, whom you talk to and when—will allow its A.I. tools to be more dynamic and more useful than OpenAI’s ChatGPT (which, for all its alarming powers, still operates out of a contained prompt window). “It has to understand you and be grounded in your personal context, like your routine, your relationships, and more,” Cook said during the event. Siri, Apple’s voice assistant on the iPhone since 2011, can understand what you say (most of the time), set alarms, and check the weather. But Apple Intelligence, which performs like a turbocharged Siri, is more of a ghost in the machine, animating the functions of your phone. The A.I. community calls this sort of tool an “agent.” Let your A.I. agent access all of your contacts, texts, and calendars, and it will competently plan your life. Apple presented, as an example, asking the tool, “Play the podcast my wife sent the other day.” Having a machine that can decode such vague references seems quite convenient, but consider that it also requires the phone to understand who your wife is and to rifle through your conversations with her.
The fact that Apple A.I. is designed to run on the device itself, rather than via the cloud, promises to protect users’ vulnerable personal data to some extent. Yet generative A.I. remains prone to random misunderstandings or “hallucinations,” the somewhat euphemistic A.I. term of art for dramatic mistakes. It has no ability to determine what is factually accurate or connected to reality. Cook told the Post that the tool would not achieve a hundred-per-cent accuracy but added, less than reassuringly, “I am confident it will be very high quality.” One can imagine that a single bizarre accident might be enough to turn an iPhone user off A.I. A wrong answer to “When is my mom landing?”—another sample query from the conference—could yield an airport-pickup snafu. More frightening would be a reply-all disaster—perhaps the A.I. might misunderstand your use of the term “everyone” and e-mail every person in your contact list. Maybe you don’t respond very quickly to messages from your boss and the A.I., clocking this, will decide to start hiding them. (Generative A.I. has been known to be so affirming of its user’s desires that it even hallucinates fictional sources, including Web sites and books, for the faux facts that it produces.)
The Apple event revealed a certain softening of the company’s walled-garden approach to personal technology. Under Jobs and the designer Jony Ive, Apple took the position that its own products were ideally designed and thus self-sufficient. There was no need for customization beyond the surface-level choice of phone-casing color; the operating system and graphic interface were one-size-fits-all. With the new OS update, users can change their phone’s home-screen aesthetic, adding new skins or color palettes so that one’s assembled apps look more like a vintage Winamp interface than a sleek Apple product. When the physical button on the side of the phone is pushed, a bump appears in the digital black border of the screen, as if that is being pushed, too. The effect over all is friendlier, almost personified, paving the way for us to view our smartphones less as perfect static machines than as malleable supporting characters in our lives. The A.I. technology that Apple is demonstrating may not be the most powerful out there, but it sends a powerful message that A.I. belongs in every corner of our lives.
During the two years since OpenAI unleashed ChatGPT to the public, we’ve been left to speculate about what drastic effects generative A.I. might have on society. Will it destroy jobs? Drive us to emotional relationships with robots? Accidentally cause human extinction? So far, though, I’ve come to think of A.I. as an accelerant for the kinds of automation already taking place on the social-media-era Internet. Algorithmic feeds—driven by machine learning, an earlier form of A.I.—push their consumers toward generic content and encourage creators to tailor their work toward the lowest common denominator. They prove our tendency to look in whatever direction machines point our attention. With A.I. on our phones, a similar force will exert itself on our lives even outside of social-media platforms. We will rapidly enter a world in which we don’t know whether a text message was written by the person sending it or by Apple Intelligence, a world in which our phones help shape who we’re in touch with and how we recall our own memories. (At W.W.D.C., Apple showed off a feature that doesn’t just create slideshows of photos, as iPhones do now, but composes filmlike montages with their own internal narrative structures designed by A.I.) On one level, we may feel relieved to seamlessly automate some details of our cluttered lives away. But doing so means relinquishing control over the most basic units of human communication. Like many people, I use my iPhone nearly every hour that I am conscious, every day, but the new announcements left me craving a dumbphone, since those machines, at least, cannot attempt to think for me. ♦