Image Playground's idea of an Apple in a cage. Background expanded using Photoshop's Generative Extend.

The AI trap has closed around Apple. Don’t be next.

Apple tried to make AI a personal answer engine. It failed. It's a warning to all businesses.

Adam Tinworth
Adam Tinworth

Apple has fumbled it.

Not words we get to write all that often, but it’s true. Apple has completely blown it over AI.

A little under a year ago, they announced a raft of new AI features coming to its platform. They ranged from the trivial — AI-generated emoji — to the transformative, like a brand-new Siri, that understood all your personal data and contexts, and could answer questions about any of it for you.

The trivial bits have shipped. For example, what the Image Playgrounds AI image generation app produced when I asked it for a picture of me writing on my laptop:

An image of Adam Tinworth at work, generated by Apple's Image Playgrounds app.

That’s flattering. But not great.

The improved, AI Siri? Delayed — to an unknown date.

Apple’s AI anguish

This is… unusual. Apple very rarely announces something that never ships. And pretty much never something this big.

John Gruber, not normally someone to be harsh on Apple, put it perfectly in an excoriating piece, the tone of which you can get from the name: Something Is Rotten in the State of Cupertino

What Apple showed regarding the upcoming “personalized Siri” at WWDC was not a demo. It was a concept video. Concept videos are bullshit, and a sign of a company in disarray, if not crisis. The Apple that commissioned the futuristic “Knowledge Navigator” concept video in 1987 was the Apple that was on a course to near-bankruptcy a decade later. Modern Apple — the post-NeXT-reunification Apple of the last quarter century — does not publish concept videos. They only demonstrate actual working products and features.

Apple “demonstrated” something last year, that they’re not confident they can do a year later. This is not good.

The seduction of the AI trap

Now — I’m in danger of repeating myself here. I’ve made the point for both NEXT Insights and this very site that LLMs are, fundamentally, guessing machines, not answer engines.

With this Siri project, Apple is trying to use an LLM as an answer engine. Ooops.

Now, let’s be fair: in theory, it’s guessing off a much smaller set of data: your one, personal data. And that should, in theory, make it less likely to make bad guesses. But it will make bad guesses. And maybe that’s Apple’s problem here. They’re trying to make LLMs do something they’re just not good at: giving you the right answer from a set of data 100% of the time. And perhaps the underlying reason for the delay is less that Apple is bad at AI (which it probably is), but that AI can’t actually do what they’re asking it to, at the standard they require.

The news summary warning signal

They’ve already been burnt on this once. Remember the news notifications summaries? Apple had to suspend them, after they kept making things up:

The BBC was among the groups to complain about the feature, after an alert generated by Apple's AI falsely told some readers that Luigi Mangione, the man accused of killing UnitedHealthcare CEO Brian Thompson, had shot himself.
The feature had also inaccurately summarised headlines from Sky News, the New York Times and the Washington Post, according to reports from journalists and others on social media.

One of the biggest companies in the world has fallen into the trap of thinking LLMs are answer engines, and it is paying the reputation price for doing so.

How much worse would that damage be, if that company putting out the product that was too inaccurate, too often was one that depends on its reputation for truth and accuracy to survive at all?

Apple’s fumble is a great, big warning light we should be paying attention to. AI is not going away — but businesses that rush to implement it inappropriately might well do.

AIpublishing strategyapple

Adam Tinworth Twitter

Adam is a digital journalism lecturer, trainer and writer. He's been a blogger for over 20 years, a journalist for 30 and teaches audience strategy and engagement at City St George’s, London.

Comments