
The long wait for a smarter Siri is to get even longer, with some indications that the new features we were originally expecting in iOS 18.4 may now be pushed back to iOS 19.
Apple hasn’t provided any real explanation, but two theories have so far been put forward, and now a developer and data analyst has suggested that security concerns may be a third reason – and by far the biggest problem …
Smarter Siri delay
Apple first promised a much smarter Siri at WWDC in June of last year. While the company has launched some Apple Intelligence features, and Siri can now at least cope with verbal stumbles and some compound commands, we haven’t yet seen any sign of the three key improvements promised by Apple:
- Contextual awareness, where it can access your personal information
- The ability to see what’s on your screen, and respond to that
- In-app actions, where you tell Siri what you want to do and it will use your apps to achieve it
We were originally expecting these features to be included in iOS 18.4, while a more conversational Siri would wait until next year, but Apple has now said that it has hit problems with this timeline.
We’ve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps. It’s going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year.
It’s not completely clear what the company means by “the coming year” – whether that is later this year, or at some point next year, but it could easily mean we won’t get a smarter Siri until iOS 19.
Two reasons have so far been suggested
Bloomberg has consistently reported that Apple has been struggling to prepare these features for launch, indicating that there are simply too many bugs at present.
Inside Apple, many employees testing the new Siri have found that these features don’t yet work consistently.
At the time, however, Mark Gurman expected the launch to be delayed from iOS 18.4 to 18.5.
It’s also been suggested that the underlying reason for these bugs is that Apple currently has two completely separate versions of Siri, with one layered on top of the other. The original version still handles tasks Siri has always been able to execute, while the second layer intercepts more complex commands. Apple is reportedly struggling to integrate everything into a single version of Siri.
But security may be the biggest concern
Developer Simon Willison, creator of the open-source data analysis tool Datasette, suggests that Apple may also be struggling to keep a smarter Siri secure. Specifically, he thinks it may be vulnerable to prompt injection attacks.
These are a well-known problem with all generative AI systems. Essentially an attacker attempts to override the built-in safety measures in a large language model (LLM) by fooling it into replacing the safeguards with new instructions. Willison says this poses a potentially massive risk with a new Siri.
These new Apple Intelligence features involve Siri responding to requests to access information in applications and then performing actions on the user’s behalf.
This is the worst possible combination for prompt injection attacks! Any time an LLM-based system has access to private data, tools it can call, and exposure to potentially malicious instructions (like emails and text messages from untrusted strangers) there’s a significant risk that an attacker might subvert those tools and use them to damage or exfiltrating a user’s data.
In other words, fooling ChatGPT into giving you instructions for building a bomb is one level of risk, but tricking Siri into handing over your personal data is something else altogether.
Apple commenter John Gruber finds this theory credible, saying that nobody has yet succeeded in stopping prompt injection attacks.
A pessimistic way to look at this personalized Siri imbroglio is that Apple cannot afford to get this wrong, but the nature of LLMs’ susceptibility to prompt injection might mean it’s impossible to ever get right. And if it is possible, it will require groundbreaking achievements. It’s not enough for Apple to “catch up”. They have to solve a vexing problem — as yet unsolved by OpenAI, Google, or any other leading AI lab — to deliver what they’ve already promised.
He wonders aloud how Apple allowed itself to promise – and even advertise – features which may prove too dangerous to ever launch.
9to5Mac’s Take
While I’m sure the earlier reports are accurate, it does seem entirely plausible that security is also a key factor.
Apple has turned privacy into a key differentiator and selling point for its products, so any vulnerability that allows a rogue app to access and export your personal data would be a complete disaster for the iPhone maker.
While I wouldn’t expect the smarter Siri project to turn into another AirPower, it is looking increasingly like Apple should have just kept quiet about timings until it was closer to addressing all of the challenges.
Image: Apple and Michael Bower/9to5Mac
FTC: We use income earning auto affiliate links. More.