
Bloomberg’s Mark Gurman reported a couple of days ago on what he described as “Apple’s biggest push into health yet with a new AI doctor.”
While I completely understand why Apple would want to do this, I think the company will need to tread extremely carefully to avoid the risk of doing more harm than good …
Apple’s most important contribution
Apple CEO Tim Cook said back in 2019 that he believed health initiatives would be the most important thing the company ever did.
On healthcare in particular and your wellbeing, this is an area that I believe, if you zoom out into the future, and you look back, and you ask the question, “What was Apple’s greatest contribution to mankind,” it will be about health.
Because our business has always been about enriching people’s lives [and now we are] empowering the individual to manage their health. And we’re just at the front end of this.
He’s reiterated that view on a number of occasions – and I think he may well be right.
While I’ve personally swapped out my Apple Watch for a smart ring, the form factor isn’t important; I fully expect an Apple Ring before too long. Healthtech is indeed a massively important and impactful field, and Apple is absolutely making a huge contribution.
Features like Afib detection, Fall Detection, Emergency SOS and more have already saved a great many lives. I said a few years ago that we are witnessing the quiet beginning of a huge Apple Health revolution.
Less visibly, but even more importantly, Apple’s ResearchKit is providing health researchers with a breadth and depth of data that will undoubtedly transform our understanding of countless medical conditions. That data will, over time, save way more lives than any number of direct health alerts to individual Watch wearers.
App-based health coaching
App-based health coaches are already a thing, of course. Apple’s activity rings have made a massive contribution to encouraging people to get more exercise, supported by quieter but still important things like prompts to stand and take time out for some breath exercises.
Using health devices like smart watches and smart rings to collect data in order to make personalized recommendations is something I really appreciate. Indeed, I’ve suggested that Apple should buy Oura not for the hardware – which the Cupertino company can easily replicate – but for the excellent app.
I gave the example of the personalized input the Oura app gave me regarding sleep quality:
It noticed I slept better when going to bed at around 11pm, so proactively recommended considering that, and prompting me to start winding down an hour or so beforehand. It also noted I slept better after a medium amount of exercise, rather than soon after a lot or a little […]
The app has completely won me over. It’s clear that a huge amount of thought has gone into deciding what to show as standard, when to prompt you with personalized alerts and suggestions, and when to offer more general advice.
Apple’s Health app is visually beautiful, and very easy to use, but it’s never made me feel like I have a personal health and fitness coach in my pocket; the Oura app does.
Oura is in the process of rolling out an AI-based health coach into the app, and I’m cautiously curious about that.
But an AI doctor is something else
What worried me about Gurman’s report was that “AI doctor” label.
Of course, this phrase is almost certainly one Gurman has coined to describe his own understanding of the goal of the feature – I don’t think it’s remotely likely Apple will actually use that term.
But whatever label the iPhone maker lands on, perceptions matter. If users perceive the advice to be coming from an AI personal trainer or coach, that’s one thing. They understand the limits of that expertise, whether offered by a human or an app.
But if users perceive the new AI to be offering them more general health advice, and it strays into areas more commonly associated with family doctors than PTs, that potentially puts Apple in very dangerous territory. People may rely on AI advice when they should really be seeing a medical professional.
Of course, you can argue that the app would always play safe – in case of any doubt at all, it will direct users to seek out a doctor. But there are two big problems with this.
First, if the AI doctor makes a habit of suggesting people need to visit a real one, it will eventually be ignored. Constantly crying wolf runs the huge risk of people failing to seek help on the few occasions they really need to.
Second, it’s an AI system. There’s no telling what it will or won’t do. Its guidelines may tell it to err on the side of caution, but as I’ve said more times than I could count, AI systems are not intelligent. Nor do they reliably follow their guidelines.
So yes, I’m totally persuaded that healthtech can make a massive contribution to the world, and I’m certain Apple will be a key player in that. But I very much hope this is one area of AI where Apple will be exceedingly careful not to over-promise.
Highlighted accessories
Photo by Alex Knight on Unsplash
FTC: We use income earning auto affiliate links. More.