ChatGPT and Mac app integrations point to an exciting future


The long-term promise of Apple Intelligence and next-gen Siri is that it will be able to access all our apps, and the data stored in those apps, to become massively more helpful.

ChatGPT has effectively given us a preview of this type of capability through its integration with a handful of Mac apps, and I’ve been putting it to the test …

ChatGPT’s Mac app integration

Back in November of last year, the ChatGPT Mac app gained the ability to access a few Apple and third-party apps. This was originally aimed at helping with coding.

Further apps rolled out, including Apple’s Notes app and TextEdit. This feature is now available to free as well as paid ChatGPT users.

A workaround to preview the future

While the list of compatible apps is still very small, I realized I could preview its broader usefulness by pasting in content from unsupported apps to supported ones. This is obviously a bit clunky, but does let us get an idea of what the feature will be able to do for us as it supports more apps.

For example, I had a lengthy Pages document that I wanted ChatGPT to summarize on for me. While Apple Intelligence claims this capability, my own experiments have shown that it’s unfortunately not very good – often omitting very important information while including things which don’t much matter.

ChatGPT is not only considerably better at it, but lets me ask follow-up questions in a very conversational way.

I simply opened the Pages document, selected all the text, copied it, opened a new TextEdit document, and pasted in the text. This sounds clunkier than it is in practice, as a few clicks with CMD-A, CMD-C, and CMD-V takes just seconds.

For some other tests, I’ve had to use a more convoluted process! For example, I wanted ChatGPT to be able to analyze an Excel spreadsheet for me. I tried pasting in the cells to Notes, which did correctly format into a table, but the ChatGPT Mac app can’t yet see the content of tables. When I pasted the table into the app itself it was able to understand it.

It feels like a very natural process

Ignoring the clunkiness needed for the experiments, the actual process once the document is ready feels very natural. When I say “Tell me about this” then ChatGPT immediately does a very good job of summarizing it. The ability to ask follow-up questions feels remarkably like a real conversation, and ChatGPT even searched the web for more information when needed to answer some of my questions.

I use the Notes app extensively, not just for personal notes, but also as a database of reference information I’ve saved over the years. For example, if I read an article with good advice I plan to follow in future, like a good mix of camera angles to use in short films, I’ll either paste the text into a note or create my own notes based on it for later use.

I’ve experimented with things like pointing ChatGPT to a tutorial in a note and asking questions like “How easy or difficult do you think this would be?” or “How much time do you think it would take?” and been impressed.

For example, I pointed it to a Home Assistant tutorial and asked it “How much time do you think this would take if I’m currently using HomeKit and have about 50 smart home devices?” It then gave what appeared to be very plausible estimates for each step of the process. (The total, if you’re interested, was 15-30 hours, which was enough to put me off any thought of converting my existing smart home, though I might consider it for a future one.)

That (indirect!) spreadsheet analysis? I was very impressed with its ability to make sense of the table despite minimal contextual clues. For example, despite the lack of any label around ‘property’ or ‘apartment,’ it was able to figure out that I was comparing the financial impact of three different potential apartment purchases, was able to draw useful conclusions, and to answer my questions about it. Once we’re able just to point to an open spreadsheet, this will be super-slick!

This points to an exciting future

I’m using ChatGPT here, but let’s imagine all this working natively with Apple Intelligence.

Imagine a future where I can select any app to work with – or an even better one where I don’t have to specify apps at all, as Apple Intelligence will have access to all of them, and will have a good idea of what to look for where.

That feature I asked for way back in 2016 looks like it might finally be close to landing.

Hey Siri, arrange lunch with Sam next week

Working – I’ll get back to you shortly …

Ok, I arranged lunch with Sam for 1pm next Wednesday at Bistro Union at Clapham Park

I said it should be able to do this by checking my calendar for free slots, checking hers at the free/busy level (with permission), knowing where we will each be on the day to find a mutually convenient place, knowing our restaurant preferences, and so on. Being able to access data from all my installed apps should – eventually – enable this type of capability.

Given the pace of Apple Intelligence to date, I’m obviously not holding my breath. But based on what we’ve just seen at Google I/O, Google now appears to be getting very close indeed to this capability. Perhaps that’s the way we’ll have to go first – work in Google apps to get early access – but whenever it does finally come to iPhones as a native ability, it really will have been worth the wait.

Highlighted accessories

Photo by BoliviaInteligente on Unsplash

FTC: We use income earning auto affiliate links. More.



Source link

Previous articleYour Steam Deck Just Got a Big Update