I tested Apple Intelligence on my iPhone 15 Pro Max: 3 ways it spoiled me rotten
Apple Intelligence, if it were personified, would be a royal attendant who feeds me grapes and fans me with palm leaves. I’ve never felt so pampered. Is this what it’s like to be catered to? Is this what it’s like to be spoiled rotten?
If you’ve been out of the loop, Apple Intelligence is the Cupertino-based tech giant’s new suite of AI features, which were announced at WWDC 2024 in June.
Some of the most highly anticipated Apple Intelligence features, like Genmoji (AI-generated emojis) and Image Playground (AI-generated images), are not available yet. However, there are still some Apple Intelligence utilities you can test right now — especially since they’ve made their first robust debut with the iOS 18.1 developer beta that Apple dropped on Monday.
With my iPhone 15 Pro Max, I tested some Apple Intelligence features, and to be succinct, I appreciate how much it indulges my laziness. Of course, it’s not perfect — not yet at least. After all, the iOS 18.1 developer beta is expectedly a bit rough-around-the-edges as Apple collects feedback from testers. (This is why you should always backup your iPhone before installing any iOS beta because it can be risky.)
Overall, though, Apple has something here that will make its “Pro” and “Pro Max” iPhone models even more enticing than ever. (Apple Intelligence is only available on the current-gen “Pro” variants: iPhone 15 Pro and iPhone 15 Pro Max.)
New ‘Summarize’ tool makes Safari more attractive
Apple basically put a TLDR (too long, didn’t read) button in Safari, allowing me to skip ultra-long articles and get to the point. After opening Safari, I can press-and-hold the icon on the left of the URL bar, tap “Summarize,” and wham,
Apple Intelligence gives me “CliffsNotes,” if you will, of any article I don’t have time to read. It’s perfect for when you want to get to the gist of the story as quickly as possible.
As someone with an attention deficit, I can’t help but get a little antsy after stumbling upon a wall of text or a verbose story. Typically, I’d read a few sentences and give up. However, with the “Summarize” tool, I received short-and-sweet synopses on articles that ramble, meander, and seem to go nowhere fast. In fact, at times, I found myself wanting to read a lengthy article in its entirety after the AI-powered summary revealed that the story is juicier than expected.
That being said, when it comes to delving into dense articles, why would I use Google Chrome? I’m hopping and skipping over to Safari to take advantage of that new Apple Intelligence-powered Summarize tool.
The only downside, however, is that the Summarize tool tests my patience sometimes. It can be a few seconds too slow for my tastes, but this isn’t unique to Apple Intelligence (ChatGPT, Copilot, and Gemini can be slow pokes, too.) But admittedly, it’s worth the wait.
‘Writing Tools’ feature is surprisingly useful
One of the most popular use cases for AI, whether it’s ChatGPT, Google Gemini, or Copilot, is “tone tuning” text. For example, you can drop in an email draft and ask those AI tools to help adjust your tone.
However, I’ll admit that I was one of those people who thought, “Pfft, I don’t need an AI chatbot to tell me how to make something sound more professional!” As it turns out, the Apple Intelligence-based tone-adjustment tool, which can be accessed via the new Writing Tools feature, is a lot more helpful than I thought.
You’ll be surprised how often you may think you’re coming across friendly and congenial in emails and texts, but instead, your messages are being interpreted as prickly.
To reduce the risk of this, I found myself using the “friendly” tone-adjustment tool to nix unintended snippiness. The best part is that I was able to use Writing Tools in almost any text field across the iOS 18.1 developer beta. I just highlight the words, hit “Writing Tools,” and choose my desired tone.
As a result, I’ve definitely seen more positive responses from my co-workers, friends, and other loved ones.
No more endless scrolling on the Photos app
I take a lot of pictures and selfies, but no, I don’t categorize them nor put them into albums (because, spoiler alert, I’m lazy).
I just let them pile up into a haphazard collection of random snapshots. Every now and then, however, I need to find that one picture, which requires me to scroll endlessly to find it.
Fortunately, with the iOS 18.1 developer beta, I can use natural language to search for a particular picture in the Photos app. For example, I typed the word “pancakes” to find a saved screenshot of my favorite IHOP order.
I could even type in the words “woman with a red top,” and surprisingly, all of my selfies of me wearing red (my favorite color) surfaced. Very cool feature!
Siri is more helpful than ever
Siri received an AI-powered boost with Apple Intelligence, but my favorite perk is its contextual awareness. For example, if I am perusing through a webpage on Safari, I can say something like, “Hey Siri, send this article to Jason.”
Siri is aware of which article I’m looking at, so it can snatch the URL and send it to my fiancé without needing to lift a single finger. (I’m telling you — I’ve been spoiled rotten.)
Is Apple Intelligence worth it?
As hinted at the outset, Apple Intelligence nurtures my laziness — and I dig that. I don’t always want to suffer through several paragraphs to get to the author’s point. I don’t want to spend too much time hemming and hawing over how to best respond to someone. And finally, I don’t want to scroll endlessly through my cluttered gallery to find a specific photo — it’s like finding a needle in a haystack.
Apple Intelligence addresses all of those concerns. I was initially skeptical of Apple’s new suite of AI features, but as it turns out, Apple Intelligence is useful, and yes, totally worth it.
Apple Intelligence is expected to officially roll out with iOS 18 later this year (but keep in mind that reports claim that some features may be delayed).