Oh, Apple. You waited until we all got comfortable, and then you dropped a whole new set of Apple Intelligence features like it was no big deal.
In an update dated September 15, 2025, Apple says the new Apple Intelligence features are available now, and they land across iPhone, iPad, Mac, Apple Watch, and even Vision Pro. The big headline for normal humans, not robot enthusiasts, is Live Translation baked into Messages, FaceTime, and Phone.
So yes, you can finally talk to people in other languages without doing the classic, awkward “hold on, I’m opening an app” move. Even better, Apple says you can use Live Translation with AirPods Pro 3 for in-person conversations, which sounds like sci-fi, but in a very Apple way.
Also, visual intelligence is getting smarter about what’s on your screen. Think “I’m looking at something, I want to know what it is, and I want my iPhone to stop pretending it doesn’t have eyes.”
And because Apple knows we love a good power user moment, Shortcuts can now tap into Apple Intelligence models directly. Translation, your goofy little automations are about to get a lot less goofy, and a lot more useful.
My take, this is the kind of AI rollout I actually want from Apple. Less hype, more stuff I’ll use on a random Tuesday when I’m traveling, texting, or trying to automate my life because I refuse to do repetitive tasks like a medieval peasant.