Oh, Apple. Just when we finally got used to saying “Apple Intelligence” without sounding like we’re auditioning for a sci-fi movie, they go and drop a fresh batch of features.
Apple says new Apple Intelligence features are available now, and the headline act is Live Translation built right into Messages, FaceTime, and Phone. So yes, your iPhone can basically play interpreter while you pretend you totally remembered your high school Spanish.
It gets even better, Live Translation can work with AirPods Pro 3 for in-person conversations, which is both insanely useful and also a little hilarious to imagine in real life. Like, “Hold on, let me double-press my AirPods stems so I can understand what you just said.”
There’s also updated visual intelligence that helps you do more with what’s on your screen. Think quick actions like summarizing text, translating stuff, or even pulling an event off a flyer and tossing it into your calendar, all without the usual copy-paste gymnastics.
And for the automation nerds (hi, it’s me), Shortcuts can tap into Apple Intelligence models, which means your little personal workflows can get way smarter. If you have ever wanted your iPhone to feel like a helpful assistant instead of a glowing rectangle that judges you, this is the vibe.
My take, this is the kind of update that makes Apple’s AI strategy feel less like a buzzword and more like “oh wow, I will actually use this.” If you are already living on iPhone, iPad, and Mac, the whole cross-device thing is where it starts to feel very Apple, in the best way.