Apple’s Visual Intelligence feature, introduced a couple of years ago, is expected to be a key component of the company’s upcoming AI tech. According to Bloomberg’s Mark Gurman, this feature might be the central part of Apple’s new smart glasses, AirPods with cameras, and AI pendant accessory.
Visual Intelligence lets users learn about places and objects around them by pointing their camera at something. With iOS 26, Apple super-powered it with a new screenshot search. When users take screenshots, it will also suggest similar features, allowing them to circle to search on Google or Etsy for a product they want to buy, or even ask questions to Apple Intelligence or ChatGPT based on the object they have highlighted.
Here’s what Visual Intelligence is, and where it’s going
Now, Bloomberg’s Mark Gurman says that Apple wants to upgrade Visual Intelligence with the ability to differentiate items and ingredients when users take a photo of a plate of food. There are also plans to upgrade turn-by-turn directions to not only say how distant users are from their next turn but also tell them exactly where they need to go based on landmarks, and even remind the user about a certain object or place as they’re walking around.
These 3 products will reportedly also rely on Visual Intelligence
While these new experiences would work on iPhone models that can already take advantage of Apple Intelligence, the company thinks this could be a central part of its new smart glasses, AirPods with cameras, and AI pendant accessory. According to Gurman, Apple’s smart glasses will rely on the new Siri and will feature higher-end materials and more cameras compared to Meta’s offerings.
For the latest in tech and entertainment, plus tips and advice you’ll actually use, sign up to BGR’s free newsletter and add us as a preferred search source. Stay ahead of the curve with onlytrustedinfo.com, your ultimate source for the fastest, most authoritative analysis of breaking tech news.