Apple’s visual intelligence feature, which uses AI to identify objects, translate text, and provide other smart capabilities, has been expanded with iOS 18.4. Initially available only on iPhone 16 models, it’s now accessible on the iPhone 15 Pro, iPhone 15 Pro Max, and iPhone 16e.
This feature is integrated into the Action button, Lock Screen, or Control Center, offering smarter camera functions. Visual intelligence allows users to get more information about the world around them, including restaurant details, plant and animal identification, and text translations.
Apple’s plans for visual intelligence extend beyond iPhones, with future devices like AirPods and Apple Watch expected to include cameras for similar AI-driven features. These devices will enhance the ability to interact with the environment, making the AI feature more integral to Apple’s ecosystem.
Apple aims to make visual intelligence a core part of its devices, with more improvements and expansions to come. As this feature develops, it’s likely to become a key part of Apple’s strategy for integrating AI across all its products.