Apple's Visual Intelligence on iPhone 16: A New Way to Explore the World

Apple’s new Visual Intelligence tool on iPhone 16 enables users to access information directly by pointing their camera at objects.

Apple’s latest iPhone 16 lineup introduces a new feature called Visual Intelligence, accessible through the new Camera Control button, designed to enhance users’ interaction with the world around them by simplifying information retrieval. Now, by simply pointing their iPhone at a storefront or object, users can access pertinent information—like store hours or product details—without having to open multiple apps.

Visual Intelligence, included in iOS 18.2 as a beta feature for developers, offers options for integrating ChatGPT or Google Search directly from the camera interface. Users can take a photo and instantly receive details on a variety of topics, from restaurant menus to historical landmarks, providing immediate information without needing to unlock the phone or manually input queries.

For instance, a New York-based journalist tested the feature by pointing the iPhone 16 camera at a Japanese tea shop in Bowery Market. With one tap, Visual Intelligence displayed the shop’s hours, images of drinks, and provided options to call the establishment or place an order. This functionality extends beyond stores; when pointing the camera at a retro gaming console, Visual Intelligence seamlessly connected with ChatGPT to identify the item and share its release date.

Visual Intelligence highlights Apple’s commitment to simplifying mobile searches. Although Google Lens offers a similar experience, Apple’s integration of this feature in a dedicated button on the iPhone 16 suggests a more streamlined, camera-based approach to accessing information. It demonstrates Apple’s vision of reducing the need for separate apps and advancing intuitive, camera-led discovery on mobile devices.

The initial response from testers indicates that Visual Intelligence is particularly valuable for those exploring unfamiliar places, such as tourists needing quick insights into local attractions. While some may prefer established browsing habits, the potential convenience of Visual Intelligence suggests that it could reshape how people use their phones over time.

Though the feature remains in its early stages and aimed at developers, Apple plans to refine Visual Intelligence before its full release, potentially changing the way users interact with their surroundings through iPhone cameras.

Next
Previous