Apple Introduces Google Lens-Like Visual Intelligence in iPhone 16 Camera

During Monday’s iPhone 16 event, Apple announced a brand new lineup of iPhones, AirPods and AirPods Pro, AirPods Max, and an Apple Watch Series 10. Apple also took the opportunity to expand on some Apple Intelligence features, the artificial intelligence (AI) tools coming to the iPhone. This includes a new Visual Intelligence feature, which essentially gives the iPhone camera Google Lens capabilities.

Apple’s Visual Intelligence lets you capture a photo of things around you, like a flyer or a restaurant, and then uses iPhone’s AI capability to search for it and give you more information.

Apple says captured data will remain private when used with Apple Intelligence and the company’s Private Cloud Compute, but users can opt for third-party integrations with the new camera experience.

Third-party integrations include the ability to search Google for whatever the camera captures, much like opening Google Lens straight from the iPhone camera app. Users can also allow ChatGPT integration with the Visual Intelligence feature, which would allow the AI chatbot to process the image data captured by the camera.

These third-party integrations require the user to give permissions on an opt-in basis.

The iPhone 16 and iPhone 16 Plus are available for pre-order from $799 and $899, respectively, but the Apple Intelligence features won’t be readily available for some time. Apple says some of its AI features will begin rolling out in beta next month, with more features to come over the next several months.

Next
Previous