Siri Integrates ChatGPT for Onscreen Awareness on Supported iPhones

Siri can now extract and process onscreen information with ChatGPT on supported iPhones, enhancing functionality.

Apple’s Siri is evolving with the integration of ChatGPT to enable onscreen awareness for supported iPhones. This advancement allows Siri to extract information from the current display, including webpages, photos, and live camera feeds. While a fully integrated onscreen awareness feature is expected to arrive with iOS 18.4, iPhones supporting Apple Intelligence can already leverage this functionality.

Siri’s Onscreen Awareness: A Game-Changing Upgrade

Apple announced onscreen awareness during the unveiling of iOS 18. The feature enables Siri to interact with onscreen content and perform actions across different apps. For instance, if a contact sends a new address, Siri can add it to their contact card directly. However, this deeper inter-app interaction is slated for iOS 18.4’s release next year.

Currently, Siri’s ability to access onscreen information relies on ChatGPT, available on iPhone 15 Pro, iPhone 15 Pro Max, and upcoming iPhone 16 models. By integrating ChatGPT, Siri can summarize content, analyze photos, or extract hidden webpage details with simple commands.

How to Use Siri with ChatGPT for Onscreen Content

To utilize this feature, users must own a compatible iPhone. Activating Siri and saying phrases like “Summarize the information on this screen” or “Describe what’s on my screen” prompts Siri to ask permission to send a screenshot to ChatGPT. Once approved, ChatGPT processes the image and generates a response that Siri reads aloud.

This functionality extends beyond text alone. Users viewing photos or live images can ask Siri questions about the content. For instance, pointing the Camera app at an animal and asking “What animal is this?” triggers ChatGPT analysis. Similarly, users can request hidden webpage content, such as restaurant opening hours, without scrolling.

Key Use Cases for Siri’s Onscreen Awareness

  1. Webpage Extraction: Users can extract information from full webpages, including non-visible sections, by choosing “Full Content” when prompted.
  2. Photo Analysis: Siri can identify elements in photos, such as animal species or object names, by sending images to ChatGPT.
  3. Text Summaries: Lengthy documents or web articles can be summarized efficiently by ChatGPT through Siri.
  4. Live Camera Queries: iPhone 15 Pro and 15 Pro Max owners can use the Camera app to analyze real-world objects in real-time.

Limitations and Future Enhancements

While the current integration is powerful, some limitations remain. Users must manually copy extracted information, as true onscreen interactivity is not yet possible. Actions like adding onscreen addresses directly to Contacts will only be achievable with iOS 18.4’s full rollout.

Privacy Considerations

All queries processed by ChatGPT occur in the cloud. Users are advised to avoid sharing sensitive content onscreen or through photos when using this feature.

With the integration of ChatGPT, Siri provides a robust alternative for onscreen awareness, offering significant time-saving tools. As Apple refines this feature with iOS 18.4, further improvements will enhance app interoperability and streamline user experiences.

Next
Previous