• btc = $89 493.00 -1 304.11 (-1.44 %)

  • eth = $3 074.05 -84.23 (-2.67 %)

  • ton = $5.35 -0.35 (-6.18 %)

  • btc = $89 493.00 -1 304.11 (-1.44 %)

  • eth = $3 074.05 -84.23 (-2.67 %)

  • ton = $5.35 -0.35 (-6.18 %)

15 Sep, 2024
1 min time to read

Apple's new "Visual Intelligence" feature, introduced at the iPhone 16 event, allows users to scan their surroundings with the iPhone camera to identify objects, copy details from posters, or look up information in real time.

While this tool is useful on the iPhone, its potential for augmented reality (AR) glasses is even greater.

Visual Intelligence could be a key component in future Apple AR glasses, enabling users to interact with the world hands-free, simply by looking at objects and asking questions. Similar to how Meta has explored AI-powered glasses, Apple's version would likely integrate with the iPhone, enhancing its functionality with apps and personal data.

Though Apple’s Vision Pro headset already offers a glimpse of its AR ambitions, it’s not suited for everyday use. Reports suggest true AR glasses may not arrive until 2027, but the Visual Intelligence feature appears to be a foundational step in Apple’s long-term vision.

By refining this technology now, Apple could be preparing for a future where AR glasses become a mainstream product, competing with rivals like Meta, Snap, and Google.