• btc = $62 594.00 829.22 (1.34 %)

  • eth = $2 440.92 46.35 (1.94 %)

  • ton = $5.31 0.12 (2.21 %)

  • btc = $62 594.00 829.22 (1.34 %)

  • eth = $2 440.92 46.35 (1.94 %)

  • ton = $5.31 0.12 (2.21 %)

15 Sep, 2024
1 min time to read

Apple's new "Visual Intelligence" feature, introduced at the iPhone 16 event, allows users to scan their surroundings with the iPhone camera to identify objects, copy details from posters, or look up information in real time.

While this tool is useful on the iPhone, its potential for augmented reality (AR) glasses is even greater.

Visual Intelligence could be a key component in future Apple AR glasses, enabling users to interact with the world hands-free, simply by looking at objects and asking questions. Similar to how Meta has explored AI-powered glasses, Apple's version would likely integrate with the iPhone, enhancing its functionality with apps and personal data.

Though Apple’s Vision Pro headset already offers a glimpse of its AR ambitions, it’s not suited for everyday use. Reports suggest true AR glasses may not arrive until 2027, but the Visual Intelligence feature appears to be a foundational step in Apple’s long-term vision.

By refining this technology now, Apple could be preparing for a future where AR glasses become a mainstream product, competing with rivals like Meta, Snap, and Google.