• btc = $96 347.00 -1 559.08 (-1.59 %)

  • eth = $3 325.75 - 100.77 (-2.94 %)

  • ton = $6.07 -0.31 (-4.88 %)

  • btc = $96 347.00 -1 559.08 (-1.59 %)

  • eth = $3 325.75 - 100.77 (-2.94 %)

  • ton = $6.07 -0.31 (-4.88 %)

12 Mar, 2024
1 min time to read

Now, users can point their glasses at landmarks and receive detailed descriptions, transforming the eyewear into a personal tour guide.

Meta's Chief Technology Officer, Andrew Bosworth, announced the new feature via Threads, highlighting its utility for travelers. Mark Zuckerberg further showcased the feature on Instagram, demonstrating how the glasses provide verbal descriptions of landmarks like Big Sky Mountain and the Roosevelt Arch.

This innovative functionality enhances the user experience, offering insightful information about the surrounding environment.

The landmark identification feature was first previewed at Meta's Connect event last year, part of the company's initiative to introduce multimodal capabilities to its smart glasses. Powered by real-time data access and Bing Search integration, these glasses can now answer questions and provide information based on the user's surroundings.

While the feature is currently available to a limited number of users in Meta's early access program, Bosworth assures that more individuals will gain access soon.