Meta is reportedly bringing artificial intelligence (AI) to its Ray-Ban smart glasses starting next month. While the smartglasses already offer a Meta voice assistant, the glasses are getting an update allowing it to perform translations and object, animal and monument identification.

Since last December, the New York Times has tested these features in early access. According to NYT, after you’ve prompted the glasses with ‘Hey Meta,’ you can ask the AI to identify pets, different fruits, monuments and even translate languages. The feature supports English, Spanish, Italian, French and German.

However, NYT  says that the feature didn’t work 100 percent of the time, as it couldn’t identify zoo animals that were too far behind cages and couldn’t identify an exotic fruit.

Meta will likely continue adding new features to the Ray-Ban smart glasses as time goes on. However, if you want to join early access, you’ll unfortunately need to move to the U.S.

Source: New York Times, The Verge


Source link