"Hey Meta!" That's all you need to say to summon your very own virtual assistant with Meta's Ray-Ban smart glasses.
This isn't your average virtual assistant, though. It sees and hears everything happening around you, making it a truly immersive experience.
Meta is now rolling out its multimodal AI features in an early access test. These features allow the AI assistant to provide information about things it can see and hear through the glasses' camera and microphones. Imagine having a personal fashion advisor, language translator, and tour guide all in one - that's what these smart glasses offer.
In a demonstration by Mark Zuckerberg, he asked the glasses to suggest trousers that would match a shirt he was holding. The AI assistant described the shirt and offered a couple of suggestions for complementary trousers. It also translated text and provided image captions.
Zuckerberg envisions people interacting with the Meta AI assistant throughout the day, asking questions about their surroundings or what they're looking at. It's like having a knowledgeable friend with you at all times, ready to provide insights and information.
The AI assistant can also help caption photos, translate text, and summarise information - features seen in other products from tech giants like Microsoft and Google.
This test period is currently limited to a small number of people in the US who opt in. So, if you fancy having your own personal AI assistant, why not give it a try? It could revolutionise the way you interact with the world.
Made with TRUST_AI - see the Charter: https://www.modelprop.co.uk/trust-ai
Comments