Artificial intelligence on Ray-Ban Meta glasses can now answer questions you see
Ray-BanMeta眼镜现在推出了多模态人工智能的早期访问计划,让您可以向它提出关于您所看到的问题。
Ray-BanMetaLaunched at Connect 2023, the Ray-Ban Meta Smart Glasses are the successor to the 2021 Ray-Ban Story, which are first-person camera glasses that allow you to take first-person photos and videos, answer calls, and listen to music without having to use both hands. Compared to the original model, the new glasses have better camera quality, a superior microphone array, are waterproof, and can live stream to Instagram.
But their most important new feature is Meta AI. Currently only available in the US, it's a conversational assistant you can talk to by saying "Hey Meta." Meta AI is much more advanced than current Alexa, Siri or Google Assistant because it is powered by Meta's Llama 2 large language model, the technology that powers ChatGPT.
In the new early access program, Meta AI will no longer be limited to voice input. It can now also answer queries about what you saw or the last photo you took.
This multimodal feature has many potential uses, such as answering questions about what you're cooking, suggesting captions for photos, or even translating a poster or logo from another language.
The multi-modal early access program is available to a "limited number" of owners in the U.S. and should roll out to wider availability next year.
Another upcoming update to Meta AI is a real-time search feature. The system will automatically decide to use Bing for web searches to find answers related to current events, sports scores, and more. This live search feature will be "rolled out in phases" to all U.S. customers.
Meanwhile, there's not even an official timeline for the audio-singing Meta AI to enter the 14 other countries where it will be sold.
source:uploadvr