Google shows off HUD-enabled multimodal AI smart glasses at I/O 2024

XR2mos agorelease XR-GPT
22 0 0

Google在I/O 2024展示了一款带有HUD的AIA short demo of smart glasses.

预先录制的演示是作为Google宣布Astra项目的一部分展示的,这是一种正在开发中的实时本地多模态人工智能助手,可以记住并推理它所看到和听到的一切,而不是当前的多模态人工智能系统,它们在被问及所看到的内容时才捕获图像。谷歌表示,Astra通过“持续编码视频帧,将视频和语音输入合并成事件时间表,并缓存这些信息以便有效地检索”。

The Astra project demo video starts with a smartphone, but halfway through the user picks up and puts on a pair of heavy glasses.

These smart glasses show a fixed heads-up display (HUD) where blue audio input indicators show when the wearer is speaking and white text shows the AI's response.

Google Deepmind CEO Demis Hassabis said that "new exciting human forms like glasses" that are "easy to visualize" are the ultimate goal of the Astra project, but didn't make any specific product announcements, saying near the end of the video a statement with the tagline "prototype shown". Near the end of the video, a statement appears with the tagline "prototype shown".

Not mentioned in the Google I/O 2024 keynote wereandroid XR, the company's spatial computing platform for Samsung's upcoming head-mounted display. Google may be waiting to let Samsung make an announcement later this year.

According to reports,MetaPlans to introduce HUD technology to next-generation Ray-Ban in 2025 MetaIn Smart Glasses. Will Google compete directly with it or does it plan to make the Astra program available to third-party hardware makers?

source:uploadvr

© Copyright notes

Related posts

No comments

No comments...
en_USEN_US
Powered by TranslatePress