Image courtesy of Google

MOUNTAIN VIEW, Calif.—At its recent I/O 2024 developer conference, Google introduced a series of updates across its Gemini family of models, including the new 1.5 Flash, its lightweight model for speed and efficiency, and Project Astra, the company’s latest “vision for the future of AI assistants.” Building on Gemini, a family of AI models that can understand and generate text, as well as other information such as images, audio, videos, and code, Google developed prototype agents that it said can process information faster by continuously encoding video frames, combining the video and speech input into a timeline of events, and caching this information for efficient recall.

Gemini AI on Project Astra (an advanced visual and talking responsive agent) has “the potential to optimize several everyday tasks,” according to the company. Google's potential wearable Project Astra combines smart glasses with advanced AI capabilities for real-life assistance, the company said.

In a recent blog entry announcing the updates, Demis Hassabis, CEO of Google DeepMind, noted that “while we’ve made incredible progress developing AI systems that can understand multimodal information, getting response time down to something conversational is a difficult engineering challenge. Over the past few years, we've been working to improve how our models perceive, reason and converse to make the pace and quality of interaction feel more natural.”

About 10 years ago, Google introduced Google Glass, a wearable, voice- and motion-controlled Android device that resembled a pair of eyeglasses and displayed information directly in the user's field of vision. Google stopped production of the Google Glass Enterprise Edition as of March 15, 2023, and supported it until September 15, 2023, according to the Google website.

Google’s Astra announcement comes on the heels of news from Meta Platforms, Inc. and EssilorLuxottica, who have launched a series of new software features, updates to Meta AI and a more diverse product assortment for the Ray-Ban Meta Smart Glasses, as VMAIL reported. Ray-Ban Meta frames now feature Meta AI with Vision functionality in the U.S. and Canada in beta version, which allows users to do things like translate text on signs, identify objects in front of them and write captions for photos. Video calling is also new to the smart glasses.

Google said it has not committed to a timeline for a potential launch of Project Astra.