Google's next-gen AI assistant could control your entire phone

Tired of barking commands at your AI assistant? Google's next-generation AI assistant can solve that problem for you and do a lot more.

 

  1. Meta Launches Standalone AI App With a Ton of Exciting New Features

AI Could Soon Take Control of Your Android Phone

Google has demonstrated its vision of a 'comprehensive AI assistant' that can understand the context around you, offer solutions, and take action on your behalf. The goal is to create a comprehensive assistant that can automatically figure out when it's needed and step in without you having to manually summon it.

This new assistant is called Project Astra, and Google showed off some pretty impressive demos of what it can do at I/O 2025. In one demo, a user had trouble with her bike's brakes and asked Astra to find a bike manual online.

Once Astra found the manual, it was asked to scroll until it found the section that mentioned bicycle brakes, which it did perfectly. The user then asked Astra to find instructional videos on YouTube and even contact a bicycle shop to find out what parts they needed. Astra could even call the nearest bicycle shop on the user's behalf and ask if the necessary parts were in stock.

 

The Verge also reported seeing a demo in which Bibo Xiu, a product manager on Google's DeepMind team, pointed her phone camera at a pair of Sony headphones and asked Astra to identify them. Astra responded by saying they were either the WH-1000XM4 or the WH-1000XM3, a confusion that most people would likely get.

Once identified, Xiu asked Astra to open a manual and explain how to pair them with her phone, only to interrupt the AI ​​assistant mid-sentence and ask it to pair the earbuds for her. As you might expect, Astra agreed without any problems.

From the demo, it looks like Astra is simulating screen inputs to move around the screen. The screen recording indicators also show that Astra is reading your screen and deciding where to go, navigating different UIs as it goes about its task.

Comprehensive AI assistants are ahead

While impressive, these demos aren't perfect. They still require user input, and in the case of Xiu's demo, she had to manually enable a feature that allowed Astra to access her phone screen.

For now, Project Astra is more of a testing ground for Google's boldest AI ambitions. Features that work well here will eventually trickle down to tools like Gemini and be made available to users. Google says its ultimate vision is to 'turn the Gemini app into a full-blown AI assistant that performs everyday tasks for us . '

Google is working hard to gradually phase out older tools in favor of newer, AI-powered ones. AI Mode is replacing Google Search, and Gemini has an impressive list of features you should try.

Google's next-gen AI assistant could control your entire phone Picture 1

However, even the most advanced AI systems today require you to input prompts at each step, providing them with the necessary data and context, and you may still need to take manual action. Since Astra can access the internet and Google services, it is looking to replace all of these inputs by accessing your information from a variety of platforms and building the context needed to take action.

That's not an easy goal to achieve, and there are also concerns about the privacy and security issues that a universal AI assistant like Astra could raise in the future. Astra could do all the heavy lifting locally using the Gemini Nano model , but the demo showed no signs of that.

Building an assistant like this will take a long time, but with these demos, Google has given us a glimpse of the future. It may not be coming soon, but a universal AI assistant is right around the corner, and people are eagerly awaiting its arrival.

5 ★ | 1 Vote

May be interested