Google is coming out with a new AI feature for Gemini live that lets it view your phone screen or use your camera in real time.
You can ask questions about whatever it sees, whether it’s picking a paint colour or figuring out what’s on your screen.
Gemini really woke up and chose context.
These updates are powered by “Project Astra,” which Google first showed off nearly a year ago.
They’re now available for Gemini Advanced users on the Google One AI Premium plan.
One Reddit user spotted the feature on their Xiaomi phone and shared a quick demo.
It shows Gemini reading the screen and responding instantly. Siri can only dream at this point.
Another feature now live is video mode, which lets you point your phone camera at something and ask questions on the spot.
Quick takes:
Gemini can now answer questions about your screen or camera view.
It’s part of the AI Premium plan for Gemini Advanced users.
Google’s rolling this out ahead of Alexa and Siri’s next big updates.
Amazon and Apple are still working on similar upgrades for Alexa and Siri, but Google seems to be moving faster here.
On Samsung devices, Gemini remains the default assistant; Bixby still exists but isn’t getting the same attention.
Alexa, Siri… y’all still updating?