Your Samsung Galaxy S25 series phone now has the ability to see what you see and actually help

Samsung is rolling out a new, visual AI feature to Galaxy S25 series users: real-time visual interaction powered by Google's Gemini Live. Starting today, users of the Galaxy S25 series can now talk to their phones while showing it what they see, making for a far more natural and intuitive way to interact with AI. This is rolling out as a free update.
Instead of just typing or speaking commands, Gemini Live with camera and screen sharing lets you actually show your phone what you’re talking about. This can be really helpful for simple everyday tasks such as picking out a shirt that matches a certain pair of pants, help sorting through a cluttered closet, or getting a second opinion while online shopping.
Samsung says this is all about making the phone feel more intuitive, more helpful, and just generally easier to live with day-to-day. And while AI features on phones are nothing new, being able to combine visual context with live conversation makes the Galaxy S25 stand out in a way that stands out.
In terms of competition, this is where things get interesting. Apple’s Siri and Amazon’s Alexa are still very much voice-first and don’t offer anything close to this kind of real-time, camera-based interaction. Google’s own Pixel phones — which are deeply tied to Gemini — just added this feature today as well. So, for now at least, Samsung and Google are the first out of the gate with something genuinely different.
This all builds on the AI work Samsung’s already done with the S25, like its ProVisual Engine for smarter image processing and improved multi-tasking features. But Gemini Live feels like a step toward the kind of AI we’ve only seen in sci-fi movies, where your device understands your environment and helps in the moment, without needing a bunch of prompts or button presses.
Together with Google, we are marking a bold step toward the future of mobile AI, delivering smarter interactions that are deeply in sync with how we live, work and communicate. With this new visual capability, Galaxy S25 series brings next-generation AI experiences to life, setting new standards for how users engage with the world through their devices.
Jay Kim, Executive Vice President and Head of Customer Experience Office, Mobile eXperience Business at Samsung Electronics
It works like this:
- Press and hold the side button to activate Gemini Live
- Use your camera or share your screen
- Ask questions, get suggestions — all in real time
- No app-switching or typing needed
Samsung says this is all about making the phone feel more intuitive, more helpful, and just generally easier to live with day-to-day. And while AI features on phones are nothing new, being able to combine visual context with live conversation makes the Galaxy S25 stand out in a way that stands out.
In terms of competition, this is where things get interesting. Apple’s Siri and Amazon’s Alexa are still very much voice-first and don’t offer anything close to this kind of real-time, camera-based interaction. Google’s own Pixel phones — which are deeply tied to Gemini — just added this feature today as well. So, for now at least, Samsung and Google are the first out of the gate with something genuinely different.
This all builds on the AI work Samsung’s already done with the S25, like its ProVisual Engine for smarter image processing and improved multi-tasking features. But Gemini Live feels like a step toward the kind of AI we’ve only seen in sci-fi movies, where your device understands your environment and helps in the moment, without needing a bunch of prompts or button presses.
Looking back at the Galaxy S24, which was solid but mostly evolutionary, this new feature helps set the S25 apart in a big way. If Samsung and Google continue down this path, we might finally be getting smartphones that feel more like actual assistants, not just smarter phones.
Things that are NOT allowed: