This Is What Surprised Me Most About the Google I/O AI Shopping Feature

zeeforce
4 Min Read


An AI fashion tool was not on my Google I/O bucket list. At Google’s annual I/O developers conference, the company introduced a number of Gemini AI updates, many of which are coming to Search and, notably, to our online shopping experiences. The new AI shopping feature lets you virtually “try on” different articles of clothing by using a photo of your body and imagining what it may look like on you. 

Google built a custom image-generation model to power its new feature. It’s a simple idea: Google’s AI takes the input image of your body and the input image of the garment and combines them. The actual process behind it is surely more complicated. But in the live demo, it seemed to work flawlessly. The virtual try-on feature is available today in the US, with more visual shopping and AI-agentic updates coming soon.

I was very intrigued when I saw the live demo. I shop online for nearly everything I need, and I have been fooled many times by misimagining how clothing I see on models would look on me. But I’m an AI reporter, and I spend a lot of time worrying about the privacy implications of image and video tools, so I was skeptical, too.

Read more: Everything Announced at Google I/O 2025

I contacted Google after the keynote to ask about the privacy policies around this new feature. A Google spokesperson said, “Your uploaded photo is never used beyond trying things on virtually, nor is your photo used for training purposes. It is not shared with other Google products, services or third parties, and you can delete or replace it at any time.” 

I was jaw-droppingly, but pleasantly surprised by this. In the age of AI, tech companies are typically so data-hungry that a source of data like this seemed like a no-brainer for Google to use. Google spent a decent chunk of time during I/O showing off its new AI image and video tools, and human-generated photos like these would be useful for future model improvements. 

AI Atlas

Tech and fashion companies have been trying to work on this problem for years — my CNET colleague Katie Collins wrote about one dress-sizing app all the way back in 2012, and Amazon’s integrated AI for its fashion sales in recent years, too. 

This fashion model, as Vidhya Srinivasan, Google’s vice president and general manager of ads and commerce, called it during the keynote, has “a deep understanding of the human body.” We’ll have to test it out to see if it really works for all body types and sizes. AI image generators, especially early ones from Google, aren’t always great when it comes to diversity. But I’m willing to give Google a chance since it states it won’t use my pictures to automatically train its AI models.

As much as I want to believe Google has given us a 2025 version of Cher’s closet from Clueless, I’m still a little skeptical. There’s no guarantee that the AI version of yourself Google generates will actually reflect how the clothing looks on you in real life. But maybe this is a potentially good use of AI, instead of filling the internet with slop.





Source link

Share This Article
Leave a comment
Optimized by Optimole
Verified by MonsterInsights