Up to date on July 24 at 9:30 a.m. ET — Mashable’s Tech Editor Timothy Beck Werth initially tried the beta model of the Google Purchasing “Attempt it on” characteristic in Could, again when it first turned out there for testing. And as of this writing, Google is launching the characteristic to all customers in the US on desktop and cellular gadgets. You’ll be able to do this digital Clueless closet for your self inside Google Shopping now — simply click on on an attire product and search for the “Attempt it on” button.
At Google I/O 2025, the tech firm introduced a ton of latest AI options, and some of the attention-grabbing is a digital clothes try-on software.
The Google Purchasing “Attempt it on” characteristic lets customers add a photograph of themselves after which nearly attempt on garments, mainly the IRL model of the Clueless closet millennials have been dreaming about since 1995. Or, as Mashable Purchasing Reporter Haley Henschel put it, “Google’s newest purchasing characteristic makes Cher Horowitz’s computerized closet a actuality.”
Nearly as quickly because the characteristic was launched, customers began attempting to “jailbreak” the software, which is turning into a enjoyable little custom for tech writers each time a brand new AI mannequin or software is launched. On Friday, The Atlantic reported that “Google’s new AI purchasing software seems keen to present J.D. Vance breasts.” Hilarious, proper? What’s much less hilarious — the identical software may even generate breasts for images of underage customers, once more per The Atlantic.
I made a decision to present the “Attempt it on” characteristic a check spin, and I will discover the great, the dangerous, and the mortifying beneath. As a purchasing software, I’ve to say I am impressed.
How you can use Google’s “Attempt it on” AI purchasing software
The digital try-on characteristic is without doubt one of the free AI instruments launched by Google this week, and customers can signal as much as take part now. Formally, this product is a part of Google Labs, the place customers can check experimental AI instruments. Signing up is easy:
Check in to your Google account
Head to Search Labs and click on to show the experiment on
Take a full-body image of your self and add it
Navigate to Google Purchasing and click on a product you wish to “attempt on”
Search for the “Attempt it on” button over the product picture

The “Attempt it on” button seems over the product picture.
Credit score: Screenshot courtesy of Google
As a style software, Google’s “Attempt it on” characteristic actually works
Purely as a software for attempting on garments, the brand new digital try-on expertise is fairly rattling spectacular. The software makes use of a customized picture era mannequin skilled for style, per Google.
I am at all times skeptical of latest AI instruments till I’ve tried them myself. I additionally care about my very own private model and contemplate myself up-to-date on males’s style developments, so I wasn’t certain what to anticipate right here. Nonetheless, the software does work as marketed. In a flashy I/O presentation, Google confirmed fashions seamlessly attempting on one outfit after the following, and whereas the precise software is a little bit slower (it takes about 15 seconds to generate a picture), the precise product expertise is similar to the demo.
To point out you what I imply, let’s examine some selfies I just lately took on a visit to Banana Republic right here in New York Metropolis to the AI images generated by Google for a similar garments. For reference, here is the unique photograph I uploaded (and keep in mind that I am a Tech Editor, not a style mannequin):

The photograph I used to nearly attempt on garments.
Credit score: Timothy Beck Werth / Mashable
On this first photograph, I am sporting a blue cashmere polo, and the AI picture appears kind of like the true one taken within the Banana Republic dressing room:
Mashable Mild Pace

Making an attempt on a blue polo…
Credit score: Timothy Beck Werth / Mashable

And here is how Google imagined the identical shirt. AI-generated picture.
Credit score: Timothy Beck Werth / Mashable
I discovered the AI purchasing software got here fairly near capturing the general match and magnificence of the shirts. It even modified my pants and sneakers to higher match the product. If something, the digital try-on software errs on the facet of constructing me slimmer than I’m IRL.

I ended up shopping for this one.
Credit score: Timothy Beck Werth / Mashable

AI-generated picture.
Credit score: Timothy Beck Werth / Mashable

Yeah, I purchased this one, too.
Credit score: Timothy Beck Werth / Mashable

AI-generated picture.
Credit score: Timothy Beck Werth / Mashable
On this photograph, Google added a necklace round my neck that I might by no means put on in actual life, and the AI-generated shirt is a bit more slim-cut than it is imagined to be, however usually the general model is correct.

I made a decision this is not my model.
Credit score: Timothy Beck Werth / Mashable

Neither is the imaginary necklace, watch, and matching white sneakers.
Credit score: Timothy Beck Werth / Mashable
Whereas the photographs are producing, you see a message that claims: “AI photos could embrace errors. Match and look will not be precise.”
However for an experimental software, it is surprisingly on level. Individuals have been hoping for a software like this for many years, and because of the age of synthetic intelligence, we lastly have one.
In fact, not all the errors made by this software are so flattering…
Google additionally eliminated my shirt and imagined my chest hair
Here is the place issues get attention-grabbing. In The Atlantic piece I discussed earlier than, the authors discovered that in the event you requested the software to generate a picture of a revealing gown or high, it could typically generate or increase breasts within the authentic photograph. That is significantly more likely to occur with girls’s clothes, for causes that needs to be apparent.
Once I used this software with a pink midi gown, the outcomes have been mortifyingly correct. I guess that is just about precisely what I might seem like sporting that specific low-cut midi gown.
I will spare you from the precise picture, however to think about me within the gown, Google needed to digitally take away most of my shirt and movie me with chest hair. Once more, I am shocked by how correct the outcomes have been. Now, after I “tried on” a pink girls’s sweater, Google did give me some additional padding within the breast part, however I’ve additionally been open about the truth that that’s not entirely Google’s fault in my case. Fortunately, this characteristic was not out there for lingerie.
What will be completed about these issues by Google? I am undecided. Males have each proper to put on cute pink midi attire, and Google can hardly prohibit customers from selecting cross-gender clothes. I would not be shocked if Google ultimately removes the software from any product that exhibits an excessive amount of pores and skin. Whereas The Atlantic criticizes Google for altering photos of them once they have been underage, they have been those who uploaded the photographs, and in violation of Google’s personal security insurance policies. And I believe the offending outcomes would even be the identical with nearly any AI picture generator.
In a press release to Mashable, a Google spokesperson stated, “We’ve sturdy protections, together with blocking delicate attire classes and stopping the add of photos of clearly identifiable minors. As with all picture era, it received’t at all times get it proper, and we’ll proceed to enhance the expertise in Labs.”
May individuals abuse the digital try-on software to cyberbully their friends or create deepfakes of celebrities? Theoretically, sure. However that is an issue inherent to AI usually, not this particular software.
In its security pointers for this product, Google bans two classes of photos, along with its common AI content material pointers:
“Grownup-oriented content material, baby sexual abuse imagery, non-consensual sexual content material, and sexually specific content material.”
“Inappropriate content material equivalent to harmful, derogatory, or surprising.”
Once more, you’ll be able to check out this software at Google Search Labs.
Matters
Synthetic Intelligence
Google


