Google’s Virtual Try-On has become a full-fledged AI-powered shopping feature, now spanning both apparel and beauty.
By blending generative AI with augmented reality, it lets shoppers preview products on themselves or diverse model types and experiment with different looks, all before making a purchase.
In this article, we’ll break down how it works, what categories it supports, the brands involved, and what it means for the future of SEO.
TL;DR
- Google’s Virtual Try-On uses AI and AR to let users preview apparel and cosmetics directly in search
- Clothing try-on is powered by image generation models that render items on diverse real models or your own uploaded photo
- Beauty try-on uses real-time AR overlays for lipstick, foundation, and full makeup looks via your phone’s camera
- The system showcases inclusive models covering a wide range of sizes, skin tones, and hair types for the best results
- Try-on is embedded in Google Search and Shopping, marked by “Try On” badges across supported products
- Global retailers of fashion like H&M, Levi’s, and Everlane and beauty giants like MAC, Fenty, and Dior are already onboard
- Shoppers get better sizing confidence, leading to fewer returns, higher conversion, and a smoother experience
- For retailers, it’s a new path to discovery that boosts engagement and aligns with visual, immersive search trends
What Is Google Virtual Try-On?
Google Virtual Try-On is an augmented reality (AR) shopping component built right into Google Search and Shopping.
It lets users preview how cosmetics and clothing look in real life, without ever stepping into a store.
From swiping through lipstick shades to finding your perfect foundation match, or seeing how a jacket fits on different people, this tool bridges the gap between static product pages and the fitting room.
Powered by Google’s AR tech, it brings a hands-on experience to online shopping, helping users try before they buy, all with just a search.
How the Technology Works
Powered by AI and AR
So, how does Google’s Virtual Try-On tool actually work?
Google’s system uses a custom diffusion-based image generation model trained on pairs of photos showing people in different poses wearing the same garment.
The AI learns how fabric drapes, folds, stretches, and casts shadows across different human body types, and with just one product image, it can generate a realistic render of that item on a person.
Initially, Google used real model photos instead of avatars, but by 2025, users can upload a full-body photo and see the clothes virtually applied to themselves.
This system is the first at Google’s scale to handle billions of clothing items from the Shopping Graph.
It doesn’t rely on creating a 3D avatar, just a single photo is enough for the AI to do its work. That simplicity is what makes the exciting new feature both scalable and user-friendly.
When it comes to cosmetics and hair, Google relies on AR overlays powered by partners like Perfect Corp and PulpoAR.
Using segmentation models, the system detects facial details (like lips, eyes, skin, and hair) and layers on makeup in real time through your phone’s camera.
For example, searching “soft glam makeup” might surface a curated set of products, and pressing the “See the look on you” button lets you preview them live on your face.
Source: Google
The AR system can combine multiple products at once – lipstick, foundation, mascara, and more – to preview a full look in seconds.
It’s fast and intuitive. A virtual try on experiment can make the discovery experience feel like trying makeup in-store.
And since it works directly in search results, there’s no app-hopping required.
Use of Diverse Body Models
When Google introduced Virtual Try-On for apparel, a key focus was diversity and inclusivity.
Many online shoppers struggle to find models that look like them. Google cited that 42% don’t feel represented by model images, and over half have been disappointed by how clothing materials fold over the body, saying that it looked different in person.
To address this, Google built a virtual model library representing a wide range of body shapes, sizes, and skin tones.
Range of Sizes:
Google’s model set spans from XXS through 4XL in women’s sizing.
Source: Google
In practice, that meant using around 80 real model images of people of various shapes and different body types, from petite to plus-size.
An official Google image literally showed a grid of 80 different models, all dressed in identical outfits, to demonstrate the breadth of representation.
Skin Tones and Ethnic Diversity
The models were chosen to reflect different ethnic backgrounds and skin colors.
Google even used the Monk Skin Tone (MST) scale (a 10-skin-tone scale developed to evaluate AI fairness) as a guide to make sure a wide spectrum of skin tones are included.
For shoppers, this means if you have a darker complexion, you’re not stuck only seeing an outfit on a light-skinned model (or vice versa). You can pick a model with a similar skin tone to yours so you get a more realistic idea of color contrasts, etc.
In foundation try-on, Google took this further by providing 148 model examples covering various skin tones and ages, ensuring that when you’re trying to match a foundation shade, you can see it on a model that closely matches your complexion.
Hair
The “diverse set of real models” also vary in hair type.
Source: Google
Hair can affect how an outfit appears (necklines, for instance), and including models with different hairstyles, from natural curls to shaved heads, adds to the relatability.
It also helps the AR hair color element, since they needed models with different base hair colors and textures.
Step-by-Step: How to Use Google’s Try-On Feature
Wondering how to use virtual try-on in Google Shopping?
Google has baked Virtual Try-On directly into mobile and desktop Search, especially within the Shopping tab.
When you search for supported clothing or beauty items, you’ll see a “Try On” badge on product listings, signaling that VTO is available.
For example, a query like “red lipstick” or “women’s jeans” may return results with a prompt to try before you buy.
These badges make it easy to spot which items support try-on, helping users skip the guesswork.
And because they appear directly in the product carousel or snippet, the entry point feels seamless.
Just hit the badge to launch the experience.
Tapping a product with the try-on badge opens Google’s dedicated interface. For clothing, you’ll be prompted to either choose “Select a model” or “Try it on” using a photo.
In the model-based flow, you scroll through dozens of real images covering a full range of sizes and skin tones, then pick the one closest to your look.
Source: Google
If you’ve opted into Search Labs, you’ll see a newer option: “Try on with your photo.”
Upload a photo with your full-body image (standing, front-facing, in fitted clothes), and within seconds, Google renders that item on your image. It’s clean, quick, and surprisingly realistic.
For cosmetics, pressing the “Try it on” icon activates AR mode via your phone’s front camera. You’ll see your face with the selected lipstick or blush applied live, and can swipe through a carousel of shades or styles.
Don’t want to use your own image?
Source: Google
You can switch to model previews instead, sorted by skin tone and face shape.
For multi-product looks, like celebrity-inspired makeup styles, the same flow applies.
You tap “See the look on you,” and Google overlays the entire set at once. It’s like a makeover, minus the makeup remover.
Clothing try-ons work on both desktop and mobile, since they rely on generated images – no special tech needed. But AR beauty try-ons are mobile-only, requiring a front-facing camera through Google Search on Android or iOS.
There’s no app to install, and no profile setup required.
As of 2025, the most advanced new capabilities, like personal upload photo capabilities, are available only in the U.S. through Search Labs.
But Google’s expansion track suggests broader rollout is on the horizon.
Brands That Support Virtual Try-On
So, which brands support Google virtual try-on?
As of 2025, a growing list of fashion and beauty retailers have integrated their products into the experience, and it’s not just the usual suspects.
On the fashion side, Google launched apparel try-ons with partners like Anthropologie, Everlane, H&M, and LOFT for women’s tops.
Since then, more brands have joined the party, including Levi’s, Abercrombie & Fitch, Staud, Pistola Denim, and even athletic labels like Adidas.
One tap, and you can see how an Abercrombie sweater fits different body shapes and sizes or how a pair of Levi’s jeans looks in motion.
Beauty brands are just as invested.
Google now supports 50+ cosmetics labels through AR-powered try-on tools.
Source: Google
That includes heavy hitters like MAC, Revlon, CoverGirl, Dior, Fenty Beauty, and more, all offering digital previews of their products.
Benefits of Virtual Try-On for Shoppers
The benefits of AR in e-commerce?
A lot more confident shoppers.
Virtual try-on gives people a far better sense of how different materials will look on them – from how a dress moves on a body like theirs to whether a foundation shade actually matches their skin tone.
That leads to the next major perk: fewer returns and less buyer’s remorse.
When you can see a jacket on your frame or lipstick on your lips, you’re far less likely to play the buy-two-sizes-and-send-one-back game.
Some experts believe tools like Google’s virtual try-on could seriously cut down on e-commerce return rates, simply by setting clearer expectations upfront.
It’s also more efficient. You save time, as well as save money, by avoiding the trial-and-error approach. No need to order five lipsticks to find the one that works.
Ready to future-proof your e-commerce store for Google’s AI-powered Shopping Graph?
Our team can help. Explore our SEO services built for visual, immersive, and AI-driven product discovery.
Why It’s a Game-Changer for Retailers
For retailers, Google’s Virtual Try-On isn’t just a shiny new feature. It’s a powerful tool for engagement and conversions.
By letting users preview products right inside Google’s search results, brands can connect with high-intent shoppers in the moment. No extra clicks, no friction.
And there’s data to back it up. One study by Shopify found that AR-powered ads outperformed static ads by 94% on average.
Shoppers are also 41% more likely to consider a product with a try-on option, and nearly three-quarters say they’d pay more after experiencing a product through AR.
That’s not just an increase in engagement. That’s a serious boost in purchase intent and return on ad spend.
Confident shoppers who hear more stories and positive feedback about the technology are more likely to click “buy,” and when they’ve interacted with the product through AR, they’re less likely to abandon cart or bounce early.
Google is also integrating this with its Shopping Graph (now over 50 billion product listings) and testing things like AI-powered personal shoppers and agentic checkout, where users can complete and find purchase details directly in the SERP using Google Pay.
This means the entire path from discovery to purchase is getting compressed, and global retailers that don’t adapt in the coming months risk missing out.
To keep up, retailers need to optimize for this new virtual shopping capability. That means high-quality product imagery, AR-ready assets, and a well-fed product feed.
Want to see how your current product pages stack up for AR and visual search?
Try our Free SEO Tools, from structured data checkers to product schema audits, and start optimizing for Google’s next-gen shopping experience.
Limitations and Challenges
Is Google’s Virtual Try-On accurate? The short answer: It’s impressively realistic, but not 100% perfect.
At its core, it’s still an approximation.
You’re looking at a generated preview, not the real thing. So while it shows how a dress might drape over the human body or how a lipstick could look, it won’t capture every nuance.
And subtle details, like undertones in makeup, may not show up perfectly on screen, especially without good lighting.
So while it’s a helpful guide, it shouldn’t be treated as a flawless predictor.
You might still experience slight differences in fit or color once the product arrives.
What’s Next for AR in E-commerce?
The future of AR shopping? It’s only just getting started.
Google is already expanding virtual try-ons beyond tops and makeup. Think new styles for full outfits, fitted clothing, footwear, accessories, and men’s clothing, which was slated for rollout shortly after the initial launch.
Soon, you might see how jeans fit your waist, how a coat buttons up, which outfits look best for rainy weather, or how a pair of trainers look in motion, all before buying.
There’s also a clear push toward deeper personalization.
The line between digital and physical shopping is blurring fast, and Google is building the infrastructure to support that shift.
For e-commerce marketers and brand teams, this means rethinking what visibility looks like.
This isn’t just about blue links anymore. It’s about being ready for a world where product discovery happens visually and interactively, everywhere.
For marketers and brand strategists, now is the time to prepare for this AR-driven e-commerce wave.
Success will require a mindset shift toward what we are calling “Search Everywhere Optimization,” ensuring your products are optimized not just for traditional search results but for visual and interactive discovery across platforms in the coming months and years.
Want the full roadmap for AR, AI, and visual search success?
Download our Search Everywhere Optimization guide. It’s packed with insights on how to stay visible across every platform consumers use to discover products.
Leave a Reply