Aura Color AI
Aura Color AI — Completing the Color Journey
When we launched the Aura Color Explorer, it completely changed how people chose wraps online. For the first time, you could spin a car in real time, see light sweep across its curves, and explore every finish with physically accurate reflections. It was closer to a video game than an e-commerce tool — and it worked.
But there was always one thing people asked for: “Can I see it on my own car?”
Because even with all the high-fidelity renders and finished install photos, a color choice is still a leap of faith. Vinyl isn’t cheap, and if you’re not one hundred percent sure, it’s easy to close the tab and walk away. That hesitation — that moment between curiosity and commitment — became our next design problem.
The Solution
The answer turned out to be surprisingly simple. With the democratization of AI, platforms like Replicate made it possible to run advanced image models without the heavy infrastructure we’d needed even a year ago. We extended the Aura Color Explorer with Google’s Nano Banana model, letting users drag and drop a photo of their own car and instantly see it “rewrapped” in any Aura color.

Behind the scenes, the process is a delicate handoff between two images — the user’s photo and the swatch they selected. We pass both into the model with carefully tuned instructions that tell it exactly what to do:
replace the existing paint color with the new one, preserve every reflection and contour, and don’t touch anything else.
The result is eerily convincing. The AI doesn’t create a new car — it transforms your car, keeping your driveway, your reflections, your angle. That realism is what converts curiosity into confidence.
What We Learned
Our first prototypes went in the opposite direction. We used a generative model to recreate the car from scratch — analyzing the upload, describing it in text, then re-rendering the entire image in the new color. It was technically impressive, but something was missing. It wasn’t your car anymore, and that emotional connection vanished.
When we switched to Nano Banana, the realism came back, but it taught us something unexpected: less is more.

We originally over-described the transformation — feeding the model long, detailed prompts about body panels, highlights, reflections. The output got confused, over-processed, sometimes even added artifacts. The best results came from giving the model less context and clearer intent. Just: “Replace this color with that color, and nothing else.”
That simplicity became the breakthrough.
The Experience
Together, the three pillars — finished car photos, the 3D Color Explorer, and the personalized AI recolor — form a complete user journey. You start by exploring the full catalog, see finishes under real lighting, then visualize them in your own world. It closes the last gap between imagination and reality — and for a decision as personal (and expensive) as wrapping a car, that’s everything.