
In 2017, Oenobiol wanted to create an unprecedented digital experience to promote its self-tanning range. The challenge was not to show a generic before/after: it was to allow each customer to see her own result, on her own face, before even purchasing the product.
The device had to work on a tablet in complete autonomy - without a salesperson, without complex handling - in partner retail spaces. Navigation had to be intuitive through simple taps, the flow fluid, and the experience convincing enough to drive a purchase.
The core technical challenge: detect the user's natural skin tone, transform the pixels into a deeper hue simulating the effect of the dietary supplement, and present the result credibly and flatteringly - without artifacts, without discrepancies between the skin and the background.
Design and develop a UWP (Universal Windows Platform) application deployed on tablet, allowing each user to simulate the visual effect of a natural tan on a photo of herself taken in real time. The application had to work in complete autonomy, with 100% touch navigation by tap, covering the entire journey: from photo capture to saving or sharing the transformed photo.
In parallel, integrate direct access to product information and a geolocated store locator to transform the experience into an immediate conversion lever.
Expected impact: Each personalized simulation creates an emotional projection. Proof through imagery accelerates the purchase decision.
The welcome screen features the Oenobiol/Mazarine key visual, the brand tagline and a single call-to-action: Try. One tap is enough to enter the experience.
Zero friction · 100% brand identity
The application activates the front camera and guides the user to position her face within the capture zone. A text prompt accompanies the framing.
Visual guidance → Optimal positioning → Automatically triggered capture
A visual scan animation confirms face detection. No action required: the algorithm automatically identifies the skin tone from a reference complexion palette.
Passive detection · Frictionless experience
Once the complexion is identified, the application applies the corresponding filter. The skin area pixels are transformed from their initial tone to a deeper, golden hue, simulating the progressive effect of the self-tanning range. The background, clothing and non-skin details remain unchanged.
Before: natural complexion → After: simulated natural tan · In under 2 seconds
A tap toggles between the original photo and the transformed photo. The user can switch back and forth at will to appreciate the tan effect on/off.
Proof through comparison · A single gesture
From the filter screen, a tap opens the full product card: description, packshot visual, usage tips. Direct access to the geolocated store locator finds the nearest point of sale - by automatic geolocation or manual postal code entry.
From desire to purchase · Without leaving the app
The user can save her transformed photo to the phone's gallery, or export an animated GIF showing the before/after transition.
Personal proof · User-generated brand content
Let's talk about your project and build together the solution that will make the difference.
Start the conversation