Maybelline + The First Mobile Beauty Assistant for Drugstore Shopping

Shade Confidence, Powered by Early Mobile Innovation. Built in the first wave of smartphone retail, this experience helped establish the behaviors shoppers now expect from beauty tech.

ClientMaybelline
PlatformiOS Mobile Application
FocusRetail Innovation + Proprietary Tech
LaunchCanada — #2 App Store
Client Context

Maybelline operates at mass scale, where the drugstore shelf is the primary point of purchase—and also the biggest friction point. Hundreds of foundations, no ability to swatch, no guidance. A consumer alone in an aisle making a $20 decision with no information.

The objective was to make shade selection feel as confident as a department-store counter: personalized, informative, and low-risk—inside the exact retail environment where that confidence had never existed before.

The Challenge

The "wall of packaging" problem.

The experience had to deliver real guidance in real retail conditions — variable lighting, fast adoption, zero learning curve.

Mismatch Risk01

No Way to Swatch

Drugstore shoppers make shade decisions with only packaging as a guide. The wrong foundation means a wasted purchase and eroded brand trust — and it happens constantly at mass retail scale.

Core tensionHigh purchase risk, zero validation
Decision Stress02

Hundreds of Options, Zero Guidance

Faced with an entire wall of foundations and no expert in sight, shoppers default to familiar choices or walk away entirely. The opportunity to win a new customer disappears in that aisle moment.

Core tensionComplexity kills conversion
Brand Trust03

The Shelf Needed to Become a Guide

Maybelline needed to deliver a department-store quality experience inside a drugstore aisle — personalized, accurate guidance at the exact moment of purchase, powered entirely by the device in a shopper's hand.

Core tensionBrand equity won or lost in the aisle
Tech Reality04

Built for Early Mobile Hardware

This was the first wave of smartphone retail. Variable store lighting, limited processing power, inconsistent cameras. Collab engineered a solution purpose-built for real-world retail conditions — and it worked.

Core tensionEmerging tech, real-world performance

Collab engineered a simulated retail environment to study the conditions of imperfect light.

Maybelline ShadeFinder campaign
Collab's Integrated Approach

Strategy, experience design, and technology in lockstep.

Collab operated as one integrated team across strategy, design, and engineering. The Shadefinder engine and the consumer experience were designed together from day one — ensuring the algorithm was built around real user behavior and the precise realities of in-store retail.

Our technology team spent a year building and studying simulated retail environments — modeling the conditions of imperfect light, variable cameras, and real shopper behavior before a single line of consumer-facing code was written. That foundation is what made Shadefinder perform.

Maybelline Makeup Express mobile application by Collab Studio
Capabilities Deployed
Product + experience strategy for retail-to-mobile behavior change
UX/UI design for guided in-store decision flow
Mobile application design and engineering
Computer vision R&D for shade analysis
Augmented reality framework for in-store content
Content system design
Prototype-to-launch coordination
Proprietary technology development (Shadefinder)
The App

Beauty Directory, Lessons, Backstage content and more — all under one roof.

Makeup Express app — home screen and navigation
Home — Beauty Directory, Lessons, Shadefinder & Backstage
Makeup Express app — beauty directory and products
Beauty Directory — Products, Tools & Makeup Looks
Makeup Express app — backstage content and videos
Backstage — Trend Reports, Videos & Talent Interviews

Makeup Express — App Overview

The Outcome

Makeup Express — the beauty industry's first app of its kind.

Shadefinder — Proprietary Core Technology

Collab built a proprietary image processing and color analysis engine from scratch. Shadefinder analyzed skin tone through the phone camera and matched it to the right Maybelline foundation—designed to grow smarter over time using AI. A year of color science R&D went into making it work reliably in variable retail lighting.

In-Store AR Content Activation

The app used augmented reality to trigger rich content experiences via in-store signage throughout the retailer. Shoppers could point their phone at shelf displays to unlock makeup tips, application guides, and product information—turning passive packaging into an active, personal conversation with the brand.

Launched in Canada — Hit #1 in the App Store

Makeup Express launched in Canada as the first mobile beauty application of its kind. It immediately topped the iTunes charts — validating that when the technology meets a real consumer need in a real retail moment, adoption follows.

A Blueprint for Modern Beauty Tech

The behaviors Collab established with Makeup Express — shade-matching on device, AR-triggered in-store content, personalized guidance at shelf — are now baseline expectations in beauty retail. This was the proof of concept that opened that door.

The Shadefinder Flow

The complete in-aisle experience — from photo capture to matched foundation recommendation.

Shadefinder — face and jaw capture screen
Step 1 — Face Reader: align your jaw to capture your true skin tone
Shadefinder — wrist reader and shade result
Step 2 — Wrist Reader: an alternate capture method for accuracy in any lighting
Shadefinder — coverage, skin type and product match
Step 3 — Your Match: coverage preference, skin type, and the right foundation — with saved shades over time
See It In Action

The beauty industry's first augmented reality mobile app — turning a smartphone into a personal beauty assistant.

Makeup Express used augmented reality to activate rich content experiences via in-store signage throughout the retailer. Shoppers pointed their phone at signage to unlock personalized makeup tips, application guides, and shade recommendations — giving every shopper a personaized beauty counter experience that kept them connected to the products and was designed to become smarter over time.


Makeup Express — In Store Demo

What Made This Possible

One team from insight to deployment.

Strategy, design, and technology built together — so the consumer insight carried all the way through to the shelf.

Strategy identified the precise in-aisle moment where consumer confidence is earned — and built the product around it
Design translated sophisticated technology into an experience shoppers could pick up and use instantly
Technology delivered proprietary computer vision and AR on early mobile hardware, purpose-built for retail conditions
Integration across all three disciplines meant every decision reinforced the same consumer outcome
Next Possibilities

This positions as a blueprint for the next generation of personalized retail experiences.

On-Device Personalization

Extend the Shadefinder model to on-device AI — real-time shade matching that improves with every use and works offline in store environments.

Omnichannel Continuity

Bridge the in-store discovery moment to e-commerce reorder — building a persistent shade profile that travels with the consumer across every touchpoint.

Privacy-Forward Computer Vision

Rebuild the color analysis engine on modern privacy-first architecture — on-device processing, no image storage, compliant with evolving data regulations.

Retail Media Pipeline

Convert the AR content layer into a measurable retail media channel — turning in-store engagement into first-party data that strengthens both brand and retailer relationships.

Proprietary Technology

Precision Beauty: The Engineering & UX of Shadefinder

A year of R&D went into solving one problem — making a phone camera reliable enough to match skin tone under drugstore fluorescents.

01
Lighting-Resilient Dataset
Built using an extensive image dataset of skin photographed under random lighting conditions — fluorescent, warm, cool, mixed. The model had to work in any drugstore, not just ideal conditions.
Same skin tone
3 light conditions
Warm Cool Fluorescent
02
Accounting for Color Perception
Human color perception varies. The system integrated individual perceptual data to refine shade selection accuracy — not just what the camera sees, but how the person perceives color contrast on their own skin.
Warm undertone
Cool undertone
Neutral
03
Environmental Normalization
Before capture, users step out of frame so the system can read and normalize ambient lighting conditions. A simple UX gesture that solved a complex computer vision problem — calibrate the environment, then read the skin.
Reading ambient light…
Step out of frame to calibrate
Step 1
Capture Your Skin
Two input methods: take a live photo or upload an existing one. Face reader or wrist reader — whichever gives better results in your lighting.
📷
Take a Photo
🖼
Upload a Photo
Step 2
Shade Analysis
The Shadefinder engine processes the image — normalizing for light, reading undertone, mapping to the product shade range. The result: a precise position on the warm-to-cool, dark-to-light spectrum.
Dark
Light
Warm
Cool
Step 3
Coverage Personalization
Shade match alone isn't enough. The app layers in coverage preference — sheer, medium, or full — to narrow the recommendation to exactly the right product for this person's needs.
Sheer
Medium
Full
Step 4
Your Match
A specific product and shade number — not a range, not a guess. "Dream Nude Airfoam, 320 Honey Beige." Saved over time so the shopper builds a personal shade profile with every use.
Dream Nude Airfoam
320 — Honey Beige
Your Match ✓