CPA configuration was fragmented. Users could create an order optimizing for CPA without clearly defining the optimization event signal, making optimization ineffective and diluting model performance.
Context
We launched two CPA-based ML models:
- Cost per Event Category — e.g., optimizing toward "Landing Page" as a category
- Cost per Action (per specific event) — optimizing toward a user-selected event
The DSP is flexible: events can be associated at both campaign and order level. That flexibility is powerful, but it can be confusing without structured guidance.
What was broken
The initial experience allowed users to:
- Select CPA optimization
- Add multiple events for measurement
- Never explicitly associate what the model should optimize toward
Additionally, the legacy model optimized toward all tracked events, which diluted the signal and reduced efficiency. The system technically worked, but structurally encouraged misconfiguration.
Data analysis
Before redesigning the flow, I used SQL to understand real usage:
- Which events were used most frequently
- How often advertisers used campaign vs. order-level associations
- How often multiple measurement events were present without a primary optimization target
This analysis confirmed we should design for the most common patterns while preserving advanced flexibility.
My role
I owned the end-to-end UX redesign:
- Mapped ML model requirements to configuration steps
- Identified fragmentation and signal-dilution failure points
- Designed a guided setup flow with conditional logic
- Partnered with engineering to ensure feasibility within existing constraints
Strategy: progressive disclosure tied to goal selection
Instead of redesigning visuals, I restructured decision sequencing to ensure the model always receives a valid signal during order creation.
- If CPA per event category → require category selection + pixel selection (implicit when advertiser has a single pixel)
- If CPA per specific event → require at least one optimization event selection
- Render only the fields needed for the chosen goal: conditional + progressive disclosure
- Ensure no extra post-creation steps are needed for effective optimization
Key UX change
The redesign made the dependency explicit:
- Measurement events are not the same as the optimization target
- CPA optimization requires a clear event signal at setup
- The flow enforces this requirement at the moment it matters
Technical depth
The solution required aligning UX logic with backend constraints:
- Conditional rendering tied to model selection and advertiser configuration (e.g., pixel count)
- Handling event hierarchy across campaign and order levels
- Validation states that prevent incomplete or ineffective configurations
- Preserving API contracts without expanding engineering scope
Before & After
Original flow: CPA selected with no enforced event requirement and no guidance on next steps.
Redesign: Goal selection requires event configuration, ensuring a valid optimization signal.
Outcome
The redesign did not change the ML models. It ensured they received the signal they were built to optimize.