Digital Marketing Simulation
CASE STUDY
Digital Marketing Simulation Redesign
Redesigning Performance Results to Drive Clearer Learning Outcomes

MY ROLE
Lead Designer
TIMELINE
Six Months
TEAM
Designer, product manager, and five engineers
KEY FACTOR
Visual Hierarchy
Product Background
Stukent’s Digital Marketing Simulation allows students to create and manage advertising campaigns for a backpack brand. After completing campaign tasks, students submit their work and receive performance results that include key marketing metrics.
The Challenge
Both students and instructors consistently reported that these results were difficult to interpret. Students struggled to understand what the metrics meant or how their ads performed, while instructors found it challenging to guide students toward improvement.
The goal of this project was to redesign the results experience to make performance data clearer, more intuitive, and actionable for students.
Key Problems Identified
Several usability challenges existed within the existing experience.
Confusing Data Presentation
Key metrics were displayed inconsistently across the page, making it difficult for students and instructors to quickly understand their performance.
Lack of Information Hierarchy
Important insights were buried within dense content, causing students to struggle identifying what mattered most.
Limited Actionable Feedback
Students and instructors could see their results but lacked guidance on how to interpret them or improve future campaigns.
Instructor Frustration
Instructors found it difficult to explain results to students because the interface lacked clarity and structured insights.
Research and Discovery | UX Audit
The first step in the process was conducting a UX audit with the Product Manager to better understand the existing results experience. During the audit we analyzed:
How campaign data was displayed
The organization of information across the results page
Areas where users frequently became confused
How different sections of the interface related to one another
Our findings revealed that the simulation was already generating the right data. The issue wasn’t missing information, it was how that information was structured and presented.
The interface lacked a clear hierarchy, which made it difficult for users to identify the most important insights or understand how different metrics related to their ad performance.

Design Process
STEP ONE
Low-Fidelity Exploration
To explore solutions, I began with low-fidelity wireframes focused on restructuring the information architecture.
One of the key ideas introduced during this phase was splitting the results into two levels.
A summary overview page
A deeper analysis page
The summary page would provide a quick snapshot of overall performance metrics, while the deeper analysis page would allow students to explore individual ads and understand specific performance insights.
This approach allowed students to first understand the big picture before diving into a detailed analysis.

STEP TWO
High-Fidelity Exploration
Once the structural direction was validated, I created high-fidelity designs that focused on improving readability and visual hierarchy.
Clear grouping of performance metrics
Visual prioritization of key results
Structured sections for different advertising channels
Improved readability for complex data
These refinements helped transform the results page from a dense information display into a more intuitive analytics dashboard.

STEP THREE
Prototyping
To validate the redesigned flow, I developed interactive prototypes that simulated how users would navigate between summary insights and deeper ad performance analysis.
The relationship between campaign-level metrics and individual ad performance
How students could move from high-level results to detailed insights
Whether the new structure improved comprehension
Prototyping confirmed that separating an overview from a detailed analysis experience reduced confusion and improved clarity.
Key Features and Solutions
Two-Level Results Structure
A summary page provides a clear overview of campaign performance, while a secondary page allows deeper exploration of individual ads.
Improved Information Hierarchy
Metrics are organized to highlight the most important insights first, making it easier for students to understand results at a glance.
Ad-Level Performance Insights
Students can drill down into specific ads to identify strengths and areas for improvement.
Clearer Data Visualization
The redesign uses structured layouts and visual grouping to simplify complex marketing metrics.
Instructor-Friendly Insights
The clearer results experience helps instructors guide students toward better marketing decisions.

Impact, Results, and Key Learning
The redesigned results experience significantly improved how students interpret their campaign performance within the simulation.
This project reinforced the importance of clear information hierarchy when designing data heavy experiences. Even when the correct data exists, poor structure can prevent users from understanding it.
It also highlighted the value of breaking complex analytics into progressive layers, allowing students to start with high-level insights before exploring deeper performance details.
Close collaboration with designers and engineering throughout the process ensured the solution was both usable for students and feasible to implement.
Simplified how students read and interpret marketing metrics
Provided clearer connections between ad decisions and performance outcomes
Enabled instructors to more effectively guide student improvement
Created a scalable analytics framework for future simulations