Engage





I began with a full audit of both tools, mapping every task, page, and dependency. This audit became the foundation for research, alignment, and eventual removal of low value functionality and surfaced clear negative patterns:
- Features had accumulated without consolidation
- Tabs were misused as containers for unrelated actions
- Key workflows were buried several layers deep
- Entire sections existed that no one actively used

Using the IX sitemap, I created a Maze survey to validate which pages actually mattered. Users reported whether they used each page in their day to day work. We discovered:
- Large portions of the navigation had zero usage (highlighted in red rectangles above)
- High value tasks were scattered across unrelated sections
- Usage patterns varied significantly by role
This gave us the confidence to simplify aggressively without fear of breaking real workflows.

I conducted interviews with one executive stakeholder, seven internal users, and three external users. Here are the key insights:
- Engage was not a primary tool
- Only 22% used it occasionally for reports
- 100% preferred IX or external tools for core work
- Speed and access mattered more than breadth
- Users wanted immediate access to panels and controls
- Navigation actively worked against users
- Command+F was a common coping mechanism
- Reporting was essential but untrustworthy
- Panel health visibility was a business critical gap
- User roles were fundamentally different. Each role had distinct goals, permissions, and mental models
From this, I worked with Product and Engineering to define essential tasks, eliminate manual processes where possible, and identify opportunities for automation.

Four primary user groups emerged, each with distinct responsibilities and success metrics. These roles directly informed permissions, visibility, and navigation structure in the new platform. This shifted the IA from feature driven to task and role driven.

Using insights from research, I ran an open card sort with 15 stakeholders to understand how users grouped tasks conceptually. This helped establish a shared mental model across teams. I then translated those learnings into multiple IA explorations and validated them through tree testing with 17 participants. Learning from this validation include:
- Panel settings versus admin settings needed clearer separation
- Email communication was critical and needed higher visibility

- Primary navigation aligned to core user tasks
- Monitoring panel performance
- Managing panel setup and details
- Managing panelists, including banning and unbanning
- Administrative functions were isolated
- Admin only tools such as rewards and translations lived in a separate context
The UI was rebuilt using the Cint design system to ensure consistency, accessibility, and scalability across products.The result was a platform that felt intentional, predictable, and aligned with how teams actually work. This work established a scalable foundation rather than a one time redesign. While additional discovery and design work remains, the roadmap is now structured around validated user needs and business priorities rather than legacy constraints.















.jpg)