Suzy Signals

2025

As Product Design Director and UX/UI Designer at Suzy, I led the end-to-end vision for a new product called Suzy Signals—an AI-powered mobile experience that reimagines how business leaders engage with research.

Our CEO challenged us to deliver a fast, accessible, and culturally intelligent decision engine—something closer to TikTok than Tableau. My role combined high-level product strategy with hands-on UX/UI design execution, from whiteboard to wireframes.

Problem

Marketing leaders don’t need raw data—they need clarity. But most research platforms:

  • Take too long to deliver answers

  • Rely on complex, researcher-centric interfaces

  • Present insights in static, hard-to-translate formats

There was a gap between the speed of cultural change and the pace of traditional research. We set out to close that gap.

Opportunity

What if we gave teams an AI-powered feed of business-relevant signals—from market trends to cultural shifts—and let them chat with those signals, launch research instantly, and receive visually compelling insights decks within minutes?

We called it Suzy Signals:
A mobile decision engine for the TikTok era of business intelligence.

Design Goals

As both design director and IC contributor, I defined four principles to guide the experience:

  1. Swipeable, not scrollable – Mimic the familiarity of TikTok, serve insight not entertainment

  2. Conversational, not configurational – Let AI handle the mechanics of research design

  3. Visual-first – Replace static reports with auto-generated, story-driven presentations

  4. Actionable, not academic – Deliver answers, not just data

  5. AI-shaped, but human-refined – Mimic expert prompting behind the scenes

My Process

I ran a parallel-track approach—balancing vision and execution:

  • Led internal design sprints with product, engineering, and AI

  • Built mobile wireframes in Figma, defining navigation, layouts, and signal taxonomy

  • Crafted AI prompt templates by mimicking expert researchers: I manually iterated on ChatGPT to simulate how study summaries, survey content, and signal headlines should sound

  • Directed content quality, including visual sourcing for CPG imagery that would make each signal card feel premium, modern, and brand-aware

I bridged the technical constraints of LLMs with user expectations—especially for teams unfamiliar with research jargon or formatting.

Key Screens

  1. Signal Feed
    A swipeable feed of AI-curated signals—visuals, headlines, data points—with quick actions to explore or generate a study.

  2. Signal Chat
    A tap opens an AI assistant that explains what the signal means for your brand—and how to act on it.

  3. Auto Study Generator
    One tap creates a research study draft—audience, questions, goals—based on the selected signal.

  4. Insight Deck
    Results are delivered as a visual, swipeable “story deck.” Users can tap into each insight or chat with the data.

  5. Chat to Your Data
    Users can ask questions like “What’s driving Gen Z interest?” and the system returns visualized, natural language insights.

AI Integration

I helped prototype the entire LLM interaction model, including:

  • Prompt engineering for converting trend signals into:

    • Study goals and hypotheses

    • Target audience definitions

    • Key survey questions and follow-ups

    • Executive-ready summaries and insights

  • Content tuning for tone, trust, and brevity

  • Image pairing guidance for each signal (especially CPG-related signals where branded visual context mattered)

In early development, I mimicked how expert researchers might prompt ChatGPT—then fine-tuned outputs that fed directly into the interface and presentations.

Intended Outcomes

We’re piloting Suzy Signals with strategy teams across 3 Fortune 500 brands. Early indicators show:

  • Projected 60% reduction in time-to-insight

  • Higher engagement from VP-level stakeholders who previously avoided traditional dashboards

  • Stronger product-market fit for Suzy as an AI-native research platform

Reflection

This project blurred the line between designer, strategist, and prompt engineer. I had to design not just the interface—but the voice and logic of the AI itself.

As Product Design Director and UX/UI Designer, I wasn’t just shaping pixels—I was shaping behavior, comprehension, and trust.

Mimicking expert prompts and pairing AI-generated insights with emotionally resonant CPG imagery let me transform a complex data tool into something intuitive, useful, and brand-worthy. This wasn’t about making AI look smart. It was about making humans feel smarter.