My Role & How I Build
Syncly is a sub-15 person YC startup. Two designers each own an entire product line. I own the Social Listening product and AI Agent experience end-to-end.
Figma → Google Antigravity / Cursor (Claude Code) → Vercel → Customer interviews & testing → repeat.
Every week I deploy working prototypes to Vercel and demo directly to the CTO, ML engineers, and business leads. No static mockups, no handoff delays. Stakeholders interact with real products, and feedback turns into the next iteration the same week.

About Users
Our customers are brand partnership and marketing teams at companies like cosmetics, fashion, and consumer brands. They need to track what hundreds of creators are saying about their brand across YouTube, TikTok, and Instagram, but today that means watching videos one by one. Syncly helps them monitor, analyze, and act on creator content at scale without the manual work.
Context
Market opportunity — Brands can't see what's inside creator video: The creator economy has shifted to short-form video. Brand partnership teams at global companies need to understand what's happening inside YouTube, TikTok and Instagram content, but legacy social listening tools were built for text. They track hashtags and captions, but miss products shown on screen, brands mentioned in speech, and sentiment expressed through visuals.
Platform expansion — One platform for CX data, creator analytics, and brand partnerships: Syncly already centralizes VoC customer feedback into AI-powered insights. Adding video social listening creates a unique position. The only platform where brands can connect CX data with creator analytics, performance tracking, and partnership intelligence in one place.
Problem
After launching video social listening, Ask Syncly (AI Agent) usage dropped 49% in 6 weeks.
The AI chat was built for CX text data. When the platform expanded into video, it kept returning text-only answers with no connection to source clips, creators, or visual context. Users had no reason to ask when the answers didn't match the data they were actually looking at.

Project Goal
Ask Syncly (AI Agent) to deliver video-native answers, so users can browse, analyze, and report without leaving the AI chat.
What I Shipped
1. Social Listening Experience
Overview / Performance / Conversation:
Three views answering one question: "What's happening with our brand right now?" Overview captures volume and top creators, Performance tracks changes over time, and Conversation reveals what people are actually saying. Brand teams can scan across platforms without opening a single video.
People / Posts:
"Who's talking about our brand, and what content is working?" People ranks creators by engagement. Posts breaks down individual videos. This is where teams evaluate creators and make partnership decisions.
The dashboard answers "what's happening." But brands kept asking "so what should we do?"
That's where Ask Syncly comes in.
AI Agent: Quick Analysis While Browsing
General mode: Ask Syncly sits as a side panel on top of the dashboard, generating structured answers from sentiment analysis to competitive comparisons with a single question. Not a conversation tool, a decision-making tool.
Content mode: "What narratives are working for this brand?" Content mode analyzes patterns across top-performing videos, identifying which hooks, formats, and storytelling structures drive engagement. It replaces gut feeling with evidence from actual content performance.
AI Agent: Deep Research & Reports
Power users wanted two things the side panel couldn't support: reusing prompts they run every week, and sharing AI answers with leadership as formatted reports. I repositioned the AI Agent as the primary entry point for analysis, evolving it from a reactive Q&A tool into a proactive research interface with Report mode, saved prompts, and General/Report switching built in.
Currently Building
Influencer Discovery
Social listening tells you who's talking about your brand. The next question brands ask is "who should we work with?" Influencer Discovery recommends creators based on listening data. Currently in development.
Impacts

Four weeks after MVP launch, the redesigned Ask Syncly reversed the usage drop that triggered this project. Weekly active usage grew 71.2%, driven by preset questions that lowered the barrier to start and a video-tailored answer structure that gave users a reason to stay.
64% of power users adopted the new Report mode within the same period. It was the strongest signal from research, and adoption followed quickly once the capability launched.
Takeaways
Shipping with GenAI-powered prototyping
Using Antigravity, Cursor and Claude Code, I moved from concept to working prototype in days, not weeks. Deploying to Vercel meant stakeholders tested real products, not static screens, and decisions happened faster because everyone could interact with the actual experience.
Aligning scope without a PM
With no PM on the project, I built a priority matrix to align the team on what to build and why. It wasn't just a planning tool. It was how I kept the CTO and engineers focused on the highest-impact problems instead of building everything at once.
Designing for AI trust
AI will get things wrong. The core design challenge was making sure users can verify answers against source data, understand where insights come from, and stay in control when the AI fails. Every interaction pattern in Ask Syncly was built around this principle.


