DORA AI: A CASE STUDY
Product Marketing, Usability Testing, Content Strategy
2023-2024
At Dora, I facilitated the launch of Dora AI, Dora's generative text-to-website feature, which aimed to transform how people built websites using natural language.
The product ultimately earned two Golden Kitty Awards on Product Hunt (Best AI Tool & Best No Code Tool), but its path to launch was anything but smooth.​
A quick overview of Dora AI's primary user flow and features. Featuring my voice.


MY ROLE
I facilitated the go-to-market strategy and led usability testing through two distinct phases:
-
May 2023: The teaser and waitlist phase — seizing the peak of AI discourse to capture maximum attention
-
March 2024: The long-awaited launch phase — regaining trust and ensuring retention amid delays and shifting expectations
THE PROBLEM
How do you design and launch a cutting-edge AI feature in a saturated market, with limited media reach, a lean team, and two diverging user groups?
Launching Dora AI wasn’t just about shipping an AI tool — it was about reconciling the growing divide between our two distinct user groups:
-
Legacy no-code users: skilled designers and developers expecting precise control over visual and motion design
-
New AI users: attracted by the promise of instant, text-to-website generation with minimal skill barrier
After teasing Dora AI in May 2023 and achieving viral traction, user expectations were sky-high — but our AI feature wasn’t ready. Developmental delays (largely due to the instability of LoRA models at the time) meant the tool was still in its early stages months later. The long silence eroded trust, created confusion around Dora’s identity, and led to vocal frustration from a waitlist that had grown to over 300,000 in just two weeks.
WHAT I DID
Part 1: Winning the Waitlist|May 2023
To make a splash in an increasingly crowded AI space, I helped architect and execute a multi-channel campaign that generated massive viral traction:
​​
Waitlist strategy
-
Collected leads using a product waitlist.
-
Incentivized sign-ups using referral mechanics (e.g. Refer a friend to move up 100 spots)
-
Boosted engagement with ProductHunt launch (e.g. “First 1000 upvoters on ProductHunt get early access”)
Influencer marketing
-
Partnered with creators at the intersection of design and technology to build early buzz
-
Designed creative briefs to align messaging and provide high-quality assets to creators
Community engagement​
-
Empowered brand ambassadors with content and strategies to spread teasers via social media.
-
Led pre- and post-launch content strategy, coordinating content across socials, email, and Discord.
As a result, we accrued over 300K users on the waitlist in <2 weeks, 10M+ impressions with a CPM < 6, and more than 350% growth in followers across social media platforms. We also achieved Product of the Day, Week, and Month on ProductHunt.

Our launch on ProductHunt , with all its accolades. All copy and branding materials written by me, including the pinned maker's comment.

Our initial concept teaser, released May 2023, received over 108K views on YouTube alone. Kudos to our designers, Faye Zheng, Alisa Fan, & Christina Liu.
.png)
Our social media stats, 3 months post-launch, following marketing campaigns run by Eve Li, Kay Feng, and me. This was a more than 350% increase in total followers.
Part 2: Developmental Delays|June 2023 - January 2024
​
Despite the buzz, Dora AI wasn’t ready for launch. Users grew impatient, confused, and angry. Many felt we had “overpromised and disappeared.” Meanwhile, our small team (~30 people) was split between updating the existing no-code builder and refining the AI tool.
Recognizing the growing distrust, I pushed for proactive damage control through community engagement, expectation management, and transparency:
​​Reframed messaging
-
Positioned the release as “Dora AI Beta” rather than "Dora AI 1.o" or just "Dora AI."​
​​
Email and social campaigns
-
Rolled out phased communications to re-engage waitlisted users and set clearer expectations, such as:
-
Social media and email updates sharing timeline expectations and progress.
-
Coupons and offers for active users and waitlisted users
-
Community events and giveaways
-
I organized and hosted events to keep the community active and engaged outside the AI tool, such as Dora Expo and the 3D Web Design Challenge.
​
Streamlined feedback channels
-
I restructured our feedback Discord and the tool's built-in feedback feature, ensuring users had access to the team, as well as previous user suggestions.
Created Dora Updates
-
I pitched, wire,framed and led content strategy for our Dora Updates page, improving transparency of our progress while commemorating key developmental milestones.
​​


All assets were reworked from "1.0" to "Beta" before release.


Landing pages for Dora Expo and 3D Design Challenge.

I also wrote entries for Dora Updates on a monthly basis.
As a result, from June 2023 to February 2024, we maintained a steady average of ~20K registrations per month. We also triaged more than 300 feedback requests each month, which heavily influenced our product roadmap and bi-weekly retrospectives.
​Crucially, our findings highlighted diverging needs within our users. No-code users desperately needed core fixes, such as faster runtimes, e-commerce features, and forms. Meanwhile, AI users struggled with the technical complexity of the Dora editor. This clarified our divided priorities and precipitated the decision to go all-in on Dora AI and launch as soon as possible.
Part 3: The Launch|January - March 2024
​
When we finally reached a viable MVP, I co-led a comprehensive launch strategy designed not just to celebrate release, but to repair relationships and retain users. Below are some of my key initiaves:
Dora AI launch website:
-
Oversaw end-to-end development of all 3 phases of the launch website, designed and developed over 2 months with a team of 3 designers and 3 developers.
-
Led low-fi prototyping and content strategy to reinforce our new positioning around “beta" and manage expectations
-
Emphasized displaying the AI tool's capabilities in a clear and engaging way
​
(Full case study coming soon)
Open and closed beta testing:
-
Invited ambassadors, creators, and long-time waitlist users to test Dora AI and share feedback
-
Designed and facilitated protocol for ~15 usability interviews (closed beta testing)
-
Led survey data collection from ~200 users (open beta testing)
-
Synthesized qualitative and quantitative feedback for the development team, influencing product roadmap and development priorities
​
Feedback infrastructure:
-
Created a dedicated Discord server for Dora AI
-
Built feedback and troubleshooting channels within the main community
-
Synthesized and presented user insights to the product and design teams
-
Responded quickly to early criticism (e.g. credit limits too low), such as increasing the default credit cap based on aggregated feedback and introducing giveaway events to gain extra credits
​
Transparent marketing campaigns
-
Prioritized showing real use cases and hands-on demos
-
Emphasized the “beta” nature of the tool by
-
Invited recognized industry leaders in design, technology, and product to elevate credibility





Final landing page for Dora AI, released in March. Bonus: my initial wireframes!

Isa in her natural habitat: running user studies (:


Our beta testing Discord alone hosted almost 200 users while our main server grew from 40K to 100K+ users.

Flashy demos < reviews of the key functionalities of our tool or practical tips by high-credibility industry leaders such as Michael Riddering and Adham Dannaway
As a result, we retained over 20% of our waitlisted users, gaining a 310% increase in registrations upon launch in March compared to the previous month. We also experienced a dramatic 2400% increase in new subscribed users that month.

WHAT I LEARNED
Keep users in the loop—consistent communication builds trust, especially when you’re still earning it.
The long gap between teaser and release was one of the biggest reasons users lost trust. In hindsight, we could have minimized the teaser-to-release window by prioritizing the development of the AI feature or having a smaller, clearer MVP. In addition, regardless of the wait time, providing regular product updates, avoiding hard promises before our timeline was set, and maintaining engagement with our users during the wait would have greatly increased waitlist retention, especially as a new, indie tool without prior credentials.
Feedback is only valuable when it’s acknowledged.
Many passionate users gave us detailed feedback, but without transparent implementation or acknowledgement, they felt ignored. Open beta testing, Dora Updates, and feedback channels helped bridge this, showing that we were listening and reinforcing our "building in public" narrative.
Your product should excel at one thing before trying to do everything.
The divergent needs of AI and no-code users stretched our small team thin. Clearer product positioning and roadmap definition could have helped us manage expectations and resources more effectively. For example, framing Dora AI as an experimental extension (rather than a replacement) during the 2023 teaser would have aligned user expectations earlier. It would also have enabled us to identify and serve the two user groups more distinctly through tailored updates and content.
Don't reinvent the wheel. Learn from those who came before.
Before Dora, I had minimal experience building and launching a product from scratch. To bridge that gap, I actively networked with product marketers and researched best practices. Our open beta approach, for instance, was inspired by Figma’s shift from stealth to open beta. Articles such as Emily Brody's Anatomy of Launching a Figma Open Beta helped me identify early on which metrics we needed to track, such as feature adoption (we tracked drop-offs at every stage of the main AI flow).