Genkle:
Speaking English Diary
An AI-powered speaking diary that transforms voice logs into personalized learning loops. It achieved a 20%+ signup conversion rate with $0 ad spend through automated GTM execution.
Advanced learners had nowhere to go.
While the market is saturated with beginner-level tools, advanced speakers face a unique challenge: subtle, persistent mistakes they don't even notice.
These users don't want to "study"—they want to use the language naturally and improve without being forced back into a classroom environment.
Without the exhaustion of typing a long diary after a busy day, we asked a simple question: "What if you could just speak?"
Speak → Correct → Recall
Voice → Study Cards
Users recount their day via voice. OpenAI Whisper transcribes it, and GPT auto-generates grammar corrections and native expression flashcards. Truly personalized study materials built from their own stories.
Bookmark & Review System
Added bookmarking and personal notes to each auto-generated card to see repeated mistakes at a glance and facilitate easy learning. Anchoring content to personal memories dramatically improves both learning effectiveness and retention.
AI Diary Image Generation
For Premium plans, AI generates an engaging visual recap of the day's diary entry. Going beyond a simple learning app, it serves as a memory archive, creating a powerful user lock-in effect.
CAC exceeded LTV in a saturated market.
Video ad campaigns on Meta and TikTok proved economically unsustainable. The unit economics broke down fast: at €4.99/month subscription with a €9 LTV target, the paid CAC needed to stay under €2 — impossible in a market dominated by apps with 8-figure ad budgets.
$0 Ad-Spend Hyper-Targeted Outbound Automation
I developed a custom Node.js automation pipeline. After collecting posts related to "english speaking improvement"via the Reddit API, I used GPT-4o-mini for a two-stage filtering process to pinpoint the exact target audience.
Gathered up to 1,000 posts via keyword search and pagination. Batch-checked Google Sheets to skip previously processed Post IDs and Authors.
Phase 1: Batch-classified all titles in a single API call (fast & cheap). Phase 2: Deep content analysis on passed posts, generating contextual comments while strictly avoiding duplicate approaches.
Routed generated comments to Slack. Clicking ✅ triggered a webhook for automated Reddit posting, logging status (Posted/Rejected) directly to Google Sheets.
Validated
at Zero Cost
Automated Reddit scraping — filtering down to 10-20 real target customers
The percentage of highly-qualified leads mapped out of all scraped posts
Over 20% of hand-picked post authors signed up — at $0 ad spend.
Instead of broad reach, I maximized filtering precision. By pinpointing 10-20 high-intent targets out of 250 daily posts, over 20% signed up — proving niche demand with $0 ad spend.