
WhatDoc
Connect a GitHub repository and generate polished, AI-powered documentation in under a minute with AST-aware extraction, editable markdown, and instant deployment.
Timeline
2+ months
Role
Full Stack Engineer
Team
Solo
Status
CompletedTechnology Stack
Key Challenges
- Balancing AI generation quality with token limits and free-tier cost constraints
- Making repo ingestion deterministic enough to avoid hallucinated documentation
- Handling domain setup, SSL, deployment, and production reliability on a student budget
- Designing documentation themes that feel premium instead of generic markdown exports
Key Learnings
- AST-first extraction beats blind prompt stuffing for trustworthy technical docs
- Infrastructure and delivery often take more time than the AI layer itself
- Strong storytelling and presentation can change how technical work is perceived
- Careful free-tier limits and BYOK flows can make AI products usable without burning cash
WhatDoc: docs that do not look boring
Overview
WhatDoc is an AI documentation platform that turns GitHub repositories into beautiful, shareable docs almost instantly. A user drops in a repo URL, WhatDoc inspects the codebase, extracts the logic that actually matters, generates structured markdown, and publishes it in a polished template.
The product came from a simple frustration: developers build impressive systems, then explain them with a README that reads like unfinished raw footage.
As a video editor, I learned that great storytelling is what makes people care. I wanted to bring that same idea to developer documentation.
Why I built it
One README is rarely enough when:
- you are explaining architecture in interviews
- recruiters need to understand what you actually built
- teammates need to ramp up quickly
- your portfolio should show engineering depth, not just screenshots
- you revisit a project three months later and need your own memory back
I wanted documentation to feel like a product surface, not an afterthought.
What the product does
WhatDoc takes a GitHub repo and gives the user a clean pipeline:
- Connect GitHub and import a public or private repository
- Analyze the codebase with AST-aware extraction and structure detection
- Generate documentation with an LLM constrained by the extracted project map
- Edit and deploy instantly using a live markdown editor and template system
The result is documentation that looks closer to Twilio, MDN, or a premium SDK portal than a default markdown page.
Engine architecture
The core pipeline is intentionally simple and deterministic:
1. Ephemeral repo ingestion
Repositories are shallow-cloned into temporary storage, parsed, and deleted immediately after generation. This keeps the pipeline lightweight and avoids retaining unnecessary project files.
2. AST and paradigm detection
Instead of treating every repository like a blob of text, WhatDoc inspects the code structure and detects what kind of project it is — API, frontend app, SDK, CLI, and so on. That lets the generator adapt the document structure instead of producing one generic template for everything.
3. Prompting with guard rails
The LLM is not asked to improvise from chaos. The system first extracts the project DNA, trims noise, and then feeds a structured view into the model. That constraint makes the output more reliable and dramatically reduces fake endpoints or invented architecture.
4. Rendering and publishing
Generated markdown is stored, edited in a live interface, and rendered through a curated template system. The goal is not just to generate docs quickly, but to make them feel intentionally designed.
Features I focused on
- Smart code ingestion that filters noisy files before generation
- Template-driven presentation so different repos can match different aesthetics
- BYOK support for users who want unlimited generations with their own Gemini key
- GitHub OAuth flow for a smoother import experience
- Live editor for quick cleanup and customization after generation
- One-click deployment to a shareable hosted documentation page
- Custom 404 experience because broken URLs should still feel polished
The hard part was not the AI
Ironically, the AI layer was not the toughest piece.
The part that took more time was the production plumbing:
- getting domains and routing correct
- setting up SSL properly
- making deployment predictable
- keeping the experience smooth on free-tier infrastructure
- making the hosted result feel fast and stable enough to share publicly
That was a good reminder that shipping real products is usually less about one clever model call and more about stitching the whole system together reliably.
Real-world constraints at launch
I shipped this on a student budget, so the product design had to reflect reality.
- Free users were limited to two generations by default
- Power users could add their own Gemini API key for effectively unthrottled usage
- Subdomain customization was planned as a staged rollout rather than something I forced into day one
- Reliability safeguards mattered because the stack was running on constrained infrastructure
Those limitations shaped the product in a healthy way: every feature had to justify its cost.
Why this project matters to me
WhatDoc combines a few things I care about deeply:
- building developer tools that solve an actual pain point
- using design to make technical work more accessible
- shipping full-stack products, not just prototypes
- thinking about infrastructure, UX, and storytelling as one system
It is also a good snapshot of how I like to work as an engineer: build something useful, ship it live, and keep refining the experience until it feels intentional.
Outcome
The product is live at whatdoc.xyz, and it represents the kind of engineering work I want to keep doing: practical AI, strong UX, and end-to-end product ownership.
If a team values people who can both build and ship, this project is probably the clearest example of how I think.
