I Built Real Products With a Vibe Workflow #14
KhazP
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Important
My biggest takeaway from AI-assisted product development is simple: the bottleneck is rarely the model. It is the quality of the context you give it.
A case study on how I used a structured AI-assisted workflow to ship products across web apps, mobile apps, browser extensions, and client-facing production sites.
Table of Contents
The idea
I do not think the most important variable in AI-assisted software development is the model.
I think it is the context stack.1
What repeatedly failed for me was not "AI can't code."
It was "AI cannot reliably build a product it barely understands."
Once I stopped treating prompting as a one-shot act and started treating it as a pipeline that produces durable project artifacts, the quality of my output changed dramatically.
That pipeline became:
I started by running this manually with the
vibe-coding-prompt-template.Then I turned it into a product with
vibeworkflow.app.Then I used the same system to ship real products.2
Note
The main shift was moving from "generate code for me" to "generate the artifacts that make good code much more likely."
The workflow
flowchart TD A[Idea] --> B[Deep Research] B --> C[PRD] C --> D[Technical Design] D --> E[AGENTS.md + Tool Rules] E --> F[Build Plan / Export Kit] F --> G[Cursor / Claude Code / Windsurf / Copilot] G --> H[Shipped Product] H --> I[Feedback] I --> BThe artifact stack
How I use each stage in practice
1. Deep research
Before writing code, I use AI to analyze the market, technical feasibility, competitors, tooling risks, and likely execution traps.
2. PRD
Then I convert the idea into a scoped product definition with priorities, user stories, constraints, and success criteria.
3. Technical design
Next comes architecture: state, data, APIs, file structure, deployment, performance, and dependencies.
4. Agent instructions
I generate a universal
AGENTS.mdplus tool-specific rules so the coding agent understands the product, the codebase, and the implementation style.5. Build and export
Finally, I create a reusable kit: docs, rules, kickoff prompts, and a structured folder that can actually seed implementation.
Why this beats raw prompting
Raw prompting feels fast at the start.
It often becomes expensive later.
What used to go wrong
What changed with the workflow
The workflow fixed this by making the important decisions explicit before build time.
In practice, that means the AI is no longer guessing:
Tip
The real win is not just cleaner code. It is lower ambiguity.
Case studies
Here is the short version:
1. Vibe Workflow — productizing the method itself
This was the productized version of the workflow.
What it does
It turns the original vibe-coding prompt template into an interactive UI pipeline.
Core ideas
Why it matters
This project turned the methodology from "something I personally do" into "something other builders can use."
Notable design decisions
Real insight
This was the moment I realized the workflow itself was becoming a product.
2. Money Visualizer — planning for hidden complexity
Money Visualizer is one of the clearest examples of why planning matters.
Product thesis
Help people see what money looks like in physical space instead of only understanding it numerically.
Surface simplicity
The interface feels simple:
Under-the-hood complexity
Why the workflow helped
This product needed early clarity around:
Without a planning-first workflow, this kind of app easily turns into an overbuilt demo.
With the workflow, it became a product with a strong experience model.
3. Çağla Cabaoğlu Gallery — applying the workflow to a design-sensitive client site
A contemporary art gallery website requires something very different from a visualization app.
Product requirements
Stack
Why this project mattered
It proved the workflow could scale beyond startup-style utility products and into a client-facing, design-sensitive production site.
What the workflow clarified early
This project reinforced a core truth:
4. RealDex — where weak planning gets punished
RealDex is a wildlife collection app: think "Pokédex for real animals."
Core loop
Technical ingredients
Why it was interesting
Cross-platform mobile is where weak planning gets punished.
If your product loop is fuzzy, onboarding suffers.
If your ML behavior is unclear, trust suffers.
If your offline rules are messy, retention suffers.
What the workflow improved
RealDex showed me that AI-assisted workflows are not just for web apps. They are especially valuable when complexity spans UX, device features, storage, ML, and monetization.
5. Money Visualizer Extension — small surface, real complexity
Once the main web app existed, a browser companion became a natural next step.
Constraints that mattered
Why the workflow was useful
Extensions are deceptively small projects.
They can become messy quickly if responsibilities are not clearly separated.
The same planning process helped define:
6. AspectRatioViewer — focused tools can still win
AspectRatioViewer was a sharper utility product and an important personal milestone because I eventually sold it.
What it did
Why it mattered
This project reminded me that not all good products need to be broad.
Some of the best ideas are highly specific and immediately useful.
It also reinforced something practical:
AI-assisted development is not only about building faster.
It is also about increasing the number of credible product experiments you can run.
That matters if you care about validation, monetization, or acquisition outcomes.
7. Smaller tools — proof that the workflow generalizes
I also built smaller, targeted products in the same ecosystem.
Localization Comparison Tool
A desktop-style utility for comparing localization files, surfacing quality issues, and improving translation workflows.
Reddit to AI
A browser workflow that takes noisy Reddit threads and converts them into clean, structured context for AI tools.
GetirFiltre
A targeted browser extension for filtering and improving discovery in a delivery marketplace workflow.
Why these mattered
These projects demonstrated that the workflow scales across product categories:
Patterns that kept repeating
1. AI configs became first-class artifacts
Every serious project benefited from explicit AI-facing documentation.
Examples:
AGENTS.mdCLAUDE.md.cursor/rulesThese are not sidecars anymore.
They are part of how the project thinks.
2. Documentation stopped being "later work"
Research docs, PRDs, technical designs, and build plans stopped being optional and started becoming the engine of implementation.
3. Fast deployment changed product behavior
Rapid iteration loops made it realistic to refine products aggressively instead of treating deployment like a rare event.
4. i18n and compliance worked better when planned early
Localization, privacy, analytics, consent, and store/compliance concerns all become cheaper when they are designed into the plan.
5. The workflow improved not just code quality, but product quality
This was probably the biggest surprise.
The strongest benefit was often not "cleaner code."
It was "clearer product thinking."
Lessons learned
Context is leverage
The better the context stack, the better the AI output.
The repo needs a memory
If the AI cannot understand the repo's rules and purpose, it becomes inconsistent fast.
Build kits are better than chat logs
A reusable project folder beats a brilliant but disposable conversation.
Taste has to be operationalized
If you want elegant UX, maintainable structure, or polished interaction, you need to describe those things concretely.
AI is strongest inside a system
The best results came when AI was part of a repeatable workflow, not when it was asked to freestyle.
Shipping teaches the workflow what to become
The workflow improved because real projects exposed its weak spots.
That feedback loop is what made it durable.
CTA
If you want to explore the system that made these projects possible:
vibeworkflow.appvibe-coding-prompt-templatevibe-coding-template-webappMy main takeaway after building this ecosystem is simple:
That is when "vibe coding" stops feeling like luck and starts feeling like a workflow.
Footnotes
By "context stack," I mean the durable set of artifacts that tells an AI what the product is, who it is for, how it should be built, and which decisions are already fixed. ↩
This started as a manual prompt template, then became a productized workflow, and then became the operating system behind multiple shipped projects. ↩
Beta Was this translation helpful? Give feedback.
All reactions