How My AI Workflow Evolved from Prompts to Workflow

This post is a summary and discussion of the original article "How My AI Workflow Evolved from Prompts to Workflow" by Hoang Nguyen, published on codeaholicguy.com on April 11, 2026. All credit for the original ideas and experiences belongs to the author.


From Random Prompts to a Real Workflow

Over the last 6 months, Hoang Nguyen has been evolving his AI coding workflow while building AI DevKit. The most important shift was not about getting better code generation — it was about moving from reusable commands and templates into a workflow that can carry context, trigger the right behavior, and verify work automatically.

A Concrete Example That Made the Difference

A recent feature made this evolution obvious. He used Codex to build interactive skill selection for the ai-devkit skill add command. With just one sentence of instruction, the workflow carried the task through:

  • Requirements gathering
  • Design
  • Planning
  • Implementation
  • Verification
  • Tests
  • Code review

The entire session took under an hour, with the actual feature flow taking about 30 minutes. What made this remarkable was not just that AI wrote code — it's that the workflow left behind requirements docs, design artifacts, tests derived from requirements, and verification against the spec, rather than just a code diff.

Key Observations from the Workflow

Several things stood out in practice:

  • Memory pulled back an old CLI rule that the author had forgotten he stored — the system remembered so he didn't have to.
  • Review phases could loop backward instead of blindly moving forward, catching problems before they compounded.
  • Verification caught drift between implementation and design — ensuring the final code actually matched the spec.
  • Human judgment remained essential — he still made the product decisions and fixed the last failing test himself.

The Bigger Question: Prompts vs. Workflow Layer

The author poses a thought-provoking question to the community: Are you mostly optimizing prompts, or are you now trying to optimize the workflow layer around the model?

This distinction matters. Optimizing prompts is a local improvement — you get better output from a single interaction. Optimizing the workflow layer is a systemic improvement — you build a system that orchestrates AI across multiple steps, preserves context, enforces structure, and verifies results.

This connects directly to the "Engineer A vs. Engineer B" framing Hoang explored in his earlier post on agentic engineering: Engineer A uses AI faster; Engineer B uses AI at scale by coordinating multiple agents and workflows. The exponential effect doesn't come from typing speed — it comes from accumulated workflow leverage.

Why This Matters for Every Developer

As AI tools evolve rapidly — Cursor, Claude Code, Codex — the interface changes but the underlying insight remains: don't couple your discipline to a single tool. Build your workflow so it works across interfaces. A workflow that only functions inside one UI is fragile.

The shift from "AI as a prompt target" to "AI as a workflow participant" is one of the most important mental model changes an engineer can make right now.


Original article: How My AI Workflow Evolved from Prompts to WorkflowHoang Nguyen, codeaholicguy.com, April 11, 2026

Also discussed on: Hacker News

]]>
← Quay lại Blog