WORKFLOW
The Human-AI Workflow
You're the pilot. Here's how the collaboration actually works.
The Metaphor: Your Airplane / My Airplane
In aviation, pilots explicitly transfer control:
- "Your airplane" — you take the controls
- "My airplane" — I'm flying now
AI-assisted development works the same way:
Human ("My airplane")
- • Setting direction and goals
- • Architectural decisions
- • Validating proposals
- • Course corrections
- • Final approval
AI ("Your airplane")
- • Code generation
- • Pattern application
- • Documentation
- • Test creation
- • Bulk operations
You're always pilot-in-command. But workload shifts based on the task.
A Real Session
Here's an actual exchange from building simple_htmx:
Result: 34 classes, 40 tests, 4 hours.
See It In Action
Watch real development sessions:
The Reference Documentation System
AI works best with context. We maintain living documentation:
CLAUDE_CONTEXT.md— Eiffel fundamentals, session startupgotchas.md— Known pitfalls and solutionspatterns.md— Verified working code patterns- Project ROADMAPs — Project-specific context
When AI reads these first, accuracy goes from ~60% to ~95%.
The documentation captures institutional knowledge. AI gets smarter with every session. Mistakes don't repeat.
The Verification Layer
AI writes code. But who checks AI?
Most AI errors are caught at Layers 1-2. Before human review. Before production.
What Makes It Work
Clear Direction
AI needs to know what you want
Reference Context
AI needs to know your patterns
Rapid Feedback
Compiler/contracts catch errors fast
Iteration
Build incrementally, verify continuously
Human Judgment
You're always the decision-maker
Productivity Results
This isn't hype. This is measured output.