Example Use Case

Reducing Token Waste in Documentation Workflows

How a software consultancy structured documentation prompts and cut costs by 63%.

← All case studies
Note: This is an illustrative example based on common LLM workflows, not real customer data.
Agency Software consultancy — internal AI development tools

Reducing Token Waste in Documentation Workflows

The problem

A consulting agency used LLMs to generate internal architecture documentation from source code. The prompts were written ad-hoc by engineers and produced long, verbose responses that required follow-up prompts to refine.

This increased API usage and slowed down documentation generation.

The solution

promptctl was used to generate structured prompts automatically.

  • Deterministic prompt structuring engine
  • Enforced clear sections, constraints, and output formats
  • Cost estimation across models before sending requests

The result

63%
Cost reduction
$950
Monthly savings
2.0x
Faster documentation

"Structured prompts reduced the number of follow-up calls dramatically."

— Lead engineer, software consultancy