Docs Automation
The docs live in this repository, so pages can include source examples directly and generated content can be refreshed from the codebase.
Current Flow
uv run scripts/docs.py generate
uv run scripts/docs.py build
uv run scripts/docs.py screenshot
uv run scripts/docs.py assess
generaterefreshes_generated/from the Python source tree.buildruns a strict Zensical build.screenshotcaptures the built local site and the live site for visual comparison withgoogle-chrome.assessruns deterministic screenshot checks for capture dimensions, blank or unstyled pages, the blue home-page header, and visible dark terminal areas.
Terminal Captures
Use scripts/docs_terminal_capture.py to run a command and write a terminal-style SVG that can be
embedded in docs:
uv run scripts/docs_terminal_capture.py \
--command "uvx fast-agent-mcp@latest --help" \
--output docs/docs/ref/terminal-uvx.svg
The script uses a pseudo-terminal through script(1) when available, then renders the captured ANSI
output with Rich. This makes CLI examples reproducible while still looking close to what users see
in a terminal.
Example output:
Visual Assessment
scripts/docs_visual_assess.py adapts the screenshot QA pattern used by the visual inspection tools
in /home/ssmith/temp/html-agent-dev: deterministic checks run first, and an optional vision judge
can inspect the same screenshots with a docs-specific rubric.
# Run deterministic checks only
uv run scripts/docs.py assess
# Write the vision prompt/card without calling a model
uv run scripts/docs_visual_assess.py --dry-run
# Run the vision judge when credentials are available
uv run scripts/docs_visual_assess.py --vision --model gpt-5.5
The deterministic path is intended for routine local and CI use. The vision path adds checks that pixel metrics cannot reliably catch: literal Markdown artifacts, overlapping labels, awkward mobile wrapping, weak feature copy, and whether CLI examples read like real terminal output.
Source-Backed Includes
pymdownx.snippets is configured with both docs/docs and the repository root. Documentation
pages can include examples directly from examples/:
"""
Parallel Workflow showing Fan Out and Fan In agents, using different models
"""
import asyncio
from pathlib import Path
from fast_agent import FastAgent
from fast_agent.core.prompt import Prompt
# Create the application
fast = FastAgent(
"Parallel Workflow",
)
@fast.agent(
name="proofreader",
instruction=""""Review the short story for grammar, spelling, and punctuation errors.
Identify any awkward phrasing or structural issues that could improve clarity.
Provide detailed feedback on corrections.""",
)
@fast.agent(
name="fact_checker",
instruction="""Verify the factual consistency within the story. Identify any contradictions,
logical inconsistencies, or inaccuracies in the plot, character actions, or setting.
Highlight potential issues with reasoning or coherence.""",
)
@fast.agent(
name="style_enforcer",
instruction="""Analyze the story for adherence to style guidelines.
Evaluate the narrative flow, clarity of expression, and tone. Suggest improvements to
enhance storytelling, readability, and engagement.""",
model="sonnet",
)
@fast.agent(
name="grader",
instruction="""Compile the feedback from the Proofreader, Fact Checker, and Style Enforcer
into a structured report. Summarize key issues and categorize them by type.
Provide actionable recommendations for improving the story,
and give an overall grade based on the feedback.""",
)
@fast.parallel(
fan_out=["proofreader", "fact_checker", "style_enforcer"],
fan_in="grader",
name="parallel",
)
async def main() -> None:
async with fast.run() as agent:
await agent.parallel.send(
Prompt.user("Student short story submission", Path("short_story.txt"))
)
if __name__ == "__main__":
asyncio.run(main())
Prefer direct includes for examples that are meant to stay runnable. This keeps docs and examples on one source of truth and makes drift visible in ordinary code review.
Provider Overviews
Provider prose can live next to provider implementation code:
src/fast_agent/llm/provider/anthropic/provider_docs.mdsrc/fast_agent/llm/provider/openai/provider_docs.md
docs/generate_reference_docs.py copies those files into _generated/provider_overview_*.md.
The public provider page includes the generated snippets, so feature prose can be reviewed beside
the implementation it describes.
Proposed Next Automations
- Add a CI docs job that runs
uv run scripts/docs.py generate, fails if generated files changed, then runsuv run scripts/docs.py buildanduv run scripts/docs.py assess. - Add a snippet verifier that scans docs for
--8<--includes and confirms every referenced file exists under an allowed root. - Add example smoke tests for docs-included examples so pages cannot point at broken sample code.
- Store baseline screenshots for the home page and provider docs page, then compare new Chrome screenshots in CI with a small pixel-difference threshold.