Warning
Ossature is currently in its 0.x series and should be considered unstable. APIs, spec formats, CLI flags, and internal behavior may change significantly between releases without prior deprecation. Pin your version and check the changelog before upgrading.
An open-source harness for spec-driven code generation.
You write a specification, optionally lay out the architecture, and Ossature breaks it down into a build plan that gets executed step by step with an LLM doing the code generation under tight constraints. The specs are your source of truth, you review the plan before anything gets built, and when something breaks you fix that step and keep going instead of starting over.
Works with Anthropic, OpenAI, Mistral, Google, and most other hosted providers, as well as local models through Ollama.
Ossature (pronounced OSS-uh-cher) means the underlying framework or skeleton of a structure.
Requires Python 3.14+.
pip install ossatureOr run it directly with uvx:
uvx ossature --versionSet your LLM provider API key:
export ANTHROPIC_API_KEY="sk-ant-..."
# or OPENAI_API_KEY, MISTRAL_API_KEY, etc.Create and build a project:
ossature init myproject && cd myproject
ossature new my-feature
# edit specs/my-feature.smd
ossature validate
ossature audit
ossature buildThe default model is anthropic:claude-sonnet-4-6. To use a different model, set the model field in ossature.toml:
[llm]
model = "openai:gpt-5.2" # or mistral:devstral-latest, etc.The API key you export must match the provider in your model string (e.g., OPENAI_API_KEY for openai:…). See the configuration docs for per-role overrides and all available options.
See ossature-examples for complete projects with specs, build plans, and generated code.
Full docs at docs.ossature.dev. The workflow guide walks through a complete project from init to generated code.
MIT
