Installation
Requirements
Section titled “Requirements”- Rust 1.75+ (MSRV — Minimum Supported Rust Version)
- Tokio runtime (TraitClaw is async-first)
Quick Install
Section titled “Quick Install”cargo add traitclaw traitclaw-openai-compat tokio anyhowThis gives you the core framework with the OpenAI-compatible provider (works with OpenAI, Ollama, Groq, Mistral, and vLLM).
Feature Flags
Section titled “Feature Flags”The traitclaw meta-crate re-exports everything through feature flags:
[dependencies]# Minimal — just core + macros + OpenAI-compatible providertraitclaw = "1.0"
# With specific featurestraitclaw = { version = "1.0", features = ["steering", "sqlite", "mcp"] }
# Everythingtraitclaw = { version = "1.0", features = ["full"] }Available Features
Section titled “Available Features”| Feature | Crate | Default | Description |
|---|---|---|---|
openai-compat | traitclaw-openai-compat | ✅ | OpenAI, Ollama, Groq, Mistral, vLLM |
macros | traitclaw-macros | ✅ | #[derive(Tool)] proc macro |
steering | traitclaw-steering | ❌ | Guards, Hints, Trackers |
sqlite | traitclaw-memory-sqlite | ❌ | SQLite persistent memory |
mcp | traitclaw-mcp | ❌ | Model Context Protocol client |
rag | traitclaw-rag | ❌ | RAG pipeline (BM25, embeddings) |
team | traitclaw-team | ❌ | Multi-agent orchestration |
eval | traitclaw-eval | ❌ | Evaluation & benchmarking |
strategies | traitclaw-strategies | ❌ | ReAct, CoT, MCTS reasoning |
full | All of the above | ❌ | Everything |
Provider Setup
Section titled “Provider Setup”OpenAI
Section titled “OpenAI”export OPENAI_API_KEY="sk-..."use traitclaw_openai_compat::OpenAiCompatProvider;
let provider = OpenAiCompatProvider::openai("gpt-4o-mini", std::env::var("OPENAI_API_KEY")?);Anthropic
Section titled “Anthropic”cargo add traitclaw-anthropicexport ANTHROPIC_API_KEY="sk-ant-..."use traitclaw_anthropic::AnthropicProvider;
let provider = AnthropicProvider::new("claude-sonnet-4-20250514", std::env::var("ANTHROPIC_API_KEY")?);Ollama (Local)
Section titled “Ollama (Local)”No API key needed — just run Ollama locally:
ollama pull llama3.2use traitclaw_openai_compat::OpenAiCompatProvider;
let provider = OpenAiCompatProvider::ollama("llama3.2");Groq / Mistral / vLLM
Section titled “Groq / Mistral / vLLM”// Groqlet provider = OpenAiCompatProvider::groq("llama-3.3-70b-versatile", groq_key);
// Mistrallet provider = OpenAiCompatProvider::new("https://api.mistral.ai/v1", "mistral-large", key);
// Any OpenAI-compatible endpointlet provider = OpenAiCompatProvider::new("http://localhost:8000/v1", "my-model", "no-key");Next Steps
Section titled “Next Steps” Quick Start → Build your first agent in 5 minutes.