Overview
This project uses two model tiers:PRIMARY_MODELfor heavyweight generation pathsSECONDARY_MODELfor lightweight or high-volume paths
provider:model format (for example anthropic:claude-opus-4-5 or openai:gpt-5-mini).
Where each tier is used
PRIMARY_MODEL
| Area | Code path | Notes |
|---|---|---|
| Agent main chat loop | src/agents/agent.ts | Used when no agent-specific llm_model override is set |
| Module overview docs | src/services/doc-generator.ts | generateModuleDocs |
| Business process docs | src/services/doc-generator.ts | generateBusinessProcessDocs |
| Clean Core analysis narratives | src/services/clean-core-analyzer.ts | Persists model_used |
| Conversion readiness narratives | src/services/conversion-readiness-analyzer.ts | Persists model_used |
SECONDARY_MODEL
| Area | Code path | Notes |
|---|---|---|
| Agent title generation | src/agents/agent.ts | First-turn title synthesis |
| Technical docs generation | src/services/doc-generator.ts | generateTechnicalDocs |
| Business process inference | src/services/business-process-inferrer.ts | Inference pass used by enrichment |
Defaults
Defaults are defined insrc/lib/model-selection.ts:
PRIMARY_MODELdefault:anthropic:claude-opus-4-5SECONDARY_MODELdefault:anthropic:claude-haiku-4-5
What happens with incorrect values
1) Invalid model string format or unsupported provider
IfPRIMARY_MODEL/SECONDARY_MODEL is set to an invalid string, selection code falls back to the default tier model instead of crashing.
Examples:
PRIMARY_MODEL=not-a-model-> falls back toanthropic:claude-opus-4-5SECONDARY_MODEL=foo:bar-> falls back toanthropic:claude-haiku-4-5
2) Valid format, but provider rejects model id
If the value is parseable (for exampleopenai:gpt-5-2) but the provider does not recognize it, runtime fails at provider call time (for example model-not-found errors).
3) Invalid per-agent override sent to API
Agent-levelllmModel override values are validated in routes. Invalid values are rejected with 400.
Agent override behavior
For the main chat loop:- If
agents.llm_modelis set, it overridesPRIMARY_MODEL - If
agents.llm_modelisnull, runtime usesPRIMARY_MODEL
llm_provider is treated as derived metadata and is not used as a standalone selector.
Hardcoded model values (not tier-controlled)
These values are currently hardcoded and not controlled byPRIMARY_MODEL/SECONDARY_MODEL:
- Embeddings:
text-embedding-3-smallsrc/services/embedding-service.tssrc/data/embed-knowledge.ts
openai:gpt-5-mini from provider-only rows was removed. If llm_model is null, no OpenAI model is inferred.