Configure LibreChat as your AI-powered SEAβ’ interface.
1
2
3
4
5
6
7
8
9
10
11
ββββββββββββββββββββββ ββββββββββββββββββββββ
β LibreChat UI ββββββΆβ SEAβ’ API β
β http://localhost β β /v1/chat/complete β
β :3080 β β /v1/models β
ββββββββββββββββββββββ βββββββββββ¬βββββββββββ
β
βββββββββββΌβββββββββββ
β Agent Society β
β Domain Forge β
β Knowledge Graph β
ββββββββββββββββββββββ
librechat.yamlLocated at deploy/librechat/librechat.yaml:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
version: 1.2.0
cache: true
endpoints:
custom:
- name: "SEAβ’"
apiKey: "${SEA_API_KEY}"
baseURL: "${SEA_API_URL:-http://sea-api:8080}/v1"
models:
default:
- "sea-default" # General purpose
- "sea-analysis" # Deep semantic analysis
- "sea-governance" # Policy-aware reasoning
fetch: true
titleConvo: true
titleMethod: "completion"
summarize: true
forcePrompt: true
modelDisplayLabel: "SEAβ’ Agent"
features:
streaming: false # Enable when implemented
moderation:
enabled: true
provider: "SEAβ’"
endpoint: "${SEA_API_URL}/api/v1/moderation"
interface:
endpointsMenu: true
modelSelect: true
parameters: true
sidePanel: true
presets: true
sea-defaultGeneral-purpose assistant for:
Example prompt:
1
Explain the Softmax Router pattern in the Cognitive Architecture
sea-analysisDeep semantic analysis for:
Example prompt:
1
Analyze the case-management bounded context for DDD violations
sea-governancePolicy-aware reasoning for:
Example prompt:
1
Can an R-AA agent approve its own recommendations? Cite the relevant policy.
1
2
3
4
5
6
Validate the spec file: docs/specs/shared/sds/048-platform-ui-integration.md
Check for:
1. Missing cross-references
2. Incomplete CQRS tables
3. Policy coverage gaps
1
2
3
Generate TypeScript types for the case-management domain.
Use the SDS-012 specification as the source of truth.
Output format: Zod schemas with domain invariants.
1
2
Find all entities in the governance bounded context that have
HITL requirements. Use SPARQL to query the Knowledge Graph.
Create saved prompts for common workflows:
1
2
3
4
5
6
{
"name": "Spec Review",
"system": "You are a SEAβ’ spec reviewer. Check for completeness, cross-references, and policy coverage.",
"model": "sea-analysis",
"temperature": 0.3
}
1
2
3
4
5
6
{
"name": "Semantic Debt",
"system": "Analyze the codebase for semantic debt. Reference SDS-016 for debt classification.",
"model": "sea-analysis",
"temperature": 0.2
}
1
2
3
4
5
6
7
8
9
10
11
12
services:
librechat:
image: librechat/librechat:latest
ports:
- "3080:3080"
volumes:
- ./deploy/librechat/librechat.yaml:/app/librechat.yaml
environment:
- SEA_API_KEY=${SEA_API_KEY}
- SEA_API_URL=http://sea-api:8080
depends_on:
- sea-api
| Issue | Cause | Solution |
|---|---|---|
| Models not loading | API unreachable | Check docker-compose logs sea-api |
| βUnauthorizedβ error | Missing API key | Set SEA_API_KEY in environment |
| Slow responses | LLM cold start | Wait or pre-warm with simple query |
| No governance context | Missing regime | Configure regime in request context |
Last Updated: January 2026