Skip to content

Agent Orchestration

ContextRouter uses LangGraph for agent orchestration. Each agent is a state machine that processes requests through a series of nodes with conditional routing.

Model Registry

All LLM usage goes through the central registry with automatic fallback:

from contextrouter.modules.models import model_registry
model = model_registry.get_llm_with_fallback(
key="openai/gpt-5-mini",
fallback_keys=["anthropic/claude-sonnet-4", "vertex/gemini-2.5-flash"],
strategy="fallback",
config=config,
)
response = await model.generate(request)

Supported Providers

ProviderModelsBackend
OpenAIGPT-5, GPT-5-mini, o1, o3openai.py
AnthropicClaude Sonnet 4, Haikuanthropic.py
Vertex AIGemini 2.5 Flash/Provertex.py
GroqLlama (ultra-fast inference)groq.py
PerplexitySonar (web-grounded)perplexity.py
RLMRecursive Language Modelrlm.py
HuggingFaceHF Inference / Hubhuggingface.py
OpenRouterMulti-provider gatewayopenrouter.py
LiteLLMUniversal adapterlitellm.py
LocalOllama, vLLM (OpenAI-compat)local_openai.py
RunPodServerless inferencerunpod.py

Agent Graphs

Agent graphs are defined in cortex/graphs/ as LangGraph StateGraph instances:

router-agents

Built-in Graphs

GraphPurposeLocation
DispatcherCentral request routinggraphs/dispatcher.py
RAG RetrievalKnowledge retrieval pipelinegraphs/rag_retrieval/
GardenerProduct taxonomy classificationgraphs/commerce/gardener/
MatcherProduct linking (RLM-based)graphs/commerce/matcher/
News EngineMulti-stage news pipelinegraphs/news_engine/
AnalyticsData analytics pipelinegraphs/analytics/
Self-healingAuto-recovery logicgraphs/self_healing/

Plugin Architecture

Router supports a plugin system for extending functionality:

from contextrouter.core.plugins import PluginManager
# Plugins register connectors, transformers, providers
plugin_manager = PluginManager()
plugin_manager.discover_plugins()

Plugins can register:

  • Connectors — new data sources
  • Transformers — new processing stages (NER, classification, etc.)
  • Providers — new storage backends
  • Tools — new LLM function calling tools