reaatech/llm-router
These packages provide an intelligent routing engine that directs LLM requests across multiple providers based on cost, latency, and model capability. They help you manage complex LLM integrations by automating fallback chains, enforcing budget limits, and providing observability into model performance. The system is built around a central `LLMRouter` class that composes pluggable strategies, telemetry, and circuit breakers into a single, configuration-driven pipeline.
Packages
7 packages
@reaatech/llm-router-cli
Provides a command-line interface for testing, benchmarking, and managing LLM routing configurations. It allows you to execute prompts, compare model performance, generate cost reports, and validate configuration files against the `llm-router` schema.
- status
- published
- published
- 1 day ago
@reaatech/llm-router-core
Provides the shared TypeScript interfaces and Zod schemas for defining LLM routing configurations, budget tracking, and circuit breaker states. It serves as the foundational type library for the `@reaatech/llm-router` ecosystem, requiring `zod` as a runtime dependency for schema validation.
- status
- published
- published
- 1 day ago
@reaatech/llm-router-engine
Provides an `LLMRouter` class that orchestrates model selection, fallback chains, cost tracking, and A/B testing for LLM requests. It requires a user-provided `executeModel` callback to interface with specific provider SDKs.
- status
- published
- published
- 1 day ago
@reaatech/llm-router-fallback
Implements resilience patterns for LLM API calls, providing a `FallbackChain` class that manages ordered model failover, circuit breaking, and exponential backoff retries. It requires a model registry and an execution function to coordinate request attempts across prioritized model lists.
- status
- published
- published
- 1 day ago
@reaatech/llm-router-mcp
Exposes an LLM router as a Model Context Protocol (MCP) server, providing tools to route requests, query model metadata, and retrieve cost reports. It provides a factory function to create the server instance and requires an implementation of a `RouterInterface` to handle the underlying routing logic.
- status
- published
- published
- 1 day ago
@reaatech/llm-router-strategies
Provides a collection of routing strategies and an orchestrator class to select the optimal LLM for a request based on cost, latency, capability, or judgment. It exposes a `StrategyOrchestrator` and several strategy classes that implement a common interface for evaluating and selecting models from a provided pool.
- status
- published
- published
- 1 day ago
@reaatech/llm-router-telemetry
Tracks LLM request costs, enforces budget limits, and exports OpenTelemetry-compatible metrics. It provides a set of classes including `CostTracker`, `BudgetManager`, and `CostReporter` designed to integrate with the `llm-router` ecosystem.
- status
- published
- published
- 1 day ago
Comments
Sign in with GitHub to comment and vote.
Loading comments…
