DSPy vs LiteLLM
Side-by-side comparison with live GitHub signals. Last updated May 16, 2026.
| Metric | DSPy | LiteLLM |
|---|---|---|
| GitHub Stars | 34.5K | 47.2K |
| Contributors | 393 | 385 |
| Last Commit | May 15, 2026 | May 16, 2026 |
| Open Issues | 474 | 3091 |
| License | open-source | open-source |
| Pricing | open-source | open-source |
| Free Tier | Yes | Yes |
| Category | frameworks | frameworks |
| Trending | No | No |
Shared Tags
pythonopen-source
Only in DSPy
prompt-optimizationprogrammingresearchragstanford
Only in LiteLLM
api-gatewaymulti-modelproxyload-balancing
About DSPy
DSPy replaces prompt engineering with programming. Instead of writing prompts, you define modules with input/output signatures and DSPy automatically optimizes the prompts and weights for your pipeline. Supports chain-of-thought, retrieval-augmented generation, and multi-hop reasoning patterns. From Stanford NLP.
View full listingAbout LiteLLM
LiteLLM provides a unified OpenAI-compatible API for 200+ LLM providers (OpenAI, Anthropic, Google, Azure, AWS Bedrock, Ollama, and more). Use one interface to call any model, with built-in load balancing, fallbacks, spend tracking, and rate limiting. Essential infrastructure for multi-model agent systems.
View full listing