Q
Quasar
10B parameter foundation model with 2M token context using Gated Linear Attention
Open SourceFree
About
Open-source foundation model framework built on Qwen3.5-9B that achieves 2 million token context windows through Gated Linear Attention (GLA) architecture. Unlike transformer-based models with quadratic complexity, Quasar uses linear attention for efficient processing of extremely long contexts. Ideal for researchers and developers working with large documents, codebases, or multi-turn conversations requiring extensive context retention.
Details
| Language | |
| Patterns |
Tags
open-sourceframeworkragpythonself-hosted
Quick Info
- Organization
- SilX AI
- Pricing
- open-source
- Free Tier
- Yes
- Updated
- Mar 23, 2026
Also in Frameworks
L
LangChain
Build context-aware reasoning applications with LLMs
OSSFree
LangChain AI
131.9K850.0K/wtoday469
A
AutoGen
Microsoft's framework for building multi-agent AI systems
OSSFree
Microsoft
56.5K3d ago444
C
CrewAI
Multi-agent orchestration framework for collaborative AI workflows
OSSFree
CrewAI Inc
47.7Ktoday284