A
AsyncFC
Asynchronous function calling framework that parallelizes LLM tool execution for faster agents
Open SourceFree
About
Academic framework from UC Berkeley that decouples LLM decoding from function execution, enabling concurrent tool use without requiring model modifications. Reduces end-to-end latency through parallelism by allowing agents to execute multiple functions simultaneously while the LLM continues processing. Particularly useful for agents that need to call multiple tools or APIs in workflows where execution order isn't strictly sequential.
Details
| Language | |
| Patterns |
Tags
frameworktool-useopen-sourcepythonautonomous
Quick Info
- Organization
- UC Berkeley
- Pricing
- open-source
- Free Tier
- Yes
- Updated
- May 16, 2026
Also in Frameworks
L
LangChain
Build context-aware reasoning applications with LLMs
OSSFree
LangChain AI
136.8K850.0K/wtoday469
A
AutoGen
Microsoft's framework for building multi-agent AI systems
OSSFree
Microsoft
58.1K4w ago445
C
CrewAI
Multi-agent orchestration framework for collaborative AI workflows
OSSFree
CrewAI Inc
51.5Ktoday296