DeepYardDeepYard

Context7 vs Memory MCP

Side-by-side comparison with live GitHub signals. Last updated April 1, 2026.

C

Context7

Up-to-date library documentation injected directly into your AI coding context

OSSFree
51.3K35.0K/wtoday114
M

Memory MCP

Persistent knowledge graph memory across AI coding sessions

OSSFree
82.7K15.0K/w3d ago430
MetricContext7Memory MCP
GitHub Stars51.3K82.7K
Contributors114430
Last CommitMar 31, 2026Mar 29, 2026
Open Issues159628
Licenseopen-sourceopen-source
Pricingopen-sourceopen-source
Free TierYesYes
Categorymcp-serversmcp-servers
TrendingNoNo

Shared Tags

No shared tags

Only in Context7

documentationlibrary-docscontext-injectionragdeveloper-toolsupstash

Only in Memory MCP

memoryknowledge-graphpersistenceanthropicreference-servercontext-management

About Context7

Context7 solves the stale-docs problem: instead of relying on an LLM's outdated training data, it fetches live, version-specific documentation for thousands of popular libraries and injects it directly into the model's context window. Ask Claude Code to use React 19, Next.js 15, or any other library and get answers grounded in the actual current API — not hallucinated method signatures from two years ago.

View full listing

About Memory MCP

The official Anthropic memory MCP server gives AI assistants persistent, cross-session memory using a local knowledge graph stored as a JSON file. Claude can create entities (people, projects, concepts), record relations between them, and add observations over time — then recall that information in future sessions. Ideal for long-running projects where continuity matters: the assistant remembers your architecture decisions, team conventions, and in-progress work without you having to repeat context.

View full listing