Mem0, the startup claiming to be the memory layer for AI applications, has received $24 million in seed and Series A funding to deliver persistent, portable memory and storage capabilities to developers and their users. The round was led by Basis Set Ventures with participation from Y Combinator, Kindred Ventures, Peak XV Partners, and the GitHub Fund and a list of key operator-angels. Pronounced “mem zero,” the company pitches a “memory passport” that follows users across apps and agents — making AI that remembers context the way people do.
Why the AI memory layer matters for long-term context
Large language models are for the most part stateless: once a session finishes, their context is gone. While context windows and retrieval-augmented generation have pushed short-term recall, they haven’t addressed long-term, user-specific memory that persists with a person across tools, devices, or model providers. As AI agents multiply, the friction of “re-training” each assistant again is a tax on adoption and a drag on accuracy.
- Why the AI memory layer matters for long-term context
- Mem0’s funding round details and notable backers
- Traction points and developer adoption milestones
- How Mem0 evolved from a side project into a platform
- What Mem0 offers developers building AI assistants
- Competitive landscape and key risks for a memory layer
- Outlook for Mem0 as it scales its AI memory layer

Mem0’s wager is that a neutral, developer-first memory abstraction becomes core infrastructure as identity layers (OAuth) and financial data aggregators (Plaid) standardized portability. The startup contends that the labs constructing their own memory stacks have little reason to ensure that they are interoperable, leading to potential lock-in just as models coalesce around similar capabilities. And a shared memory layer can enable day-one personalization across ecosystems, without requiring developers to commit to one vendor.
Mem0’s funding round details and notable backers
The round mixes a new unannounced seed investment with a new $20 million Series A. Previous backers include Basis Set, Kindred Ventures, and Y Combinator, while new investors this time included Peak XV Partners and the GitHub Fund.
Prominent angels include Dharmesh Shah of HubSpot, Scott Belsky, Olivier Pomel of Datadog, former GitHub CEO Thomas Dohmke, Supabase’s Paul Copplestone, PostHog’s James Hawkins, and Lukas Biewald from Weights & Biases — a posse that reflects confidence from builders who have rolled out developer platforms before.
Traction points and developer adoption milestones
Mem0 is rapidly becoming the fabric of choice for AI memory among developers and has already garnered over 41,000 GitHub stars and 13 million Python package downloads.
(Photo courtesy of Mem0) Feature image via Pexels.
Its cloud API processed 35 million calls in the first quarter of this year and surged to 186 million by the third quarter, about 30% month-over-month growth. More than 80,000 have signed up to use the managed service, and Mem0 is the sole memory provider for the new AWS Agent SDK — a key distribution inroad as enterprises trial multi-agent workflows.

How Mem0 evolved from a side project into a platform
Co-founder and CEO Taranjeet Singh is a former product shipper at Paytm who was Khatabook’s first growth engineer, then went on to create one of the first GPT app stores based in India. That got them thinking about the problem of indexing and retrieving unstructured data, which in turn led to Embedchain, their open-source project that sprang from an accidental discovery while working at 500 Startups (they left after a year), as well as new collaboration with CTO Deshraj Yadav (there was some overlap there) who headed the AI Platform for Tesla Autopilot. The duo also developed EvalAI, an open-source version of Kaggle competitions.
Mem0, on the other hand, was a pivot that had to happen when a viral app for meditation demonstrated a simple gap: users really want an AI companion who would remember their regular sessions and progression. That realization — that hard-to-shake memory and ever-evolving memory could become a baseline product need — changed the shape of a utility into a platform vision. Singh’s cold reach-out to founders and operators contributed to early adoption and ultimately a path to Silicon Valley backers.
What Mem0 offers developers building AI assistants
Mem0 exposes the API to save, recover, and evolve from user memories that can be shared among models, applications, or devices. It is model agnostic — compatible with OpenAI, Anthropic, and open-source LLMs — and also supports popular frameworks such as LangChain and LlamaIndex. Use cases might include therapy assistants that can remember previous conversations, productivity agents that can memorize routines, and financial copilots with preferences and history while enabling appropriate access controls.
Central to the company’s work is a broader consideration of memory as more than mere vector storage. It works to evolve context — summarizing, deduplicating, and updating facts over time — so that agents can learn to adapt rather than hoard transcripts. For teams already investing in retrieval pipelines, a pluggable memory layer offers faster iteration, fewer one-off glue scripts for data development, and clearer governance boundaries.
Competitive landscape and key risks for a memory layer
Incumbent labs are working hard to write long-term memory straight into their helpers, and top managers have even declared that persistent memory will be the foundation of future agents and hardware. Startups such as Supermemory, Letta, and Memories.ai are also testing different methods for cross-app recall. The differentiator of Mem0 is neutrality and portability; users can take trusted, user-consented memory wherever agents and providers should trust it, without lock-in.
The tough problems today are less API polish and more about trust: consent and revocation flows, privacy by design, enterprise-grade security, data residency controls. If Mem0 becomes the “Plaid for memory,” it will be because it nails reliability, permissions, and interoperability while also demonstrating marked improvements in accuracy and retention among the users of apps that implement it.
Outlook for Mem0 as it scales its AI memory layer
With new funding and momentum, Mem0 is pushing to make a popular open-source project into critical AI infrastructure. The thesis is simple: models are getting cheaper and more commoditized, so persistent memory — that’s portable, governed, and developer-friendly — becomes the new moat. If the company can keep growing and closing platform integrations like AWS Agent SDK, it might set a definition for how AI remembers across the modern app stack.