Open Source
Explore the latest AI open-source projects from GitHub and HuggingFace.
Explore the latest AI open-source projects from GitHub and HuggingFace.
Mastra is an open-source TypeScript framework for building AI-powered applications and agents, created by the team behind Gatsby. It provides a complete set of AI primitives including workflows, agents, RAG, and evaluations, all built with a modern TypeScript stack. ## Background and Origin Mastra was created by the team that built Gatsby, the popular React-based static site generator. With their deep experience in developer tooling and JavaScript ecosystem, the team pivoted to address the growing need for a TypeScript-native AI agent framework. The project launched on GitHub and quickly gained traction, reaching over 21,000 stars by February 2026. ## Core Architecture Mastra is built around a modular architecture that lets developers compose AI applications from discrete primitives. At its core, the framework provides a unified interface for connecting to over 40 AI providers, including OpenAI, Anthropic, Google, and open-source models. This model routing capability means developers can switch between providers without rewriting application logic. The agent system supports autonomous decision-making using LLMs combined with custom tools. Agents can be configured with specific instructions, memory contexts, and tool access, enabling them to handle open-ended tasks independently. ## Workflow Engine One of Mastra's standout features is its graph-based workflow engine. Developers define complex orchestration logic using an intuitive API with methods like `.then()`, `.branch()`, and `.parallel()`. This approach brings familiar programming patterns to AI workflow design rather than requiring visual builders or YAML configurations. The workflow engine includes built-in human-in-the-loop support with persistent state management. Workflows can be suspended at any point to await human input and then resumed, making it suitable for applications that require approval steps or manual review. ## Memory and Context Management Mastra provides sophisticated context management capabilities including conversation history tracking, RAG integration for knowledge retrieval, working memory for short-term context, and semantic memory for long-term knowledge storage. This layered memory architecture helps agents maintain coherent interactions across extended conversations. ## MCP Server Support The framework includes native support for the Model Context Protocol (MCP), allowing developers to author MCP servers directly within Mastra. This enables standardized tool and context sharing between AI applications, aligning with the emerging industry standard for AI interoperability. ## Production Readiness Mastra ships with built-in evaluation tools and observability features designed for production deployments. The evaluation system allows developers to measure agent performance against custom criteria, while observability tools provide insights into workflow execution, token usage, and latency metrics. The latest release, @mastra/core@1.4.0 (February 13, 2026), includes embedded documentation in published npm packages under dist/docs/, enabling coding agents and AI assistants to understand and use the framework by reading documentation directly from node_modules. ## Framework Integration Mastra integrates seamlessly with React, Next.js, and Node.js applications, or can run as a standalone service. This flexibility makes it suitable for adding AI capabilities to existing web applications or building dedicated AI services from scratch. ## Community and Ecosystem With over 330 contributors and 1,600 projects built on Mastra, the ecosystem is growing rapidly. The project is backed by Y Combinator and maintains active development with regular releases addressing community feedback and expanding provider support.