Open Source
Explore the latest AI open-source projects from GitHub and HuggingFace.
Explore the latest AI open-source projects from GitHub and HuggingFace.

Megatron-LM is NVIDIA's GPU-optimized library for distributed training of large transformer models at scale. It provides composable building blocks through Megatron Core, supporting advanced parallelism strategies including Tensor Parallel, Pipeline Parallel, Data Parallel, Expert Parallel, and Context Parallel. The library achieves up to 47% Model FLOP Utilization on H100 clusters and scales from 2B to 462B parameters across thousands of GPUs. Recent updates include Dynamic Context Parallelism with 1.48x speedup for variable-length sequences, Megatron Bridge for Hugging Face checkpoint interoperability, and expanded MoE support for DeepSeek-V3 and Qwen3 architectures.

Shubhamsaboo
Collection of 100+ production-ready LLM apps with AI agents, RAG, voice agents, and MCP using OpenAI, Anthropic, Gemini, and open-source models
infiniflow
Leading open-source RAG engine with deep document understanding, grounded citations, and agent capabilities, with 73K+ GitHub stars.
Unsloth AI
Open-source LLM fine-tuning optimizer delivering 2x faster training with 70% less VRAM, supporting models from GPT-OSS to DeepSeek with zero accuracy loss.
Andrej Karpathy
Andrej Karpathy's minimal LLM training framework that enables end-to-end ChatGPT-class model training from scratch for approximately $100.