Developer ToolsTempl
Type-safe HTML templating language for Go with compile-time safety
Repomix is the open-source CLI that bundles a whole repository into a single XML, Markdown, JSON, or plain-text file your LLM can actually read. Free, MIT-licensed, and at 24,000+ GitHub stars it has effectively become the default way developers feed code to ChatGPT, Claude, and Gemini.
Repomix is an open-source CLI and web tool that packages an entire codebase — local folder, GitHub URL, or zipped archive — into a single AI-friendly file in XML, Markdown, JSON, or plain text. We rate it 89/100 — the right pick for anyone who's tired of copy-pasting random files into ChatGPT, Claude, or Gemini and wants a deterministic, token-counted way to share full project context with an LLM.
Repomix is a free, MIT-licensed Node.js tool created by Kazuki Yamada (@yamadashy) and first published on . It compresses a whole repository — directory tree, file contents, gitignored exclusions, and optional secret scanning — into one structured artifact that fits cleanly inside an LLM's context window. The repo has crossed 24,000 GitHub stars, 1,198 forks, and was nominated for the JSNation Open Source Awards 2025 "Powered by AI" category.
The problem it solves is unsexy but universal: large language models do their best work when they see all the relevant code at once, but tools like ChatGPT, Claude, and Gemini don't have a great native way to upload an entire repo. Before Repomix, most developers either pasted scattered snippets, manually concatenated files with shell scripts, or paid for a coding-specific IDE extension. Repomix turns that into a one-line command — npx repomix — that produces a deterministic, LLM-optimized file.
<file_summary> + <directory_structure> + <files> schema that LLMs parse very accurately.tiktoken to show you the GPT-style token cost of every file plus the full bundle, so you know whether you're going to bust a 200k or 1M context limit before you send it..gitignore automatically, plus its own .repomixignore, plus standard node_modules/.venv/build-folder exclusions out of the box.repomix --remote yamadashy/repomix packs any public GitHub repo without you having to clone it locally — useful for code review of dependencies or open-source projects.Sentiment is overwhelmingly positive. On Reddit's r/ChatGPTCoding and r/LocalLLaMA, threads consistently rank Repomix above competing tools like GitIngest and code2prompt for one reason: the XML output format and Tree-sitter compression are what Claude in particular asks for. On Hacker News, a benchmark on the python-docs-samples repo showed Repomix producing roughly 56M tokens versus GitIngest's 69M and code2prompt's 57M for the same codebase — small but real efficiency gains compound at scale.
The most common complaint on GitHub Discussions and Reddit is the same: very large repositories (50k+ files) can blow past LLM context windows even after compression, and Repomix doesn't yet auto-shard the output across multiple files by topic. The other recurring feedback is that the Node.js dependency feels heavy for a tool people often install once and run from the command line — several users have requested a Go or Rust binary release. Both are open issues on the GitHub repo and acknowledged by the maintainer.
Repomix is completely free and open source under the MIT License. There is no paid tier, no usage cap, and no telemetry sent home from the CLI. The optional hosted web UI at repomix.com is also free, processes everything in-browser, and explicitly states in the privacy policy that uploaded code is not stored server-side.
| Plan | Price | Includes |
|---|---|---|
| CLI / Library | $0 (MIT) | Unlimited local and remote packing, all output formats, MCP server, Tree-sitter compression, Secretlint scanning |
| Web UI | $0 | Browser-based packing of public GitHub repos, no signup required |
| Sponsorship | Optional | GitHub Sponsors page for the maintainer; not required to use any feature |
Best for: developers who routinely paste code into ChatGPT, Claude, or Gemini and want a deterministic way to include full project context; teams running AI code reviews in CI; anyone building agents on top of MCP that need to read codebases; technical writers documenting unfamiliar repos with LLM help.
Not ideal for: developers who already use a deeply integrated IDE assistant like Cursor, GitHub Copilot Workspace, or Aider — those tools manage context internally and Repomix becomes redundant. Also overkill for one-off questions about a single file.
Pros:
Cons:
The closest alternatives are GitIngest (Python-first, browser tool at gitingest.com, simpler but no Tree-sitter compression), code2prompt (Rust binary, fast, no XML format), and Yek (Rust, single-binary install, popular on Hacker News). For developers who already live inside an IDE assistant, Aider and Continue.dev handle context management without an external packer.
Yes — and "worth it" is almost the wrong question for an MIT-licensed tool that takes 30 seconds to try. If you regularly hand code to an LLM outside your IDE, Repomix is the path of least resistance, and the XML+Tree-sitter combination is meaningfully better than the obvious alternatives. The 89/100 reflects that it's the best-in-class option for its specific job, with realistic limits around very large repos and a Node-only distribution that some users would prefer to see ported to Go or Rust.
npx repomix, npm install -g repomix, Yarn, Bun, Homebrew, or Docker). There's also a Chrome extension and a web UI for users who prefer not to install anything.Arm Launches Performix — Free AI-Agent Toolkit With MCP (Apr 28, 2026)
Arm announced Performix on April 28, 2026 — a free, extensible performance-analysis toolkit for AI agents on Arm-based cloud infrastructure, with a built-in MCP server that plugs into GitHub Copilot, Codex, Gemini and Kiro. Microsoft, MongoDB, Redis and SAP are launch partners.
May 3, 2026
OpenAI Brings GPT-5.5 and Codex to Amazon Bedrock — One Day After Microsoft Exclusivity Ends (April 28, 2026)
AWS launched a limited preview of GPT-5.5, Codex and Bedrock Managed Agents one day after Microsoft's OpenAI API exclusivity officially ended.
May 3, 2026
NIST CAISI Evaluation Lands: DeepSeek V4 Pro Trails U.S. Frontier by 8 Months but Wins on Cost (May 1, 2026)
NIST's Center for AI Standards and Innovation published its evaluation of DeepSeek V4 Pro on May 1, 2026, finding the open-weight Chinese flagship lags U.S. frontier models like GPT-5.5 and Claude Opus 4.6 by roughly eight months — but undercuts GPT-5.4 mini on cost across most benchmarks.
May 3, 2026
Is this product worth it?
Built With
Compare with other tools
Open Comparison Tool →