About
Repo Crawler is an MCP server designed to transform GitHub repositories into structured intelligence for AI agents. It addresses the challenge of AI agents needing deep understanding of code repositories, beyond just file content, by exposing GitHub's entire data surface as structured tools. This includes comprehensive details like repository metadata, file trees, languages, commits, contributors, issues, pull requests, traffic, security alerts (Dependabot, code scanning, secret scanning), and Software Bill of Materials (SBOMs). The tool streamlines data extraction, handling API quotas, context window limitations, and rate limiting through a multi-tiered, section-selective, and gracefully degrading fetching mechanism.