Evaluates and scores Markdown documentation against specific project goals using a structured review framework.
The Markdown Document Reviewer skill provides a rigorous, standardized framework for auditing documentation quality within Claude Code. It automates the evaluation of Markdown files by comparing them against objective benchmarks, generating quantitative scores for accuracy, completeness, relevance, and actionability. Beyond simple scoring, it performs deep content analysis to identify critical issues, performance bottlenecks, and potential bugs, ensuring your project documentation remains high-quality and aligned with its intended purpose.
Key Features
01Comprehensive scoring system for documentation quality metrics
02Standardized reporting format for consistent document audits
03Detection of critical issues and potential bugs in text
04Automated accuracy gap analysis against goal files
05Quantitative evaluation of content completeness and relevance
061 GitHub stars
Use Cases
01Performing quality assurance on Markdown-based documentation during CI/CD
02Auditing technical manuals to ensure they meet project goal specifications
03Identifying actionability gaps in user guides and developer READMEs