Builds fully automated, AI-powered data collection agents that scrape, enrich, and store public data for free.
The Data Scraper Agent skill provides a comprehensive framework for building production-grade data pipelines within Claude Code. It automates the process of monitoring public websites, APIs, and RSS feeds, then uses the Gemini Flash LLM to score and summarize the findings. Designed for efficiency and cost-effectiveness, the skill utilizes a free stack—including GitHub Actions for scheduling and Notion or Supabase for storage—while incorporating a feedback loop that allows the agent to learn your preferences and improve its accuracy over time.
Características Principales
010 GitHub stars
02Feedback-driven learning system that refines AI results over time
03Support for Notion, Sheets, and Supabase storage backends
04Automated scheduling via GitHub Actions CRON
05Multi-model fallback logic to bypass API rate limits
06AI-powered data enrichment and scoring with Gemini Flash
Casos de Uso
01Aggregating and summarizing news, GitHub releases, or sports stats into a database
02E-commerce price tracking and competitive analysis alerts
03Automated job board monitoring with AI relevance scoring against a resume