Discover our curated collection of MCP servers for data science & ml. Browse 10306 servers and find the perfect MCPs for your needs.
Enables reading and writing to a Pinecone index via the Model Context Protocol, facilitating retrieval-augmented generation (RAG).
Connects Large Language Models to open data sources using the Model Context Protocol.
Integrates TikTok access into applications via TikNeuron.
Enables LLMs to perform precise numerical calculations via a Model Context Protocol server.
Enables Claude to interact with locally running LLM models through LM Studio by providing a Model Control Protocol (MCP) server.
Provides authenticated access to Google Workspace APIs, offering integrated Gmail, Calendar, and Drive functionality.
Conduct in-depth cryptocurrency research locally using multiple data sources.
Analyzes PubMed medical literature to provide researchers with insights into medical research trends.
Allows AI agents to request human intervention when uncertain, preventing hallucinations and improving accuracy.
Facilitates agentic data analysis on JSON and CSV files for AI models like Claude Code.
Implements a sophisticated database design for Artificial General Intelligence (AGI) memory management, featuring multiple storage and retrieval mechanisms inspired by human cognitive architecture.
Extracts, cleans, and summarizes content from various media sources using AI-powered processing.
Provides a unified knowledge graph and shared memory storage for connecting AI agents and multi-agent systems.
Enables AI agents to connect with Apache Spark History Servers for intelligent job analysis and performance monitoring.
Translates natural-language prompts into precise Perfetto queries to analyze trace data without requiring manual SQL writing.
Connects large language models to the Japanese Ministry of Land, Infrastructure, Transport and Tourism (MLIT) Data Platform API, enabling intuitive, conversational data search and retrieval.
Expedites the creation of production-ready RESTful APIs and MCP servers using Node.js, Express, and various database options.
Provides AI models and clients with comprehensive access to VictoriaMetrics instances, enabling seamless integration with its APIs and embedded documentation.
Optimizes AI coding agent context windows by providing full codebase visibility, reducing token consumption, and improving response quality through intelligent compression.
Equips AI agents with persistent memory, real-time observability, and advanced memory management capabilities.
Scroll for more results...