Discover our curated collection of MCP servers for data science & ml. Browse 6326servers and find the perfect MCPs for your needs.
Provides a professional academic literature search server, integrating seamlessly with AI assistants to enhance research workflows.
Streamlines development workflows by providing consistent environments, tooling configurations, and coding patterns for modern AI Agentic coding across multiple repositories.
Enables AI models to interact with and retrieve data from the XRP Ledger.
Enables querying and managing MLflow tracking servers using natural language via the Model Context Protocol (MCP).
Provides access to the gnomAD GraphQL API for AI assistants.
Aggregates real-time cryptocurrency news from multiple RSS feeds to empower informed decision-making.
Exposes OpenAI agents and tools as a Model Context Protocol (MCP) server.
Provides extensive tools, resources, and AI-assisted prompts for managing and analyzing Apache Druid clusters through the Model Context Protocol.
Enables large language models to interact with and analyze Reddit content via the Model Context Protocol (MCP).
Enables interaction with Timeplus clusters through an MCP server, providing tools for querying, listing databases and tables, and managing Kafka topics and streams.
Analyzes text documents by counting words and characters.
Automates prompt engineering to refine and optimize interactions with large language models across multiple providers.
Enhance literature reviews by enabling LLMs to access and interact with academic papers.
Standardizes context interaction between AI models and development environments using FastAPI and the Model Context Protocol (MCP).
Enables AI agents to interact with Google BigQuery databases via natural language queries and schema exploration.
Accesses protein function and sequence information directly from UniProt.
Capture and organize technical details, code context, GitHub issues, and personal reflections into a Git-based project journal.
Provides large language models (LLMs) with persistent, long-term memory capabilities across conversations using a multi-layered architecture and rapid GPU-accelerated search.
Enables LLMs to read, write, and manage schemas in PostgreSQL databases.
Indexes and provides semantic search for local documents using a vector database and local language models.
Scroll for more results...