Discover our curated collection of MCP servers for data science & ml. Browse 10401 servers and find the perfect MCPs for your needs.
Simplifies integration between enterprise systems and AI platforms by providing business data support for AI systems.
Enables querying live SAP BusinessObjects BI data from AI clients like Claude Desktop through CData JDBC Drivers.
Provides web search and intelligent research capabilities through Claude CLI integration for MCP clients.
Manages and queries RSS/news feeds, enabling AI assistants to fetch and search information in real-time.
Provides large language models with comprehensive Reddit access via a three-layer architecture, optimizing research and content retrieval.
Provides structured access to the ConceptNet knowledge graph, delivering semantic knowledge for Large Language Models and AI applications.
Provides real-time access to Orlen wholesale fuel prices within the Claude AI environment.
Transforms how AI systems reason by simulating the emergence of insights from interacting thought fragments, mimicking complex physical systems.
Empower qualitative researchers with AI-driven analysis tools and a community-contributed methodology library.
Capture photos and record videos programmatically through a Node.js-based Model Context Protocol (MCP) service.
Integrate Apple's on-device AI models with any application via an OpenAI-compatible API server for private, local AI capabilities on Mac and Windows.
Exposes multi-variant Thai-to-Roman transliteration capabilities as tools for Large Language Models and offers a web-based interface.
Builds a self-improving knowledge graph that enhances AI agent memory and learning capabilities.
Manages post-quantum encrypted secrets for AI agents, offering secure access via a zero-knowledge proxy and the MCP protocol.
Provides a privacy-first, self-hosted Model Context Protocol (MCP) governance gateway for securely orchestrating AI toolchains.
Integrates the Bureau of Labor Statistics Public Data API with AI clients to query U.S. economic data directly.
Exposes a Model Context Protocol (MCP) server over SSE/HTTP, enabling LLM agents to interact with live Zenoh networks.
Provides smart, decaying, and frequency-weighted memory for AI agents to prevent past irrelevant information from resurfacing.
Empowers AI agents with research-grounded structured thinking and steel-manning verification, leveraging sequential decomposition and cognitive mode separation.
Provides a persistent memory layer for AI agents, enabling storage, recall, and enrichment of long-term context through semantic search, knowledge graphs, and dreaming.
Scroll for more results...