Descubre nuestra colección curada de servidores MCP para data science & ml. Explora 10247 servidores y encuentra los MCP perfectos para tus necesidades.
Enables AI agents to query data stored in Axiom using the Axiom Processing Language (APL).
Enables seamless integration with OpenRouter.ai's diverse model ecosystem.
Enables interaction with the Semantic Scholar API to search for papers, retrieve paper and author details, and fetch citations and references.
Provides real-time and historical cryptocurrency market data for LLMs.
Enables natural language querying of PostgreSQL databases using Ollama's LLM and the Model Context Protocol.
Enhance function-calling-enabled language models and agents with a robust collection of tools.
Integrates Naver Search and DataLab APIs for comprehensive searching and trend analysis across Naver services.
Converts Compiled HTML Help (CHM) files to Markdown format, optimized for technical documentation and AI integration.
Deploys a complete AWS backend infrastructure for retrieval-augmented generation applications, integrating with Gemini Pro and a Streamlit UI.
Transforms OpenAPI/Swagger specifications into Model Context Protocol (MCP) format, enabling AI assistants to interact with REST APIs through a standardized protocol.
Provides LLM tools to search and retrieve comprehensive clinical trial data from the official ClinicalTrials.gov REST API.
Enables AI assistants to perform advanced PDF processing operations by integrating with the Nutrient Document Web Service (DWS) Processor API.
Empowers your Navidrome music server with an AI-powered assistant, facilitating intelligent playlist creation, music discovery, and library management through natural language interaction.
Stream online music directly from your command-line interface with support for AI integration via Model Context Protocol.
Enables AI agents to search and analyze real-time Korean laws, precedents, and administrative rules using the National Legal Information Center Open API.
Integrates AI-powered search and vision analysis APIs for enhanced code development workflows via a specialized MiniMax Model Context Protocol (MCP) server.
Provides a fast, local, and persistent memory system for AI coding assistants, enabling them to store and recall information across sessions.
Eliminates the F# edit-build-run cycle by providing a live development server with hot reload, full project context, and live unit testing.
Protects AI agents from prompt injection, data exfiltration, and dangerous command execution by acting as an LLM security middleware and AI agent firewall.
Provides a comprehensive MicroPython firmware and script collection for the Clockwork Pi PicoCalc handheld device, powered by the Raspberry Pi Pico 2W.
Scroll for more results...