发现我们为 data science & ml 精心策划的 MCP 服务器集合。浏览 10306 个服务器,找到满足您需求的完美 MCP。
Downloads video and audio content from various online platforms for use with Large Language Models.
Aggregates trending topics from across the web using the Model Context Protocol (MCP).
Enables natural language queries to retrieve data from databases by converting them into SQL.
Securely executes stateful Python code in sandboxed environments designed to run indefinitely.
Retrieves information from Wikipedia to provide context to Large Language Models (LLMs).
Enables AI assistants to retrieve and interpret real-time weather data via the Model Context Protocol.
Simplifies the integration of AI plugins and external tools into open-source large language models (LLMs).
Integrates Google Gemini AI into Claude Code, enabling powerful AI collaboration and enhanced functionality.
Facilitates the development and configuration of AI applications that integrate with Tableau Cloud and Server.
Integrates Google's Gemini AI capabilities into Claude Code, providing a powerful AI assistant with a massive context window for code analysis.
Enables large language models (LLMs) like Claude to backtest trading ideas and automate investment strategies.
Visualizes directories and their contents with AI-optimized output formats for enhanced readability and token efficiency.
Integrates AI assistants with real-time weather data through a lightweight Model Context Protocol server.
Establishes a privacy-first, local RAG server for AI coding assistants, enabling semantic search over personal documents without exposing data to external services.
Integrates Anki with AI assistants, enabling natural language interaction for enhanced spaced repetition learning.
Provides a cards-first context system for coding agents that intelligently indexes code repositories, saving tokens and improving context relevance.
Automates intelligent browser interactions and data extraction using an LLM-driven CodeAgent architecture and direct Chrome DevTools Protocol communication.
Provides compact, efficient, and extensible long-term memory for LLM agents.
Provides AI session memory with quality loops to enable agents to retain context and decisions across sessions.
Compiles raw research and files into a persistent markdown wiki, knowledge graph, and search index for AI agents to leverage.
Scroll for more results...