探索我们完整的 MCP 服务器集合,将 AI 连接到您喜爱的工具。
Provides Yr weather data as context for Large Language Model (LLM) tools.
Serves as a MCP server implementation based on the Marker project.
Enables LLMs to execute Python code, manage files, and interact with Python environments.
Connects Model Context Protocol (MCP) servers to OpenAI-compatible LLMs like Ollama.
Retrieves and analyzes Sentry issues, providing detailed information about errors in applications.
Simplifies the process of starting and managing Model Context Protocol (MCP) servers for development.
Enables LLMs and AI Agents to manage serverless cron jobs through CRUD operations via the Cronlytic API.
Provides audio playback functionality for AI agents, enabling notification sounds for task completion.
Enables large language models to query live LinkedIn Ads data using natural language, leveraging CData JDBC drivers.
Connects large language models to live Microsoft Planner data, enabling natural language querying without SQL.
Facilitates connecting large language models to live Quickbase data through a Model Context Protocol (MCP) server.
Enhances local LLMs with agentic capabilities, enabling autonomous tool creation and invocation.
Manages structured game worlds for LLM-driven text adventures and role-playing games.
Connects large language models to Oracle Cloud Infrastructure, enabling direct interaction with cloud resources.
Manages collections and documents in various vector databases, including Weaviate and Milvus, through a Model Context Protocol (MCP) server.
Develops a custom MCP server and client for concept testing using Node.js, TypeScript, and Gemini AI.
Integrates AI assistants with Obsidian vaults, enabling direct file operations and markdown content analysis.
Provides a secure, unified Model Context Protocol (MCP) gateway to democratize AI agent access to AWS resources.
Manage personal finances and track expenses through natural language conversations with AI assistants.
Perform safe, surgical, and deterministic modifications within Obsidian vaults, acting as the 'hands' for AI agents.
Generate diverse charts and visualizations with dual SVG/terminal output, and integrate with AI agents via a Model Context Protocol server.
Transforms apcore module registries into Model Context Protocol (MCP) tool definitions and OpenAI-compatible function calling formats.
Aggregates and organizes over 200 specialized AI skills, prompts, and coding tasks for seamless integration with various AI assistants.
Enables AI agents to query PDF documents using natural language and receive grounded, source-attributed answers through retrieval-augmented generation.
Provides comprehensive data and intelligence on Magic: The Gathering cards, rules, and formats for AI assistants.
Execute pytest test suites, parse results, generate reports, and store outcomes in a QA platform database via a Model Context Protocol server.
Automate AI-powered social media content creation, scheduling, and performance optimization across diverse platforms.
Enables AI assistants to upload, transform, and serve images globally via the Spronta Image CDN.
Empowers AI agents to autonomously schedule, monitor, and execute event-driven or time-based tasks without manual scripting.
Provides a RAM-resident, graph-primary memory system that makes LLM agents smarter through LLM-driven consolidation.
Scroll for more results...