お気に入りのツールにAIを接続するためのMCPサーバーの完全なコレクションをご覧ください。
No description available
Automates Slack message fetching and processing using AI agents.
Provides unified mail operations with atomic tools for organizing and managing email efficiently.
Enables the deployment of a Model Context Protocol (MCP) server on Cloudflare Workers without requiring authentication.
Scans project dependencies to generate comprehensive markdown license reports.
Provides a local Alloy-Language server to facilitate the generation and execution of Alloy modeling code.
Integrates the DeepL Translation API to offer superior translation, document handling, and linguistic control for a wide range of languages.
Provides SQLite-backed memory for agents, featuring a CLI and programmatic API for robust storage and retrieval of information.
Provides a production-ready server for integrating OpenAI models and custom tools with the Model Context Protocol.
Extracts and structures Ant Design v4 component documentation into JSON for AI agent analysis.
Provides full CRUD operations and database exploration for SQLCipher 4 encrypted SQLite databases via Model Context Protocol.
Parses and analyzes HTTP Archive (HAR) files to extract comprehensive insights from network traffic captures.
Provides a production-ready, OAuth-protected server template for AI clients like Claude.
Provides a visual status indicator in a terminal window for AI assistants, showing working, waiting, or completion states.
Enhances user requests into AI-optimized prompts using advanced prompt engineering techniques.
Search and analyze submissions from the Zenn AI Agent Hackathons using an AI agent and a chat-based web UI.
Enables production-safe debugging and diagnostic operations for Laravel applications through Codex CLI using a secure SSH-based runner.
Provides a local-first, assistant-agnostic persistent memory engine for AI agents, featuring advanced hybrid retrieval and neuroscience-inspired memory management.
Provides AI-ready documentation for the PayHere payment gateway, accessible as `llm.txt` or via an MCP server.
Enables AI agents to securely execute remote commands via a safety-first SSH proxy.
Empowers AI agents to read and modify Figma design documents, addressing the write limitations of the official Figma Model Context Protocol server.
Empowers AI agents to manage projects, issues, comments, artifacts, workflows, and workspace resources within the Toony platform.
Transforms plain-text Bible and Catechism references in Markdown into linked resources, including Bible Gateway URLs, Obsidian wiki-links, and Catholic Cross Reference links.
Extracts clean text from web pages for AI development environments while keeping all processing local for complete privacy.
Builds a unified, queryable knowledge graph from diverse codebases, including Python, Terraform/HCL, YAML, and Jinja2, to power LLM agents.
Establishes the authority layer for design-to-code component lifecycles by managing variant definitions and accessibility contracts.
Provides agent-native zero-knowledge proof infrastructure, enabling privacy-preserving computations within a secure AWS Nitro Enclave.
Provides persistent, intelligent memory and a robust knowledge graph for AI agents to learn, remember, and improve across sessions.
Performs comprehensive SEO auditing and real-time web monitoring for websites, integrating directly with AI conversation models.
Perform local semantic search across personal knowledge bases and expose the results to Claude via an MCP server.
Scroll for more results...