This MCP server enhances Neo4j to serve as a powerful backend for LLM-driven GraphRAG applications. It provides specialized tools for performing semantic vector searches, comprehensive fulltext searches using Lucene syntax, and innovative methods to combine search results directly within Cypher queries. Built on LiteLLM, it supports a wide range of embedding providers (OpenAI, Azure, Bedrock, Cohere, Ollama), allowing LLMs to intelligently interact with and retrieve information from a Neo4j graph database, making complex data exploration and retrieval more accessible for AI agents.
主な機能
01Semantic vector search using Neo4j vector indexes.
02Execute Cypher queries augmented by vector or fulltext search results.
03Keyword fulltext search with Lucene syntax on Neo4j fulltext indexes.
041 GitHub stars
05Multi-provider embedding support via LiteLLM for diverse AI models.
06Perform read-only Cypher queries for general graph traversal.