소개
Empower your development workflow with Memory IA, a robust Model Context Protocol (MCP) server designed to host an AI agent equipped with persistent memory. Built on Python, FastAPI, and LangGraph, this server seamlessly integrates SQLite for long-term memory storage and leverages Ollama for running local large language models like Llama 3.2 or Qwen. It offers a standardized JSON-RPC interface, making it ready for immediate use across diverse clients such as VS Code, Gemini-CLI, and Cursor, providing a powerful, multi-client AI backend.