Provides an MCP server for accessing academic paper data from arXiv and serving prompt templates for large language models.
This project offers a practical implementation of the Model Context Protocol (MCP) server, designed to facilitate building rich-context AI applications. Developed as a hands-on exercise from DeepLearning.AI's 'Build Rich-Context AI Apps with Anthropic' course, it demonstrates how to standardize LLM access to external tools and data. The server specifically exposes functionalities for searching and retrieving academic papers from arXiv, manages research-related resources, and supplies structured prompt templates, enabling LLMs to synthesize and summarize complex research topics. Built with FastMCP, it features a robust client-server architecture with Docker and Procfile support for easy deployment.