Client Server Assessment
0
Executes system commands based on client queries using an LLM for command generation.
概要
This project implements a client-server system where a client sends a query to the server. The server utilizes an Ollama LLM to process the query, generate an appropriate system command, execute that command using subprocess, and return the output to the client. The system is built using FastAPI, FastMCP, aiohttp, and standard Python libraries.
主な機能
- Built with FastAPI and FastMCP.
- Processes client queries using an LLM.
- Uses aiohttp for asynchronous communication.
- 0 GitHub stars
- Executes system commands using subprocess.
- Generates system commands based on LLM output.
ユースケース
- Automated system administration tasks.
- Remote command execution based on natural language queries.
- Experimenting with LLMs for system control.