90 lines
1.4 KiB
Markdown
90 lines
1.4 KiB
Markdown
# pyro-mcp
|
|
|
|
`pyro-mcp` is a minimal Python library that exposes an MCP-compatible server with one static tool.
|
|
|
|
## v0.0.1 Features
|
|
|
|
- Official Python MCP SDK integration.
|
|
- Public server factory: `pyro_mcp.create_server()`.
|
|
- One static MCP tool: `hello_static`.
|
|
- Runnable demonstration script.
|
|
- Project automation via `Makefile`, `pre-commit`, `ruff`, `mypy`, and `pytest`.
|
|
|
|
## Requirements
|
|
|
|
- Python 3.12+
|
|
- `uv` installed
|
|
|
|
## Setup
|
|
|
|
```bash
|
|
make setup
|
|
```
|
|
|
|
This installs runtime and development dependencies into `.venv`.
|
|
|
|
## Run the demo
|
|
|
|
```bash
|
|
make demo
|
|
```
|
|
|
|
Expected output:
|
|
|
|
```json
|
|
{
|
|
"message": "hello from pyro_mcp",
|
|
"status": "ok",
|
|
"version": "0.0.1"
|
|
}
|
|
```
|
|
|
|
## Run the Ollama tool-calling demo
|
|
|
|
Start Ollama and ensure the model is available (defaults to `llama:3.2-3b`):
|
|
|
|
```bash
|
|
ollama serve
|
|
ollama pull llama:3.2-3b
|
|
```
|
|
|
|
Then run:
|
|
|
|
```bash
|
|
make ollama-demo
|
|
```
|
|
|
|
You can also run `make ollama demo` to execute both demos in one command.
|
|
The Make target defaults to model `llama:3.2-3b` and can be overridden:
|
|
|
|
```bash
|
|
make ollama-demo OLLAMA_MODEL=llama3.2:3b
|
|
```
|
|
|
|
## Run checks
|
|
|
|
```bash
|
|
make check
|
|
```
|
|
|
|
`make check` runs:
|
|
|
|
- `ruff` lint checks
|
|
- `mypy` type checks
|
|
- `pytest` (with coverage threshold configured in `pyproject.toml`)
|
|
|
|
## Run MCP server (stdio transport)
|
|
|
|
```bash
|
|
make run-server
|
|
```
|
|
|
|
## Pre-commit
|
|
|
|
Install hooks:
|
|
|
|
```bash
|
|
make install-hooks
|
|
```
|
|
|
|
Hooks run `ruff`, `mypy`, and `pytest` on each commit.
|