Make workspace-core the default MCP profile

Flip bare pyro mcp serve, create_server(), and Pyro.create_server() to default to workspace-core in 4.0.0 while keeping workspace-full as the explicit advanced opt-in surface.

Rewrite the MCP-facing docs and host-specific examples around the bare default command, update package and catalog compatibility to 4.x, and move the public-contract wording from 3.x compatibility guidance to the new stable default.

Adjust the server, API, and contract tests so bare server creation now asserts the workspace-core tool set, while explicit workspace-full coverage continues to prove shells, services, snapshots, and disk tools remain available.

Validation: uv lock; .venv/bin/pytest --no-cov tests/test_cli.py tests/test_api.py tests/test_server.py tests/test_public_contract.py; UV_CACHE_DIR=.uv-cache make check; UV_CACHE_DIR=.uv-cache make dist-check; real guest-backed smoke for bare Pyro.create_server() plus explicit profile="workspace-full".
This commit is contained in:
Thales Maciel 2026-03-13 14:14:15 -03:00
parent 68d8e875e0
commit c00c699a9f
25 changed files with 170 additions and 121 deletions

View file

@ -4,10 +4,10 @@ Requirements:
- `pip install openai` or `uv add openai`
- `OPENAI_API_KEY`
This is the recommended persistent-chat example. It mirrors the
`workspace-core` MCP profile by deriving tool schemas from
`Pyro.create_server(profile="workspace-core")` and dispatching tool calls back
through that same profiled server.
This is the recommended persistent-chat example. In 4.x the default MCP server
profile is already `workspace-core`, so it derives tool schemas from
`Pyro.create_server()` and dispatches tool calls back through that same
default-profile server.
"""
from __future__ import annotations
@ -45,7 +45,7 @@ async def run_openai_workspace_core_example(*, prompt: str, model: str = DEFAULT
from openai import OpenAI # type: ignore[import-not-found]
pyro = Pyro()
server = pyro.create_server(profile="workspace-core")
server = pyro.create_server()
tools = [_tool_to_openai(tool) for tool in await server.list_tools()]
client = OpenAI()
input_items: list[dict[str, Any]] = [{"role": "user", "content": prompt}]