# mcp MCP (Model Context Protocol) lets you build tools that LLMs can use. fastmcp makes this straightforward. ## what MCP is MCP servers expose: - **tools** - functions LLMs can call (actions, side effects) - **resources** - read-only data (like GET endpoints) - **prompts** - reusable message templates clients (like Claude) discover and call these over stdio or HTTP. ## basic server ```python from fastmcp import FastMCP mcp = FastMCP("my-server") @mcp.tool def add(a: int, b: int) -> int: """Add two numbers.""" return a + b @mcp.resource("config://version") def get_version() -> str: return "1.0.0" if __name__ == "__main__": mcp.run() ``` fastmcp generates JSON schemas from type hints and docstrings automatically. ## running ```bash # stdio (default, for local tools) python server.py # http (for deployment) fastmcp run server.py --transport http --port 8000 ``` ## tools vs resources **tools** do things: ```python @mcp.tool async def create_post(text: str) -> dict: """Create a new post.""" return await api.create(text) ``` **resources** read things: ```python @mcp.resource("posts://{post_id}") async def get_post(post_id: str) -> dict: """Get a post by ID.""" return await api.get(post_id) ``` ## context access MCP capabilities within tools: ```python from fastmcp import FastMCP, Context mcp = FastMCP("server") @mcp.tool async def process(uri: str, ctx: Context) -> str: await ctx.info(f"Processing {uri}...") data = await ctx.read_resource(uri) await ctx.report_progress(50, 100) return data ``` ## middleware add authentication or other cross-cutting concerns: ```python from fastmcp import FastMCP from fastmcp.server.middleware import Middleware class AuthMiddleware(Middleware): async def on_call_tool(self, context, call_next): # extract auth from headers, set context state return await call_next(context) mcp = FastMCP("server") mcp.add_middleware(AuthMiddleware()) ``` ## decorator patterns add parameters dynamically (from pdsx): ```python import inspect from functools import wraps def filterable(fn): """Add a _filter parameter for JMESPath filtering.""" @wraps(fn) async def wrapper(*args, _filter: str | None = None, **kwargs): result = await fn(*args, **kwargs) if _filter: import jmespath return jmespath.search(_filter, result) return result # modify signature to include new param sig = inspect.signature(fn) params = list(sig.parameters.values()) params.append(inspect.Parameter( "_filter", inspect.Parameter.KEYWORD_ONLY, default=None, annotation=str | None, )) wrapper.__signature__ = sig.replace(parameters=params) return wrapper @mcp.tool @filterable async def list_records(collection: str) -> list[dict]: ... ``` ## response size protection LLMs have context limits. protect against flooding: ```python MAX_RESPONSE_CHARS = 30000 def truncate_response(records: list) -> list: import json serialized = json.dumps(records) if len(serialized) <= MAX_RESPONSE_CHARS: return records # truncate and add message about using _filter ... ``` ## claude code plugins structure for Claude Code integration: ``` .claude-plugin/ ├── plugin.json # plugin definition └── marketplace.json # marketplace metadata skills/ └── domain/ └── SKILL.md # contextual guidance ``` **plugin.json**: ```json { "name": "myserver", "description": "what it does", "mcpServers": "./.mcp.json" } ``` skills are markdown files loaded as context when relevant to the task. ## entry points expose both CLI and MCP server: ```toml [project.scripts] mytool = "mytool.cli:main" mytool-mcp = "mytool.mcp:main" ``` sources: - [fastmcp](https://github.com/jlowin/fastmcp) - [pdsx](https://github.com/zzstoatzz/pdsx) - [prefect-mcp-server-demo](https://github.com/zzstoatzz/prefect-mcp-server-demo) - [gofastmcp.com](https://gofastmcp.com)