+29
languages/python/README.md
+29
languages/python/README.md
···
···
1
+
# python
2
+
3
+
notes on python patterns - tooling, project structure, and building things.
4
+
5
+
## topics
6
+
7
+
- [uv](./uv.md) - cargo for python
8
+
- [project setup](./project-setup.md) - src/ layout, pyproject.toml, justfile
9
+
- [pydantic-settings](./pydantic-settings.md) - centralized typed configuration
10
+
- [tooling](./tooling.md) - ruff, ty, pre-commit
11
+
- [mcp](./mcp.md) - building MCP servers with fastmcp
12
+
13
+
## sources
14
+
15
+
patterns derived from building:
16
+
17
+
| project | what it is |
18
+
|---------|------------|
19
+
| [pdsx](https://github.com/zzstoatzz/pdsx) | ATProto MCP server and CLI |
20
+
| [raggy](https://github.com/zzstoatzz/raggy) | document loaders for LLMs |
21
+
| [prefect-pack](https://github.com/zzstoatzz/prefect-pack) | prefect utilities |
22
+
| [plyr.fm](https://github.com/zzstoatzz/plyr.fm) | music on atproto |
23
+
24
+
and studying:
25
+
26
+
| project | what it is |
27
+
|---------|------------|
28
+
| [fastmcp](https://github.com/jlowin/fastmcp) | MCP server framework |
29
+
| [docket](https://github.com/chrisguidry/docket) | distributed task system |
+188
languages/python/mcp.md
+188
languages/python/mcp.md
···
···
1
+
# mcp
2
+
3
+
MCP (Model Context Protocol) lets you build tools that LLMs can use. fastmcp makes this straightforward.
4
+
5
+
## what MCP is
6
+
7
+
MCP servers expose:
8
+
- **tools** - functions LLMs can call (actions, side effects)
9
+
- **resources** - read-only data (like GET endpoints)
10
+
- **prompts** - reusable message templates
11
+
12
+
clients (like Claude) discover and call these over stdio or HTTP.
13
+
14
+
## basic server
15
+
16
+
```python
17
+
from fastmcp import FastMCP
18
+
19
+
mcp = FastMCP("my-server")
20
+
21
+
@mcp.tool
22
+
def add(a: int, b: int) -> int:
23
+
"""Add two numbers."""
24
+
return a + b
25
+
26
+
@mcp.resource("config://version")
27
+
def get_version() -> str:
28
+
return "1.0.0"
29
+
30
+
if __name__ == "__main__":
31
+
mcp.run()
32
+
```
33
+
34
+
fastmcp generates JSON schemas from type hints and docstrings automatically.
35
+
36
+
## running
37
+
38
+
```bash
39
+
# stdio (default, for local tools)
40
+
python server.py
41
+
42
+
# http (for deployment)
43
+
fastmcp run server.py --transport http --port 8000
44
+
```
45
+
46
+
## tools vs resources
47
+
48
+
**tools** do things:
49
+
```python
50
+
@mcp.tool
51
+
async def create_post(text: str) -> dict:
52
+
"""Create a new post."""
53
+
return await api.create(text)
54
+
```
55
+
56
+
**resources** read things:
57
+
```python
58
+
@mcp.resource("posts://{post_id}")
59
+
async def get_post(post_id: str) -> dict:
60
+
"""Get a post by ID."""
61
+
return await api.get(post_id)
62
+
```
63
+
64
+
## context
65
+
66
+
access MCP capabilities within tools:
67
+
68
+
```python
69
+
from fastmcp import FastMCP, Context
70
+
71
+
mcp = FastMCP("server")
72
+
73
+
@mcp.tool
74
+
async def process(uri: str, ctx: Context) -> str:
75
+
await ctx.info(f"Processing {uri}...")
76
+
data = await ctx.read_resource(uri)
77
+
await ctx.report_progress(50, 100)
78
+
return data
79
+
```
80
+
81
+
## middleware
82
+
83
+
add authentication or other cross-cutting concerns:
84
+
85
+
```python
86
+
from fastmcp import FastMCP
87
+
from fastmcp.server.middleware import Middleware
88
+
89
+
class AuthMiddleware(Middleware):
90
+
async def on_call_tool(self, context, call_next):
91
+
# extract auth from headers, set context state
92
+
return await call_next(context)
93
+
94
+
mcp = FastMCP("server")
95
+
mcp.add_middleware(AuthMiddleware())
96
+
```
97
+
98
+
## decorator patterns
99
+
100
+
add parameters dynamically (from pdsx):
101
+
102
+
```python
103
+
import inspect
104
+
from functools import wraps
105
+
106
+
def filterable(fn):
107
+
"""Add a _filter parameter for JMESPath filtering."""
108
+
@wraps(fn)
109
+
async def wrapper(*args, _filter: str | None = None, **kwargs):
110
+
result = await fn(*args, **kwargs)
111
+
if _filter:
112
+
import jmespath
113
+
return jmespath.search(_filter, result)
114
+
return result
115
+
116
+
# modify signature to include new param
117
+
sig = inspect.signature(fn)
118
+
params = list(sig.parameters.values())
119
+
params.append(inspect.Parameter(
120
+
"_filter",
121
+
inspect.Parameter.KEYWORD_ONLY,
122
+
default=None,
123
+
annotation=str | None,
124
+
))
125
+
wrapper.__signature__ = sig.replace(parameters=params)
126
+
return wrapper
127
+
128
+
@mcp.tool
129
+
@filterable
130
+
async def list_records(collection: str) -> list[dict]:
131
+
...
132
+
```
133
+
134
+
## response size protection
135
+
136
+
LLMs have context limits. protect against flooding:
137
+
138
+
```python
139
+
MAX_RESPONSE_CHARS = 30000
140
+
141
+
def truncate_response(records: list) -> list:
142
+
import json
143
+
serialized = json.dumps(records)
144
+
if len(serialized) <= MAX_RESPONSE_CHARS:
145
+
return records
146
+
# truncate and add message about using _filter
147
+
...
148
+
```
149
+
150
+
## claude code plugins
151
+
152
+
structure for Claude Code integration:
153
+
154
+
```
155
+
.claude-plugin/
156
+
├── plugin.json # plugin definition
157
+
└── marketplace.json # marketplace metadata
158
+
159
+
skills/
160
+
└── domain/
161
+
└── SKILL.md # contextual guidance
162
+
```
163
+
164
+
**plugin.json**:
165
+
```json
166
+
{
167
+
"name": "myserver",
168
+
"description": "what it does",
169
+
"mcpServers": "./.mcp.json"
170
+
}
171
+
```
172
+
173
+
skills are markdown files loaded as context when relevant to the task.
174
+
175
+
## entry points
176
+
177
+
expose both CLI and MCP server:
178
+
179
+
```toml
180
+
[project.scripts]
181
+
mytool = "mytool.cli:main"
182
+
mytool-mcp = "mytool.mcp:main"
183
+
```
184
+
185
+
sources:
186
+
- [fastmcp](https://github.com/jlowin/fastmcp)
187
+
- [pdsx](https://github.com/zzstoatzz/pdsx)
188
+
- [gofastmcp.com](https://gofastmcp.com)
+125
languages/python/project-setup.md
+125
languages/python/project-setup.md
···
···
1
+
# project setup
2
+
3
+
consistent structure across projects: src/ layout, pyproject.toml as single source of truth, justfile for commands.
4
+
5
+
## directory structure
6
+
7
+
```
8
+
myproject/
9
+
├── src/myproject/
10
+
│ ├── __init__.py
11
+
│ ├── cli.py
12
+
│ ├── settings.py
13
+
│ └── _internal/ # private implementation
14
+
├── tests/
15
+
├── pyproject.toml
16
+
├── justfile
17
+
└── .pre-commit-config.yaml
18
+
```
19
+
20
+
the `src/` layout prevents accidental imports from the working directory. your package is only importable when properly installed.
21
+
22
+
## pyproject.toml
23
+
24
+
```toml
25
+
[project]
26
+
name = "myproject"
27
+
description = "what it does"
28
+
readme = "README.md"
29
+
requires-python = ">=3.10"
30
+
dynamic = ["version"]
31
+
dependencies = [
32
+
"httpx>=0.27",
33
+
"pydantic>=2.0",
34
+
]
35
+
36
+
[project.scripts]
37
+
myproject = "myproject.cli:main"
38
+
39
+
[build-system]
40
+
requires = ["hatchling", "uv-dynamic-versioning"]
41
+
build-backend = "hatchling.build"
42
+
43
+
[tool.hatch.version]
44
+
source = "uv-dynamic-versioning"
45
+
46
+
[tool.uv-dynamic-versioning]
47
+
vcs = "git"
48
+
style = "pep440"
49
+
bump = true
50
+
fallback-version = "0.0.0"
51
+
52
+
[dependency-groups]
53
+
dev = [
54
+
"pytest>=8.0",
55
+
"ruff>=0.8",
56
+
"ty>=0.0.1a6",
57
+
]
58
+
```
59
+
60
+
key patterns:
61
+
- `dynamic = ["version"]` - version comes from git tags, not manual editing
62
+
- `[dependency-groups]` - dev deps separate from runtime deps
63
+
- `[project.scripts]` - CLI entry points
64
+
65
+
## versioning from git tags
66
+
67
+
with `uv-dynamic-versioning`, your version is derived from git:
68
+
69
+
```bash
70
+
git tag v0.1.0
71
+
git push --tags
72
+
```
73
+
74
+
no more editing `__version__` or `pyproject.toml` for releases.
75
+
76
+
## justfile
77
+
78
+
```makefile
79
+
check-uv:
80
+
#!/usr/bin/env sh
81
+
if ! command -v uv >/dev/null 2>&1; then
82
+
echo "uv is not installed. Install: curl -LsSf https://astral.sh/uv/install.sh | sh"
83
+
exit 1
84
+
fi
85
+
86
+
install: check-uv
87
+
uv sync
88
+
89
+
test:
90
+
uv run pytest tests/ -xvs
91
+
92
+
lint:
93
+
uv run ruff format src/ tests/
94
+
uv run ruff check src/ tests/ --fix
95
+
96
+
check:
97
+
uv run ty check
98
+
```
99
+
100
+
run with `just test`, `just lint`, etc.
101
+
102
+
## multiple entry points
103
+
104
+
for projects with both CLI and MCP server:
105
+
106
+
```toml
107
+
[project.scripts]
108
+
myproject = "myproject.cli:main"
109
+
myproject-mcp = "myproject.mcp:main"
110
+
```
111
+
112
+
## optional dependencies
113
+
114
+
for features that not everyone needs:
115
+
116
+
```toml
117
+
[project.optional-dependencies]
118
+
mcp = ["fastmcp>=2.0"]
119
+
```
120
+
121
+
install with `uv sync --extra mcp` or `uv add 'myproject[mcp]'`.
122
+
123
+
sources:
124
+
- [pdsx/pyproject.toml](https://github.com/zzstoatzz/pdsx/blob/main/pyproject.toml)
125
+
- [raggy/pyproject.toml](https://github.com/zzstoatzz/raggy/blob/main/pyproject.toml)
+110
languages/python/pydantic-settings.md
+110
languages/python/pydantic-settings.md
···
···
1
+
# pydantic-settings
2
+
3
+
replace scattered `os.getenv()` calls with a typed, validated settings class.
4
+
5
+
## the problem
6
+
7
+
```python
8
+
import os
9
+
10
+
REDIS_HOST = os.getenv("REDIS_HOST")
11
+
REDIS_PORT = os.getenv("REDIS_PORT")
12
+
13
+
if not REDIS_HOST or not REDIS_PORT:
14
+
raise ValueError("REDIS_HOST and REDIS_PORT must be set")
15
+
16
+
# REDIS_PORT is still a string here
17
+
```
18
+
19
+
issues:
20
+
- validation happens where you use the value, not at startup
21
+
- no type coercion (port is a string)
22
+
- configuration scattered across files
23
+
- easy to forget validation
24
+
25
+
## the solution
26
+
27
+
```python
28
+
from pydantic import Field, SecretStr
29
+
from pydantic_settings import BaseSettings, SettingsConfigDict
30
+
31
+
class Settings(BaseSettings):
32
+
model_config = SettingsConfigDict(
33
+
env_file=".env",
34
+
extra="ignore",
35
+
)
36
+
37
+
redis_host: str
38
+
redis_port: int = Field(ge=0)
39
+
openai_api_key: SecretStr
40
+
41
+
settings = Settings()
42
+
```
43
+
44
+
now:
45
+
- missing `REDIS_HOST` fails immediately at import
46
+
- `redis_port` is coerced to int and validated >= 0
47
+
- `openai_api_key` won't accidentally appear in logs
48
+
- all configuration in one place
49
+
50
+
## field aliases
51
+
52
+
when env var names don't match your preferred attribute names:
53
+
54
+
```python
55
+
class Settings(BaseSettings):
56
+
current_user: str = Field(alias="USER")
57
+
```
58
+
59
+
reads from `$USER`, accessed as `settings.current_user`.
60
+
61
+
## secrets
62
+
63
+
`SecretStr` prevents accidental exposure:
64
+
65
+
```python
66
+
class Settings(BaseSettings):
67
+
api_key: SecretStr
68
+
69
+
settings = Settings()
70
+
print(settings.api_key) # SecretStr('**********')
71
+
print(settings.api_key.get_secret_value()) # actual value
72
+
```
73
+
74
+
## contextual serialization
75
+
76
+
when you need to unmask secrets for subprocesses:
77
+
78
+
```python
79
+
from pydantic import Secret, SerializationInfo
80
+
81
+
def maybe_unmask(v: Secret[str], info: SerializationInfo) -> str | Secret[str]:
82
+
if info.context and info.context.get("unmask"):
83
+
return v.get_secret_value()
84
+
return v
85
+
86
+
# usage
87
+
settings.model_dump(context={"unmask": True})
88
+
```
89
+
90
+
## nested settings
91
+
92
+
for larger projects:
93
+
94
+
```python
95
+
class DatabaseSettings(BaseSettings):
96
+
host: str = "localhost"
97
+
port: int = 5432
98
+
99
+
class Settings(BaseSettings):
100
+
database: DatabaseSettings = Field(default_factory=DatabaseSettings)
101
+
```
102
+
103
+
## why fail-fast matters
104
+
105
+
with `os.getenv()`, you find out about missing config when the code path executes - maybe in production, at 2am.
106
+
107
+
with pydantic-settings, invalid configuration fails at startup. deploy fails, not runtime.
108
+
109
+
sources:
110
+
- [how to use pydantic-settings](https://blog.zzstoatzz.io/how-to-use-pydantic-settings/)
+121
languages/python/tooling.md
+121
languages/python/tooling.md
···
···
1
+
# tooling
2
+
3
+
ruff for linting and formatting. ty for type checking. pre-commit to enforce both.
4
+
5
+
## ruff
6
+
7
+
replaces black, isort, flake8, and dozens of plugins. one tool, fast.
8
+
9
+
```bash
10
+
uv run ruff format src/ tests/ # format
11
+
uv run ruff check src/ tests/ # lint
12
+
uv run ruff check --fix # lint and auto-fix
13
+
```
14
+
15
+
### pyproject.toml config
16
+
17
+
```toml
18
+
[tool.ruff]
19
+
line-length = 88
20
+
21
+
[tool.ruff.lint]
22
+
fixable = ["ALL"]
23
+
extend-select = [
24
+
"I", # isort
25
+
"B", # flake8-bugbear
26
+
"C4", # flake8-comprehensions
27
+
"UP", # pyupgrade
28
+
"SIM", # flake8-simplify
29
+
"RUF", # ruff-specific
30
+
]
31
+
ignore = [
32
+
"COM812", # conflicts with formatter
33
+
]
34
+
35
+
[tool.ruff.lint.per-file-ignores]
36
+
"__init__.py" = ["F401", "I001"] # unused imports ok in __init__
37
+
"tests/**/*.py" = ["S101"] # assert ok in tests
38
+
```
39
+
40
+
## ty
41
+
42
+
astral's new type checker. still early but fast and improving.
43
+
44
+
```bash
45
+
uv run ty check
46
+
```
47
+
48
+
### pyproject.toml config
49
+
50
+
```toml
51
+
[tool.ty.src]
52
+
include = ["src", "tests"]
53
+
exclude = ["**/node_modules", "**/__pycache__", ".venv"]
54
+
55
+
[tool.ty.environment]
56
+
python-version = "3.10"
57
+
58
+
[tool.ty.rules]
59
+
# start permissive, tighten over time
60
+
unknown-argument = "ignore"
61
+
no-matching-overload = "ignore"
62
+
```
63
+
64
+
## pre-commit
65
+
66
+
enforce standards before commits reach the repo.
67
+
68
+
### .pre-commit-config.yaml
69
+
70
+
```yaml
71
+
repos:
72
+
- repo: https://github.com/abravalheri/validate-pyproject
73
+
rev: v0.24.1
74
+
hooks:
75
+
- id: validate-pyproject
76
+
77
+
- repo: https://github.com/astral-sh/ruff-pre-commit
78
+
rev: v0.8.0
79
+
hooks:
80
+
- id: ruff-check
81
+
args: [--fix, --exit-non-zero-on-fix]
82
+
- id: ruff-format
83
+
84
+
- repo: local
85
+
hooks:
86
+
- id: type-check
87
+
name: type check
88
+
entry: uv run ty check
89
+
language: system
90
+
types: [python]
91
+
pass_filenames: false
92
+
93
+
- repo: https://github.com/pre-commit/pre-commit-hooks
94
+
rev: v5.0.0
95
+
hooks:
96
+
- id: no-commit-to-branch
97
+
args: [--branch, main]
98
+
```
99
+
100
+
install with:
101
+
102
+
```bash
103
+
uv run pre-commit install
104
+
```
105
+
106
+
never use `--no-verify` to skip hooks. fix the issue instead.
107
+
108
+
## pytest
109
+
110
+
```toml
111
+
[tool.pytest.ini_options]
112
+
asyncio_mode = "auto"
113
+
asyncio_default_fixture_loop_scope = "function"
114
+
testpaths = ["tests"]
115
+
```
116
+
117
+
`asyncio_mode = "auto"` means async tests just work - no `@pytest.mark.asyncio` needed.
118
+
119
+
sources:
120
+
- [pdsx/.pre-commit-config.yaml](https://github.com/zzstoatzz/pdsx/blob/main/.pre-commit-config.yaml)
121
+
- [raggy/pyproject.toml](https://github.com/zzstoatzz/raggy/blob/main/pyproject.toml)
+106
languages/python/uv.md
+106
languages/python/uv.md
···
···
1
+
# uv
2
+
3
+
uv isn't "faster pip." it's cargo for python - a unified toolchain that changes what's practical to do.
4
+
5
+
## install
6
+
7
+
```bash
8
+
# macOS/Linux
9
+
curl -LsSf https://astral.sh/uv/install.sh | sh
10
+
11
+
# Windows
12
+
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
13
+
```
14
+
15
+
## the commands you actually use
16
+
17
+
```bash
18
+
uv sync # install deps from pyproject.toml
19
+
uv run pytest # run in project environment
20
+
uv add httpx # add a dependency
21
+
uvx ruff check # run a tool without installing it
22
+
```
23
+
24
+
never use `uv pip`. that's the escape hatch, not the workflow.
25
+
26
+
## zero-setup environments
27
+
28
+
run tools without installing anything:
29
+
30
+
```bash
31
+
uvx flask --help
32
+
uvx ruff check .
33
+
uvx pytest
34
+
```
35
+
36
+
this creates an ephemeral environment, runs the tool, done. no virtualenv activation, no pip install.
37
+
38
+
## the repro pattern
39
+
40
+
testing specific versions without polluting your environment:
41
+
42
+
```bash
43
+
# test against a specific version
44
+
uv run --with 'pydantic==2.11.4' repro.py
45
+
46
+
# test a git branch before it's released
47
+
uv run --with pydantic@git+https://github.com/pydantic/pydantic.git@fix-branch repro.py
48
+
49
+
# combine: released package + unreleased fix
50
+
uv run --with prefect==3.1.3 --with pydantic@git+https://github.com/pydantic/pydantic.git@fix repro.py
51
+
```
52
+
53
+
for monorepos with subdirectories:
54
+
55
+
```bash
56
+
uv run --with git+https://github.com/prefecthq/prefect.git@branch#subdirectory=src/integrations/prefect-redis repro.py
57
+
```
58
+
59
+
## shareable one-liners
60
+
61
+
no file needed:
62
+
63
+
```bash
64
+
uv run --with 'httpx==0.27.0' python -c 'import httpx; print(httpx.get("https://httpbin.org/get").json())'
65
+
```
66
+
67
+
share in github issues, slack, anywhere. anyone with uv can run it.
68
+
69
+
## stdin execution
70
+
71
+
pipe code directly:
72
+
73
+
```bash
74
+
echo 'import sys; print(sys.version)' | uv run -
75
+
pbpaste | uv run --with pandas -
76
+
```
77
+
78
+
## project workflow
79
+
80
+
```bash
81
+
uv init myproject # create new project
82
+
cd myproject
83
+
uv add httpx pydantic # add deps
84
+
uv sync # install everything
85
+
uv run python main.py # run in environment
86
+
```
87
+
88
+
`uv sync` reads `pyproject.toml` and `uv.lock`, installs exactly what's specified.
89
+
90
+
## why this matters
91
+
92
+
the old way:
93
+
1. install python (which version?)
94
+
2. create virtualenv
95
+
3. activate it (did you remember?)
96
+
4. pip install (hope versions resolve)
97
+
5. run your code
98
+
99
+
the uv way:
100
+
1. `uv run your_code.py`
101
+
102
+
uv handles python versions, environments, and dependencies implicitly. you stop thinking about environment management.
103
+
104
+
sources:
105
+
- [but really, what's so good about uv???](https://blog.zzstoatzz.io/but-really-whats-so-good-about-uv/)
106
+
- [running list of repros via uv](https://blog.zzstoatzz.io/running-list-of-repros-via-uv/)