+15
-25
languages/python/README.md
+15
-25
languages/python/README.md
···
1
1
# python
2
2
3
-
notes on writing python - the language itself, the ecosystem around it, and patterns that work.
3
+
notes on writing python.
4
4
5
-
python is easy to start but has depth. these notes focus on modern python (3.10+) with strong typing, async patterns, and the tooling that makes it feel like a proper systems language.
5
+
python is dynamically typed but increasingly written with static type hints. it has first-class async support but doesn't force it on you. it has a global interpreter lock but that matters less than people think for I/O-bound work.
6
6
7
-
## the language
7
+
these notes focus on modern python (3.10+) - the patterns that emerge when you take typing seriously and use async where it makes sense.
8
8
9
-
how to write python code:
9
+
## the language
10
10
11
-
- [typing](./typing.md) - modern type hints, Protocol, generics
12
-
- [async](./async.md) - context managers, ContextVar, patterns
13
-
- [pydantic](./pydantic.md) - models, validation, settings
11
+
- [typing](./typing.md) - type hints, generics, Protocol
12
+
- [async](./async.md) - async with, ContextVar, concurrency
14
13
- [patterns](./patterns.md) - class design, decorators, error handling
15
14
16
15
## the ecosystem
17
16
18
-
tooling and project structure:
19
-
20
-
- [uv](./uv.md) - cargo for python
17
+
- [uv](./uv.md) - package management that doesn't hurt
21
18
- [project setup](./project-setup.md) - src/ layout, pyproject.toml, justfile
22
19
- [tooling](./tooling.md) - ruff, ty, pre-commit
23
-
- [mcp](./mcp.md) - building MCP servers with fastmcp
20
+
- [pydantic](./pydantic.md) - validation at boundaries
21
+
- [mcp](./mcp.md) - building tools for LLMs
24
22
25
23
## sources
26
24
27
-
patterns derived from building:
28
-
29
-
| project | what it is |
30
-
|---------|------------|
31
-
| [pdsx](https://github.com/zzstoatzz/pdsx) | ATProto MCP server and CLI |
32
-
| [plyr-python-client](https://github.com/zzstoatzz/plyr-python-client) | uv workspace, multi-package repo |
33
-
| [prefect-mcp-server-demo](https://github.com/zzstoatzz/prefect-mcp-server-demo) | MCP server patterns |
34
-
35
-
and studying:
36
-
37
-
| project | what it is |
38
-
|---------|------------|
39
-
| [fastmcp](https://github.com/jlowin/fastmcp) | MCP server framework |
40
-
| [docket](https://github.com/chrisguidry/docket) | distributed task system |
25
+
| project | notes |
26
+
|---------|-------|
27
+
| [pdsx](https://github.com/zzstoatzz/pdsx) | async patterns, MCP, CLI |
28
+
| [plyr-python-client](https://github.com/zzstoatzz/plyr-python-client) | multi-package workspace |
29
+
| [fastmcp](https://github.com/jlowin/fastmcp) | decorator patterns, generics |
30
+
| [docket](https://github.com/chrisguidry/docket) | dependency injection, async lifecycle |
+23
-132
languages/python/async.md
+23
-132
languages/python/async.md
···
1
1
# async
2
2
3
-
async python patterns - context managers, concurrency, and request-scoped state.
3
+
python's `async`/`await` syntax is straightforward. the interesting part is how you structure code around it.
4
4
5
-
## async context managers
5
+
## async with
6
6
7
-
the core pattern for resource lifecycle:
7
+
the core insight from async python codebases: `async with` is how you manage resources. not try/finally, not callbacks - the context manager protocol.
8
8
9
-
```python
10
-
from contextlib import asynccontextmanager
11
-
from typing import AsyncIterator
9
+
when you open a connection, start a session, or acquire any resource that needs cleanup, you wrap it in an async context manager:
12
10
11
+
```python
13
12
@asynccontextmanager
14
-
async def get_client(require_auth: bool = False) -> AsyncIterator[Client]:
15
-
"""acquire a client, ensure cleanup."""
13
+
async def get_client() -> AsyncIterator[Client]:
16
14
client = Client()
17
-
18
-
if require_auth:
19
-
await client.login()
20
-
15
+
await client.connect()
21
16
try:
22
17
yield client
23
18
finally:
24
19
await client.close()
25
20
```
26
21
27
-
usage:
28
-
29
-
```python
30
-
async with get_client(require_auth=True) as client:
31
-
await client.do_something()
32
-
# client.close() called automatically
33
-
```
34
-
35
-
## class-based context managers
36
-
37
-
when you need state:
38
-
39
-
```python
40
-
class AsyncClient:
41
-
async def __aenter__(self) -> "AsyncClient":
42
-
self._session = await self._create_session()
43
-
return self
44
-
45
-
async def __aexit__(
46
-
self,
47
-
exc_type: type[BaseException] | None,
48
-
exc_val: BaseException | None,
49
-
exc_tb: TracebackType | None,
50
-
) -> None:
51
-
await self._session.close()
52
-
```
22
+
the caller writes `async with get_client() as c:` and cleanup happens automatically, even if exceptions occur. this pattern appears constantly - database connections, HTTP sessions, file handles, locks.
53
23
54
-
## AsyncExitStack
24
+
the alternative - manual try/finally blocks scattered through the code, or worse, forgetting cleanup entirely - is why this pattern dominates. you encode the lifecycle once in the context manager, and every use site gets it right by default.
55
25
56
-
compose multiple context managers:
26
+
## ContextVar
57
27
58
-
```python
59
-
from contextlib import AsyncExitStack
28
+
python added `contextvars` to solve a specific problem: how do you have request-scoped state in async code without passing it through every function?
60
29
61
-
async def with_dependencies() -> dict[str, Any]:
62
-
async with AsyncExitStack() as stack:
63
-
db = await stack.enter_async_context(get_database())
64
-
cache = await stack.enter_async_context(get_cache())
65
-
66
-
return {"db": db, "cache": cache}
67
-
```
68
-
69
-
useful when the number of context managers is dynamic.
70
-
71
-
## ContextVar
72
-
73
-
request-scoped state without passing through every function:
30
+
in sync code, you might use thread-locals. but async tasks can interleave on the same thread, so thread-locals don't work. `ContextVar` gives each task its own copy:
74
31
75
32
```python
76
33
from contextvars import ContextVar
77
34
78
-
_current_user: ContextVar[User | None] = ContextVar("current_user", default=None)
79
-
80
-
def get_current_user() -> User:
81
-
user = _current_user.get()
82
-
if user is None:
83
-
raise RuntimeError("no user in context")
84
-
return user
85
-
86
-
async def with_user(user: User):
87
-
token = _current_user.set(user)
88
-
try:
89
-
yield
90
-
finally:
91
-
_current_user.reset(token)
35
+
_current_request: ContextVar[Request | None] = ContextVar("request", default=None)
92
36
```
93
37
94
-
each async task gets its own copy. no global state pollution.
38
+
set it at the start of handling a request, and any code called from that task can access it. this is how frameworks like fastapi and fastmcp pass request context without threading it through every function signature.
95
39
96
-
## semaphore for concurrency
97
-
98
-
limit concurrent operations:
40
+
the pattern: set at the boundary (request handler, task entry), read anywhere inside. reset when you're done.
99
41
100
-
```python
101
-
import asyncio
42
+
## concurrency control
102
43
103
-
async def batch_process(items: list[str], concurrency: int = 10) -> list[Result]:
104
-
semaphore = asyncio.Semaphore(concurrency)
44
+
`asyncio.gather()` runs tasks concurrently, but sometimes you need to limit how many run at once - rate limits, connection pools, memory constraints.
105
45
106
-
async def process_one(item: str) -> Result:
107
-
async with semaphore:
108
-
return await do_work(item)
109
-
110
-
return await asyncio.gather(*[process_one(item) for item in items])
111
-
```
112
-
113
-
## gather with error handling
114
-
115
-
don't let one failure kill everything:
46
+
`asyncio.Semaphore` is the primitive for this. acquire before work, release after. the `async with` syntax makes it clean:
116
47
117
48
```python
118
-
results = await asyncio.gather(
119
-
*tasks,
120
-
return_exceptions=True,
121
-
)
49
+
semaphore = asyncio.Semaphore(10)
122
50
123
-
successes = [r for r in results if not isinstance(r, Exception)]
124
-
failures = [r for r in results if isinstance(r, Exception)]
51
+
async def limited_work(item):
52
+
async with semaphore:
53
+
return await do_work(item)
125
54
```
126
55
127
-
## shield for critical cleanup
128
-
129
-
prevent cancellation during important operations:
130
-
131
-
```python
132
-
async def __aexit__(self, *args) -> None:
133
-
# don't let cancellation interrupt disconnect
134
-
await asyncio.shield(self._connection.disconnect())
135
-
```
136
-
137
-
## streaming responses
138
-
139
-
async iteration over chunks:
140
-
141
-
```python
142
-
async with client.stream("GET", url) as response:
143
-
async for line in response.aiter_lines():
144
-
if line.startswith("data: "):
145
-
yield json.loads(line[6:])
146
-
```
147
-
148
-
## background tasks
149
-
150
-
fire and forget with cleanup:
151
-
152
-
```python
153
-
class Worker:
154
-
async def __aenter__(self) -> "Worker":
155
-
self._heartbeat_task = asyncio.create_task(
156
-
self._heartbeat(),
157
-
name="worker.heartbeat",
158
-
)
159
-
return self
160
-
161
-
async def __aexit__(self, *args) -> None:
162
-
self._heartbeat_task.cancel()
163
-
with suppress(asyncio.CancelledError):
164
-
await self._heartbeat_task
165
-
```
56
+
at most 10 `do_work` calls run concurrently. the rest wait.
166
57
167
58
sources:
168
59
- [pdsx/src/pdsx/mcp/client.py](https://github.com/zzstoatzz/pdsx/blob/main/src/pdsx/mcp/client.py)
+22
-152
languages/python/pydantic.md
+22
-152
languages/python/pydantic.md
···
1
1
# pydantic
2
2
3
-
data validation, serialization, and settings. the foundation for typed python.
4
-
5
-
## BaseModel basics
6
-
7
-
```python
8
-
from pydantic import BaseModel, Field, ConfigDict
9
-
10
-
class Track(BaseModel):
11
-
model_config = ConfigDict(extra="forbid") # reject unknown fields
12
-
13
-
title: str
14
-
album: str | None = None
15
-
tags: list[str] = Field(default_factory=list)
16
-
```
17
-
18
-
`extra="forbid"` catches typos. if someone passes `titl` instead of `title`, it fails.
19
-
20
-
## field aliases
21
-
22
-
when external data uses different names:
23
-
24
-
```python
25
-
class Track(BaseModel):
26
-
model_config = ConfigDict(populate_by_name=True)
3
+
pydantic is a library, not the language. but it's become foundational enough that it's worth understanding.
27
4
28
-
audio_url: str | None = Field(default=None, alias="r2_url")
29
-
```
5
+
## the core idea
30
6
31
-
accepts both `audio_url` and `r2_url` as input.
7
+
python's type hints don't do anything at runtime by default. `def foo(x: int)` accepts strings, floats, whatever - the annotation is just documentation.
32
8
33
-
## validators
34
-
35
-
transform or check data:
9
+
pydantic makes them real. define a model with type hints, and pydantic validates and coerces data to match:
36
10
37
11
```python
38
-
from pydantic import field_validator, model_validator
39
-
from typing import Self
12
+
from pydantic import BaseModel
40
13
41
-
class Document(BaseModel):
42
-
text: str
43
-
tokens: int | None = None
44
-
45
-
@field_validator("text", mode="before")
46
-
@classmethod
47
-
def strip_whitespace(cls, v: str) -> str:
48
-
return v.strip()
49
-
50
-
@model_validator(mode="after")
51
-
def compute_tokens(self) -> Self:
52
-
if self.tokens is None:
53
-
self.tokens = len(self.text.split())
54
-
return self
55
-
```
56
-
57
-
`mode="before"` runs before pydantic's own validation. `mode="after"` runs after the model is constructed.
58
-
59
-
## discriminated unions
60
-
61
-
polymorphic types with a type field:
62
-
63
-
```python
64
-
from typing import Annotated, Literal
65
-
from pydantic import Field
66
-
67
-
class TrackResult(BaseModel):
68
-
type: Literal["track"] = "track"
69
-
title: str
70
-
71
-
class ArtistResult(BaseModel):
72
-
type: Literal["artist"] = "artist"
14
+
class User(BaseModel):
73
15
name: str
74
-
75
-
SearchResult = Annotated[
76
-
TrackResult | ArtistResult,
77
-
Field(discriminator="type"),
78
-
]
79
-
80
-
def parse_results(data: list[dict]) -> list[SearchResult]:
81
-
from pydantic import TypeAdapter
82
-
adapter = TypeAdapter(list[SearchResult])
83
-
return adapter.validate_python(data)
84
-
```
85
-
86
-
pydantic looks at `type` to decide which model to use.
87
-
88
-
## TypedDict for loose structures
89
-
90
-
when you don't need full model machinery:
91
-
92
-
```python
93
-
from typing import TypedDict
16
+
age: int
94
17
95
-
class RecordResponse(TypedDict):
96
-
uri: str
97
-
cid: str | None
98
-
value: dict[str, Any]
18
+
user = User(name="alice", age="25") # age coerced to int
19
+
user = User(name="alice", age="not a number") # raises ValidationError
99
20
```
100
21
101
-
lighter than BaseModel, still typed.
22
+
this is why pydantic shows up everywhere in python - it bridges the gap between python's dynamic runtime and the desire for validated, typed data.
102
23
103
24
## settings from environment
104
25
105
-
replace `os.getenv()` with validated configuration:
26
+
the most common use: replacing `os.getenv()` calls with validated configuration.
106
27
107
28
```python
108
-
from pydantic import Field, SecretStr
109
-
from pydantic_settings import BaseSettings, SettingsConfigDict
29
+
from pydantic_settings import BaseSettings
110
30
111
31
class Settings(BaseSettings):
112
-
model_config = SettingsConfigDict(
113
-
env_prefix="APP_",
114
-
env_file=".env",
115
-
extra="ignore",
116
-
)
117
-
118
32
database_url: str
119
-
api_key: SecretStr
120
33
debug: bool = False
121
34
122
-
settings = Settings()
35
+
settings = Settings() # reads from environment
123
36
```
124
37
125
-
- `env_prefix` means `APP_DATABASE_URL` maps to `database_url`
126
-
- `SecretStr` hides values in logs
127
-
- missing required fields fail at import, not at runtime
128
-
129
-
## cached singleton
130
-
131
-
```python
132
-
from functools import lru_cache
38
+
if `DATABASE_URL` isn't set, this fails at import time - not later when you try to connect. that fail-fast behavior catches configuration errors before they become runtime surprises.
133
39
134
-
@lru_cache
135
-
def get_settings() -> Settings:
136
-
return Settings()
137
-
```
40
+
## when to use what
138
41
139
-
one instance, created once, reused everywhere.
140
-
141
-
## nested settings
142
-
143
-
for complex configuration:
42
+
pydantic models are heavier than they look - they do a lot of work on instantiation. for internal data you control, python's `dataclasses` are simpler:
144
43
145
44
```python
146
-
class DatabaseSettings(BaseSettings):
147
-
host: str = "localhost"
148
-
port: int = 5432
45
+
from dataclasses import dataclass
149
46
150
-
class Settings(BaseSettings):
151
-
database: DatabaseSettings = Field(default_factory=DatabaseSettings)
152
-
cache: CacheSettings = Field(default_factory=CacheSettings)
47
+
@dataclass
48
+
class InternalRecord:
49
+
id: str
50
+
value: float
153
51
```
154
52
155
-
## factory methods
156
-
157
-
alternative constructors:
158
-
159
-
```python
160
-
class Config(BaseModel):
161
-
name: str
162
-
values: dict[str, Any]
163
-
164
-
@classmethod
165
-
def from_file(cls, path: Path) -> "Config":
166
-
data = json.loads(path.read_text())
167
-
return cls.model_validate(data)
168
-
```
169
-
170
-
## TypeAdapter for non-model validation
171
-
172
-
validate without a full model:
173
-
174
-
```python
175
-
from pydantic import TypeAdapter
176
-
177
-
adapter = TypeAdapter(list[int])
178
-
result = adapter.validate_python(["1", "2", "3"]) # [1, 2, 3]
179
-
```
180
-
181
-
useful for validating function arguments or API responses.
53
+
no validation, no coercion, just a class with fields. use pydantic at boundaries (API input, config files, external data) where you need validation. use dataclasses for internal data structures.
182
54
183
55
sources:
184
-
- [pdsx/src/pdsx/_internal/types.py](https://github.com/zzstoatzz/pdsx/blob/main/src/pdsx/_internal/types.py)
185
-
- [plyr-python-client/packages/plyrfm/src/plyrfm/_internal/types.py](https://github.com/zzstoatzz/plyr-python-client)
186
56
- [how to use pydantic-settings](https://blog.zzstoatzz.io/how-to-use-pydantic-settings/)