Add pipeline engine and remove legacy compatibility paths

This commit is contained in:
Thales Maciel 2026-02-25 22:40:03 -03:00
parent 3bc473262d
commit e221d49020
18 changed files with 1523 additions and 399 deletions

View file

@ -80,8 +80,7 @@ Create `~/.config/aman/config.json` (or let `aman` create it automatically on fi
{ "from": "docker", "to": "Docker" }
],
"terms": ["Systemd", "Kubernetes"]
},
"domain_inference": { "enabled": true }
}
}
```
@ -92,6 +91,9 @@ Hotkey notes:
- Use one key plus optional modifiers (for example `Cmd+m`, `Super+m`, `Ctrl+space`).
- `Super` and `Cmd` are equivalent aliases for the same modifier.
- Invalid hotkey syntax in config prevents startup/reload.
- When `~/.config/aman/pipelines.py` exists, hotkeys come from `HOTKEY_PIPELINES`.
- `daemon.hotkey` is used as the fallback/default hotkey only when no pipelines file is present.
AI cleanup is always enabled and uses the locked local Llama-3.2-3B GGUF model
downloaded to `~/.cache/aman/models/` during daemon initialization.
@ -107,11 +109,6 @@ Vocabulary correction:
- Wildcards are intentionally rejected (`*`, `?`, `[`, `]`, `{`, `}`) to avoid ambiguous rules.
- Rules are deduplicated case-insensitively; conflicting replacements are rejected.
Domain inference:
- Domain context is advisory only and is used to improve cleanup prompts.
- When confidence is low, it falls back to `general` context.
STT hinting:
- Vocabulary is passed to Whisper as `hotwords`/`initial_prompt` only when those
@ -134,6 +131,12 @@ systemctl --user enable --now aman
- Press it again to stop and run STT.
- Press `Esc` while recording to cancel without processing.
- Transcript contents are logged only when `-v/--verbose` is used.
- Config changes are hot-reloaded automatically (polled every 1 second).
- `~/.config/aman/pipelines.py` changes are hot-reloaded automatically (polled every 1 second).
- Send `SIGHUP` to force an immediate reload of config and pipelines:
`systemctl --user kill -s HUP aman` (or send `HUP` to the process directly).
- Reloads are applied when the daemon is idle; invalid updates are rejected and the last valid config stays active.
- Reload success/failure is logged, and desktop notifications are shown when available.
Wayland note:
@ -149,6 +152,77 @@ AI processing:
- Local llama.cpp model only (no remote provider configuration).
## Pipelines API
`aman` is split into:
- shell daemon: hotkeys, recording/cancel, and desktop injection
- pipeline engine: `lib.transcribe(...)` and `lib.llm(...)`
- pipeline implementation: Python callables mapped per hotkey
Pipeline file path:
- `~/.config/aman/pipelines.py`
- You can start from [`pipelines.example.py`](./pipelines.example.py).
- If `pipelines.py` is missing, `aman` uses a built-in reference pipeline bound to `daemon.hotkey`.
- If `pipelines.py` exists but is invalid, startup fails fast.
- Pipelines are hot-reloaded automatically when the module file changes.
- Send `SIGHUP` to force an immediate reload of both config and pipelines.
Expected module exports:
```python
HOTKEY_PIPELINES = {
"Super+m": my_pipeline,
"Super+Shift+m": caps_pipeline,
}
PIPELINE_OPTIONS = {
"Super+Shift+m": {"failure_policy": "strict"}, # optional
}
```
Pipeline callable signature:
```python
def my_pipeline(audio, lib) -> str:
text = lib.transcribe(audio)
context = lib.llm(
system_prompt="context system prompt",
user_prompt=f"Transcript: {text}",
)
out = lib.llm(
system_prompt="amanuensis prompt",
user_prompt=f"context={context}\ntext={text}",
)
return out
```
`lib` API:
- `lib.transcribe(audio, hints=None, whisper_opts=None) -> str`
- `lib.llm(system_prompt=..., user_prompt=..., llm_opts=None) -> str`
Failure policy options:
- `best_effort` (default): pipeline errors return empty output
- `strict`: pipeline errors abort the current run
Validation:
- `HOTKEY_PIPELINES` must be a non-empty dictionary.
- Every hotkey key must be a non-empty string.
- Every pipeline value must be callable.
- `PIPELINE_OPTIONS` must be a dictionary when provided.
Reference behavior:
- The built-in fallback pipeline (used when `pipelines.py` is missing) uses `lib.llm(...)` twice:
- first to infer context
- second to run the amanuensis rewrite
- The second pass requests JSON output and expects `{"cleaned_text": "..."}`.
- Deterministic dictionary replacements are then applied as part of that reference implementation.
Control:
```bash