Begin adding AI memory

This commit is contained in:
2026-03-05 03:24:39 +00:00
parent f21abd6299
commit 06735bdfb1
26 changed files with 1446 additions and 110 deletions

View File

@@ -0,0 +1,34 @@
# Feature Plan: Agent Knowledge Memory Foundation (Pre-11/12)
## Goal
Establish a scalable, queryable memory substrate so wiki and MCP features can rely on fast retrieval instead of markdown-file scans.
## Why This Comes Before 11/12
- Plan 11 (personal memory) needs performant retrieval and indexing guarantees.
- Plan 12 (MCP wiki/tools) needs a stable backend abstraction independent of UI and tool transport.
## Scope
- Pluggable memory search backend interface.
- Default Django backend for zero-infra operation.
- Optional Manticore backend for scalable full-text/vector-ready indexing.
- Reindex + query operational commands.
- System diagnostics endpoints for backend status and query inspection.
## Implementation Slice
1. Add `core/memory/search_backend.py` abstraction and backends.
2. Add `memory_search_reindex` and `memory_search_query` management commands.
3. Add system APIs:
- backend status
- memory query
4. Add lightweight Podman utility script for Manticore runtime.
5. Add tests for diagnostics and query behavior.
## Acceptance Criteria
- Memory retrieval works with `MEMORY_SEARCH_BACKEND=django` out of the box.
- Switching to `MEMORY_SEARCH_BACKEND=manticore` requires only env/config + container startup.
- Operators can verify backend health and query output from system settings.
## Out of Scope
- Full wiki article model/UI.
- Full MCP server process/tooling.
- Embedding generation pipeline (next slice after backend foundation).