Begin adding AI memory

This commit is contained in:
2026-03-05 03:24:39 +00:00
parent f21abd6299
commit 06735bdfb1
26 changed files with 1446 additions and 110 deletions

2
.gitignore vendored
View File

@@ -167,3 +167,5 @@ node_modules/
.podman/
.beads/
.sisyphus/
.container-home/

View File

@@ -54,6 +54,13 @@ Prosody container helpers:
- `QUADLET_PROSODY_DATA_DIR`
- `QUADLET_PROSODY_LOGS_DIR`
Memory/wiki search helpers:
- `MEMORY_SEARCH_BACKEND` (`django` or `manticore`)
- `MANTICORE_HTTP_URL`
- `MANTICORE_MEMORY_TABLE`
- `MANTICORE_HTTP_TIMEOUT`
For XMPP media upload, configure one of:
- `XMPP_UPLOAD_SERVICE`
@@ -173,6 +180,26 @@ Certificate renewal helper (run as root on host):
./utilities/prosody/renew_prosody_cert.sh
```
### E) Manticore container for memory/wiki retrieval
```bash
./utilities/memory/manage_manticore_container.sh up
./utilities/memory/manage_manticore_container.sh status
./utilities/memory/manage_manticore_container.sh logs
```
Reindex memory into configured backend:
```bash
podman exec ur_gia /venv/bin/python manage.py memory_search_reindex --user-id 1 --statuses active
```
Query memory backend:
```bash
podman exec ur_gia /venv/bin/python manage.py memory_search_query --user-id 1 --query "reply style"
```
### C) Signal or WhatsApp send failures
- Verify account/link status in service pages.

View File

@@ -67,3 +67,8 @@ CAPABILITY_ENFORCEMENT_ENABLED = (
)
TRACE_PROPAGATION_ENABLED = getenv("TRACE_PROPAGATION_ENABLED", "true").lower() in trues
EVENT_PRIMARY_WRITE_PATH = getenv("EVENT_PRIMARY_WRITE_PATH", "false").lower() in trues
MEMORY_SEARCH_BACKEND = getenv("MEMORY_SEARCH_BACKEND", "django")
MANTICORE_HTTP_URL = getenv("MANTICORE_HTTP_URL", "http://127.0.0.1:9308")
MANTICORE_MEMORY_TABLE = getenv("MANTICORE_MEMORY_TABLE", "gia_memory_items")
MANTICORE_HTTP_TIMEOUT = int(getenv("MANTICORE_HTTP_TIMEOUT", "5") or 5)

View File

@@ -83,6 +83,21 @@ urlpatterns = [
system.EventProjectionShadowAPI.as_view(),
name="system_projection_shadow",
),
path(
"settings/system/event-ledger-smoke/",
system.EventLedgerSmokeAPI.as_view(),
name="system_event_ledger_smoke",
),
path(
"settings/system/memory-search/status/",
system.MemorySearchStatusAPI.as_view(),
name="system_memory_search_status",
),
path(
"settings/system/memory-search/query/",
system.MemorySearchQueryAPI.as_view(),
name="system_memory_search_query",
),
path(
"settings/command-routing/",
automation.CommandRoutingSettings.as_view(),

View File

@@ -0,0 +1,13 @@
No Tasks Yet
This group has no derived tasks yet. To start populating this view:
Open Task Settings and confirm this chat is mapped under Group Mapping.
Send task-like messages in this group, for example: task: ship v1, todo: write tests, please review PR.
Mark completion explicitly with a phrase + reference, for example: done #12, completed #12, fixed #12.
Refresh this page; new derived tasks and events should appear automatically.
task settings sound complicated, make them simpler

View File

@@ -0,0 +1,34 @@
# Feature Plan: Agent Knowledge Memory Foundation (Pre-11/12)
## Goal
Establish a scalable, queryable memory substrate so wiki and MCP features can rely on fast retrieval instead of markdown-file scans.
## Why This Comes Before 11/12
- Plan 11 (personal memory) needs performant retrieval and indexing guarantees.
- Plan 12 (MCP wiki/tools) needs a stable backend abstraction independent of UI and tool transport.
## Scope
- Pluggable memory search backend interface.
- Default Django backend for zero-infra operation.
- Optional Manticore backend for scalable full-text/vector-ready indexing.
- Reindex + query operational commands.
- System diagnostics endpoints for backend status and query inspection.
## Implementation Slice
1. Add `core/memory/search_backend.py` abstraction and backends.
2. Add `memory_search_reindex` and `memory_search_query` management commands.
3. Add system APIs:
- backend status
- memory query
4. Add lightweight Podman utility script for Manticore runtime.
5. Add tests for diagnostics and query behavior.
## Acceptance Criteria
- Memory retrieval works with `MEMORY_SEARCH_BACKEND=django` out of the box.
- Switching to `MEMORY_SEARCH_BACKEND=manticore` requires only env/config + container startup.
- Operators can verify backend health and query output from system settings.
## Out of Scope
- Full wiki article model/UI.
- Full MCP server process/tooling.
- Embedding generation pipeline (next slice after backend foundation).

View File

@@ -0,0 +1,25 @@
# Memory Backend Evaluation: Manticore vs Alternatives
## Decision Summary
- **Recommended now:** Manticore for indexed text retrieval and future vector layering.
- **Default fallback:** Django/ORM backend for zero-infra environments.
- **Revisit later:** dedicated vector DB only if recall quality or ANN latency requires it.
## Why Manticore Fits This Stage
- Already present in adjacent infra and codebase history.
- Runs well as a small standalone container with low operational complexity.
- Supports SQL-like querying and fast full-text retrieval for agent memory/wiki content.
- Lets us keep one retrieval abstraction while deferring embedding complexity.
## Tradeoff Notes
- Manticore-first gives immediate performance over markdown scans.
- For advanced ANN/vector-only workloads, Qdrant/pgvector/Weaviate may outperform with less custom shaping.
- A hybrid approach remains possible:
- Manticore for lexical + metadata filtering,
- optional vector store for semantic recall.
## Practical Rollout
1. Start with `MEMORY_SEARCH_BACKEND=django` and verify API/command workflows.
2. Start Manticore container and switch to `MEMORY_SEARCH_BACKEND=manticore`.
3. Run reindex and validate query latency/quality on real agent workflows.
4. Add embedding pipeline only after baseline lexical retrieval is stable.

View File

@@ -1,85 +0,0 @@
# Create a debug log to confirm script execution
import os
import sys
import django
LOG_PATH = os.environ.get("AUTH_DEBUG_LOG", "/tmp/auth_debug.log")
def log(data):
try:
with open(LOG_PATH, "a") as f:
f.write(f"{data}\n")
except Exception:
pass
# Set up Django environment
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "app.settings") # Adjust if needed
django.setup()
from django.contrib.auth import authenticate # noqa: E402
from django.contrib.auth.models import User # noqa: E402
def check_credentials(username, password):
"""Authenticate user via Django"""
user = authenticate(username=username, password=password)
return user is not None and user.is_active
def main():
"""Process authentication requests from Prosody"""
while True:
try:
# Read a single line from stdin
line = sys.stdin.readline().strip()
if not line:
break # Exit if input is empty (EOF)
# Log received command (for debugging)
# log(f"Received: {line}")
parts = line.split(":")
if len(parts) < 3:
log("Sending 0")
print("0", flush=True) # Invalid format, return failure
continue
command, username, domain = parts[:3]
password = (
":".join(parts[3:]) if len(parts) > 3 else None
) # Reconstruct password
if command == "auth":
if password and check_credentials(username, password):
log("Authentication success")
log("Sent 1")
print("1", flush=True) # Success
else:
log("Authentication failure")
log("Sent 0")
print("0", flush=True) # Failure
elif command == "isuser":
if User.objects.filter(username=username).exists():
print("1", flush=True) # User exists
else:
print("0", flush=True) # User does not exist
elif command == "setpass":
print("0", flush=True) # Not supported
else:
print("0", flush=True) # Unknown command, return failure
except Exception as e:
# Log any unexpected errors
log(f"Error: {str(e)}\n")
print("0", flush=True) # Return failure for any error
if __name__ == "__main__":
main()

View File

@@ -1,10 +0,0 @@
#!/bin/sh
set -eu
# Prosody external auth expects a single long-lived stdin/stdout process.
# Keep one stable process chain and hand off with exec.
exec podman exec -i gia sh -lc '
cd /code &&
. /venv/bin/activate &&
exec python -u auth_django.py
'

View File

@@ -840,7 +840,6 @@ class XMPPComponent(ComponentXMPP):
connected = self.connect()
if connected is False:
raise RuntimeError("connect returned false")
self.process(forever=False)
return
except Exception as exc:
self.log.warning("XMPP reconnect attempt failed: %s", exc)
@@ -1754,7 +1753,6 @@ class XMPPClient(ClientBase):
self.client.loop = self.loop
self.client.connect()
self.client.process(forever=False)
async def start_typing_for_person(self, user, person_identifier):
await self.client.send_typing_for_person(user, person_identifier, True)

View File

@@ -3,7 +3,7 @@ from __future__ import annotations
import json
import time
from django.core.management.base import BaseCommand
from django.core.management.base import BaseCommand, CommandError
from core.models import ConversationEvent
@@ -16,6 +16,8 @@ class Command(BaseCommand):
parser.add_argument("--service", default="")
parser.add_argument("--user-id", default="")
parser.add_argument("--limit", type=int, default=200)
parser.add_argument("--require-types", default="")
parser.add_argument("--fail-if-empty", action="store_true", default=False)
parser.add_argument("--json", action="store_true", default=False)
def handle(self, *args, **options):
@@ -23,7 +25,14 @@ class Command(BaseCommand):
service = str(options.get("service") or "").strip().lower()
user_id = str(options.get("user_id") or "").strip()
limit = max(1, int(options.get("limit") or 200))
require_types_raw = str(options.get("require_types") or "").strip()
fail_if_empty = bool(options.get("fail_if_empty"))
as_json = bool(options.get("json"))
required_types = [
item.strip().lower()
for item in require_types_raw.split(",")
if item.strip()
]
cutoff_ts = int(time.time() * 1000) - (minutes * 60 * 1000)
queryset = ConversationEvent.objects.filter(ts__gte=cutoff_ts).order_by("-ts")
@@ -48,6 +57,11 @@ class Command(BaseCommand):
for row in rows:
key = str(row.get("event_type") or "")
event_type_counts[key] = int(event_type_counts.get(key) or 0) + 1
missing_required_types = [
event_type
for event_type in required_types
if int(event_type_counts.get(event_type) or 0) <= 0
]
payload = {
"minutes": minutes,
@@ -55,6 +69,8 @@ class Command(BaseCommand):
"user_id": user_id,
"count": len(rows),
"event_type_counts": event_type_counts,
"required_types": required_types,
"missing_required_types": missing_required_types,
"sample": rows[:25],
}
@@ -66,3 +82,14 @@ class Command(BaseCommand):
f"event-ledger-smoke minutes={minutes} service={service or '-'} user={user_id or '-'} count={len(rows)}"
)
self.stdout.write(f"event_type_counts={event_type_counts}")
if required_types:
self.stdout.write(
f"required_types={required_types} missing_required_types={missing_required_types}"
)
if fail_if_empty and len(rows) == 0:
raise CommandError("No recent ConversationEvent rows found.")
if missing_required_types:
raise CommandError(
"Missing required event types: " + ", ".join(missing_required_types)
)

View File

@@ -19,6 +19,7 @@ class Command(BaseCommand):
parser.add_argument("--user-id", default="")
parser.add_argument("--session-id", default="")
parser.add_argument("--service", default="")
parser.add_argument("--recent-only", action="store_true", default=False)
parser.add_argument("--recent-minutes", type=int, default=0)
parser.add_argument("--limit-sessions", type=int, default=50)
parser.add_argument("--detail-limit", type=int, default=25)
@@ -29,7 +30,10 @@ class Command(BaseCommand):
user_id = str(options.get("user_id") or "").strip()
session_id = str(options.get("session_id") or "").strip()
service = str(options.get("service") or "").strip().lower()
recent_only = bool(options.get("recent_only"))
recent_minutes = max(0, int(options.get("recent_minutes") or 0))
if recent_only and recent_minutes <= 0:
recent_minutes = 120
limit_sessions = max(1, int(options.get("limit_sessions") or 50))
detail_limit = max(0, int(options.get("detail_limit") or 25))
as_json = bool(options.get("json"))
@@ -98,6 +102,7 @@ class Command(BaseCommand):
"user_id": user_id,
"session_id": session_id,
"service": service,
"recent_only": recent_only,
"recent_minutes": recent_minutes,
"limit_sessions": limit_sessions,
"detail_limit": detail_limit,

View File

@@ -0,0 +1,72 @@
from __future__ import annotations
import json
from django.core.management.base import BaseCommand, CommandError
from core.memory.search_backend import get_memory_search_backend
class Command(BaseCommand):
help = "Run a query against configured memory search backend."
def add_arguments(self, parser):
parser.add_argument("--user-id", required=True)
parser.add_argument("--query", required=True)
parser.add_argument("--conversation-id", default="")
parser.add_argument("--statuses", default="active")
parser.add_argument("--limit", type=int, default=20)
parser.add_argument("--json", action="store_true", default=False)
def handle(self, *args, **options):
user_id_raw = str(options.get("user_id") or "").strip()
query = str(options.get("query") or "").strip()
conversation_id = str(options.get("conversation_id") or "").strip()
statuses = tuple(
item.strip().lower()
for item in str(options.get("statuses") or "active").split(",")
if item.strip()
)
limit = max(1, int(options.get("limit") or 20))
as_json = bool(options.get("json"))
if not user_id_raw:
raise CommandError("--user-id is required")
if not query:
raise CommandError("--query is required")
backend = get_memory_search_backend()
hits = backend.search(
user_id=int(user_id_raw),
query=query,
conversation_id=conversation_id,
limit=limit,
include_statuses=statuses,
)
payload = {
"backend": getattr(backend, "name", "unknown"),
"query": query,
"user_id": int(user_id_raw),
"conversation_id": conversation_id,
"statuses": statuses,
"count": len(hits),
"hits": [
{
"memory_id": item.memory_id,
"score": item.score,
"summary": item.summary,
"payload": item.payload,
}
for item in hits
],
}
if as_json:
self.stdout.write(json.dumps(payload, indent=2, sort_keys=True))
return
self.stdout.write(
f"memory-search-query backend={payload['backend']} count={payload['count']} query={query!r}"
)
for row in payload["hits"]:
self.stdout.write(
f"- id={row['memory_id']} score={row['score']:.2f} summary={row['summary'][:120]}"
)

View File

@@ -0,0 +1,49 @@
from __future__ import annotations
import json
from django.core.management.base import BaseCommand
from core.memory.search_backend import get_memory_search_backend
class Command(BaseCommand):
help = "Reindex MemoryItem rows into the configured memory search backend."
def add_arguments(self, parser):
parser.add_argument("--user-id", default="")
parser.add_argument("--statuses", default="active")
parser.add_argument("--limit", type=int, default=2000)
parser.add_argument("--json", action="store_true", default=False)
def handle(self, *args, **options):
user_id_raw = str(options.get("user_id") or "").strip()
statuses = tuple(
item.strip().lower()
for item in str(options.get("statuses") or "active").split(",")
if item.strip()
)
limit = max(1, int(options.get("limit") or 2000))
as_json = bool(options.get("json"))
backend = get_memory_search_backend()
result = backend.reindex(
user_id=int(user_id_raw) if user_id_raw else None,
include_statuses=statuses,
limit=limit,
)
payload = {
"backend": getattr(backend, "name", "unknown"),
"user_id": user_id_raw,
"statuses": statuses,
"limit": limit,
"result": result,
}
if as_json:
self.stdout.write(json.dumps(payload, indent=2, sort_keys=True))
return
self.stdout.write(
f"memory-search-reindex backend={payload['backend']} "
f"user={user_id_raw or '-'} statuses={','.join(statuses) or '-'} "
f"scanned={int(result.get('scanned') or 0)} indexed={int(result.get('indexed') or 0)}"
)

3
core/memory/__init__.py Normal file
View File

@@ -0,0 +1,3 @@
from .search_backend import get_memory_search_backend
__all__ = ["get_memory_search_backend"]

View File

@@ -0,0 +1,283 @@
from __future__ import annotations
import hashlib
import json
import time
from dataclasses import dataclass
from typing import Any
import requests
from django.conf import settings
from core.models import MemoryItem
from core.util import logs
log = logs.get_logger("memory-search")
@dataclass
class MemorySearchHit:
memory_id: str
score: float
summary: str
payload: dict[str, Any]
def _flatten_to_text(value: Any) -> str:
if value is None:
return ""
if isinstance(value, dict):
parts = []
for key, item in value.items():
parts.append(str(key))
parts.append(_flatten_to_text(item))
return " ".join(part for part in parts if part).strip()
if isinstance(value, (list, tuple, set)):
return " ".join(_flatten_to_text(item) for item in value if item).strip()
return str(value).strip()
class BaseMemorySearchBackend:
def upsert(self, item: MemoryItem) -> None:
raise NotImplementedError
def delete(self, memory_id: str) -> None:
raise NotImplementedError
def search(
self,
*,
user_id: int,
query: str,
conversation_id: str = "",
limit: int = 20,
include_statuses: tuple[str, ...] = ("active",),
) -> list[MemorySearchHit]:
raise NotImplementedError
def reindex(
self,
*,
user_id: int | None = None,
include_statuses: tuple[str, ...] = ("active",),
limit: int = 2000,
) -> dict[str, int]:
queryset = MemoryItem.objects.all().order_by("-updated_at")
if user_id is not None:
queryset = queryset.filter(user_id=int(user_id))
if include_statuses:
queryset = queryset.filter(status__in=list(include_statuses))
scanned = 0
indexed = 0
for item in queryset[: max(1, int(limit))]:
scanned += 1
try:
self.upsert(item)
indexed += 1
except Exception as exc:
log.warning("memory-search upsert failed id=%s err=%s", item.id, exc)
return {"scanned": scanned, "indexed": indexed}
class DjangoMemorySearchBackend(BaseMemorySearchBackend):
name = "django"
def upsert(self, item: MemoryItem) -> None:
# No-op because Django backend queries source-of-truth rows directly.
_ = item
def delete(self, memory_id: str) -> None:
_ = memory_id
def search(
self,
*,
user_id: int,
query: str,
conversation_id: str = "",
limit: int = 20,
include_statuses: tuple[str, ...] = ("active",),
) -> list[MemorySearchHit]:
needle = str(query or "").strip().lower()
if not needle:
return []
queryset = MemoryItem.objects.filter(user_id=int(user_id))
if conversation_id:
queryset = queryset.filter(conversation_id=conversation_id)
if include_statuses:
queryset = queryset.filter(status__in=list(include_statuses))
hits: list[MemorySearchHit] = []
for item in queryset.order_by("-updated_at")[:500]:
text_blob = _flatten_to_text(item.content).lower()
if needle not in text_blob:
continue
raw_summary = _flatten_to_text(item.content)
summary = raw_summary[:280]
score = 1.0 + min(1.0, len(needle) / max(1.0, len(text_blob)))
hits.append(
MemorySearchHit(
memory_id=str(item.id),
score=float(score),
summary=summary,
payload={
"memory_kind": str(item.memory_kind or ""),
"status": str(item.status or ""),
"conversation_id": str(item.conversation_id or ""),
"updated_at": item.updated_at.isoformat(),
},
)
)
if len(hits) >= max(1, int(limit)):
break
return hits
class ManticoreMemorySearchBackend(BaseMemorySearchBackend):
name = "manticore"
def __init__(self):
self.base_url = str(
getattr(settings, "MANTICORE_HTTP_URL", "http://127.0.0.1:9308")
).rstrip("/")
self.table = str(
getattr(settings, "MANTICORE_MEMORY_TABLE", "gia_memory_items")
).strip() or "gia_memory_items"
self.timeout_seconds = int(getattr(settings, "MANTICORE_HTTP_TIMEOUT", 5) or 5)
def _sql(self, query: str) -> dict[str, Any]:
response = requests.post(
f"{self.base_url}/sql",
data={"mode": "raw", "query": query},
timeout=self.timeout_seconds,
)
response.raise_for_status()
payload = response.json()
if isinstance(payload, list):
return payload[0] if payload else {}
return dict(payload or {})
def ensure_table(self) -> None:
self._sql(
(
f"CREATE TABLE IF NOT EXISTS {self.table} ("
"id BIGINT,"
"memory_uuid STRING,"
"user_id BIGINT,"
"conversation_id STRING,"
"memory_kind STRING,"
"status STRING,"
"updated_ts BIGINT,"
"summary TEXT,"
"body TEXT"
")"
)
)
def _doc_id(self, memory_id: str) -> int:
digest = hashlib.blake2b(
str(memory_id or "").encode("utf-8"),
digest_size=8,
).digest()
value = int.from_bytes(digest, byteorder="big", signed=False)
return max(1, int(value))
def _escape(self, value: Any) -> str:
text = str(value or "")
text = text.replace("\\", "\\\\").replace("'", "\\'")
return text
def upsert(self, item: MemoryItem) -> None:
self.ensure_table()
memory_id = str(item.id)
doc_id = self._doc_id(memory_id)
summary = _flatten_to_text(item.content)[:280]
body = _flatten_to_text(item.content)
updated_ts = int(item.updated_at.timestamp() * 1000)
query = (
f"REPLACE INTO {self.table} "
"(id,memory_uuid,user_id,conversation_id,memory_kind,status,updated_ts,summary,body) "
f"VALUES ({doc_id},'{self._escape(memory_id)}',{int(item.user_id)},"
f"'{self._escape(item.conversation_id)}','{self._escape(item.memory_kind)}',"
f"'{self._escape(item.status)}',{updated_ts},"
f"'{self._escape(summary)}','{self._escape(body)}')"
)
self._sql(query)
def delete(self, memory_id: str) -> None:
self.ensure_table()
doc_id = self._doc_id(memory_id)
self._sql(f"DELETE FROM {self.table} WHERE id={doc_id}")
def search(
self,
*,
user_id: int,
query: str,
conversation_id: str = "",
limit: int = 20,
include_statuses: tuple[str, ...] = ("active",),
) -> list[MemorySearchHit]:
self.ensure_table()
needle = str(query or "").strip()
if not needle:
return []
where_parts = [f"user_id={int(user_id)}", f"MATCH('{self._escape(needle)}')"]
if conversation_id:
where_parts.append(f"conversation_id='{self._escape(conversation_id)}'")
statuses = [str(item or "").strip() for item in include_statuses if str(item or "").strip()]
if statuses:
in_clause = ",".join(f"'{self._escape(item)}'" for item in statuses)
where_parts.append(f"status IN ({in_clause})")
where_sql = " AND ".join(where_parts)
query_sql = (
f"SELECT memory_uuid,memory_kind,status,conversation_id,updated_ts,summary,WEIGHT() AS score "
f"FROM {self.table} WHERE {where_sql} ORDER BY score DESC LIMIT {max(1, int(limit))}"
)
payload = self._sql(query_sql)
rows = list(payload.get("data") or [])
hits = []
for row in rows:
item = dict(row or {})
hits.append(
MemorySearchHit(
memory_id=str(item.get("memory_uuid") or ""),
score=float(item.get("score") or 0.0),
summary=str(item.get("summary") or ""),
payload={
"memory_kind": str(item.get("memory_kind") or ""),
"status": str(item.get("status") or ""),
"conversation_id": str(item.get("conversation_id") or ""),
"updated_ts": int(item.get("updated_ts") or 0),
},
)
)
return hits
def get_memory_search_backend() -> BaseMemorySearchBackend:
backend = str(getattr(settings, "MEMORY_SEARCH_BACKEND", "django")).strip().lower()
if backend == "manticore":
return ManticoreMemorySearchBackend()
return DjangoMemorySearchBackend()
def backend_status() -> dict[str, Any]:
backend = get_memory_search_backend()
status = {
"backend": getattr(backend, "name", "unknown"),
"ok": True,
"ts": int(time.time() * 1000),
}
if isinstance(backend, ManticoreMemorySearchBackend):
try:
backend.ensure_table()
status["manticore_http_url"] = backend.base_url
status["manticore_table"] = backend.table
except Exception as exc:
status["ok"] = False
status["error"] = str(exc)
return status

View File

@@ -39,6 +39,113 @@
</article>
</div>
<div class="column is-12">
<article class="box">
<h2 class="title is-6">Diagnostics Quick Checks</h2>
<p class="is-size-7 has-text-grey" style="margin-bottom: 0.65rem;">
Run projection shadow, event ledger smoke, and trace diagnostics from one place.
</p>
<div class="columns is-multiline">
<div class="column is-12-tablet is-3-desktop">
<form id="projection-shadow-form">
<label class="label is-size-7">Projection Shadow</label>
<div class="field">
<div class="control">
<input class="input is-small" name="session_lookup" placeholder="name | id | service | identifier" list="diagnostics-session-options" required>
<input type="hidden" name="session_id">
</div>
<p class="help is-size-7" title="Use this when a thread looks wrong in Compose and you need to compare DB messages vs event projection.">Pick a session to compare message table vs event projection.</p>
</div>
<div class="field">
<div class="control">
<input class="input is-small" name="detail_limit" value="25" type="number" min="0" max="200">
</div>
</div>
<button class="button is-small is-link is-light" type="submit">Run Shadow</button>
</form>
</div>
<div class="column is-12-tablet is-3-desktop">
<form id="event-ledger-smoke-form">
<label class="label is-size-7">Event Ledger Smoke</label>
<div class="field">
<div class="control">
<input class="input is-small" name="minutes" value="120" type="number" min="1" max="10080">
</div>
<p class="help is-size-7" title="Use this to check that recent traffic is actually writing canonical events.">Checks whether recent actions were written to `ConversationEvent`.</p>
</div>
<div class="field">
<div class="control">
<input class="input is-small" name="service" placeholder="service" list="diagnostics-service-options">
</div>
</div>
<div class="field">
<div class="control">
<input class="input is-small" name="require_types" placeholder="message_created,reaction_added" list="diagnostics-event-type-options">
</div>
<p class="help is-size-7" title="If set, response includes missing required event types so you can quickly spot dual-write gaps.">Optional required event types (comma-separated).</p>
</div>
<button class="button is-small is-link is-light" type="submit">Run Smoke</button>
</form>
</div>
<div class="column is-12-tablet is-3-desktop">
<form id="trace-diagnostics-form">
<label class="label is-size-7">Trace Diagnostics</label>
<div class="field">
<div class="control">
<input class="input is-small" name="trace_id" placeholder="select recent trace id or paste one" list="diagnostics-trace-options" required>
</div>
<p class="help is-size-7" title="Use this to reconstruct ingress -> persistence -> fanout path and jump to projection-shadow for linked sessions.">Use a trace id from the dropdown (recent traces), Event Ledger Smoke `sample[].trace_id`, or UR logs.</p>
</div>
<button class="button is-small is-link is-light" type="submit">Lookup Trace</button>
</form>
</div>
<div class="column is-12-tablet is-3-desktop">
<form id="memory-search-form">
<label class="label is-size-7">Memory Search</label>
<div class="field">
<div class="control">
<input class="input is-small" name="q" placeholder="search memory text" required>
</div>
<p class="help is-size-7" title="Use this for fast retrieval over AI memory/wiki text; backend can be django or manticore.">Query memory index and inspect top matches.</p>
</div>
<div class="field">
<div class="control">
<input class="input is-small" name="statuses" placeholder="active" value="active">
</div>
</div>
<button class="button is-small is-link is-light" type="submit">Query Memory</button>
<button class="button is-small is-light" type="button" id="memory-search-status">Backend Status</button>
</form>
</div>
</div>
<datalist id="diagnostics-session-options">
{% for row in diagnostics_options.sessions %}
<option value="{{ row.label }}" data-session-id="{{ row.id }}"></option>
{% endfor %}
</datalist>
<datalist id="diagnostics-trace-options">
{% for trace_id in diagnostics_options.trace_ids %}
<option value="{{ trace_id }}"></option>
{% endfor %}
</datalist>
<datalist id="diagnostics-service-options">
{% for service in diagnostics_options.services %}
<option value="{{ service }}"></option>
{% endfor %}
</datalist>
<datalist id="diagnostics-event-type-options">
{% for event_type in diagnostics_options.event_types %}
<option value="{{ event_type }}"></option>
{% endfor %}
</datalist>
<div class="buttons are-small" style="margin-bottom: 0.5rem;">
<button id="diagnostics-select-all" type="button" class="button is-light">Select All</button>
<button id="diagnostics-copy" type="button" class="button is-light">Copy</button>
</div>
<pre id="diagnostics-output" class="is-size-7" style="max-height: 20rem; overflow:auto; background:#f7f7f7; padding:0.75rem; border-radius:8px;"></pre>
</article>
</div>
<div class="column is-12">
<article class="box">
<h2 class="title is-6">Purge OSINT Setup Categories</h2>
@@ -72,5 +179,144 @@
</div>
</div>
</section>
{% endblock %}
<script>
(function () {
const out = document.getElementById("diagnostics-output");
const shadowForm = document.getElementById("projection-shadow-form");
const smokeForm = document.getElementById("event-ledger-smoke-form");
const traceForm = document.getElementById("trace-diagnostics-form");
const memoryForm = document.getElementById("memory-search-form");
const memoryStatusBtn = document.getElementById("memory-search-status");
const selectAllBtn = document.getElementById("diagnostics-select-all");
const copyBtn = document.getElementById("diagnostics-copy");
const sessionOptionMap = new Map();
const sessionDatalist = document.getElementById("diagnostics-session-options");
if (sessionDatalist) {
sessionDatalist.querySelectorAll("option").forEach(function (opt) {
const key = (opt.value || "").trim();
const sessionId = (opt.dataset.sessionId || "").trim();
if (key && sessionId) {
sessionOptionMap.set(key, sessionId);
}
});
}
function write(text) {
if (out) {
out.textContent = text;
}
}
function outputText() {
return out ? (out.textContent || "") : "";
}
function selectOutputText() {
if (!out) return;
const range = document.createRange();
range.selectNodeContents(out);
const selection = window.getSelection();
if (!selection) return;
selection.removeAllRanges();
selection.addRange(range);
}
async function copyOutputText() {
const text = outputText();
if (!text) return;
try {
await navigator.clipboard.writeText(text);
write(text + "\n\n[Copied]");
return;
} catch (err) {
const area = document.createElement("textarea");
area.value = text;
area.style.position = "fixed";
area.style.opacity = "0";
document.body.appendChild(area);
area.focus();
area.select();
try {
document.execCommand("copy");
write(text + "\n\n[Copied]");
} finally {
document.body.removeChild(area);
}
}
}
async function runGet(url, params) {
const query = new URLSearchParams(params);
const response = await fetch(`${url}?${query.toString()}`, { headers: { "Accept": "application/json" } });
const payload = await response.json();
return { status: response.status, payload };
}
shadowForm.addEventListener("submit", async function (ev) {
ev.preventDefault();
write("Running projection shadow...");
const form = new FormData(shadowForm);
const sessionLookup = (form.get("session_lookup") || "").toString().trim();
const sessionId = sessionOptionMap.get(sessionLookup) || sessionLookup;
const result = await runGet("{% url 'system_projection_shadow' %}", {
session_id: sessionId,
detail_limit: (form.get("detail_limit") || "25").toString().trim(),
});
write(JSON.stringify(result, null, 2));
});
smokeForm.addEventListener("submit", async function (ev) {
ev.preventDefault();
write("Running event ledger smoke...");
const form = new FormData(smokeForm);
const result = await runGet("{% url 'system_event_ledger_smoke' %}", {
minutes: (form.get("minutes") || "120").toString().trim(),
service: (form.get("service") || "").toString().trim(),
require_types: (form.get("require_types") || "").toString().trim(),
});
write(JSON.stringify(result, null, 2));
});
traceForm.addEventListener("submit", async function (ev) {
ev.preventDefault();
write("Running trace diagnostics...");
const form = new FormData(traceForm);
const result = await runGet("{% url 'system_trace_diagnostics' %}", {
trace_id: (form.get("trace_id") || "").toString().trim(),
});
write(JSON.stringify(result, null, 2));
});
memoryForm.addEventListener("submit", async function (ev) {
ev.preventDefault();
write("Running memory search...");
const form = new FormData(memoryForm);
const result = await runGet("{% url 'system_memory_search_query' %}", {
q: (form.get("q") || "").toString().trim(),
statuses: (form.get("statuses") || "active").toString().trim(),
limit: "20",
});
write(JSON.stringify(result, null, 2));
});
if (memoryStatusBtn) {
memoryStatusBtn.addEventListener("click", async function () {
write("Checking memory search backend...");
const result = await runGet("{% url 'system_memory_search_status' %}", {});
write(JSON.stringify(result, null, 2));
});
}
if (selectAllBtn) {
selectAllBtn.addEventListener("click", function () {
selectOutputText();
});
}
if (copyBtn) {
copyBtn.addEventListener("click", function () {
copyOutputText();
});
}
})();
</script>
{% endblock %}

View File

@@ -1,6 +1,7 @@
from io import StringIO
from django.core.management import call_command
from django.core.management.base import CommandError
from django.test import TestCase, override_settings
from core.events.ledger import append_event_sync
@@ -45,3 +46,46 @@ class EventLedgerSmokeCommandTests(TestCase):
rendered = out.getvalue()
self.assertIn("event-ledger-smoke", rendered)
self.assertIn("event_type_counts=", rendered)
def test_smoke_command_validates_required_types(self):
append_event_sync(
user=self.user,
session=self.session,
ts=1770000000001,
event_type="message_created",
direction="in",
origin_transport="signal",
origin_message_id="m-required",
payload={"message_id": "m-required"},
)
out = StringIO()
call_command(
"event_ledger_smoke",
user_id=str(self.user.id),
minutes=999999,
require_types="message_created",
stdout=out,
)
rendered = out.getvalue()
self.assertIn("required_types=", rendered)
self.assertIn("missing_required_types=[]", rendered)
def test_smoke_command_errors_when_required_type_missing(self):
with self.assertRaises(CommandError):
call_command(
"event_ledger_smoke",
user_id=str(self.user.id),
minutes=999999,
require_types="reaction_added",
stdout=StringIO(),
)
def test_smoke_command_errors_when_empty_and_fail_if_empty(self):
with self.assertRaises(CommandError):
call_command(
"event_ledger_smoke",
user_id=str(self.user.id),
minutes=1,
fail_if_empty=True,
stdout=StringIO(),
)

View File

@@ -136,3 +136,24 @@ class EventProjectionShadowTests(TestCase):
)
rendered = out.getvalue()
self.assertIn("shadow compare:", rendered)
def test_management_command_supports_recent_only_switch(self):
Message.objects.create(
user=self.user,
session=self.session,
ts=int(time.time() * 1000),
sender_uuid="+15550000001",
text="recent-only",
source_service="signal",
source_message_id="recent-only-1",
)
out = StringIO()
call_command(
"event_projection_shadow",
user_id=str(self.user.id),
recent_only=True,
limit_sessions=5,
stdout=out,
)
rendered = out.getvalue()
self.assertIn("shadow compare:", rendered)

View File

@@ -0,0 +1,62 @@
from io import StringIO
from django.core.management import call_command
from django.test import TestCase, override_settings
from core.models import AIRequest, MemoryItem, User, WorkspaceConversation
@override_settings(MEMORY_SEARCH_BACKEND="django")
class MemorySearchCommandTests(TestCase):
def setUp(self):
self.user = User.objects.create_user(
username="memory-search-user",
email="memory-search@example.com",
password="pw",
)
self.conversation = WorkspaceConversation.objects.create(
user=self.user,
platform_type="signal",
title="Memory Search Scope",
platform_thread_id="mem-scope-1",
)
request = AIRequest.objects.create(
user=self.user,
conversation=self.conversation,
window_spec={},
operation="memory_propose",
)
self.item = MemoryItem.objects.create(
user=self.user,
conversation=self.conversation,
memory_kind="fact",
status="active",
content={"text": "Prefers concise updates with action items."},
source_request=request,
)
def test_reindex_command_emits_summary(self):
out = StringIO()
call_command(
"memory_search_reindex",
user_id=str(self.user.id),
statuses="active",
limit=100,
stdout=out,
)
rendered = out.getvalue()
self.assertIn("memory-search-reindex", rendered)
self.assertIn("indexed=", rendered)
def test_query_command_returns_hit(self):
out = StringIO()
call_command(
"memory_search_query",
user_id=str(self.user.id),
query="concise updates",
statuses="active",
stdout=out,
)
rendered = out.getvalue()
self.assertIn("memory-search-query", rendered)
self.assertIn(str(self.item.id), rendered)

View File

@@ -0,0 +1,150 @@
from django.test import TestCase
from django.urls import reverse
from core.models import (
AIRequest,
ChatSession,
ConversationEvent,
MemoryItem,
Person,
PersonIdentifier,
User,
WorkspaceConversation,
)
class SystemDiagnosticsAPITests(TestCase):
def setUp(self):
self.user = User.objects.create_superuser(
username="sys-diag-admin",
email="sys-diag@example.com",
password="pw",
)
person = Person.objects.create(user=self.user, name="System Diagnostics Person")
identifier = PersonIdentifier.objects.create(
user=self.user,
person=person,
service="signal",
identifier="+15554443333",
)
self.session = ChatSession.objects.create(user=self.user, identifier=identifier)
self.workspace_conversation = WorkspaceConversation.objects.create(
user=self.user,
platform_type="signal",
title="Diag Memory Scope",
platform_thread_id=str(self.session.id),
)
self.client.force_login(self.user)
def test_event_ledger_smoke_api_returns_counts_and_missing_required(self):
ConversationEvent.objects.create(
user=self.user,
session=self.session,
ts=1700000000000,
event_type="message_created",
direction="in",
origin_transport="signal",
payload={"message_id": "m1"},
raw_payload={},
)
response = self.client.get(
reverse("system_event_ledger_smoke"),
{
"minutes": "999999",
"service": "signal",
"require_types": "message_created,reaction_added",
},
)
self.assertEqual(200, response.status_code)
payload = response.json()
self.assertTrue(payload.get("ok"))
self.assertEqual("signal", payload.get("service"))
self.assertIn("event_type_counts", payload)
self.assertIn("missing_required_types", payload)
self.assertIn("reaction_added", payload.get("missing_required_types") or [])
def test_trace_diagnostics_includes_projection_shadow_links(self):
trace_id = "trace-system-diag-1"
event = ConversationEvent.objects.create(
user=self.user,
session=self.session,
ts=1700000001000,
event_type="message_created",
direction="in",
origin_transport="signal",
trace_id=trace_id,
payload={"message_id": "m2"},
raw_payload={},
)
response = self.client.get(
reverse("system_trace_diagnostics"),
{"trace_id": trace_id},
)
self.assertEqual(200, response.status_code)
payload = response.json()
self.assertTrue(payload.get("ok"))
self.assertEqual(1, payload.get("count"))
self.assertIn(str(self.session.id), payload.get("related_session_ids") or [])
urls = payload.get("projection_shadow_urls") or []
self.assertTrue(urls)
self.assertIn(str(self.session.id), str(urls[0]))
events = payload.get("events") or []
self.assertEqual(str(event.id), str(events[0].get("id")))
self.assertIn(
str(self.session.id),
str(events[0].get("projection_shadow_url") or ""),
)
def test_memory_search_status_and_query_api(self):
request = AIRequest.objects.create(
user=self.user,
conversation=self.workspace_conversation,
window_spec={},
operation="memory_propose",
)
memory = MemoryItem.objects.create(
user=self.user,
conversation=self.workspace_conversation,
memory_kind="fact",
status="active",
content={"text": "User prefers concise status updates on WhatsApp."},
source_request=request,
)
status_response = self.client.get(reverse("system_memory_search_status"))
self.assertEqual(200, status_response.status_code)
status_payload = status_response.json()
self.assertTrue(status_payload.get("ok"))
self.assertIn("status", status_payload)
query_response = self.client.get(
reverse("system_memory_search_query"),
{"q": "concise status updates"},
)
self.assertEqual(200, query_response.status_code)
query_payload = query_response.json()
self.assertTrue(query_payload.get("ok"))
self.assertGreaterEqual(int(query_payload.get("count") or 0), 1)
first_hit = (query_payload.get("hits") or [{}])[0]
self.assertEqual(str(memory.id), str(first_hit.get("memory_id") or ""))
def test_system_settings_page_renders_searchable_datalists(self):
ConversationEvent.objects.create(
user=self.user,
session=self.session,
ts=1700000002000,
event_type="reaction_added",
direction="system",
origin_transport="signal",
trace_id="trace-system-diag-2",
payload={"message_id": "m3"},
raw_payload={},
)
response = self.client.get(reverse("system_settings"))
self.assertEqual(200, response.status_code)
content = response.content.decode("utf-8")
self.assertIn('datalist id="diagnostics-session-options"', content)
self.assertIn('datalist id="diagnostics-trace-options"', content)
self.assertIn('datalist id="diagnostics-service-options"', content)
self.assertIn('datalist id="diagnostics-event-type-options"', content)
self.assertIn(str(self.session.id), content)
self.assertIn("trace-system-diag-2", content)

View File

@@ -0,0 +1,52 @@
from django.test import TestCase
from django.urls import reverse
from core.models import ChatSession, Message, Person, PersonIdentifier, User
class SystemProjectionShadowAPITests(TestCase):
def setUp(self):
self.user = User.objects.create_superuser(
username="sys-shadow-admin",
email="sys-shadow@example.com",
password="pw",
)
person = Person.objects.create(user=self.user, name="System Shadow Person")
identifier = PersonIdentifier.objects.create(
user=self.user,
person=person,
service="signal",
identifier="+15553332222",
)
self.session = ChatSession.objects.create(user=self.user, identifier=identifier)
self.client.force_login(self.user)
def test_projection_shadow_requires_session_id(self):
response = self.client.get(reverse("system_projection_shadow"))
self.assertEqual(400, response.status_code)
payload = response.json()
self.assertFalse(payload.get("ok"))
self.assertEqual("session_id_required", payload.get("error"))
def test_projection_shadow_includes_cause_summary_and_samples(self):
Message.objects.create(
user=self.user,
session=self.session,
ts=1700000000000,
sender_uuid="+15553332222",
text="row-without-event",
)
response = self.client.get(
reverse("system_projection_shadow"),
{"session_id": str(self.session.id), "detail_limit": 10},
)
self.assertEqual(200, response.status_code)
payload = response.json()
self.assertTrue(payload.get("ok"))
self.assertIn("cause_summary", payload)
self.assertIn("cause_samples", payload)
cause_summary = dict(payload.get("cause_summary") or {})
cause_samples = dict(payload.get("cause_samples") or {})
self.assertIn("missing_event_write", cause_summary)
self.assertIn("missing_event_write", cause_samples)
self.assertGreaterEqual(int(cause_summary.get("missing_event_write") or 0), 1)

View File

@@ -1,5 +1,8 @@
import time
from django.http import JsonResponse
from django.shortcuts import render
from django.urls import reverse
from django.views import View
from core.models import (
@@ -29,6 +32,7 @@ from core.models import (
WorkspaceMetricSnapshot,
)
from core.events.projection import shadow_compare_session
from core.memory.search_backend import backend_status, get_memory_search_backend
from core.transports.capabilities import capability_snapshot
from core.views.manage.permissions import SuperUserRequiredMixin
@@ -143,19 +147,81 @@ class SystemSettings(SuperUserRequiredMixin, View):
)
return ("danger", "Unknown action.")
def get(self, request):
return render(
request,
self.template_name,
def _diagnostics_options(self, user):
session_rows = list(
ChatSession.objects.filter(user=user)
.select_related("identifier", "identifier__person")
.order_by("-last_interaction", "-id")[:120]
)
session_options = []
for row in session_rows:
identifier = getattr(row, "identifier", None)
person = getattr(identifier, "person", None) if identifier else None
session_options.append(
{
"counts": self._counts(request.user),
"notice_level": "",
"notice_message": "",
},
"id": str(row.id),
"label": " | ".join(
[
str(getattr(person, "name", "") or "-"),
str(row.id),
str(getattr(identifier, "service", "") or "-"),
str(getattr(identifier, "identifier", "") or "-"),
]
),
}
)
def post(self, request):
notice_level, notice_message = self._handle_action(request)
trace_options = []
seen_trace_ids = set()
for trace_id in (
ConversationEvent.objects.filter(user=user)
.exclude(trace_id="")
.order_by("-ts")
.values_list("trace_id", flat=True)[:400]
):
value = str(trace_id or "").strip()
if not value or value in seen_trace_ids:
continue
seen_trace_ids.add(value)
trace_options.append(value)
if len(trace_options) >= 120:
break
service_candidates = {"signal", "whatsapp", "xmpp", "instagram", "web"}
service_candidates.update(
str(item or "").strip().lower()
for item in ConversationEvent.objects.filter(user=user)
.exclude(origin_transport="")
.values_list("origin_transport", flat=True)
.distinct()[:50]
)
service_options = sorted(value for value in service_candidates if value)
event_type_candidates = {
"message_created",
"reaction_added",
"reaction_removed",
"read_receipt",
"message_updated",
"message_deleted",
}
event_type_candidates.update(
str(item or "").strip().lower()
for item in ConversationEvent.objects.filter(user=user)
.exclude(event_type="")
.values_list("event_type", flat=True)
.distinct()[:80]
)
event_type_options = sorted(value for value in event_type_candidates if value)
return {
"sessions": session_options,
"trace_ids": trace_options,
"services": service_options,
"event_types": event_type_options,
}
def _render_page(self, request, notice_level="", notice_message=""):
return render(
request,
self.template_name,
@@ -163,9 +229,21 @@ class SystemSettings(SuperUserRequiredMixin, View):
"counts": self._counts(request.user),
"notice_level": notice_level,
"notice_message": notice_message,
"diagnostics_options": self._diagnostics_options(request.user),
},
)
def get(self, request):
return self._render_page(request)
def post(self, request):
notice_level, notice_message = self._handle_action(request)
return self._render_page(
request,
notice_level=notice_level,
notice_message=notice_message,
)
class ServiceCapabilitySnapshotAPI(SuperUserRequiredMixin, View):
def get(self, request):
@@ -211,11 +289,25 @@ class TraceDiagnosticsAPI(SuperUserRequiredMixin, View):
.select_related("session")
.order_by("ts", "created_at")[:500]
)
related_session_ids = []
seen_sessions = set()
for row in rows:
session_id = str(row.session_id or "").strip()
if not session_id or session_id in seen_sessions:
continue
seen_sessions.add(session_id)
related_session_ids.append(session_id)
return JsonResponse(
{
"ok": True,
"trace_id": trace_id,
"count": len(rows),
"related_session_ids": related_session_ids,
"projection_shadow_urls": [
f"{reverse('system_projection_shadow')}?session_id={session_id}"
for session_id in related_session_ids
],
"events": [
{
"id": str(row.id),
@@ -223,6 +315,11 @@ class TraceDiagnosticsAPI(SuperUserRequiredMixin, View):
"event_type": str(row.event_type or ""),
"direction": str(row.direction or ""),
"session_id": str(row.session_id or ""),
"projection_shadow_url": (
f"{reverse('system_projection_shadow')}?session_id={str(row.session_id or '').strip()}"
if str(row.session_id or "").strip()
else ""
),
"origin_transport": str(row.origin_transport or ""),
"origin_message_id": str(row.origin_message_id or ""),
"payload": dict(row.payload or {}),
@@ -260,3 +357,105 @@ class EventProjectionShadowAPI(SuperUserRequiredMixin, View):
"cause_samples": dict(compared.get("cause_samples") or {}),
}
)
class EventLedgerSmokeAPI(SuperUserRequiredMixin, View):
def get(self, request):
minutes = max(1, int(request.GET.get("minutes") or 120))
service = str(request.GET.get("service") or "").strip().lower()
user_id = str(request.GET.get("user_id") or "").strip() or str(request.user.id)
limit = max(1, min(500, int(request.GET.get("limit") or 200)))
require_types_raw = str(request.GET.get("require_types") or "").strip()
required_types = [
item.strip().lower()
for item in require_types_raw.split(",")
if item.strip()
]
cutoff_ts = int(time.time() * 1000) - (minutes * 60 * 1000)
queryset = ConversationEvent.objects.filter(ts__gte=cutoff_ts).order_by("-ts")
if service:
queryset = queryset.filter(origin_transport=service)
if user_id:
queryset = queryset.filter(user_id=user_id)
rows = list(
queryset.values(
"id",
"user_id",
"session_id",
"ts",
"event_type",
"direction",
"origin_transport",
"trace_id",
)[:limit]
)
event_type_counts = {}
for row in rows:
key = str(row.get("event_type") or "")
event_type_counts[key] = int(event_type_counts.get(key) or 0) + 1
missing_required_types = [
event_type
for event_type in required_types
if int(event_type_counts.get(event_type) or 0) <= 0
]
return JsonResponse(
{
"ok": True,
"minutes": minutes,
"service": service,
"user_id": user_id,
"count": len(rows),
"event_type_counts": event_type_counts,
"required_types": required_types,
"missing_required_types": missing_required_types,
"sample": rows[:25],
}
)
class MemorySearchStatusAPI(SuperUserRequiredMixin, View):
def get(self, request):
return JsonResponse({"ok": True, "status": backend_status()})
class MemorySearchQueryAPI(SuperUserRequiredMixin, View):
def get(self, request):
query = str(request.GET.get("q") or "").strip()
user_id = int(request.GET.get("user_id") or request.user.id)
conversation_id = str(request.GET.get("conversation_id") or "").strip()
limit = max(1, min(50, int(request.GET.get("limit") or 20)))
statuses = tuple(
item.strip().lower()
for item in str(request.GET.get("statuses") or "active").split(",")
if item.strip()
)
if not query:
return JsonResponse({"ok": False, "error": "query_required"}, status=400)
backend = get_memory_search_backend()
hits = backend.search(
user_id=user_id,
query=query,
conversation_id=conversation_id,
limit=limit,
include_statuses=statuses,
)
return JsonResponse(
{
"ok": True,
"backend": getattr(backend, "name", "unknown"),
"query": query,
"count": len(hits),
"hits": [
{
"memory_id": item.memory_id,
"score": item.score,
"summary": item.summary,
"payload": item.payload,
}
for item in hits
],
}
)

View File

@@ -38,3 +38,9 @@ QUADLET_PROSODY_CONFIG_FILE=./utilities/prosody/prosody.cfg.lua
QUADLET_PROSODY_CERTS_DIR=./.podman/gia_prosody_certs
QUADLET_PROSODY_DATA_DIR=./.podman/gia_prosody_data
QUADLET_PROSODY_LOGS_DIR=./.podman/gia_prosody_logs
# Memory/wiki search backend foundation
MEMORY_SEARCH_BACKEND=django
MANTICORE_HTTP_URL=http://127.0.0.1:9308
MANTICORE_MEMORY_TABLE=gia_memory_items
MANTICORE_HTTP_TIMEOUT=5

View File

@@ -0,0 +1,83 @@
#!/usr/bin/env bash
set -euo pipefail
ROOT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")/../.." && pwd)"
STACK_ENV="${STACK_ENV:-$ROOT_DIR/stack.env}"
if [[ -f "$STACK_ENV" ]]; then
set -a
. "$STACK_ENV"
set +a
fi
STACK_ID="${GIA_STACK_ID:-${STACK_ID:-}}"
STACK_ID="$(echo "$STACK_ID" | tr -cs 'a-zA-Z0-9._-' '-' | sed 's/^-*//; s/-*$//')"
name_with_stack() {
local base="$1"
if [[ -n "$STACK_ID" ]]; then
echo "${base}_${STACK_ID}"
else
echo "$base"
fi
}
MANTICORE_CONTAINER="$(name_with_stack "manticore_gia")"
MANTICORE_CONFIG_FILE="${MANTICORE_CONFIG_FILE:-$ROOT_DIR/utilities/memory/manticore.conf}"
MANTICORE_DATA_DIR="${MANTICORE_DATA_DIR:-$ROOT_DIR/.podman/gia_manticore_data}"
MANTICORE_LOG_DIR="${MANTICORE_LOG_DIR:-$ROOT_DIR/.podman/gia_manticore_log}"
MANTICORE_MYSQL_PORT="${MANTICORE_MYSQL_PORT:-9306}"
MANTICORE_HTTP_PORT="${MANTICORE_HTTP_PORT:-9308}"
MANTICORE_SPHINX_PORT="${MANTICORE_SPHINX_PORT:-9312}"
mkdir -p "$MANTICORE_DATA_DIR" "$MANTICORE_LOG_DIR"
up() {
podman run -d \
--replace \
--name "$MANTICORE_CONTAINER" \
-p "${MANTICORE_MYSQL_PORT}:9306" \
-p "${MANTICORE_HTTP_PORT}:9308" \
-p "${MANTICORE_SPHINX_PORT}:9312" \
-v "$MANTICORE_DATA_DIR:/var/lib/manticore" \
-v "$MANTICORE_LOG_DIR:/var/log/manticore" \
-v "$MANTICORE_CONFIG_FILE:/etc/manticoresearch/manticore.conf:ro" \
docker.io/manticoresearch/manticore:latest >/dev/null
echo "Started $MANTICORE_CONTAINER"
}
down() {
podman rm -f "$MANTICORE_CONTAINER" >/dev/null 2>&1 || true
echo "Stopped $MANTICORE_CONTAINER"
}
status() {
podman ps --format "table {{.Names}}\t{{.Status}}" | grep -E "^$MANTICORE_CONTAINER\b" || true
}
logs() {
podman logs -f "$MANTICORE_CONTAINER"
}
case "${1:-}" in
up)
up
;;
down)
down
;;
restart)
down
up
;;
status)
status
;;
logs)
logs
;;
*)
echo "Usage: $0 {up|down|restart|status|logs}" >&2
exit 2
;;
esac

View File

@@ -0,0 +1,10 @@
searchd {
listen = 0.0.0.0:9312
listen = 0.0.0.0:9306:mysql
listen = 0.0.0.0:9308:http
log = /var/log/manticore/searchd.log
query_log = /var/log/manticore/query.log
pid_file = /var/run/manticore/searchd.pid
data_dir = /var/lib/manticore
auto_schema = 1
}