Implement Manticore fully and re-theme
This commit is contained in:
13
CLAUDE.md
13
CLAUDE.md
@@ -46,3 +46,16 @@ Preferred terms:
|
|||||||
| "Them" | "contact" or "remote party" |
|
| "Them" | "contact" or "remote party" |
|
||||||
|
|
||||||
Apply this in: comments, template labels, log messages, and variable names.
|
Apply this in: comments, template labels, log messages, and variable names.
|
||||||
|
|
||||||
|
## Runtime: uWSGI Reload File
|
||||||
|
|
||||||
|
The app container uses uWSGI with a single touch-based reload sentinel:
|
||||||
|
|
||||||
|
- Reload file: `/code/.uwsgi-reload`
|
||||||
|
- Config: [docker/uwsgi.ini](/code/xf/GIA/docker/uwsgi.ini)
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
|
||||||
|
- Never run `python manage.py ...` on the host. Run Django management commands inside Podman, for example with `podman exec`.
|
||||||
|
- After changing templates or app code that should be picked up by the `gia` uWSGI service, touch `/code/.uwsgi-reload`.
|
||||||
|
- If the uWSGI config itself changes, touch `/code/.uwsgi-reload` and restart the `gia` container so the new config is loaded.
|
||||||
|
|||||||
@@ -59,6 +59,10 @@ Memory/wiki search helpers:
|
|||||||
- `MEMORY_SEARCH_BACKEND` (`django` or `manticore`)
|
- `MEMORY_SEARCH_BACKEND` (`django` or `manticore`)
|
||||||
- `MANTICORE_HTTP_URL`
|
- `MANTICORE_HTTP_URL`
|
||||||
- `MANTICORE_MEMORY_TABLE`
|
- `MANTICORE_MEMORY_TABLE`
|
||||||
|
- `MANTICORE_EVENT_TABLE`
|
||||||
|
- `MANTICORE_METRIC_TABLE`
|
||||||
|
- `COMPOSING_ABANDONED_WINDOW_SECONDS`
|
||||||
|
- `CONVERSATION_EVENT_RETENTION_DAYS`
|
||||||
- `MANTICORE_HTTP_TIMEOUT`
|
- `MANTICORE_HTTP_TIMEOUT`
|
||||||
|
|
||||||
For XMPP media upload, configure one of:
|
For XMPP media upload, configure one of:
|
||||||
@@ -238,6 +242,9 @@ Performance defaults now applied in GIA:
|
|||||||
|
|
||||||
- Batched Manticore reindex writes (`REPLACE ... VALUES (...)` in chunks) for lower ingest latency.
|
- Batched Manticore reindex writes (`REPLACE ... VALUES (...)` in chunks) for lower ingest latency.
|
||||||
- Cached table-ensure checks to avoid `CREATE TABLE IF NOT EXISTS` overhead on every query.
|
- Cached table-ensure checks to avoid `CREATE TABLE IF NOT EXISTS` overhead on every query.
|
||||||
|
- Behavioral event dual-write uses `MANTICORE_EVENT_TABLE` (default `gia_events`) when event ledger flags are enabled.
|
||||||
|
- Behavioral metrics are written by `python manage.py gia_analysis` into `MANTICORE_METRIC_TABLE` (default `gia_metrics`).
|
||||||
|
- ORM shadow copies can be pruned with `python manage.py prune_behavioral_orm_data`; defaults are driven by `CONVERSATION_EVENT_RETENTION_DAYS`.
|
||||||
- Runtime table maintenance available through MCP (`FLUSH RAMCHUNK`, `OPTIMIZE TABLE`) for steady query responsiveness.
|
- Runtime table maintenance available through MCP (`FLUSH RAMCHUNK`, `OPTIMIZE TABLE`) for steady query responsiveness.
|
||||||
|
|
||||||
### F) MCP server for task + memory tooling (VS Code)
|
### F) MCP server for task + memory tooling (VS Code)
|
||||||
|
|||||||
@@ -12,6 +12,9 @@ GIA is a multi-transport communication workspace that unifies Signal, WhatsApp,
|
|||||||
- Supports canonical task creation from chat commands, web UI, and MCP tooling.
|
- Supports canonical task creation from chat commands, web UI, and MCP tooling.
|
||||||
- Bridges messages across transports (including XMPP) with attachment handling.
|
- Bridges messages across transports (including XMPP) with attachment handling.
|
||||||
- Tracks delivery/read metadata and typing state events.
|
- Tracks delivery/read metadata and typing state events.
|
||||||
|
- Can dual-write canonical behavioral events to Manticore for time-series analysis.
|
||||||
|
- Includes `gia_analysis` for rolling behavioral metric aggregation into Manticore.
|
||||||
|
- Includes `prune_behavioral_orm_data` to keep Django event shadow tables bounded once Manticore is primary.
|
||||||
- Provides AI workspace analytics, mitigation plans, and insight visualizations.
|
- Provides AI workspace analytics, mitigation plans, and insight visualizations.
|
||||||
- Exposes fine-grained capability policy controls for gateway commands, task intake, and command execution.
|
- Exposes fine-grained capability policy controls for gateway commands, task intake, and command execution.
|
||||||
- Separates XMPP encryption controls into plaintext rejection, component-chat encryption, and relayed-contact encryption.
|
- Separates XMPP encryption controls into plaintext rejection, component-chat encryption, and relayed-contact encryption.
|
||||||
|
|||||||
@@ -108,6 +108,14 @@ EVENT_PRIMARY_WRITE_PATH = getenv("EVENT_PRIMARY_WRITE_PATH", "false").lower() i
|
|||||||
MEMORY_SEARCH_BACKEND = getenv("MEMORY_SEARCH_BACKEND", "django")
|
MEMORY_SEARCH_BACKEND = getenv("MEMORY_SEARCH_BACKEND", "django")
|
||||||
MANTICORE_HTTP_URL = getenv("MANTICORE_HTTP_URL", "http://localhost:9308")
|
MANTICORE_HTTP_URL = getenv("MANTICORE_HTTP_URL", "http://localhost:9308")
|
||||||
MANTICORE_MEMORY_TABLE = getenv("MANTICORE_MEMORY_TABLE", "gia_memory_items")
|
MANTICORE_MEMORY_TABLE = getenv("MANTICORE_MEMORY_TABLE", "gia_memory_items")
|
||||||
|
MANTICORE_EVENT_TABLE = getenv("MANTICORE_EVENT_TABLE", "gia_events")
|
||||||
|
MANTICORE_METRIC_TABLE = getenv("MANTICORE_METRIC_TABLE", "gia_metrics")
|
||||||
|
COMPOSING_ABANDONED_WINDOW_SECONDS = int(
|
||||||
|
getenv("COMPOSING_ABANDONED_WINDOW_SECONDS", "300")
|
||||||
|
)
|
||||||
|
CONVERSATION_EVENT_RETENTION_DAYS = int(
|
||||||
|
getenv("CONVERSATION_EVENT_RETENTION_DAYS", "90") or 90
|
||||||
|
)
|
||||||
MANTICORE_HTTP_TIMEOUT = int(getenv("MANTICORE_HTTP_TIMEOUT", "5") or 5)
|
MANTICORE_HTTP_TIMEOUT = int(getenv("MANTICORE_HTTP_TIMEOUT", "5") or 5)
|
||||||
|
|
||||||
# Attachment security defaults for transport adapters.
|
# Attachment security defaults for transport adapters.
|
||||||
|
|||||||
@@ -445,8 +445,13 @@ urlpatterns = [
|
|||||||
name="codex_approval",
|
name="codex_approval",
|
||||||
),
|
),
|
||||||
path(
|
path(
|
||||||
"settings/availability/",
|
"settings/behavioral/",
|
||||||
availability.AvailabilitySettingsPage.as_view(),
|
availability.AvailabilitySettingsPage.as_view(),
|
||||||
|
name="behavioral_signals_settings",
|
||||||
|
),
|
||||||
|
path(
|
||||||
|
"settings/availability/",
|
||||||
|
RedirectView.as_view(pattern_name="behavioral_signals_settings", permanent=False),
|
||||||
name="availability_settings",
|
name="availability_settings",
|
||||||
),
|
),
|
||||||
# AIs
|
# AIs
|
||||||
|
|||||||
@@ -1579,6 +1579,33 @@ class WhatsAppClient(ClientBase):
|
|||||||
out.add(f"{mapped}@s.whatsapp.net")
|
out.add(f"{mapped}@s.whatsapp.net")
|
||||||
return out
|
return out
|
||||||
|
|
||||||
|
def _message_identifier_candidates(self, *, sender, chat, is_from_me):
|
||||||
|
"""
|
||||||
|
Resolve the logical contact for a WhatsApp message event.
|
||||||
|
|
||||||
|
Direct outbound messages must bind to the chat peer, not the sender,
|
||||||
|
otherwise the user's own account identifier can fan out the same message
|
||||||
|
into unrelated XMPP contact threads.
|
||||||
|
"""
|
||||||
|
sender_value = self._jid_to_identifier(sender)
|
||||||
|
chat_value = self._jid_to_identifier(chat)
|
||||||
|
candidate_values = []
|
||||||
|
|
||||||
|
if chat_value.endswith("@g.us"):
|
||||||
|
candidate_values.append(chat)
|
||||||
|
elif is_from_me:
|
||||||
|
if chat_value:
|
||||||
|
candidate_values.append(chat)
|
||||||
|
elif sender_value:
|
||||||
|
candidate_values.append(sender)
|
||||||
|
else:
|
||||||
|
if sender_value:
|
||||||
|
candidate_values.append(sender)
|
||||||
|
elif chat_value:
|
||||||
|
candidate_values.append(chat)
|
||||||
|
|
||||||
|
return self._normalize_identifier_candidates(*candidate_values)
|
||||||
|
|
||||||
async def _sync_contacts_from_client(self):
|
async def _sync_contacts_from_client(self):
|
||||||
if self._client is None:
|
if self._client is None:
|
||||||
return
|
return
|
||||||
@@ -2666,7 +2693,11 @@ class WhatsAppClient(ClientBase):
|
|||||||
sender_jid=str(sender or ""),
|
sender_jid=str(sender or ""),
|
||||||
)
|
)
|
||||||
|
|
||||||
identifier_values = self._normalize_identifier_candidates(sender, chat)
|
identifier_values = self._message_identifier_candidates(
|
||||||
|
sender=sender,
|
||||||
|
chat=chat,
|
||||||
|
is_from_me=is_from_me,
|
||||||
|
)
|
||||||
if not identifier_values:
|
if not identifier_values:
|
||||||
return
|
return
|
||||||
|
|
||||||
|
|||||||
@@ -30,7 +30,7 @@ def settings_hierarchy_nav(request):
|
|||||||
business_plans_href = reverse("business_plan_inbox")
|
business_plans_href = reverse("business_plan_inbox")
|
||||||
tasks_href = reverse("tasks_settings")
|
tasks_href = reverse("tasks_settings")
|
||||||
translation_href = reverse("translation_settings")
|
translation_href = reverse("translation_settings")
|
||||||
availability_href = reverse("availability_settings")
|
behavioral_href = reverse("behavioral_signals_settings")
|
||||||
|
|
||||||
categories = {
|
categories = {
|
||||||
"general": {
|
"general": {
|
||||||
@@ -99,6 +99,7 @@ def settings_hierarchy_nav(request):
|
|||||||
"translation_settings",
|
"translation_settings",
|
||||||
"translation_preview",
|
"translation_preview",
|
||||||
"availability_settings",
|
"availability_settings",
|
||||||
|
"behavioral_signals_settings",
|
||||||
"codex_settings",
|
"codex_settings",
|
||||||
"codex_approval",
|
"codex_approval",
|
||||||
},
|
},
|
||||||
@@ -116,7 +117,12 @@ def settings_hierarchy_nav(request):
|
|||||||
translation_href,
|
translation_href,
|
||||||
lambda: url_name in {"translation_settings", "translation_preview"},
|
lambda: url_name in {"translation_settings", "translation_preview"},
|
||||||
),
|
),
|
||||||
("Availability", availability_href, lambda: path == availability_href),
|
(
|
||||||
|
"Behavioral Signals",
|
||||||
|
behavioral_href,
|
||||||
|
lambda: url_name
|
||||||
|
in {"availability_settings", "behavioral_signals_settings"},
|
||||||
|
),
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|||||||
213
core/events/behavior.py
Normal file
213
core/events/behavior.py
Normal file
@@ -0,0 +1,213 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import json
|
||||||
|
import statistics
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
|
||||||
|
def safe_int(value: Any, default: int = 0) -> int:
|
||||||
|
try:
|
||||||
|
return int(value)
|
||||||
|
except Exception:
|
||||||
|
return int(default)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_payload(value: Any) -> dict:
|
||||||
|
if isinstance(value, dict):
|
||||||
|
return dict(value)
|
||||||
|
if isinstance(value, str):
|
||||||
|
text = value.strip()
|
||||||
|
if not text:
|
||||||
|
return {}
|
||||||
|
try:
|
||||||
|
loaded = json.loads(text)
|
||||||
|
except Exception:
|
||||||
|
return {}
|
||||||
|
if isinstance(loaded, dict):
|
||||||
|
return dict(loaded)
|
||||||
|
return {}
|
||||||
|
|
||||||
|
|
||||||
|
def median_ms(values: list[int]) -> int:
|
||||||
|
clean = [int(v) for v in values if safe_int(v, 0) > 0]
|
||||||
|
if not clean:
|
||||||
|
return 0
|
||||||
|
return int(statistics.median(clean))
|
||||||
|
|
||||||
|
|
||||||
|
def z_score(value: int, baseline_samples: list[int]) -> float:
|
||||||
|
clean = [int(v) for v in baseline_samples if safe_int(v, 0) > 0]
|
||||||
|
if len(clean) < 2:
|
||||||
|
return 0.0
|
||||||
|
baseline = statistics.median(clean)
|
||||||
|
stdev = statistics.pstdev(clean)
|
||||||
|
if stdev <= 0:
|
||||||
|
return 0.0
|
||||||
|
return float((float(value) - float(baseline)) / float(stdev))
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class CompositionState:
|
||||||
|
started_ts: int
|
||||||
|
last_started_ts: int
|
||||||
|
stopped_ts: int = 0
|
||||||
|
revision: int = 1
|
||||||
|
|
||||||
|
|
||||||
|
class ComposingTracker:
|
||||||
|
def __init__(self, window_ms: int = 300000):
|
||||||
|
self.window_ms = max(1000, int(window_ms or 300000))
|
||||||
|
self._state: dict[str, CompositionState] = {}
|
||||||
|
|
||||||
|
def observe_started(self, session_id: str, ts: int) -> CompositionState:
|
||||||
|
key = str(session_id or "").strip()
|
||||||
|
if not key:
|
||||||
|
raise ValueError("session_id is required")
|
||||||
|
safe_ts_value = max(0, safe_int(ts, 0))
|
||||||
|
state = self._state.get(key)
|
||||||
|
if state is None:
|
||||||
|
state = CompositionState(
|
||||||
|
started_ts=safe_ts_value,
|
||||||
|
last_started_ts=safe_ts_value,
|
||||||
|
revision=1,
|
||||||
|
)
|
||||||
|
self._state[key] = state
|
||||||
|
return state
|
||||||
|
if state.stopped_ts > 0:
|
||||||
|
state.revision += 1
|
||||||
|
state.last_started_ts = safe_ts_value
|
||||||
|
state.stopped_ts = 0
|
||||||
|
return state
|
||||||
|
|
||||||
|
def observe_stopped(self, session_id: str, ts: int) -> dict | None:
|
||||||
|
key = str(session_id or "").strip()
|
||||||
|
state = self._state.get(key)
|
||||||
|
if state is None:
|
||||||
|
return None
|
||||||
|
safe_ts_value = max(0, safe_int(ts, 0))
|
||||||
|
duration_ms = max(0, safe_ts_value - int(state.started_ts or 0))
|
||||||
|
if duration_ms >= self.window_ms:
|
||||||
|
self._state.pop(key, None)
|
||||||
|
return {
|
||||||
|
"started_ts": int(state.started_ts or 0),
|
||||||
|
"stopped_ts": safe_ts_value,
|
||||||
|
"duration_ms": duration_ms,
|
||||||
|
"revision": int(state.revision or 1),
|
||||||
|
"abandoned": True,
|
||||||
|
}
|
||||||
|
state.stopped_ts = safe_ts_value
|
||||||
|
return None
|
||||||
|
|
||||||
|
def observe_message(self, session_id: str) -> CompositionState | None:
|
||||||
|
key = str(session_id or "").strip()
|
||||||
|
if not key:
|
||||||
|
return None
|
||||||
|
return self._state.pop(key, None)
|
||||||
|
|
||||||
|
|
||||||
|
def extract_metric_samples(rows: list[dict]) -> dict[str, list[int]]:
|
||||||
|
delivered_by_message: dict[str, int] = {}
|
||||||
|
read_by_message: dict[str, int] = {}
|
||||||
|
delay_c_samples: list[int] = []
|
||||||
|
delay_f_samples: list[int] = []
|
||||||
|
revision_samples: list[int] = []
|
||||||
|
abandoned_started = 0
|
||||||
|
abandoned_total = 0
|
||||||
|
composition_by_session: dict[str, dict[str, int]] = {}
|
||||||
|
presence_by_session: dict[str, int] = {}
|
||||||
|
|
||||||
|
for row in sorted(
|
||||||
|
list(rows or []),
|
||||||
|
key=lambda item: (
|
||||||
|
safe_int(item.get("ts"), 0),
|
||||||
|
str(item.get("kind") or ""),
|
||||||
|
str(item.get("session_id") or ""),
|
||||||
|
),
|
||||||
|
):
|
||||||
|
kind = str(row.get("kind") or "").strip().lower()
|
||||||
|
session_id = str(row.get("session_id") or "").strip()
|
||||||
|
ts = safe_int(row.get("ts"), 0)
|
||||||
|
payload = parse_payload(row.get("payload"))
|
||||||
|
message_id = str(
|
||||||
|
payload.get("message_id")
|
||||||
|
or payload.get("origin_message_id")
|
||||||
|
or row.get("origin_message_id")
|
||||||
|
or ""
|
||||||
|
).strip()
|
||||||
|
|
||||||
|
if kind == "message_delivered" and message_id:
|
||||||
|
delivered_by_message[message_id] = ts
|
||||||
|
continue
|
||||||
|
if kind == "message_read" and message_id:
|
||||||
|
read_by_message[message_id] = ts
|
||||||
|
continue
|
||||||
|
if kind == "presence_available" and session_id:
|
||||||
|
presence_by_session[session_id] = ts
|
||||||
|
continue
|
||||||
|
if kind == "composing_started" and session_id:
|
||||||
|
abandoned_started += 1
|
||||||
|
state = composition_by_session.get(session_id)
|
||||||
|
if state is None:
|
||||||
|
state = {"started_ts": ts, "revision": 1}
|
||||||
|
composition_by_session[session_id] = state
|
||||||
|
else:
|
||||||
|
state["revision"] = int(state.get("revision", 1)) + 1
|
||||||
|
if presence_by_session.get(session_id):
|
||||||
|
delta = ts - int(presence_by_session.get(session_id) or 0)
|
||||||
|
if delta >= 0:
|
||||||
|
delay_f_samples.append(delta)
|
||||||
|
continue
|
||||||
|
if kind == "composing_abandoned":
|
||||||
|
abandoned_total += 1
|
||||||
|
if session_id:
|
||||||
|
composition_by_session.pop(session_id, None)
|
||||||
|
continue
|
||||||
|
if kind == "message_sent" and session_id:
|
||||||
|
state = composition_by_session.pop(session_id, None)
|
||||||
|
if state is None:
|
||||||
|
continue
|
||||||
|
delta = ts - int(state.get("started_ts") or 0)
|
||||||
|
if delta >= 0:
|
||||||
|
delay_c_samples.append(delta)
|
||||||
|
revision_samples.append(max(1, int(state.get("revision") or 1)))
|
||||||
|
|
||||||
|
delay_b_samples = []
|
||||||
|
for message_id, delivered_ts in delivered_by_message.items():
|
||||||
|
read_ts = safe_int(read_by_message.get(message_id), 0)
|
||||||
|
if read_ts > 0 and read_ts >= delivered_ts:
|
||||||
|
delay_b_samples.append(read_ts - delivered_ts)
|
||||||
|
|
||||||
|
abandoned_rate_samples = []
|
||||||
|
if abandoned_started > 0:
|
||||||
|
abandoned_rate_samples.append(
|
||||||
|
int(round((float(abandoned_total) / float(abandoned_started)) * 1000))
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"delay_b": delay_b_samples,
|
||||||
|
"delay_c": delay_c_samples,
|
||||||
|
"delay_f": delay_f_samples,
|
||||||
|
"revision": revision_samples,
|
||||||
|
"abandoned_rate": abandoned_rate_samples,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def summarize_metrics(window_rows: list[dict], baseline_rows: list[dict]) -> dict[str, dict]:
|
||||||
|
window_samples = extract_metric_samples(window_rows)
|
||||||
|
baseline_samples = extract_metric_samples(baseline_rows)
|
||||||
|
metrics: dict[str, dict] = {}
|
||||||
|
for metric in ("delay_b", "delay_c", "delay_f", "revision", "abandoned_rate"):
|
||||||
|
samples = list(window_samples.get(metric) or [])
|
||||||
|
if not samples:
|
||||||
|
continue
|
||||||
|
baseline = list(baseline_samples.get(metric) or [])
|
||||||
|
value = median_ms(samples)
|
||||||
|
baseline_value = median_ms(baseline)
|
||||||
|
metrics[metric] = {
|
||||||
|
"value_ms": int(value),
|
||||||
|
"baseline_ms": int(baseline_value),
|
||||||
|
"z_score": float(round(z_score(value, baseline), 6)),
|
||||||
|
"sample_n": len(samples),
|
||||||
|
}
|
||||||
|
return metrics
|
||||||
@@ -5,12 +5,19 @@ import time
|
|||||||
from asgiref.sync import sync_to_async
|
from asgiref.sync import sync_to_async
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
|
|
||||||
|
from core.events.manticore import get_event_ledger_backend
|
||||||
from core.models import ConversationEvent
|
from core.models import ConversationEvent
|
||||||
from core.observability.tracing import ensure_trace_id
|
from core.observability.tracing import ensure_trace_id
|
||||||
|
from core.util import logs
|
||||||
|
|
||||||
|
log = logs.get_logger("event-ledger")
|
||||||
|
|
||||||
|
|
||||||
def event_ledger_enabled() -> bool:
|
def event_ledger_enabled() -> bool:
|
||||||
return bool(getattr(settings, "EVENT_LEDGER_DUAL_WRITE", False))
|
return bool(
|
||||||
|
getattr(settings, "EVENT_LEDGER_DUAL_WRITE", False)
|
||||||
|
or getattr(settings, "EVENT_PRIMARY_WRITE_PATH", False)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def event_ledger_status() -> dict:
|
def event_ledger_status() -> dict:
|
||||||
@@ -72,38 +79,78 @@ def append_event_sync(
|
|||||||
normalized_direction = _normalize_direction(direction)
|
normalized_direction = _normalize_direction(direction)
|
||||||
normalized_trace = ensure_trace_id(trace_id, payload or {})
|
normalized_trace = ensure_trace_id(trace_id, payload or {})
|
||||||
|
|
||||||
|
safe_ts = _safe_ts(ts)
|
||||||
transport = str(origin_transport or "").strip().lower()
|
transport = str(origin_transport or "").strip().lower()
|
||||||
message_id = str(origin_message_id or "").strip()
|
message_id = str(origin_message_id or "").strip()
|
||||||
dedup_row = None
|
actor_identifier = str(actor_identifier or "").strip()
|
||||||
if transport and message_id:
|
origin_chat_id = str(origin_chat_id or "").strip()
|
||||||
dedup_row = (
|
payload = dict(payload or {})
|
||||||
ConversationEvent.objects.filter(
|
raw_payload = dict(raw_payload or {})
|
||||||
|
|
||||||
|
dual_write = bool(getattr(settings, "EVENT_LEDGER_DUAL_WRITE", False))
|
||||||
|
primary_write = bool(getattr(settings, "EVENT_PRIMARY_WRITE_PATH", False))
|
||||||
|
write_django = dual_write and not primary_write
|
||||||
|
|
||||||
|
row = None
|
||||||
|
if write_django:
|
||||||
|
dedup_row = None
|
||||||
|
if transport and message_id:
|
||||||
|
dedup_row = (
|
||||||
|
ConversationEvent.objects.filter(
|
||||||
|
user=user,
|
||||||
|
session=session,
|
||||||
|
event_type=normalized_type,
|
||||||
|
origin_transport=transport,
|
||||||
|
origin_message_id=message_id,
|
||||||
|
)
|
||||||
|
.order_by("-created_at")
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if dedup_row is not None:
|
||||||
|
row = dedup_row
|
||||||
|
else:
|
||||||
|
row = ConversationEvent.objects.create(
|
||||||
user=user,
|
user=user,
|
||||||
session=session,
|
session=session,
|
||||||
|
ts=safe_ts,
|
||||||
event_type=normalized_type,
|
event_type=normalized_type,
|
||||||
|
direction=normalized_direction,
|
||||||
|
actor_identifier=actor_identifier,
|
||||||
origin_transport=transport,
|
origin_transport=transport,
|
||||||
origin_message_id=message_id,
|
origin_message_id=message_id,
|
||||||
|
origin_chat_id=origin_chat_id,
|
||||||
|
payload=payload,
|
||||||
|
raw_payload=raw_payload,
|
||||||
|
trace_id=normalized_trace,
|
||||||
)
|
)
|
||||||
.order_by("-created_at")
|
|
||||||
.first()
|
|
||||||
)
|
|
||||||
if dedup_row is not None:
|
|
||||||
return dedup_row
|
|
||||||
|
|
||||||
return ConversationEvent.objects.create(
|
try:
|
||||||
user=user,
|
get_event_ledger_backend().upsert_event(
|
||||||
session=session,
|
user_id=int(user.id),
|
||||||
ts=_safe_ts(ts),
|
person_id=str(session.identifier.person_id),
|
||||||
event_type=normalized_type,
|
session_id=str(session.id),
|
||||||
direction=normalized_direction,
|
event_type=normalized_type,
|
||||||
actor_identifier=str(actor_identifier or "").strip(),
|
direction=normalized_direction,
|
||||||
origin_transport=transport,
|
ts=safe_ts,
|
||||||
origin_message_id=message_id,
|
actor_identifier=actor_identifier,
|
||||||
origin_chat_id=str(origin_chat_id or "").strip(),
|
origin_transport=transport,
|
||||||
payload=dict(payload or {}),
|
origin_message_id=message_id,
|
||||||
raw_payload=dict(raw_payload or {}),
|
origin_chat_id=origin_chat_id,
|
||||||
trace_id=normalized_trace,
|
payload=payload,
|
||||||
)
|
raw_payload=raw_payload,
|
||||||
|
trace_id=normalized_trace,
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
if primary_write:
|
||||||
|
raise
|
||||||
|
log.warning(
|
||||||
|
"Event ledger manticore dual-write failed session=%s event_type=%s err=%s",
|
||||||
|
getattr(session, "id", "-"),
|
||||||
|
normalized_type,
|
||||||
|
exc,
|
||||||
|
)
|
||||||
|
|
||||||
|
return row
|
||||||
|
|
||||||
|
|
||||||
async def append_event(**kwargs):
|
async def append_event(**kwargs):
|
||||||
|
|||||||
588
core/events/manticore.py
Normal file
588
core/events/manticore.py
Normal file
@@ -0,0 +1,588 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import json
|
||||||
|
import time
|
||||||
|
from urllib.parse import urlparse, urlunparse
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
import requests
|
||||||
|
from django.conf import settings
|
||||||
|
|
||||||
|
from core.models import ConversationEvent
|
||||||
|
from core.util import logs
|
||||||
|
from core.events.behavior import parse_payload
|
||||||
|
|
||||||
|
log = logs.get_logger("event-manticore")
|
||||||
|
|
||||||
|
|
||||||
|
class ManticoreEventLedgerBackend:
|
||||||
|
_table_ready_cache: dict[str, float] = {}
|
||||||
|
_table_ready_ttl_seconds = 30.0
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
self.base_url = str(
|
||||||
|
getattr(settings, "MANTICORE_HTTP_URL", "http://localhost:9308")
|
||||||
|
).rstrip("/")
|
||||||
|
self.table = (
|
||||||
|
str(getattr(settings, "MANTICORE_EVENT_TABLE", "gia_events")).strip()
|
||||||
|
or "gia_events"
|
||||||
|
)
|
||||||
|
self.metrics_table = (
|
||||||
|
str(getattr(settings, "MANTICORE_METRIC_TABLE", "gia_metrics")).strip()
|
||||||
|
or "gia_metrics"
|
||||||
|
)
|
||||||
|
self.timeout_seconds = int(getattr(settings, "MANTICORE_HTTP_TIMEOUT", 5) or 5)
|
||||||
|
self._table_cache_key = f"{self.base_url}|{self.table}"
|
||||||
|
self._metrics_cache_key = f"{self.base_url}|{self.metrics_table}"
|
||||||
|
|
||||||
|
def _candidate_base_urls(self) -> list[str]:
|
||||||
|
parsed = urlparse(self.base_url)
|
||||||
|
hostname = str(parsed.hostname or "").strip().lower()
|
||||||
|
candidates = [self.base_url]
|
||||||
|
if hostname in {"localhost", "127.0.0.1"}:
|
||||||
|
replacement = parsed._replace(netloc=f"host.containers.internal:{parsed.port or 9308}")
|
||||||
|
candidates.append(urlunparse(replacement))
|
||||||
|
output = []
|
||||||
|
seen = set()
|
||||||
|
for value in candidates:
|
||||||
|
key = str(value or "").strip()
|
||||||
|
if not key or key in seen:
|
||||||
|
continue
|
||||||
|
seen.add(key)
|
||||||
|
output.append(key)
|
||||||
|
return output
|
||||||
|
|
||||||
|
def _sql(self, query: str) -> dict[str, Any]:
|
||||||
|
last_exc = None
|
||||||
|
for base_url in self._candidate_base_urls():
|
||||||
|
try:
|
||||||
|
response = requests.post(
|
||||||
|
f"{base_url}/sql",
|
||||||
|
data={"mode": "raw", "query": query},
|
||||||
|
timeout=self.timeout_seconds,
|
||||||
|
)
|
||||||
|
response.raise_for_status()
|
||||||
|
payload = response.json()
|
||||||
|
if base_url != self.base_url:
|
||||||
|
self.base_url = base_url.rstrip("/")
|
||||||
|
self._table_cache_key = f"{self.base_url}|{self.table}"
|
||||||
|
self._metrics_cache_key = f"{self.base_url}|{self.metrics_table}"
|
||||||
|
if isinstance(payload, list):
|
||||||
|
return payload[0] if payload else {}
|
||||||
|
return dict(payload or {})
|
||||||
|
except Exception as exc:
|
||||||
|
last_exc = exc
|
||||||
|
if last_exc is not None:
|
||||||
|
raise last_exc
|
||||||
|
return {}
|
||||||
|
|
||||||
|
def ensure_table(self) -> None:
|
||||||
|
last_ready = float(
|
||||||
|
self._table_ready_cache.get(self._table_cache_key, 0.0) or 0.0
|
||||||
|
)
|
||||||
|
if (time.time() - last_ready) <= float(self._table_ready_ttl_seconds):
|
||||||
|
return
|
||||||
|
self._sql(
|
||||||
|
(
|
||||||
|
f"CREATE TABLE IF NOT EXISTS {self.table} ("
|
||||||
|
"id BIGINT,"
|
||||||
|
"user_id BIGINT,"
|
||||||
|
"person_id STRING,"
|
||||||
|
"session_id STRING,"
|
||||||
|
"transport STRING,"
|
||||||
|
"kind STRING,"
|
||||||
|
"direction STRING,"
|
||||||
|
"ts BIGINT,"
|
||||||
|
"ts_ref BIGINT,"
|
||||||
|
"actor STRING,"
|
||||||
|
"duration_ms BIGINT,"
|
||||||
|
"abandoned INTEGER,"
|
||||||
|
"revision INTEGER,"
|
||||||
|
"payload JSON"
|
||||||
|
") engine='columnar' min_infix_len='2'"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
self._table_ready_cache[self._table_cache_key] = time.time()
|
||||||
|
|
||||||
|
def ensure_metrics_table(self) -> None:
|
||||||
|
last_ready = float(
|
||||||
|
self._table_ready_cache.get(self._metrics_cache_key, 0.0) or 0.0
|
||||||
|
)
|
||||||
|
if (time.time() - last_ready) <= float(self._table_ready_ttl_seconds):
|
||||||
|
return
|
||||||
|
self._sql(
|
||||||
|
(
|
||||||
|
f"CREATE TABLE IF NOT EXISTS {self.metrics_table} ("
|
||||||
|
"id BIGINT,"
|
||||||
|
"user_id BIGINT,"
|
||||||
|
"person_id STRING,"
|
||||||
|
"window_days INTEGER,"
|
||||||
|
"metric STRING,"
|
||||||
|
"value_ms BIGINT,"
|
||||||
|
"baseline_ms BIGINT,"
|
||||||
|
"z_score FLOAT,"
|
||||||
|
"sample_n INTEGER,"
|
||||||
|
"computed_at BIGINT"
|
||||||
|
") engine='columnar'"
|
||||||
|
)
|
||||||
|
)
|
||||||
|
self._table_ready_cache[self._metrics_cache_key] = time.time()
|
||||||
|
|
||||||
|
def _escape(self, value: Any) -> str:
|
||||||
|
text = str(value or "")
|
||||||
|
return text.replace("\\", "\\\\").replace("'", "\\'")
|
||||||
|
|
||||||
|
def _event_id(self, *, logical_key: str) -> int:
|
||||||
|
digest = hashlib.blake2b(
|
||||||
|
str(logical_key or "").encode("utf-8"),
|
||||||
|
digest_size=8,
|
||||||
|
).digest()
|
||||||
|
value = int.from_bytes(digest, byteorder="big", signed=False)
|
||||||
|
return max(1, int(value))
|
||||||
|
|
||||||
|
def _event_kind(self, event_type: str) -> str:
|
||||||
|
normalized = str(event_type or "").strip().lower()
|
||||||
|
return {
|
||||||
|
"message_created": "message_sent",
|
||||||
|
"delivery_receipt": "message_delivered",
|
||||||
|
"read_receipt": "message_read",
|
||||||
|
"typing_started": "composing_started",
|
||||||
|
"typing_stopped": "composing_stopped",
|
||||||
|
"composing_abandoned": "composing_abandoned",
|
||||||
|
"presence_available": "presence_available",
|
||||||
|
"presence_unavailable": "presence_unavailable",
|
||||||
|
}.get(normalized, normalized)
|
||||||
|
|
||||||
|
def _rows_from_sql_payload(self, payload: dict[str, Any]) -> list[dict]:
|
||||||
|
data = payload.get("data") or payload.get("hits") or []
|
||||||
|
if isinstance(data, dict):
|
||||||
|
data = [data]
|
||||||
|
rows = []
|
||||||
|
for row in list(data or []):
|
||||||
|
if isinstance(row, dict):
|
||||||
|
rows.append(dict(row))
|
||||||
|
return rows
|
||||||
|
|
||||||
|
def _build_values(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
user_id: int,
|
||||||
|
person_id: str,
|
||||||
|
session_id: str,
|
||||||
|
event_type: str,
|
||||||
|
direction: str,
|
||||||
|
ts: int,
|
||||||
|
actor_identifier: str,
|
||||||
|
origin_transport: str,
|
||||||
|
origin_message_id: str,
|
||||||
|
origin_chat_id: str,
|
||||||
|
payload: dict | None,
|
||||||
|
raw_payload: dict | None,
|
||||||
|
trace_id: str,
|
||||||
|
) -> str:
|
||||||
|
data = dict(payload or {})
|
||||||
|
if raw_payload:
|
||||||
|
data["raw_payload"] = dict(raw_payload)
|
||||||
|
if trace_id:
|
||||||
|
data["trace_id"] = str(trace_id)
|
||||||
|
if origin_message_id:
|
||||||
|
data["origin_message_id"] = str(origin_message_id)
|
||||||
|
if origin_chat_id:
|
||||||
|
data["origin_chat_id"] = str(origin_chat_id)
|
||||||
|
data["legacy_event_type"] = str(event_type or "").strip().lower()
|
||||||
|
|
||||||
|
ts_ref = 0
|
||||||
|
try:
|
||||||
|
ts_ref = int(data.get("message_ts") or data.get("source_ts") or 0)
|
||||||
|
except Exception:
|
||||||
|
ts_ref = 0
|
||||||
|
try:
|
||||||
|
duration_ms = int(data.get("duration_ms") or 0)
|
||||||
|
except Exception:
|
||||||
|
duration_ms = 0
|
||||||
|
try:
|
||||||
|
abandoned = 1 if bool(data.get("abandoned")) else 0
|
||||||
|
except Exception:
|
||||||
|
abandoned = 0
|
||||||
|
try:
|
||||||
|
revision = int(data.get("revision") or 0)
|
||||||
|
except Exception:
|
||||||
|
revision = 0
|
||||||
|
|
||||||
|
logical_key = "|".join(
|
||||||
|
[
|
||||||
|
str(user_id),
|
||||||
|
str(session_id),
|
||||||
|
str(event_type or "").strip().lower(),
|
||||||
|
str(direction or "").strip().lower(),
|
||||||
|
str(origin_transport or "").strip().lower(),
|
||||||
|
str(origin_message_id or "").strip(),
|
||||||
|
str(origin_chat_id or "").strip(),
|
||||||
|
str(actor_identifier or "").strip(),
|
||||||
|
str(int(ts or 0)),
|
||||||
|
str(trace_id or "").strip(),
|
||||||
|
]
|
||||||
|
)
|
||||||
|
doc_id = self._event_id(logical_key=logical_key)
|
||||||
|
payload_json = json.dumps(data, separators=(",", ":"), sort_keys=True)
|
||||||
|
return (
|
||||||
|
f"({doc_id},{int(user_id)},'{self._escape(person_id)}',"
|
||||||
|
f"'{self._escape(session_id)}','{self._escape(origin_transport)}',"
|
||||||
|
f"'{self._escape(self._event_kind(event_type))}','{self._escape(direction)}',"
|
||||||
|
f"{int(ts)},{ts_ref},'{self._escape(actor_identifier)}',{duration_ms},"
|
||||||
|
f"{abandoned},{revision},'{self._escape(payload_json)}')"
|
||||||
|
)
|
||||||
|
|
||||||
|
def upsert_event(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
user_id: int,
|
||||||
|
person_id: str,
|
||||||
|
session_id: str,
|
||||||
|
event_type: str,
|
||||||
|
direction: str,
|
||||||
|
ts: int,
|
||||||
|
actor_identifier: str = "",
|
||||||
|
origin_transport: str = "",
|
||||||
|
origin_message_id: str = "",
|
||||||
|
origin_chat_id: str = "",
|
||||||
|
payload: dict | None = None,
|
||||||
|
raw_payload: dict | None = None,
|
||||||
|
trace_id: str = "",
|
||||||
|
) -> None:
|
||||||
|
self.ensure_table()
|
||||||
|
values = self._build_values(
|
||||||
|
user_id=user_id,
|
||||||
|
person_id=person_id,
|
||||||
|
session_id=session_id,
|
||||||
|
event_type=event_type,
|
||||||
|
direction=direction,
|
||||||
|
ts=ts,
|
||||||
|
actor_identifier=actor_identifier,
|
||||||
|
origin_transport=origin_transport,
|
||||||
|
origin_message_id=origin_message_id,
|
||||||
|
origin_chat_id=origin_chat_id,
|
||||||
|
payload=payload,
|
||||||
|
raw_payload=raw_payload,
|
||||||
|
trace_id=trace_id,
|
||||||
|
)
|
||||||
|
self._sql(
|
||||||
|
f"REPLACE INTO {self.table} "
|
||||||
|
"(id,user_id,person_id,session_id,transport,kind,direction,ts,ts_ref,actor,duration_ms,abandoned,revision,payload) "
|
||||||
|
f"VALUES {values}"
|
||||||
|
)
|
||||||
|
|
||||||
|
def query_rows(self, query: str) -> list[dict]:
|
||||||
|
return self._rows_from_sql_payload(self._sql(query))
|
||||||
|
|
||||||
|
def list_event_targets(self, *, user_id: int | None = None) -> list[dict]:
|
||||||
|
filters = []
|
||||||
|
if user_id is not None:
|
||||||
|
filters.append(f"user_id={int(user_id)}")
|
||||||
|
where_clause = f" WHERE {' AND '.join(filters)}" if filters else ""
|
||||||
|
return self.query_rows(
|
||||||
|
f"SELECT user_id, person_id FROM {self.table}{where_clause} "
|
||||||
|
"GROUP BY user_id, person_id"
|
||||||
|
)
|
||||||
|
|
||||||
|
def fetch_events(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
user_id: int,
|
||||||
|
person_id: str,
|
||||||
|
since_ts: int,
|
||||||
|
) -> list[dict]:
|
||||||
|
return self.query_rows(
|
||||||
|
f"SELECT user_id, person_id, session_id, transport, kind, direction, ts, ts_ref, actor, duration_ms, abandoned, revision, payload "
|
||||||
|
f"FROM {self.table} "
|
||||||
|
f"WHERE user_id={int(user_id)} "
|
||||||
|
f"AND person_id='{self._escape(person_id)}' "
|
||||||
|
f"AND ts>={int(since_ts)} "
|
||||||
|
"ORDER BY ts ASC"
|
||||||
|
)
|
||||||
|
|
||||||
|
def _metric_doc_id(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
user_id: int,
|
||||||
|
person_id: str,
|
||||||
|
window_days: int,
|
||||||
|
metric: str,
|
||||||
|
) -> int:
|
||||||
|
digest = hashlib.blake2b(
|
||||||
|
f"{int(user_id)}|{person_id}|{int(window_days)}|{metric}".encode("utf-8"),
|
||||||
|
digest_size=8,
|
||||||
|
).digest()
|
||||||
|
return max(1, int.from_bytes(digest, byteorder="big", signed=False))
|
||||||
|
|
||||||
|
def upsert_metric(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
user_id: int,
|
||||||
|
person_id: str,
|
||||||
|
window_days: int,
|
||||||
|
metric: str,
|
||||||
|
value_ms: int,
|
||||||
|
baseline_ms: int,
|
||||||
|
z_score: float,
|
||||||
|
sample_n: int,
|
||||||
|
computed_at: int,
|
||||||
|
) -> None:
|
||||||
|
self.ensure_metrics_table()
|
||||||
|
doc_id = self._metric_doc_id(
|
||||||
|
user_id=user_id,
|
||||||
|
person_id=person_id,
|
||||||
|
window_days=window_days,
|
||||||
|
metric=metric,
|
||||||
|
)
|
||||||
|
self._sql(
|
||||||
|
f"REPLACE INTO {self.metrics_table} "
|
||||||
|
"(id,user_id,person_id,window_days,metric,value_ms,baseline_ms,z_score,sample_n,computed_at) "
|
||||||
|
f"VALUES ({doc_id},{int(user_id)},'{self._escape(person_id)}',{int(window_days)},"
|
||||||
|
f"'{self._escape(metric)}',{int(value_ms)},{int(baseline_ms)},"
|
||||||
|
f"{float(z_score)},{int(sample_n)},{int(computed_at)})"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_event_ledger_backend() -> ManticoreEventLedgerBackend:
|
||||||
|
return ManticoreEventLedgerBackend()
|
||||||
|
|
||||||
|
|
||||||
|
def upsert_conversation_event(event: ConversationEvent) -> None:
|
||||||
|
session = event.session
|
||||||
|
identifier = session.identifier
|
||||||
|
get_event_ledger_backend().upsert_event(
|
||||||
|
user_id=int(event.user_id),
|
||||||
|
person_id=str(identifier.person_id),
|
||||||
|
session_id=str(session.id),
|
||||||
|
event_type=str(event.event_type or ""),
|
||||||
|
direction=str(event.direction or "system"),
|
||||||
|
ts=int(event.ts or 0),
|
||||||
|
actor_identifier=str(event.actor_identifier or ""),
|
||||||
|
origin_transport=str(event.origin_transport or ""),
|
||||||
|
origin_message_id=str(event.origin_message_id or ""),
|
||||||
|
origin_chat_id=str(event.origin_chat_id or ""),
|
||||||
|
payload=dict(event.payload or {}),
|
||||||
|
raw_payload=dict(event.raw_payload or {}),
|
||||||
|
trace_id=str(event.trace_id or ""),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_behavioral_availability_stats(*, user_id: int) -> list[dict]:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
return backend.query_rows(
|
||||||
|
f"SELECT person_id, transport, "
|
||||||
|
"COUNT(*) AS total_events, "
|
||||||
|
"SUM(IF(kind IN ('presence_available','presence_unavailable'),1,0)) AS presence_events, "
|
||||||
|
"SUM(IF(kind='message_read',1,0)) AS read_events, "
|
||||||
|
"SUM(IF(kind IN ('composing_started','composing_stopped'),1,0)) AS typing_events, "
|
||||||
|
"SUM(IF(kind='message_sent',1,0)) AS message_events, "
|
||||||
|
"SUM(IF(kind='composing_abandoned',1,0)) AS abandoned_events, "
|
||||||
|
"MAX(ts) AS last_event_ts "
|
||||||
|
f"FROM {backend.table} "
|
||||||
|
f"WHERE user_id={int(user_id)} "
|
||||||
|
"GROUP BY person_id, transport "
|
||||||
|
"ORDER BY total_events DESC, person_id ASC, transport ASC"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_behavioral_latest_states(
|
||||||
|
*,
|
||||||
|
user_id: int,
|
||||||
|
person_ids: list[str],
|
||||||
|
transport: str = "",
|
||||||
|
) -> list[dict]:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
cleaned_ids = [
|
||||||
|
str(value or "").strip()
|
||||||
|
for value in list(person_ids or [])
|
||||||
|
if str(value or "").strip()
|
||||||
|
]
|
||||||
|
if not cleaned_ids:
|
||||||
|
return []
|
||||||
|
id_clause = ",".join(f"'{backend._escape(value)}'" for value in cleaned_ids)
|
||||||
|
transport_clause = ""
|
||||||
|
if str(transport or "").strip():
|
||||||
|
transport_clause = (
|
||||||
|
f" AND transport='{backend._escape(str(transport or '').strip().lower())}'"
|
||||||
|
)
|
||||||
|
return backend.query_rows(
|
||||||
|
f"SELECT person_id, transport, kind, ts "
|
||||||
|
f"FROM {backend.table} "
|
||||||
|
f"WHERE user_id={int(user_id)} "
|
||||||
|
f"AND person_id IN ({id_clause})"
|
||||||
|
f"{transport_clause} "
|
||||||
|
"ORDER BY person_id ASC, ts DESC"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_behavioral_events_for_range(
|
||||||
|
*,
|
||||||
|
user_id: int,
|
||||||
|
person_id: str,
|
||||||
|
start_ts: int,
|
||||||
|
end_ts: int,
|
||||||
|
transport: str = "",
|
||||||
|
) -> list[dict]:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
transport_clause = ""
|
||||||
|
if str(transport or "").strip():
|
||||||
|
transport_clause = (
|
||||||
|
f" AND transport='{backend._escape(str(transport or '').strip().lower())}'"
|
||||||
|
)
|
||||||
|
return backend.query_rows(
|
||||||
|
f"SELECT person_id, session_id, transport, kind, direction, ts, payload "
|
||||||
|
f"FROM {backend.table} "
|
||||||
|
f"WHERE user_id={int(user_id)} "
|
||||||
|
f"AND person_id='{backend._escape(str(person_id or '').strip())}' "
|
||||||
|
f"AND ts>={int(start_ts)} AND ts<={int(end_ts)}"
|
||||||
|
f"{transport_clause} "
|
||||||
|
"ORDER BY ts ASC"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def get_recent_event_rows(
|
||||||
|
*,
|
||||||
|
minutes: int = 120,
|
||||||
|
service: str = "",
|
||||||
|
user_id: str = "",
|
||||||
|
limit: int = 200,
|
||||||
|
) -> list[dict]:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
cutoff_ts = int(time.time() * 1000) - (max(1, int(minutes)) * 60 * 1000)
|
||||||
|
where = [f"ts>={cutoff_ts}"]
|
||||||
|
if service:
|
||||||
|
where.append(f"transport='{backend._escape(str(service).strip().lower())}'")
|
||||||
|
if user_id:
|
||||||
|
where.append(f"user_id={int(user_id)}")
|
||||||
|
rows = backend.query_rows(
|
||||||
|
f"SELECT user_id, session_id, ts, kind, direction, transport, payload "
|
||||||
|
f"FROM {backend.table} "
|
||||||
|
f"WHERE {' AND '.join(where)} "
|
||||||
|
f"ORDER BY ts DESC "
|
||||||
|
f"LIMIT {max(1, min(int(limit), 500))}"
|
||||||
|
)
|
||||||
|
output = []
|
||||||
|
for row in list(rows or []):
|
||||||
|
payload = parse_payload(row.get("payload"))
|
||||||
|
legacy_event_type = str(payload.get("legacy_event_type") or "").strip().lower()
|
||||||
|
output.append(
|
||||||
|
{
|
||||||
|
"id": "",
|
||||||
|
"user_id": int(row.get("user_id") or 0),
|
||||||
|
"session_id": str(row.get("session_id") or ""),
|
||||||
|
"ts": int(row.get("ts") or 0),
|
||||||
|
"event_type": legacy_event_type or str(row.get("kind") or ""),
|
||||||
|
"kind": str(row.get("kind") or ""),
|
||||||
|
"direction": str(row.get("direction") or ""),
|
||||||
|
"origin_transport": str(row.get("transport") or ""),
|
||||||
|
"trace_id": str(payload.get("trace_id") or ""),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
def count_behavioral_events(*, user_id: int) -> int:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
rows = backend.query_rows(
|
||||||
|
f"SELECT COUNT(*) AS total_events "
|
||||||
|
f"FROM {backend.table} "
|
||||||
|
f"WHERE user_id={int(user_id)}"
|
||||||
|
)
|
||||||
|
if not rows:
|
||||||
|
return 0
|
||||||
|
try:
|
||||||
|
return int((rows[0] or {}).get("total_events") or 0)
|
||||||
|
except Exception:
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def get_trace_ids(*, user_id: int, limit: int = 120) -> list[str]:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
rows = backend.query_rows(
|
||||||
|
f"SELECT payload "
|
||||||
|
f"FROM {backend.table} "
|
||||||
|
f"WHERE user_id={int(user_id)} "
|
||||||
|
"ORDER BY ts DESC "
|
||||||
|
f"LIMIT {max(1, min(int(limit) * 6, 1000))}"
|
||||||
|
)
|
||||||
|
seen = set()
|
||||||
|
output = []
|
||||||
|
for row in list(rows or []):
|
||||||
|
payload = parse_payload(row.get("payload"))
|
||||||
|
trace_id = str(payload.get("trace_id") or "").strip()
|
||||||
|
if not trace_id or trace_id in seen:
|
||||||
|
continue
|
||||||
|
seen.add(trace_id)
|
||||||
|
output.append(trace_id)
|
||||||
|
if len(output) >= max(1, min(int(limit), 500)):
|
||||||
|
break
|
||||||
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
def get_trace_event_rows(*, user_id: int, trace_id: str, limit: int = 500) -> list[dict]:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
rows = backend.query_rows(
|
||||||
|
f"SELECT user_id, session_id, ts, kind, direction, transport, payload "
|
||||||
|
f"FROM {backend.table} "
|
||||||
|
f"WHERE user_id={int(user_id)} "
|
||||||
|
"ORDER BY ts ASC "
|
||||||
|
f"LIMIT {max(1, min(int(limit) * 8, 5000))}"
|
||||||
|
)
|
||||||
|
output = []
|
||||||
|
target = str(trace_id or "").strip()
|
||||||
|
for row in list(rows or []):
|
||||||
|
payload = parse_payload(row.get("payload"))
|
||||||
|
if str(payload.get("trace_id") or "").strip() != target:
|
||||||
|
continue
|
||||||
|
output.append(
|
||||||
|
{
|
||||||
|
"id": "",
|
||||||
|
"ts": int(row.get("ts") or 0),
|
||||||
|
"event_type": str(
|
||||||
|
payload.get("legacy_event_type") or row.get("kind") or ""
|
||||||
|
).strip(),
|
||||||
|
"kind": str(row.get("kind") or "").strip(),
|
||||||
|
"direction": str(row.get("direction") or "").strip(),
|
||||||
|
"session_id": str(row.get("session_id") or "").strip(),
|
||||||
|
"origin_transport": str(row.get("transport") or "").strip(),
|
||||||
|
"origin_message_id": str(payload.get("origin_message_id") or "").strip(),
|
||||||
|
"payload": payload,
|
||||||
|
"trace_id": target,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
if len(output) >= max(1, min(int(limit), 500)):
|
||||||
|
break
|
||||||
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
def get_session_event_rows(*, user_id: int, session_id: str, limit: int = 2000) -> list[dict]:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
rows = backend.query_rows(
|
||||||
|
f"SELECT user_id, session_id, ts, kind, direction, transport, actor, payload "
|
||||||
|
f"FROM {backend.table} "
|
||||||
|
f"WHERE user_id={int(user_id)} "
|
||||||
|
f"AND session_id='{backend._escape(str(session_id or '').strip())}' "
|
||||||
|
"ORDER BY ts ASC "
|
||||||
|
f"LIMIT {max(1, min(int(limit), 5000))}"
|
||||||
|
)
|
||||||
|
output = []
|
||||||
|
for row in list(rows or []):
|
||||||
|
payload = parse_payload(row.get("payload"))
|
||||||
|
output.append(
|
||||||
|
{
|
||||||
|
"ts": int(row.get("ts") or 0),
|
||||||
|
"event_type": str(
|
||||||
|
payload.get("legacy_event_type") or row.get("kind") or ""
|
||||||
|
).strip(),
|
||||||
|
"kind": str(row.get("kind") or "").strip(),
|
||||||
|
"direction": str(row.get("direction") or "").strip(),
|
||||||
|
"session_id": str(row.get("session_id") or "").strip(),
|
||||||
|
"origin_transport": str(row.get("transport") or "").strip(),
|
||||||
|
"actor_identifier": str(row.get("actor") or "").strip(),
|
||||||
|
"origin_message_id": str(payload.get("origin_message_id") or "").strip(),
|
||||||
|
"payload": payload,
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return output
|
||||||
@@ -2,6 +2,7 @@ from __future__ import annotations
|
|||||||
|
|
||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
|
|
||||||
|
from core.events.manticore import get_session_event_rows
|
||||||
from core.models import ChatSession, ConversationEvent, Message
|
from core.models import ChatSession, ConversationEvent, Message
|
||||||
|
|
||||||
|
|
||||||
@@ -59,27 +60,56 @@ def _normalize_reactions(rows: list[dict] | None) -> list[dict]:
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def project_session_from_events(session: ChatSession) -> list[dict]:
|
def _event_rows_for_session(session: ChatSession):
|
||||||
rows = list(
|
try:
|
||||||
ConversationEvent.objects.filter(
|
rows = get_session_event_rows(
|
||||||
user=session.user,
|
user_id=int(session.user_id),
|
||||||
session=session,
|
session_id=str(session.id),
|
||||||
).order_by("ts", "created_at")
|
limit=2000,
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
rows = []
|
||||||
|
if rows:
|
||||||
|
return rows, "manticore"
|
||||||
|
return (
|
||||||
|
list(
|
||||||
|
ConversationEvent.objects.filter(
|
||||||
|
user=session.user,
|
||||||
|
session=session,
|
||||||
|
).order_by("ts", "created_at")
|
||||||
|
),
|
||||||
|
"django",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def project_session_from_events(session: ChatSession) -> list[dict]:
|
||||||
|
rows, _source = _event_rows_for_session(session)
|
||||||
|
|
||||||
projected: dict[str, _ProjectedMessage] = {}
|
projected: dict[str, _ProjectedMessage] = {}
|
||||||
order: list[str] = []
|
order: list[str] = []
|
||||||
|
|
||||||
for event in rows:
|
for event in rows:
|
||||||
payload = dict(event.payload or {})
|
is_dict = isinstance(event, dict)
|
||||||
event_type = str(event.event_type or "").strip().lower()
|
payload = dict(
|
||||||
|
(event.get("payload") if is_dict else getattr(event, "payload", {})) or {}
|
||||||
|
)
|
||||||
|
event_type = str(
|
||||||
|
(event.get("event_type") if is_dict else getattr(event, "event_type", ""))
|
||||||
|
or ""
|
||||||
|
).strip().lower()
|
||||||
message_id = str(
|
message_id = str(
|
||||||
payload.get("message_id") or payload.get("target_message_id") or ""
|
payload.get("message_id") or payload.get("target_message_id") or ""
|
||||||
).strip()
|
).strip()
|
||||||
|
|
||||||
if event_type == "message_created":
|
if event_type == "message_created":
|
||||||
message_id = str(
|
message_id = str(
|
||||||
payload.get("message_id") or event.origin_message_id or ""
|
payload.get("message_id")
|
||||||
|
or (
|
||||||
|
event.get("origin_message_id")
|
||||||
|
if is_dict
|
||||||
|
else getattr(event, "origin_message_id", "")
|
||||||
|
)
|
||||||
|
or ""
|
||||||
).strip()
|
).strip()
|
||||||
if not message_id:
|
if not message_id:
|
||||||
continue
|
continue
|
||||||
@@ -88,10 +118,14 @@ def project_session_from_events(session: ChatSession) -> list[dict]:
|
|||||||
state = _ProjectedMessage(message_id=message_id)
|
state = _ProjectedMessage(message_id=message_id)
|
||||||
projected[message_id] = state
|
projected[message_id] = state
|
||||||
order.append(message_id)
|
order.append(message_id)
|
||||||
state.ts = _safe_int(payload.get("message_ts"), _safe_int(event.ts))
|
state.ts = _safe_int(
|
||||||
|
payload.get("message_ts"),
|
||||||
|
_safe_int(event.get("ts") if is_dict else getattr(event, "ts", 0)),
|
||||||
|
)
|
||||||
state.text = str(payload.get("text") or state.text or "")
|
state.text = str(payload.get("text") or state.text or "")
|
||||||
delivered_default = _safe_int(
|
delivered_default = _safe_int(
|
||||||
payload.get("delivered_ts"), _safe_int(event.ts)
|
payload.get("delivered_ts"),
|
||||||
|
_safe_int(event.get("ts") if is_dict else getattr(event, "ts", 0)),
|
||||||
)
|
)
|
||||||
if state.delivered_ts is None:
|
if state.delivered_ts is None:
|
||||||
state.delivered_ts = delivered_default or None
|
state.delivered_ts = delivered_default or None
|
||||||
@@ -102,7 +136,10 @@ def project_session_from_events(session: ChatSession) -> list[dict]:
|
|||||||
state = projected[message_id]
|
state = projected[message_id]
|
||||||
|
|
||||||
if event_type == "read_receipt":
|
if event_type == "read_receipt":
|
||||||
read_ts = _safe_int(payload.get("read_ts"), _safe_int(event.ts))
|
read_ts = _safe_int(
|
||||||
|
payload.get("read_ts"),
|
||||||
|
_safe_int(event.get("ts") if is_dict else getattr(event, "ts", 0)),
|
||||||
|
)
|
||||||
if read_ts > 0:
|
if read_ts > 0:
|
||||||
if state.read_ts is None:
|
if state.read_ts is None:
|
||||||
state.read_ts = read_ts
|
state.read_ts = read_ts
|
||||||
@@ -114,11 +151,27 @@ def project_session_from_events(session: ChatSession) -> list[dict]:
|
|||||||
|
|
||||||
if event_type in {"reaction_added", "reaction_removed"}:
|
if event_type in {"reaction_added", "reaction_removed"}:
|
||||||
source_service = (
|
source_service = (
|
||||||
str(payload.get("source_service") or event.origin_transport or "")
|
str(
|
||||||
|
payload.get("source_service")
|
||||||
|
or (
|
||||||
|
event.get("origin_transport")
|
||||||
|
if is_dict
|
||||||
|
else getattr(event, "origin_transport", "")
|
||||||
|
)
|
||||||
|
or ""
|
||||||
|
)
|
||||||
.strip()
|
.strip()
|
||||||
.lower()
|
.lower()
|
||||||
)
|
)
|
||||||
actor = str(payload.get("actor") or event.actor_identifier or "").strip()
|
actor = str(
|
||||||
|
payload.get("actor")
|
||||||
|
or (
|
||||||
|
event.get("actor_identifier")
|
||||||
|
if is_dict
|
||||||
|
else getattr(event, "actor_identifier", "")
|
||||||
|
)
|
||||||
|
or ""
|
||||||
|
).strip()
|
||||||
emoji = str(payload.get("emoji") or "").strip()
|
emoji = str(payload.get("emoji") or "").strip()
|
||||||
if not source_service and not actor and not emoji:
|
if not source_service and not actor and not emoji:
|
||||||
continue
|
continue
|
||||||
|
|||||||
148
core/events/shadow.py
Normal file
148
core/events/shadow.py
Normal file
@@ -0,0 +1,148 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from django.db.models import Count, Max, Q
|
||||||
|
|
||||||
|
from core.models import ConversationEvent, Person, User
|
||||||
|
|
||||||
|
|
||||||
|
def _kind_from_event_type(event_type: str) -> str:
|
||||||
|
normalized = str(event_type or "").strip().lower()
|
||||||
|
return {
|
||||||
|
"message_created": "message_sent",
|
||||||
|
"delivery_receipt": "message_delivered",
|
||||||
|
"read_receipt": "message_read",
|
||||||
|
"typing_started": "composing_started",
|
||||||
|
"typing_stopped": "composing_stopped",
|
||||||
|
"composing_abandoned": "composing_abandoned",
|
||||||
|
"presence_available": "presence_available",
|
||||||
|
"presence_unavailable": "presence_unavailable",
|
||||||
|
}.get(normalized, normalized)
|
||||||
|
|
||||||
|
|
||||||
|
def get_shadow_behavioral_availability_stats(*, user: User) -> list[dict]:
|
||||||
|
person_map = {
|
||||||
|
str(row["id"]): str(row["name"] or "")
|
||||||
|
for row in Person.objects.filter(user=user).values("id", "name")
|
||||||
|
}
|
||||||
|
rows = (
|
||||||
|
ConversationEvent.objects.filter(
|
||||||
|
user=user,
|
||||||
|
session__identifier__person__isnull=False,
|
||||||
|
)
|
||||||
|
.values("session__identifier__person_id", "origin_transport")
|
||||||
|
.annotate(
|
||||||
|
total_events=Count("id"),
|
||||||
|
presence_events=Count(
|
||||||
|
"id",
|
||||||
|
filter=Q(event_type__in=["presence_available", "presence_unavailable"]),
|
||||||
|
),
|
||||||
|
read_events=Count("id", filter=Q(event_type="read_receipt")),
|
||||||
|
typing_events=Count(
|
||||||
|
"id",
|
||||||
|
filter=Q(
|
||||||
|
event_type__in=["typing_started", "typing_stopped"]
|
||||||
|
),
|
||||||
|
),
|
||||||
|
message_events=Count("id", filter=Q(event_type="message_created")),
|
||||||
|
abandoned_events=Count("id", filter=Q(event_type="composing_abandoned")),
|
||||||
|
last_event_ts=Max("ts"),
|
||||||
|
)
|
||||||
|
.order_by("-total_events", "session__identifier__person_id", "origin_transport")
|
||||||
|
)
|
||||||
|
output = []
|
||||||
|
for row in rows:
|
||||||
|
person_id = str(row.get("session__identifier__person_id") or "").strip()
|
||||||
|
output.append(
|
||||||
|
{
|
||||||
|
"person_id": person_id,
|
||||||
|
"person_name": person_map.get(person_id, person_id or "-"),
|
||||||
|
"service": str(row.get("origin_transport") or "").strip().lower(),
|
||||||
|
"total_events": int(row.get("total_events") or 0),
|
||||||
|
"presence_events": int(row.get("presence_events") or 0),
|
||||||
|
"read_events": int(row.get("read_events") or 0),
|
||||||
|
"typing_events": int(row.get("typing_events") or 0),
|
||||||
|
"message_events": int(row.get("message_events") or 0),
|
||||||
|
"abandoned_events": int(row.get("abandoned_events") or 0),
|
||||||
|
"last_event_ts": int(row.get("last_event_ts") or 0),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
def get_shadow_behavioral_latest_states(
|
||||||
|
*, user: User, person_ids: list[str], transport: str = ""
|
||||||
|
) -> list[dict]:
|
||||||
|
queryset = ConversationEvent.objects.filter(
|
||||||
|
user=user,
|
||||||
|
session__identifier__person_id__in=[str(value) for value in person_ids],
|
||||||
|
event_type__in=[
|
||||||
|
"message_created",
|
||||||
|
"delivery_receipt",
|
||||||
|
"read_receipt",
|
||||||
|
"typing_started",
|
||||||
|
"typing_stopped",
|
||||||
|
"composing_abandoned",
|
||||||
|
"presence_available",
|
||||||
|
"presence_unavailable",
|
||||||
|
],
|
||||||
|
).select_related("session__identifier")
|
||||||
|
if transport:
|
||||||
|
queryset = queryset.filter(origin_transport=str(transport).strip().lower())
|
||||||
|
rows = []
|
||||||
|
seen = set()
|
||||||
|
for row in queryset.order_by(
|
||||||
|
"session__identifier__person_id", "-ts", "-created_at"
|
||||||
|
)[:500]:
|
||||||
|
person_id = str(getattr(row.session.identifier, "person_id", "") or "").strip()
|
||||||
|
if not person_id or person_id in seen:
|
||||||
|
continue
|
||||||
|
seen.add(person_id)
|
||||||
|
rows.append(
|
||||||
|
{
|
||||||
|
"person_id": person_id,
|
||||||
|
"transport": str(row.origin_transport or "").strip().lower(),
|
||||||
|
"kind": _kind_from_event_type(row.event_type),
|
||||||
|
"ts": int(row.ts or 0),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return rows
|
||||||
|
|
||||||
|
|
||||||
|
def get_shadow_behavioral_events_for_range(
|
||||||
|
*,
|
||||||
|
user: User,
|
||||||
|
person_id: str,
|
||||||
|
start_ts: int,
|
||||||
|
end_ts: int,
|
||||||
|
transport: str = "",
|
||||||
|
) -> list[dict]:
|
||||||
|
queryset = ConversationEvent.objects.filter(
|
||||||
|
user=user,
|
||||||
|
session__identifier__person_id=str(person_id or "").strip(),
|
||||||
|
ts__gte=int(start_ts),
|
||||||
|
ts__lte=int(end_ts),
|
||||||
|
event_type__in=[
|
||||||
|
"message_created",
|
||||||
|
"delivery_receipt",
|
||||||
|
"read_receipt",
|
||||||
|
"typing_started",
|
||||||
|
"typing_stopped",
|
||||||
|
"composing_abandoned",
|
||||||
|
"presence_available",
|
||||||
|
"presence_unavailable",
|
||||||
|
],
|
||||||
|
).order_by("ts", "created_at")
|
||||||
|
if transport:
|
||||||
|
queryset = queryset.filter(origin_transport=str(transport).strip().lower())
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
"person_id": str(person_id or "").strip(),
|
||||||
|
"session_id": str(row.session_id or ""),
|
||||||
|
"transport": str(row.origin_transport or "").strip().lower(),
|
||||||
|
"kind": _kind_from_event_type(row.event_type),
|
||||||
|
"direction": str(row.direction or "").strip().lower(),
|
||||||
|
"ts": int(row.ts or 0),
|
||||||
|
"payload": dict(row.payload or {}),
|
||||||
|
}
|
||||||
|
for row in queryset[:1000]
|
||||||
|
]
|
||||||
@@ -4,13 +4,16 @@ from typing import Iterable
|
|||||||
|
|
||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
|
|
||||||
|
from core.events.ledger import append_event_sync
|
||||||
from core.models import Message
|
from core.models import Message
|
||||||
from core.presence import AvailabilitySignal, record_inferred_signal
|
|
||||||
from core.presence.inference import now_ms
|
from core.presence.inference import now_ms
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = "Backfill inferred contact availability events from historical message/read-receipt activity."
|
help = (
|
||||||
|
"Backfill behavioral event ledger rows from historical message and "
|
||||||
|
"read-receipt activity."
|
||||||
|
)
|
||||||
|
|
||||||
def add_arguments(self, parser):
|
def add_arguments(self, parser):
|
||||||
parser.add_argument("--days", type=int, default=30)
|
parser.add_argument("--days", type=int, default=30)
|
||||||
@@ -39,17 +42,18 @@ class Command(BaseCommand):
|
|||||||
user_filter = str(options.get("user_id") or "").strip()
|
user_filter = str(options.get("user_id") or "").strip()
|
||||||
dry_run = bool(options.get("dry_run"))
|
dry_run = bool(options.get("dry_run"))
|
||||||
|
|
||||||
created = 0
|
indexed = 0
|
||||||
scanned = 0
|
scanned = 0
|
||||||
|
|
||||||
for msg in self._iter_messages(
|
for msg in self._iter_messages(
|
||||||
days=days, limit=limit, service=service_filter, user_id=user_filter
|
days=days, limit=limit, service=service_filter, user_id=user_filter
|
||||||
):
|
):
|
||||||
scanned += 1
|
scanned += 1
|
||||||
identifier = getattr(getattr(msg, "session", None), "identifier", None)
|
session = getattr(msg, "session", None)
|
||||||
|
identifier = getattr(session, "identifier", None)
|
||||||
person = getattr(identifier, "person", None)
|
person = getattr(identifier, "person", None)
|
||||||
user = getattr(msg, "user", None)
|
user = getattr(msg, "user", None)
|
||||||
if not identifier or not person or not user:
|
if not session or not identifier or not person or not user:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
service = (
|
service = (
|
||||||
@@ -60,76 +64,65 @@ class Command(BaseCommand):
|
|||||||
if not service:
|
if not service:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
base_ts = int(getattr(msg, "ts", 0) or 0)
|
author = str(getattr(msg, "custom_author", "") or "").strip().upper()
|
||||||
message_author = (
|
outgoing = author in {"USER", "BOT"}
|
||||||
str(getattr(msg, "custom_author", "") or "").strip().upper()
|
message_id = str(
|
||||||
)
|
getattr(msg, "source_message_id", "") or f"django-message-{msg.id}"
|
||||||
outgoing = message_author in {"USER", "BOT"}
|
).strip()
|
||||||
|
|
||||||
candidates = []
|
if not dry_run:
|
||||||
if base_ts > 0:
|
append_event_sync(
|
||||||
candidates.append(
|
user=user,
|
||||||
{
|
session=session,
|
||||||
"source_kind": "message_out" if outgoing else "message_in",
|
ts=int(getattr(msg, "ts", 0) or 0),
|
||||||
"availability_state": "available",
|
event_type="message_created",
|
||||||
"confidence": 0.65 if outgoing else 0.75,
|
direction="out" if outgoing else "in",
|
||||||
"ts": base_ts,
|
actor_identifier=str(
|
||||||
"payload": {
|
getattr(msg, "sender_uuid", "") or identifier.identifier or ""
|
||||||
"origin": "backfill_contact_availability",
|
),
|
||||||
"message_id": str(msg.id),
|
origin_transport=service,
|
||||||
"inferred_from": "message_activity",
|
origin_message_id=message_id,
|
||||||
},
|
origin_chat_id=str(getattr(msg, "source_chat_id", "") or ""),
|
||||||
}
|
payload={
|
||||||
|
"origin": "backfill_contact_availability",
|
||||||
|
"message_id": str(msg.id),
|
||||||
|
"text": str(getattr(msg, "text", "") or ""),
|
||||||
|
"outgoing": outgoing,
|
||||||
|
},
|
||||||
)
|
)
|
||||||
|
indexed += 1
|
||||||
|
|
||||||
read_ts = int(getattr(msg, "read_ts", 0) or 0)
|
read_ts = int(getattr(msg, "read_ts", 0) or 0)
|
||||||
if read_ts > 0:
|
if read_ts <= 0:
|
||||||
candidates.append(
|
continue
|
||||||
{
|
if not dry_run:
|
||||||
"source_kind": "read_receipt",
|
append_event_sync(
|
||||||
"availability_state": "available",
|
user=user,
|
||||||
"confidence": 0.95,
|
session=session,
|
||||||
"ts": read_ts,
|
ts=read_ts,
|
||||||
"payload": {
|
event_type="read_receipt",
|
||||||
"origin": "backfill_contact_availability",
|
direction="system",
|
||||||
"message_id": str(msg.id),
|
actor_identifier=str(
|
||||||
"inferred_from": "read_receipt",
|
getattr(msg, "read_by_identifier", "") or identifier.identifier
|
||||||
"read_by": str(
|
),
|
||||||
getattr(msg, "read_by_identifier", "") or ""
|
origin_transport=service,
|
||||||
),
|
origin_message_id=message_id,
|
||||||
},
|
origin_chat_id=str(getattr(msg, "source_chat_id", "") or ""),
|
||||||
}
|
payload={
|
||||||
)
|
"origin": "backfill_contact_availability",
|
||||||
|
"message_id": str(msg.id),
|
||||||
for row in candidates:
|
"message_ts": int(getattr(msg, "ts", 0) or 0),
|
||||||
exists = user.contact_availability_events.filter(
|
"read_by": str(
|
||||||
person=person,
|
getattr(msg, "read_by_identifier", "") or ""
|
||||||
person_identifier=identifier,
|
).strip(),
|
||||||
service=service,
|
},
|
||||||
source_kind=row["source_kind"],
|
|
||||||
ts=int(row["ts"]),
|
|
||||||
).exists()
|
|
||||||
if exists:
|
|
||||||
continue
|
|
||||||
created += 1
|
|
||||||
if dry_run:
|
|
||||||
continue
|
|
||||||
record_inferred_signal(
|
|
||||||
AvailabilitySignal(
|
|
||||||
user=user,
|
|
||||||
person=person,
|
|
||||||
person_identifier=identifier,
|
|
||||||
service=service,
|
|
||||||
source_kind=row["source_kind"],
|
|
||||||
availability_state=row["availability_state"],
|
|
||||||
confidence=float(row["confidence"]),
|
|
||||||
ts=int(row["ts"]),
|
|
||||||
payload=dict(row["payload"]),
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
|
indexed += 1
|
||||||
|
|
||||||
self.stdout.write(
|
self.stdout.write(
|
||||||
self.style.SUCCESS(
|
self.style.SUCCESS(
|
||||||
f"backfill_contact_availability complete scanned={scanned} created={created} dry_run={dry_run} days={days} limit={limit}"
|
"backfill_contact_availability complete "
|
||||||
|
f"scanned={scanned} indexed={indexed} dry_run={dry_run} "
|
||||||
|
f"days={days} limit={limit}"
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -5,12 +5,46 @@ import time
|
|||||||
|
|
||||||
from django.core.management.base import BaseCommand, CommandError
|
from django.core.management.base import BaseCommand, CommandError
|
||||||
|
|
||||||
|
from core.events.manticore import get_recent_event_rows
|
||||||
from core.models import ConversationEvent
|
from core.models import ConversationEvent
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = "Quick non-mutating sanity check for recent canonical event writes."
|
help = "Quick non-mutating sanity check for recent canonical event writes."
|
||||||
|
|
||||||
|
def _recent_rows(self, *, minutes: int, service: str, user_id: str, limit: int):
|
||||||
|
cutoff_ts = int(time.time() * 1000) - (minutes * 60 * 1000)
|
||||||
|
queryset = ConversationEvent.objects.filter(ts__gte=cutoff_ts).order_by("-ts")
|
||||||
|
if service:
|
||||||
|
queryset = queryset.filter(origin_transport=service)
|
||||||
|
if user_id:
|
||||||
|
queryset = queryset.filter(user_id=user_id)
|
||||||
|
|
||||||
|
rows = list(
|
||||||
|
queryset.values(
|
||||||
|
"id",
|
||||||
|
"user_id",
|
||||||
|
"session_id",
|
||||||
|
"ts",
|
||||||
|
"event_type",
|
||||||
|
"direction",
|
||||||
|
"origin_transport",
|
||||||
|
"trace_id",
|
||||||
|
)[:limit]
|
||||||
|
)
|
||||||
|
if rows:
|
||||||
|
return rows, "django"
|
||||||
|
try:
|
||||||
|
manticore_rows = get_recent_event_rows(
|
||||||
|
minutes=minutes,
|
||||||
|
service=service,
|
||||||
|
user_id=user_id,
|
||||||
|
limit=limit,
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
manticore_rows = []
|
||||||
|
return manticore_rows, "manticore" if manticore_rows else "django"
|
||||||
|
|
||||||
def add_arguments(self, parser):
|
def add_arguments(self, parser):
|
||||||
parser.add_argument("--minutes", type=int, default=120)
|
parser.add_argument("--minutes", type=int, default=120)
|
||||||
parser.add_argument("--service", default="")
|
parser.add_argument("--service", default="")
|
||||||
@@ -34,24 +68,11 @@ class Command(BaseCommand):
|
|||||||
if item.strip()
|
if item.strip()
|
||||||
]
|
]
|
||||||
|
|
||||||
cutoff_ts = int(time.time() * 1000) - (minutes * 60 * 1000)
|
rows, data_source = self._recent_rows(
|
||||||
queryset = ConversationEvent.objects.filter(ts__gte=cutoff_ts).order_by("-ts")
|
minutes=minutes,
|
||||||
if service:
|
service=service,
|
||||||
queryset = queryset.filter(origin_transport=service)
|
user_id=user_id,
|
||||||
if user_id:
|
limit=limit,
|
||||||
queryset = queryset.filter(user_id=user_id)
|
|
||||||
|
|
||||||
rows = list(
|
|
||||||
queryset.values(
|
|
||||||
"id",
|
|
||||||
"user_id",
|
|
||||||
"session_id",
|
|
||||||
"ts",
|
|
||||||
"event_type",
|
|
||||||
"direction",
|
|
||||||
"origin_transport",
|
|
||||||
"trace_id",
|
|
||||||
)[:limit]
|
|
||||||
)
|
)
|
||||||
event_type_counts = {}
|
event_type_counts = {}
|
||||||
for row in rows:
|
for row in rows:
|
||||||
@@ -67,6 +88,7 @@ class Command(BaseCommand):
|
|||||||
"minutes": minutes,
|
"minutes": minutes,
|
||||||
"service": service,
|
"service": service,
|
||||||
"user_id": user_id,
|
"user_id": user_id,
|
||||||
|
"data_source": data_source,
|
||||||
"count": len(rows),
|
"count": len(rows),
|
||||||
"event_type_counts": event_type_counts,
|
"event_type_counts": event_type_counts,
|
||||||
"required_types": required_types,
|
"required_types": required_types,
|
||||||
@@ -79,7 +101,7 @@ class Command(BaseCommand):
|
|||||||
return
|
return
|
||||||
|
|
||||||
self.stdout.write(
|
self.stdout.write(
|
||||||
f"event-ledger-smoke minutes={minutes} service={service or '-'} user={user_id or '-'} count={len(rows)}"
|
f"event-ledger-smoke minutes={minutes} service={service or '-'} user={user_id or '-'} source={data_source} count={len(rows)}"
|
||||||
)
|
)
|
||||||
self.stdout.write(f"event_type_counts={event_type_counts}")
|
self.stdout.write(f"event_type_counts={event_type_counts}")
|
||||||
if required_types:
|
if required_types:
|
||||||
@@ -88,7 +110,7 @@ class Command(BaseCommand):
|
|||||||
)
|
)
|
||||||
|
|
||||||
if fail_if_empty and len(rows) == 0:
|
if fail_if_empty and len(rows) == 0:
|
||||||
raise CommandError("No recent ConversationEvent rows found.")
|
raise CommandError("No recent canonical event rows found.")
|
||||||
if missing_required_types:
|
if missing_required_types:
|
||||||
raise CommandError(
|
raise CommandError(
|
||||||
"Missing required event types: " + ", ".join(missing_required_types)
|
"Missing required event types: " + ", ".join(missing_required_types)
|
||||||
|
|||||||
96
core/management/commands/gia_analysis.py
Normal file
96
core/management/commands/gia_analysis.py
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import time
|
||||||
|
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
|
||||||
|
from core.events.behavior import summarize_metrics
|
||||||
|
from core.events.manticore import get_event_ledger_backend
|
||||||
|
from core.util import logs
|
||||||
|
|
||||||
|
log = logs.get_logger("gia_analysis")
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Compute behavioral metrics from Manticore event rows into gia_metrics."
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument("--once", action="store_true", default=False)
|
||||||
|
parser.add_argument("--user-id", type=int)
|
||||||
|
parser.add_argument("--person-id")
|
||||||
|
parser.add_argument("--sleep-seconds", type=float, default=60.0)
|
||||||
|
parser.add_argument("--window-days", nargs="*", type=int, default=[1, 7, 30, 90])
|
||||||
|
|
||||||
|
def _run_cycle(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
user_id: int | None = None,
|
||||||
|
person_id: str = "",
|
||||||
|
window_days: list[int] | None = None,
|
||||||
|
) -> int:
|
||||||
|
backend = get_event_ledger_backend()
|
||||||
|
now_ms = int(time.time() * 1000)
|
||||||
|
baseline_since = now_ms - (90 * 86400000)
|
||||||
|
windows = sorted({max(1, int(value)) for value in list(window_days or [1, 7, 30, 90])})
|
||||||
|
|
||||||
|
targets = backend.list_event_targets(user_id=user_id)
|
||||||
|
if person_id:
|
||||||
|
targets = [
|
||||||
|
row
|
||||||
|
for row in targets
|
||||||
|
if str(row.get("person_id") or "").strip() == str(person_id).strip()
|
||||||
|
]
|
||||||
|
|
||||||
|
written = 0
|
||||||
|
for target in targets:
|
||||||
|
target_user_id = int(target.get("user_id") or 0)
|
||||||
|
target_person_id = str(target.get("person_id") or "").strip()
|
||||||
|
if target_user_id <= 0 or not target_person_id:
|
||||||
|
continue
|
||||||
|
baseline_rows = backend.fetch_events(
|
||||||
|
user_id=target_user_id,
|
||||||
|
person_id=target_person_id,
|
||||||
|
since_ts=baseline_since,
|
||||||
|
)
|
||||||
|
if not baseline_rows:
|
||||||
|
continue
|
||||||
|
for window in windows:
|
||||||
|
since_ts = now_ms - (int(window) * 86400000)
|
||||||
|
window_rows = [
|
||||||
|
row
|
||||||
|
for row in baseline_rows
|
||||||
|
if int(row.get("ts") or 0) >= since_ts
|
||||||
|
]
|
||||||
|
metrics = summarize_metrics(window_rows, baseline_rows)
|
||||||
|
for metric, values in metrics.items():
|
||||||
|
backend.upsert_metric(
|
||||||
|
user_id=target_user_id,
|
||||||
|
person_id=target_person_id,
|
||||||
|
window_days=int(window),
|
||||||
|
metric=metric,
|
||||||
|
value_ms=int(values.get("value_ms") or 0),
|
||||||
|
baseline_ms=int(values.get("baseline_ms") or 0),
|
||||||
|
z_score=float(values.get("z_score") or 0.0),
|
||||||
|
sample_n=int(values.get("sample_n") or 0),
|
||||||
|
computed_at=now_ms,
|
||||||
|
)
|
||||||
|
written += 1
|
||||||
|
return written
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
once = bool(options.get("once"))
|
||||||
|
sleep_seconds = max(1.0, float(options.get("sleep_seconds") or 60.0))
|
||||||
|
user_id = options.get("user_id")
|
||||||
|
person_id = str(options.get("person_id") or "").strip()
|
||||||
|
window_days = list(options.get("window_days") or [1, 7, 30, 90])
|
||||||
|
|
||||||
|
while True:
|
||||||
|
written = self._run_cycle(
|
||||||
|
user_id=user_id,
|
||||||
|
person_id=person_id,
|
||||||
|
window_days=window_days,
|
||||||
|
)
|
||||||
|
self.stdout.write(f"gia-analysis wrote={written}")
|
||||||
|
if once:
|
||||||
|
return
|
||||||
|
time.sleep(sleep_seconds)
|
||||||
46
core/management/commands/manticore_backfill.py
Normal file
46
core/management/commands/manticore_backfill.py
Normal file
@@ -0,0 +1,46 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from django.core.management.base import BaseCommand, CommandError
|
||||||
|
|
||||||
|
from core.events.manticore import upsert_conversation_event
|
||||||
|
from core.models import ConversationEvent
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = "Backfill behavioral events into Manticore from ConversationEvent rows."
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument(
|
||||||
|
"--from-conversation-events",
|
||||||
|
action="store_true",
|
||||||
|
help="Replay ConversationEvent rows into the Manticore event table.",
|
||||||
|
)
|
||||||
|
parser.add_argument("--user-id", type=int, default=None)
|
||||||
|
parser.add_argument("--limit", type=int, default=5000)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
if not bool(options.get("from_conversation_events")):
|
||||||
|
raise CommandError("Pass --from-conversation-events to run this backfill.")
|
||||||
|
|
||||||
|
queryset = (
|
||||||
|
ConversationEvent.objects.select_related("session__identifier")
|
||||||
|
.order_by("ts", "created_at")
|
||||||
|
)
|
||||||
|
user_id = options.get("user_id")
|
||||||
|
if user_id is not None:
|
||||||
|
queryset = queryset.filter(user_id=int(user_id))
|
||||||
|
|
||||||
|
scanned = 0
|
||||||
|
indexed = 0
|
||||||
|
limit = max(1, int(options.get("limit") or 5000))
|
||||||
|
for event in queryset[:limit]:
|
||||||
|
scanned += 1
|
||||||
|
upsert_conversation_event(event)
|
||||||
|
indexed += 1
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
self.style.SUCCESS(
|
||||||
|
"manticore-backfill scanned=%s indexed=%s user=%s"
|
||||||
|
% (scanned, indexed, user_id if user_id is not None else "-")
|
||||||
|
)
|
||||||
|
)
|
||||||
62
core/management/commands/prune_behavioral_orm_data.py
Normal file
62
core/management/commands/prune_behavioral_orm_data.py
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import time
|
||||||
|
|
||||||
|
from django.conf import settings
|
||||||
|
from django.core.management.base import BaseCommand
|
||||||
|
|
||||||
|
from core.models import ConversationEvent
|
||||||
|
|
||||||
|
|
||||||
|
class Command(BaseCommand):
|
||||||
|
help = (
|
||||||
|
"Prune high-growth behavioral ORM shadow tables after data has been "
|
||||||
|
"persisted to Manticore."
|
||||||
|
)
|
||||||
|
|
||||||
|
def add_arguments(self, parser):
|
||||||
|
parser.add_argument("--user-id", default="")
|
||||||
|
parser.add_argument("--dry-run", action="store_true", default=False)
|
||||||
|
parser.add_argument("--conversation-days", type=int)
|
||||||
|
parser.add_argument(
|
||||||
|
"--tables",
|
||||||
|
default="conversation_events",
|
||||||
|
help="Comma separated subset of: conversation_events",
|
||||||
|
)
|
||||||
|
|
||||||
|
def _cutoff_ms(self, days: int) -> int:
|
||||||
|
return int(time.time() * 1000) - (max(1, int(days)) * 24 * 60 * 60 * 1000)
|
||||||
|
|
||||||
|
def handle(self, *args, **options):
|
||||||
|
user_id = str(options.get("user_id") or "").strip()
|
||||||
|
dry_run = bool(options.get("dry_run"))
|
||||||
|
conversation_days = int(
|
||||||
|
options.get("conversation_days")
|
||||||
|
or getattr(settings, "CONVERSATION_EVENT_RETENTION_DAYS", 90)
|
||||||
|
or 90
|
||||||
|
)
|
||||||
|
selected_tables = {
|
||||||
|
str(item or "").strip().lower()
|
||||||
|
for item in str(options.get("tables") or "").split(",")
|
||||||
|
if str(item or "").strip()
|
||||||
|
}
|
||||||
|
|
||||||
|
deleted = {
|
||||||
|
"conversation_events": 0,
|
||||||
|
}
|
||||||
|
|
||||||
|
if "conversation_events" in selected_tables:
|
||||||
|
qs = ConversationEvent.objects.filter(
|
||||||
|
ts__lt=self._cutoff_ms(conversation_days)
|
||||||
|
)
|
||||||
|
if user_id:
|
||||||
|
qs = qs.filter(user_id=user_id)
|
||||||
|
deleted["conversation_events"] = int(qs.count() if dry_run else qs.delete()[0])
|
||||||
|
|
||||||
|
self.stdout.write(
|
||||||
|
"prune-behavioral-orm-data "
|
||||||
|
f"dry_run={dry_run} "
|
||||||
|
f"user_id={user_id or '-'} "
|
||||||
|
f"conversation_days={conversation_days} "
|
||||||
|
f"deleted={deleted}"
|
||||||
|
)
|
||||||
@@ -2,22 +2,15 @@ from __future__ import annotations
|
|||||||
|
|
||||||
from django.core.management.base import BaseCommand
|
from django.core.management.base import BaseCommand
|
||||||
|
|
||||||
from core.models import ContactAvailabilityEvent, ContactAvailabilitySpan, Message
|
from core.events.ledger import append_event_sync
|
||||||
from core.presence import AvailabilitySignal, record_native_signal
|
from core.models import Message
|
||||||
from core.presence.inference import now_ms
|
from core.presence.inference import now_ms
|
||||||
|
|
||||||
_SOURCE_ORDER = {
|
|
||||||
"message_in": 10,
|
|
||||||
"message_out": 20,
|
|
||||||
"read_receipt": 30,
|
|
||||||
"native_presence": 40,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
class Command(BaseCommand):
|
class Command(BaseCommand):
|
||||||
help = (
|
help = (
|
||||||
"Recalculate contact availability events/spans from persisted message, "
|
"Replay behavioral event ledger rows from persisted message, receipt, "
|
||||||
"read-receipt, and reaction history (deterministic rebuild)."
|
"and reaction history."
|
||||||
)
|
)
|
||||||
|
|
||||||
def add_arguments(self, parser):
|
def add_arguments(self, parser):
|
||||||
@@ -39,70 +32,93 @@ class Command(BaseCommand):
|
|||||||
qs = qs.filter(user_id=str(user_id).strip())
|
qs = qs.filter(user_id=str(user_id).strip())
|
||||||
return qs.order_by("ts")[: max(1, int(limit))]
|
return qs.order_by("ts")[: max(1, int(limit))]
|
||||||
|
|
||||||
def _build_event_rows(self, messages):
|
def handle(self, *args, **options):
|
||||||
rows = []
|
days = max(1, int(options.get("days") or 90))
|
||||||
|
limit = max(1, int(options.get("limit") or 20000))
|
||||||
|
service_filter = str(options.get("service") or "").strip().lower()
|
||||||
|
user_filter = str(options.get("user_id") or "").strip()
|
||||||
|
dry_run = bool(options.get("dry_run"))
|
||||||
|
|
||||||
|
messages = list(
|
||||||
|
self._iter_messages(
|
||||||
|
days=days,
|
||||||
|
limit=limit,
|
||||||
|
service=service_filter,
|
||||||
|
user_id=user_filter,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
indexed = 0
|
||||||
|
|
||||||
for msg in messages:
|
for msg in messages:
|
||||||
identifier = getattr(getattr(msg, "session", None), "identifier", None)
|
session = getattr(msg, "session", None)
|
||||||
|
identifier = getattr(session, "identifier", None)
|
||||||
person = getattr(identifier, "person", None)
|
person = getattr(identifier, "person", None)
|
||||||
user = getattr(msg, "user", None)
|
user = getattr(msg, "user", None)
|
||||||
if not identifier or not person or not user:
|
if not session or not identifier or not person or not user:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
service = (
|
service = (
|
||||||
str(
|
str(getattr(msg, "source_service", "") or identifier.service or "")
|
||||||
getattr(msg, "source_service", "")
|
|
||||||
or getattr(identifier, "service", "")
|
|
||||||
)
|
|
||||||
.strip()
|
.strip()
|
||||||
.lower()
|
.lower()
|
||||||
)
|
)
|
||||||
if not service:
|
if not service:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
ts = int(getattr(msg, "ts", 0) or 0)
|
author = str(getattr(msg, "custom_author", "") or "").strip().upper()
|
||||||
if ts > 0:
|
outgoing = author in {"USER", "BOT"}
|
||||||
author = str(getattr(msg, "custom_author", "") or "").strip().upper()
|
message_id = str(
|
||||||
outgoing = author in {"USER", "BOT"}
|
getattr(msg, "source_message_id", "") or f"django-message-{msg.id}"
|
||||||
rows.append(
|
).strip()
|
||||||
{
|
|
||||||
"user": user,
|
if not dry_run:
|
||||||
"person": person,
|
append_event_sync(
|
||||||
"person_identifier": identifier,
|
user=user,
|
||||||
"service": service,
|
session=session,
|
||||||
"source_kind": "message_out" if outgoing else "message_in",
|
ts=int(getattr(msg, "ts", 0) or 0),
|
||||||
"availability_state": "available",
|
event_type="message_created",
|
||||||
"confidence": 0.65 if outgoing else 0.75,
|
direction="out" if outgoing else "in",
|
||||||
"ts": ts,
|
actor_identifier=str(
|
||||||
"payload": {
|
getattr(msg, "sender_uuid", "") or identifier.identifier or ""
|
||||||
"origin": "recalculate_contact_availability",
|
),
|
||||||
"message_id": str(msg.id),
|
origin_transport=service,
|
||||||
"inferred_from": "message_activity",
|
origin_message_id=message_id,
|
||||||
},
|
origin_chat_id=str(getattr(msg, "source_chat_id", "") or ""),
|
||||||
}
|
payload={
|
||||||
|
"origin": "recalculate_contact_availability",
|
||||||
|
"message_id": str(msg.id),
|
||||||
|
"text": str(getattr(msg, "text", "") or ""),
|
||||||
|
"outgoing": outgoing,
|
||||||
|
},
|
||||||
)
|
)
|
||||||
|
indexed += 1
|
||||||
|
|
||||||
read_ts = int(getattr(msg, "read_ts", 0) or 0)
|
read_ts = int(getattr(msg, "read_ts", 0) or 0)
|
||||||
if read_ts > 0:
|
if read_ts > 0:
|
||||||
rows.append(
|
if not dry_run:
|
||||||
{
|
append_event_sync(
|
||||||
"user": user,
|
user=user,
|
||||||
"person": person,
|
session=session,
|
||||||
"person_identifier": identifier,
|
ts=read_ts,
|
||||||
"service": service,
|
event_type="read_receipt",
|
||||||
"source_kind": "read_receipt",
|
direction="system",
|
||||||
"availability_state": "available",
|
actor_identifier=str(
|
||||||
"confidence": 0.95,
|
getattr(msg, "read_by_identifier", "")
|
||||||
"ts": read_ts,
|
or identifier.identifier
|
||||||
"payload": {
|
),
|
||||||
|
origin_transport=service,
|
||||||
|
origin_message_id=message_id,
|
||||||
|
origin_chat_id=str(getattr(msg, "source_chat_id", "") or ""),
|
||||||
|
payload={
|
||||||
"origin": "recalculate_contact_availability",
|
"origin": "recalculate_contact_availability",
|
||||||
"message_id": str(msg.id),
|
"message_id": str(msg.id),
|
||||||
"inferred_from": "read_receipt",
|
"message_ts": int(getattr(msg, "ts", 0) or 0),
|
||||||
"read_by": str(
|
"read_by": str(
|
||||||
getattr(msg, "read_by_identifier", "") or ""
|
getattr(msg, "read_by_identifier", "") or ""
|
||||||
),
|
).strip(),
|
||||||
},
|
},
|
||||||
}
|
)
|
||||||
)
|
indexed += 1
|
||||||
|
|
||||||
reactions = list(
|
reactions = list(
|
||||||
(getattr(msg, "receipt_payload", {}) or {}).get("reactions") or []
|
(getattr(msg, "receipt_payload", {}) or {}).get("reactions") or []
|
||||||
@@ -114,138 +130,32 @@ class Command(BaseCommand):
|
|||||||
reaction_ts = int(item.get("updated_at") or 0)
|
reaction_ts = int(item.get("updated_at") or 0)
|
||||||
if reaction_ts <= 0:
|
if reaction_ts <= 0:
|
||||||
continue
|
continue
|
||||||
rows.append(
|
if not dry_run:
|
||||||
{
|
append_event_sync(
|
||||||
"user": user,
|
user=user,
|
||||||
"person": person,
|
session=session,
|
||||||
"person_identifier": identifier,
|
ts=reaction_ts,
|
||||||
"service": service,
|
event_type="presence_available",
|
||||||
"source_kind": "native_presence",
|
direction="system",
|
||||||
"availability_state": "available",
|
actor_identifier=str(item.get("actor") or ""),
|
||||||
"confidence": 0.9,
|
origin_transport=service,
|
||||||
"ts": reaction_ts,
|
origin_message_id=message_id,
|
||||||
"payload": {
|
origin_chat_id=str(getattr(msg, "source_chat_id", "") or ""),
|
||||||
|
payload={
|
||||||
"origin": "recalculate_contact_availability",
|
"origin": "recalculate_contact_availability",
|
||||||
"message_id": str(msg.id),
|
"message_id": str(msg.id),
|
||||||
"inferred_from": "reaction",
|
"inferred_from": "reaction",
|
||||||
"emoji": str(item.get("emoji") or ""),
|
"emoji": str(item.get("emoji") or ""),
|
||||||
"actor": str(item.get("actor") or ""),
|
"source_service": str(item.get("source_service") or service),
|
||||||
"source_service": str(
|
|
||||||
item.get("source_service") or service
|
|
||||||
),
|
|
||||||
},
|
},
|
||||||
}
|
)
|
||||||
)
|
indexed += 1
|
||||||
|
|
||||||
rows.sort(
|
|
||||||
key=lambda row: (
|
|
||||||
str(getattr(row["user"], "id", "")),
|
|
||||||
str(getattr(row["person"], "id", "")),
|
|
||||||
str(row.get("service") or ""),
|
|
||||||
int(row.get("ts") or 0),
|
|
||||||
_SOURCE_ORDER.get(str(row.get("source_kind") or ""), 999),
|
|
||||||
str((row.get("payload") or {}).get("message_id") or ""),
|
|
||||||
)
|
|
||||||
)
|
|
||||||
return rows
|
|
||||||
|
|
||||||
def handle(self, *args, **options):
|
|
||||||
days = max(1, int(options.get("days") or 90))
|
|
||||||
limit = max(1, int(options.get("limit") or 20000))
|
|
||||||
service_filter = str(options.get("service") or "").strip().lower()
|
|
||||||
user_filter = str(options.get("user_id") or "").strip()
|
|
||||||
dry_run = bool(options.get("dry_run"))
|
|
||||||
reset = not bool(options.get("no_reset"))
|
|
||||||
cutoff_ts = now_ms() - (days * 24 * 60 * 60 * 1000)
|
|
||||||
|
|
||||||
messages = list(
|
|
||||||
self._iter_messages(
|
|
||||||
days=days,
|
|
||||||
limit=limit,
|
|
||||||
service=service_filter,
|
|
||||||
user_id=user_filter,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
rows = self._build_event_rows(messages)
|
|
||||||
|
|
||||||
keys_to_reset = set()
|
|
||||||
for row in rows:
|
|
||||||
keys_to_reset.add(
|
|
||||||
(
|
|
||||||
str(getattr(row["user"], "id", "")),
|
|
||||||
str(getattr(row["person"], "id", "")),
|
|
||||||
str(row.get("service") or ""),
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
deleted_events = 0
|
|
||||||
deleted_spans = 0
|
|
||||||
if reset and keys_to_reset and not dry_run:
|
|
||||||
for user_id, person_id, service in keys_to_reset:
|
|
||||||
deleted_events += ContactAvailabilityEvent.objects.filter(
|
|
||||||
user_id=user_id,
|
|
||||||
person_id=person_id,
|
|
||||||
service=service,
|
|
||||||
ts__gte=cutoff_ts,
|
|
||||||
).delete()[0]
|
|
||||||
deleted_spans += ContactAvailabilitySpan.objects.filter(
|
|
||||||
user_id=user_id,
|
|
||||||
person_id=person_id,
|
|
||||||
service=service,
|
|
||||||
end_ts__gte=cutoff_ts,
|
|
||||||
).delete()[0]
|
|
||||||
|
|
||||||
created = 0
|
|
||||||
dedup_seen = set()
|
|
||||||
for row in rows:
|
|
||||||
dedup_key = (
|
|
||||||
str(getattr(row["user"], "id", "")),
|
|
||||||
str(getattr(row["person"], "id", "")),
|
|
||||||
str(getattr(row["person_identifier"], "id", "")),
|
|
||||||
str(row.get("service") or ""),
|
|
||||||
str(row.get("source_kind") or ""),
|
|
||||||
int(row.get("ts") or 0),
|
|
||||||
str((row.get("payload") or {}).get("message_id") or ""),
|
|
||||||
str((row.get("payload") or {}).get("inferred_from") or ""),
|
|
||||||
)
|
|
||||||
if dedup_key in dedup_seen:
|
|
||||||
continue
|
|
||||||
dedup_seen.add(dedup_key)
|
|
||||||
|
|
||||||
if not reset:
|
|
||||||
exists = ContactAvailabilityEvent.objects.filter(
|
|
||||||
user=row["user"],
|
|
||||||
person=row["person"],
|
|
||||||
person_identifier=row["person_identifier"],
|
|
||||||
service=row["service"],
|
|
||||||
source_kind=row["source_kind"],
|
|
||||||
ts=row["ts"],
|
|
||||||
).exists()
|
|
||||||
if exists:
|
|
||||||
continue
|
|
||||||
|
|
||||||
created += 1
|
|
||||||
if dry_run:
|
|
||||||
continue
|
|
||||||
record_native_signal(
|
|
||||||
AvailabilitySignal(
|
|
||||||
user=row["user"],
|
|
||||||
person=row["person"],
|
|
||||||
person_identifier=row["person_identifier"],
|
|
||||||
service=row["service"],
|
|
||||||
source_kind=row["source_kind"],
|
|
||||||
availability_state=row["availability_state"],
|
|
||||||
confidence=float(row["confidence"]),
|
|
||||||
ts=int(row["ts"]),
|
|
||||||
payload=dict(row["payload"]),
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|
||||||
self.stdout.write(
|
self.stdout.write(
|
||||||
self.style.SUCCESS(
|
self.style.SUCCESS(
|
||||||
"recalculate_contact_availability complete "
|
"recalculate_contact_availability complete "
|
||||||
f"messages_scanned={len(messages)} candidates={len(rows)} "
|
f"messages_scanned={len(messages)} indexed={indexed} "
|
||||||
f"created={created} deleted_events={deleted_events} deleted_spans={deleted_spans} "
|
f"dry_run={dry_run} no_reset={bool(options.get('no_reset'))} "
|
||||||
f"reset={reset} dry_run={dry_run} days={days} limit={limit}"
|
f"days={days} limit={limit}"
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -293,6 +293,7 @@ async def apply_read_receipts(
|
|||||||
read_by_identifier="",
|
read_by_identifier="",
|
||||||
payload=None,
|
payload=None,
|
||||||
trace_id="",
|
trace_id="",
|
||||||
|
receipt_event_type="read_receipt",
|
||||||
):
|
):
|
||||||
"""
|
"""
|
||||||
Persist delivery/read metadata for one identifier's messages.
|
Persist delivery/read metadata for one identifier's messages.
|
||||||
@@ -310,6 +311,9 @@ async def apply_read_receipts(
|
|||||||
read_at = int(read_ts) if read_ts else None
|
read_at = int(read_ts) if read_ts else None
|
||||||
except Exception:
|
except Exception:
|
||||||
read_at = None
|
read_at = None
|
||||||
|
normalized_event_type = str(receipt_event_type or "read_receipt").strip().lower()
|
||||||
|
if normalized_event_type not in {"read_receipt", "delivery_receipt"}:
|
||||||
|
normalized_event_type = "read_receipt"
|
||||||
|
|
||||||
rows = await sync_to_async(list)(
|
rows = await sync_to_async(list)(
|
||||||
Message.objects.filter(
|
Message.objects.filter(
|
||||||
@@ -324,13 +328,25 @@ async def apply_read_receipts(
|
|||||||
if message.delivered_ts is None:
|
if message.delivered_ts is None:
|
||||||
message.delivered_ts = read_at or message.ts
|
message.delivered_ts = read_at or message.ts
|
||||||
dirty.append("delivered_ts")
|
dirty.append("delivered_ts")
|
||||||
if read_at and (message.read_ts is None or read_at > message.read_ts):
|
if (
|
||||||
|
normalized_event_type == "read_receipt"
|
||||||
|
and read_at
|
||||||
|
and (message.read_ts is None or read_at > message.read_ts)
|
||||||
|
):
|
||||||
message.read_ts = read_at
|
message.read_ts = read_at
|
||||||
dirty.append("read_ts")
|
dirty.append("read_ts")
|
||||||
if source_service and message.read_source_service != source_service:
|
if (
|
||||||
|
normalized_event_type == "read_receipt"
|
||||||
|
and source_service
|
||||||
|
and message.read_source_service != source_service
|
||||||
|
):
|
||||||
message.read_source_service = source_service
|
message.read_source_service = source_service
|
||||||
dirty.append("read_source_service")
|
dirty.append("read_source_service")
|
||||||
if read_by_identifier and message.read_by_identifier != read_by_identifier:
|
if (
|
||||||
|
normalized_event_type == "read_receipt"
|
||||||
|
and read_by_identifier
|
||||||
|
and message.read_by_identifier != read_by_identifier
|
||||||
|
):
|
||||||
message.read_by_identifier = read_by_identifier
|
message.read_by_identifier = read_by_identifier
|
||||||
dirty.append("read_by_identifier")
|
dirty.append("read_by_identifier")
|
||||||
if payload:
|
if payload:
|
||||||
@@ -346,7 +362,7 @@ async def apply_read_receipts(
|
|||||||
user=user,
|
user=user,
|
||||||
session=message.session,
|
session=message.session,
|
||||||
ts=int(read_at or message.ts or 0),
|
ts=int(read_at or message.ts or 0),
|
||||||
event_type="read_receipt",
|
event_type=normalized_event_type,
|
||||||
direction="system",
|
direction="system",
|
||||||
actor_identifier=str(read_by_identifier or ""),
|
actor_identifier=str(read_by_identifier or ""),
|
||||||
origin_transport=str(source_service or ""),
|
origin_transport=str(source_service or ""),
|
||||||
@@ -356,6 +372,7 @@ async def apply_read_receipts(
|
|||||||
"message_id": str(message.id),
|
"message_id": str(message.id),
|
||||||
"message_ts": int(message.ts or 0),
|
"message_ts": int(message.ts or 0),
|
||||||
"read_ts": int(read_at or 0),
|
"read_ts": int(read_at or 0),
|
||||||
|
"receipt_event_type": normalized_event_type,
|
||||||
"read_by_identifier": str(read_by_identifier or ""),
|
"read_by_identifier": str(read_by_identifier or ""),
|
||||||
"timestamps": [int(v) for v in ts_values],
|
"timestamps": [int(v) for v in ts_values],
|
||||||
},
|
},
|
||||||
@@ -364,7 +381,7 @@ async def apply_read_receipts(
|
|||||||
)
|
)
|
||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
log.warning(
|
log.warning(
|
||||||
"Event ledger append failed for read receipt message=%s: %s",
|
"Event ledger append failed for receipt message=%s: %s",
|
||||||
message.id,
|
message.id,
|
||||||
exc,
|
exc,
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -0,0 +1,33 @@
|
|||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
dependencies = [
|
||||||
|
("core", "0046_externalchatlink_provider_default_mock"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AlterField(
|
||||||
|
model_name="conversationevent",
|
||||||
|
name="event_type",
|
||||||
|
field=models.CharField(
|
||||||
|
choices=[
|
||||||
|
("message_created", "Message Created"),
|
||||||
|
("message_edited", "Message Edited"),
|
||||||
|
("message_deleted", "Message Deleted"),
|
||||||
|
("reaction_added", "Reaction Added"),
|
||||||
|
("reaction_removed", "Reaction Removed"),
|
||||||
|
("read_receipt", "Read Receipt"),
|
||||||
|
("typing_started", "Typing Started"),
|
||||||
|
("typing_stopped", "Typing Stopped"),
|
||||||
|
("composing_abandoned", "Composing Abandoned"),
|
||||||
|
("presence_available", "Presence Available"),
|
||||||
|
("presence_unavailable", "Presence Unavailable"),
|
||||||
|
("participant_added", "Participant Added"),
|
||||||
|
("participant_removed", "Participant Removed"),
|
||||||
|
("delivery_receipt", "Delivery Receipt"),
|
||||||
|
],
|
||||||
|
max_length=64,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
||||||
@@ -0,0 +1,16 @@
|
|||||||
|
from django.db import migrations
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
dependencies = [
|
||||||
|
("core", "0047_conversationevent_behavioral_event_types"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.DeleteModel(
|
||||||
|
name="ContactAvailabilitySpan",
|
||||||
|
),
|
||||||
|
migrations.DeleteModel(
|
||||||
|
name="ContactAvailabilityEvent",
|
||||||
|
),
|
||||||
|
]
|
||||||
106
core/models.py
106
core/models.py
@@ -82,6 +82,7 @@ class User(AbstractUser):
|
|||||||
customer_id = models.UUIDField(default=uuid.uuid4, null=True, blank=True)
|
customer_id = models.UUIDField(default=uuid.uuid4, null=True, blank=True)
|
||||||
billing_provider_id = models.CharField(max_length=255, null=True, blank=True)
|
billing_provider_id = models.CharField(max_length=255, null=True, blank=True)
|
||||||
email = models.EmailField(unique=True)
|
email = models.EmailField(unique=True)
|
||||||
|
allow_contacts_to_create_tasks = models.BooleanField(default=True)
|
||||||
|
|
||||||
def __init__(self, *args, **kwargs):
|
def __init__(self, *args, **kwargs):
|
||||||
super().__init__(*args, **kwargs)
|
super().__init__(*args, **kwargs)
|
||||||
@@ -397,6 +398,9 @@ class ConversationEvent(models.Model):
|
|||||||
("read_receipt", "Read Receipt"),
|
("read_receipt", "Read Receipt"),
|
||||||
("typing_started", "Typing Started"),
|
("typing_started", "Typing Started"),
|
||||||
("typing_stopped", "Typing Stopped"),
|
("typing_stopped", "Typing Stopped"),
|
||||||
|
("composing_abandoned", "Composing Abandoned"),
|
||||||
|
("presence_available", "Presence Available"),
|
||||||
|
("presence_unavailable", "Presence Unavailable"),
|
||||||
("participant_added", "Participant Added"),
|
("participant_added", "Participant Added"),
|
||||||
("participant_removed", "Participant Removed"),
|
("participant_removed", "Participant Removed"),
|
||||||
("delivery_receipt", "Delivery Receipt"),
|
("delivery_receipt", "Delivery Receipt"),
|
||||||
@@ -2759,108 +2763,6 @@ class ContactAvailabilitySettings(models.Model):
|
|||||||
updated_at = models.DateTimeField(auto_now=True)
|
updated_at = models.DateTimeField(auto_now=True)
|
||||||
|
|
||||||
|
|
||||||
class ContactAvailabilityEvent(models.Model):
|
|
||||||
SOURCE_KIND_CHOICES = (
|
|
||||||
("native_presence", "Native Presence"),
|
|
||||||
("read_receipt", "Read Receipt"),
|
|
||||||
("typing_start", "Typing Start"),
|
|
||||||
("typing_stop", "Typing Stop"),
|
|
||||||
("message_in", "Message In"),
|
|
||||||
("message_out", "Message Out"),
|
|
||||||
("inferred_timeout", "Inferred Timeout"),
|
|
||||||
)
|
|
||||||
STATE_CHOICES = (
|
|
||||||
("available", "Available"),
|
|
||||||
("unavailable", "Unavailable"),
|
|
||||||
("unknown", "Unknown"),
|
|
||||||
("fading", "Fading"),
|
|
||||||
)
|
|
||||||
|
|
||||||
user = models.ForeignKey(
|
|
||||||
User,
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name="contact_availability_events",
|
|
||||||
)
|
|
||||||
person = models.ForeignKey(
|
|
||||||
Person,
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name="availability_events",
|
|
||||||
)
|
|
||||||
person_identifier = models.ForeignKey(
|
|
||||||
PersonIdentifier,
|
|
||||||
on_delete=models.SET_NULL,
|
|
||||||
null=True,
|
|
||||||
blank=True,
|
|
||||||
related_name="availability_events",
|
|
||||||
)
|
|
||||||
service = models.CharField(max_length=255, choices=CHANNEL_SERVICE_CHOICES)
|
|
||||||
source_kind = models.CharField(max_length=32, choices=SOURCE_KIND_CHOICES)
|
|
||||||
availability_state = models.CharField(max_length=32, choices=STATE_CHOICES)
|
|
||||||
confidence = models.FloatField(default=0.0)
|
|
||||||
ts = models.BigIntegerField(db_index=True)
|
|
||||||
payload = models.JSONField(default=dict, blank=True)
|
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
indexes = [
|
|
||||||
models.Index(fields=["user", "person", "ts"]),
|
|
||||||
models.Index(fields=["user", "service", "ts"]),
|
|
||||||
models.Index(fields=["user", "availability_state", "ts"]),
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
class ContactAvailabilitySpan(models.Model):
|
|
||||||
STATE_CHOICES = ContactAvailabilityEvent.STATE_CHOICES
|
|
||||||
|
|
||||||
user = models.ForeignKey(
|
|
||||||
User,
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name="contact_availability_spans",
|
|
||||||
)
|
|
||||||
person = models.ForeignKey(
|
|
||||||
Person,
|
|
||||||
on_delete=models.CASCADE,
|
|
||||||
related_name="availability_spans",
|
|
||||||
)
|
|
||||||
person_identifier = models.ForeignKey(
|
|
||||||
PersonIdentifier,
|
|
||||||
on_delete=models.SET_NULL,
|
|
||||||
null=True,
|
|
||||||
blank=True,
|
|
||||||
related_name="availability_spans",
|
|
||||||
)
|
|
||||||
service = models.CharField(max_length=255, choices=CHANNEL_SERVICE_CHOICES)
|
|
||||||
state = models.CharField(max_length=32, choices=STATE_CHOICES)
|
|
||||||
start_ts = models.BigIntegerField(db_index=True)
|
|
||||||
end_ts = models.BigIntegerField(db_index=True)
|
|
||||||
confidence_start = models.FloatField(default=0.0)
|
|
||||||
confidence_end = models.FloatField(default=0.0)
|
|
||||||
opening_event = models.ForeignKey(
|
|
||||||
ContactAvailabilityEvent,
|
|
||||||
on_delete=models.SET_NULL,
|
|
||||||
null=True,
|
|
||||||
blank=True,
|
|
||||||
related_name="opening_spans",
|
|
||||||
)
|
|
||||||
closing_event = models.ForeignKey(
|
|
||||||
ContactAvailabilityEvent,
|
|
||||||
on_delete=models.SET_NULL,
|
|
||||||
null=True,
|
|
||||||
blank=True,
|
|
||||||
related_name="closing_spans",
|
|
||||||
)
|
|
||||||
payload = models.JSONField(default=dict, blank=True)
|
|
||||||
created_at = models.DateTimeField(auto_now_add=True)
|
|
||||||
updated_at = models.DateTimeField(auto_now=True)
|
|
||||||
|
|
||||||
class Meta:
|
|
||||||
indexes = [
|
|
||||||
models.Index(fields=["user", "person", "start_ts"]),
|
|
||||||
models.Index(fields=["user", "person", "end_ts"]),
|
|
||||||
models.Index(fields=["user", "service", "start_ts"]),
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
class ExternalChatLink(models.Model):
|
class ExternalChatLink(models.Model):
|
||||||
user = models.ForeignKey(
|
user = models.ForeignKey(
|
||||||
User,
|
User,
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import asyncio
|
import asyncio
|
||||||
import re
|
import re
|
||||||
|
import time
|
||||||
|
|
||||||
from asgiref.sync import sync_to_async
|
from asgiref.sync import sync_to_async
|
||||||
from django.conf import settings
|
from django.conf import settings
|
||||||
@@ -12,7 +13,8 @@ from core.clients.whatsapp import WhatsAppClient
|
|||||||
from core.clients.xmpp import XMPPClient
|
from core.clients.xmpp import XMPPClient
|
||||||
from core.commands.base import CommandContext
|
from core.commands.base import CommandContext
|
||||||
from core.commands.engine import process_inbound_message
|
from core.commands.engine import process_inbound_message
|
||||||
from core.events import event_ledger_status
|
from core.events import append_event, event_ledger_enabled, event_ledger_status
|
||||||
|
from core.events.behavior import ComposingTracker
|
||||||
from core.messaging import history
|
from core.messaging import history
|
||||||
from core.models import PersonIdentifier
|
from core.models import PersonIdentifier
|
||||||
from core.observability.tracing import ensure_trace_id
|
from core.observability.tracing import ensure_trace_id
|
||||||
@@ -32,7 +34,13 @@ class UnifiedRouter(object):
|
|||||||
self.typing_auto_stop_seconds = int(
|
self.typing_auto_stop_seconds = int(
|
||||||
getattr(settings, "XMPP_TYPING_AUTO_STOP_SECONDS", 3)
|
getattr(settings, "XMPP_TYPING_AUTO_STOP_SECONDS", 3)
|
||||||
)
|
)
|
||||||
|
self.composing_abandoned_window_seconds = int(
|
||||||
|
getattr(settings, "COMPOSING_ABANDONED_WINDOW_SECONDS", 300)
|
||||||
|
)
|
||||||
self._typing_stop_tasks = {}
|
self._typing_stop_tasks = {}
|
||||||
|
self._composing_tracker = ComposingTracker(
|
||||||
|
window_ms=self.composing_abandoned_window_seconds * 1000
|
||||||
|
)
|
||||||
|
|
||||||
self.log = logs.get_logger("router")
|
self.log = logs.get_logger("router")
|
||||||
self.log.info("Initialised Unified Router Interface.")
|
self.log.info("Initialised Unified Router Interface.")
|
||||||
@@ -85,6 +93,55 @@ class UnifiedRouter(object):
|
|||||||
|
|
||||||
self._typing_stop_tasks[key] = self.loop.create_task(_timer())
|
self._typing_stop_tasks[key] = self.loop.create_task(_timer())
|
||||||
|
|
||||||
|
def _behavior_direction(self, protocol: str) -> str:
|
||||||
|
return "out" if str(protocol or "").strip().lower() == "xmpp" else "in"
|
||||||
|
|
||||||
|
def _event_ts_from_kwargs(self, kwargs: dict) -> int | None:
|
||||||
|
payload = dict(kwargs.get("payload") or {})
|
||||||
|
for candidate in (
|
||||||
|
kwargs.get("ts"),
|
||||||
|
payload.get("ts"),
|
||||||
|
payload.get("timestamp"),
|
||||||
|
payload.get("messageTimestamp"),
|
||||||
|
payload.get("message_ts"),
|
||||||
|
):
|
||||||
|
try:
|
||||||
|
parsed = int(candidate)
|
||||||
|
except Exception:
|
||||||
|
continue
|
||||||
|
if parsed > 0:
|
||||||
|
return parsed
|
||||||
|
return int(time.time() * 1000)
|
||||||
|
|
||||||
|
async def _append_identifier_event(
|
||||||
|
self,
|
||||||
|
*,
|
||||||
|
identifier_row,
|
||||||
|
event_type: str,
|
||||||
|
protocol: str,
|
||||||
|
direction: str,
|
||||||
|
ts: int | None = None,
|
||||||
|
payload: dict | None = None,
|
||||||
|
raw_payload: dict | None = None,
|
||||||
|
actor_identifier: str = "",
|
||||||
|
):
|
||||||
|
if not event_ledger_enabled():
|
||||||
|
return None
|
||||||
|
session = await history.get_chat_session(identifier_row.user, identifier_row)
|
||||||
|
await append_event(
|
||||||
|
user=identifier_row.user,
|
||||||
|
session=session,
|
||||||
|
ts=ts,
|
||||||
|
event_type=event_type,
|
||||||
|
direction=direction,
|
||||||
|
actor_identifier=str(actor_identifier or identifier_row.identifier or ""),
|
||||||
|
origin_transport=str(protocol or "").strip().lower(),
|
||||||
|
origin_chat_id=str(identifier_row.identifier or ""),
|
||||||
|
payload=dict(payload or {}),
|
||||||
|
raw_payload=dict(raw_payload or {}),
|
||||||
|
)
|
||||||
|
return session
|
||||||
|
|
||||||
def _start(self):
|
def _start(self):
|
||||||
self.log.info("Starting unified router clients")
|
self.log.info("Starting unified router clients")
|
||||||
self.xmpp.start()
|
self.xmpp.start()
|
||||||
@@ -117,6 +174,9 @@ class UnifiedRouter(object):
|
|||||||
message_text = str(kwargs.get("text") or "").strip()
|
message_text = str(kwargs.get("text") or "").strip()
|
||||||
if local_message is None:
|
if local_message is None:
|
||||||
return
|
return
|
||||||
|
self._composing_tracker.observe_message(
|
||||||
|
str(getattr(local_message, "session_id", "") or "")
|
||||||
|
)
|
||||||
identifiers = await self._resolve_identifier_objects(protocol, identifier)
|
identifiers = await self._resolve_identifier_objects(protocol, identifier)
|
||||||
if identifiers:
|
if identifiers:
|
||||||
outgoing = str(
|
outgoing = str(
|
||||||
@@ -239,6 +299,10 @@ class UnifiedRouter(object):
|
|||||||
timestamps = kwargs.get("message_timestamps") or []
|
timestamps = kwargs.get("message_timestamps") or []
|
||||||
read_ts = kwargs.get("read_ts")
|
read_ts = kwargs.get("read_ts")
|
||||||
payload = kwargs.get("payload") or {}
|
payload = kwargs.get("payload") or {}
|
||||||
|
payload_type = str((payload or {}).get("type") or "").strip().lower()
|
||||||
|
receipt_event_type = (
|
||||||
|
"delivery_receipt" if payload_type == "delivered" else "read_receipt"
|
||||||
|
)
|
||||||
trace_id = (
|
trace_id = (
|
||||||
ensure_trace_id(payload=payload)
|
ensure_trace_id(payload=payload)
|
||||||
if bool(getattr(settings, "TRACE_PROPAGATION_ENABLED", True))
|
if bool(getattr(settings, "TRACE_PROPAGATION_ENABLED", True))
|
||||||
@@ -257,6 +321,7 @@ class UnifiedRouter(object):
|
|||||||
read_by_identifier=read_by or row.identifier,
|
read_by_identifier=read_by or row.identifier,
|
||||||
payload=payload,
|
payload=payload,
|
||||||
trace_id=trace_id,
|
trace_id=trace_id,
|
||||||
|
receipt_event_type=receipt_event_type,
|
||||||
)
|
)
|
||||||
record_native_signal(
|
record_native_signal(
|
||||||
AvailabilitySignal(
|
AvailabilitySignal(
|
||||||
@@ -264,12 +329,13 @@ class UnifiedRouter(object):
|
|||||||
person=row.person,
|
person=row.person,
|
||||||
person_identifier=row,
|
person_identifier=row,
|
||||||
service=str(protocol or "").strip().lower(),
|
service=str(protocol or "").strip().lower(),
|
||||||
source_kind="read_receipt",
|
source_kind=receipt_event_type,
|
||||||
availability_state="available",
|
availability_state="available",
|
||||||
confidence=0.95,
|
confidence=0.95,
|
||||||
ts=int(read_ts or 0),
|
ts=int(read_ts or 0),
|
||||||
payload={
|
payload={
|
||||||
"origin": "router.message_read",
|
"origin": "router.message_read",
|
||||||
|
"receipt_event_type": receipt_event_type,
|
||||||
"message_timestamps": [
|
"message_timestamps": [
|
||||||
int(v) for v in list(timestamps or []) if str(v).isdigit()
|
int(v) for v in list(timestamps or []) if str(v).isdigit()
|
||||||
],
|
],
|
||||||
@@ -309,11 +375,41 @@ class UnifiedRouter(object):
|
|||||||
payload=payload,
|
payload=payload,
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
state_event = None
|
||||||
|
if state == "available":
|
||||||
|
state_event = "presence_available"
|
||||||
|
elif state == "unavailable":
|
||||||
|
state_event = "presence_unavailable"
|
||||||
|
if state_event:
|
||||||
|
try:
|
||||||
|
await self._append_identifier_event(
|
||||||
|
identifier_row=row,
|
||||||
|
event_type=state_event,
|
||||||
|
protocol=protocol,
|
||||||
|
direction="system",
|
||||||
|
ts=(ts or None),
|
||||||
|
payload={
|
||||||
|
"state": state,
|
||||||
|
"confidence": confidence,
|
||||||
|
**payload,
|
||||||
|
},
|
||||||
|
raw_payload=payload,
|
||||||
|
actor_identifier=str(row.identifier or ""),
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
self.log.warning(
|
||||||
|
"Failed to append presence event for %s: %s",
|
||||||
|
row.identifier,
|
||||||
|
exc,
|
||||||
|
)
|
||||||
await self._refresh_workspace_metrics_for_identifiers(identifiers)
|
await self._refresh_workspace_metrics_for_identifiers(identifiers)
|
||||||
|
|
||||||
async def started_typing(self, protocol, *args, **kwargs):
|
async def started_typing(self, protocol, *args, **kwargs):
|
||||||
self.log.info(f"Started typing ({protocol}) {args} {kwargs}")
|
self.log.info(f"Started typing ({protocol}) {args} {kwargs}")
|
||||||
identifier = kwargs.get("identifier")
|
identifier = kwargs.get("identifier")
|
||||||
|
payload = dict(kwargs.get("payload") or {})
|
||||||
|
event_ts = self._event_ts_from_kwargs(kwargs)
|
||||||
|
direction = self._behavior_direction(protocol)
|
||||||
identifiers = await self._resolve_identifier_objects(protocol, identifier)
|
identifiers = await self._resolve_identifier_objects(protocol, identifier)
|
||||||
for src in identifiers:
|
for src in identifiers:
|
||||||
record_native_signal(
|
record_native_signal(
|
||||||
@@ -329,6 +425,30 @@ class UnifiedRouter(object):
|
|||||||
payload={"origin": "router.started_typing"},
|
payload={"origin": "router.started_typing"},
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
try:
|
||||||
|
session = await history.get_chat_session(src.user, src)
|
||||||
|
state = self._composing_tracker.observe_started(
|
||||||
|
str(session.id),
|
||||||
|
int(event_ts or 0),
|
||||||
|
)
|
||||||
|
await append_event(
|
||||||
|
user=src.user,
|
||||||
|
session=session,
|
||||||
|
ts=event_ts,
|
||||||
|
event_type="typing_started",
|
||||||
|
direction=direction,
|
||||||
|
actor_identifier=str(src.identifier or ""),
|
||||||
|
origin_transport=str(protocol or "").strip().lower(),
|
||||||
|
origin_chat_id=str(src.identifier or ""),
|
||||||
|
payload=dict(payload or {}, revision=int(state.revision or 1)),
|
||||||
|
raw_payload=dict(payload or {}),
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
self.log.warning(
|
||||||
|
"Failed to append typing-start event for %s: %s",
|
||||||
|
src.identifier,
|
||||||
|
exc,
|
||||||
|
)
|
||||||
if protocol != "xmpp":
|
if protocol != "xmpp":
|
||||||
set_person_typing_state(
|
set_person_typing_state(
|
||||||
user_id=src.user_id,
|
user_id=src.user_id,
|
||||||
@@ -362,6 +482,9 @@ class UnifiedRouter(object):
|
|||||||
async def stopped_typing(self, protocol, *args, **kwargs):
|
async def stopped_typing(self, protocol, *args, **kwargs):
|
||||||
self.log.info(f"Stopped typing ({protocol}) {args} {kwargs}")
|
self.log.info(f"Stopped typing ({protocol}) {args} {kwargs}")
|
||||||
identifier = kwargs.get("identifier")
|
identifier = kwargs.get("identifier")
|
||||||
|
payload = dict(kwargs.get("payload") or {})
|
||||||
|
event_ts = self._event_ts_from_kwargs(kwargs)
|
||||||
|
direction = self._behavior_direction(protocol)
|
||||||
identifiers = await self._resolve_identifier_objects(protocol, identifier)
|
identifiers = await self._resolve_identifier_objects(protocol, identifier)
|
||||||
for src in identifiers:
|
for src in identifiers:
|
||||||
record_native_signal(
|
record_native_signal(
|
||||||
@@ -377,6 +500,52 @@ class UnifiedRouter(object):
|
|||||||
payload={"origin": "router.stopped_typing"},
|
payload={"origin": "router.stopped_typing"},
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
try:
|
||||||
|
session = await history.get_chat_session(src.user, src)
|
||||||
|
await append_event(
|
||||||
|
user=src.user,
|
||||||
|
session=session,
|
||||||
|
ts=event_ts,
|
||||||
|
event_type="typing_stopped",
|
||||||
|
direction=direction,
|
||||||
|
actor_identifier=str(src.identifier or ""),
|
||||||
|
origin_transport=str(protocol or "").strip().lower(),
|
||||||
|
origin_chat_id=str(src.identifier or ""),
|
||||||
|
payload=dict(payload or {}),
|
||||||
|
raw_payload=dict(payload or {}),
|
||||||
|
)
|
||||||
|
if session is not None:
|
||||||
|
abandoned = self._composing_tracker.observe_stopped(
|
||||||
|
str(session.id),
|
||||||
|
int(event_ts or 0),
|
||||||
|
)
|
||||||
|
if abandoned is not None:
|
||||||
|
await append_event(
|
||||||
|
user=src.user,
|
||||||
|
session=session,
|
||||||
|
ts=int(abandoned.get("stopped_ts") or event_ts or 0),
|
||||||
|
event_type="composing_abandoned",
|
||||||
|
direction=direction,
|
||||||
|
actor_identifier=str(src.identifier or ""),
|
||||||
|
origin_transport=str(protocol or "").strip().lower(),
|
||||||
|
origin_chat_id=str(src.identifier or ""),
|
||||||
|
payload={
|
||||||
|
**dict(payload or {}),
|
||||||
|
"abandoned": True,
|
||||||
|
"duration_ms": int(
|
||||||
|
abandoned.get("duration_ms") or 0
|
||||||
|
),
|
||||||
|
"revision": int(abandoned.get("revision") or 1),
|
||||||
|
"started_ts": int(abandoned.get("started_ts") or 0),
|
||||||
|
},
|
||||||
|
raw_payload=dict(payload or {}),
|
||||||
|
)
|
||||||
|
except Exception as exc:
|
||||||
|
self.log.warning(
|
||||||
|
"Failed to append typing-stop event for %s: %s",
|
||||||
|
src.identifier,
|
||||||
|
exc,
|
||||||
|
)
|
||||||
if protocol != "xmpp":
|
if protocol != "xmpp":
|
||||||
set_person_typing_state(
|
set_person_typing_state(
|
||||||
user_id=src.user_id,
|
user_id=src.user_id,
|
||||||
|
|||||||
@@ -2,25 +2,7 @@ from __future__ import annotations
|
|||||||
|
|
||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
|
|
||||||
from django.db import transaction
|
from core.models import ContactAvailabilitySettings, Person, PersonIdentifier, User
|
||||||
|
|
||||||
from core.models import (
|
|
||||||
ContactAvailabilityEvent,
|
|
||||||
ContactAvailabilitySettings,
|
|
||||||
ContactAvailabilitySpan,
|
|
||||||
Person,
|
|
||||||
PersonIdentifier,
|
|
||||||
User,
|
|
||||||
)
|
|
||||||
|
|
||||||
from .inference import fade_confidence, now_ms, should_fade
|
|
||||||
|
|
||||||
POSITIVE_SOURCE_KINDS = {
|
|
||||||
"native_presence",
|
|
||||||
"read_receipt",
|
|
||||||
"typing_start",
|
|
||||||
"message_in",
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass(slots=True)
|
@dataclass(slots=True)
|
||||||
@@ -41,95 +23,24 @@ def get_settings(user: User) -> ContactAvailabilitySettings:
|
|||||||
return settings_row
|
return settings_row
|
||||||
|
|
||||||
|
|
||||||
def _normalize_ts(value: int | None) -> int:
|
def record_native_signal(signal: AvailabilitySignal) -> AvailabilitySignal | None:
|
||||||
try:
|
"""
|
||||||
ts = int(value or 0)
|
Compatibility adapter for existing router call sites.
|
||||||
except Exception:
|
|
||||||
ts = 0
|
|
||||||
return ts if ts > 0 else now_ms()
|
|
||||||
|
|
||||||
|
Availability state is now derived from behavioral events in Manticore, so this
|
||||||
def _upsert_spans_for_event(event: ContactAvailabilityEvent) -> None:
|
function no longer persists a separate ORM availability row.
|
||||||
prev = (
|
"""
|
||||||
ContactAvailabilitySpan.objects.filter(
|
|
||||||
user=event.user,
|
|
||||||
person=event.person,
|
|
||||||
service=event.service,
|
|
||||||
)
|
|
||||||
.order_by("-end_ts", "-id")
|
|
||||||
.first()
|
|
||||||
)
|
|
||||||
|
|
||||||
if prev and prev.state == event.availability_state:
|
|
||||||
prev.end_ts = max(int(prev.end_ts or 0), int(event.ts or 0))
|
|
||||||
prev.confidence_end = float(event.confidence or 0.0)
|
|
||||||
prev.closing_event = event
|
|
||||||
prev.payload = dict(prev.payload or {})
|
|
||||||
prev.payload.update({"extended_by": str(event.source_kind or "")})
|
|
||||||
prev.save(
|
|
||||||
update_fields=[
|
|
||||||
"end_ts",
|
|
||||||
"confidence_end",
|
|
||||||
"closing_event",
|
|
||||||
"payload",
|
|
||||||
"updated_at",
|
|
||||||
]
|
|
||||||
)
|
|
||||||
return
|
|
||||||
|
|
||||||
ContactAvailabilitySpan.objects.create(
|
|
||||||
user=event.user,
|
|
||||||
person=event.person,
|
|
||||||
person_identifier=event.person_identifier,
|
|
||||||
service=event.service,
|
|
||||||
state=event.availability_state,
|
|
||||||
start_ts=int(event.ts or 0),
|
|
||||||
end_ts=int(event.ts or 0),
|
|
||||||
confidence_start=float(event.confidence or 0.0),
|
|
||||||
confidence_end=float(event.confidence or 0.0),
|
|
||||||
opening_event=event,
|
|
||||||
closing_event=event,
|
|
||||||
payload=dict(event.payload or {}),
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
@transaction.atomic
|
|
||||||
def record_native_signal(signal: AvailabilitySignal) -> ContactAvailabilityEvent | None:
|
|
||||||
settings_row = get_settings(signal.user)
|
settings_row = get_settings(signal.user)
|
||||||
if not settings_row.enabled:
|
if not settings_row.enabled:
|
||||||
return None
|
return None
|
||||||
|
return signal
|
||||||
event = ContactAvailabilityEvent.objects.create(
|
|
||||||
user=signal.user,
|
|
||||||
person=signal.person,
|
|
||||||
person_identifier=signal.person_identifier,
|
|
||||||
service=str(signal.service or "").strip().lower() or "signal",
|
|
||||||
source_kind=str(signal.source_kind or "").strip() or "native_presence",
|
|
||||||
availability_state=str(signal.availability_state or "unknown").strip()
|
|
||||||
or "unknown",
|
|
||||||
confidence=float(signal.confidence or 0.0),
|
|
||||||
ts=_normalize_ts(signal.ts),
|
|
||||||
payload=dict(signal.payload or {}),
|
|
||||||
)
|
|
||||||
_upsert_spans_for_event(event)
|
|
||||||
_prune_old_data(signal.user, settings_row.retention_days)
|
|
||||||
return event
|
|
||||||
|
|
||||||
|
|
||||||
def record_inferred_signal(
|
def record_inferred_signal(signal: AvailabilitySignal) -> AvailabilitySignal | None:
|
||||||
signal: AvailabilitySignal,
|
|
||||||
) -> ContactAvailabilityEvent | None:
|
|
||||||
settings_row = get_settings(signal.user)
|
settings_row = get_settings(signal.user)
|
||||||
if not settings_row.enabled or not settings_row.inference_enabled:
|
if not settings_row.enabled or not settings_row.inference_enabled:
|
||||||
return None
|
return None
|
||||||
return record_native_signal(signal)
|
return signal
|
||||||
|
|
||||||
|
|
||||||
def _prune_old_data(user: User, retention_days: int) -> None:
|
|
||||||
days = max(1, int(retention_days or 90))
|
|
||||||
cutoff = now_ms() - (days * 24 * 60 * 60 * 1000)
|
|
||||||
ContactAvailabilityEvent.objects.filter(user=user, ts__lt=cutoff).delete()
|
|
||||||
ContactAvailabilitySpan.objects.filter(user=user, end_ts__lt=cutoff).delete()
|
|
||||||
|
|
||||||
|
|
||||||
def ensure_fading_state(
|
def ensure_fading_state(
|
||||||
@@ -139,48 +50,5 @@ def ensure_fading_state(
|
|||||||
person_identifier: PersonIdentifier | None,
|
person_identifier: PersonIdentifier | None,
|
||||||
service: str,
|
service: str,
|
||||||
at_ts: int | None = None,
|
at_ts: int | None = None,
|
||||||
) -> ContactAvailabilityEvent | None:
|
) -> None:
|
||||||
settings_row = get_settings(user)
|
return None
|
||||||
if not settings_row.enabled or not settings_row.inference_enabled:
|
|
||||||
return None
|
|
||||||
|
|
||||||
current_ts = _normalize_ts(at_ts)
|
|
||||||
latest = (
|
|
||||||
ContactAvailabilityEvent.objects.filter(
|
|
||||||
user=user,
|
|
||||||
person=person,
|
|
||||||
service=str(service or "").strip().lower(),
|
|
||||||
)
|
|
||||||
.order_by("-ts", "-id")
|
|
||||||
.first()
|
|
||||||
)
|
|
||||||
if latest is None:
|
|
||||||
return None
|
|
||||||
if latest.availability_state in {"fading", "unavailable"}:
|
|
||||||
return None
|
|
||||||
if latest.source_kind not in POSITIVE_SOURCE_KINDS:
|
|
||||||
return None
|
|
||||||
if not should_fade(
|
|
||||||
int(latest.ts or 0), current_ts, settings_row.fade_threshold_seconds
|
|
||||||
):
|
|
||||||
return None
|
|
||||||
|
|
||||||
elapsed = max(0, current_ts - int(latest.ts or 0))
|
|
||||||
payload = {
|
|
||||||
"inferred_from": latest.source_kind,
|
|
||||||
"last_signal_ts": int(latest.ts or 0),
|
|
||||||
"elapsed_ms": elapsed,
|
|
||||||
}
|
|
||||||
return record_inferred_signal(
|
|
||||||
AvailabilitySignal(
|
|
||||||
user=user,
|
|
||||||
person=person,
|
|
||||||
person_identifier=person_identifier,
|
|
||||||
service=service,
|
|
||||||
source_kind="inferred_timeout",
|
|
||||||
availability_state="fading",
|
|
||||||
confidence=fade_confidence(elapsed, settings_row.fade_threshold_seconds),
|
|
||||||
ts=current_ts,
|
|
||||||
payload=payload,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
|
|||||||
@@ -1,11 +1,33 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
from django.db.models import Q
|
from core.events.behavior import parse_payload
|
||||||
|
from core.events.manticore import (
|
||||||
|
get_behavioral_events_for_range,
|
||||||
|
get_behavioral_latest_states,
|
||||||
|
)
|
||||||
|
from core.events.shadow import (
|
||||||
|
get_shadow_behavioral_events_for_range,
|
||||||
|
get_shadow_behavioral_latest_states,
|
||||||
|
)
|
||||||
|
from core.models import Person, User
|
||||||
|
|
||||||
from core.models import ContactAvailabilityEvent, ContactAvailabilitySpan, Person, User
|
|
||||||
|
|
||||||
from .engine import ensure_fading_state
|
def _behavioral_state_from_kind(kind: str) -> tuple[str, float]:
|
||||||
from .inference import now_ms
|
normalized = str(kind or "").strip().lower()
|
||||||
|
if normalized == "presence_unavailable":
|
||||||
|
return ("unavailable", 0.95)
|
||||||
|
if normalized == "composing_abandoned":
|
||||||
|
return ("fading", 0.8)
|
||||||
|
if normalized in {
|
||||||
|
"presence_available",
|
||||||
|
"message_read",
|
||||||
|
"message_delivered",
|
||||||
|
"composing_started",
|
||||||
|
"composing_stopped",
|
||||||
|
"message_sent",
|
||||||
|
}:
|
||||||
|
return ("available", 0.75)
|
||||||
|
return ("unknown", 0.5)
|
||||||
|
|
||||||
|
|
||||||
def spans_for_range(
|
def spans_for_range(
|
||||||
@@ -17,43 +39,98 @@ def spans_for_range(
|
|||||||
service: str = "",
|
service: str = "",
|
||||||
limit: int = 200,
|
limit: int = 200,
|
||||||
):
|
):
|
||||||
qs = ContactAvailabilitySpan.objects.filter(
|
service_filter = str(service or "").strip().lower()
|
||||||
user=user,
|
try:
|
||||||
person=person,
|
rows = get_behavioral_events_for_range(
|
||||||
).filter(Q(start_ts__lte=end_ts) & Q(end_ts__gte=start_ts))
|
user_id=int(user.id),
|
||||||
if service:
|
person_id=str(person.id),
|
||||||
qs = qs.filter(service=str(service).strip().lower())
|
start_ts=int(start_ts),
|
||||||
|
end_ts=int(end_ts),
|
||||||
|
transport=service_filter,
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
rows = []
|
||||||
|
if not rows:
|
||||||
|
rows = get_shadow_behavioral_events_for_range(
|
||||||
|
user=user,
|
||||||
|
person_id=str(person.id),
|
||||||
|
start_ts=int(start_ts),
|
||||||
|
end_ts=int(end_ts),
|
||||||
|
transport=service_filter,
|
||||||
|
)
|
||||||
|
|
||||||
ensure_fading_state(
|
spans = []
|
||||||
user=user,
|
current = None
|
||||||
person=person,
|
for row in list(rows or []):
|
||||||
person_identifier=None,
|
transport = str(row.get("transport") or "").strip().lower()
|
||||||
service=(str(service or "").strip().lower() or "signal"),
|
if service_filter and transport != service_filter:
|
||||||
at_ts=now_ms(),
|
continue
|
||||||
)
|
ts = int(row.get("ts") or 0)
|
||||||
|
state, confidence = _behavioral_state_from_kind(str(row.get("kind") or ""))
|
||||||
|
if current is None or str(current.get("state")) != state:
|
||||||
|
if current is not None:
|
||||||
|
spans.append(current)
|
||||||
|
current = {
|
||||||
|
"id": 0,
|
||||||
|
"service": transport or service_filter,
|
||||||
|
"state": state,
|
||||||
|
"start_ts": ts,
|
||||||
|
"end_ts": ts,
|
||||||
|
"confidence_start": float(confidence),
|
||||||
|
"confidence_end": float(confidence),
|
||||||
|
"payload": {
|
||||||
|
"source": "manticore_behavioral",
|
||||||
|
"kind": str(row.get("kind") or "").strip().lower(),
|
||||||
|
"raw_payload": parse_payload(row.get("payload")),
|
||||||
|
},
|
||||||
|
}
|
||||||
|
continue
|
||||||
|
current["end_ts"] = max(int(current.get("end_ts") or 0), ts)
|
||||||
|
current["confidence_end"] = float(confidence)
|
||||||
|
payload = dict(current.get("payload") or {})
|
||||||
|
payload["kind"] = str(row.get("kind") or "").strip().lower()
|
||||||
|
current["payload"] = payload
|
||||||
|
|
||||||
return list(qs.order_by("-end_ts")[: max(1, min(int(limit or 200), 500))])
|
if current is not None:
|
||||||
|
spans.append(current)
|
||||||
|
return list(reversed(spans[: max(1, min(int(limit or 200), 500))]))
|
||||||
|
|
||||||
|
|
||||||
def latest_state_for_people(*, user: User, person_ids: list, service: str = "") -> dict:
|
def latest_state_for_people(*, user: User, person_ids: list, service: str = "") -> dict:
|
||||||
out = {}
|
out = {}
|
||||||
if not person_ids:
|
if not person_ids:
|
||||||
return out
|
return out
|
||||||
qs = ContactAvailabilityEvent.objects.filter(user=user, person_id__in=person_ids)
|
service_filter = str(service or "").strip().lower()
|
||||||
if service:
|
try:
|
||||||
qs = qs.filter(service=str(service).strip().lower())
|
rows = get_behavioral_latest_states(
|
||||||
rows = list(qs.order_by("person_id", "-ts", "-id"))
|
user_id=int(user.id),
|
||||||
|
person_ids=[str(value) for value in person_ids],
|
||||||
|
transport=service_filter,
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
rows = []
|
||||||
|
if not rows:
|
||||||
|
rows = get_shadow_behavioral_latest_states(
|
||||||
|
user=user,
|
||||||
|
person_ids=[str(value) for value in person_ids],
|
||||||
|
transport=service_filter,
|
||||||
|
)
|
||||||
|
|
||||||
seen = set()
|
seen = set()
|
||||||
for row in rows:
|
for row in list(rows or []):
|
||||||
person_key = str(row.person_id)
|
person_key = str(row.get("person_id") or "").strip()
|
||||||
if person_key in seen:
|
transport = str(row.get("transport") or "").strip().lower()
|
||||||
|
if service_filter and transport != service_filter:
|
||||||
|
continue
|
||||||
|
if not person_key or person_key in seen:
|
||||||
continue
|
continue
|
||||||
seen.add(person_key)
|
seen.add(person_key)
|
||||||
|
state, confidence = _behavioral_state_from_kind(str(row.get("kind") or ""))
|
||||||
out[person_key] = {
|
out[person_key] = {
|
||||||
"state": str(row.availability_state or "unknown"),
|
"state": state,
|
||||||
"confidence": float(row.confidence or 0.0),
|
"confidence": float(confidence),
|
||||||
"service": str(row.service or ""),
|
"service": transport or service_filter,
|
||||||
"ts": int(row.ts or 0),
|
"ts": int(row.get("ts") or 0),
|
||||||
"source_kind": str(row.source_kind or ""),
|
"source_kind": f"behavioral:{str(row.get('kind') or '').strip().lower()}",
|
||||||
}
|
}
|
||||||
return out
|
return out
|
||||||
|
|||||||
@@ -6,10 +6,50 @@
|
|||||||
<!DOCTYPE html>
|
<!DOCTYPE html>
|
||||||
<html lang="en-GB">
|
<html lang="en-GB">
|
||||||
<head>
|
<head>
|
||||||
|
<script>
|
||||||
|
(function () {
|
||||||
|
var storedTheme = "light";
|
||||||
|
try {
|
||||||
|
storedTheme = localStorage.getItem("theme") || "light";
|
||||||
|
} catch (error) {
|
||||||
|
}
|
||||||
|
var resolvedTheme = storedTheme === "dark" ? "dark" : "light";
|
||||||
|
document.documentElement.dataset.theme = resolvedTheme;
|
||||||
|
if (resolvedTheme === "dark") {
|
||||||
|
document.documentElement.style.backgroundColor = "#090d16";
|
||||||
|
document.documentElement.style.color = "#f8fafc";
|
||||||
|
} else {
|
||||||
|
document.documentElement.style.backgroundColor = "#f3f4f6";
|
||||||
|
document.documentElement.style.color = "#1b2433";
|
||||||
|
}
|
||||||
|
var faviconHref = resolvedTheme === "dark"
|
||||||
|
? "{% static 'favicon-dark.svg' %}"
|
||||||
|
: "{% static 'favicon-light.svg' %}";
|
||||||
|
document.querySelectorAll("link[data-gia-favicon]").forEach(function (link) {
|
||||||
|
link.setAttribute("href", faviconHref);
|
||||||
|
});
|
||||||
|
document.querySelectorAll(".js-theme-logo").forEach(function (image) {
|
||||||
|
image.setAttribute("src", faviconHref);
|
||||||
|
});
|
||||||
|
})();
|
||||||
|
</script>
|
||||||
|
<style>
|
||||||
|
html[data-theme="dark"],
|
||||||
|
html[data-theme="dark"] body {
|
||||||
|
background: #090d16 !important;
|
||||||
|
color: #f8fafc !important;
|
||||||
|
}
|
||||||
|
html[data-theme="light"],
|
||||||
|
html[data-theme="light"] body {
|
||||||
|
background: #f3f4f6 !important;
|
||||||
|
color: #1b2433 !important;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
<meta charset="utf-8">
|
<meta charset="utf-8">
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||||
<title>{% block browser_title %}{% firstof page_browser_title page_title as explicit_title %}{% if explicit_title %}{{ explicit_title }} · GIA{% else %}{% with route_value=request.resolver_match.url_name|default:request.path_info|humanize_route %}{% if route_value %}{{ route_value }} · GIA{% else %}GIA{% endif %}{% endwith %}{% endif %}{% endblock %}</title>
|
<title>{% block browser_title %}{% firstof page_browser_title page_title as explicit_title %}{% if explicit_title %}{{ explicit_title }} · GIA{% else %}{% with route_value=request.resolver_match.url_name|default:request.path_info|humanize_route %}{% if route_value %}{{ route_value }} · GIA{% else %}GIA{% endif %}{% endwith %}{% endif %}{% endblock %}</title>
|
||||||
<link rel="shortcut icon" href="{% static 'favicon.ico' %}">
|
<link rel="icon" type="image/svg+xml" href="{% static 'favicon-light.svg' %}" data-gia-favicon>
|
||||||
|
<link rel="shortcut icon" href="{% static 'favicon-light.svg' %}" data-gia-favicon>
|
||||||
<link rel="manifest" href="{% static 'manifest.webmanifest' %}">
|
<link rel="manifest" href="{% static 'manifest.webmanifest' %}">
|
||||||
<link rel="stylesheet" href="{% static 'css/bulma.min.css' %}">
|
<link rel="stylesheet" href="{% static 'css/bulma.min.css' %}">
|
||||||
<link rel="stylesheet" href="{% static 'css/bulma-tooltip.min.css' %}">
|
<link rel="stylesheet" href="{% static 'css/bulma-tooltip.min.css' %}">
|
||||||
@@ -29,6 +69,66 @@
|
|||||||
<script src="{% static 'js/gridstack-all.js' %}"></script>
|
<script src="{% static 'js/gridstack-all.js' %}"></script>
|
||||||
<script defer src="{% static 'js/magnet.min.js' %}"></script>
|
<script defer src="{% static 'js/magnet.min.js' %}"></script>
|
||||||
<script>
|
<script>
|
||||||
|
(function () {
|
||||||
|
function applyFavicon(theme) {
|
||||||
|
var href = theme === "dark"
|
||||||
|
? "{% static 'favicon-dark.svg' %}"
|
||||||
|
: "{% static 'favicon-light.svg' %}";
|
||||||
|
document.querySelectorAll("link[data-gia-favicon]").forEach(function (link) {
|
||||||
|
link.setAttribute("href", href);
|
||||||
|
});
|
||||||
|
document.querySelectorAll(".js-theme-logo").forEach(function (image) {
|
||||||
|
image.setAttribute("src", href);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function applyTheme(mode) {
|
||||||
|
var validMode = mode === "dark" ? "dark" : "light";
|
||||||
|
document.documentElement.dataset.theme = validMode;
|
||||||
|
applyFavicon(validMode);
|
||||||
|
try {
|
||||||
|
localStorage.setItem("theme", validMode);
|
||||||
|
} catch (error) {
|
||||||
|
}
|
||||||
|
document.querySelectorAll(".js-theme-toggle").forEach(function (button) {
|
||||||
|
var nextMode = validMode === "dark" ? "light" : "dark";
|
||||||
|
button.setAttribute("data-theme-mode", validMode);
|
||||||
|
button.setAttribute("aria-label", "Theme: " + validMode + ". Click to switch to " + nextMode + ".");
|
||||||
|
button.setAttribute("title", "Theme: " + validMode);
|
||||||
|
var label = button.querySelector("[data-theme-label]");
|
||||||
|
if (label) {
|
||||||
|
label.textContent = validMode === "dark" ? "Dark" : "Light";
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function cycleTheme() {
|
||||||
|
var currentTheme = "light";
|
||||||
|
try {
|
||||||
|
currentTheme = localStorage.getItem("theme") || "light";
|
||||||
|
} catch (error) {
|
||||||
|
}
|
||||||
|
if (currentTheme === "dark") {
|
||||||
|
applyTheme("light");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
applyTheme("dark");
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
applyTheme(localStorage.getItem("theme") || "light");
|
||||||
|
} catch (error) {
|
||||||
|
applyTheme("light");
|
||||||
|
}
|
||||||
|
|
||||||
|
document.addEventListener("DOMContentLoaded", function () {
|
||||||
|
document.querySelectorAll(".js-theme-toggle").forEach(function (button) {
|
||||||
|
button.addEventListener("click", cycleTheme);
|
||||||
|
});
|
||||||
|
applyTheme(document.documentElement.dataset.theme || "light");
|
||||||
|
});
|
||||||
|
})();
|
||||||
|
|
||||||
document.addEventListener("restore-scroll", function () {
|
document.addEventListener("restore-scroll", function () {
|
||||||
var scrollpos = localStorage.getItem("scrollpos");
|
var scrollpos = localStorage.getItem("scrollpos");
|
||||||
if (scrollpos) {
|
if (scrollpos) {
|
||||||
@@ -204,6 +304,28 @@
|
|||||||
.navbar {
|
.navbar {
|
||||||
background-color:rgba(0, 0, 0, 0.03) !important;
|
background-color:rgba(0, 0, 0, 0.03) !important;
|
||||||
}
|
}
|
||||||
|
.gia-brand-shell {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
}
|
||||||
|
.gia-brand-logo {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
padding: 0.45rem 0.75rem;
|
||||||
|
border-radius: 16px;
|
||||||
|
background: rgba(255, 255, 255, 0.82);
|
||||||
|
border: 1px solid rgba(21, 28, 39, 0.08);
|
||||||
|
box-shadow: 0 10px 24px rgba(21, 28, 39, 0.08);
|
||||||
|
}
|
||||||
|
.gia-brand-logo img {
|
||||||
|
display: block;
|
||||||
|
}
|
||||||
|
[data-theme="dark"] .gia-brand-logo {
|
||||||
|
background: rgba(255, 255, 255, 0.96);
|
||||||
|
border-color: rgba(255, 255, 255, 0.82);
|
||||||
|
box-shadow: 0 12px 28px rgba(0, 0, 0, 0.34);
|
||||||
|
}
|
||||||
.section > .container.gia-page-shell,
|
.section > .container.gia-page-shell,
|
||||||
.section > .container {
|
.section > .container {
|
||||||
max-width: 1340px;
|
max-width: 1340px;
|
||||||
@@ -227,11 +349,13 @@
|
|||||||
position: sticky;
|
position: sticky;
|
||||||
top: 0;
|
top: 0;
|
||||||
background: rgba(248, 250, 252, 0.96) !important;
|
background: rgba(248, 250, 252, 0.96) !important;
|
||||||
|
color: #1b1f2a !important;
|
||||||
backdrop-filter: blur(6px);
|
backdrop-filter: blur(6px);
|
||||||
z-index: 1;
|
z-index: 1;
|
||||||
}
|
}
|
||||||
[data-theme="dark"] .table thead th {
|
[data-theme="dark"] .table thead th {
|
||||||
background: rgba(44, 44, 44, 0.96) !important;
|
background: rgba(44, 44, 44, 0.96) !important;
|
||||||
|
color: #f3f5f8 !important;
|
||||||
}
|
}
|
||||||
.table td,
|
.table td,
|
||||||
.table th {
|
.table th {
|
||||||
@@ -351,6 +475,26 @@
|
|||||||
color: #1f4f99 !important;
|
color: #1f4f99 !important;
|
||||||
font-weight: 600;
|
font-weight: 600;
|
||||||
}
|
}
|
||||||
|
.theme-toggle-button {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
}
|
||||||
|
.brand-theme-toggle {
|
||||||
|
min-width: 0;
|
||||||
|
padding: 0;
|
||||||
|
border: 0 !important;
|
||||||
|
background: transparent !important;
|
||||||
|
box-shadow: none !important;
|
||||||
|
line-height: 1;
|
||||||
|
width: 3.15rem;
|
||||||
|
height: 3.15rem;
|
||||||
|
}
|
||||||
|
.brand-theme-logo {
|
||||||
|
display: block;
|
||||||
|
width: 3.15rem;
|
||||||
|
height: 3.15rem;
|
||||||
|
}
|
||||||
.security-page-tabs a {
|
.security-page-tabs a {
|
||||||
transition: background-color 0.15s ease-in-out, color 0.15s ease-in-out;
|
transition: background-color 0.15s ease-in-out, color 0.15s ease-in-out;
|
||||||
}
|
}
|
||||||
@@ -369,9 +513,20 @@
|
|||||||
|
|
||||||
<nav class="navbar" role="navigation" aria-label="main navigation">
|
<nav class="navbar" role="navigation" aria-label="main navigation">
|
||||||
<div class="navbar-brand">
|
<div class="navbar-brand">
|
||||||
<a class="navbar-item" href="{% url 'home' %}">
|
<div class="navbar-item">
|
||||||
<img src="{% static 'logo.svg' %}" width="112" height="28" alt="logo">
|
<span class="gia-brand-shell">
|
||||||
</a>
|
{% if user.is_authenticated %}
|
||||||
|
<button class="button is-light theme-toggle-button gia-brand-logo brand-theme-toggle js-theme-toggle" type="button" data-theme-mode="light" aria-label="Theme toggle">
|
||||||
|
<img class="brand-theme-logo js-theme-logo" src="{% static 'favicon-light.svg' %}" alt="" aria-hidden="true">
|
||||||
|
<span class="is-sr-only" data-theme-label>Auto</span>
|
||||||
|
</button>
|
||||||
|
{% else %}
|
||||||
|
<a href="{% url 'home' %}" class="gia-brand-logo" aria-label="GIA home">
|
||||||
|
<img class="brand-theme-logo" src="{% static 'favicon-light.svg' %}" alt="logo">
|
||||||
|
</a>
|
||||||
|
{% endif %}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
<a role="button" class="navbar-burger" aria-label="menu" aria-expanded="false" data-target="bar">
|
<a role="button" class="navbar-burger" aria-label="menu" aria-expanded="false" data-target="bar">
|
||||||
<span aria-hidden="true"></span>
|
<span aria-hidden="true"></span>
|
||||||
@@ -473,6 +628,9 @@
|
|||||||
System
|
System
|
||||||
</a>
|
</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
<a class="navbar-item{% if request.resolver_match.url_name == 'accessibility_settings' %} is-current-route{% endif %}" href="{% url 'accessibility_settings' %}">
|
||||||
|
Accessibility
|
||||||
|
</a>
|
||||||
<hr class="navbar-divider">
|
<hr class="navbar-divider">
|
||||||
<div class="navbar-item has-text-weight-semibold is-size-7 has-text-grey">
|
<div class="navbar-item has-text-weight-semibold is-size-7 has-text-grey">
|
||||||
Security
|
Security
|
||||||
@@ -512,8 +670,8 @@
|
|||||||
<a class="navbar-item{% if request.resolver_match.url_name == 'translation_settings' %} is-current-route{% endif %}" href="{% url 'translation_settings' %}">
|
<a class="navbar-item{% if request.resolver_match.url_name == 'translation_settings' %} is-current-route{% endif %}" href="{% url 'translation_settings' %}">
|
||||||
Translation
|
Translation
|
||||||
</a>
|
</a>
|
||||||
<a class="navbar-item{% if request.resolver_match.url_name == 'availability_settings' %} is-current-route{% endif %}" href="{% url 'availability_settings' %}">
|
<a class="navbar-item{% if request.resolver_match.url_name == 'availability_settings' or request.resolver_match.url_name == 'behavioral_signals_settings' %} is-current-route{% endif %}" href="{% url 'behavioral_signals_settings' %}">
|
||||||
Availability
|
Behavioral Signals
|
||||||
</a>
|
</a>
|
||||||
<hr class="navbar-divider">
|
<hr class="navbar-divider">
|
||||||
<div class="navbar-item has-text-weight-semibold is-size-7 has-text-grey">
|
<div class="navbar-item has-text-weight-semibold is-size-7 has-text-grey">
|
||||||
@@ -525,10 +683,6 @@
|
|||||||
<a class="navbar-item{% if request.resolver_match.url_name == 'osint_workspace' %} is-current-route{% endif %}" href="{% url 'osint_workspace' %}">
|
<a class="navbar-item{% if request.resolver_match.url_name == 'osint_workspace' %} is-current-route{% endif %}" href="{% url 'osint_workspace' %}">
|
||||||
OSINT Workspace
|
OSINT Workspace
|
||||||
</a>
|
</a>
|
||||||
<hr class="navbar-divider">
|
|
||||||
<a class="navbar-item{% if request.resolver_match.url_name == 'accessibility_settings' %} is-current-route{% endif %}" href="{% url 'accessibility_settings' %}">
|
|
||||||
Accessibility
|
|
||||||
</a>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|||||||
@@ -3,7 +3,8 @@
|
|||||||
{% block content %}
|
{% block content %}
|
||||||
<section class="section">
|
<section class="section">
|
||||||
<div class="container">
|
<div class="container">
|
||||||
<h1 class="title is-4">Availability Settings</h1>
|
<h1 class="title is-4">Behavioral Signals</h1>
|
||||||
|
<p class="subtitle is-6">Presence is only one slice. This page exposes the broader behavioral event surface used for timing, read, typing, and response analysis.</p>
|
||||||
<form method="post" class="box">
|
<form method="post" class="box">
|
||||||
{% csrf_token %}
|
{% csrf_token %}
|
||||||
<div class="columns is-multiline">
|
<div class="columns is-multiline">
|
||||||
@@ -24,44 +25,92 @@
|
|||||||
</form>
|
</form>
|
||||||
|
|
||||||
<div class="box">
|
<div class="box">
|
||||||
<h2 class="title is-6">Availability Event Statistics Per Contact</h2>
|
<div class="level mb-3">
|
||||||
|
<div class="level-left">
|
||||||
|
<div>
|
||||||
|
<h2 class="title is-6 mb-1">Behavioral Event Statistics</h2>
|
||||||
|
<p class="help is-size-7">Primary source is `gia_events` in Manticore. When unavailable, this page falls back to Django `ConversationEvent` shadow rows.</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="level-right">
|
||||||
|
<span class="tag is-light">Source: {{ behavioral_stats_source }}</span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="tags mb-4">
|
||||||
|
<span class="tag is-info is-light">Contacts: {{ behavioral_totals.contacts }}</span>
|
||||||
|
<span class="tag is-light">Events: {{ behavioral_totals.total_events }}</span>
|
||||||
|
<span class="tag is-light">Presence: {{ behavioral_totals.presence_events }}</span>
|
||||||
|
<span class="tag is-light">Read: {{ behavioral_totals.read_events }}</span>
|
||||||
|
<span class="tag is-light">Typing: {{ behavioral_totals.typing_events }}</span>
|
||||||
|
<span class="tag is-light">Messages: {{ behavioral_totals.message_events }}</span>
|
||||||
|
<span class="tag is-light">Abandoned: {{ behavioral_totals.abandoned_events }}</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<h3 class="title is-7">By Transport</h3>
|
||||||
|
<table class="table is-fullwidth is-striped is-size-7 mb-5">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Service</th>
|
||||||
|
<th>Contacts</th>
|
||||||
|
<th>Total</th>
|
||||||
|
<th>Presence</th>
|
||||||
|
<th>Read</th>
|
||||||
|
<th>Typing</th>
|
||||||
|
<th>Messages</th>
|
||||||
|
<th>Abandoned</th>
|
||||||
|
<th>Last Event TS</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
{% for row in transport_stats %}
|
||||||
|
<tr>
|
||||||
|
<td>{{ row.service }}</td>
|
||||||
|
<td>{{ row.contacts }}</td>
|
||||||
|
<td>{{ row.total_events }}</td>
|
||||||
|
<td>{{ row.presence_events }}</td>
|
||||||
|
<td>{{ row.read_events }}</td>
|
||||||
|
<td>{{ row.typing_events }}</td>
|
||||||
|
<td>{{ row.message_events }}</td>
|
||||||
|
<td>{{ row.abandoned_events }}</td>
|
||||||
|
<td>{{ row.last_event_ts }}</td>
|
||||||
|
</tr>
|
||||||
|
{% empty %}
|
||||||
|
<tr><td colspan="9">No behavioral transport summaries available.</td></tr>
|
||||||
|
{% endfor %}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
|
||||||
|
<h3 class="title is-7">By Contact</h3>
|
||||||
<table class="table is-fullwidth is-striped is-size-7">
|
<table class="table is-fullwidth is-striped is-size-7">
|
||||||
<thead>
|
<thead>
|
||||||
<tr>
|
<tr>
|
||||||
<th>Contact</th>
|
<th>Contact</th>
|
||||||
<th>Service</th>
|
<th>Service</th>
|
||||||
<th>Total</th>
|
<th>Total</th>
|
||||||
<th>Available</th>
|
<th>Presence</th>
|
||||||
<th>Fading</th>
|
|
||||||
<th>Unavailable</th>
|
|
||||||
<th>Unknown</th>
|
|
||||||
<th>Native</th>
|
|
||||||
<th>Read</th>
|
<th>Read</th>
|
||||||
<th>Typing</th>
|
<th>Typing</th>
|
||||||
<th>Msg Activity</th>
|
<th>Messages</th>
|
||||||
<th>Timeout</th>
|
<th>Abandoned</th>
|
||||||
<th>Last Event TS</th>
|
<th>Last Event TS</th>
|
||||||
</tr>
|
</tr>
|
||||||
</thead>
|
</thead>
|
||||||
<tbody>
|
<tbody>
|
||||||
{% for row in contact_stats %}
|
{% for row in behavioral_stats %}
|
||||||
<tr>
|
<tr>
|
||||||
<td>{{ row.person__name }}</td>
|
<td>{{ row.person_name }}</td>
|
||||||
<td>{{ row.service }}</td>
|
<td>{{ row.service }}</td>
|
||||||
<td>{{ row.total_events }}</td>
|
<td>{{ row.total_events }}</td>
|
||||||
<td>{{ row.available_events }}</td>
|
<td>{{ row.presence_events }}</td>
|
||||||
<td>{{ row.fading_events }}</td>
|
<td>{{ row.read_events }}</td>
|
||||||
<td>{{ row.unavailable_events }}</td>
|
|
||||||
<td>{{ row.unknown_events }}</td>
|
|
||||||
<td>{{ row.native_presence_events }}</td>
|
|
||||||
<td>{{ row.read_receipt_events }}</td>
|
|
||||||
<td>{{ row.typing_events }}</td>
|
<td>{{ row.typing_events }}</td>
|
||||||
<td>{{ row.message_activity_events }}</td>
|
<td>{{ row.message_events }}</td>
|
||||||
<td>{{ row.inferred_timeout_events }}</td>
|
<td>{{ row.abandoned_events }}</td>
|
||||||
<td>{{ row.last_event_ts }}</td>
|
<td>{{ row.last_event_ts }}</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% empty %}
|
{% empty %}
|
||||||
<tr><td colspan="13">No availability events found.</td></tr>
|
<tr><td colspan="9">No behavioral events found in either Manticore or ConversationEvent shadow rows.</td></tr>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
|
|||||||
@@ -24,6 +24,8 @@
|
|||||||
<span class="tag is-light">Messages: {{ counts.messages }}</span>
|
<span class="tag is-light">Messages: {{ counts.messages }}</span>
|
||||||
<span class="tag is-light">Queued: {{ counts.queued_messages }}</span>
|
<span class="tag is-light">Queued: {{ counts.queued_messages }}</span>
|
||||||
<span class="tag is-light">Events: {{ counts.message_events }}</span>
|
<span class="tag is-light">Events: {{ counts.message_events }}</span>
|
||||||
|
<span class="tag is-info is-light">Behavioral Events: {{ counts.behavioral_event_rows }}</span>
|
||||||
|
<span class="tag is-light">Event Shadow Rows: {{ counts.conversation_event_shadow_rows }}</span>
|
||||||
<span class="tag is-light">Workspace: {{ counts.workspace_conversations }}</span>
|
<span class="tag is-light">Workspace: {{ counts.workspace_conversations }}</span>
|
||||||
<span class="tag is-light">Snapshots: {{ counts.workspace_snapshots }}</span>
|
<span class="tag is-light">Snapshots: {{ counts.workspace_snapshots }}</span>
|
||||||
<span class="tag is-light">AI Requests: {{ counts.ai_requests }}</span>
|
<span class="tag is-light">AI Requests: {{ counts.ai_requests }}</span>
|
||||||
@@ -71,7 +73,7 @@
|
|||||||
<div class="control">
|
<div class="control">
|
||||||
<input class="input is-small" name="minutes" value="120" type="number" min="1" max="10080">
|
<input class="input is-small" name="minutes" value="120" type="number" min="1" max="10080">
|
||||||
</div>
|
</div>
|
||||||
<p class="help is-size-7" title="Use this to check that recent traffic is actually writing canonical events.">Checks whether recent actions were written to `ConversationEvent`.</p>
|
<p class="help is-size-7" title="Use this to check that recent traffic is actually writing canonical events.">Checks whether recent actions were written to the behavioral event ledger in Manticore, with Django shadow rows as fallback context.</p>
|
||||||
</div>
|
</div>
|
||||||
<div class="field">
|
<div class="field">
|
||||||
<div class="control">
|
<div class="control">
|
||||||
|
|||||||
@@ -1,14 +1,11 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
from django.test import TestCase
|
from django.test import TestCase
|
||||||
from django.urls import reverse
|
from django.urls import reverse
|
||||||
|
|
||||||
from core.models import (
|
from core.models import ContactAvailabilitySettings, ChatSession, ConversationEvent, Person, PersonIdentifier, User
|
||||||
ContactAvailabilityEvent,
|
|
||||||
ContactAvailabilitySettings,
|
|
||||||
Person,
|
|
||||||
User,
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
class AvailabilitySettingsPageTests(TestCase):
|
class AvailabilitySettingsPageTests(TestCase):
|
||||||
@@ -17,13 +14,13 @@ class AvailabilitySettingsPageTests(TestCase):
|
|||||||
self.client.force_login(self.user)
|
self.client.force_login(self.user)
|
||||||
|
|
||||||
def test_get_page_renders(self):
|
def test_get_page_renders(self):
|
||||||
response = self.client.get(reverse("availability_settings"))
|
response = self.client.get(reverse("behavioral_signals_settings"))
|
||||||
self.assertEqual(200, response.status_code)
|
self.assertEqual(200, response.status_code)
|
||||||
self.assertContains(response, "Availability Settings")
|
self.assertContains(response, "Behavioral Signals")
|
||||||
|
|
||||||
def test_post_updates_settings(self):
|
def test_post_updates_settings(self):
|
||||||
response = self.client.post(
|
response = self.client.post(
|
||||||
reverse("availability_settings"),
|
reverse("behavioral_signals_settings"),
|
||||||
{
|
{
|
||||||
"enabled": "1",
|
"enabled": "1",
|
||||||
"show_in_chat": "1",
|
"show_in_chat": "1",
|
||||||
@@ -42,35 +39,70 @@ class AvailabilitySettingsPageTests(TestCase):
|
|||||||
self.assertEqual(120, row.retention_days)
|
self.assertEqual(120, row.retention_days)
|
||||||
self.assertEqual(300, row.fade_threshold_seconds)
|
self.assertEqual(300, row.fade_threshold_seconds)
|
||||||
|
|
||||||
def test_contact_event_stats_are_aggregated(self):
|
@patch("core.views.availability.get_behavioral_availability_stats")
|
||||||
|
def test_behavioral_manticore_stats_are_in_context(self, mocked_stats):
|
||||||
person = Person.objects.create(user=self.user, name="Alice")
|
person = Person.objects.create(user=self.user, name="Alice")
|
||||||
ContactAvailabilityEvent.objects.create(
|
mocked_stats.return_value = [
|
||||||
user=self.user,
|
{
|
||||||
person=person,
|
"person_id": str(person.id),
|
||||||
service="whatsapp",
|
"transport": "whatsapp",
|
||||||
source_kind="message_in",
|
"total_events": 9,
|
||||||
availability_state="available",
|
"presence_events": 2,
|
||||||
confidence=0.9,
|
"read_events": 3,
|
||||||
ts=1000,
|
"typing_events": 2,
|
||||||
payload={},
|
"message_events": 1,
|
||||||
)
|
"abandoned_events": 1,
|
||||||
ContactAvailabilityEvent.objects.create(
|
"last_event_ts": 5555,
|
||||||
user=self.user,
|
}
|
||||||
person=person,
|
]
|
||||||
service="whatsapp",
|
|
||||||
source_kind="inferred_timeout",
|
response = self.client.get(reverse("behavioral_signals_settings"))
|
||||||
availability_state="fading",
|
|
||||||
confidence=0.5,
|
|
||||||
ts=2000,
|
|
||||||
payload={},
|
|
||||||
)
|
|
||||||
response = self.client.get(reverse("availability_settings"))
|
|
||||||
self.assertEqual(200, response.status_code)
|
self.assertEqual(200, response.status_code)
|
||||||
stats = list(response.context["contact_stats"])
|
stats = list(response.context["behavioral_stats"])
|
||||||
self.assertEqual(1, len(stats))
|
self.assertEqual(1, len(stats))
|
||||||
self.assertEqual("Alice", stats[0]["person__name"])
|
self.assertEqual("Alice", stats[0]["person_name"])
|
||||||
self.assertEqual(2, stats[0]["total_events"])
|
self.assertEqual(1, stats[0]["abandoned_events"])
|
||||||
self.assertEqual(1, stats[0]["available_events"])
|
self.assertEqual("manticore", response.context["behavioral_stats_source"])
|
||||||
self.assertEqual(1, stats[0]["fading_events"])
|
self.assertContains(response, "Behavioral Event Statistics")
|
||||||
self.assertEqual(1, stats[0]["message_activity_events"])
|
self.assertNotIn("contact_stats", response.context)
|
||||||
self.assertEqual(1, stats[0]["inferred_timeout_events"])
|
self.assertNotIn("parity_rows", response.context)
|
||||||
|
|
||||||
|
@patch("core.views.availability.get_behavioral_availability_stats")
|
||||||
|
def test_behavioral_stats_fallback_to_conversation_event_shadow(self, mocked_stats):
|
||||||
|
person = Person.objects.create(user=self.user, name="Alice")
|
||||||
|
identifier = PersonIdentifier.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
person=person,
|
||||||
|
service="signal",
|
||||||
|
identifier="+15551230000",
|
||||||
|
)
|
||||||
|
session = ChatSession.objects.create(user=self.user, identifier=identifier)
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=session,
|
||||||
|
ts=1234,
|
||||||
|
event_type="presence_available",
|
||||||
|
direction="system",
|
||||||
|
origin_transport="signal",
|
||||||
|
)
|
||||||
|
mocked_stats.return_value = []
|
||||||
|
|
||||||
|
response = self.client.get(reverse("behavioral_signals_settings"))
|
||||||
|
|
||||||
|
self.assertEqual(200, response.status_code)
|
||||||
|
stats = list(response.context["behavioral_stats"])
|
||||||
|
self.assertEqual(1, len(stats))
|
||||||
|
self.assertEqual("Alice", stats[0]["person_name"])
|
||||||
|
self.assertEqual(1, int(stats[0]["presence_events"]))
|
||||||
|
self.assertEqual(
|
||||||
|
"conversation_event_shadow", response.context["behavioral_stats_source"]
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_legacy_availability_url_redirects(self):
|
||||||
|
response = self.client.get(reverse("availability_settings"))
|
||||||
|
self.assertEqual(302, response.status_code)
|
||||||
|
self.assertIn(
|
||||||
|
reverse("behavioral_signals_settings"),
|
||||||
|
str(response.headers.get("Location") or ""),
|
||||||
|
)
|
||||||
|
|||||||
@@ -1,16 +1,11 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
from django.test import TestCase
|
from django.test import TestCase
|
||||||
|
|
||||||
from core.models import (
|
from core.models import ChatSession, Message, Person, PersonIdentifier, User
|
||||||
ChatSession,
|
|
||||||
ContactAvailabilityEvent,
|
|
||||||
Message,
|
|
||||||
Person,
|
|
||||||
PersonIdentifier,
|
|
||||||
User,
|
|
||||||
)
|
|
||||||
from core.presence.inference import now_ms
|
from core.presence.inference import now_ms
|
||||||
|
|
||||||
|
|
||||||
@@ -30,7 +25,8 @@ class BackfillContactAvailabilityCommandTests(TestCase):
|
|||||||
user=self.user, identifier=self.identifier
|
user=self.user, identifier=self.identifier
|
||||||
)
|
)
|
||||||
|
|
||||||
def test_backfill_creates_message_and_read_receipt_availability_events(self):
|
@patch("core.management.commands.backfill_contact_availability.append_event_sync")
|
||||||
|
def test_backfill_replays_message_and_read_receipt_events(self, mocked_append):
|
||||||
base_ts = now_ms()
|
base_ts = now_ms()
|
||||||
Message.objects.create(
|
Message.objects.create(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
@@ -61,12 +57,5 @@ class BackfillContactAvailabilityCommandTests(TestCase):
|
|||||||
"100",
|
"100",
|
||||||
)
|
)
|
||||||
|
|
||||||
events = list(
|
event_types = [call.kwargs["event_type"] for call in mocked_append.call_args_list]
|
||||||
ContactAvailabilityEvent.objects.filter(user=self.user).order_by(
|
self.assertEqual(["message_created", "message_created", "read_receipt"], event_types)
|
||||||
"ts", "source_kind"
|
|
||||||
)
|
|
||||||
)
|
|
||||||
self.assertEqual(3, len(events))
|
|
||||||
self.assertTrue(any(row.source_kind == "message_in" for row in events))
|
|
||||||
self.assertTrue(any(row.source_kind == "message_out" for row in events))
|
|
||||||
self.assertTrue(any(row.source_kind == "read_receipt" for row in events))
|
|
||||||
|
|||||||
161
core/tests/test_behavioral_event_platform.py
Normal file
161
core/tests/test_behavioral_event_platform.py
Normal file
@@ -0,0 +1,161 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from io import StringIO
|
||||||
|
from unittest.mock import Mock, patch
|
||||||
|
|
||||||
|
from asgiref.sync import async_to_sync
|
||||||
|
from django.core.management import call_command
|
||||||
|
from django.test import TestCase, override_settings
|
||||||
|
|
||||||
|
from core.events.ledger import append_event_sync
|
||||||
|
from core.events.manticore import ManticoreEventLedgerBackend
|
||||||
|
from core.messaging import history
|
||||||
|
from core.models import (
|
||||||
|
ChatSession,
|
||||||
|
ConversationEvent,
|
||||||
|
Message,
|
||||||
|
Person,
|
||||||
|
PersonIdentifier,
|
||||||
|
User,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(
|
||||||
|
EVENT_LEDGER_DUAL_WRITE=True,
|
||||||
|
MANTICORE_HTTP_URL="http://manticore.test:9308",
|
||||||
|
MANTICORE_EVENT_TABLE="gia_events_test",
|
||||||
|
)
|
||||||
|
class BehavioralEventPlatformTests(TestCase):
|
||||||
|
def setUp(self):
|
||||||
|
ManticoreEventLedgerBackend._table_ready_cache.clear()
|
||||||
|
self.user = User.objects.create_user(
|
||||||
|
username="behavior-user",
|
||||||
|
email="behavior@example.com",
|
||||||
|
password="x",
|
||||||
|
)
|
||||||
|
self.person = Person.objects.create(user=self.user, name="Behavior Person")
|
||||||
|
self.identifier = PersonIdentifier.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
person=self.person,
|
||||||
|
service="whatsapp",
|
||||||
|
identifier="15550001234",
|
||||||
|
)
|
||||||
|
self.session = ChatSession.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
identifier=self.identifier,
|
||||||
|
)
|
||||||
|
|
||||||
|
@patch("core.events.manticore.requests.post")
|
||||||
|
def test_append_event_dual_writes_to_manticore_and_django(self, mocked_post):
|
||||||
|
mocked_response = Mock()
|
||||||
|
mocked_response.json.side_effect = [{}, {}]
|
||||||
|
mocked_response.raise_for_status.return_value = None
|
||||||
|
mocked_post.return_value = mocked_response
|
||||||
|
|
||||||
|
row = append_event_sync(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=1700000000000,
|
||||||
|
event_type="delivery_receipt",
|
||||||
|
direction="system",
|
||||||
|
actor_identifier="15550001234",
|
||||||
|
origin_transport="whatsapp",
|
||||||
|
origin_message_id="wamid.001",
|
||||||
|
payload={"message_ts": 1699999999000},
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertIsNotNone(row)
|
||||||
|
self.assertEqual(1, ConversationEvent.objects.count())
|
||||||
|
self.assertEqual(2, mocked_post.call_count)
|
||||||
|
replace_query = mocked_post.call_args_list[-1].kwargs["data"]["query"]
|
||||||
|
self.assertIn("REPLACE INTO gia_events_test", replace_query)
|
||||||
|
self.assertIn("message_delivered", replace_query)
|
||||||
|
|
||||||
|
@override_settings(
|
||||||
|
EVENT_LEDGER_DUAL_WRITE=False,
|
||||||
|
EVENT_PRIMARY_WRITE_PATH=True,
|
||||||
|
MANTICORE_HTTP_URL="http://manticore.test:9308",
|
||||||
|
MANTICORE_EVENT_TABLE="gia_events_primary",
|
||||||
|
)
|
||||||
|
@patch("core.events.manticore.requests.post")
|
||||||
|
def test_primary_write_path_skips_django_rows(self, mocked_post):
|
||||||
|
mocked_response = Mock()
|
||||||
|
mocked_response.json.side_effect = [{}, {}]
|
||||||
|
mocked_response.raise_for_status.return_value = None
|
||||||
|
mocked_post.return_value = mocked_response
|
||||||
|
|
||||||
|
row = append_event_sync(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=1700000000100,
|
||||||
|
event_type="message_created",
|
||||||
|
direction="out",
|
||||||
|
origin_transport="signal",
|
||||||
|
origin_message_id="1700000000100",
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertIsNone(row)
|
||||||
|
self.assertEqual(0, ConversationEvent.objects.count())
|
||||||
|
self.assertEqual(2, mocked_post.call_count)
|
||||||
|
|
||||||
|
@patch("core.events.manticore.requests.post")
|
||||||
|
def test_delivery_receipts_write_delivery_event_type(self, mocked_post):
|
||||||
|
mocked_response = Mock()
|
||||||
|
mocked_response.json.side_effect = [{}, {}]
|
||||||
|
mocked_response.raise_for_status.return_value = None
|
||||||
|
mocked_post.return_value = mocked_response
|
||||||
|
|
||||||
|
message = Message.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=1700000000200,
|
||||||
|
sender_uuid="BOT",
|
||||||
|
text="hello",
|
||||||
|
custom_author="BOT",
|
||||||
|
source_service="whatsapp",
|
||||||
|
source_message_id="wamid.002",
|
||||||
|
source_chat_id="15550001234",
|
||||||
|
)
|
||||||
|
|
||||||
|
updated = async_to_sync(history.apply_read_receipts)(
|
||||||
|
self.user,
|
||||||
|
self.identifier,
|
||||||
|
[message.ts],
|
||||||
|
read_ts=1700000000300,
|
||||||
|
source_service="whatsapp",
|
||||||
|
read_by_identifier="15550001234",
|
||||||
|
payload={"type": "delivered"},
|
||||||
|
receipt_event_type="delivery_receipt",
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertEqual(1, updated)
|
||||||
|
event = ConversationEvent.objects.get()
|
||||||
|
self.assertEqual("delivery_receipt", event.event_type)
|
||||||
|
self.assertEqual("delivery_receipt", event.payload.get("receipt_event_type"))
|
||||||
|
message.refresh_from_db()
|
||||||
|
self.assertEqual(1700000000300, message.delivered_ts)
|
||||||
|
self.assertIsNone(message.read_ts)
|
||||||
|
|
||||||
|
@patch("core.management.commands.manticore_backfill.upsert_conversation_event")
|
||||||
|
def test_backfill_command_replays_conversation_events(self, mocked_upsert):
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=1700000000400,
|
||||||
|
event_type="read_receipt",
|
||||||
|
direction="system",
|
||||||
|
origin_transport="signal",
|
||||||
|
origin_message_id="msg-123",
|
||||||
|
)
|
||||||
|
out = StringIO()
|
||||||
|
|
||||||
|
call_command(
|
||||||
|
"manticore_backfill",
|
||||||
|
"--from-conversation-events",
|
||||||
|
"--user-id",
|
||||||
|
str(self.user.id),
|
||||||
|
stdout=out,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertEqual(1, mocked_upsert.call_count)
|
||||||
|
self.assertIn("manticore-backfill scanned=1 indexed=1", out.getvalue())
|
||||||
@@ -1,4 +1,5 @@
|
|||||||
from io import StringIO
|
from io import StringIO
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
from django.core.management.base import CommandError
|
from django.core.management.base import CommandError
|
||||||
@@ -89,3 +90,29 @@ class EventLedgerSmokeCommandTests(TestCase):
|
|||||||
fail_if_empty=True,
|
fail_if_empty=True,
|
||||||
stdout=StringIO(),
|
stdout=StringIO(),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@patch("core.management.commands.event_ledger_smoke.get_recent_event_rows")
|
||||||
|
def test_smoke_command_falls_back_to_manticore_rows(self, mocked_rows):
|
||||||
|
mocked_rows.return_value = [
|
||||||
|
{
|
||||||
|
"id": "",
|
||||||
|
"user_id": int(self.user.id),
|
||||||
|
"session_id": str(self.session.id),
|
||||||
|
"ts": 1770000000002,
|
||||||
|
"event_type": "message_created",
|
||||||
|
"kind": "message_sent",
|
||||||
|
"direction": "in",
|
||||||
|
"origin_transport": "signal",
|
||||||
|
"trace_id": "",
|
||||||
|
}
|
||||||
|
]
|
||||||
|
out = StringIO()
|
||||||
|
call_command(
|
||||||
|
"event_ledger_smoke",
|
||||||
|
user_id=str(self.user.id),
|
||||||
|
minutes=999999,
|
||||||
|
stdout=out,
|
||||||
|
)
|
||||||
|
rendered = out.getvalue()
|
||||||
|
self.assertIn("source=manticore", rendered)
|
||||||
|
self.assertIn("event_type_counts=", rendered)
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
import time
|
import time
|
||||||
from io import StringIO
|
from io import StringIO
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
from django.test import TestCase, override_settings
|
from django.test import TestCase, override_settings
|
||||||
@@ -105,6 +106,38 @@ class EventProjectionShadowTests(TestCase):
|
|||||||
1,
|
1,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@patch("core.events.projection.get_session_event_rows")
|
||||||
|
def test_shadow_compare_can_project_from_manticore_rows(self, mocked_rows):
|
||||||
|
message = Message.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=1700000000000,
|
||||||
|
sender_uuid="+15555550333",
|
||||||
|
text="hello",
|
||||||
|
delivered_ts=1700000000000,
|
||||||
|
read_ts=1700000000500,
|
||||||
|
)
|
||||||
|
mocked_rows.return_value = [
|
||||||
|
{
|
||||||
|
"ts": 1700000000000,
|
||||||
|
"event_type": "message_created",
|
||||||
|
"origin_transport": "signal",
|
||||||
|
"origin_message_id": str(message.id),
|
||||||
|
"payload": {"message_id": str(message.id), "text": "hello"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"ts": 1700000000500,
|
||||||
|
"event_type": "read_receipt",
|
||||||
|
"origin_transport": "signal",
|
||||||
|
"origin_message_id": str(message.id),
|
||||||
|
"payload": {"message_id": str(message.id), "read_ts": 1700000000500},
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
compared = shadow_compare_session(self.session, detail_limit=10)
|
||||||
|
|
||||||
|
self.assertEqual(0, compared["counters"]["missing_in_projection"])
|
||||||
|
|
||||||
def test_management_command_emits_summary(self):
|
def test_management_command_emits_summary(self):
|
||||||
out = StringIO()
|
out = StringIO()
|
||||||
call_command(
|
call_command(
|
||||||
|
|||||||
219
core/tests/test_gia_analysis.py
Normal file
219
core/tests/test_gia_analysis.py
Normal file
@@ -0,0 +1,219 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import time
|
||||||
|
from io import StringIO
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
|
from django.core.management import call_command
|
||||||
|
from django.test import SimpleTestCase, TestCase, override_settings
|
||||||
|
|
||||||
|
from core.events.behavior import ComposingTracker, summarize_metrics
|
||||||
|
from core.events.manticore import ManticoreEventLedgerBackend
|
||||||
|
from core.models import ChatSession, ConversationEvent, Person, PersonIdentifier, User
|
||||||
|
|
||||||
|
|
||||||
|
class ComposingTrackerTests(SimpleTestCase):
|
||||||
|
def test_tracker_emits_abandoned_after_window(self):
|
||||||
|
tracker = ComposingTracker(window_ms=300000)
|
||||||
|
tracker.observe_started("session-1", 1000)
|
||||||
|
abandoned = tracker.observe_stopped("session-1", 301500)
|
||||||
|
self.assertIsNotNone(abandoned)
|
||||||
|
self.assertEqual(300500, abandoned["duration_ms"])
|
||||||
|
self.assertTrue(abandoned["abandoned"])
|
||||||
|
|
||||||
|
|
||||||
|
class BehavioralMetricSummaryTests(SimpleTestCase):
|
||||||
|
def test_summarize_metrics_computes_core_intervals(self):
|
||||||
|
rows = [
|
||||||
|
{
|
||||||
|
"session_id": "s1",
|
||||||
|
"kind": "message_delivered",
|
||||||
|
"ts": 1000,
|
||||||
|
"payload": {"message_id": "m1"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"session_id": "s1",
|
||||||
|
"kind": "message_read",
|
||||||
|
"ts": 1600,
|
||||||
|
"payload": {"message_id": "m1"},
|
||||||
|
},
|
||||||
|
{"session_id": "s1", "kind": "presence_available", "ts": 2000},
|
||||||
|
{"session_id": "s1", "kind": "composing_started", "ts": 2100},
|
||||||
|
{"session_id": "s1", "kind": "composing_started", "ts": 2200},
|
||||||
|
{"session_id": "s1", "kind": "message_sent", "ts": 2600},
|
||||||
|
{"session_id": "s2", "kind": "composing_started", "ts": 4000},
|
||||||
|
{"session_id": "s2", "kind": "composing_abandoned", "ts": 710000},
|
||||||
|
]
|
||||||
|
|
||||||
|
metrics = summarize_metrics(rows, rows)
|
||||||
|
|
||||||
|
self.assertEqual(600, metrics["delay_b"]["value_ms"])
|
||||||
|
self.assertEqual(500, metrics["delay_c"]["value_ms"])
|
||||||
|
self.assertEqual(150, metrics["delay_f"]["value_ms"])
|
||||||
|
self.assertEqual(2, metrics["revision"]["value_ms"])
|
||||||
|
self.assertEqual(333, metrics["abandoned_rate"]["value_ms"])
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(
|
||||||
|
EVENT_LEDGER_DUAL_WRITE=True,
|
||||||
|
MANTICORE_HTTP_URL="http://manticore.test:9308",
|
||||||
|
MANTICORE_EVENT_TABLE="gia_events_test",
|
||||||
|
MANTICORE_METRIC_TABLE="gia_metrics_test",
|
||||||
|
)
|
||||||
|
class GiaAnalysisCommandTests(TestCase):
|
||||||
|
def setUp(self):
|
||||||
|
ManticoreEventLedgerBackend._table_ready_cache.clear()
|
||||||
|
self.user = User.objects.create_user(
|
||||||
|
username="analysis-user",
|
||||||
|
email="analysis@example.com",
|
||||||
|
password="x",
|
||||||
|
)
|
||||||
|
self.person = Person.objects.create(user=self.user, name="Analysis Person")
|
||||||
|
self.identifier = PersonIdentifier.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
person=self.person,
|
||||||
|
service="whatsapp",
|
||||||
|
identifier="15550009999",
|
||||||
|
)
|
||||||
|
self.session = ChatSession.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
identifier=self.identifier,
|
||||||
|
)
|
||||||
|
|
||||||
|
@patch("core.events.manticore.requests.post")
|
||||||
|
def test_metrics_table_upsert_and_analysis_command(self, mocked_post):
|
||||||
|
now_ms = int(time.time() * 1000)
|
||||||
|
mocked_response = type(
|
||||||
|
"Response",
|
||||||
|
(),
|
||||||
|
{
|
||||||
|
"raise_for_status": lambda self: None,
|
||||||
|
"json": lambda self: {},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
mocked_post.return_value = mocked_response()
|
||||||
|
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=now_ms - 5000,
|
||||||
|
event_type="presence_available",
|
||||||
|
direction="system",
|
||||||
|
origin_transport="whatsapp",
|
||||||
|
payload={},
|
||||||
|
)
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=now_ms - 4500,
|
||||||
|
event_type="typing_started",
|
||||||
|
direction="in",
|
||||||
|
origin_transport="whatsapp",
|
||||||
|
payload={},
|
||||||
|
)
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=now_ms - 3900,
|
||||||
|
event_type="message_created",
|
||||||
|
direction="in",
|
||||||
|
origin_transport="whatsapp",
|
||||||
|
payload={"message_id": "m1"},
|
||||||
|
)
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=now_ms - 3600,
|
||||||
|
event_type="delivery_receipt",
|
||||||
|
direction="system",
|
||||||
|
origin_transport="whatsapp",
|
||||||
|
payload={"message_id": "m1"},
|
||||||
|
)
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=now_ms - 3000,
|
||||||
|
event_type="read_receipt",
|
||||||
|
direction="system",
|
||||||
|
origin_transport="whatsapp",
|
||||||
|
payload={"message_id": "m1"},
|
||||||
|
)
|
||||||
|
|
||||||
|
out = StringIO()
|
||||||
|
with patch(
|
||||||
|
"core.management.commands.gia_analysis.get_event_ledger_backend"
|
||||||
|
) as mocked_backend:
|
||||||
|
backend = mocked_backend.return_value
|
||||||
|
backend.list_event_targets.return_value = [
|
||||||
|
{"user_id": self.user.id, "person_id": str(self.person.id)}
|
||||||
|
]
|
||||||
|
backend.fetch_events.return_value = [
|
||||||
|
{
|
||||||
|
"user_id": self.user.id,
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"session_id": str(self.session.id),
|
||||||
|
"kind": "presence_available",
|
||||||
|
"direction": "system",
|
||||||
|
"ts": now_ms - 5000,
|
||||||
|
"payload": {},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"user_id": self.user.id,
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"session_id": str(self.session.id),
|
||||||
|
"kind": "composing_started",
|
||||||
|
"direction": "in",
|
||||||
|
"ts": now_ms - 4500,
|
||||||
|
"payload": {},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"user_id": self.user.id,
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"session_id": str(self.session.id),
|
||||||
|
"kind": "message_sent",
|
||||||
|
"direction": "in",
|
||||||
|
"ts": now_ms - 3900,
|
||||||
|
"payload": {"message_id": "m1"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"user_id": self.user.id,
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"session_id": str(self.session.id),
|
||||||
|
"kind": "message_delivered",
|
||||||
|
"direction": "system",
|
||||||
|
"ts": now_ms - 3600,
|
||||||
|
"payload": {"message_id": "m1"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"user_id": self.user.id,
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"session_id": str(self.session.id),
|
||||||
|
"kind": "message_read",
|
||||||
|
"direction": "system",
|
||||||
|
"ts": now_ms - 3000,
|
||||||
|
"payload": {"message_id": "m1"},
|
||||||
|
},
|
||||||
|
]
|
||||||
|
|
||||||
|
call_command("gia_analysis", "--once", "--user-id", str(self.user.id), stdout=out)
|
||||||
|
|
||||||
|
self.assertGreaterEqual(backend.upsert_metric.call_count, 3)
|
||||||
|
self.assertIn("gia-analysis wrote=", out.getvalue())
|
||||||
|
|
||||||
|
@patch("core.events.manticore.requests.post")
|
||||||
|
def test_list_event_targets_uses_group_by_query(self, mocked_post):
|
||||||
|
mocked_response = type(
|
||||||
|
"Response",
|
||||||
|
(),
|
||||||
|
{
|
||||||
|
"raise_for_status": lambda self: None,
|
||||||
|
"json": lambda self: {"data": []},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
mocked_post.return_value = mocked_response()
|
||||||
|
|
||||||
|
backend = ManticoreEventLedgerBackend()
|
||||||
|
backend.list_event_targets(user_id=1)
|
||||||
|
|
||||||
|
query = mocked_post.call_args.kwargs["data"]["query"]
|
||||||
|
self.assertIn("GROUP BY user_id, person_id", query)
|
||||||
@@ -2,21 +2,13 @@ from __future__ import annotations
|
|||||||
|
|
||||||
from django.test import TestCase
|
from django.test import TestCase
|
||||||
|
|
||||||
from core.models import (
|
from core.models import ContactAvailabilitySettings, Person, PersonIdentifier, User
|
||||||
ContactAvailabilityEvent,
|
|
||||||
ContactAvailabilitySettings,
|
|
||||||
ContactAvailabilitySpan,
|
|
||||||
Person,
|
|
||||||
PersonIdentifier,
|
|
||||||
User,
|
|
||||||
)
|
|
||||||
from core.presence.engine import (
|
from core.presence.engine import (
|
||||||
AvailabilitySignal,
|
AvailabilitySignal,
|
||||||
ensure_fading_state,
|
ensure_fading_state,
|
||||||
record_inferred_signal,
|
record_inferred_signal,
|
||||||
record_native_signal,
|
record_native_signal,
|
||||||
)
|
)
|
||||||
from core.presence.inference import now_ms
|
|
||||||
|
|
||||||
|
|
||||||
class PresenceEngineTests(TestCase):
|
class PresenceEngineTests(TestCase):
|
||||||
@@ -31,118 +23,45 @@ class PresenceEngineTests(TestCase):
|
|||||||
service="signal",
|
service="signal",
|
||||||
identifier="+15550001111",
|
identifier="+15550001111",
|
||||||
)
|
)
|
||||||
ContactAvailabilitySettings.objects.update_or_create(
|
|
||||||
user=self.user,
|
|
||||||
defaults={
|
|
||||||
"enabled": True,
|
|
||||||
"show_in_chat": True,
|
|
||||||
"show_in_groups": True,
|
|
||||||
"inference_enabled": True,
|
|
||||||
"retention_days": 90,
|
|
||||||
"fade_threshold_seconds": 1,
|
|
||||||
},
|
|
||||||
)
|
|
||||||
|
|
||||||
def test_read_receipt_signal_creates_available_event(self):
|
def test_record_native_signal_is_a_compatibility_noop(self):
|
||||||
ts = now_ms()
|
signal = AvailabilitySignal(
|
||||||
event = record_native_signal(
|
|
||||||
AvailabilitySignal(
|
|
||||||
user=self.user,
|
|
||||||
person=self.person,
|
|
||||||
person_identifier=self.identifier,
|
|
||||||
service="signal",
|
|
||||||
source_kind="read_receipt",
|
|
||||||
availability_state="available",
|
|
||||||
confidence=0.95,
|
|
||||||
ts=ts,
|
|
||||||
payload={"origin": "test"},
|
|
||||||
)
|
|
||||||
)
|
|
||||||
self.assertIsNotNone(event)
|
|
||||||
self.assertEqual(
|
|
||||||
1, ContactAvailabilityEvent.objects.filter(user=self.user).count()
|
|
||||||
)
|
|
||||||
self.assertEqual("available", event.availability_state)
|
|
||||||
|
|
||||||
def test_inactivity_transitions_to_fading(self):
|
|
||||||
base_ts = now_ms()
|
|
||||||
record_inferred_signal(
|
|
||||||
AvailabilitySignal(
|
|
||||||
user=self.user,
|
|
||||||
person=self.person,
|
|
||||||
person_identifier=self.identifier,
|
|
||||||
service="signal",
|
|
||||||
source_kind="read_receipt",
|
|
||||||
availability_state="available",
|
|
||||||
confidence=0.95,
|
|
||||||
ts=base_ts,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
fade_event = ensure_fading_state(
|
|
||||||
user=self.user,
|
user=self.user,
|
||||||
person=self.person,
|
person=self.person,
|
||||||
person_identifier=self.identifier,
|
person_identifier=self.identifier,
|
||||||
service="signal",
|
service="signal",
|
||||||
at_ts=base_ts + 10_000,
|
source_kind="read_receipt",
|
||||||
|
availability_state="available",
|
||||||
|
confidence=0.95,
|
||||||
|
ts=1234,
|
||||||
|
payload={"origin": "test"},
|
||||||
)
|
)
|
||||||
self.assertIsNotNone(fade_event)
|
result = record_native_signal(signal)
|
||||||
self.assertEqual("fading", fade_event.availability_state)
|
self.assertIs(result, signal)
|
||||||
|
|
||||||
def test_explicit_unavailable_blocks_fade_inference(self):
|
def test_record_inferred_signal_respects_settings(self):
|
||||||
base_ts = now_ms()
|
ContactAvailabilitySettings.objects.update_or_create(
|
||||||
record_native_signal(
|
user=self.user,
|
||||||
AvailabilitySignal(
|
defaults={"enabled": True, "inference_enabled": False},
|
||||||
user=self.user,
|
|
||||||
person=self.person,
|
|
||||||
person_identifier=self.identifier,
|
|
||||||
service="xmpp",
|
|
||||||
source_kind="native_presence",
|
|
||||||
availability_state="unavailable",
|
|
||||||
confidence=1.0,
|
|
||||||
ts=base_ts,
|
|
||||||
)
|
|
||||||
)
|
)
|
||||||
fade_event = ensure_fading_state(
|
signal = AvailabilitySignal(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
person=self.person,
|
person=self.person,
|
||||||
person_identifier=self.identifier,
|
person_identifier=self.identifier,
|
||||||
service="xmpp",
|
service="signal",
|
||||||
at_ts=base_ts + 60_000,
|
source_kind="typing_stop",
|
||||||
)
|
availability_state="fading",
|
||||||
self.assertIsNone(fade_event)
|
ts=1234,
|
||||||
self.assertEqual(
|
|
||||||
1, ContactAvailabilityEvent.objects.filter(user=self.user).count()
|
|
||||||
)
|
)
|
||||||
|
self.assertIsNone(record_inferred_signal(signal))
|
||||||
|
|
||||||
def test_adjacent_same_state_events_extend_single_span(self):
|
def test_ensure_fading_state_no_longer_persists_shadow_rows(self):
|
||||||
ts0 = now_ms()
|
self.assertIsNone(
|
||||||
record_native_signal(
|
ensure_fading_state(
|
||||||
AvailabilitySignal(
|
|
||||||
user=self.user,
|
user=self.user,
|
||||||
person=self.person,
|
person=self.person,
|
||||||
person_identifier=self.identifier,
|
person_identifier=self.identifier,
|
||||||
service="signal",
|
service="signal",
|
||||||
source_kind="typing_start",
|
at_ts=1234,
|
||||||
availability_state="available",
|
|
||||||
confidence=0.9,
|
|
||||||
ts=ts0,
|
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
record_native_signal(
|
|
||||||
AvailabilitySignal(
|
|
||||||
user=self.user,
|
|
||||||
person=self.person,
|
|
||||||
person_identifier=self.identifier,
|
|
||||||
service="signal",
|
|
||||||
source_kind="message_in",
|
|
||||||
availability_state="available",
|
|
||||||
confidence=0.8,
|
|
||||||
ts=ts0 + 5_000,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
spans = list(
|
|
||||||
ContactAvailabilitySpan.objects.filter(user=self.user).order_by("start_ts")
|
|
||||||
)
|
|
||||||
self.assertEqual(1, len(spans))
|
|
||||||
self.assertEqual(ts0, spans[0].start_ts)
|
|
||||||
self.assertEqual(ts0 + 5_000, spans[0].end_ts)
|
|
||||||
|
|||||||
@@ -1,14 +1,11 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
from django.test import TestCase
|
from django.test import TestCase
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
from core.clients import transport
|
from core.clients import transport
|
||||||
from core.models import Person, PersonIdentifier, PlatformChatLink, User
|
from core.models import ChatSession, ConversationEvent, Person, PersonIdentifier, PlatformChatLink, User
|
||||||
from core.presence import (
|
from core.presence import latest_state_for_people
|
||||||
AvailabilitySignal,
|
|
||||||
latest_state_for_people,
|
|
||||||
record_native_signal,
|
|
||||||
)
|
|
||||||
from core.presence.inference import now_ms
|
from core.presence.inference import now_ms
|
||||||
from core.views.compose import (
|
from core.views.compose import (
|
||||||
_compose_availability_payload,
|
_compose_availability_payload,
|
||||||
@@ -28,19 +25,16 @@ class PresenceQueryAndComposeContextTests(TestCase):
|
|||||||
identifier="15551234567",
|
identifier="15551234567",
|
||||||
)
|
)
|
||||||
|
|
||||||
def test_latest_state_map_uses_string_person_keys(self):
|
@patch("core.presence.query.get_behavioral_latest_states")
|
||||||
record_native_signal(
|
def test_latest_state_map_uses_string_person_keys(self, mocked_states):
|
||||||
AvailabilitySignal(
|
mocked_states.return_value = [
|
||||||
user=self.user,
|
{
|
||||||
person=self.person,
|
"person_id": str(self.person.id),
|
||||||
person_identifier=self.identifier,
|
"transport": "signal",
|
||||||
service="signal",
|
"kind": "presence_available",
|
||||||
source_kind="message_in",
|
"ts": now_ms(),
|
||||||
availability_state="available",
|
}
|
||||||
confidence=0.8,
|
]
|
||||||
ts=now_ms(),
|
|
||||||
)
|
|
||||||
)
|
|
||||||
state_map = latest_state_for_people(
|
state_map = latest_state_for_people(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
person_ids=[str(self.person.id)],
|
person_ids=[str(self.person.id)],
|
||||||
@@ -48,6 +42,29 @@ class PresenceQueryAndComposeContextTests(TestCase):
|
|||||||
)
|
)
|
||||||
self.assertIn(str(self.person.id), state_map)
|
self.assertIn(str(self.person.id), state_map)
|
||||||
|
|
||||||
|
@patch("core.presence.query.get_behavioral_latest_states")
|
||||||
|
def test_latest_state_map_falls_back_to_conversation_event_shadow(
|
||||||
|
self, mocked_states
|
||||||
|
):
|
||||||
|
session = ChatSession.objects.create(user=self.user, identifier=self.identifier)
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=session,
|
||||||
|
ts=now_ms(),
|
||||||
|
event_type="presence_available",
|
||||||
|
direction="system",
|
||||||
|
origin_transport="signal",
|
||||||
|
)
|
||||||
|
mocked_states.return_value = []
|
||||||
|
|
||||||
|
state_map = latest_state_for_people(
|
||||||
|
user=self.user,
|
||||||
|
person_ids=[str(self.person.id)],
|
||||||
|
service="signal",
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertEqual("available", str(state_map[str(self.person.id)]["state"]))
|
||||||
|
|
||||||
def test_context_base_matches_signal_identifier_variants(self):
|
def test_context_base_matches_signal_identifier_variants(self):
|
||||||
base = _context_base(
|
base = _context_base(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
@@ -58,20 +75,29 @@ class PresenceQueryAndComposeContextTests(TestCase):
|
|||||||
self.assertIsNotNone(base["person_identifier"])
|
self.assertIsNotNone(base["person_identifier"])
|
||||||
self.assertEqual(str(self.person.id), str(base["person"].id))
|
self.assertEqual(str(self.person.id), str(base["person"].id))
|
||||||
|
|
||||||
def test_compose_availability_payload_falls_back_to_cross_service(self):
|
@patch("core.presence.query.get_behavioral_latest_states")
|
||||||
|
@patch("core.presence.query.get_behavioral_events_for_range")
|
||||||
|
def test_compose_availability_payload_falls_back_to_cross_service(
|
||||||
|
self, mocked_events, mocked_states
|
||||||
|
):
|
||||||
ts_value = now_ms()
|
ts_value = now_ms()
|
||||||
record_native_signal(
|
mocked_events.return_value = [
|
||||||
AvailabilitySignal(
|
{
|
||||||
user=self.user,
|
"person_id": str(self.person.id),
|
||||||
person=self.person,
|
"transport": "whatsapp",
|
||||||
person_identifier=self.identifier,
|
"kind": "presence_available",
|
||||||
service="whatsapp",
|
"ts": ts_value,
|
||||||
source_kind="message_in",
|
"payload": {},
|
||||||
availability_state="available",
|
}
|
||||||
confidence=0.9,
|
]
|
||||||
ts=ts_value,
|
mocked_states.return_value = [
|
||||||
)
|
{
|
||||||
)
|
"person_id": str(self.person.id),
|
||||||
|
"transport": "whatsapp",
|
||||||
|
"kind": "presence_available",
|
||||||
|
"ts": ts_value,
|
||||||
|
}
|
||||||
|
]
|
||||||
enabled, slices, summary = _compose_availability_payload(
|
enabled, slices, summary = _compose_availability_payload(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
person=self.person,
|
person=self.person,
|
||||||
@@ -85,6 +111,132 @@ class PresenceQueryAndComposeContextTests(TestCase):
|
|||||||
self.assertEqual("available", str(summary.get("state")))
|
self.assertEqual("available", str(summary.get("state")))
|
||||||
self.assertTrue(bool(summary.get("is_cross_service")))
|
self.assertTrue(bool(summary.get("is_cross_service")))
|
||||||
|
|
||||||
|
@patch("core.presence.query.get_behavioral_latest_states")
|
||||||
|
@patch("core.presence.query.get_behavioral_events_for_range")
|
||||||
|
def test_compose_availability_payload_can_fallback_to_manticore_behavioral(
|
||||||
|
self,
|
||||||
|
mocked_events,
|
||||||
|
mocked_states,
|
||||||
|
):
|
||||||
|
ts_value = now_ms()
|
||||||
|
mocked_events.return_value = [
|
||||||
|
{
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"transport": "signal",
|
||||||
|
"kind": "presence_available",
|
||||||
|
"ts": ts_value - 30000,
|
||||||
|
"payload": {},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"transport": "signal",
|
||||||
|
"kind": "composing_abandoned",
|
||||||
|
"ts": ts_value - 5000,
|
||||||
|
"payload": {},
|
||||||
|
},
|
||||||
|
]
|
||||||
|
mocked_states.return_value = [
|
||||||
|
{
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"transport": "signal",
|
||||||
|
"kind": "composing_abandoned",
|
||||||
|
"ts": ts_value - 5000,
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
enabled, slices, summary = _compose_availability_payload(
|
||||||
|
user=self.user,
|
||||||
|
person=self.person,
|
||||||
|
service="signal",
|
||||||
|
range_start=ts_value - 60000,
|
||||||
|
range_end=ts_value + 60000,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertTrue(enabled)
|
||||||
|
self.assertGreaterEqual(len(slices), 1)
|
||||||
|
self.assertEqual("signal", str(slices[0].get("service")))
|
||||||
|
self.assertEqual("fading", str(summary.get("state")))
|
||||||
|
self.assertEqual(
|
||||||
|
"behavioral:composing_abandoned",
|
||||||
|
str(summary.get("source_kind")),
|
||||||
|
)
|
||||||
|
|
||||||
|
@patch("core.presence.query.get_behavioral_latest_states")
|
||||||
|
@patch("core.presence.query.get_behavioral_events_for_range")
|
||||||
|
def test_compose_availability_payload_falls_back_to_conversation_event_shadow(
|
||||||
|
self,
|
||||||
|
mocked_events,
|
||||||
|
mocked_states,
|
||||||
|
):
|
||||||
|
ts_value = now_ms()
|
||||||
|
session = ChatSession.objects.create(user=self.user, identifier=self.identifier)
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=session,
|
||||||
|
ts=ts_value - 10000,
|
||||||
|
event_type="presence_available",
|
||||||
|
direction="system",
|
||||||
|
origin_transport="signal",
|
||||||
|
)
|
||||||
|
mocked_events.return_value = []
|
||||||
|
mocked_states.return_value = []
|
||||||
|
|
||||||
|
enabled, slices, summary = _compose_availability_payload(
|
||||||
|
user=self.user,
|
||||||
|
person=self.person,
|
||||||
|
service="signal",
|
||||||
|
range_start=ts_value - 60000,
|
||||||
|
range_end=ts_value + 60000,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertTrue(enabled)
|
||||||
|
self.assertGreaterEqual(len(slices), 1)
|
||||||
|
self.assertEqual("signal", str(slices[0].get("service")))
|
||||||
|
self.assertEqual("available", str(summary.get("state")))
|
||||||
|
|
||||||
|
@patch("core.presence.query.get_behavioral_latest_states")
|
||||||
|
@patch("core.presence.query.get_behavioral_events_for_range")
|
||||||
|
def test_compose_availability_payload_prefers_manticore_over_django(
|
||||||
|
self,
|
||||||
|
mocked_events,
|
||||||
|
mocked_states,
|
||||||
|
):
|
||||||
|
ts_value = now_ms()
|
||||||
|
mocked_events.return_value = [
|
||||||
|
{
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"transport": "signal",
|
||||||
|
"kind": "composing_abandoned",
|
||||||
|
"ts": ts_value - 5000,
|
||||||
|
"payload": {},
|
||||||
|
}
|
||||||
|
]
|
||||||
|
mocked_states.return_value = [
|
||||||
|
{
|
||||||
|
"person_id": str(self.person.id),
|
||||||
|
"transport": "signal",
|
||||||
|
"kind": "composing_abandoned",
|
||||||
|
"ts": ts_value - 5000,
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
enabled, slices, summary = _compose_availability_payload(
|
||||||
|
user=self.user,
|
||||||
|
person=self.person,
|
||||||
|
service="signal",
|
||||||
|
range_start=ts_value - 60000,
|
||||||
|
range_end=ts_value + 60000,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertTrue(enabled)
|
||||||
|
self.assertGreaterEqual(len(slices), 1)
|
||||||
|
self.assertEqual("fading", str(slices[0].get("state")))
|
||||||
|
self.assertEqual("fading", str(summary.get("state")))
|
||||||
|
self.assertEqual(
|
||||||
|
"behavioral:composing_abandoned",
|
||||||
|
str(summary.get("source_kind")),
|
||||||
|
)
|
||||||
|
|
||||||
def test_context_base_preserves_native_signal_group_identifier(self):
|
def test_context_base_preserves_native_signal_group_identifier(self):
|
||||||
PlatformChatLink.objects.create(
|
PlatformChatLink.objects.create(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
@@ -104,6 +256,33 @@ class PresenceQueryAndComposeContextTests(TestCase):
|
|||||||
self.assertTrue(bool(base["is_group"]))
|
self.assertTrue(bool(base["is_group"]))
|
||||||
self.assertEqual("signal-group-123", str(base["identifier"]))
|
self.assertEqual("signal-group-123", str(base["identifier"]))
|
||||||
|
|
||||||
|
def test_context_base_prefers_explicit_signal_group_over_xmpp_identifier_match(self):
|
||||||
|
PlatformChatLink.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
service="signal",
|
||||||
|
chat_identifier="signal-group-123",
|
||||||
|
chat_name="Signal Group",
|
||||||
|
is_group=True,
|
||||||
|
)
|
||||||
|
xmpp_person = Person.objects.create(user=self.user, name="Bridge Alias")
|
||||||
|
PersonIdentifier.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
person=xmpp_person,
|
||||||
|
service="xmpp",
|
||||||
|
identifier="signal-group-123",
|
||||||
|
)
|
||||||
|
|
||||||
|
base = _context_base(
|
||||||
|
user=self.user,
|
||||||
|
service="signal",
|
||||||
|
identifier="signal-group-123",
|
||||||
|
person=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertEqual("signal", str(base["service"]))
|
||||||
|
self.assertTrue(bool(base["is_group"]))
|
||||||
|
self.assertIsNone(base["person_identifier"])
|
||||||
|
|
||||||
def test_manual_contact_rows_include_signal_groups(self):
|
def test_manual_contact_rows_include_signal_groups(self):
|
||||||
PlatformChatLink.objects.create(
|
PlatformChatLink.objects.create(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
|
|||||||
80
core/tests/test_prune_behavioral_orm_data.py
Normal file
80
core/tests/test_prune_behavioral_orm_data.py
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from io import StringIO
|
||||||
|
|
||||||
|
from django.core.management import call_command
|
||||||
|
from django.test import TestCase, override_settings
|
||||||
|
|
||||||
|
from core.models import ChatSession, ConversationEvent, Person, PersonIdentifier, User
|
||||||
|
|
||||||
|
|
||||||
|
@override_settings(CONVERSATION_EVENT_RETENTION_DAYS=90)
|
||||||
|
class PruneBehavioralOrmDataCommandTests(TestCase):
|
||||||
|
def setUp(self):
|
||||||
|
self.user = User.objects.create_user(
|
||||||
|
username="prune-user",
|
||||||
|
email="prune@example.com",
|
||||||
|
password="x",
|
||||||
|
)
|
||||||
|
self.person = Person.objects.create(user=self.user, name="Prune Person")
|
||||||
|
self.identifier = PersonIdentifier.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
person=self.person,
|
||||||
|
service="signal",
|
||||||
|
identifier="+15555550111",
|
||||||
|
)
|
||||||
|
self.session = ChatSession.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
identifier=self.identifier,
|
||||||
|
)
|
||||||
|
|
||||||
|
def test_prune_command_deletes_old_shadow_rows(self):
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=1000,
|
||||||
|
event_type="message_created",
|
||||||
|
direction="in",
|
||||||
|
origin_transport="signal",
|
||||||
|
)
|
||||||
|
|
||||||
|
out = StringIO()
|
||||||
|
call_command("prune_behavioral_orm_data", stdout=out)
|
||||||
|
|
||||||
|
self.assertEqual(0, ConversationEvent.objects.count())
|
||||||
|
self.assertIn("prune-behavioral-orm-data", out.getvalue())
|
||||||
|
|
||||||
|
def test_prune_command_supports_dry_run(self):
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=1000,
|
||||||
|
event_type="message_created",
|
||||||
|
direction="in",
|
||||||
|
origin_transport="signal",
|
||||||
|
)
|
||||||
|
|
||||||
|
out = StringIO()
|
||||||
|
call_command("prune_behavioral_orm_data", "--dry-run", stdout=out)
|
||||||
|
|
||||||
|
self.assertEqual(1, ConversationEvent.objects.count())
|
||||||
|
self.assertIn("dry_run=True", out.getvalue())
|
||||||
|
|
||||||
|
def test_prune_command_can_limit_tables(self):
|
||||||
|
ConversationEvent.objects.create(
|
||||||
|
user=self.user,
|
||||||
|
session=self.session,
|
||||||
|
ts=1000,
|
||||||
|
event_type="message_created",
|
||||||
|
direction="in",
|
||||||
|
origin_transport="signal",
|
||||||
|
)
|
||||||
|
|
||||||
|
call_command(
|
||||||
|
"prune_behavioral_orm_data",
|
||||||
|
"--tables",
|
||||||
|
"conversation_events",
|
||||||
|
stdout=StringIO(),
|
||||||
|
)
|
||||||
|
|
||||||
|
self.assertEqual(0, ConversationEvent.objects.count())
|
||||||
@@ -47,6 +47,15 @@ class SettingsIntegrityTests(TestCase):
|
|||||||
self.assertIsNotNone(settings_nav)
|
self.assertIsNotNone(settings_nav)
|
||||||
self.assertEqual("Modules", settings_nav["title"])
|
self.assertEqual("Modules", settings_nav["title"])
|
||||||
|
|
||||||
|
def test_behavioral_settings_receives_modules_settings_nav(self):
|
||||||
|
response = self.client.get(reverse("behavioral_signals_settings"))
|
||||||
|
self.assertEqual(200, response.status_code)
|
||||||
|
settings_nav = response.context.get("settings_nav")
|
||||||
|
self.assertIsNotNone(settings_nav)
|
||||||
|
self.assertEqual("Modules", settings_nav["title"])
|
||||||
|
labels = [str(item["label"]) for item in settings_nav["tabs"]]
|
||||||
|
self.assertIn("Behavioral Signals", labels)
|
||||||
|
|
||||||
def test_tasks_settings_cross_links_commands_and_permissions(self):
|
def test_tasks_settings_cross_links_commands_and_permissions(self):
|
||||||
TaskProject.objects.create(user=self.user, name="Integrity Project")
|
TaskProject.objects.create(user=self.user, name="Integrity Project")
|
||||||
response = self.client.get(reverse("tasks_settings"))
|
response = self.client.get(reverse("tasks_settings"))
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
from django.test import TestCase
|
from django.test import TestCase
|
||||||
from django.urls import reverse
|
from django.urls import reverse
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
from core.models import (
|
from core.models import (
|
||||||
AIRequest,
|
AIRequest,
|
||||||
@@ -63,6 +64,31 @@ class SystemDiagnosticsAPITests(TestCase):
|
|||||||
self.assertIn("missing_required_types", payload)
|
self.assertIn("missing_required_types", payload)
|
||||||
self.assertIn("reaction_added", payload.get("missing_required_types") or [])
|
self.assertIn("reaction_added", payload.get("missing_required_types") or [])
|
||||||
|
|
||||||
|
@patch("core.views.system.get_recent_event_rows")
|
||||||
|
def test_event_ledger_smoke_api_can_use_manticore_source(self, mocked_rows):
|
||||||
|
mocked_rows.return_value = [
|
||||||
|
{
|
||||||
|
"id": "",
|
||||||
|
"user_id": int(self.user.id),
|
||||||
|
"session_id": str(self.session.id),
|
||||||
|
"ts": 1700000000000,
|
||||||
|
"event_type": "message_created",
|
||||||
|
"kind": "message_sent",
|
||||||
|
"direction": "in",
|
||||||
|
"origin_transport": "signal",
|
||||||
|
"trace_id": "",
|
||||||
|
}
|
||||||
|
]
|
||||||
|
response = self.client.get(
|
||||||
|
reverse("system_event_ledger_smoke"),
|
||||||
|
{"minutes": "999999", "service": "signal"},
|
||||||
|
)
|
||||||
|
self.assertEqual(200, response.status_code)
|
||||||
|
payload = response.json()
|
||||||
|
self.assertTrue(payload.get("ok"))
|
||||||
|
self.assertEqual("manticore", payload.get("data_source"))
|
||||||
|
self.assertEqual(1, int(payload.get("count") or 0))
|
||||||
|
|
||||||
def test_trace_diagnostics_includes_projection_shadow_links(self):
|
def test_trace_diagnostics_includes_projection_shadow_links(self):
|
||||||
trace_id = "trace-system-diag-1"
|
trace_id = "trace-system-diag-1"
|
||||||
event = ConversationEvent.objects.create(
|
event = ConversationEvent.objects.create(
|
||||||
@@ -95,6 +121,44 @@ class SystemDiagnosticsAPITests(TestCase):
|
|||||||
str(events[0].get("projection_shadow_url") or ""),
|
str(events[0].get("projection_shadow_url") or ""),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@patch("core.views.system.get_trace_event_rows")
|
||||||
|
def test_trace_diagnostics_can_use_manticore_source(self, mocked_rows):
|
||||||
|
mocked_rows.return_value = [
|
||||||
|
{
|
||||||
|
"id": "",
|
||||||
|
"ts": 1700000001000,
|
||||||
|
"event_type": "message_created",
|
||||||
|
"direction": "in",
|
||||||
|
"session_id": str(self.session.id),
|
||||||
|
"origin_transport": "signal",
|
||||||
|
"origin_message_id": "m2",
|
||||||
|
"payload": {"trace_id": "trace-system-diag-m"},
|
||||||
|
}
|
||||||
|
]
|
||||||
|
response = self.client.get(
|
||||||
|
reverse("system_trace_diagnostics"),
|
||||||
|
{"trace_id": "trace-system-diag-m"},
|
||||||
|
)
|
||||||
|
self.assertEqual(200, response.status_code)
|
||||||
|
payload = response.json()
|
||||||
|
self.assertTrue(payload.get("ok"))
|
||||||
|
self.assertEqual("manticore", payload.get("data_source"))
|
||||||
|
self.assertEqual(1, int(payload.get("count") or 0))
|
||||||
|
self.assertIn(str(self.session.id), payload.get("related_session_ids") or [])
|
||||||
|
|
||||||
|
@patch("core.views.system.get_trace_ids")
|
||||||
|
@patch("core.views.system.count_behavioral_events")
|
||||||
|
def test_system_settings_page_includes_manticore_trace_ids(
|
||||||
|
self, mocked_behavioral_count, mocked_trace_ids
|
||||||
|
):
|
||||||
|
mocked_behavioral_count.return_value = 7
|
||||||
|
mocked_trace_ids.return_value = ["trace-from-manticore"]
|
||||||
|
response = self.client.get(reverse("system_settings"))
|
||||||
|
self.assertEqual(200, response.status_code)
|
||||||
|
content = response.content.decode("utf-8")
|
||||||
|
self.assertIn("trace-from-manticore", content)
|
||||||
|
self.assertIn("Behavioral Events: 7", content)
|
||||||
|
|
||||||
def test_memory_search_status_and_query_api(self):
|
def test_memory_search_status_and_query_api(self):
|
||||||
request = AIRequest.objects.create(
|
request = AIRequest.objects.create(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
@@ -127,7 +191,18 @@ class SystemDiagnosticsAPITests(TestCase):
|
|||||||
first_hit = (query_payload.get("hits") or [{}])[0]
|
first_hit = (query_payload.get("hits") or [{}])[0]
|
||||||
self.assertEqual(str(memory.id), str(first_hit.get("memory_id") or ""))
|
self.assertEqual(str(memory.id), str(first_hit.get("memory_id") or ""))
|
||||||
|
|
||||||
def test_system_settings_page_renders_searchable_datalists(self):
|
@patch("core.views.system.get_recent_event_rows")
|
||||||
|
@patch("core.views.system.count_behavioral_events")
|
||||||
|
def test_system_settings_page_renders_searchable_datalists(
|
||||||
|
self, mocked_behavioral_count, mocked_recent_rows
|
||||||
|
):
|
||||||
|
mocked_behavioral_count.return_value = 3
|
||||||
|
mocked_recent_rows.return_value = [
|
||||||
|
{
|
||||||
|
"event_type": "presence_available",
|
||||||
|
"origin_transport": "whatsapp",
|
||||||
|
}
|
||||||
|
]
|
||||||
ConversationEvent.objects.create(
|
ConversationEvent.objects.create(
|
||||||
user=self.user,
|
user=self.user,
|
||||||
session=self.session,
|
session=self.session,
|
||||||
@@ -148,3 +223,5 @@ class SystemDiagnosticsAPITests(TestCase):
|
|||||||
self.assertIn('datalist id="diagnostics-event-type-options"', content)
|
self.assertIn('datalist id="diagnostics-event-type-options"', content)
|
||||||
self.assertIn(str(self.session.id), content)
|
self.assertIn(str(self.session.id), content)
|
||||||
self.assertIn("trace-system-diag-2", content)
|
self.assertIn("trace-system-diag-2", content)
|
||||||
|
self.assertIn("whatsapp", content)
|
||||||
|
self.assertIn("presence_available", content)
|
||||||
|
|||||||
@@ -3,6 +3,8 @@ from __future__ import annotations
|
|||||||
import asyncio
|
import asyncio
|
||||||
|
|
||||||
from asgiref.sync import async_to_sync
|
from asgiref.sync import async_to_sync
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
from django.core.management import call_command
|
from django.core.management import call_command
|
||||||
from django.test import TestCase
|
from django.test import TestCase
|
||||||
|
|
||||||
@@ -10,8 +12,6 @@ from core.clients.whatsapp import WhatsAppClient
|
|||||||
from core.messaging import history, media_bridge
|
from core.messaging import history, media_bridge
|
||||||
from core.models import (
|
from core.models import (
|
||||||
ChatSession,
|
ChatSession,
|
||||||
ContactAvailabilityEvent,
|
|
||||||
ContactAvailabilitySpan,
|
|
||||||
Message,
|
Message,
|
||||||
Person,
|
Person,
|
||||||
PersonIdentifier,
|
PersonIdentifier,
|
||||||
@@ -175,6 +175,36 @@ class WhatsAppReactionHandlingTests(TestCase):
|
|||||||
)
|
)
|
||||||
self.assertEqual("caption from wrapper", text)
|
self.assertEqual("caption from wrapper", text)
|
||||||
|
|
||||||
|
def test_message_identifier_candidates_use_chat_for_direct_outbound(self):
|
||||||
|
values = self.client._message_identifier_candidates(
|
||||||
|
sender="441234567890@s.whatsapp.net",
|
||||||
|
chat="447356114729@s.whatsapp.net",
|
||||||
|
is_from_me=True,
|
||||||
|
)
|
||||||
|
self.assertIn("447356114729@s.whatsapp.net", values)
|
||||||
|
self.assertIn("447356114729", values)
|
||||||
|
self.assertNotIn("441234567890@s.whatsapp.net", values)
|
||||||
|
|
||||||
|
def test_message_identifier_candidates_use_sender_for_direct_inbound(self):
|
||||||
|
values = self.client._message_identifier_candidates(
|
||||||
|
sender="447356114729@s.whatsapp.net",
|
||||||
|
chat="441234567890@s.whatsapp.net",
|
||||||
|
is_from_me=False,
|
||||||
|
)
|
||||||
|
self.assertIn("447356114729@s.whatsapp.net", values)
|
||||||
|
self.assertIn("447356114729", values)
|
||||||
|
self.assertNotIn("441234567890@s.whatsapp.net", values)
|
||||||
|
|
||||||
|
def test_message_identifier_candidates_use_group_chat_for_group_events(self):
|
||||||
|
values = self.client._message_identifier_candidates(
|
||||||
|
sender="447356114729@s.whatsapp.net",
|
||||||
|
chat="120363402761690215@g.us",
|
||||||
|
is_from_me=True,
|
||||||
|
)
|
||||||
|
self.assertIn("120363402761690215@g.us", values)
|
||||||
|
self.assertIn("120363402761690215", values)
|
||||||
|
self.assertNotIn("447356114729@s.whatsapp.net", values)
|
||||||
|
|
||||||
|
|
||||||
class RecalculateContactAvailabilityTests(TestCase):
|
class RecalculateContactAvailabilityTests(TestCase):
|
||||||
def setUp(self):
|
def setUp(self):
|
||||||
@@ -215,42 +245,21 @@ class RecalculateContactAvailabilityTests(TestCase):
|
|||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
def _projection(self):
|
@patch("core.management.commands.recalculate_contact_availability.append_event_sync")
|
||||||
events = list(
|
def test_recalculate_replays_message_read_and_reaction_events(self, mocked_append):
|
||||||
ContactAvailabilityEvent.objects.filter(user=self.user)
|
|
||||||
.order_by("ts", "source_kind", "id")
|
|
||||||
.values_list("service", "source_kind", "availability_state", "ts")
|
|
||||||
)
|
|
||||||
spans = list(
|
|
||||||
ContactAvailabilitySpan.objects.filter(user=self.user)
|
|
||||||
.order_by("start_ts", "end_ts", "id")
|
|
||||||
.values_list("service", "state", "start_ts", "end_ts")
|
|
||||||
)
|
|
||||||
return events, spans
|
|
||||||
|
|
||||||
def test_recalculate_is_deterministic_and_no_skew_on_rerun(self):
|
|
||||||
call_command(
|
call_command(
|
||||||
"recalculate_contact_availability", "--days", "36500", "--limit", "500"
|
"recalculate_contact_availability", "--days", "36500", "--limit", "500"
|
||||||
)
|
)
|
||||||
first_events, first_spans = self._projection()
|
event_types = [call.kwargs["event_type"] for call in mocked_append.call_args_list]
|
||||||
self.assertTrue(first_events)
|
self.assertEqual(
|
||||||
self.assertTrue(first_spans)
|
["message_created", "read_receipt", "presence_available"],
|
||||||
|
event_types,
|
||||||
call_command(
|
|
||||||
"recalculate_contact_availability", "--days", "36500", "--limit", "500"
|
|
||||||
)
|
)
|
||||||
second_events, second_spans = self._projection()
|
|
||||||
|
|
||||||
self.assertEqual(first_events, second_events)
|
|
||||||
self.assertEqual(first_spans, second_spans)
|
|
||||||
|
|
||||||
def test_recalculate_no_reset_does_not_duplicate(self):
|
|
||||||
call_command(
|
|
||||||
"recalculate_contact_availability", "--days", "36500", "--limit", "500"
|
|
||||||
)
|
|
||||||
events_before = ContactAvailabilityEvent.objects.filter(user=self.user).count()
|
|
||||||
spans_before = ContactAvailabilitySpan.objects.filter(user=self.user).count()
|
|
||||||
|
|
||||||
|
@patch("core.management.commands.recalculate_contact_availability.append_event_sync")
|
||||||
|
def test_recalculate_no_reset_remains_idempotent_at_command_interface(
|
||||||
|
self, mocked_append
|
||||||
|
):
|
||||||
call_command(
|
call_command(
|
||||||
"recalculate_contact_availability",
|
"recalculate_contact_availability",
|
||||||
"--days",
|
"--days",
|
||||||
@@ -259,7 +268,4 @@ class RecalculateContactAvailabilityTests(TestCase):
|
|||||||
"500",
|
"500",
|
||||||
"--no-reset",
|
"--no-reset",
|
||||||
)
|
)
|
||||||
events_after = ContactAvailabilityEvent.objects.filter(user=self.user).count()
|
self.assertEqual(3, mocked_append.call_count)
|
||||||
spans_after = ContactAvailabilitySpan.objects.filter(user=self.user).count()
|
|
||||||
self.assertEqual(events_before, events_after)
|
|
||||||
self.assertEqual(spans_before, spans_after)
|
|
||||||
|
|||||||
@@ -15,6 +15,12 @@ _TRANSPORT_CAPABILITIES: dict[str, dict[str, bool]] = {
|
|||||||
"media_video": True,
|
"media_video": True,
|
||||||
"media_audio": True,
|
"media_audio": True,
|
||||||
"media_documents": True,
|
"media_documents": True,
|
||||||
|
"delivery_receipt": True,
|
||||||
|
"read_receipt": True,
|
||||||
|
"composing": True,
|
||||||
|
"composing_stopped": True,
|
||||||
|
"presence": False,
|
||||||
|
"presence_last_seen": False,
|
||||||
},
|
},
|
||||||
"whatsapp": {
|
"whatsapp": {
|
||||||
"send": True,
|
"send": True,
|
||||||
@@ -28,6 +34,12 @@ _TRANSPORT_CAPABILITIES: dict[str, dict[str, bool]] = {
|
|||||||
"media_video": True,
|
"media_video": True,
|
||||||
"media_audio": True,
|
"media_audio": True,
|
||||||
"media_documents": True,
|
"media_documents": True,
|
||||||
|
"delivery_receipt": True,
|
||||||
|
"read_receipt": True,
|
||||||
|
"composing": True,
|
||||||
|
"composing_stopped": True,
|
||||||
|
"presence": True,
|
||||||
|
"presence_last_seen": True,
|
||||||
},
|
},
|
||||||
"instagram": {
|
"instagram": {
|
||||||
"send": True,
|
"send": True,
|
||||||
@@ -41,6 +53,12 @@ _TRANSPORT_CAPABILITIES: dict[str, dict[str, bool]] = {
|
|||||||
"media_video": True,
|
"media_video": True,
|
||||||
"media_audio": False,
|
"media_audio": False,
|
||||||
"media_documents": False,
|
"media_documents": False,
|
||||||
|
"delivery_receipt": False,
|
||||||
|
"read_receipt": True,
|
||||||
|
"composing": True,
|
||||||
|
"composing_stopped": False,
|
||||||
|
"presence": True,
|
||||||
|
"presence_last_seen": False,
|
||||||
},
|
},
|
||||||
"xmpp": {
|
"xmpp": {
|
||||||
"send": False,
|
"send": False,
|
||||||
@@ -54,6 +72,15 @@ _TRANSPORT_CAPABILITIES: dict[str, dict[str, bool]] = {
|
|||||||
"media_video": False,
|
"media_video": False,
|
||||||
"media_audio": False,
|
"media_audio": False,
|
||||||
"media_documents": False,
|
"media_documents": False,
|
||||||
|
"delivery_receipt": True,
|
||||||
|
"read_receipt": True,
|
||||||
|
"composing": True,
|
||||||
|
"composing_stopped": True,
|
||||||
|
"composing_paused": True,
|
||||||
|
"composing_inactive": True,
|
||||||
|
"composing_gone": True,
|
||||||
|
"presence": True,
|
||||||
|
"presence_last_seen": True,
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -1,14 +1,12 @@
|
|||||||
from __future__ import annotations
|
from __future__ import annotations
|
||||||
|
|
||||||
from django.contrib.auth.mixins import LoginRequiredMixin
|
from django.contrib.auth.mixins import LoginRequiredMixin
|
||||||
from django.db.models import Count, Max, Q
|
|
||||||
from django.shortcuts import render
|
from django.shortcuts import render
|
||||||
from django.views import View
|
from django.views import View
|
||||||
|
|
||||||
from core.models import (
|
from core.events.manticore import get_behavioral_availability_stats
|
||||||
ContactAvailabilityEvent,
|
from core.events.shadow import get_shadow_behavioral_availability_stats
|
||||||
ContactAvailabilitySettings,
|
from core.models import ContactAvailabilitySettings, Person
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
def _to_int(value, default=0):
|
def _to_int(value, default=0):
|
||||||
@@ -64,42 +62,105 @@ class AvailabilitySettingsPage(LoginRequiredMixin, View):
|
|||||||
return self.get(request)
|
return self.get(request)
|
||||||
|
|
||||||
def get(self, request):
|
def get(self, request):
|
||||||
settings_row = self._settings(request)
|
behavioral_stats, stats_source = self._behavioral_stats(request.user)
|
||||||
contact_stats = list(
|
transport_stats = self._transport_stats(behavioral_stats)
|
||||||
ContactAvailabilityEvent.objects.filter(
|
totals = self._totals(behavioral_stats)
|
||||||
user=request.user, person__isnull=False
|
|
||||||
)
|
|
||||||
.values("person_id", "person__name", "service")
|
|
||||||
.annotate(
|
|
||||||
total_events=Count("id"),
|
|
||||||
available_events=Count("id", filter=Q(availability_state="available")),
|
|
||||||
fading_events=Count("id", filter=Q(availability_state="fading")),
|
|
||||||
unavailable_events=Count(
|
|
||||||
"id", filter=Q(availability_state="unavailable")
|
|
||||||
),
|
|
||||||
unknown_events=Count("id", filter=Q(availability_state="unknown")),
|
|
||||||
native_presence_events=Count(
|
|
||||||
"id", filter=Q(source_kind="native_presence")
|
|
||||||
),
|
|
||||||
read_receipt_events=Count("id", filter=Q(source_kind="read_receipt")),
|
|
||||||
typing_events=Count(
|
|
||||||
"id",
|
|
||||||
filter=Q(source_kind="typing_start") | Q(source_kind="typing_stop"),
|
|
||||||
),
|
|
||||||
message_activity_events=Count(
|
|
||||||
"id",
|
|
||||||
filter=Q(source_kind="message_in") | Q(source_kind="message_out"),
|
|
||||||
),
|
|
||||||
inferred_timeout_events=Count(
|
|
||||||
"id", filter=Q(source_kind="inferred_timeout")
|
|
||||||
),
|
|
||||||
last_event_ts=Max("ts"),
|
|
||||||
)
|
|
||||||
.order_by("-total_events", "person__name", "service")
|
|
||||||
)
|
|
||||||
|
|
||||||
context = {
|
context = {
|
||||||
"settings_row": settings_row,
|
"settings_row": self._settings(request),
|
||||||
"contact_stats": contact_stats,
|
"behavioral_stats": behavioral_stats,
|
||||||
|
"behavioral_stats_source": stats_source,
|
||||||
|
"transport_stats": transport_stats,
|
||||||
|
"behavioral_totals": totals,
|
||||||
}
|
}
|
||||||
return render(request, self.template_name, context)
|
return render(request, self.template_name, context)
|
||||||
|
|
||||||
|
def _behavioral_stats(self, user):
|
||||||
|
try:
|
||||||
|
person_map = {
|
||||||
|
str(row["id"]): str(row["name"] or "")
|
||||||
|
for row in Person.objects.filter(user=user).values("id", "name")
|
||||||
|
}
|
||||||
|
rows = []
|
||||||
|
for row in list(
|
||||||
|
get_behavioral_availability_stats(user_id=int(user.id)) or []
|
||||||
|
):
|
||||||
|
person_id = str(row.get("person_id") or "").strip()
|
||||||
|
rows.append(
|
||||||
|
{
|
||||||
|
"person_id": person_id,
|
||||||
|
"person_name": person_map.get(person_id, person_id or "-"),
|
||||||
|
"service": str(row.get("transport") or "").strip().lower(),
|
||||||
|
"total_events": _to_int(row.get("total_events"), 0),
|
||||||
|
"presence_events": _to_int(row.get("presence_events"), 0),
|
||||||
|
"read_events": _to_int(row.get("read_events"), 0),
|
||||||
|
"typing_events": _to_int(row.get("typing_events"), 0),
|
||||||
|
"message_events": _to_int(row.get("message_events"), 0),
|
||||||
|
"abandoned_events": _to_int(row.get("abandoned_events"), 0),
|
||||||
|
"last_event_ts": _to_int(row.get("last_event_ts"), 0),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
if rows:
|
||||||
|
return rows, "manticore"
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
return list(get_shadow_behavioral_availability_stats(user=user)), "conversation_event_shadow"
|
||||||
|
|
||||||
|
def _transport_stats(self, behavioral_stats: list[dict]) -> list[dict]:
|
||||||
|
by_transport = {}
|
||||||
|
for row in list(behavioral_stats or []):
|
||||||
|
service = str(row.get("service") or "").strip().lower() or "-"
|
||||||
|
state = by_transport.setdefault(
|
||||||
|
service,
|
||||||
|
{
|
||||||
|
"service": service,
|
||||||
|
"contacts": 0,
|
||||||
|
"total_events": 0,
|
||||||
|
"presence_events": 0,
|
||||||
|
"read_events": 0,
|
||||||
|
"typing_events": 0,
|
||||||
|
"message_events": 0,
|
||||||
|
"abandoned_events": 0,
|
||||||
|
"last_event_ts": 0,
|
||||||
|
},
|
||||||
|
)
|
||||||
|
state["contacts"] += 1
|
||||||
|
for key in (
|
||||||
|
"total_events",
|
||||||
|
"presence_events",
|
||||||
|
"read_events",
|
||||||
|
"typing_events",
|
||||||
|
"message_events",
|
||||||
|
"abandoned_events",
|
||||||
|
):
|
||||||
|
state[key] += _to_int(row.get(key), 0)
|
||||||
|
state["last_event_ts"] = max(
|
||||||
|
int(state.get("last_event_ts") or 0),
|
||||||
|
_to_int(row.get("last_event_ts"), 0),
|
||||||
|
)
|
||||||
|
return sorted(
|
||||||
|
by_transport.values(),
|
||||||
|
key=lambda row: (-int(row.get("total_events") or 0), str(row.get("service") or "")),
|
||||||
|
)
|
||||||
|
|
||||||
|
def _totals(self, behavioral_stats: list[dict]) -> dict:
|
||||||
|
totals = {
|
||||||
|
"contacts": 0,
|
||||||
|
"total_events": 0,
|
||||||
|
"presence_events": 0,
|
||||||
|
"read_events": 0,
|
||||||
|
"typing_events": 0,
|
||||||
|
"message_events": 0,
|
||||||
|
"abandoned_events": 0,
|
||||||
|
}
|
||||||
|
for row in list(behavioral_stats or []):
|
||||||
|
totals["contacts"] += 1
|
||||||
|
for key in (
|
||||||
|
"total_events",
|
||||||
|
"presence_events",
|
||||||
|
"read_events",
|
||||||
|
"typing_events",
|
||||||
|
"message_events",
|
||||||
|
"abandoned_events",
|
||||||
|
):
|
||||||
|
totals[key] += _to_int(row.get(key), 0)
|
||||||
|
return totals
|
||||||
|
|||||||
@@ -179,18 +179,23 @@ def _format_ts_label(ts_value: int) -> str:
|
|||||||
|
|
||||||
|
|
||||||
def _serialize_availability_spans(spans):
|
def _serialize_availability_spans(spans):
|
||||||
|
def _value(row, key, default=None):
|
||||||
|
if isinstance(row, dict):
|
||||||
|
return row.get(key, default)
|
||||||
|
return getattr(row, key, default)
|
||||||
|
|
||||||
rows = []
|
rows = []
|
||||||
for row in list(spans or []):
|
for row in list(spans or []):
|
||||||
rows.append(
|
rows.append(
|
||||||
{
|
{
|
||||||
"id": int(getattr(row, "id", 0) or 0),
|
"id": int(_value(row, "id", 0) or 0),
|
||||||
"service": str(getattr(row, "service", "") or ""),
|
"service": str(_value(row, "service", "") or ""),
|
||||||
"state": str(getattr(row, "state", "unknown") or "unknown"),
|
"state": str(_value(row, "state", "unknown") or "unknown"),
|
||||||
"start_ts": int(getattr(row, "start_ts", 0) or 0),
|
"start_ts": int(_value(row, "start_ts", 0) or 0),
|
||||||
"end_ts": int(getattr(row, "end_ts", 0) or 0),
|
"end_ts": int(_value(row, "end_ts", 0) or 0),
|
||||||
"confidence_start": float(getattr(row, "confidence_start", 0.0) or 0.0),
|
"confidence_start": float(_value(row, "confidence_start", 0.0) or 0.0),
|
||||||
"confidence_end": float(getattr(row, "confidence_end", 0.0) or 0.0),
|
"confidence_end": float(_value(row, "confidence_end", 0.0) or 0.0),
|
||||||
"payload": dict(getattr(row, "payload", {}) or {}),
|
"payload": dict(_value(row, "payload", {}) or {}),
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
return rows
|
return rows
|
||||||
@@ -1765,8 +1770,31 @@ def _engage_source_from_ref(plan, source_ref):
|
|||||||
|
|
||||||
|
|
||||||
def _context_base(user, service, identifier, person):
|
def _context_base(user, service, identifier, person):
|
||||||
|
service = _default_service(service)
|
||||||
identifier_variants = _identifier_variants(service, identifier)
|
identifier_variants = _identifier_variants(service, identifier)
|
||||||
person_identifier = None
|
person_identifier = None
|
||||||
|
if person is None and identifier:
|
||||||
|
for candidate in identifier_variants or [identifier]:
|
||||||
|
bare_id = str(candidate or "").strip().split("@", 1)[0].strip()
|
||||||
|
if not bare_id:
|
||||||
|
continue
|
||||||
|
group_link = PlatformChatLink.objects.filter(
|
||||||
|
user=user,
|
||||||
|
service=service,
|
||||||
|
chat_identifier=bare_id,
|
||||||
|
is_group=True,
|
||||||
|
).first()
|
||||||
|
if group_link:
|
||||||
|
return {
|
||||||
|
"person_identifier": None,
|
||||||
|
"service": service,
|
||||||
|
"identifier": _group_channel_identifier(
|
||||||
|
service, group_link, bare_id
|
||||||
|
),
|
||||||
|
"person": None,
|
||||||
|
"is_group": True,
|
||||||
|
"group_name": group_link.chat_name or bare_id,
|
||||||
|
}
|
||||||
if person is not None:
|
if person is not None:
|
||||||
if identifier_variants:
|
if identifier_variants:
|
||||||
person_identifier = PersonIdentifier.objects.filter(
|
person_identifier = PersonIdentifier.objects.filter(
|
||||||
@@ -1811,24 +1839,6 @@ def _context_base(user, service, identifier, person):
|
|||||||
if group_link is not None:
|
if group_link is not None:
|
||||||
identifier = str(group_link.chat_jid or f"{bare_id}@g.us")
|
identifier = str(group_link.chat_jid or f"{bare_id}@g.us")
|
||||||
|
|
||||||
if person_identifier is None and identifier:
|
|
||||||
bare_id = identifier.split("@", 1)[0].strip()
|
|
||||||
group_link = PlatformChatLink.objects.filter(
|
|
||||||
user=user,
|
|
||||||
service=service,
|
|
||||||
chat_identifier=bare_id,
|
|
||||||
is_group=True,
|
|
||||||
).first()
|
|
||||||
if group_link:
|
|
||||||
return {
|
|
||||||
"person_identifier": None,
|
|
||||||
"service": service,
|
|
||||||
"identifier": _group_channel_identifier(service, group_link, bare_id),
|
|
||||||
"person": None,
|
|
||||||
"is_group": True,
|
|
||||||
"group_name": group_link.chat_name or bare_id,
|
|
||||||
}
|
|
||||||
|
|
||||||
return {
|
return {
|
||||||
"person_identifier": person_identifier,
|
"person_identifier": person_identifier,
|
||||||
"service": service,
|
"service": service,
|
||||||
|
|||||||
@@ -7,6 +7,12 @@ from django.urls import reverse
|
|||||||
from django.views import View
|
from django.views import View
|
||||||
|
|
||||||
from core.clients import transport
|
from core.clients import transport
|
||||||
|
from core.events.manticore import (
|
||||||
|
count_behavioral_events,
|
||||||
|
get_recent_event_rows,
|
||||||
|
get_trace_event_rows,
|
||||||
|
get_trace_ids,
|
||||||
|
)
|
||||||
from core.events.projection import shadow_compare_session
|
from core.events.projection import shadow_compare_session
|
||||||
from core.memory.search_backend import backend_status, get_memory_search_backend
|
from core.memory.search_backend import backend_status, get_memory_search_backend
|
||||||
from core.models import (
|
from core.models import (
|
||||||
@@ -56,12 +62,20 @@ class SystemSettings(SuperUserRequiredMixin, View):
|
|||||||
template_name = "pages/system-settings.html"
|
template_name = "pages/system-settings.html"
|
||||||
|
|
||||||
def _counts(self, user):
|
def _counts(self, user):
|
||||||
|
behavioral_event_rows = 0
|
||||||
|
try:
|
||||||
|
behavioral_event_rows = int(count_behavioral_events(user_id=int(user.id)) or 0)
|
||||||
|
except Exception:
|
||||||
|
behavioral_event_rows = 0
|
||||||
return {
|
return {
|
||||||
"chat_sessions": ChatSession.objects.filter(user=user).count(),
|
"chat_sessions": ChatSession.objects.filter(user=user).count(),
|
||||||
"messages": Message.objects.filter(user=user).count(),
|
"messages": Message.objects.filter(user=user).count(),
|
||||||
"queued_messages": QueuedMessage.objects.filter(user=user).count(),
|
"queued_messages": QueuedMessage.objects.filter(user=user).count(),
|
||||||
"message_events": MessageEvent.objects.filter(user=user).count(),
|
"message_events": MessageEvent.objects.filter(user=user).count(),
|
||||||
"conversation_events": ConversationEvent.objects.filter(user=user).count(),
|
"conversation_event_shadow_rows": ConversationEvent.objects.filter(
|
||||||
|
user=user
|
||||||
|
).count(),
|
||||||
|
"behavioral_event_rows": behavioral_event_rows,
|
||||||
"adapter_health_events": AdapterHealthEvent.objects.filter(
|
"adapter_health_events": AdapterHealthEvent.objects.filter(
|
||||||
user=user
|
user=user
|
||||||
).count(),
|
).count(),
|
||||||
@@ -203,6 +217,21 @@ class SystemSettings(SuperUserRequiredMixin, View):
|
|||||||
trace_options.append(value)
|
trace_options.append(value)
|
||||||
if len(trace_options) >= 120:
|
if len(trace_options) >= 120:
|
||||||
break
|
break
|
||||||
|
if len(trace_options) < 120:
|
||||||
|
try:
|
||||||
|
for trace_id in get_trace_ids(
|
||||||
|
user_id=int(user.id),
|
||||||
|
limit=max(1, 120 - len(trace_options)),
|
||||||
|
):
|
||||||
|
value = str(trace_id or "").strip()
|
||||||
|
if not value or value in seen_trace_ids:
|
||||||
|
continue
|
||||||
|
seen_trace_ids.add(value)
|
||||||
|
trace_options.append(value)
|
||||||
|
if len(trace_options) >= 120:
|
||||||
|
break
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
service_candidates = {"signal", "whatsapp", "xmpp", "instagram", "web"}
|
service_candidates = {"signal", "whatsapp", "xmpp", "instagram", "web"}
|
||||||
service_candidates.update(
|
service_candidates.update(
|
||||||
@@ -212,6 +241,18 @@ class SystemSettings(SuperUserRequiredMixin, View):
|
|||||||
.values_list("origin_transport", flat=True)
|
.values_list("origin_transport", flat=True)
|
||||||
.distinct()[:50]
|
.distinct()[:50]
|
||||||
)
|
)
|
||||||
|
try:
|
||||||
|
service_candidates.update(
|
||||||
|
str(item.get("origin_transport") or "").strip().lower()
|
||||||
|
for item in get_recent_event_rows(
|
||||||
|
minutes=60 * 24 * 30,
|
||||||
|
user_id=str(user.id),
|
||||||
|
limit=500,
|
||||||
|
)
|
||||||
|
if str(item.get("origin_transport") or "").strip()
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
service_options = sorted(value for value in service_candidates if value)
|
service_options = sorted(value for value in service_candidates if value)
|
||||||
|
|
||||||
event_type_candidates = {
|
event_type_candidates = {
|
||||||
@@ -229,6 +270,18 @@ class SystemSettings(SuperUserRequiredMixin, View):
|
|||||||
.values_list("event_type", flat=True)
|
.values_list("event_type", flat=True)
|
||||||
.distinct()[:80]
|
.distinct()[:80]
|
||||||
)
|
)
|
||||||
|
try:
|
||||||
|
event_type_candidates.update(
|
||||||
|
str(item.get("event_type") or "").strip().lower()
|
||||||
|
for item in get_recent_event_rows(
|
||||||
|
minutes=60 * 24 * 30,
|
||||||
|
user_id=str(user.id),
|
||||||
|
limit=500,
|
||||||
|
)
|
||||||
|
if str(item.get("event_type") or "").strip()
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
event_type_options = sorted(value for value in event_type_candidates if value)
|
event_type_options = sorted(value for value in event_type_candidates if value)
|
||||||
|
|
||||||
return {
|
return {
|
||||||
@@ -306,10 +359,26 @@ class TraceDiagnosticsAPI(SuperUserRequiredMixin, View):
|
|||||||
.select_related("session")
|
.select_related("session")
|
||||||
.order_by("ts", "created_at")[:500]
|
.order_by("ts", "created_at")[:500]
|
||||||
)
|
)
|
||||||
|
data_source = "django"
|
||||||
|
if not rows:
|
||||||
|
try:
|
||||||
|
rows = get_trace_event_rows(
|
||||||
|
user_id=int(request.user.id),
|
||||||
|
trace_id=trace_id,
|
||||||
|
limit=500,
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
rows = []
|
||||||
|
if rows:
|
||||||
|
data_source = "manticore"
|
||||||
related_session_ids = []
|
related_session_ids = []
|
||||||
seen_sessions = set()
|
seen_sessions = set()
|
||||||
for row in rows:
|
for row in rows:
|
||||||
session_id = str(row.session_id or "").strip()
|
session_id = (
|
||||||
|
str(row.session_id or "").strip()
|
||||||
|
if not isinstance(row, dict)
|
||||||
|
else str(row.get("session_id") or "").strip()
|
||||||
|
)
|
||||||
if not session_id or session_id in seen_sessions:
|
if not session_id or session_id in seen_sessions:
|
||||||
continue
|
continue
|
||||||
seen_sessions.add(session_id)
|
seen_sessions.add(session_id)
|
||||||
@@ -319,6 +388,7 @@ class TraceDiagnosticsAPI(SuperUserRequiredMixin, View):
|
|||||||
{
|
{
|
||||||
"ok": True,
|
"ok": True,
|
||||||
"trace_id": trace_id,
|
"trace_id": trace_id,
|
||||||
|
"data_source": data_source,
|
||||||
"count": len(rows),
|
"count": len(rows),
|
||||||
"related_session_ids": related_session_ids,
|
"related_session_ids": related_session_ids,
|
||||||
"projection_shadow_urls": [
|
"projection_shadow_urls": [
|
||||||
@@ -327,19 +397,56 @@ class TraceDiagnosticsAPI(SuperUserRequiredMixin, View):
|
|||||||
],
|
],
|
||||||
"events": [
|
"events": [
|
||||||
{
|
{
|
||||||
"id": str(row.id),
|
"id": (
|
||||||
"ts": int(row.ts or 0),
|
str(row.id)
|
||||||
"event_type": str(row.event_type or ""),
|
if not isinstance(row, dict)
|
||||||
"direction": str(row.direction or ""),
|
else str(row.get("id") or "")
|
||||||
"session_id": str(row.session_id or ""),
|
),
|
||||||
|
"ts": (
|
||||||
|
int(row.ts or 0)
|
||||||
|
if not isinstance(row, dict)
|
||||||
|
else int(row.get("ts") or 0)
|
||||||
|
),
|
||||||
|
"event_type": (
|
||||||
|
str(row.event_type or "")
|
||||||
|
if not isinstance(row, dict)
|
||||||
|
else str(row.get("event_type") or "")
|
||||||
|
),
|
||||||
|
"direction": (
|
||||||
|
str(row.direction or "")
|
||||||
|
if not isinstance(row, dict)
|
||||||
|
else str(row.get("direction") or "")
|
||||||
|
),
|
||||||
|
"session_id": (
|
||||||
|
str(row.session_id or "")
|
||||||
|
if not isinstance(row, dict)
|
||||||
|
else str(row.get("session_id") or "")
|
||||||
|
),
|
||||||
"projection_shadow_url": (
|
"projection_shadow_url": (
|
||||||
f"{reverse('system_projection_shadow')}?session_id={str(row.session_id or '').strip()}"
|
f"{reverse('system_projection_shadow')}?session_id="
|
||||||
if str(row.session_id or "").strip()
|
f"{(str(row.session_id or '').strip() if not isinstance(row, dict) else str(row.get('session_id') or '').strip())}"
|
||||||
|
if (
|
||||||
|
str(row.session_id or "").strip()
|
||||||
|
if not isinstance(row, dict)
|
||||||
|
else str(row.get("session_id") or "").strip()
|
||||||
|
)
|
||||||
else ""
|
else ""
|
||||||
),
|
),
|
||||||
"origin_transport": str(row.origin_transport or ""),
|
"origin_transport": (
|
||||||
"origin_message_id": str(row.origin_message_id or ""),
|
str(row.origin_transport or "")
|
||||||
"payload": dict(row.payload or {}),
|
if not isinstance(row, dict)
|
||||||
|
else str(row.get("origin_transport") or "")
|
||||||
|
),
|
||||||
|
"origin_message_id": (
|
||||||
|
str(row.origin_message_id or "")
|
||||||
|
if not isinstance(row, dict)
|
||||||
|
else str(row.get("origin_message_id") or "")
|
||||||
|
),
|
||||||
|
"payload": (
|
||||||
|
dict(row.payload or {})
|
||||||
|
if not isinstance(row, dict)
|
||||||
|
else dict(row.get("payload") or {})
|
||||||
|
),
|
||||||
}
|
}
|
||||||
for row in rows
|
for row in rows
|
||||||
],
|
],
|
||||||
@@ -377,18 +484,7 @@ class EventProjectionShadowAPI(SuperUserRequiredMixin, View):
|
|||||||
|
|
||||||
|
|
||||||
class EventLedgerSmokeAPI(SuperUserRequiredMixin, View):
|
class EventLedgerSmokeAPI(SuperUserRequiredMixin, View):
|
||||||
def get(self, request):
|
def _recent_rows(self, *, minutes: int, service: str, user_id: str, limit: int):
|
||||||
minutes = max(1, int(request.GET.get("minutes") or 120))
|
|
||||||
service = str(request.GET.get("service") or "").strip().lower()
|
|
||||||
user_id = str(request.GET.get("user_id") or "").strip() or str(request.user.id)
|
|
||||||
limit = max(1, min(500, int(request.GET.get("limit") or 200)))
|
|
||||||
require_types_raw = str(request.GET.get("require_types") or "").strip()
|
|
||||||
required_types = [
|
|
||||||
item.strip().lower()
|
|
||||||
for item in require_types_raw.split(",")
|
|
||||||
if item.strip()
|
|
||||||
]
|
|
||||||
|
|
||||||
cutoff_ts = int(time.time() * 1000) - (minutes * 60 * 1000)
|
cutoff_ts = int(time.time() * 1000) - (minutes * 60 * 1000)
|
||||||
queryset = ConversationEvent.objects.filter(ts__gte=cutoff_ts).order_by("-ts")
|
queryset = ConversationEvent.objects.filter(ts__gte=cutoff_ts).order_by("-ts")
|
||||||
if service:
|
if service:
|
||||||
@@ -408,6 +504,37 @@ class EventLedgerSmokeAPI(SuperUserRequiredMixin, View):
|
|||||||
"trace_id",
|
"trace_id",
|
||||||
)[:limit]
|
)[:limit]
|
||||||
)
|
)
|
||||||
|
if rows:
|
||||||
|
return rows, "django"
|
||||||
|
try:
|
||||||
|
manticore_rows = get_recent_event_rows(
|
||||||
|
minutes=minutes,
|
||||||
|
service=service,
|
||||||
|
user_id=user_id,
|
||||||
|
limit=limit,
|
||||||
|
)
|
||||||
|
except Exception:
|
||||||
|
manticore_rows = []
|
||||||
|
return manticore_rows, "manticore" if manticore_rows else "django"
|
||||||
|
|
||||||
|
def get(self, request):
|
||||||
|
minutes = max(1, int(request.GET.get("minutes") or 120))
|
||||||
|
service = str(request.GET.get("service") or "").strip().lower()
|
||||||
|
user_id = str(request.GET.get("user_id") or "").strip() or str(request.user.id)
|
||||||
|
limit = max(1, min(500, int(request.GET.get("limit") or 200)))
|
||||||
|
require_types_raw = str(request.GET.get("require_types") or "").strip()
|
||||||
|
required_types = [
|
||||||
|
item.strip().lower()
|
||||||
|
for item in require_types_raw.split(",")
|
||||||
|
if item.strip()
|
||||||
|
]
|
||||||
|
|
||||||
|
rows, data_source = self._recent_rows(
|
||||||
|
minutes=minutes,
|
||||||
|
service=service,
|
||||||
|
user_id=user_id,
|
||||||
|
limit=limit,
|
||||||
|
)
|
||||||
event_type_counts = {}
|
event_type_counts = {}
|
||||||
for row in rows:
|
for row in rows:
|
||||||
key = str(row.get("event_type") or "")
|
key = str(row.get("event_type") or "")
|
||||||
@@ -423,6 +550,7 @@ class EventLedgerSmokeAPI(SuperUserRequiredMixin, View):
|
|||||||
"minutes": minutes,
|
"minutes": minutes,
|
||||||
"service": service,
|
"service": service,
|
||||||
"user_id": user_id,
|
"user_id": user_id,
|
||||||
|
"data_source": data_source,
|
||||||
"count": len(rows),
|
"count": len(rows),
|
||||||
"event_type_counts": event_type_counts,
|
"event_type_counts": event_type_counts,
|
||||||
"required_types": required_types,
|
"required_types": required_types,
|
||||||
|
|||||||
@@ -18,6 +18,7 @@ max-requests=1000
|
|||||||
# Ensure old worker processes are cleaned up properly
|
# Ensure old worker processes are cleaned up properly
|
||||||
reload-on-as=512
|
reload-on-as=512
|
||||||
reload-on-rss=256
|
reload-on-rss=256
|
||||||
|
touch-reload=/code/.uwsgi-reload
|
||||||
vacuum=1
|
vacuum=1
|
||||||
home=/venv
|
home=/venv
|
||||||
processes=2
|
processes=2
|
||||||
|
|||||||
@@ -57,6 +57,10 @@ SIGNAL_NUMBER=
|
|||||||
MEMORY_SEARCH_BACKEND=django
|
MEMORY_SEARCH_BACKEND=django
|
||||||
MANTICORE_HTTP_URL=http://localhost:9308
|
MANTICORE_HTTP_URL=http://localhost:9308
|
||||||
MANTICORE_MEMORY_TABLE=gia_memory_items
|
MANTICORE_MEMORY_TABLE=gia_memory_items
|
||||||
|
MANTICORE_EVENT_TABLE=gia_events
|
||||||
|
MANTICORE_METRIC_TABLE=gia_metrics
|
||||||
|
COMPOSING_ABANDONED_WINDOW_SECONDS=300
|
||||||
|
CONVERSATION_EVENT_RETENTION_DAYS=90
|
||||||
MANTICORE_HTTP_TIMEOUT=5
|
MANTICORE_HTTP_TIMEOUT=5
|
||||||
ATTACHMENT_MAX_BYTES=26214400
|
ATTACHMENT_MAX_BYTES=26214400
|
||||||
ATTACHMENT_ALLOW_PRIVATE_URLS=false
|
ATTACHMENT_ALLOW_PRIVATE_URLS=false
|
||||||
|
|||||||
Reference in New Issue
Block a user