Fix all integrations

This commit is contained in:
2026-03-08 22:08:55 +00:00
parent bca4d6898f
commit acedc01e83
58 changed files with 4120 additions and 960 deletions

View File

@@ -88,6 +88,15 @@ Optional static token helper:
make token TOKEN_USER=<your_username> make token TOKEN_USER=<your_username>
``` ```
Local code-quality checks:
```bash
make pre-commit
make pre-commit-glibc
```
`make pre-commit-glibc` selects `env` on musl systems and `genv` on glibc systems.
## 6) Logs and health checks ## 6) Logs and health checks
Tail logs: Tail logs:
@@ -141,8 +150,21 @@ If blocked, use full recycle.
- Manual compose: `/compose/page/` - Manual compose: `/compose/page/`
- AI workspace: `/ai/workspace/` - AI workspace: `/ai/workspace/`
- OSINT search: `/search/page/` - OSINT search: `/search/page/`
- Security encryption settings: `/settings/security/encryption/`
- Security permissions settings: `/settings/security/permissions/`
- Command routing settings: `/settings/command-routing/`
- Task automation settings: `/settings/tasks/`
- Task inbox / manual task creation: `/tasks/`
## 10) Common troubleshooting ## 10) Security and capability controls
- `Require OMEMO encryption` rejects plaintext XMPP messages before command routing.
- `Encrypt gateway component chat replies with OMEMO` only affects gateway/component conversations.
- `Encrypt contact relay messages to your XMPP client with OMEMO` only affects relayed contact chats.
- Fine-grained capability policy is configured in `/settings/security/permissions/` and applies by scope, service, and optional channel pattern.
- Trusted OMEMO key enforcement depends on trusted key records, not only the most recently observed client key.
## 11) Common troubleshooting
### A) Compose restart errors / dependency improper state ### A) Compose restart errors / dependency improper state
@@ -237,6 +259,13 @@ make run
Then approve/enable the `manticore` MCP server in VS Code when prompted. Then approve/enable the `manticore` MCP server in VS Code when prompted.
The MCP task surface now supports canonical task creation/completion in GIA:
- `tasks.create`
- `tasks.complete`
- `tasks.create_note`
- `tasks.link_artifact`
Optional ultra-light Rust MCP worker: Optional ultra-light Rust MCP worker:
```bash ```bash
@@ -247,6 +276,24 @@ make mcp-rust-build
Then enable `manticore-rust-worker` in `/code/xf/.vscode/mcp.json`. Then enable `manticore-rust-worker` in `/code/xf/.vscode/mcp.json`.
It is intentionally `disabled: true` by default so the existing Python MCP server remains the baseline. It is intentionally `disabled: true` by default so the existing Python MCP server remains the baseline.
### H) Optional browser MCP for visual validation
To validate compose/tasks/settings flows visually, add a browser-capable MCP server in your editor workspace alongside `manticore`. A Playwright-style browser MCP is the intended integration point for GIA UI checks.
Recommended usage:
- keep browser MCP outside host-network mode
- point it at the local GIA app URL/port from the running stack
- use it for page-load, form-flow, and visual regression checks on compose/tasks/settings pages
### I) Task command shortcuts
Gateway / XMPP / chat task commands now include:
- `.l` -> list open tasks
- `.tasks add <project> :: <title>` -> create a canonical task in a named project
- `.task add <title>` -> create a task inside the current mapped chat scope
### C) Signal or WhatsApp send failures ### C) Signal or WhatsApp send failures
- Verify account/link status in service pages. - Verify account/link status in service pages.

View File

@@ -4,6 +4,9 @@ TOKEN_USER ?= m
STACK_ID_CLEAN := $(shell sid="$${GIA_STACK_ID:-$${STACK_ID:-}}"; sid=$$(printf "%s" "$$sid" | tr -cs 'a-zA-Z0-9._-' '-' | sed 's/^-*//; s/-*$$//'); printf "%s" "$$sid") STACK_ID_CLEAN := $(shell sid="$${GIA_STACK_ID:-$${STACK_ID:-}}"; sid=$$(printf "%s" "$$sid" | tr -cs 'a-zA-Z0-9._-' '-' | sed 's/^-*//; s/-*$$//'); printf "%s" "$$sid")
STACK_SUFFIX := $(if $(STACK_ID_CLEAN),_$(STACK_ID_CLEAN),) STACK_SUFFIX := $(if $(STACK_ID_CLEAN),_$(STACK_ID_CLEAN),)
APP_CONTAINER := gia$(STACK_SUFFIX) APP_CONTAINER := gia$(STACK_SUFFIX)
LOCAL_LIBC := $(shell if ldd --version 2>&1 | head -n1 | tr '[:upper:]' '[:lower:]' | grep -q musl; then printf musl; else printf glibc; fi)
LOCAL_VENV := $(if $(filter musl,$(LOCAL_LIBC)),env,genv)
PRE_COMMIT_BIN := $(firstword $(wildcard $(LOCAL_VENV)/bin/pre-commit) $(wildcard genv/bin/pre-commit) $(wildcard env/bin/pre-commit))
run: run:
bash $(QUADLET_MGR) up bash $(QUADLET_MGR) up
@@ -31,6 +34,23 @@ test:
exit 125; \ exit 125; \
fi fi
pre-commit:
@if [ -x "$(PRE_COMMIT_BIN)" ]; then \
"$(PRE_COMMIT_BIN)" run -a; \
else \
echo "No local pre-commit executable found in $(LOCAL_VENV)/bin, genv/bin, or env/bin." >&2; \
exit 127; \
fi
pre-commit-glibc:
@if [ -x "$(PRE_COMMIT_BIN)" ]; then \
echo "Using $(LOCAL_VENV) ($(LOCAL_LIBC))"; \
"$(PRE_COMMIT_BIN)" run -a; \
else \
echo "No local pre-commit executable found in $(LOCAL_VENV)/bin, genv/bin, or env/bin." >&2; \
exit 127; \
fi
migrate: migrate:
@if podman ps --format '{{.Names}}' | grep -qx "$(APP_CONTAINER)"; then \ @if podman ps --format '{{.Names}}' | grep -qx "$(APP_CONTAINER)"; then \
podman exec "$(APP_CONTAINER)" sh -lc "cd /code && . /venv/bin/activate && python manage.py migrate"; \ podman exec "$(APP_CONTAINER)" sh -lc "cd /code && . /venv/bin/activate && python manage.py migrate"; \

View File

@@ -9,9 +9,12 @@ GIA is a multi-transport communication workspace that unifies Signal, WhatsApp,
- Unifies chats from multiple protocols in one interface. - Unifies chats from multiple protocols in one interface.
- Keeps conversation history in a shared model (`Person`, `PersonIdentifier`, `ChatSession`, `Message`). - Keeps conversation history in a shared model (`Person`, `PersonIdentifier`, `ChatSession`, `Message`).
- Supports manual, queue-driven, and AI-assisted outbound messaging. - Supports manual, queue-driven, and AI-assisted outbound messaging.
- Supports canonical task creation from chat commands, web UI, and MCP tooling.
- Bridges messages across transports (including XMPP) with attachment handling. - Bridges messages across transports (including XMPP) with attachment handling.
- Tracks delivery/read metadata and typing state events. - Tracks delivery/read metadata and typing state events.
- Provides AI workspace analytics, mitigation plans, and insight visualizations. - Provides AI workspace analytics, mitigation plans, and insight visualizations.
- Exposes fine-grained capability policy controls for gateway commands, task intake, and command execution.
- Separates XMPP encryption controls into plaintext rejection, component-chat encryption, and relayed-contact encryption.
## Operation Modes ## Operation Modes
@@ -104,6 +107,14 @@ Core behavior:
- XMPP bridge supports text, attachments, typing, and chat-state paths. - XMPP bridge supports text, attachments, typing, and chat-state paths.
- Signal and WhatsApp media relay paths are normalized via shared transport/media logic. - Signal and WhatsApp media relay paths are normalized via shared transport/media logic.
## Settings Model
- `Security > Encryption`: transport-level XMPP/OMEMO controls, observed client state, and discovered key trust management.
- `Security > Permissions`: fine-grained capability policy by scope, service, and channel pattern.
- `Modules > Commands`: command profiles, bindings, and delivery behavior.
- `Modules > Task Automation`: task extraction defaults, channel overrides, and provider approval routing.
- `Modules > Business Plans`: generated document inbox and editor.
Key design points: Key design points:
- Prefer shared media preparation over per-service duplicated logic. - Prefer shared media preparation over per-service duplicated logic.
@@ -121,6 +132,7 @@ Core components:
- `core/clients/xmpp.py`: XMPP component bridge and media upload relay. - `core/clients/xmpp.py`: XMPP component bridge and media upload relay.
- `rust/manticore-mcp-worker`: optional ultra-light MCP frontend for direct Manticore status/query/maintenance. - `rust/manticore-mcp-worker`: optional ultra-light MCP frontend for direct Manticore status/query/maintenance.
- `core/views/compose.py`: Manual compose UX, polling/ws, send pipeline, media blob endpoint. - `core/views/compose.py`: Manual compose UX, polling/ws, send pipeline, media blob endpoint.
- `core/tasks/engine.py`: Canonical task creation/completion helpers used by chat commands and UI.
- `core/views/workspace.py`: AI workspace operations and insight surfaces. - `core/views/workspace.py`: AI workspace operations and insight surfaces.
- `core/views/osint.py`: Search/workspace OSINT interactions. - `core/views/osint.py`: Search/workspace OSINT interactions.
@@ -144,6 +156,7 @@ After environment setup from `INSTALL.md`:
4. Open manual compose and test per-service send/receive. 4. Open manual compose and test per-service send/receive.
5. Open AI workspace for analysis/mitigation workflows. 5. Open AI workspace for analysis/mitigation workflows.
6. Verify queue workflows if approval mode is used. 6. Verify queue workflows if approval mode is used.
7. Verify task creation from `/tasks/`, `.tasks add <project> :: <title>`, or scoped `.task add <title>`.
Recommended functional smoke test: Recommended functional smoke test:
@@ -157,6 +170,7 @@ Recommended functional smoke test:
- After runtime code changes, restart runtime services before validation. - After runtime code changes, restart runtime services before validation.
- Full environment recycle convention: `make stop && make run`. - Full environment recycle convention: `make stop && make run`.
- If single-service restart fails due to dependency state, use full recycle. - If single-service restart fails due to dependency state, use full recycle.
- Local repository checks are available via `make pre-commit`; use `make pre-commit-glibc` when you want libc-based `env`/`genv` selection.
## Security & Reliability Notes ## Security & Reliability Notes

View File

@@ -91,6 +91,12 @@ XMPP_USER_DOMAIN = getenv("XMPP_USER_DOMAIN", "")
XMPP_PORT = int(getenv("XMPP_PORT", "8888") or 8888) XMPP_PORT = int(getenv("XMPP_PORT", "8888") or 8888)
XMPP_SECRET = getenv("XMPP_SECRET") XMPP_SECRET = getenv("XMPP_SECRET")
XMPP_OMEMO_DATA_DIR = getenv("XMPP_OMEMO_DATA_DIR", "") XMPP_OMEMO_DATA_DIR = getenv("XMPP_OMEMO_DATA_DIR", "")
XMPP_UPLOAD_SERVICE = getenv("XMPP_UPLOAD_SERVICE", "").strip()
XMPP_UPLOAD_JID = getenv("XMPP_UPLOAD_JID", "").strip()
if not XMPP_UPLOAD_SERVICE and XMPP_UPLOAD_JID:
XMPP_UPLOAD_SERVICE = XMPP_UPLOAD_JID
if not XMPP_UPLOAD_SERVICE and XMPP_USER_DOMAIN:
XMPP_UPLOAD_SERVICE = XMPP_USER_DOMAIN
EVENT_LEDGER_DUAL_WRITE = getenv("EVENT_LEDGER_DUAL_WRITE", "false").lower() in trues EVENT_LEDGER_DUAL_WRITE = getenv("EVENT_LEDGER_DUAL_WRITE", "false").lower() in trues
CAPABILITY_ENFORCEMENT_ENABLED = ( CAPABILITY_ENFORCEMENT_ENABLED = (

View File

@@ -1,13 +1,11 @@
"""Test-only settings overrides — used via DJANGO_SETTINGS_MODULE=app.test_settings.""" from app.settings import * # noqa
from app.settings import * # noqa: F401, F403
CACHES = { CACHES = {
"default": { "default": {
"BACKEND": "django.core.cache.backends.locmem.LocMemCache", "BACKEND": "django.core.cache.backends.locmem.LocMemCache",
"LOCATION": "gia-test-cache",
} }
} }
INSTALLED_APPS = [app for app in INSTALLED_APPS if app != "cachalot"] # noqa: F405 CACHALOT_ENABLED = False
CHANNEL_LAYERS = {"default": {"BACKEND": "channels.layers.InMemoryChannelLayer"}}

View File

@@ -102,7 +102,7 @@
"severity": "high", "severity": "high",
"category": "supply-chain", "category": "supply-chain",
"rule": "GIT_PYTHON_DEP", "rule": "GIT_PYTHON_DEP",
"title": "Git/URL Python Dependency: git+https://git.zm.is/XF/django-crud-mixins", "title": "Git/URL Python Dependency: git+https://git.example.invalid/vendor/django-crud-mixins",
"description": "Installing from git/URL bypasses PyPI integrity checks.", "description": "Installing from git/URL bypasses PyPI integrity checks.",
"fix": "Publish to PyPI or pin to a specific commit hash", "fix": "Publish to PyPI or pin to a specific commit hash",
"cwe": null, "cwe": null,
@@ -522,9 +522,9 @@
"severity": "medium", "severity": "medium",
"category": "supply-chain", "category": "supply-chain",
"rule": "UNPINNED_PYTHON_DEP", "rule": "UNPINNED_PYTHON_DEP",
"title": "Unpinned Python Dependency: git+https://git.zm.is/XF/django-crud-mixins", "title": "Unpinned Python Dependency: git+https://git.example.invalid/vendor/django-crud-mixins",
"description": "Python dependency without version pin. Pin to a specific version for reproducible builds.", "description": "Python dependency without version pin. Pin to a specific version for reproducible builds.",
"fix": "Pin version: git+https://git.zm.is/XF/django-crud-mixins==x.y.z", "fix": "Pin version: git+https://git.example.invalid/vendor/django-crud-mixins==x.y.z",
"cwe": null, "cwe": null,
"owasp": null "owasp": null
}, },
@@ -812,7 +812,7 @@
"severity": "high", "severity": "high",
"category": "supply-chain", "category": "supply-chain",
"categoryLabel": "SUPPLY CHAIN", "categoryLabel": "SUPPLY CHAIN",
"title": "Git/URL Python Dependency: git+https://git.zm.is/XF/django-crud-mixins", "title": "Git/URL Python Dependency: git+https://git.example.invalid/vendor/django-crud-mixins",
"file": "requirements.txt:26", "file": "requirements.txt:26",
"action": "Publish to PyPI or pin to a specific commit hash", "action": "Publish to PyPI or pin to a specific commit hash",
"effort": "medium" "effort": "medium"
@@ -1162,9 +1162,9 @@
"severity": "medium", "severity": "medium",
"category": "supply-chain", "category": "supply-chain",
"categoryLabel": "SUPPLY CHAIN", "categoryLabel": "SUPPLY CHAIN",
"title": "Unpinned Python Dependency: git+https://git.zm.is/XF/django-crud-mixins", "title": "Unpinned Python Dependency: git+https://git.example.invalid/vendor/django-crud-mixins",
"file": "requirements.txt:26", "file": "requirements.txt:26",
"action": "Pin version: git+https://git.zm.is/XF/django-crud-mixins==x.y.z", "action": "Pin version: git+https://git.example.invalid/vendor/django-crud-mixins==x.y.z",
"effort": "medium" "effort": "medium"
}, },
{ {

View File

@@ -0,0 +1,158 @@
# Plan: Settings Integrity and Controls Reorganization
## Objective
Create a single coherent configuration model for Security, Commands, and Tasks so UI labels, enforcement behavior, docs, and navigation all match actual runtime behavior.
## Current Integrity Findings
### 1) Scope Registry and Enforcement Are Out of Sync
- Gateway command routes enforce scopes that are not exposed in Fine-Grained Security Scopes:
- `gateway.contacts`
- `gateway.help`
- `gateway.whoami`
- Fine-Grained Security Scopes currently expose only:
- `gateway.tasks`, `gateway.approval`, `tasks.submit`, `tasks.commands`, `command.bp`, `command.codex`, `command.claude`
- Result: users cannot configure all enforced gateway capabilities from the UI.
### 2) “Require Trusted Fingerprint” Semantics Are Incorrect
- UI and labels imply trust-list based enforcement.
- Runtime policy enforcement checks `UserXmppOmemoState.latest_client_key` equality, not `UserXmppOmemoTrustedKey` trust records.
- Result: behavior is “match latest observed key,” not “require trusted fingerprint.”
### 3) Command Surfaces Are Split Across Inconsistent Places
- Command Routing UI create flow exposes command slugs: `bp`, `codex`.
- Runtime command engine auto-bootstraps `claude` profile and bindings.
- Security scopes include `command.claude`, but Command Routing create UI does not.
- Result: commands are partially configurable depending on entrypoint.
### 4) Task and Command Control Planes Interlock Implicitly, Not Explicitly
- Task settings contain provider approval routing (Codex/Claude approver service/identifier).
- Security permissions contain policy gates (`tasks.*`, `command.*`, `gateway.*`).
- Command Routing controls profile/binding/variant policy.
- These are tightly coupled but not represented as one layered model in UI/docs.
### 5) Settings Shell Coverage Is Incomplete
- Shared settings hierarchy nav is context-processor driven and route-name based.
- Settings routes missing from modules/general/security group coverage include:
- `codex_settings`
- `codex_approval`
- `translation_preview`
- Result: some settings pages can miss expected local tabs/title context.
### 6) Documentation Drift
- Undocumented or under-documented features now in production behavior:
- Fine-Grained Security Scopes + Global Scope Override
- OMEMO trust management and per-direction encryption toggles
- Business Plan Inbox under Settings Modules
- Potentially misleading documentation:
- Security wording implies trusted-key enforcement that is not implemented.
## Reorganization Principles
1. One capability registry, reused by:
- Security Permissions UI
- command/task/gateway dispatch
- documentation generation
2. One settings-shell contract for every `/settings/*` page:
- title
- category tabs
- breadcrumb
3. Explicit layered model:
- Layer A: transport encryption/security
- Layer B: capability permissions (scope policy)
- Layer C: feature configuration (tasks/commands/providers)
4. No hardcoded duplicated scope lists in multiple files.
## Target Information Architecture
### Security
- `Encryption`: OMEMO transport controls + trust management.
- `Permissions`: Fine-Grained Security Scopes (capability policy only).
- `2FA`: account factor settings.
### Modules
- `Commands`: command profiles, bindings, variant policies.
- `Tasks`: extraction/defaults/overrides/provider pipelines.
- `Translation`: translation bridge settings.
- `Availability`: adapter availability controls.
- `Business Plans`: inbox/editor for generated artifacts.
### General
- `Notifications`
- `System`
- `Accessibility`
### AI
- `Models`
- `Traces`
## Phased Execution Plan
## Phase 1: Canonical Capability Registry
1. Add a central capability registry module (single source of truth):
- key
- label
- description
- group (`gateway`, `tasks`, `commands`, `agentic`, etc.)
- owning feature page URL
2. Migrate SecurityPage scope rendering to this registry.
3. Migrate gateway/command/task dispatchers to reference registry keys.
4. Add automated integrity test:
- every enforced scope key must exist in registry
- every registry key marked user-configurable must appear in Permissions UI
## Phase 2: Trusted Key Enforcement Correction
1. Define authoritative trust policy behavior:
- `require_trusted_fingerprint` must validate against `UserXmppOmemoTrustedKey`.
2. Preserve backwards compatibility via migration path:
- existing latest-key behavior can be temporarily represented as an optional fallback mode.
3. Update labels/help to match exact behavior.
4. Add tests:
- trusted key allows
- untrusted key denies
- unknown key denies
## Phase 3: Commands/Tasks Control Plane Alignment
1. Unify command surface definitions:
- Command Routing create/edit options include all supported command slugs (`bp`, `codex`, `claude`).
2. Add explicit cross-links:
- Tasks settings references Command Routing and Permissions scopes directly.
- Command Routing references Permissions scopes affecting each profile.
3. Introduce capability-impact preview panel:
- for each command/task action, show effective allow/deny by scope and channel.
## Phase 4: Settings Shell Normalization
1. Replace route-name allowlists in `settings_hierarchy_nav` with category mapping table.
2. Ensure all `/settings/*` pages declare category + tab metadata.
3. Include missing routes (`codex_settings`, `codex_approval`, `translation_preview`) in shell.
4. Add test to fail when a `/settings/*` route lacks shell metadata.
## Phase 5: Documentation Synchronization
1. Add a settings matrix doc generated (or validated) from the capability registry:
- capability key
- UI location
- enforced by code path
2. Update `README.md` and `INSTALL.md` security/modules sections.
3. Add "policy semantics" section clarifying:
- encryption-required vs per-scope OMEMO requirements
- trusted key behavior
- global override precedence
## Acceptance Criteria
- Every enforced scope is user-visible/configurable (or intentionally internal and documented).
- “Require Trusted Fingerprint” enforcement uses trust records, not only latest observed key.
- Command Routing and runtime-supported command slugs are aligned.
- All `/settings/*` pages show consistent settings shell navigation.
- Security/tasks/commands docs reflect real behavior and pass integrity checks.
## Risks and Mitigations
- Risk: policy behavior change blocks existing workflows.
- Mitigation: add compatibility flag and staged rollout.
- Risk: registry migration introduces missing scope mappings.
- Mitigation: integrity test that compares runtime-enforced keys vs registry.
- Risk: UI complexity increase.
- Mitigation: keep layered model with concise, context-aware summaries.
## Implementation Notes
- Keep migration incremental; avoid big-bang rewrite.
- Prioritize Phase 1 + Phase 2 first, because they are correctness and security semantics issues.
- Do not add new transport-specific branches; keep service-agnostic evaluation path in policy engine.

View File

@@ -280,6 +280,43 @@ def _extract_signal_delete(envelope):
} }
def _extract_signal_group_identifiers(payload: dict) -> list[str]:
envelope = payload.get("envelope") or {}
if not isinstance(envelope, dict):
return []
candidates = []
paths = [
("group",),
("groupV2",),
("dataMessage", "groupInfo"),
("dataMessage", "groupV2"),
("syncMessage", "sentMessage", "groupInfo"),
("syncMessage", "sentMessage", "groupV2"),
("syncMessage", "sentMessage", "message", "groupInfo"),
("syncMessage", "sentMessage", "message", "groupV2"),
]
key_names = ("id", "groupId", "groupID", "internal_id", "masterKey")
for path in paths:
node = _get_nested(envelope, path)
if isinstance(node, str):
candidates.append(node)
continue
if not isinstance(node, dict):
continue
for key_name in key_names:
value = str(node.get(key_name) or "").strip()
if value:
candidates.append(value)
unique = []
for value in candidates:
cleaned = str(value or "").strip()
if cleaned and cleaned not in unique:
unique.append(cleaned)
return unique
def _extract_signal_text(raw_payload, default_text=""): def _extract_signal_text(raw_payload, default_text=""):
text = str(default_text or "").strip() text = str(default_text or "").strip()
if text: if text:
@@ -332,6 +369,22 @@ def _identifier_candidates(*values):
return out return out
def _dedupe_person_identifiers(rows):
deduped = []
seen = set()
for row in rows or []:
key = (
int(getattr(row, "user_id", 0) or 0),
str(getattr(row, "person_id", "") or ""),
str(getattr(row, "service", "") or "").strip().lower(),
)
if key in seen:
continue
seen.add(key)
deduped.append(row)
return deduped
def _digits_only(value): def _digits_only(value):
return re.sub(r"[^0-9]", "", str(value or "").strip()) return re.sub(r"[^0-9]", "", str(value or "").strip())
@@ -762,6 +815,31 @@ class HandleMessage(Command):
log.warning("Signal reaction relay to XMPP failed: %s", exc) log.warning("Signal reaction relay to XMPP failed: %s", exc)
return return
if reply_to_self and str(text or "").strip().startswith("."):
responded_user_ids = set()
for identifier in identifiers:
if identifier.user_id in responded_user_ids:
continue
responded_user_ids.add(identifier.user_id)
gateway_replies = await self.ur.xmpp.client.execute_gateway_command(
sender_user=identifier.user,
body=text,
service=self.service,
channel_identifier=str(identifier.identifier or ""),
sender_identifier=str(identifier.identifier or ""),
local_message=None,
message_meta={
"signal": {
"source_uuid": str(effective_source_uuid or ""),
"source_number": str(effective_source_number or ""),
"reply_to_self": True,
}
},
)
for line in gateway_replies:
await c.send(f"[>] {line}")
return
# Handle attachments across multiple Signal payload variants. # Handle attachments across multiple Signal payload variants.
attachment_list = _extract_attachments(raw) attachment_list = _extract_attachments(raw)
xmpp_attachments = [] xmpp_attachments = []
@@ -1087,6 +1165,130 @@ class SignalClient(ClientBase):
self.client.register(HandleMessage(self.ur, self.service)) self.client.register(HandleMessage(self.ur, self.service))
self._command_task = None self._command_task = None
self._raw_receive_task = None self._raw_receive_task = None
self._catalog_refresh_task = None
async def _signal_api_get_list(
self, session: aiohttp.ClientSession, path: str
) -> list[dict]:
base = str(getattr(settings, "SIGNAL_HTTP_URL", "http://signal:8080")).rstrip(
"/"
)
try:
async with session.get(f"{base}{path}") as response:
if response.status != 200:
body = str(await response.text()).strip()[:300]
self.log.warning(
"signal catalog fetch failed path=%s status=%s body=%s",
path,
response.status,
body or "-",
)
return []
payload = await response.json(content_type=None)
except Exception as exc:
self.log.warning("signal catalog fetch failed path=%s error=%s", path, exc)
return []
return payload if isinstance(payload, list) else []
async def _refresh_runtime_catalog(self) -> None:
signal_number = str(getattr(settings, "SIGNAL_NUMBER", "") or "").strip()
if not signal_number:
return
encoded_account = quote_plus(signal_number)
timeout = aiohttp.ClientTimeout(total=20)
async with aiohttp.ClientSession(timeout=timeout) as session:
identities = await self._signal_api_get_list(
session, f"/v1/identities/{encoded_account}"
)
groups = await self._signal_api_get_list(
session, f"/v1/groups/{encoded_account}"
)
account_digits = _digits_only(signal_number)
contact_rows = []
seen_contacts = set()
for identity in identities:
if not isinstance(identity, dict):
continue
number = str(identity.get("number") or "").strip()
uuid = str(identity.get("uuid") or "").strip()
if account_digits and number and _digits_only(number) == account_digits:
continue
identifiers = []
for candidate in (number, uuid):
cleaned = str(candidate or "").strip()
if cleaned and cleaned not in identifiers:
identifiers.append(cleaned)
if not identifiers:
continue
key = (
f"uuid:{uuid.lower()}"
if uuid
else f"phone:{_digits_only(number) or number}"
)
if key in seen_contacts:
continue
seen_contacts.add(key)
contact_rows.append(
{
"identifier": number or uuid,
"identifiers": identifiers,
"name": str(identity.get("name") or "").strip(),
"number": number,
"uuid": uuid,
}
)
group_rows = []
seen_groups = set()
for group in groups:
if not isinstance(group, dict):
continue
group_id = str(group.get("id") or "").strip()
internal_id = str(group.get("internal_id") or "").strip()
identifiers = []
for candidate in (group_id, internal_id):
cleaned = str(candidate or "").strip()
if cleaned and cleaned not in identifiers:
identifiers.append(cleaned)
if not identifiers:
continue
key = group_id or internal_id
if key in seen_groups:
continue
seen_groups.add(key)
group_rows.append(
{
"identifier": group_id or internal_id,
"identifiers": identifiers,
"name": str(group.get("name") or "").strip(),
"id": group_id,
"internal_id": internal_id,
}
)
transport.update_runtime_state(
self.service,
accounts=[signal_number],
contacts=contact_rows,
groups=group_rows,
catalog_refreshed_at=int(time.time()),
catalog_error="",
)
async def _refresh_runtime_catalog_safe(self) -> None:
try:
await self._refresh_runtime_catalog()
except asyncio.CancelledError:
raise
except Exception as exc:
transport.update_runtime_state(
self.service,
catalog_error=str(exc).strip()[:300],
catalog_refreshed_at=int(time.time()),
)
self.log.warning("signal catalog refresh failed: %s", exc)
async def _drain_runtime_commands(self): async def _drain_runtime_commands(self):
"""Process queued runtime commands (e.g., web UI sends via composite router).""" """Process queued runtime commands (e.g., web UI sends via composite router)."""
@@ -1237,7 +1439,104 @@ class SignalClient(ClientBase):
self.log.warning(f"Command loop error: {exc}") self.log.warning(f"Command loop error: {exc}")
await asyncio.sleep(1) await asyncio.sleep(1)
async def _resolve_signal_identifiers(self, source_uuid: str, source_number: str): async def _resolve_signal_group_identifiers(self, group_candidates: list[str]):
unique_candidates = []
for value in group_candidates or []:
cleaned = str(value or "").strip()
if cleaned and cleaned not in unique_candidates:
unique_candidates.append(cleaned)
if not unique_candidates:
return []
runtime_groups = transport.get_runtime_state(self.service).get("groups") or []
expanded = list(unique_candidates)
for item in runtime_groups:
if not isinstance(item, dict):
continue
identifiers = []
for candidate in item.get("identifiers") or []:
cleaned = str(candidate or "").strip()
if cleaned:
identifiers.append(cleaned)
if not identifiers:
continue
if not any(candidate in identifiers for candidate in unique_candidates):
continue
for candidate in identifiers:
if candidate not in expanded:
expanded.append(candidate)
exact_identifiers = await sync_to_async(list)(
PersonIdentifier.objects.filter(
service=self.service,
identifier__in=expanded,
).select_related("user", "person")
)
if exact_identifiers:
return exact_identifiers
bare_candidates = []
for candidate in expanded:
bare = str(candidate or "").strip().split("@", 1)[0].strip()
if bare and bare not in bare_candidates:
bare_candidates.append(bare)
links = await sync_to_async(list)(
PlatformChatLink.objects.filter(
service=self.service,
is_group=True,
chat_identifier__in=bare_candidates,
)
.select_related("user", "person_identifier", "person")
.order_by("id")
)
if not links:
return []
results = []
seen_ids = set()
for link in links:
if link.person_identifier_id:
if link.person_identifier_id not in seen_ids:
seen_ids.add(link.person_identifier_id)
results.append(link.person_identifier)
continue
if not link.person_id:
continue
group_pi = await sync_to_async(
lambda: PersonIdentifier.objects.filter(
user=link.user,
person=link.person,
service=self.service,
identifier__in=expanded,
)
.select_related("user", "person")
.first()
)()
if group_pi is None:
group_pi = await sync_to_async(
lambda: PersonIdentifier.objects.filter(
user=link.user,
person=link.person,
service=self.service,
)
.select_related("user", "person")
.first()
)()
if group_pi is not None and group_pi.id not in seen_ids:
seen_ids.add(group_pi.id)
results.append(group_pi)
return results
async def _resolve_signal_identifiers(
self,
source_uuid: str,
source_number: str,
group_candidates: list[str] | None = None,
):
group_rows = await self._resolve_signal_group_identifiers(group_candidates or [])
if group_rows:
return _dedupe_person_identifiers(group_rows)
candidates = _identifier_candidates(source_uuid, source_number) candidates = _identifier_candidates(source_uuid, source_number)
if not candidates: if not candidates:
return [] return []
@@ -1248,7 +1547,7 @@ class SignalClient(ClientBase):
) )
) )
if identifiers: if identifiers:
return identifiers return _dedupe_person_identifiers(identifiers)
candidate_digits = {_digits_only(value) for value in candidates} candidate_digits = {_digits_only(value) for value in candidates}
candidate_digits = {value for value in candidate_digits if value} candidate_digits = {value for value in candidate_digits if value}
if not candidate_digits: if not candidate_digits:
@@ -1256,11 +1555,13 @@ class SignalClient(ClientBase):
rows = await sync_to_async(list)( rows = await sync_to_async(list)(
PersonIdentifier.objects.filter(service=self.service).select_related("user") PersonIdentifier.objects.filter(service=self.service).select_related("user")
) )
return [ return _dedupe_person_identifiers(
row [
for row in rows row
if _digits_only(getattr(row, "identifier", "")) in candidate_digits for row in rows
] if _digits_only(getattr(row, "identifier", "")) in candidate_digits
]
)
async def _auto_link_single_user_signal_identifier( async def _auto_link_single_user_signal_identifier(
self, source_uuid: str, source_number: str self, source_uuid: str, source_number: str
@@ -1301,7 +1602,123 @@ class SignalClient(ClientBase):
fallback_identifier, fallback_identifier,
int(owner.id), int(owner.id),
) )
return [pi] return _dedupe_person_identifiers([pi])
async def _build_xmpp_relay_attachments(self, payload: dict):
attachment_list = _extract_attachments(payload)
xmpp_attachments = []
compose_media_urls = []
if not attachment_list:
return xmpp_attachments, compose_media_urls
fetched_attachments = await asyncio.gather(
*[signalapi.fetch_signal_attachment(att["id"]) for att in attachment_list]
)
for fetched, att in zip(fetched_attachments, attachment_list):
if not fetched:
self.log.warning(
"signal raw attachment fetch failed attachment_id=%s", att["id"]
)
continue
xmpp_attachments.append(
{
"content": fetched["content"],
"content_type": fetched["content_type"],
"filename": fetched["filename"],
"size": fetched["size"],
}
)
blob_key = media_bridge.put_blob(
service="signal",
content=fetched["content"],
filename=fetched["filename"],
content_type=fetched["content_type"],
)
if blob_key:
compose_media_urls.append(
f"/compose/media/blob/?key={quote_plus(str(blob_key))}"
)
return xmpp_attachments, compose_media_urls
async def _relay_signal_inbound_to_xmpp(
self,
*,
identifiers,
relay_text,
xmpp_attachments,
compose_media_urls,
source_uuid,
source_number,
ts,
):
resolved_text_by_session = {}
for identifier in identifiers:
user = identifier.user
session_key = (identifier.user.id, identifier.person.id)
mutate_manips = await sync_to_async(list)(
Manipulation.objects.filter(
group__people=identifier.person,
user=identifier.user,
mode="mutate",
filter_enabled=True,
enabled=True,
)
)
if mutate_manips:
uploaded_urls = []
for manip in mutate_manips:
prompt = replies.generate_mutate_reply_prompt(
relay_text,
None,
manip,
None,
)
result = await ai.run_prompt(
prompt,
manip.ai,
operation="signal_mutate",
)
uploaded_urls = await self.ur.xmpp.client.send_from_external(
user,
identifier,
result,
False,
attachments=xmpp_attachments,
source_ref={
"upstream_message_id": "",
"upstream_author": str(
source_uuid or source_number or ""
),
"upstream_ts": int(ts or 0),
},
)
resolved_text = relay_text
if (not resolved_text) and uploaded_urls:
resolved_text = "\n".join(uploaded_urls)
elif (not resolved_text) and compose_media_urls:
resolved_text = "\n".join(compose_media_urls)
resolved_text_by_session[session_key] = resolved_text
continue
uploaded_urls = await self.ur.xmpp.client.send_from_external(
user,
identifier,
relay_text,
False,
attachments=xmpp_attachments,
source_ref={
"upstream_message_id": "",
"upstream_author": str(source_uuid or source_number or ""),
"upstream_ts": int(ts or 0),
},
)
resolved_text = relay_text
if (not resolved_text) and uploaded_urls:
resolved_text = "\n".join(uploaded_urls)
elif (not resolved_text) and compose_media_urls:
resolved_text = "\n".join(compose_media_urls)
resolved_text_by_session[session_key] = resolved_text
return resolved_text_by_session
async def _process_raw_inbound_event(self, raw_message: str): async def _process_raw_inbound_event(self, raw_message: str):
try: try:
@@ -1361,6 +1778,15 @@ class SignalClient(ClientBase):
envelope = payload.get("envelope") or {} envelope = payload.get("envelope") or {}
if not isinstance(envelope, dict): if not isinstance(envelope, dict):
return return
group_candidates = _extract_signal_group_identifiers(payload)
preferred_group_id = ""
for candidate in group_candidates:
cleaned = str(candidate or "").strip()
if cleaned.startswith("group."):
preferred_group_id = cleaned
break
if not preferred_group_id and group_candidates:
preferred_group_id = str(group_candidates[0] or "").strip()
sync_sent_message = _get_nested(envelope, ("syncMessage", "sentMessage")) or {} sync_sent_message = _get_nested(envelope, ("syncMessage", "sentMessage")) or {}
if isinstance(sync_sent_message, dict) and sync_sent_message: if isinstance(sync_sent_message, dict) and sync_sent_message:
raw_text = sync_sent_message.get("message") raw_text = sync_sent_message.get("message")
@@ -1395,6 +1821,7 @@ class SignalClient(ClientBase):
identifiers = await self._resolve_signal_identifiers( identifiers = await self._resolve_signal_identifiers(
destination_uuid, destination_uuid,
destination_number, destination_number,
group_candidates,
) )
if not identifiers: if not identifiers:
identifiers = await self._auto_link_single_user_signal_identifier( identifiers = await self._auto_link_single_user_signal_identifier(
@@ -1532,7 +1959,12 @@ class SignalClient(ClientBase):
or str(payload.get("account") or "").strip() or str(payload.get("account") or "").strip()
or "self" or "self"
) )
source_chat_id = destination_number or destination_uuid or sender_key source_chat_id = (
preferred_group_id
or destination_number
or destination_uuid
or sender_key
)
reply_ref = reply_sync.extract_reply_ref(self.service, payload) reply_ref = reply_sync.extract_reply_ref(self.service, payload)
for identifier in identifiers: for identifier in identifiers:
session = await history.get_chat_session( session = await history.get_chat_session(
@@ -1599,7 +2031,11 @@ class SignalClient(ClientBase):
): ):
return return
identifiers = await self._resolve_signal_identifiers(source_uuid, source_number) identifiers = await self._resolve_signal_identifiers(
source_uuid,
source_number,
group_candidates,
)
reaction_payload = _extract_signal_reaction(envelope) reaction_payload = _extract_signal_reaction(envelope)
edit_payload = _extract_signal_edit(envelope) edit_payload = _extract_signal_edit(envelope)
delete_payload = _extract_signal_delete(envelope) delete_payload = _extract_signal_delete(envelope)
@@ -1620,6 +2056,7 @@ class SignalClient(ClientBase):
identifiers = await self._resolve_signal_identifiers( identifiers = await self._resolve_signal_identifiers(
destination_uuid, destination_uuid,
destination_number, destination_number,
group_candidates,
) )
if not identifiers: if not identifiers:
identifiers = await self._auto_link_single_user_signal_identifier( identifiers = await self._auto_link_single_user_signal_identifier(
@@ -1730,7 +2167,13 @@ class SignalClient(ClientBase):
text = _extract_signal_text( text = _extract_signal_text(
payload, str(data_message.get("message") or "").strip() payload, str(data_message.get("message") or "").strip()
) )
if not text: relay_text = text
xmpp_attachments, compose_media_urls = await self._build_xmpp_relay_attachments(
payload
)
if xmpp_attachments and _is_compose_blob_only_text(relay_text):
relay_text = ""
if not relay_text and not xmpp_attachments:
return return
ts_raw = ( ts_raw = (
@@ -1753,8 +2196,17 @@ class SignalClient(ClientBase):
or source_number or source_number
or (identifiers[0].identifier if identifiers else "") or (identifiers[0].identifier if identifiers else "")
) )
source_chat_id = source_number or source_uuid or sender_key source_chat_id = preferred_group_id or source_number or source_uuid or sender_key
reply_ref = reply_sync.extract_reply_ref(self.service, payload) reply_ref = reply_sync.extract_reply_ref(self.service, payload)
resolved_text_by_session = await self._relay_signal_inbound_to_xmpp(
identifiers=identifiers,
relay_text=relay_text,
xmpp_attachments=xmpp_attachments,
compose_media_urls=compose_media_urls,
source_uuid=source_uuid,
source_number=source_number,
ts=ts,
)
for identifier in identifiers: for identifier in identifiers:
session = await history.get_chat_session(identifier.user, identifier) session = await history.get_chat_session(identifier.user, identifier)
@@ -1773,10 +2225,14 @@ class SignalClient(ClientBase):
)() )()
if exists: if exists:
continue continue
message_text = resolved_text_by_session.get(
(identifier.user.id, identifier.person.id),
relay_text if relay_text else "\n".join(compose_media_urls),
)
local_message = await history.store_message( local_message = await history.store_message(
session=session, session=session,
sender=sender_key, sender=sender_key,
text=text, text=message_text,
ts=ts, ts=ts,
outgoing=False, outgoing=False,
source_service=self.service, source_service=self.service,
@@ -1792,7 +2248,7 @@ class SignalClient(ClientBase):
await self.ur.message_received( await self.ur.message_received(
self.service, self.service,
identifier=identifier, identifier=identifier,
text=text, text=message_text,
ts=ts, ts=ts,
payload=payload, payload=payload,
local_message=local_message, local_message=local_message,
@@ -1809,16 +2265,45 @@ class SignalClient(ClientBase):
if not signal_number: if not signal_number:
return return
uri = f"ws://{SIGNAL_URL}/v1/receive/{signal_number}" uri = f"ws://{SIGNAL_URL}/v1/receive/{signal_number}"
poll_uri = f"http://{SIGNAL_URL}/v1/receive/{signal_number}"
use_http_polling = False
while not self._stopping: while not self._stopping:
try: try:
transport.update_runtime_state(self.service, accounts=[signal_number])
if use_http_polling:
timeout = aiohttp.ClientTimeout(total=15)
async with aiohttp.ClientSession(timeout=timeout) as session:
async with session.get(poll_uri) as response:
response.raise_for_status()
payload = await response.json(content_type=None)
if isinstance(payload, dict):
payload = [payload]
if not isinstance(payload, list):
payload = []
for item in payload:
if not isinstance(item, dict):
continue
await self._process_raw_inbound_event(json.dumps(item))
continue
async with websockets.connect(uri, ping_interval=None) as websocket: async with websockets.connect(uri, ping_interval=None) as websocket:
async for raw_message in websocket: async for raw_message in websocket:
await self._process_raw_inbound_event(raw_message) await self._process_raw_inbound_event(raw_message)
except asyncio.CancelledError: except asyncio.CancelledError:
raise raise
except Exception as exc: except Exception as exc:
if (
not use_http_polling
and "server rejected WebSocket connection: HTTP 200" in str(exc)
):
self.log.info(
"signal raw-receive switching to HTTP polling for %s",
signal_number,
)
use_http_polling = True
continue
self.log.warning("signal raw-receive loop error: %s", exc) self.log.warning("signal raw-receive loop error: %s", exc)
await asyncio.sleep(2) await asyncio.sleep(2)
transport.update_runtime_state(self.service, accounts=[])
def start(self): def start(self):
self.log.info("Signal client starting...") self.log.info("Signal client starting...")
@@ -1828,6 +2313,10 @@ class SignalClient(ClientBase):
self._command_task = self.loop.create_task(self._command_loop()) self._command_task = self.loop.create_task(self._command_loop())
if not self._raw_receive_task or self._raw_receive_task.done(): if not self._raw_receive_task or self._raw_receive_task.done():
self._raw_receive_task = self.loop.create_task(self._raw_receive_loop()) self._raw_receive_task = self.loop.create_task(self._raw_receive_loop())
# Use direct websocket receive loop as primary ingestion path. if not self._catalog_refresh_task or self._catalog_refresh_task.done():
# signalbot's internal receive consumer can compete for the same stream self._catalog_refresh_task = self.loop.create_task(
# and starve inbound events in this deployment, so we keep it disabled. self._refresh_runtime_catalog_safe()
)
# Keep signalbot's internal receive consumer disabled to avoid competing
# consumers. The raw loop adapts between websocket and HTTP polling
# depending on the deployed signal-cli-rest-api behavior.

View File

@@ -5,6 +5,7 @@ import os
import secrets import secrets
import shutil import shutil
import time import time
from pathlib import Path
from typing import Any from typing import Any
from urllib.parse import quote_plus from urllib.parse import quote_plus
@@ -444,23 +445,6 @@ def list_accounts(service: str):
Return account identifiers for service UI list. Return account identifiers for service UI list.
""" """
service_key = _service_key(service) service_key = _service_key(service)
if service_key == "signal":
import requests
base = str(getattr(settings, "SIGNAL_HTTP_URL", "http://signal:8080")).rstrip(
"/"
)
try:
response = requests.get(f"{base}/v1/accounts", timeout=20)
if not response.ok:
return []
payload = orjson.loads(response.text or "[]")
if isinstance(payload, list):
return payload
except Exception:
return []
return []
state = get_runtime_state(service_key) state = get_runtime_state(service_key)
accounts = state.get("accounts") or [] accounts = state.get("accounts") or []
if service_key == "whatsapp" and not accounts: if service_key == "whatsapp" and not accounts:
@@ -495,13 +479,24 @@ def _wipe_signal_cli_local_state() -> bool:
Best-effort local signal-cli state reset for json-rpc deployments where Best-effort local signal-cli state reset for json-rpc deployments where
REST account delete endpoints are unavailable. REST account delete endpoints are unavailable.
""" """
config_roots = ( config_roots = []
"/code/signal-cli-config", base_dir = getattr(settings, "BASE_DIR", None)
"/signal-cli-config", if base_dir:
"/home/.local/share/signal-cli", config_roots.append(str(Path(base_dir) / "signal-cli-config"))
config_roots.extend(
[
"/code/signal-cli-config",
"/signal-cli-config",
"/home/.local/share/signal-cli",
]
) )
removed_any = False removed_any = False
seen_roots = set()
for root in config_roots: for root in config_roots:
root = str(root or "").strip()
if not root or root in seen_roots:
continue
seen_roots.add(root)
if not os.path.isdir(root): if not os.path.isdir(root):
continue continue
try: try:
@@ -554,7 +549,26 @@ def unlink_account(service: str, account: str) -> bool:
continue continue
if unlinked: if unlinked:
return True return True
return _wipe_signal_cli_local_state() wiped = _wipe_signal_cli_local_state()
if not wiped:
return False
# Best-effort verification: if the REST API still reports the same account,
# the runtime likely still holds active linked state and the UI should not
# claim relink is ready yet.
remaining_accounts = list_accounts("signal")
for row in remaining_accounts:
if isinstance(row, dict):
candidate = (
row.get("number")
or row.get("id")
or row.get("jid")
or row.get("account")
)
else:
candidate = row
if _account_key(str(candidate or "")) == _account_key(account_value):
return False
return True
if service_key in {"whatsapp", "instagram"}: if service_key in {"whatsapp", "instagram"}:
state = get_runtime_state(service_key) state = get_runtime_state(service_key)
@@ -842,9 +856,7 @@ async def send_message_raw(
runtime_result = await runtime_client.send_message_raw( runtime_result = await runtime_client.send_message_raw(
recipient, recipient,
text=text, text=text,
attachments=await prepare_outbound_attachments( attachments=attachments or [],
service_key, attachments or []
),
metadata=dict(metadata or {}), metadata=dict(metadata or {}),
) )
if runtime_result is not False and runtime_result is not None: if runtime_result is not False and runtime_result is not None:
@@ -853,11 +865,8 @@ async def send_message_raw(
log.warning("%s runtime send failed: %s", service_key, exc) log.warning("%s runtime send failed: %s", service_key, exc)
# Web/UI process cannot access UR in-process runtime client directly. # Web/UI process cannot access UR in-process runtime client directly.
# Hand off send to UR via shared cache command queue. # Hand off send to UR via shared cache command queue.
prepared_attachments = await prepare_outbound_attachments(
service_key, attachments or []
)
command_attachments = [] command_attachments = []
for att in prepared_attachments: for att in (attachments or []):
row = dict(att or {}) row = dict(att or {})
# Keep payload cache-friendly and avoid embedding raw bytes. # Keep payload cache-friendly and avoid embedding raw bytes.
for key in ("content",): for key in ("content",):
@@ -1082,9 +1091,8 @@ async def fetch_attachment(service: str, attachment_ref: dict):
service_key = _service_key(service) service_key = _service_key(service)
if service_key == "signal": if service_key == "signal":
attachment_id = attachment_ref.get("id") or attachment_ref.get("attachment_id") attachment_id = attachment_ref.get("id") or attachment_ref.get("attachment_id")
if not attachment_id: if attachment_id:
return None return await signalapi.fetch_signal_attachment(attachment_id)
return await signalapi.fetch_signal_attachment(attachment_id)
runtime_client = get_runtime_client(service_key) runtime_client = get_runtime_client(service_key)
if runtime_client and hasattr(runtime_client, "fetch_attachment"): if runtime_client and hasattr(runtime_client, "fetch_attachment"):
@@ -1160,7 +1168,7 @@ def get_link_qr(service: str, device_name: str):
response = requests.get( response = requests.get(
f"{base}/v1/qrcodelink", f"{base}/v1/qrcodelink",
params={"device_name": device}, params={"device_name": device},
timeout=20, timeout=5,
) )
response.raise_for_status() response.raise_for_status()
return response.content return response.content

View File

@@ -2105,45 +2105,28 @@ class WhatsAppClient(ClientBase):
""" """
Extract user-visible text from diverse WhatsApp message payload shapes. Extract user-visible text from diverse WhatsApp message payload shapes.
""" """
candidates = ( for candidate in self._iter_message_variants(msg_obj):
self._pluck(msg_obj, "conversation"), for value in (
self._pluck(msg_obj, "Conversation"), self._pluck(candidate, "conversation"),
self._pluck(msg_obj, "extendedTextMessage", "text"), self._pluck(candidate, "Conversation"),
self._pluck(msg_obj, "ExtendedTextMessage", "Text"), self._pluck(candidate, "extendedTextMessage", "text"),
self._pluck(msg_obj, "extended_text_message", "text"), self._pluck(candidate, "ExtendedTextMessage", "Text"),
self._pluck(msg_obj, "imageMessage", "caption"), self._pluck(candidate, "extended_text_message", "text"),
self._pluck(msg_obj, "videoMessage", "caption"), self._pluck(candidate, "imageMessage", "caption"),
self._pluck(msg_obj, "documentMessage", "caption"), self._pluck(candidate, "videoMessage", "caption"),
self._pluck(msg_obj, "ephemeralMessage", "message", "conversation"), self._pluck(candidate, "documentMessage", "caption"),
self._pluck( ):
msg_obj, "ephemeralMessage", "message", "extendedTextMessage", "text" text = str(value or "").strip()
), if text:
self._pluck(msg_obj, "viewOnceMessage", "message", "conversation"), return text
self._pluck( for value in (
msg_obj, "viewOnceMessage", "message", "extendedTextMessage", "text"
),
self._pluck(msg_obj, "viewOnceMessageV2", "message", "conversation"),
self._pluck(
msg_obj, "viewOnceMessageV2", "message", "extendedTextMessage", "text"
),
self._pluck(
msg_obj, "viewOnceMessageV2Extension", "message", "conversation"
),
self._pluck(
msg_obj,
"viewOnceMessageV2Extension",
"message",
"extendedTextMessage",
"text",
),
self._pluck(event_obj, "message", "conversation"), self._pluck(event_obj, "message", "conversation"),
self._pluck(event_obj, "message", "extendedTextMessage", "text"), self._pluck(event_obj, "message", "extendedTextMessage", "text"),
self._pluck(event_obj, "Message", "conversation"), self._pluck(event_obj, "Message", "conversation"),
self._pluck(event_obj, "Message", "extendedTextMessage", "text"), self._pluck(event_obj, "Message", "extendedTextMessage", "text"),
self._pluck(event_obj, "conversation"), self._pluck(event_obj, "conversation"),
self._pluck(event_obj, "text"), self._pluck(event_obj, "text"),
) ):
for value in candidates:
text = str(value or "").strip() text = str(value or "").strip()
if text: if text:
return text return text
@@ -2318,7 +2301,40 @@ class WhatsAppClient(ClientBase):
return str(user) return str(user)
return raw return raw
def _is_media_message(self, message_obj): def _iter_message_variants(self, message_obj, max_depth: int = 8):
wrapper_paths = (
("deviceSentMessage", "message"),
("DeviceSentMessage", "Message"),
("ephemeralMessage", "message"),
("EphemeralMessage", "Message"),
("viewOnceMessage", "message"),
("ViewOnceMessage", "Message"),
("viewOnceMessageV2", "message"),
("ViewOnceMessageV2", "Message"),
("viewOnceMessageV2Extension", "message"),
("ViewOnceMessageV2Extension", "Message"),
("editedMessage", "message"),
("EditedMessage", "Message"),
)
queue = [(message_obj, 0)]
seen = set()
while queue:
current, depth = queue.pop(0)
if current is None:
continue
marker = id(current)
if marker in seen:
continue
seen.add(marker)
yield current
if depth >= max_depth:
continue
for path in wrapper_paths:
nested = self._pluck(current, *path)
if nested is not None:
queue.append((nested, depth + 1))
def _direct_media_payload(self, message_obj):
media_fields = ( media_fields = (
"imageMessage", "imageMessage",
"videoMessage", "videoMessage",
@@ -2334,8 +2350,17 @@ class WhatsAppClient(ClientBase):
for field in media_fields: for field in media_fields:
value = self._pluck(message_obj, field) value = self._pluck(message_obj, field)
if value: if value:
return True return value
return False return None
def _resolve_media_message(self, message_obj):
for candidate in self._iter_message_variants(message_obj):
if self._direct_media_payload(candidate):
return candidate
return None
def _is_media_message(self, message_obj):
return self._resolve_media_message(message_obj) is not None
def _infer_media_content_type(self, message_obj): def _infer_media_content_type(self, message_obj):
if self._pluck(message_obj, "imageMessage") or self._pluck( if self._pluck(message_obj, "imageMessage") or self._pluck(
@@ -2439,13 +2464,14 @@ class WhatsAppClient(ClientBase):
if not self._client: if not self._client:
return [] return []
msg_obj = self._pluck(event, "message") or self._pluck(event, "Message") msg_obj = self._pluck(event, "message") or self._pluck(event, "Message")
if msg_obj is None or not self._is_media_message(msg_obj): media_msg = self._resolve_media_message(msg_obj)
if media_msg is None:
return [] return []
if not hasattr(self._client, "download_any"): if not hasattr(self._client, "download_any"):
return [] return []
try: try:
payload = await self._maybe_await(self._client.download_any(msg_obj)) payload = await self._maybe_await(self._client.download_any(media_msg))
except Exception as exc: except Exception as exc:
self.log.warning("whatsapp media download failed: %s", exc) self.log.warning("whatsapp media download failed: %s", exc)
return [] return []
@@ -2455,19 +2481,21 @@ class WhatsAppClient(ClientBase):
if not isinstance(payload, (bytes, bytearray)): if not isinstance(payload, (bytes, bytearray)):
return [] return []
filename = self._pluck(msg_obj, "documentMessage", "fileName") or self._pluck( filename = self._pluck(
msg_obj, "document_message", "file_name" media_msg, "documentMessage", "fileName"
) or self._pluck(
media_msg, "document_message", "file_name"
) )
content_type = ( content_type = (
self._pluck(msg_obj, "documentMessage", "mimetype") self._pluck(media_msg, "documentMessage", "mimetype")
or self._pluck(msg_obj, "document_message", "mimetype") or self._pluck(media_msg, "document_message", "mimetype")
or self._pluck(msg_obj, "imageMessage", "mimetype") or self._pluck(media_msg, "imageMessage", "mimetype")
or self._pluck(msg_obj, "image_message", "mimetype") or self._pluck(media_msg, "image_message", "mimetype")
or self._pluck(msg_obj, "videoMessage", "mimetype") or self._pluck(media_msg, "videoMessage", "mimetype")
or self._pluck(msg_obj, "video_message", "mimetype") or self._pluck(media_msg, "video_message", "mimetype")
or self._pluck(msg_obj, "audioMessage", "mimetype") or self._pluck(media_msg, "audioMessage", "mimetype")
or self._pluck(msg_obj, "audio_message", "mimetype") or self._pluck(media_msg, "audio_message", "mimetype")
or self._infer_media_content_type(msg_obj) or self._infer_media_content_type(media_msg)
) )
if not filename: if not filename:
ext = mimetypes.guess_extension( ext = mimetypes.guess_extension(
@@ -2651,6 +2679,38 @@ class WhatsAppClient(ClientBase):
if not identifiers: if not identifiers:
return return
is_self_chat = bool(
is_from_me
and str(sender or "").strip()
and str(chat or "").strip()
and str(sender).strip() == str(chat).strip()
)
if is_self_chat and str(text or "").strip().startswith("."):
responded_user_ids = set()
reply_target = str(chat or sender or "").strip()
for identifier in identifiers:
if identifier.user_id in responded_user_ids:
continue
responded_user_ids.add(identifier.user_id)
replies = await self.ur.xmpp.client.execute_gateway_command(
sender_user=identifier.user,
body=text,
service=self.service,
channel_identifier=str(identifier.identifier or ""),
sender_identifier=str(identifier.identifier or ""),
local_message=None,
message_meta={
"whatsapp": {
"sender": str(sender or ""),
"chat": str(chat or ""),
"self_chat": True,
}
},
)
for line in replies:
await self.send_message_raw(reply_target, f"[>] {line}")
return
attachments = await self._download_event_media(event) attachments = await self._download_event_media(event)
xmpp_attachments = [] xmpp_attachments = []
if attachments: if attachments:
@@ -3186,28 +3246,20 @@ class WhatsAppClient(ClientBase):
url = (attachment or {}).get("url") url = (attachment or {}).get("url")
if url: if url:
safe_url = validate_attachment_url(url) safe_url = validate_attachment_url(url)
timeout = aiohttp.ClientTimeout(total=20) filename, content_type = validate_attachment_metadata(
async with aiohttp.ClientSession(timeout=timeout) as session: filename=(attachment or {}).get("filename")
async with session.get(safe_url) as response: or safe_url.rstrip("/").split("/")[-1]
if response.status != 200: or "attachment.bin",
return None content_type=(attachment or {}).get("content_type")
payload = await response.read() or "application/octet-stream",
filename, content_type = validate_attachment_metadata( size=(attachment or {}).get("size"),
filename=(attachment or {}).get("filename") )
or safe_url.rstrip("/").split("/")[-1] return {
or "attachment.bin", "url": safe_url,
content_type=(attachment or {}).get("content_type") "filename": filename,
or response.headers.get( "content_type": content_type,
"Content-Type", "application/octet-stream" "size": (attachment or {}).get("size"),
), }
size=len(payload),
)
return {
"content": payload,
"filename": filename,
"content_type": content_type,
"size": len(payload),
}
return None return None
async def send_message_raw( async def send_message_raw(
@@ -3364,18 +3416,26 @@ class WhatsAppClient(ClientBase):
payload = await self._fetch_attachment_payload(attachment) payload = await self._fetch_attachment_payload(attachment)
if not payload: if not payload:
continue continue
data = payload.get("content") or b"" data = payload.get("content")
source_url = str(payload.get("url") or "").strip()
try: try:
filename, mime = validate_attachment_metadata( filename, mime = validate_attachment_metadata(
filename=payload.get("filename") or "attachment.bin", filename=payload.get("filename") or "attachment.bin",
content_type=payload.get("content_type") content_type=payload.get("content_type")
or "application/octet-stream", or "application/octet-stream",
size=payload.get("size") size=payload.get("size")
or (len(data) if isinstance(data, (bytes, bytearray)) else 0), or (len(data) if isinstance(data, (bytes, bytearray)) else None),
) )
except Exception as exc: except Exception as exc:
self.log.warning("whatsapp blocked attachment: %s", exc) self.log.warning("whatsapp blocked attachment: %s", exc)
continue continue
file_arg = (
data
if isinstance(data, (bytes, bytearray))
else source_url
)
if not file_arg:
continue
mime = str(mime).lower() mime = str(mime).lower()
attachment_target = jid_obj if jid_obj is not None else jid attachment_target = jid_obj if jid_obj is not None else jid
send_method = "document" send_method = "document"
@@ -3392,27 +3452,31 @@ class WhatsAppClient(ClientBase):
send_method, send_method,
mime, mime,
filename, filename,
len(data) if isinstance(data, (bytes, bytearray)) else 0, (
len(data)
if isinstance(data, (bytes, bytearray))
else (payload.get("size") or 0)
),
) )
try: try:
if mime.startswith("image/") and hasattr(self._client, "send_image"): if mime.startswith("image/") and hasattr(self._client, "send_image"):
response = await self._maybe_await( response = await self._maybe_await(
self._client.send_image(attachment_target, data, caption="") self._client.send_image(attachment_target, file_arg, caption="")
) )
elif mime.startswith("video/") and hasattr(self._client, "send_video"): elif mime.startswith("video/") and hasattr(self._client, "send_video"):
response = await self._maybe_await( response = await self._maybe_await(
self._client.send_video(attachment_target, data, caption="") self._client.send_video(attachment_target, file_arg, caption="")
) )
elif mime.startswith("audio/") and hasattr(self._client, "send_audio"): elif mime.startswith("audio/") and hasattr(self._client, "send_audio"):
response = await self._maybe_await( response = await self._maybe_await(
self._client.send_audio(attachment_target, data) self._client.send_audio(attachment_target, file_arg)
) )
elif hasattr(self._client, "send_document"): elif hasattr(self._client, "send_document"):
response = await self._maybe_await( response = await self._maybe_await(
self._client.send_document( self._client.send_document(
attachment_target, attachment_target,
data, file_arg,
filename=filename, filename=filename,
mimetype=mime, mimetype=mime,
caption="", caption="",

File diff suppressed because it is too large Load Diff

View File

@@ -32,35 +32,93 @@ def settings_hierarchy_nav(request):
translation_href = reverse("translation_settings") translation_href = reverse("translation_settings")
availability_href = reverse("availability_settings") availability_href = reverse("availability_settings")
general_routes = { categories = {
"notifications_settings", "general": {
"notifications_update", "routes": {
"system_settings", "notifications_settings",
"accessibility_settings", "notifications_update",
} "system_settings",
security_routes = { "accessibility_settings",
"security_settings", },
"encryption_settings", "title": "General",
"permission_settings", "tabs": [
"security_2fa", (
} "Notifications",
ai_routes = { notifications_href,
"ai_settings", lambda: path == notifications_href,
"ai_models", ),
"ais", ("System", system_href, lambda: path == system_href),
"ai_create", (
"ai_update", "Accessibility",
"ai_delete", accessibility_href,
"ai_execution_log", lambda: path == accessibility_href,
} ),
modules_routes = { ],
"modules_settings", },
"command_routing", "security": {
"business_plan_inbox", "routes": {
"business_plan_editor", "security_settings",
"tasks_settings", "encryption_settings",
"translation_settings", "permission_settings",
"availability_settings", "security_2fa",
},
"title": "Security",
"tabs": [
("Encryption", encryption_href, lambda: path == encryption_href),
("Permissions", permissions_href, lambda: path == permissions_href),
(
"2FA",
security_2fa_href,
lambda: path == security_2fa_href or namespace == "two_factor",
),
],
},
"ai": {
"routes": {
"ai_settings",
"ai_models",
"ais",
"ai_create",
"ai_update",
"ai_delete",
"ai_execution_log",
},
"title": "AI",
"tabs": [
("Models", ai_models_href, lambda: path == ai_models_href),
("Traces", ai_traces_href, lambda: path == ai_traces_href),
],
},
"modules": {
"routes": {
"modules_settings",
"command_routing",
"business_plan_inbox",
"business_plan_editor",
"tasks_settings",
"translation_settings",
"translation_preview",
"availability_settings",
"codex_settings",
"codex_approval",
},
"title": "Modules",
"tabs": [
("Commands", commands_href, lambda: path == commands_href),
(
"Business Plans",
business_plans_href,
lambda: url_name in {"business_plan_inbox", "business_plan_editor"},
),
("Task Automation", tasks_href, lambda: path == tasks_href),
(
"Translation",
translation_href,
lambda: url_name in {"translation_settings", "translation_preview"},
),
("Availability", availability_href, lambda: path == availability_href),
],
},
} }
two_factor_security_routes = { two_factor_security_routes = {
@@ -72,55 +130,29 @@ def settings_hierarchy_nav(request):
"phone_delete", "phone_delete",
} }
if url_name in general_routes: if url_name in categories["general"]["routes"]:
settings_nav = { category = categories["general"]
"title": "General", elif url_name in categories["security"]["routes"] or (
"tabs": [
_tab("Notifications", notifications_href, path == notifications_href),
_tab("System", system_href, path == system_href),
_tab("Accessibility", accessibility_href, path == accessibility_href),
],
}
elif url_name in security_routes or (
namespace == "two_factor" and url_name in two_factor_security_routes namespace == "two_factor" and url_name in two_factor_security_routes
): ):
settings_nav = { category = categories["security"]
"title": "Security", elif url_name in categories["ai"]["routes"]:
"tabs": [ category = categories["ai"]
_tab("Encryption", encryption_href, path == encryption_href), elif url_name in categories["modules"]["routes"]:
_tab("Permissions", permissions_href, path == permissions_href), category = categories["modules"]
_tab(
"2FA",
security_2fa_href,
path == security_2fa_href or namespace == "two_factor",
),
],
}
elif url_name in ai_routes:
settings_nav = {
"title": "AI",
"tabs": [
_tab("Models", ai_models_href, path == ai_models_href),
_tab("Traces", ai_traces_href, path == ai_traces_href),
],
}
elif url_name in modules_routes:
settings_nav = {
"title": "Modules",
"tabs": [
_tab("Commands", commands_href, path == commands_href),
_tab(
"Business Plans",
business_plans_href,
url_name in {"business_plan_inbox", "business_plan_editor"},
),
_tab("Task Automation", tasks_href, path == tasks_href),
_tab("Translation", translation_href, path == translation_href),
_tab("Availability", availability_href, path == availability_href),
],
}
else: else:
category = None
if category is None:
settings_nav = None settings_nav = None
else:
settings_nav = {
"title": str(category.get("title") or "Settings"),
"tabs": [
_tab(label, href, bool(is_active()))
for label, href, is_active in category.get("tabs", [])
],
}
if not settings_nav: if not settings_nav:
return {} return {}

417
core/gateway/builtin.py Normal file
View File

@@ -0,0 +1,417 @@
from __future__ import annotations
import re
from asgiref.sync import sync_to_async
from core.gateway.commands import (
GatewayCommandContext,
GatewayCommandRoute,
dispatch_gateway_command,
)
from core.models import (
CodexPermissionRequest,
CodexRun,
DerivedTask,
ExternalSyncEvent,
Person,
TaskProject,
User,
)
from core.tasks.engine import create_task_record_and_sync, mark_task_completed_and_sync
APPROVAL_PROVIDER_COMMANDS = {
".claude": "claude",
".codex": "codex_cli",
}
APPROVAL_EVENT_PREFIX = "codex_approval"
ACTION_TO_STATUS = {"approve": "approved", "reject": "denied"}
TASK_COMMAND_MATCH_RE = re.compile(r"^\s*(?:\.tasks\b|\.l\b|\.list\b)", re.IGNORECASE)
def gateway_help_lines() -> list[str]:
return [
"Gateway commands:",
" .contacts — list contacts",
" .whoami — show current user",
" .help — show this help",
"Approval commands:",
" .approval list-pending [all] — list pending approval requests",
" .approval approve <key> — approve a request",
" .approval reject <key> — reject a request",
" .approval status <key> — check request status",
"Task commands:",
" .l — shortcut for open task list",
" .tasks list [status] [limit] — list tasks",
" .tasks add <project> :: <title> — create task in project",
" .tasks show #<ref> — show task details",
" .tasks complete #<ref> — mark task complete",
" .tasks undo #<ref> — remove task",
]
def _resolve_request_provider(request):
event = getattr(request, "external_sync_event", None)
if event is None:
return ""
return str(getattr(event, "provider", "") or "").strip()
async def _apply_approval_decision(request, decision):
status = ACTION_TO_STATUS.get(decision, decision)
request.status = status
await sync_to_async(request.save)(update_fields=["status"])
run = None
if request.codex_run_id:
run = await sync_to_async(CodexRun.objects.get)(pk=request.codex_run_id)
run.status = "approved_waiting_resume" if status == "approved" else status
await sync_to_async(run.save)(update_fields=["status"])
if request.external_sync_event_id:
evt = await sync_to_async(ExternalSyncEvent.objects.get)(
pk=request.external_sync_event_id
)
evt.status = "ok"
await sync_to_async(evt.save)(update_fields=["status"])
user = await sync_to_async(User.objects.get)(pk=request.user_id)
task = None
if run is not None and run.task_id:
task = await sync_to_async(DerivedTask.objects.get)(pk=run.task_id)
ikey = f"{APPROVAL_EVENT_PREFIX}:{request.approval_key}:{status}"
await sync_to_async(ExternalSyncEvent.objects.get_or_create)(
idempotency_key=ikey,
defaults={
"user": user,
"task": task,
"provider": "codex_cli",
"status": "pending",
"payload": {},
"error": "",
},
)
async def _approval_list_pending(user, scope, emit):
_ = scope
requests = await sync_to_async(list)(
CodexPermissionRequest.objects.filter(user=user, status="pending").order_by(
"-requested_at"
)[:20]
)
emit(f"pending={len(requests)}")
for req in requests:
emit(f" {req.approval_key}: {req.summary}")
async def _approval_status(user, approval_key, emit):
try:
req = await sync_to_async(CodexPermissionRequest.objects.get)(
user=user, approval_key=approval_key
)
emit(f"status={req.status} key={req.approval_key}")
except CodexPermissionRequest.DoesNotExist:
emit(f"approval_key_not_found:{approval_key}")
async def handle_approval_command(user, body, emit):
command = str(body or "").strip()
for prefix, expected_provider in APPROVAL_PROVIDER_COMMANDS.items():
if command.startswith(prefix + " ") or command == prefix:
sub = command[len(prefix) :].strip()
parts = sub.split()
if len(parts) >= 2 and parts[0] in ("approve", "reject"):
action, approval_key = parts[0], parts[1]
try:
req = await sync_to_async(
CodexPermissionRequest.objects.select_related(
"external_sync_event"
).get
)(user=user, approval_key=approval_key)
except CodexPermissionRequest.DoesNotExist:
emit(f"approval_key_not_found:{approval_key}")
return True
provider = _resolve_request_provider(req)
if not provider.startswith(expected_provider):
emit(
f"approval_key_not_for_provider:{approval_key} provider={provider}"
)
return True
await _apply_approval_decision(req, action)
emit(f"{action}d: {approval_key}")
return True
emit(f"usage: {prefix} approve|reject <key>")
return True
if not command.startswith(".approval"):
return False
rest = command[len(".approval") :].strip()
if rest.split() and rest.split()[0] in ("approve", "reject"):
parts = rest.split()
action = parts[0]
approval_key = parts[1] if len(parts) > 1 else ""
if not approval_key:
emit("usage: .approval approve|reject <key>")
return True
try:
req = await sync_to_async(
CodexPermissionRequest.objects.select_related("external_sync_event").get
)(user=user, approval_key=approval_key)
except CodexPermissionRequest.DoesNotExist:
emit(f"approval_key_not_found:{approval_key}")
return True
await _apply_approval_decision(req, action)
emit(f"{action}d: {approval_key}")
return True
if rest.startswith("list-pending"):
scope = rest[len("list-pending") :].strip() or "mine"
await _approval_list_pending(user, scope, emit)
return True
if rest.startswith("status "):
approval_key = rest[len("status ") :].strip()
await _approval_status(user, approval_key, emit)
return True
emit(
"approval: .approval approve|reject <key> | "
".approval list-pending [all] | "
".approval status <key>"
)
return True
def _parse_task_create(rest: str) -> tuple[str, str]:
text = str(rest or "").strip()
if not text.lower().startswith("add "):
return "", ""
payload = text[4:].strip()
if "::" in payload:
project_name, title = payload.split("::", 1)
return str(project_name or "").strip(), str(title or "").strip()
return "", ""
async def handle_tasks_command(
user,
body,
emit,
*,
service: str = "",
channel_identifier: str = "",
sender_identifier: str = "",
):
command = str(body or "").strip()
lower_command = command.lower()
if not TASK_COMMAND_MATCH_RE.match(command):
return False
if lower_command.startswith(".tasks"):
rest = command[len(".tasks") :].strip()
elif lower_command.startswith(".list") or lower_command.startswith(".l"):
rest = "list"
else:
rest = "list " + command[2:].strip() if len(command) > 2 else "list"
if rest.startswith("list"):
parts = rest.split()
status_filter = parts[1] if len(parts) > 1 else "open"
limit = int(parts[2]) if len(parts) > 2 and parts[2].isdigit() else 10
tasks = await sync_to_async(list)(
DerivedTask.objects.filter(
user=user, status_snapshot=status_filter
).order_by("-id")[:limit]
)
if not tasks:
emit(f"no {status_filter} tasks")
else:
for task in tasks:
emit(f"#{task.reference_code} [{task.status_snapshot}] {task.title}")
return True
project_name, title = _parse_task_create(rest)
if project_name or rest.startswith("add "):
if not project_name or not title:
emit("usage: .tasks add <project> :: <title>")
return True
project = await sync_to_async(
lambda: TaskProject.objects.filter(user=user, name__iexact=project_name)
.order_by("name")
.first()
)()
if project is None:
emit(f"project_not_found:{project_name}")
return True
task, _event = await create_task_record_and_sync(
user=user,
project=project,
title=title,
source_service=str(service or "web").strip().lower() or "web",
source_channel=str(channel_identifier or "").strip(),
actor_identifier=str(sender_identifier or "").strip(),
immutable_payload={
"origin": "gateway.tasks.add",
"channel_service": str(service or "").strip().lower(),
"channel_identifier": str(channel_identifier or "").strip(),
},
event_payload={
"command": ".tasks add",
"via": "gateway_builtin",
},
)
emit(f"created #{task.reference_code} [{project.name}] {task.title}")
return True
if rest.startswith("show "):
ref = rest[len("show ") :].strip().lstrip("#")
try:
task = await sync_to_async(DerivedTask.objects.get)(
user=user, reference_code=ref
)
emit(f"#{task.reference_code} {task.title}")
emit(f"status: {task.status_snapshot}")
except DerivedTask.DoesNotExist:
emit(f"task_not_found:#{ref}")
return True
if rest.startswith("complete "):
ref = rest[len("complete ") :].strip().lstrip("#")
try:
task = await sync_to_async(DerivedTask.objects.select_related("project").get)(
user=user, reference_code=ref
)
await mark_task_completed_and_sync(
task=task,
actor_identifier=str(sender_identifier or "").strip(),
payload={
"marker": ref,
"command": ".tasks complete",
"via": "gateway_builtin",
},
)
emit(f"completed #{ref}")
except DerivedTask.DoesNotExist:
emit(f"task_not_found:#{ref}")
return True
if rest.startswith("undo "):
ref = rest[len("undo ") :].strip().lstrip("#")
try:
task = await sync_to_async(DerivedTask.objects.get)(
user=user, reference_code=ref
)
await sync_to_async(task.delete)()
emit(f"removed #{ref}")
except DerivedTask.DoesNotExist:
emit(f"task_not_found:#{ref}")
return True
emit(
"tasks: .l | .tasks list [status] [limit] | "
".tasks add <project> :: <title> | "
".tasks show #<ref> | "
".tasks complete #<ref> | "
".tasks undo #<ref>"
)
return True
async def dispatch_builtin_gateway_command(
*,
user,
command_text: str,
service: str,
channel_identifier: str,
sender_identifier: str,
source_message,
message_meta: dict,
payload: dict,
emit,
) -> bool:
text = str(command_text or "").strip()
async def _contacts_handler(_ctx, out):
persons = await sync_to_async(list)(Person.objects.filter(user=user).order_by("name"))
if not persons:
out("No contacts found.")
return True
out("Contacts: " + ", ".join([p.name for p in persons]))
return True
async def _help_handler(_ctx, out):
for line in gateway_help_lines():
out(line)
return True
async def _whoami_handler(_ctx, out):
out(str(user.__dict__))
return True
async def _approval_handler(_ctx, out):
return await handle_approval_command(user, text, out)
async def _tasks_handler(_ctx, out):
return await handle_tasks_command(
user,
text,
out,
service=service,
channel_identifier=channel_identifier,
sender_identifier=sender_identifier,
)
routes = [
GatewayCommandRoute(
name="contacts",
scope_key="gateway.contacts",
matcher=lambda value: str(value or "").strip().lower() == ".contacts",
handler=_contacts_handler,
),
GatewayCommandRoute(
name="help",
scope_key="gateway.help",
matcher=lambda value: str(value or "").strip().lower() == ".help",
handler=_help_handler,
),
GatewayCommandRoute(
name="whoami",
scope_key="gateway.whoami",
matcher=lambda value: str(value or "").strip().lower() == ".whoami",
handler=_whoami_handler,
),
GatewayCommandRoute(
name="approval",
scope_key="gateway.approval",
matcher=lambda value: str(value or "").strip().lower().startswith(".approval")
or any(
str(value or "").strip().lower().startswith(prefix + " ")
or str(value or "").strip().lower() == prefix
for prefix in APPROVAL_PROVIDER_COMMANDS
),
handler=_approval_handler,
),
GatewayCommandRoute(
name="tasks",
scope_key="gateway.tasks",
matcher=lambda value: bool(TASK_COMMAND_MATCH_RE.match(str(value or ""))),
handler=_tasks_handler,
),
]
handled = await dispatch_gateway_command(
context=GatewayCommandContext(
user=user,
source_message=source_message,
service=str(service or "xmpp"),
channel_identifier=str(channel_identifier or ""),
sender_identifier=str(sender_identifier or ""),
message_text=text,
message_meta=dict(message_meta or {}),
payload=dict(payload or {}),
),
routes=routes,
emit=emit,
)
if not handled and text.startswith("."):
emit("No such command")
return handled

View File

@@ -1,10 +1,12 @@
from __future__ import annotations from __future__ import annotations
import datetime
import json import json
import time import time
from pathlib import Path from pathlib import Path
from typing import Any from typing import Any
from asgiref.sync import async_to_sync
from django.conf import settings from django.conf import settings
from django.db.models import Q from django.db.models import Q
from django.utils import timezone from django.utils import timezone
@@ -27,9 +29,12 @@ from core.models import (
MemoryChangeRequest, MemoryChangeRequest,
MemoryItem, MemoryItem,
TaskArtifactLink, TaskArtifactLink,
TaskEpic,
TaskProject,
User, User,
WorkspaceConversation, WorkspaceConversation,
) )
from core.tasks.engine import create_task_record_and_sync, mark_task_completed_and_sync
from core.util import logs from core.util import logs
log = logs.get_logger("mcp-tools") log = logs.get_logger("mcp-tools")
@@ -508,6 +513,55 @@ def tool_tasks_search(arguments: dict[str, Any]) -> dict[str, Any]:
return tool_tasks_list(arguments) return tool_tasks_list(arguments)
def tool_tasks_create(arguments: dict[str, Any]) -> dict[str, Any]:
user_id = int(arguments.get("user_id"))
project_id = str(arguments.get("project_id") or "").strip()
title = str(arguments.get("title") or "").strip()
if not project_id:
raise ValueError("project_id is required")
if not title:
raise ValueError("title is required")
project = TaskProject.objects.filter(user_id=user_id, id=project_id).first()
if project is None:
raise ValueError("project_id not found")
epic_id = str(arguments.get("epic_id") or "").strip()
epic = None
if epic_id:
epic = TaskEpic.objects.filter(project=project, id=epic_id).first()
if epic is None:
raise ValueError("epic_id not found for project")
due_at = str(arguments.get("due_date") or "").strip()
due_date = None
if due_at:
try:
due_date = datetime.date.fromisoformat(due_at)
except Exception as exc:
raise ValueError("due_date must be YYYY-MM-DD") from exc
task, event = async_to_sync(create_task_record_and_sync)(
user=project.user,
project=project,
epic=epic,
title=title,
source_service=str(arguments.get("source_service") or "web").strip().lower()
or "web",
source_channel=str(arguments.get("source_channel") or "").strip(),
actor_identifier=str(arguments.get("actor_identifier") or "").strip(),
due_date=due_date,
assignee_identifier=str(arguments.get("assignee_identifier") or "").strip(),
immutable_payload={
"source": "mcp.tasks.create",
"requested_by": str(arguments.get("actor_identifier") or "").strip(),
},
event_payload={
"source": "mcp.tasks.create",
"via": "mcp",
},
)
return {"task": _task_payload(task), "event": _event_payload(event)}
def tool_tasks_get(arguments: dict[str, Any]) -> dict[str, Any]: def tool_tasks_get(arguments: dict[str, Any]) -> dict[str, Any]:
task = _resolve_task(arguments) task = _resolve_task(arguments)
payload = _task_payload(task) payload = _task_payload(task)
@@ -555,6 +609,17 @@ def tool_tasks_create_note(arguments: dict[str, Any]) -> dict[str, Any]:
return {"task": _task_payload(task), "event": _event_payload(event)} return {"task": _task_payload(task), "event": _event_payload(event)}
def tool_tasks_complete(arguments: dict[str, Any]) -> dict[str, Any]:
task = _resolve_task(arguments)
event = async_to_sync(mark_task_completed_and_sync)(
task=task,
actor_identifier=str(arguments.get("actor_identifier") or "").strip(),
payload={"source": "mcp.tasks.complete", "via": "mcp"},
)
task.refresh_from_db()
return {"task": _task_payload(task), "event": _event_payload(event)}
def tool_tasks_link_artifact(arguments: dict[str, Any]) -> dict[str, Any]: def tool_tasks_link_artifact(arguments: dict[str, Any]) -> dict[str, Any]:
task = _resolve_task(arguments) task = _resolve_task(arguments)
kind = str(arguments.get("kind") or "").strip() or "note" kind = str(arguments.get("kind") or "").strip() or "note"
@@ -987,6 +1052,26 @@ TOOL_DEFS: dict[str, dict[str, Any]] = {
}, },
"handler": tool_tasks_search, "handler": tool_tasks_search,
}, },
"tasks.create": {
"description": "Create a canonical task inside GIA.",
"inputSchema": {
"type": "object",
"properties": {
"user_id": {"type": "integer"},
"project_id": {"type": "string"},
"epic_id": {"type": "string"},
"title": {"type": "string"},
"due_date": {"type": "string"},
"assignee_identifier": {"type": "string"},
"actor_identifier": {"type": "string"},
"source_service": {"type": "string"},
"source_channel": {"type": "string"},
},
"required": ["user_id", "project_id", "title"],
"additionalProperties": False,
},
"handler": tool_tasks_create,
},
"tasks.get": { "tasks.get": {
"description": "Get one derived task by ID, including links.", "description": "Get one derived task by ID, including links.",
"inputSchema": { "inputSchema": {
@@ -1029,6 +1114,20 @@ TOOL_DEFS: dict[str, dict[str, Any]] = {
}, },
"handler": tool_tasks_create_note, "handler": tool_tasks_create_note,
}, },
"tasks.complete": {
"description": "Mark a task completed and append a completion event.",
"inputSchema": {
"type": "object",
"properties": {
"task_id": {"type": "string"},
"user_id": {"type": "integer"},
"actor_identifier": {"type": "string"},
},
"required": ["task_id"],
"additionalProperties": False,
},
"handler": tool_tasks_complete,
},
"tasks.link_artifact": { "tasks.link_artifact": {
"description": "Link an artifact (URI/path) to a task.", "description": "Link an artifact (URI/path) to a task.",
"inputSchema": { "inputSchema": {

View File

@@ -0,0 +1,15 @@
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("core", "0043_userxmppsecuritysettings_encrypt_contact_messages_with_omemo"),
]
operations = [
migrations.AddField(
model_name="userxmppsecuritysettings",
name="encrypt_component_messages_with_omemo",
field=models.BooleanField(default=True),
),
]

View File

@@ -2979,6 +2979,7 @@ class UserXmppSecuritySettings(models.Model):
related_name="xmpp_security_settings", related_name="xmpp_security_settings",
) )
require_omemo = models.BooleanField(default=False) require_omemo = models.BooleanField(default=False)
encrypt_component_messages_with_omemo = models.BooleanField(default=True)
encrypt_contact_messages_with_omemo = models.BooleanField(default=True) encrypt_contact_messages_with_omemo = models.BooleanField(default=True)
created_at = models.DateTimeField(auto_now_add=True) created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True) updated_at = models.DateTimeField(auto_now=True)

View File

@@ -0,0 +1,111 @@
"""Canonical capability registry for command/task/gateway scope policy."""
from __future__ import annotations
from dataclasses import dataclass
@dataclass(frozen=True, slots=True)
class CapabilityScope:
key: str
label: str
description: str
group: str
configurable: bool = True
owner_path: str = "/settings/security/permissions/"
GLOBAL_SCOPE_KEY = "global.override"
CAPABILITY_SCOPES: tuple[CapabilityScope, ...] = (
CapabilityScope(
key="gateway.contacts",
label="Gateway contacts command",
description="Handles .contacts over gateway channels.",
group="gateway",
),
CapabilityScope(
key="gateway.help",
label="Gateway help command",
description="Handles .help over gateway channels.",
group="gateway",
),
CapabilityScope(
key="gateway.whoami",
label="Gateway whoami command",
description="Handles .whoami over gateway channels.",
group="gateway",
),
CapabilityScope(
key="gateway.tasks",
label="Gateway .tasks commands",
description="Handles .tasks list/show/complete/undo over gateway channels.",
group="tasks",
),
CapabilityScope(
key="gateway.approval",
label="Gateway approval commands",
description="Handles .approval/.codex/.claude approve/deny over gateway channels.",
group="command",
),
CapabilityScope(
key="tasks.submit",
label="Task submissions from chat",
description="Controls automatic task creation from inbound messages.",
group="tasks",
owner_path="/settings/tasks/",
),
CapabilityScope(
key="tasks.commands",
label="Task command verbs (.task/.undo/.epic)",
description="Controls explicit task command verbs.",
group="tasks",
owner_path="/settings/tasks/",
),
CapabilityScope(
key="command.bp",
label="Business plan command",
description="Controls Business Plan command execution.",
group="command",
owner_path="/settings/command-routing/",
),
CapabilityScope(
key="command.codex",
label="Codex command",
description="Controls Codex command execution.",
group="agentic",
owner_path="/settings/command-routing/",
),
CapabilityScope(
key="command.claude",
label="Claude command",
description="Controls Claude command execution.",
group="agentic",
owner_path="/settings/command-routing/",
),
)
SCOPE_BY_KEY = {row.key: row for row in CAPABILITY_SCOPES}
GROUP_LABELS: dict[str, str] = {
"gateway": "Gateway",
"tasks": "Tasks",
"command": "Commands",
"agentic": "Agentic",
"other": "Other",
}
def all_scope_keys(*, configurable_only: bool = False) -> list[str]:
rows = [
row.key
for row in CAPABILITY_SCOPES
if (not configurable_only or bool(row.configurable))
]
return rows
def scope_record(scope_key: str) -> CapabilityScope | None:
key = str(scope_key or "").strip().lower()
return SCOPE_BY_KEY.get(key)

View File

@@ -2,7 +2,10 @@ from __future__ import annotations
from dataclasses import dataclass from dataclasses import dataclass
from core.models import CommandSecurityPolicy, UserXmppOmemoState from core.models import (
CommandSecurityPolicy,
UserXmppOmemoTrustedKey,
)
GLOBAL_SCOPE_KEY = "global.override" GLOBAL_SCOPE_KEY = "global.override"
OVERRIDE_OPTIONS = {"per_scope", "on", "off"} OVERRIDE_OPTIONS = {"per_scope", "on", "off"}
@@ -64,7 +67,7 @@ def _match_channel(rule: str, channel: str) -> bool:
return current == value return current == value
def _omemo_facts(ctx: CommandSecurityContext) -> tuple[str, str]: def _omemo_facts(ctx: CommandSecurityContext) -> tuple[str, str, str]:
message_meta = dict(ctx.message_meta or {}) message_meta = dict(ctx.message_meta or {})
payload = dict(ctx.payload or {}) payload = dict(ctx.payload or {})
xmpp_meta = dict(message_meta.get("xmpp") or {}) xmpp_meta = dict(message_meta.get("xmpp") or {})
@@ -76,7 +79,10 @@ def _omemo_facts(ctx: CommandSecurityContext) -> tuple[str, str]:
client_key = str( client_key = str(
xmpp_meta.get("omemo_client_key") or payload.get("omemo_client_key") or "" xmpp_meta.get("omemo_client_key") or payload.get("omemo_client_key") or ""
).strip() ).strip()
return status, client_key sender_jid = str(
xmpp_meta.get("sender_jid") or payload.get("sender_jid") or ""
).strip()
return status, client_key, sender_jid
def _channel_allowed_for_rules(rules: dict, service: str, channel: str) -> bool: def _channel_allowed_for_rules(rules: dict, service: str, channel: str) -> bool:
@@ -192,7 +198,7 @@ def evaluate_command_policy(
reason=f"channel={channel or '-'} not allowed by global override", reason=f"channel={channel or '-'} not allowed by global override",
) )
omemo_status, omemo_client_key = _omemo_facts(context) omemo_status, omemo_client_key, sender_jid = _omemo_facts(context)
if require_omemo and omemo_status != "detected": if require_omemo and omemo_status != "detected":
return CommandPolicyDecision( return CommandPolicyDecision(
allowed=False, allowed=False,
@@ -205,15 +211,25 @@ def evaluate_command_policy(
return CommandPolicyDecision( return CommandPolicyDecision(
allowed=False, allowed=False,
code="trusted_fingerprint_required", code="trusted_fingerprint_required",
reason=f"scope={scope} requires trusted OMEMO fingerprint", reason=f"scope={scope} requires a trusted OMEMO key",
) )
state = UserXmppOmemoState.objects.filter(user=user).first() jid_bare = (
expected_key = str(getattr(state, "latest_client_key", "") or "").strip() str(sender_jid.split("/", 1)[0] if sender_jid else "").strip().lower()
if not expected_key or expected_key != omemo_client_key: )
trusted_query = UserXmppOmemoTrustedKey.objects.filter(
user=user,
key_type="client_key",
key_id=omemo_client_key,
trusted=True,
)
if jid_bare:
trusted_query = trusted_query.filter(jid__iexact=jid_bare)
trusted_match = trusted_query.order_by("-updated_at").first()
if trusted_match is None:
return CommandPolicyDecision( return CommandPolicyDecision(
allowed=False, allowed=False,
code="fingerprint_mismatch", code="trusted_key_missing",
reason=f"scope={scope} OMEMO fingerprint does not match enrolled key", reason=f"scope={scope} requires a trusted OMEMO key for this sender",
) )
return CommandPolicyDecision(allowed=True) return CommandPolicyDecision(allowed=True)

View File

@@ -65,6 +65,10 @@ _TASK_COMPLETE_CMD_RE = re.compile(
r"^\s*\.task\s+(?:complete|done|close)\s+#?(?P<reference>[A-Za-z0-9_-]+)\s*$", r"^\s*\.task\s+(?:complete|done|close)\s+#?(?P<reference>[A-Za-z0-9_-]+)\s*$",
re.IGNORECASE, re.IGNORECASE,
) )
_TASK_ADD_CMD_RE = re.compile(
r"^\s*\.task\s+(?:add|create|new)\s+(?P<title>.+?)\s*$",
re.IGNORECASE | re.DOTALL,
)
_DUE_ISO_RE = re.compile( _DUE_ISO_RE = re.compile(
r"\b(?:due|by)\s+(\d{4}-\d{2}-\d{2})\b", r"\b(?:due|by)\s+(\d{4}-\d{2}-\d{2})\b",
re.IGNORECASE, re.IGNORECASE,
@@ -367,6 +371,102 @@ def _next_reference(user, project) -> str:
return str(DerivedTask.objects.filter(user=user, project=project).count() + 1) return str(DerivedTask.objects.filter(user=user, project=project).count() + 1)
def create_task_record(
*,
user,
project,
title: str,
source_service: str,
source_channel: str,
origin_message: Message | None = None,
actor_identifier: str = "",
epic: TaskEpic | None = None,
due_date: datetime.date | None = None,
assignee_identifier: str = "",
immutable_payload: dict | None = None,
event_payload: dict | None = None,
status_snapshot: str = "open",
) -> tuple[DerivedTask, DerivedTaskEvent]:
reference = _next_reference(user, project)
task = DerivedTask.objects.create(
user=user,
project=project,
epic=epic,
title=str(title or "").strip()[:255] or "Untitled task",
source_service=str(source_service or "web").strip() or "web",
source_channel=str(source_channel or "").strip(),
origin_message=origin_message,
reference_code=reference,
status_snapshot=str(status_snapshot or "open").strip() or "open",
due_date=due_date,
assignee_identifier=str(assignee_identifier or "").strip(),
immutable_payload=dict(immutable_payload or {}),
)
event = DerivedTaskEvent.objects.create(
task=task,
event_type="created",
actor_identifier=str(actor_identifier or "").strip(),
source_message=origin_message,
payload=dict(event_payload or {}),
)
return task, event
async def create_task_record_and_sync(
*,
user,
project,
title: str,
source_service: str,
source_channel: str,
origin_message: Message | None = None,
actor_identifier: str = "",
epic: TaskEpic | None = None,
due_date: datetime.date | None = None,
assignee_identifier: str = "",
immutable_payload: dict | None = None,
event_payload: dict | None = None,
status_snapshot: str = "open",
) -> tuple[DerivedTask, DerivedTaskEvent]:
task, event = await sync_to_async(create_task_record)(
user=user,
project=project,
title=title,
source_service=source_service,
source_channel=source_channel,
origin_message=origin_message,
actor_identifier=actor_identifier,
epic=epic,
due_date=due_date,
assignee_identifier=assignee_identifier,
immutable_payload=immutable_payload,
event_payload=event_payload,
status_snapshot=status_snapshot,
)
await _emit_sync_event(task, event, "create")
return task, event
async def mark_task_completed_and_sync(
*,
task: DerivedTask,
actor_identifier: str = "",
source_message: Message | None = None,
payload: dict | None = None,
) -> DerivedTaskEvent:
task.status_snapshot = "completed"
await sync_to_async(task.save)(update_fields=["status_snapshot"])
event = await sync_to_async(DerivedTaskEvent.objects.create)(
task=task,
event_type="completion_marked",
actor_identifier=str(actor_identifier or "").strip(),
source_message=source_message,
payload=dict(payload or {}),
)
await _emit_sync_event(task, event, "complete")
return event
async def _derive_title(message: Message) -> str: async def _derive_title(message: Message) -> str:
text = str(message.text or "").strip() text = str(message.text or "").strip()
if not text: if not text:
@@ -600,6 +700,49 @@ async def _handle_scope_task_commands(
await _send_scope_message(source, message, "\n".join(lines)) await _send_scope_message(source, message, "\n".join(lines))
return True return True
create_match = _TASK_ADD_CMD_RE.match(body)
if create_match:
task_text = str(create_match.group("title") or "").strip()
if not task_text:
await _send_scope_message(
source, message, "[task] title is required for .task add."
)
return True
epic = source.epic
epic_name = _extract_epic_name_from_text(task_text)
if epic_name:
epic, _ = await sync_to_async(TaskEpic.objects.get_or_create)(
project=source.project,
name=epic_name,
)
cleaned_task_text = _strip_epic_token(task_text)
task, _event = await create_task_record_and_sync(
user=message.user,
project=source.project,
epic=epic,
title=cleaned_task_text[:255],
source_service=source.service or message.source_service or "web",
source_channel=source.channel_identifier or message.source_chat_id or "",
origin_message=message,
actor_identifier=str(message.sender_uuid or ""),
due_date=_parse_due_date(cleaned_task_text),
assignee_identifier=_parse_assignee(cleaned_task_text),
immutable_payload={
"origin_text": text,
"task_text": cleaned_task_text,
"source": "chat_manual_command",
},
event_payload={
"origin_text": text,
"command": ".task add",
"via": "chat_command",
},
)
await _send_scope_message(
source, message, f"[task] created #{task.reference_code}: {task.title}"
)
return True
undo_match = _UNDO_TASK_RE.match(body) undo_match = _UNDO_TASK_RE.match(body)
if undo_match: if undo_match:
reference = str(undo_match.group("reference") or "").strip() reference = str(undo_match.group("reference") or "").strip()
@@ -690,11 +833,8 @@ async def _handle_scope_task_commands(
source, message, f"[task] #{reference} not found." source, message, f"[task] #{reference} not found."
) )
return True return True
task.status_snapshot = "completed" await mark_task_completed_and_sync(
await sync_to_async(task.save)(update_fields=["status_snapshot"])
event = await sync_to_async(DerivedTaskEvent.objects.create)(
task=task, task=task,
event_type="completion_marked",
actor_identifier=str(message.sender_uuid or ""), actor_identifier=str(message.sender_uuid or ""),
source_message=message, source_message=message,
payload={ payload={
@@ -703,7 +843,6 @@ async def _handle_scope_task_commands(
"via": "chat_command", "via": "chat_command",
}, },
) )
await _emit_sync_event(task, event, "complete")
await _send_scope_message( await _send_scope_message(
source, message, f"[task] completed #{task.reference_code}: {task.title}" source, message, f"[task] completed #{task.reference_code}: {task.title}"
) )
@@ -763,6 +902,7 @@ def _is_task_command_candidate(text: str) -> bool:
if ( if (
_LIST_TASKS_RE.match(body) _LIST_TASKS_RE.match(body)
or _LIST_TASKS_CMD_RE.match(body) or _LIST_TASKS_CMD_RE.match(body)
or _TASK_ADD_CMD_RE.match(body)
or _TASK_SHOW_RE.match(body) or _TASK_SHOW_RE.match(body)
or _TASK_COMPLETE_CMD_RE.match(body) or _TASK_COMPLETE_CMD_RE.match(body)
or _UNDO_TASK_RE.match(body) or _UNDO_TASK_RE.match(body)
@@ -779,6 +919,7 @@ def _is_explicit_task_command(text: str) -> bool:
return bool( return bool(
_LIST_TASKS_RE.match(body) _LIST_TASKS_RE.match(body)
or _LIST_TASKS_CMD_RE.match(body) or _LIST_TASKS_CMD_RE.match(body)
or _TASK_ADD_CMD_RE.match(body)
or _TASK_SHOW_RE.match(body) or _TASK_SHOW_RE.match(body)
or _TASK_COMPLETE_CMD_RE.match(body) or _TASK_COMPLETE_CMD_RE.match(body)
or _UNDO_TASK_RE.match(body) or _UNDO_TASK_RE.match(body)
@@ -878,16 +1019,12 @@ async def process_inbound_task_intelligence(message: Message) -> None:
) )
return return
task.status_snapshot = "completed" await mark_task_completed_and_sync(
await sync_to_async(task.save)(update_fields=["status_snapshot"])
event = await sync_to_async(DerivedTaskEvent.objects.create)(
task=task, task=task,
event_type="completion_marked",
actor_identifier=str(message.sender_uuid or ""), actor_identifier=str(message.sender_uuid or ""),
source_message=message, source_message=message,
payload={"marker": ref_code}, payload={"marker": ref_code},
) )
await _emit_sync_event(task, event, "complete")
return return
for source in sources: for source in sources:
@@ -913,10 +1050,9 @@ async def process_inbound_task_intelligence(message: Message) -> None:
source_chat_id=message.source_chat_id, source_chat_id=message.source_chat_id,
) )
title = await _derive_title_with_flags(cloned_message, flags) title = await _derive_title_with_flags(cloned_message, flags)
reference = await sync_to_async(_next_reference)(message.user, source.project)
parsed_due_date = _parse_due_date(task_text) parsed_due_date = _parse_due_date(task_text)
parsed_assignee = _parse_assignee(task_text) parsed_assignee = _parse_assignee(task_text)
task = await sync_to_async(DerivedTask.objects.create)( task, event = await create_task_record_and_sync(
user=message.user, user=message.user,
project=source.project, project=source.project,
epic=epic, epic=epic,
@@ -924,8 +1060,7 @@ async def process_inbound_task_intelligence(message: Message) -> None:
source_service=source.service or message.source_service or "web", source_service=source.service or message.source_service or "web",
source_channel=source.channel_identifier or message.source_chat_id or "", source_channel=source.channel_identifier or message.source_chat_id or "",
origin_message=message, origin_message=message,
reference_code=reference, actor_identifier=str(message.sender_uuid or ""),
status_snapshot="open",
due_date=parsed_due_date, due_date=parsed_due_date,
assignee_identifier=parsed_assignee, assignee_identifier=parsed_assignee,
immutable_payload={ immutable_payload={
@@ -933,15 +1068,8 @@ async def process_inbound_task_intelligence(message: Message) -> None:
"task_text": task_text, "task_text": task_text,
"flags": flags, "flags": flags,
}, },
event_payload={"origin_text": text},
) )
event = await sync_to_async(DerivedTaskEvent.objects.create)(
task=task,
event_type="created",
actor_identifier=str(message.sender_uuid or ""),
source_message=message,
payload={"origin_text": text},
)
await _emit_sync_event(task, event, "create")
if bool(flags.get("announce_task_id", False)): if bool(flags.get("announce_task_id", False)):
try: try:
await send_message_raw( await send_message_raw(

View File

@@ -167,6 +167,10 @@
.panel, .box, .modal { .panel, .box, .modal {
background-color: var(--modal-color) !important; background-color: var(--modal-color) !important;
} }
.box {
border: 1px solid rgba(25, 33, 52, 0.08);
box-shadow: 0 18px 48px rgba(20, 28, 45, 0.08);
}
.modal, .modal.box{ .modal, .modal.box{
background-color: var(--background-color) !important; background-color: var(--background-color) !important;
} }
@@ -200,6 +204,57 @@
.navbar { .navbar {
background-color:rgba(0, 0, 0, 0.03) !important; background-color:rgba(0, 0, 0, 0.03) !important;
} }
.section > .container.gia-page-shell,
.section > .container {
max-width: 1340px;
}
.gia-page-header {
display: flex;
align-items: flex-start;
justify-content: space-between;
gap: 1rem;
margin-bottom: 1rem;
flex-wrap: wrap;
}
.gia-page-header .title,
.gia-page-header .subtitle {
max-width: 72ch;
}
.gia-page-header .subtitle {
margin-bottom: 0;
}
.table thead th {
position: sticky;
top: 0;
background: rgba(248, 250, 252, 0.96) !important;
backdrop-filter: blur(6px);
z-index: 1;
}
[data-theme="dark"] .table thead th {
background: rgba(44, 44, 44, 0.96) !important;
}
.table td,
.table th {
vertical-align: top;
}
.help {
max-width: 78ch;
}
.button.is-light {
border-color: rgba(27, 38, 59, 0.12);
}
.input,
.textarea,
.select select {
border-color: rgba(27, 38, 59, 0.18);
box-shadow: none;
}
.input:focus,
.textarea:focus,
.select select:focus {
border-color: rgba(27, 99, 214, 0.8);
box-shadow: 0 0 0 0.125em rgba(27, 99, 214, 0.14);
}
.grid-stack-item-content, .grid-stack-item-content,
.floating-window { .floating-window {

View File

@@ -2,9 +2,13 @@
{% block content %} {% block content %}
<section class="section"> <section class="section">
<div class="container"> <div class="container gia-page-shell">
<h1 class="title is-4">Codex Status</h1> <div class="gia-page-header">
<p class="subtitle is-6">Global per-user Codex task-sync status, runs, and approvals.</p> <div>
<h1 class="title is-4">Codex Status</h1>
<p class="subtitle is-6">Worker-backed task sync status, runs, and approvals for the canonical GIA task store.</p>
</div>
</div>
<article class="box"> <article class="box">
<div class="codex-inline-stats"> <div class="codex-inline-stats">

View File

@@ -5,6 +5,11 @@
<div class="container"> <div class="container">
<h1 class="title is-4">Command Routing</h1> <h1 class="title is-4">Command Routing</h1>
<p class="subtitle is-6">Configure commands, channel bindings, and per-command delivery in a predictable way.</p> <p class="subtitle is-6">Configure commands, channel bindings, and per-command delivery in a predictable way.</p>
<p class="help">
Related controls:
<a href="{% url 'tasks_settings' %}">Task Automation</a> and
<a href="{% url 'permission_settings' %}">Security Permissions</a>.
</p>
{% if scope_service and scope_identifier %} {% if scope_service and scope_identifier %}
<article class="notification is-info is-light"> <article class="notification is-info is-light">
Scoped to this chat only: <strong>{{ scope_service }}</strong> · <code>{{ scope_identifier }}</code> Scoped to this chat only: <strong>{{ scope_service }}</strong> · <code>{{ scope_identifier }}</code>

View File

@@ -190,13 +190,22 @@
data-person="{{ row.linked_person_name|default:'-'|lower }}" data-person="{{ row.linked_person_name|default:'-'|lower }}"
data-detected="{{ row.detected_name|default:'-'|lower }}" data-detected="{{ row.detected_name|default:'-'|lower }}"
data-identifier="{{ row.identifier|lower }}" data-identifier="{{ row.identifier|lower }}"
data-search="{{ row.linked_person_name|default:'-'|lower }} {{ row.detected_name|default:'-'|lower }} {{ row.service|lower }} {{ row.identifier|lower }}"> data-search="{{ row.linked_person_name|default:'-'|lower }} {{ row.detected_name|default:'-'|lower }} {{ row.service|lower }} {{ row.identifier_search|default:row.identifier|lower }}">
<td data-discovered-col="0" class="discovered-col-0">{{ row.linked_person_name|default:"-" }}</td> <td data-discovered-col="0" class="discovered-col-0">{{ row.linked_person_name|default:"-" }}</td>
<td data-discovered-col="1" class="discovered-col-1">{{ row.detected_name|default:"-" }}</td> <td data-discovered-col="1" class="discovered-col-1">{{ row.detected_name|default:"-" }}</td>
<td data-discovered-col="2" class="discovered-col-2"> <td data-discovered-col="2" class="discovered-col-2">
{{ row.service|title }} {{ row.service|title }}
</td> </td>
<td data-discovered-col="3" class="discovered-col-3"><code>{{ row.identifier }}</code></td> <td data-discovered-col="3" class="discovered-col-3">
<code>{{ row.identifier }}</code>
{% if row.identifier_aliases %}
<div class="is-size-7 has-text-grey mt-1">
{% for alias in row.identifier_aliases %}
<div><code>{{ alias }}</code></div>
{% endfor %}
</div>
{% endif %}
</td>
<td data-discovered-col="4" class="discovered-col-4"> <td data-discovered-col="4" class="discovered-col-4">
{% if not row.linked_person %} {% if not row.linked_person %}
<div class="buttons are-small"> <div class="buttons are-small">

View File

@@ -48,12 +48,20 @@
<p class="help is-size-7 has-text-grey">This is separate from command-scope policy checks such as Require Trusted Fingerprint.</p> <p class="help is-size-7 has-text-grey">This is separate from command-scope policy checks such as Require Trusted Fingerprint.</p>
</div> </div>
<input type="hidden" name="encrypt_contact_messages_with_omemo" value="0"> <input type="hidden" name="encrypt_contact_messages_with_omemo" value="0">
<input type="hidden" name="encrypt_component_messages_with_omemo" value="0">
<div class="field mt-3">
<label class="checkbox">
<input type="checkbox" name="encrypt_component_messages_with_omemo" value="1"{% if security_settings.encrypt_component_messages_with_omemo %} checked{% endif %}>
Encrypt gateway component chat replies with OMEMO
</label>
<p class="help is-size-7 has-text-grey mt-1">Controls only gateway/component command replies (for example, <code>.tasks</code> or approvals) sent to your XMPP client.</p>
</div>
<div class="field mt-3"> <div class="field mt-3">
<label class="checkbox"> <label class="checkbox">
<input type="checkbox" name="encrypt_contact_messages_with_omemo" value="1"{% if security_settings.encrypt_contact_messages_with_omemo %} checked{% endif %}> <input type="checkbox" name="encrypt_contact_messages_with_omemo" value="1"{% if security_settings.encrypt_contact_messages_with_omemo %} checked{% endif %}>
Encrypt contact relay messages to your XMPP client with OMEMO Encrypt contact relay messages to your XMPP client with OMEMO
</label> </label>
<p class="help is-size-7 has-text-grey mt-1">When enabled, relay text from contacts is sent with OMEMO when available. If disabled, relay text is sent in plaintext.</p> <p class="help is-size-7 has-text-grey mt-1">Controls relayed contact chat text. Keep this off if you want normal contact chats while securing only component workflows.</p>
</div> </div>
<button class="button is-link is-small" type="submit">Save</button> <button class="button is-link is-small" type="submit">Save</button>
</form> </form>
@@ -194,7 +202,7 @@
<div class="box"> <div class="box">
<h2 class="title is-6">Global Scope Override</h2> <h2 class="title is-6">Global Scope Override</h2>
<p class="is-size-7 has-text-grey mb-3"> <p class="is-size-7 has-text-grey mb-3">
This scope can force settings across all Command Security Scopes. This scope can force settings across all Fine-Grained Security Scopes.
</p> </p>
<div class="box" style="margin: 0; border: 1px solid rgba(60, 60, 60, 0.12);"> <div class="box" style="margin: 0; border: 1px solid rgba(60, 60, 60, 0.12);">
<form method="post"> <form method="post">
@@ -341,7 +349,7 @@
{% endfor %} {% endfor %}
</select> </select>
</div> </div>
<input class="input is-small" name="allowed_channel_pattern" value="{{ rule.pattern }}" placeholder="m@zm.is* or 1203*"> <input class="input is-small" name="allowed_channel_pattern" value="{{ rule.pattern }}" placeholder="user@example.test* or 1203*">
<button class="button is-small is-light is-danger channel-rule-remove" type="button">Remove</button> <button class="button is-small is-light is-danger channel-rule-remove" type="button">Remove</button>
</div> </div>
{% endfor %} {% endfor %}
@@ -454,7 +462,7 @@
{% endfor %} {% endfor %}
</select> </select>
</div> </div>
<input class="input is-small scope-editable" data-lock-state="free" name="allowed_channel_pattern" value="{{ rule.pattern }}" placeholder="m@zm.is* or 1203*"> <input class="input is-small scope-editable" data-lock-state="free" name="allowed_channel_pattern" value="{{ rule.pattern }}" placeholder="user@example.test* or 1203*">
<button class="button is-small is-light is-danger channel-rule-remove scope-editable" data-lock-state="free" type="button">Remove</button> <button class="button is-small is-light is-danger channel-rule-remove scope-editable" data-lock-state="free" type="button">Remove</button>
</div> </div>
{% endfor %} {% endfor %}
@@ -516,7 +524,7 @@
{% endfor %} {% endfor %}
</select> </select>
</div> </div>
<input class="input is-small scope-editable" data-lock-state="free" name="allowed_channel_pattern" value="" placeholder="m@zm.is* or 1203*"> <input class="input is-small scope-editable" data-lock-state="free" name="allowed_channel_pattern" value="" placeholder="user@example.test* or 1203*">
<button class="button is-small is-light is-danger channel-rule-remove scope-editable" data-lock-state="free" type="button">Remove</button> <button class="button is-small is-light is-danger channel-rule-remove scope-editable" data-lock-state="free" type="button">Remove</button>
</div> </div>
</template> </template>

View File

@@ -1,8 +1,12 @@
{% extends "base.html" %} {% extends "base.html" %}
{% block content %} {% block content %}
<section class="section"><div class="container"> <section class="section"><div class="container gia-page-shell">
<h1 class="title is-4">Task #{{ task.reference_code }}: {{ task.title }}</h1> <div class="gia-page-header">
<p class="subtitle is-6">{{ task.project.name }}{% if task.epic %} / {{ task.epic.name }}{% endif %} · {{ task.status_snapshot }}</p> <div>
<h1 class="title is-4">Task #{{ task.reference_code }}: {{ task.title }}</h1>
<p class="subtitle is-6">{{ task.project.name }}{% if task.epic %} / {{ task.epic.name }}{% endif %} · {{ task.status_snapshot }}</p>
</div>
</div>
<p class="is-size-7 has-text-grey" style="margin-top:-0.65rem; margin-bottom: 0.65rem;"> <p class="is-size-7 has-text-grey" style="margin-top:-0.65rem; margin-bottom: 0.65rem;">
Created by {{ task.creator_label|default:"Unknown" }} Created by {{ task.creator_label|default:"Unknown" }}
{% if task.origin_message_id %} {% if task.origin_message_id %}

View File

@@ -1,9 +1,13 @@
{% extends "base.html" %} {% extends "base.html" %}
{% block content %} {% block content %}
<section class="section"> <section class="section">
<div class="container"> <div class="container gia-page-shell">
<h1 class="title is-4">Task Inbox</h1> <div class="gia-page-header">
<p class="subtitle is-6">Immutable tasks derived from chat activity.</p> <div>
<h1 class="title is-4">Task Inbox</h1>
<p class="subtitle is-6">Canonical tasks live in GIA. Chats, XMPP, web UI, and agent tooling all operate on the same records.</p>
</div>
</div>
<div class="buttons" style="margin-bottom: 0.75rem;"> <div class="buttons" style="margin-bottom: 0.75rem;">
<a class="button is-small is-link is-light" href="{% url 'tasks_settings' %}{% if scope.person_id or scope.service or scope.identifier %}?{% if scope.person_id %}person={{ scope.person_id|urlencode }}{% endif %}{% if scope.service %}{% if scope.person_id %}&{% endif %}service={{ scope.service|urlencode }}{% endif %}{% if scope.identifier %}{% if scope.person_id or scope.service %}&{% endif %}identifier={{ scope.identifier|urlencode }}{% endif %}{% endif %}">Task Automation</a> <a class="button is-small is-link is-light" href="{% url 'tasks_settings' %}{% if scope.person_id or scope.service or scope.identifier %}?{% if scope.person_id %}person={{ scope.person_id|urlencode }}{% endif %}{% if scope.service %}{% if scope.person_id %}&{% endif %}service={{ scope.service|urlencode }}{% endif %}{% if scope.identifier %}{% if scope.person_id or scope.service %}&{% endif %}identifier={{ scope.identifier|urlencode }}{% endif %}{% endif %}">Task Automation</a>
</div> </div>
@@ -139,6 +143,60 @@
</article> </article>
</div> </div>
<div class="column"> <div class="column">
<article class="box">
<div class="is-flex is-justify-content-space-between is-align-items-center" style="gap: 0.5rem; margin-bottom: 0.6rem; flex-wrap: wrap;">
<h2 class="title is-6" style="margin: 0;">Create Task</h2>
{% if scope.service or scope.identifier %}
<span class="tag task-ui-badge">{{ scope.service|default:"web" }}{% if scope.identifier %} · {{ scope.identifier }}{% endif %}</span>
{% else %}
<span class="tag task-ui-badge">web</span>
{% endif %}
</div>
<p class="help" style="margin-bottom: 0.65rem;">Use this to create canonical tasks directly from the web UI without relying on WhatsApp-derived source state.</p>
<form method="post">
{% csrf_token %}
<input type="hidden" name="action" value="task_create">
<input type="hidden" name="person" value="{{ scope.person_id }}">
<input type="hidden" name="service" value="{{ scope.service }}">
<input type="hidden" name="identifier" value="{{ scope.identifier }}">
<div class="columns is-multiline">
<div class="column is-7">
<label class="label is-size-7">Title</label>
<input class="input is-small" name="title" placeholder="Ship MCP browser validation for compose">
</div>
<div class="column is-5">
<label class="label is-size-7">Project</label>
<div class="select is-small is-fullwidth">
<select name="project_id">
{% for project in project_choices %}
<option value="{{ project.id }}" {% if selected_project and selected_project.id == project.id %}selected{% endif %}>{{ project.name }}</option>
{% endfor %}
</select>
</div>
</div>
<div class="column is-5">
<label class="label is-size-7">Epic (optional)</label>
<div class="select is-small is-fullwidth">
<select name="epic_id">
<option value="">No epic</option>
{% for epic in epic_choices %}
<option value="{{ epic.id }}">{{ epic.project.name }} / {{ epic.name }}</option>
{% endfor %}
</select>
</div>
</div>
<div class="column is-3">
<label class="label is-size-7">Due Date</label>
<input class="input is-small" type="date" name="due_date">
</div>
<div class="column is-4">
<label class="label is-size-7">Assignee</label>
<input class="input is-small" name="assignee_identifier" placeholder="@operator">
</div>
</div>
<button class="button is-small is-link" type="submit">Create Task</button>
</form>
</article>
<article class="box"> <article class="box">
<h2 class="title is-6">Recent Derived Tasks</h2> <h2 class="title is-6">Recent Derived Tasks</h2>
<table class="table is-fullwidth is-striped is-size-7"> <table class="table is-fullwidth is-striped is-size-7">

View File

@@ -4,6 +4,11 @@
<div class="container tasks-settings-page"> <div class="container tasks-settings-page">
<h1 class="title is-4">Task Automation</h1> <h1 class="title is-4">Task Automation</h1>
<p class="subtitle is-6">Project defaults flow into channel overrides. Use Quick Setup for normal operation; open Advanced Setup for full controls.</p> <p class="subtitle is-6">Project defaults flow into channel overrides. Use Quick Setup for normal operation; open Advanced Setup for full controls.</p>
<p class="help">
Related controls:
<a href="{% url 'command_routing' %}">Commands</a> and
<a href="{% url 'permission_settings' %}">Security Permissions</a>.
</p>
<div class="notification is-light"> <div class="notification is-light">
<div class="content is-size-7"> <div class="content is-size-7">

View File

@@ -33,6 +33,7 @@ class CommandRoutingVariantUITests(TestCase):
self.assertContains(response, "bp set range") self.assertContains(response, "bp set range")
self.assertContains(response, "Send status to egress") self.assertContains(response, "Send status to egress")
self.assertContains(response, "Codex (codex)") self.assertContains(response, "Codex (codex)")
self.assertContains(response, "Claude (claude)")
def test_variant_policy_update_persists(self): def test_variant_policy_update_persists(self):
response = self.client.post( response = self.client.post(

View File

@@ -20,7 +20,7 @@ from core.models import (
Person, Person,
PersonIdentifier, PersonIdentifier,
User, User,
UserXmppOmemoState, UserXmppOmemoTrustedKey,
) )
from core.security.command_policy import CommandSecurityContext, evaluate_command_policy from core.security.command_policy import CommandSecurityContext, evaluate_command_policy
@@ -37,7 +37,7 @@ class CommandSecurityPolicyTests(TestCase):
user=self.user, user=self.user,
person=self.person, person=self.person,
service="xmpp", service="xmpp",
identifier="policy-user@zm.is", identifier="policy-user@example.test",
) )
self.session = ChatSession.objects.create( self.session = ChatSession.objects.create(
user=self.user, user=self.user,
@@ -58,7 +58,7 @@ class CommandSecurityPolicyTests(TestCase):
profile=profile, profile=profile,
direction="ingress", direction="ingress",
service="xmpp", service="xmpp",
channel_identifier="policy-user@zm.is", channel_identifier="policy-user@example.test",
enabled=True, enabled=True,
) )
CommandSecurityPolicy.objects.create( CommandSecurityPolicy.objects.create(
@@ -74,13 +74,13 @@ class CommandSecurityPolicyTests(TestCase):
text="#bp#", text="#bp#",
ts=1000, ts=1000,
source_service="xmpp", source_service="xmpp",
source_chat_id="policy-user@zm.is", source_chat_id="policy-user@example.test",
message_meta={}, message_meta={},
) )
results = async_to_sync(process_inbound_message)( results = async_to_sync(process_inbound_message)(
CommandContext( CommandContext(
service="xmpp", service="xmpp",
channel_identifier="policy-user@zm.is", channel_identifier="policy-user@example.test",
message_id=str(msg.id), message_id=str(msg.id),
user_id=self.user.id, user_id=self.user.id,
message_text="#bp#", message_text="#bp#",
@@ -101,12 +101,13 @@ class CommandSecurityPolicyTests(TestCase):
require_omemo=True, require_omemo=True,
require_trusted_omemo_fingerprint=True, require_trusted_omemo_fingerprint=True,
) )
UserXmppOmemoState.objects.create( UserXmppOmemoTrustedKey.objects.create(
user=self.user, user=self.user,
status="detected", jid="policy-user@example.test",
latest_client_key="sid:abc", key_type="client_key",
last_sender_jid="policy-user@zm.is/phone", key_id="sid:abc",
last_target_jid="jews.zm.is", trusted=True,
source="test",
) )
outputs: list[str] = [] outputs: list[str] = []
@@ -119,11 +120,15 @@ class CommandSecurityPolicyTests(TestCase):
user=self.user, user=self.user,
source_message=None, source_message=None,
service="xmpp", service="xmpp",
channel_identifier="policy-user@zm.is", channel_identifier="policy-user@example.test",
sender_identifier="policy-user@zm.is/phone", sender_identifier="policy-user@example.test/phone",
message_text=".tasks list", message_text=".tasks list",
message_meta={ message_meta={
"xmpp": {"omemo_status": "detected", "omemo_client_key": "sid:abc"} "xmpp": {
"omemo_status": "detected",
"omemo_client_key": "sid:abc",
"sender_jid": "policy-user@example.test/phone",
}
}, },
payload={}, payload={},
), ),
@@ -161,8 +166,8 @@ class CommandSecurityPolicyTests(TestCase):
user=self.user, user=self.user,
source_message=None, source_message=None,
service="xmpp", service="xmpp",
channel_identifier="policy-user@zm.is", channel_identifier="policy-user@example.test",
sender_identifier="policy-user@zm.is/phone", sender_identifier="policy-user@example.test/phone",
message_text=".tasks list", message_text=".tasks list",
message_meta={"xmpp": {"omemo_status": "no_omemo"}}, message_meta={"xmpp": {"omemo_status": "no_omemo"}},
payload={}, payload={},
@@ -200,7 +205,7 @@ class CommandSecurityPolicyTests(TestCase):
scope_key="gateway.tasks", scope_key="gateway.tasks",
context=CommandSecurityContext( context=CommandSecurityContext(
service="xmpp", service="xmpp",
channel_identifier="policy-user@zm.is", channel_identifier="policy-user@example.test",
message_meta={}, message_meta={},
payload={}, payload={},
), ),
@@ -226,3 +231,30 @@ class CommandSecurityPolicyTests(TestCase):
) )
self.assertFalse(decision.allowed) self.assertFalse(decision.allowed)
self.assertEqual("service_not_allowed", decision.code) self.assertEqual("service_not_allowed", decision.code)
def test_trusted_key_requirement_blocks_untrusted_key(self):
CommandSecurityPolicy.objects.create(
user=self.user,
scope_key="gateway.tasks",
enabled=True,
require_omemo=True,
require_trusted_omemo_fingerprint=True,
)
decision = evaluate_command_policy(
user=self.user,
scope_key="gateway.tasks",
context=CommandSecurityContext(
service="xmpp",
channel_identifier="policy-user@example.test",
message_meta={
"xmpp": {
"omemo_status": "detected",
"omemo_client_key": "sid:missing",
"sender_jid": "policy-user@example.test/phone",
}
},
payload={},
),
)
self.assertFalse(decision.allowed)
self.assertEqual("trusted_key_missing", decision.code)

View File

@@ -304,16 +304,16 @@ class XMPPReplyExtractionTests(SimpleTestCase):
"xmpp", "xmpp",
{ {
"reply_source_message_id": "xmpp-anchor-001", "reply_source_message_id": "xmpp-anchor-001",
"reply_source_chat_id": "user@zm.is/mobile", "reply_source_chat_id": "user@example.test/mobile",
}, },
) )
self.assertEqual("xmpp-anchor-001", ref.get("reply_source_message_id")) self.assertEqual("xmpp-anchor-001", ref.get("reply_source_message_id"))
self.assertEqual("xmpp", ref.get("reply_source_service")) self.assertEqual("xmpp", ref.get("reply_source_service"))
self.assertEqual("user@zm.is/mobile", ref.get("reply_source_chat_id")) self.assertEqual("user@example.test/mobile", ref.get("reply_source_chat_id"))
def test_extract_reply_ref_returns_empty_for_missing_id(self): def test_extract_reply_ref_returns_empty_for_missing_id(self):
ref = reply_sync.extract_reply_ref( ref = reply_sync.extract_reply_ref(
"xmpp", {"reply_source_chat_id": "user@zm.is"} "xmpp", {"reply_source_chat_id": "user@example.test"}
) )
self.assertEqual({}, ref) self.assertEqual({}, ref)
@@ -333,7 +333,7 @@ class XMPPReplyResolutionTests(TestCase):
user=self.user, user=self.user,
person=self.person, person=self.person,
service="xmpp", service="xmpp",
identifier="contact@zm.is", identifier="contact@example.test",
) )
self.session = ChatSession.objects.create( self.session = ChatSession.objects.create(
user=self.user, identifier=self.identifier user=self.user, identifier=self.identifier
@@ -345,8 +345,8 @@ class XMPPReplyResolutionTests(TestCase):
text="xmpp anchor", text="xmpp anchor",
source_service="xmpp", source_service="xmpp",
source_message_id="xmpp-anchor-001", source_message_id="xmpp-anchor-001",
source_chat_id="contact@zm.is/mobile", source_chat_id="contact@example.test/mobile",
sender_uuid="contact@zm.is", sender_uuid="contact@example.test",
) )
def test_resolve_reply_target_by_source_message_id(self): def test_resolve_reply_target_by_source_message_id(self):
@@ -354,7 +354,7 @@ class XMPPReplyResolutionTests(TestCase):
"xmpp", "xmpp",
{ {
"reply_source_message_id": "xmpp-anchor-001", "reply_source_message_id": "xmpp-anchor-001",
"reply_source_chat_id": "contact@zm.is/mobile", "reply_source_chat_id": "contact@example.test/mobile",
}, },
) )
target = async_to_sync(reply_sync.resolve_reply_target)( target = async_to_sync(reply_sync.resolve_reply_target)(
@@ -371,7 +371,7 @@ class XMPPReplyResolutionTests(TestCase):
target_ts=int(self.anchor.ts), target_ts=int(self.anchor.ts),
emoji="🔥", emoji="🔥",
source_service="xmpp", source_service="xmpp",
actor="contact@zm.is", actor="contact@example.test",
remove=False, remove=False,
payload={"target_xmpp_id": "xmpp-anchor-001"}, payload={"target_xmpp_id": "xmpp-anchor-001"},
) )

View File

@@ -67,6 +67,8 @@ class MCPToolTests(TestCase):
names = {item["name"] for item in tool_specs()} names = {item["name"] for item in tool_specs()}
self.assertIn("manticore.status", names) self.assertIn("manticore.status", names)
self.assertIn("memory.propose", names) self.assertIn("memory.propose", names)
self.assertIn("tasks.create", names)
self.assertIn("tasks.complete", names)
self.assertIn("tasks.link_artifact", names) self.assertIn("tasks.link_artifact", names)
self.assertIn("wiki.create_article", names) self.assertIn("wiki.create_article", names)
self.assertIn("project.get_runbook", names) self.assertIn("project.get_runbook", names)
@@ -102,6 +104,35 @@ class MCPToolTests(TestCase):
"created", str((events_payload.get("items") or [{}])[0].get("event_type")) "created", str((events_payload.get("items") or [{}])[0].get("event_type"))
) )
def test_task_create_and_complete_tools(self):
create_payload = execute_tool(
"tasks.create",
{
"user_id": self.user.id,
"project_id": str(self.project.id),
"title": "Create via MCP",
"source_service": "xmpp",
"source_channel": "component.example.test",
"actor_identifier": "mcp-user",
},
)
task_payload = create_payload.get("task") or {}
self.assertEqual("Create via MCP", str(task_payload.get("title") or ""))
self.assertEqual("xmpp", str(task_payload.get("source_service") or ""))
complete_payload = execute_tool(
"tasks.complete",
{
"user_id": self.user.id,
"task_id": str(task_payload.get("id") or ""),
"actor_identifier": "mcp-user",
},
)
completed_task = complete_payload.get("task") or {}
self.assertEqual(
"completed", str(completed_task.get("status_snapshot") or "")
)
def test_memory_proposal_review_flow(self): def test_memory_proposal_review_flow(self):
propose_payload = execute_tool( propose_payload = execute_tool(
"memory.propose", "memory.propose",

View File

@@ -2,14 +2,19 @@ from __future__ import annotations
from django.test import TestCase from django.test import TestCase
from core.models import Person, PersonIdentifier, User from core.clients import transport
from core.models import Person, PersonIdentifier, PlatformChatLink, User
from core.presence import ( from core.presence import (
AvailabilitySignal, AvailabilitySignal,
latest_state_for_people, latest_state_for_people,
record_native_signal, record_native_signal,
) )
from core.presence.inference import now_ms from core.presence.inference import now_ms
from core.views.compose import _compose_availability_payload, _context_base from core.views.compose import (
_compose_availability_payload,
_context_base,
_manual_contact_rows,
)
class PresenceQueryAndComposeContextTests(TestCase): class PresenceQueryAndComposeContextTests(TestCase):
@@ -79,3 +84,84 @@ class PresenceQueryAndComposeContextTests(TestCase):
self.assertEqual("whatsapp", str(slices[0].get("service"))) self.assertEqual("whatsapp", str(slices[0].get("service")))
self.assertEqual("available", str(summary.get("state"))) self.assertEqual("available", str(summary.get("state")))
self.assertTrue(bool(summary.get("is_cross_service"))) self.assertTrue(bool(summary.get("is_cross_service")))
def test_context_base_preserves_native_signal_group_identifier(self):
PlatformChatLink.objects.create(
user=self.user,
service="signal",
chat_identifier="signal-group-123",
chat_name="Signal Group",
is_group=True,
)
base = _context_base(
user=self.user,
service="signal",
identifier="signal-group-123",
person=None,
)
self.assertTrue(bool(base["is_group"]))
self.assertEqual("signal-group-123", str(base["identifier"]))
def test_manual_contact_rows_include_signal_groups(self):
PlatformChatLink.objects.create(
user=self.user,
service="signal",
chat_identifier="signal-group-123",
chat_name="Misinformation Club",
is_group=True,
)
rows = _manual_contact_rows(self.user)
match = next(
(
row
for row in rows
if str(row.get("service")) == "signal"
and str(row.get("identifier")) == "signal-group-123"
),
None,
)
self.assertIsNotNone(match)
self.assertEqual("Misinformation Club", str(match.get("detected_name") or ""))
def test_manual_contact_rows_collapse_signal_group_aliases(self):
PlatformChatLink.objects.create(
user=self.user,
service="signal",
chat_identifier="group.signal-club",
chat_name="Misinformation Club",
is_group=True,
)
transport.update_runtime_state(
"signal",
accounts=["+447700900001"],
groups=[
{
"identifier": "group.signal-club",
"identifiers": [
"group.signal-club",
"sEGA9F0HQ/eyLgmvKx23hha9Vp7mDRhpq23/roVSZbI=",
],
"name": "Misinformation Club",
"id": "group.signal-club",
"internal_id": "sEGA9F0HQ/eyLgmvKx23hha9Vp7mDRhpq23/roVSZbI=",
}
],
)
rows = [
row
for row in _manual_contact_rows(self.user)
if str(row.get("service")) == "signal"
and str(row.get("detected_name")) == "Misinformation Club"
]
self.assertEqual(1, len(rows))
self.assertEqual("group.signal-club", str(rows[0].get("identifier") or ""))
self.assertEqual(
["sEGA9F0HQ/eyLgmvKx23hha9Vp7mDRhpq23/roVSZbI="],
rows[0].get("identifier_aliases"),
)

View File

@@ -0,0 +1,85 @@
from __future__ import annotations
from django.test import RequestFactory, TestCase
from django.urls import resolve, reverse
from core.context_processors import settings_hierarchy_nav
from core.security.capabilities import all_scope_keys
from core.models import CommandProfile, TaskProject, User
class SettingsIntegrityTests(TestCase):
def setUp(self):
self.user = User.objects.create_user(
username="settings-user",
email="settings@example.com",
password="x",
)
self.client.force_login(self.user)
self.factory = RequestFactory()
def test_permissions_page_shows_gateway_capabilities(self):
response = self.client.get(reverse("permission_settings"))
self.assertEqual(200, response.status_code)
self.assertContains(response, "Gateway contacts command")
self.assertContains(response, "Gateway help command")
self.assertContains(response, "Gateway whoami command")
for scope_key in all_scope_keys(configurable_only=True):
self.assertContains(response, scope_key)
def test_capability_registry_excludes_removed_totp_scope(self):
self.assertNotIn("gateway.totp", all_scope_keys())
def test_codex_settings_receives_modules_settings_nav(self):
response = self.client.get(reverse("codex_settings"))
self.assertEqual(200, response.status_code)
settings_nav = response.context.get("settings_nav")
self.assertIsNotNone(settings_nav)
self.assertEqual("Modules", settings_nav["title"])
labels = [str(item["label"]) for item in settings_nav["tabs"]]
self.assertIn("Commands", labels)
self.assertIn("Task Automation", labels)
def test_business_plan_inbox_receives_modules_settings_nav(self):
response = self.client.get(reverse("business_plan_inbox"))
self.assertEqual(200, response.status_code)
settings_nav = response.context.get("settings_nav")
self.assertIsNotNone(settings_nav)
self.assertEqual("Modules", settings_nav["title"])
def test_tasks_settings_cross_links_commands_and_permissions(self):
TaskProject.objects.create(user=self.user, name="Integrity Project")
response = self.client.get(reverse("tasks_settings"))
self.assertEqual(200, response.status_code)
self.assertContains(response, "Task Automation")
self.assertContains(response, reverse("command_routing"))
self.assertContains(response, reverse("permission_settings"))
def test_command_routing_cross_links_tasks_and_permissions(self):
CommandProfile.objects.create(
user=self.user,
slug="bp",
name="Business Plan",
enabled=True,
trigger_token=".bp",
reply_required=False,
exact_match_only=False,
)
response = self.client.get(reverse("command_routing"))
self.assertEqual(200, response.status_code)
self.assertContains(response, reverse("tasks_settings"))
self.assertContains(response, reverse("permission_settings"))
def test_settings_nav_includes_codex_approval_route(self):
request = self.factory.post(reverse("codex_approval"))
request.user = self.user
request.resolver_match = resolve(reverse("codex_approval"))
settings_nav = settings_hierarchy_nav(request)["settings_nav"]
self.assertEqual("Modules", settings_nav["title"])
def test_settings_nav_includes_translation_preview_route(self):
request = self.factory.post(reverse("translation_preview"))
request.user = self.user
request.resolver_match = resolve(reverse("translation_preview"))
settings_nav = settings_hierarchy_nav(request)["settings_nav"]
self.assertEqual("Modules", settings_nav["title"])

View File

@@ -43,3 +43,22 @@ class SignalRelinkTests(TestCase):
) )
self.assertEqual(200, response.status_code) self.assertEqual(200, response.status_code)
mock_unlink_account.assert_called_once_with("signal", "+447000000001") mock_unlink_account.assert_called_once_with("signal", "+447000000001")
@patch("core.views.signal.transport.get_link_qr")
def test_signal_account_add_renders_notify_when_qr_fetch_fails(self, mock_get_link_qr):
mock_get_link_qr.side_effect = RuntimeError("timeout")
response = self.client.post(
reverse(
"signal_account_add",
kwargs={"type": "modal"},
),
{"device": "My Device"},
HTTP_HX_REQUEST="true",
HTTP_HX_TARGET="modals-here",
)
self.assertEqual(200, response.status_code)
self.assertContains(response, "modal is-active")
self.assertContains(response, "Signal QR link is unavailable right now")
self.assertContains(response, "timeout")

View File

@@ -1,3 +1,4 @@
import tempfile
from unittest.mock import Mock, patch from unittest.mock import Mock, patch
from django.test import TestCase from django.test import TestCase
@@ -6,6 +7,40 @@ from core.clients import transport
class SignalUnlinkFallbackTests(TestCase): class SignalUnlinkFallbackTests(TestCase):
def test_signal_wipe_uses_project_signal_cli_config_dir(self):
with tempfile.TemporaryDirectory() as tmpdir:
signal_root = os.path.join(tmpdir, "signal-cli-config")
os.makedirs(signal_root, exist_ok=True)
account_dir = os.path.join(signal_root, "account-data")
os.makedirs(account_dir, exist_ok=True)
keep_file = os.path.join(signal_root, "jsonrpc2.yml")
with open(keep_file, "w", encoding="utf-8") as handle:
handle.write("jsonrpc")
data_file = os.path.join(account_dir, "state.db")
with open(data_file, "w", encoding="utf-8") as handle:
handle.write("state")
with patch.object(transport.settings, "BASE_DIR", tmpdir):
result = transport._wipe_signal_cli_local_state()
self.assertTrue(result)
self.assertTrue(os.path.exists(keep_file))
self.assertFalse(os.path.exists(account_dir))
@patch("requests.get")
def test_signal_list_accounts_uses_fast_timeout(self, mock_get):
ok_response = Mock()
ok_response.ok = True
ok_response.text = "[]"
mock_get.return_value = ok_response
result = transport.list_accounts("signal")
self.assertEqual([], result)
mock_get.assert_called_once()
_, kwargs = mock_get.call_args
self.assertEqual(5, int(kwargs.get("timeout") or 0))
@patch("core.clients.transport._wipe_signal_cli_local_state") @patch("core.clients.transport._wipe_signal_cli_local_state")
@patch("requests.delete") @patch("requests.delete")
def test_signal_unlink_uses_rest_delete_when_available( def test_signal_unlink_uses_rest_delete_when_available(
@@ -40,3 +75,25 @@ class SignalUnlinkFallbackTests(TestCase):
self.assertTrue(result) self.assertTrue(result)
self.assertEqual(2, mock_delete.call_count) self.assertEqual(2, mock_delete.call_count)
mock_wipe.assert_called_once() mock_wipe.assert_called_once()
@patch("core.clients.transport.list_accounts")
@patch("core.clients.transport._wipe_signal_cli_local_state")
@patch("requests.delete")
def test_signal_unlink_returns_false_when_account_still_listed_after_wipe(
self,
mock_delete,
mock_wipe,
mock_list_accounts,
):
bad_response = Mock()
bad_response.ok = False
mock_delete.return_value = bad_response
mock_wipe.return_value = True
mock_list_accounts.return_value = ["+447700900000"]
result = transport.unlink_account("signal", "+447700900000")
self.assertFalse(result)
self.assertEqual(2, mock_delete.call_count)
mock_wipe.assert_called_once()
mock_list_accounts.assert_called_once_with("signal")

View File

@@ -251,6 +251,28 @@ class TasksPagesManagementTests(TestCase):
self.assertEqual(200, response.status_code) self.assertEqual(200, response.status_code)
self.assertContains(response, "Scope Person") self.assertContains(response, "Scope Person")
def test_tasks_hub_can_create_manual_task_without_chat_source(self):
project = TaskProject.objects.create(user=self.user, name="Manual Project")
response = self.client.post(
reverse("tasks_hub"),
{
"action": "task_create",
"project_id": str(project.id),
"title": "Manual web task",
"due_date": "2026-03-10",
"assignee_identifier": "@operator",
},
follow=True,
)
self.assertEqual(200, response.status_code)
task = DerivedTask.objects.get(user=self.user, project=project, title="Manual web task")
self.assertEqual("web", task.source_service)
self.assertEqual("@operator", task.assignee_identifier)
self.assertEqual("2026-03-10", task.due_date.isoformat())
event = task.events.order_by("-created_at").first()
self.assertEqual("created", event.event_type)
self.assertEqual("web_ui", str((event.payload or {}).get("via") or ""))
def test_project_page_creator_column_links_to_compose(self): def test_project_page_creator_column_links_to_compose(self):
project = TaskProject.objects.create(user=self.user, name="Creator Link Test") project = TaskProject.objects.create(user=self.user, name="Creator Link Test")
session = ChatSession.objects.create(user=self.user, identifier=self.pid_signal) session = ChatSession.objects.create(user=self.user, identifier=self.pid_signal)

View File

@@ -7,7 +7,7 @@ from django.core.management import call_command
from django.test import TestCase from django.test import TestCase
from core.clients.whatsapp import WhatsAppClient from core.clients.whatsapp import WhatsAppClient
from core.messaging import history from core.messaging import history, media_bridge
from core.models import ( from core.models import (
ChatSession, ChatSession,
ContactAvailabilityEvent, ContactAvailabilityEvent,
@@ -25,6 +25,16 @@ class _DummyXMPPClient:
return None return None
class _DummyDownloadClient:
def __init__(self, payload: bytes):
self.payload = payload
self.calls = []
async def download_any(self, message):
self.calls.append(message)
return self.payload
class _DummyUR: class _DummyUR:
def __init__(self, loop): def __init__(self, loop):
self.loop = loop self.loop = loop
@@ -122,6 +132,49 @@ class WhatsAppReactionHandlingTests(TestCase):
self.assertEqual("offline", payload.get("presence")) self.assertEqual("offline", payload.get("presence"))
self.assertTrue(int(payload.get("last_seen_ts") or 0) > 0) self.assertTrue(int(payload.get("last_seen_ts") or 0) > 0)
def test_download_event_media_unwraps_device_sent_image(self):
downloader = _DummyDownloadClient(b"png-bytes")
self.client._client = downloader
event = {
"message": {
"deviceSentMessage": {
"message": {
"imageMessage": {
"caption": "wrapped image",
"mimetype": "image/png",
}
}
}
}
}
attachments = async_to_sync(self.client._download_event_media)(event)
self.assertEqual(1, len(attachments))
self.assertEqual(1, len(downloader.calls))
self.assertEqual(
{"imageMessage": {"caption": "wrapped image", "mimetype": "image/png"}},
downloader.calls[0],
)
blob = media_bridge.get_blob(attachments[0]["blob_key"])
self.assertIsNotNone(blob)
self.assertEqual(b"png-bytes", blob["content"])
self.assertEqual("image/png", blob["content_type"])
def test_message_text_unwraps_device_sent_caption(self):
text = self.client._message_text(
{
"deviceSentMessage": {
"message": {
"imageMessage": {
"caption": "caption from wrapper",
}
}
}
}
)
self.assertEqual("caption from wrapper", text)
class RecalculateContactAvailabilityTests(TestCase): class RecalculateContactAvailabilityTests(TestCase):
def setUp(self): def setUp(self):

View File

@@ -5,7 +5,11 @@ from unittest.mock import MagicMock
from asgiref.sync import async_to_sync from asgiref.sync import async_to_sync
from django.test import TestCase from django.test import TestCase
from core.clients.xmpp import XMPPComponent from core.gateway.builtin import (
gateway_help_lines,
handle_approval_command,
handle_tasks_command,
)
from core.models import ( from core.models import (
CodexPermissionRequest, CodexPermissionRequest,
CodexRun, CodexRun,
@@ -16,19 +20,6 @@ from core.models import (
) )
class _ApprovalProbe:
_resolve_request_provider = XMPPComponent._resolve_request_provider
_approval_event_prefix = XMPPComponent._approval_event_prefix
_APPROVAL_PROVIDER_COMMANDS = XMPPComponent._APPROVAL_PROVIDER_COMMANDS
_ACTION_TO_STATUS = XMPPComponent._ACTION_TO_STATUS
_apply_approval_decision = XMPPComponent._apply_approval_decision
_approval_list_pending = XMPPComponent._approval_list_pending
_approval_status = XMPPComponent._approval_status
_handle_approval_command = XMPPComponent._handle_approval_command
_gateway_help_lines = XMPPComponent._gateway_help_lines
_handle_tasks_command = XMPPComponent._handle_tasks_command
class XMPPGatewayApprovalCommandTests(TestCase): class XMPPGatewayApprovalCommandTests(TestCase):
def setUp(self): def setUp(self):
self.user = User.objects.create_user( self.user = User.objects.create_user(
@@ -43,7 +34,7 @@ class XMPPGatewayApprovalCommandTests(TestCase):
epic=None, epic=None,
title="Approve me", title="Approve me",
source_service="xmpp", source_service="xmpp",
source_channel="jews.zm.is", source_channel="component.example.test",
reference_code="77", reference_code="77",
status_snapshot="open", status_snapshot="open",
) )
@@ -60,7 +51,7 @@ class XMPPGatewayApprovalCommandTests(TestCase):
task=self.task, task=self.task,
project=self.project, project=self.project,
source_service="xmpp", source_service="xmpp",
source_channel="jews.zm.is", source_channel="component.example.test",
status="waiting_approval", status="waiting_approval",
request_payload={ request_payload={
"action": "append_update", "action": "append_update",
@@ -78,8 +69,7 @@ class XMPPGatewayApprovalCommandTests(TestCase):
resume_payload={}, resume_payload={},
status="pending", status="pending",
) )
self.probe = _ApprovalProbe() self.probe = MagicMock()
self.probe.log = MagicMock()
def _run_command(self, text: str) -> list[str]: def _run_command(self, text: str) -> list[str]:
messages = [] messages = []
@@ -87,11 +77,9 @@ class XMPPGatewayApprovalCommandTests(TestCase):
def _sym(value): def _sym(value):
messages.append(str(value)) messages.append(str(value))
handled = async_to_sync(XMPPComponent._handle_approval_command)( handled = async_to_sync(handle_approval_command)(
self.probe,
self.user, self.user,
text, text,
"xmpp-approval-user@zm.is/mobile",
_sym, _sym,
) )
self.assertTrue(handled) self.assertTrue(handled)
@@ -140,12 +128,11 @@ class XMPPGatewayTasksCommandTests(TestCase):
epic=None, epic=None,
title="Ship CLI", title="Ship CLI",
source_service="xmpp", source_service="xmpp",
source_channel="jews.zm.is", source_channel="component.example.test",
reference_code="12", reference_code="12",
status_snapshot="open", status_snapshot="open",
) )
self.probe = _ApprovalProbe() self.probe = MagicMock()
self.probe.log = MagicMock()
def _run_tasks(self, text: str) -> list[str]: def _run_tasks(self, text: str) -> list[str]:
messages = [] messages = []
@@ -153,8 +140,7 @@ class XMPPGatewayTasksCommandTests(TestCase):
def _sym(value): def _sym(value):
messages.append(str(value)) messages.append(str(value))
handled = async_to_sync(XMPPComponent._handle_tasks_command)( handled = async_to_sync(handle_tasks_command)(
self.probe,
self.user, self.user,
text, text,
_sym, _sym,
@@ -164,14 +150,18 @@ class XMPPGatewayTasksCommandTests(TestCase):
return messages return messages
def test_help_contains_approval_and_tasks_sections(self): def test_help_contains_approval_and_tasks_sections(self):
lines = self.probe._gateway_help_lines() lines = gateway_help_lines()
text = "\n".join(lines) text = "\n".join(lines)
self.assertIn(".approval list-pending", text) self.assertIn(".approval list-pending", text)
self.assertIn(".tasks list", text) self.assertIn(".tasks list", text)
self.assertIn(".tasks add", text)
self.assertIn(".l", text)
def test_tasks_list_show_complete_and_undo(self): def test_tasks_list_show_complete_and_undo(self):
rows = self._run_tasks(".tasks list open 10") rows = self._run_tasks(".tasks list open 10")
self.assertIn("#12", "\n".join(rows)) self.assertIn("#12", "\n".join(rows))
rows = self._run_tasks(".l")
self.assertIn("#12", "\n".join(rows))
rows = self._run_tasks(".tasks show #12") rows = self._run_tasks(".tasks show #12")
self.assertIn("status: open", "\n".join(rows)) self.assertIn("status: open", "\n".join(rows))
rows = self._run_tasks(".tasks complete #12") rows = self._run_tasks(".tasks complete #12")
@@ -181,3 +171,24 @@ class XMPPGatewayTasksCommandTests(TestCase):
rows = self._run_tasks(".tasks undo #12") rows = self._run_tasks(".tasks undo #12")
self.assertIn("removed #12", "\n".join(rows)) self.assertIn("removed #12", "\n".join(rows))
self.assertFalse(DerivedTask.objects.filter(id=self.task.id).exists()) self.assertFalse(DerivedTask.objects.filter(id=self.task.id).exists())
def test_tasks_add_creates_task_in_named_project(self):
rows = []
handled = async_to_sync(handle_tasks_command)(
self.user,
".tasks add Task Project :: Wire XMPP manual task create",
lambda value: rows.append(str(value)),
service="xmpp",
channel_identifier="component.example.test",
sender_identifier="operator@example.test",
)
self.assertTrue(handled)
self.assertTrue(any("created #" in row.lower() for row in rows))
created = DerivedTask.objects.filter(
user=self.user,
project=self.project,
title="Wire XMPP manual task create",
source_service="xmpp",
source_channel="component.example.test",
).first()
self.assertIsNotNone(created)

View File

@@ -0,0 +1,103 @@
from unittest.mock import AsyncMock, patch
from asgiref.sync import async_to_sync
from django.test import SimpleTestCase, TestCase, override_settings
from core.clients import transport
from core.clients.xmpp import XMPPComponent, _resolve_person_from_xmpp_localpart
from core.models import Person, PersonIdentifier, User
class SignalAttachmentFetchTests(SimpleTestCase):
def test_signal_service_allows_direct_url_fetch(self):
response = AsyncMock()
response.status = 200
response.headers = {"Content-Type": "image/png"}
response.read = AsyncMock(return_value=b"png-bytes")
request_ctx = AsyncMock()
request_ctx.__aenter__.return_value = response
request_ctx.__aexit__.return_value = False
session = AsyncMock()
session.get.return_value = request_ctx
session_ctx = AsyncMock()
session_ctx.__aenter__.return_value = session
session_ctx.__aexit__.return_value = False
with patch(
"core.clients.transport.aiohttp.ClientSession",
return_value=session_ctx,
):
fetched = async_to_sync(transport.fetch_attachment)(
"signal",
{
"url": "https://example.com/file_share/demo.png",
"filename": "demo.png",
"content_type": "image/png",
},
)
self.assertEqual(b"png-bytes", fetched["content"])
self.assertEqual("image/png", fetched["content_type"])
self.assertEqual("demo.png", fetched["filename"])
self.assertEqual(9, fetched["size"])
@override_settings(
XMPP_JID="component.example.test",
XMPP_USER_DOMAIN="example.test",
)
class XMPPContactJidTests(TestCase):
def _component(self):
return XMPPComponent(
ur=AsyncMock(),
jid="component.example.test",
secret="secret",
server="localhost",
port=5347,
)
def test_resolve_person_from_escaped_localpart(self):
user = User.objects.create_user(username="user", password="pw")
person = Person.objects.create(user=user, name="Misinformation Club")
resolved = _resolve_person_from_xmpp_localpart(
user=user,
localpart_value=r"misinformation\20club",
)
self.assertEqual(person.id, resolved.id)
def test_send_from_external_escapes_contact_jid(self):
user = User.objects.create_user(username="user2", password="pw")
person = Person.objects.create(user=user, name="Misinformation Club")
identifier = PersonIdentifier.objects.create(
user=user,
person=person,
service="signal",
identifier="group.example",
)
component = self._component()
component.send_xmpp_message = AsyncMock(return_value="xmpp-id")
with (
patch("core.clients.xmpp.transport.record_bridge_mapping"),
patch("core.clients.xmpp.history.save_bridge_ref", new=AsyncMock()),
):
async_to_sync(component.send_from_external)(
user,
identifier,
"hello",
False,
attachments=[],
source_ref={},
)
component.send_xmpp_message.assert_awaited_once_with(
"user2@example.test",
r"misinformation\20club|signal@component.example.test",
"hello",
use_omemo_encryption=True,
)

View File

@@ -0,0 +1,165 @@
from types import SimpleNamespace
from unittest.mock import AsyncMock, MagicMock, patch
from asgiref.sync import async_to_sync
from django.test import TestCase, override_settings
from slixmpp.plugins.xep_0356.permissions import MessagePermission
from core.clients.xmpp import XMPPComponent
from core.models import Person, PersonIdentifier, User
@override_settings(
XMPP_JID="component.example.test",
XMPP_USER_DOMAIN="example.test",
)
class XMPPCarbonTests(TestCase):
def _component(self):
component = XMPPComponent(
ur=MagicMock(),
jid="component.example.test",
secret="secret",
server="localhost",
port=5347,
)
component.log = MagicMock()
return component
def test_build_privileged_outbound_message_targets_contact(self):
component = self._component()
msg = component._build_privileged_outbound_message(
user_jid="user@example.test",
contact_jid="contact|signal@component.example.test",
body_text="hello from signal",
attachment_url="https://files.example.test/demo.png",
)
self.assertEqual("user@example.test", str(msg["from"]))
self.assertEqual("contact|signal@component.example.test", str(msg["to"]))
body = next(
(child for child in msg.xml if str(child.tag).endswith("body")),
None,
)
self.assertIsNotNone(body)
self.assertEqual("hello from signal", body.text)
oob = msg.xml.find(".//{jabber:x:oob}url")
self.assertIsNotNone(oob)
self.assertEqual("https://files.example.test/demo.png", oob.text)
def test_send_sent_carbon_copy_requires_outgoing_privilege(self):
component = self._component()
plugin = SimpleNamespace(
granted_privileges={"example.test": SimpleNamespace(message="none")},
send_privileged_message=MagicMock(),
_make_privileged_message=MagicMock(),
)
with (
patch.object(component.plugin, "get", return_value=plugin),
patch.object(component, "_user_xmpp_domain", return_value="other.example.test"),
):
sent = async_to_sync(component.send_sent_carbon_copy)(
user_jid="user@example.test",
contact_jid="contact|signal@component.example.test",
body_text="hello",
)
self.assertFalse(sent)
plugin.send_privileged_message.assert_not_called()
plugin._make_privileged_message.assert_not_called()
def test_send_sent_carbon_copy_sends_privileged_message_when_allowed(self):
component = self._component()
plugin = SimpleNamespace(
granted_privileges={
"example.test": SimpleNamespace(message=MessagePermission.OUTGOING)
},
send_privileged_message=MagicMock(),
)
with patch.object(component.plugin, "get", return_value=plugin):
sent = async_to_sync(component.send_sent_carbon_copy)(
user_jid="user@example.test",
contact_jid="contact|signal@component.example.test",
body_text="hello",
)
self.assertTrue(sent)
plugin.send_privileged_message.assert_called_once()
sent_message = plugin.send_privileged_message.call_args.args[0]
self.assertEqual("contact|signal@component.example.test", str(sent_message["to"]))
self.assertEqual("user@example.test", str(sent_message["from"]))
self.assertIsNotNone(
next(
(child for child in sent_message.xml if str(child.tag).endswith("body")),
None,
)
)
def test_send_sent_carbon_copy_uses_configured_domain_without_advertisement(self):
component = self._component()
wrapped = MagicMock()
plugin = SimpleNamespace(
granted_privileges={},
send_privileged_message=MagicMock(),
_make_privileged_message=MagicMock(return_value=wrapped),
)
with patch.object(component.plugin, "get", return_value=plugin):
sent = async_to_sync(component.send_sent_carbon_copy)(
user_jid="user@example.test",
contact_jid="contact|signal@component.example.test",
body_text="hello",
)
self.assertTrue(sent)
plugin.send_privileged_message.assert_not_called()
plugin._make_privileged_message.assert_called_once()
wrapped.send.assert_called_once()
def test_outgoing_relay_keeps_you_prefix_while_attempting_carbon_copy(self):
component = self._component()
user = User.objects.create_user(username="user", password="pw")
person = Person.objects.create(user=user, name="Contact")
identifier = PersonIdentifier.objects.create(
user=user,
person=person,
service="signal",
identifier="+15550000000",
)
call_order = []
async def send_xmpp_message(*args, **kwargs):
call_order.append("fallback")
return "xmpp-message-id"
async def send_sent_carbon_copy(*args, **kwargs):
call_order.append("carbon")
return True
component.send_sent_carbon_copy = AsyncMock(side_effect=send_sent_carbon_copy)
component.send_xmpp_message = AsyncMock(side_effect=send_xmpp_message)
with (
patch("core.clients.xmpp.transport.record_bridge_mapping"),
patch("core.clients.xmpp.history.save_bridge_ref", new=AsyncMock()),
):
async_to_sync(component.send_from_external)(
user,
identifier,
"hello",
True,
attachments=[],
source_ref={},
)
component.send_sent_carbon_copy.assert_awaited_once_with(
user_jid="user@example.test",
contact_jid="contact|signal@component.example.test",
body_text="hello",
)
component.send_xmpp_message.assert_awaited_once_with(
"user@example.test",
"contact|signal@component.example.test",
"YOU: hello",
use_omemo_encryption=True,
)
self.assertEqual(["fallback", "carbon"], call_order)

View File

@@ -54,12 +54,12 @@ def _xmpp_c2s_port() -> int:
def _xmpp_domain() -> str: def _xmpp_domain() -> str:
"""The VirtualHost domain (zm.is), derived from XMPP_JID or XMPP_DOMAIN.""" """The VirtualHost domain, derived from XMPP_JID or XMPP_DOMAIN."""
domain = getattr(settings, "XMPP_DOMAIN", None) domain = getattr(settings, "XMPP_DOMAIN", None)
if domain: if domain:
return str(domain) return str(domain)
jid = str(settings.XMPP_JID) jid = str(settings.XMPP_JID)
# Component JID is like "jews.zm.is" → parent domain is "zm.is" # Component JIDs may be subdomains; derive the parent domain when needed.
parts = jid.split(".") parts = jid.split(".")
if len(parts) > 2: if len(parts) > 2:
return ".".join(parts[1:]) return ".".join(parts[1:])
@@ -389,7 +389,10 @@ class XMPPAuthBridgeTests(SimpleTestCase):
"""Auth bridge returns 0 (or error) for a request with a wrong XMPP_SECRET.""" """Auth bridge returns 0 (or error) for a request with a wrong XMPP_SECRET."""
_, host, port, path = self._parse_endpoint() _, host, port, path = self._parse_endpoint()
# isuser command with wrong secret — should be rejected or return 0 # isuser command with wrong secret — should be rejected or return 0
query = "?command=isuser%3Anonexistent%3Azm.is&secret=wrongsecret" query = (
f"?command=isuser%3Anonexistent%3A{urllib.parse.quote(_xmpp_domain())}"
"&secret=wrongsecret"
)
try: try:
conn = http.client.HTTPConnection(host, port, timeout=5) conn = http.client.HTTPConnection(host, port, timeout=5)
conn.request("GET", path + query) conn.request("GET", path + query)
@@ -410,7 +413,8 @@ class XMPPAuthBridgeTests(SimpleTestCase):
secret = getattr(settings, "XMPP_SECRET", "") secret = getattr(settings, "XMPP_SECRET", "")
_, host, port, path = self._parse_endpoint() _, host, port, path = self._parse_endpoint()
query = ( query = (
f"?command=isuser%3Anonexistent%3Azm.is&secret={urllib.parse.quote(secret)}" f"?command=isuser%3Anonexistent%3A{urllib.parse.quote(_xmpp_domain())}"
f"&secret={urllib.parse.quote(secret)}"
) )
try: try:
conn = http.client.HTTPConnection(host, port, timeout=5) conn = http.client.HTTPConnection(host, port, timeout=5)

View File

@@ -9,7 +9,7 @@ from core.models import User, UserXmppOmemoState
@override_settings( @override_settings(
XMPP_JID="jews.zm.is", XMPP_JID="component.example.test",
XMPP_SECRET="secret", XMPP_SECRET="secret",
XMPP_ADDRESS="127.0.0.1", XMPP_ADDRESS="127.0.0.1",
XMPP_PORT=8888, XMPP_PORT=8888,
@@ -51,14 +51,14 @@ class XMPPOmemoObservationPersistenceTests(TestCase):
async_to_sync(XMPPComponent._record_sender_omemo_state)( async_to_sync(XMPPComponent._record_sender_omemo_state)(
xmpp_component, xmpp_component,
user, user,
sender_jid="xmpp-omemo-user@zm.is/mobile", sender_jid="xmpp-omemo-user@example.test/mobile",
recipient_jid="jews.zm.is", recipient_jid="component.example.test",
message_stanza=SimpleNamespace(xml=stanza_xml), message_stanza=SimpleNamespace(xml=stanza_xml),
) )
row = UserXmppOmemoState.objects.get(user=user) row = UserXmppOmemoState.objects.get(user=user)
self.assertEqual("detected", row.status) self.assertEqual("detected", row.status)
self.assertEqual("sid:321,rid:654", row.latest_client_key) self.assertEqual("sid:321,rid:654", row.latest_client_key)
self.assertEqual("jews.zm.is", row.last_target_jid) self.assertEqual("component.example.test", row.last_target_jid)
class XMPPOmemoEnforcementTests(TestCase): class XMPPOmemoEnforcementTests(TestCase):
@@ -78,7 +78,7 @@ class XMPPOmemoEnforcementTests(TestCase):
# Create a plaintext message stanza (no OMEMO encryption) # Create a plaintext message stanza (no OMEMO encryption)
stanza_xml = ET.fromstring( stanza_xml = ET.fromstring(
"<message from='sender@example.com' to='jews.zm.is'>" "<message from='sender@example.com' to='component.example.test'>"
"<body>Hello, world!</body>" "<body>Hello, world!</body>"
"</message>" "</message>"
) )
@@ -125,7 +125,7 @@ class XMPPOmemoEnforcementTests(TestCase):
# Create an OMEMO-encrypted message stanza # Create an OMEMO-encrypted message stanza
stanza_xml = ET.fromstring( stanza_xml = ET.fromstring(
"<message from='sender@example.com' to='jews.zm.is'>" "<message from='sender@example.com' to='component.example.test'>"
"<encrypted xmlns='eu.siacs.conversations.axolotl'>" "<encrypted xmlns='eu.siacs.conversations.axolotl'>"
"<header sid='77'><key rid='88'>x</key></header>" "<header sid='77'><key rid='88'>x</key></header>"
"</encrypted>" "</encrypted>"
@@ -167,14 +167,14 @@ class XMPPOmemoDeviceDiscoveryTests(TestCase):
# Create a mock XMPP component # Create a mock XMPP component
self.mock_component = MagicMock() self.mock_component = MagicMock()
self.mock_component.log = MagicMock() self.mock_component.log = MagicMock()
self.mock_component.jid = "jews.zm.is" self.mock_component.jid = "component.example.test"
def test_gateway_publishes_device_list_to_pubsub(self): def test_gateway_publishes_device_list_to_pubsub(self):
"""Test that the gateway publishes its device list to PubSub (XEP-0060). """Test that the gateway publishes its device list to PubSub (XEP-0060).
This simulates the device discovery query that real XMPP clients perform. This simulates the device discovery query that real XMPP clients perform.
When a client wants to send an OMEMO message, it: When a client wants to send an OMEMO message, it:
1. Queries the PubSub node: pubsub.example.com/eu.siacs.conversations.axolotl/devices/jews.zm.is 1. Queries the PubSub node: pubsub.example.com/eu.siacs.conversations.axolotl/devices/component.example.test
2. Expects to receive a device list with at least one device 2. Expects to receive a device list with at least one device
3. Retrieves keys for those devices 3. Retrieves keys for those devices
4. Encrypts the message 4. Encrypts the message
@@ -261,7 +261,7 @@ class XMPPOmemoDeviceDiscoveryTests(TestCase):
""" """
# Simulate an OMEMO-encrypted message from a client device # Simulate an OMEMO-encrypted message from a client device
client_stanza = ET.fromstring( client_stanza = ET.fromstring(
"<message from='testuser@example.com/mobile' to='jews.zm.is'>" "<message from='testuser@example.com/mobile' to='component.example.test'>"
"<encrypted xmlns='eu.siacs.conversations.axolotl'>" "<encrypted xmlns='eu.siacs.conversations.axolotl'>"
"<header sid='12345' schemeVersion='2'>" # Device 12345 "<header sid='12345' schemeVersion='2'>" # Device 12345
"<key rid='67890'>encrypted_payload_1</key>" # To recipient device 67890 "<key rid='67890'>encrypted_payload_1</key>" # To recipient device 67890
@@ -289,7 +289,7 @@ class XMPPOmemoDeviceDiscoveryTests(TestCase):
The OMEMO bootstrap must: The OMEMO bootstrap must:
1. Initialize the session manager (which auto-creates devices) 1. Initialize the session manager (which auto-creates devices)
2. Publish device list to PubSub at: eu.siacs.conversations.axolotl/devices/jews.zm.is 2. Publish device list to PubSub at: eu.siacs.conversations.axolotl/devices/component.example.test
3. Allow clients to discover and query those devices 3. Allow clients to discover and query those devices
If PubSub is slow or unavailable, this times out and prevents If PubSub is slow or unavailable, this times out and prevents
@@ -309,15 +309,15 @@ class XMPPOmemoDeviceDiscoveryTests(TestCase):
def test_component_jid_device_discovery(self): def test_component_jid_device_discovery(self):
"""Test that component JIDs (without user@) can publish OMEMO devices. """Test that component JIDs (without user@) can publish OMEMO devices.
A key issue with components: they use JIDs like 'jews.zm.is' instead of A key issue with components: they use JIDs like 'component.example.test' instead of
'user@jews.zm.is'. This affects: 'user@component.example.test'. This affects:
1. Device list node path: eu.siacs.conversations.axolotl/devices/jews.zm.is 1. Device list node path: eu.siacs.conversations.axolotl/devices/component.example.test
2. Device identity and trust establishment 2. Device identity and trust establishment
3. How clients discover and encrypt to the component 3. How clients discover and encrypt to the component
The OMEMO plugin must handle component JIDs correctly. The OMEMO plugin must handle component JIDs correctly.
""" """
component_jid = "jews.zm.is" component_jid = "component.example.test"
# Component JID format (no user@ part) # Component JID format (no user@ part)
self.assertNotIn("@", component_jid) self.assertNotIn("@", component_jid)
@@ -325,21 +325,21 @@ class XMPPOmemoDeviceDiscoveryTests(TestCase):
# But PubSub device node still follows standard format # But PubSub device node still follows standard format
pubsub_node = f"eu.siacs.conversations.axolotl/devices/{component_jid}" pubsub_node = f"eu.siacs.conversations.axolotl/devices/{component_jid}"
self.assertEqual( self.assertEqual(
"eu.siacs.conversations.axolotl/devices/jews.zm.is", pubsub_node "eu.siacs.conversations.axolotl/devices/component.example.test", pubsub_node
) )
def test_gateway_accepts_presence_subscription_for_omemo(self): def test_gateway_accepts_presence_subscription_for_omemo(self):
"""Test that gateway auto-accepts presence subscriptions for OMEMO device discovery. """Test that gateway auto-accepts presence subscriptions for OMEMO device discovery.
When a client subscribes to the gateway component (jews.zm.is) for OMEMO: When a client subscribes to the gateway component JID for OMEMO:
1. Client sends: <presence type="subscribe" from="user@example.com" to="jews.zm.is"/> 1. Client sends: <presence type="subscribe" from="user@example.com" to="component.example.test"/>
2. Gateway should auto-accept and send presence availability 2. Gateway should auto-accept and send presence availability
3. This allows the client to add the gateway to its roster 3. This allows the client to add the gateway to its roster
4. Client can then query PubSub for device lists 4. Client can then query PubSub for device lists
""" """
# Simulate a client sending presence subscription to gateway # Simulate a client sending presence subscription to gateway
client_jid = "testclient@example.com" client_jid = "testclient@example.com"
gateway_jid = "jews.zm.is" gateway_jid = "component.example.test"
# Create a mock XMPP component with the subscription handler # Create a mock XMPP component with the subscription handler
mock_component = MagicMock() mock_component = MagicMock()

View File

@@ -143,6 +143,7 @@ class CommandRoutingSettings(LoginRequiredMixin, View):
"command_choices": ( "command_choices": (
("bp", "Business Plan (bp)"), ("bp", "Business Plan (bp)"),
("codex", "Codex (codex)"), ("codex", "Codex (codex)"),
("claude", "Claude (claude)"),
), ),
"scope_service": scope_service, "scope_service": scope_service,
"scope_identifier": scope_identifier, "scope_identifier": scope_identifier,
@@ -165,21 +166,27 @@ class CommandRoutingSettings(LoginRequiredMixin, View):
.lower() .lower()
or "bp" or "bp"
) )
default_name = {
"bp": "Business Plan",
"codex": "Codex",
"claude": "Claude",
}.get(slug, "Business Plan")
default_trigger = {
"bp": ".bp",
"codex": ".codex",
"claude": ".claude",
}.get(slug, ".bp")
profile, _ = CommandProfile.objects.get_or_create( profile, _ = CommandProfile.objects.get_or_create(
user=request.user, user=request.user,
slug=slug, slug=slug,
defaults={ defaults={
"name": str( "name": str(request.POST.get("name") or default_name).strip()
request.POST.get("name") or default_name,
or ("Codex" if slug == "codex" else "Business Plan")
).strip()
or ("Codex" if slug == "codex" else "Business Plan"),
"enabled": True, "enabled": True,
"trigger_token": str( "trigger_token": str(
request.POST.get("trigger_token") request.POST.get("trigger_token") or default_trigger
or (".codex" if slug == "codex" else ".bp")
).strip() ).strip()
or (".codex" if slug == "codex" else ".bp"), or default_trigger,
"template_text": str(request.POST.get("template_text") or ""), "template_text": str(request.POST.get("template_text") or ""),
}, },
) )
@@ -195,6 +202,10 @@ class CommandRoutingSettings(LoginRequiredMixin, View):
profile.trigger_token = ".codex" profile.trigger_token = ".codex"
profile.reply_required = False profile.reply_required = False
profile.exact_match_only = False profile.exact_match_only = False
if slug == "claude":
profile.trigger_token = ".claude"
profile.reply_required = False
profile.exact_match_only = False
profile.save( profile.save(
update_fields=[ update_fields=[
"name", "name",

View File

@@ -7,7 +7,7 @@ import time
from datetime import datetime from datetime import datetime
from datetime import timezone as dt_timezone from datetime import timezone as dt_timezone
from difflib import SequenceMatcher from difflib import SequenceMatcher
from urllib.parse import quote_plus, urlencode, urlparse from urllib.parse import urlencode, urlparse
from asgiref.sync import async_to_sync from asgiref.sync import async_to_sync
from django.conf import settings from django.conf import settings
@@ -136,6 +136,13 @@ def _identifier_variants(service: str, identifier: str) -> list[str]:
return variants return variants
def _group_channel_identifier(service: str, group_link: PlatformChatLink, bare_id: str) -> str:
service_key = _default_service(service)
if service_key == "whatsapp":
return str(group_link.chat_jid or f"{bare_id}@g.us").strip()
return bare_id
def _safe_limit(raw) -> int: def _safe_limit(raw) -> int:
try: try:
value = int(raw or 40) value = int(raw or 40)
@@ -424,7 +431,7 @@ def _extract_attachment_image_urls(blob) -> list[str]:
direct_urls.append(normalized) direct_urls.append(normalized)
urls.extend(direct_urls) urls.extend(direct_urls)
blob_key = str(blob.get("blob_key") or "").strip() blob_key = str(blob.get("blob_key") or "").strip()
# Prefer source-hosted URLs (for example share.zm.is) and use blob fallback only # Prefer source-hosted URLs and use blob fallback only
# when no usable direct URL exists. # when no usable direct URL exists.
if blob_key and image_hint and not direct_urls: if blob_key and image_hint and not direct_urls:
urls.append(f"/compose/media/blob/?key={quote_plus(blob_key)}") urls.append(f"/compose/media/blob/?key={quote_plus(blob_key)}")
@@ -1783,6 +1790,11 @@ def _context_base(user, service, identifier, person):
service=service, service=service,
identifier__in=identifier_variants or [identifier], identifier__in=identifier_variants or [identifier],
).first() ).first()
if person_identifier is None and identifier and person is None:
person_identifier = PersonIdentifier.objects.filter(
user=user,
identifier__in=identifier_variants or [identifier],
).first()
if person_identifier: if person_identifier:
service = person_identifier.service service = person_identifier.service
@@ -1811,7 +1823,7 @@ def _context_base(user, service, identifier, person):
return { return {
"person_identifier": None, "person_identifier": None,
"service": service, "service": service,
"identifier": f"{bare_id}@g.us", "identifier": _group_channel_identifier(service, group_link, bare_id),
"person": None, "person": None,
"is_group": True, "is_group": True,
"group_name": group_link.chat_name or bare_id, "group_name": group_link.chat_name or bare_id,
@@ -2426,6 +2438,63 @@ def _signal_identifier_shape(value: str) -> str:
return "other" return "other"
def _preferred_signal_identifier(identifiers: list[str], *, is_group: bool) -> str:
cleaned = []
for value in identifiers:
candidate = str(value or "").strip()
if candidate and candidate not in cleaned:
cleaned.append(candidate)
if not cleaned:
return ""
if is_group:
for candidate in cleaned:
if candidate.startswith("group."):
return candidate
return cleaned[0]
for candidate in cleaned:
if _signal_identifier_shape(candidate) == "phone":
return candidate
for candidate in cleaned:
if _signal_identifier_shape(candidate) == "uuid":
return candidate
return cleaned[0]
def _signal_runtime_alias_map() -> dict[str, set[str]]:
state = transport.get_runtime_state("signal") or {}
alias_map: dict[str, set[str]] = {}
for bucket_name in ("contacts", "groups"):
rows = state.get(bucket_name) or []
if not isinstance(rows, list):
continue
for item in rows:
if not isinstance(item, dict):
continue
identifiers = []
candidates = item.get("identifiers")
if isinstance(candidates, list):
identifiers.extend(candidates)
identifiers.extend(
[
item.get("identifier"),
item.get("number"),
item.get("uuid"),
item.get("id"),
item.get("internal_id"),
]
)
unique = []
for value in identifiers:
candidate = str(value or "").strip()
if candidate and candidate not in unique:
unique.append(candidate)
if len(unique) < 2:
continue
for identifier in unique:
alias_map[identifier] = {value for value in unique if value != identifier}
return alias_map
def _manual_contact_rows(user): def _manual_contact_rows(user):
rows = [] rows = []
seen = set() seen = set()
@@ -2493,6 +2562,8 @@ def _manual_contact_rows(user):
) )
for row in identifiers: for row in identifiers:
if _default_service(row.service) == "signal":
continue
add_row( add_row(
service=row.service, service=row.service,
identifier=row.identifier, identifier=row.identifier,
@@ -2500,6 +2571,27 @@ def _manual_contact_rows(user):
source="linked", source="linked",
) )
group_links = (
PlatformChatLink.objects.filter(user=user, is_group=True)
.order_by("service", "chat_name", "chat_identifier")
)
for link in group_links:
if _default_service(link.service) == "signal":
continue
group_identifier = _group_channel_identifier(
str(link.service or "").strip(),
link,
str(link.chat_identifier or "").strip(),
)
if not group_identifier:
continue
add_row(
service=link.service,
identifier=group_identifier,
source=f"{_default_service(link.service)}_group",
detected_name=str(link.chat_name or "").strip(),
)
signal_links = { signal_links = {
str(row.identifier): row str(row.identifier): row
for row in ( for row in (
@@ -2508,6 +2600,163 @@ def _manual_contact_rows(user):
.order_by("id") .order_by("id")
) )
} }
signal_state = transport.get_runtime_state("signal") or {}
signal_accounts = [
str(value or "").strip()
for value in (signal_state.get("accounts") or [])
if str(value or "").strip()
]
signal_account_set = set(signal_accounts)
signal_entities = {}
signal_alias_index = {}
def _signal_entity_key(identifiers_list: list[str], *, is_group: bool) -> str:
preferred = _preferred_signal_identifier(identifiers_list, is_group=is_group)
if is_group:
return f"group:{preferred}"
for candidate in identifiers_list:
if _signal_identifier_shape(candidate) == "uuid":
return f"contact:uuid:{candidate.lower()}"
for candidate in identifiers_list:
if _signal_identifier_shape(candidate) == "phone":
return f"contact:phone:{candidate}"
return f"contact:other:{preferred.lower()}"
def _resolve_signal_entity_key(candidate: str) -> str:
cleaned = str(candidate or "").strip()
if not cleaned:
return ""
for variant in _identifier_variants("signal", cleaned):
entity_key = signal_alias_index.get(variant)
if entity_key:
return entity_key
return ""
def _register_signal_entity(
*,
identifiers_list,
is_group: bool,
detected_name="",
person=None,
source="signal_runtime",
):
unique_identifiers = []
for value in identifiers_list or []:
cleaned = str(value or "").strip()
if (
not cleaned
or cleaned in unique_identifiers
or cleaned in signal_account_set
):
continue
unique_identifiers.append(cleaned)
if not unique_identifiers:
return None
entity_key = ""
for identifier in unique_identifiers:
entity_key = _resolve_signal_entity_key(identifier)
if entity_key:
break
if not entity_key:
entity_key = _signal_entity_key(unique_identifiers, is_group=is_group)
entity = signal_entities.get(entity_key)
if entity is None:
entity = {
"is_group": bool(is_group),
"identifiers": [],
"detected_name": _clean_detected_name(detected_name or ""),
"person": person,
"sources": set(),
}
signal_entities[entity_key] = entity
for identifier in unique_identifiers:
if identifier not in entity["identifiers"]:
entity["identifiers"].append(identifier)
for variant in _identifier_variants("signal", identifier):
signal_alias_index[variant] = entity_key
cleaned_name = _clean_detected_name(detected_name or "")
if cleaned_name and not entity["detected_name"]:
entity["detected_name"] = cleaned_name
if person is not None and entity.get("person") is None:
entity["person"] = person
entity["sources"].add(str(source or "").strip() or "signal_runtime")
return entity
for row in signal_links.values():
_register_signal_entity(
identifiers_list=[row.identifier],
is_group=False,
detected_name=(str(row.person.name or "").strip() if row.person else ""),
person=row.person,
source="linked",
)
signal_group_links = (
PlatformChatLink.objects.filter(user=user, service="signal", is_group=True)
.order_by("chat_name", "chat_identifier")
)
for link in signal_group_links:
group_identifier = _group_channel_identifier(
"signal",
link,
str(link.chat_identifier or "").strip(),
)
if not group_identifier:
continue
_register_signal_entity(
identifiers_list=[group_identifier],
is_group=True,
detected_name=str(link.chat_name or "").strip(),
source="signal_group",
)
signal_contacts = signal_state.get("contacts") or []
if isinstance(signal_contacts, list):
for item in signal_contacts:
if not isinstance(item, dict):
continue
candidate_identifiers = item.get("identifiers")
if not isinstance(candidate_identifiers, list):
candidate_identifiers = [
item.get("identifier"),
item.get("number"),
item.get("uuid"),
]
linked = None
for candidate in candidate_identifiers:
cleaned = str(candidate or "").strip()
if not cleaned:
continue
linked = signal_links.get(cleaned)
if linked is not None:
break
_register_signal_entity(
identifiers_list=candidate_identifiers,
is_group=False,
detected_name=str(item.get("name") or "").strip(),
person=(linked.person if linked else None),
source="signal_runtime",
)
signal_groups = signal_state.get("groups") or []
if isinstance(signal_groups, list):
for item in signal_groups:
if not isinstance(item, dict):
continue
candidate_identifiers = item.get("identifiers")
if not isinstance(candidate_identifiers, list):
candidate_identifiers = [
item.get("identifier"),
item.get("id"),
item.get("internal_id"),
]
_register_signal_entity(
identifiers_list=candidate_identifiers,
is_group=True,
detected_name=str(item.get("name") or "").strip(),
source="signal_group_raw",
)
signal_chats = Chat.objects.all().order_by("-id")[:500] signal_chats = Chat.objects.all().order_by("-id")[:500]
for chat in signal_chats: for chat in signal_chats:
uuid_candidate = str(chat.source_uuid or "").strip() uuid_candidate = str(chat.source_uuid or "").strip()
@@ -2517,20 +2766,45 @@ def _manual_contact_rows(user):
fallback_linked = signal_links.get(uuid_candidate) fallback_linked = signal_links.get(uuid_candidate)
if fallback_linked is None and number_candidate: if fallback_linked is None and number_candidate:
fallback_linked = signal_links.get(number_candidate) fallback_linked = signal_links.get(number_candidate)
for candidate in (uuid_candidate, number_candidate): linked = fallback_linked
if not candidate: if linked is None:
continue for candidate in (uuid_candidate, number_candidate):
linked = signal_links.get(candidate) or fallback_linked linked = signal_links.get(candidate)
add_row( if linked is not None:
service="signal", break
identifier=candidate, _register_signal_entity(
person=(linked.person if linked else None), identifiers_list=[uuid_candidate, number_candidate],
source="signal_chat", is_group=False,
account=str(chat.account or ""), detected_name=_clean_detected_name(chat.source_name or chat.account or ""),
detected_name=_clean_detected_name( person=(linked.person if linked else None),
chat.source_name or chat.account or "" source="signal_chat",
), )
)
for entity in signal_entities.values():
entity_identifiers = list(entity.get("identifiers") or [])
identifier_value = _preferred_signal_identifier(
entity_identifiers,
is_group=bool(entity.get("is_group")),
)
if not identifier_value:
continue
add_row(
service="signal",
identifier=identifier_value,
person=entity.get("person"),
source=",".join(sorted(entity.get("sources") or {"signal_runtime"})),
account=entity.get("detected_name") or "",
detected_name=entity.get("detected_name") or "",
)
if rows:
rows[-1]["identifier_aliases"] = [
candidate
for candidate in entity_identifiers
if str(candidate or "").strip() and candidate != identifier_value
]
rows[-1]["identifier_search"] = " ".join(
[rows[-1]["identifier"]] + rows[-1]["identifier_aliases"]
).strip()
whatsapp_links = { whatsapp_links = {
str(row.identifier): row str(row.identifier): row
@@ -3225,8 +3499,11 @@ class ComposeContactMatch(LoginRequiredMixin, View):
value = str(identifier or "").strip() value = str(identifier or "").strip()
if not value: if not value:
return set() return set()
source_shape = _signal_identifier_shape(value)
companions = set() companions = set()
runtime_aliases = _signal_runtime_alias_map()
for variant in _identifier_variants("signal", value):
companions.update(runtime_aliases.get(variant) or set())
source_shape = _signal_identifier_shape(value)
signal_rows = Chat.objects.filter(source_uuid=value) | Chat.objects.filter( signal_rows = Chat.objects.filter(source_uuid=value) | Chat.objects.filter(
source_number=value source_number=value
) )

View File

@@ -5,6 +5,7 @@ import requests
from django.conf import settings from django.conf import settings
from django.contrib import messages from django.contrib import messages
from django.db.models import Q from django.db.models import Q
from django.http import HttpResponse
from django.shortcuts import render from django.shortcuts import render
from django.urls import reverse from django.urls import reverse
from django.views import View from django.views import View
@@ -141,7 +142,12 @@ class SignalAccountUnlink(SuperUserRequiredMixin, View):
else: else:
messages.error( messages.error(
request, request,
"Signal relink failed to clear current device state. Try relink again.", (
"Signal relink could not verify that the current device state was "
"cleared. The account may still be linked in signal-cli-rest-api. "
"Try relink again, and if it still fails, restart the Signal service "
"before requesting a new QR code."
),
) )
else: else:
messages.warning(request, "No Signal account selected to relink.") messages.warning(request, "No Signal account selected to relink.")
@@ -324,15 +330,16 @@ class SignalChatsList(SuperUserRequiredMixin, ObjectList):
group_id = str(link.chat_identifier or "").strip() group_id = str(link.chat_identifier or "").strip()
if not group_id: if not group_id:
continue continue
query = urlencode({"service": "signal", "identifier": group_id})
rows.append( rows.append(
{ {
"chat": None, "chat": None,
"compose_page_url": "", "compose_page_url": f"{reverse('compose_page')}?{query}",
"compose_widget_url": "", "compose_widget_url": f"{reverse('compose_widget')}?{query}",
"ai_url": reverse("ai_workspace"), "ai_url": reverse("ai_workspace"),
"person_name": "", "person_name": "",
"manual_icon_class": "fa-solid fa-users", "manual_icon_class": "fa-solid fa-users",
"can_compose": False, "can_compose": True,
"match_url": "", "match_url": "",
"is_group": True, "is_group": True,
"name": link.chat_name or group_id, "name": link.chat_name or group_id,
@@ -412,7 +419,21 @@ class SignalAccountAdd(SuperUserRequiredMixin, CustomObjectRead):
def get_object(self, **kwargs): def get_object(self, **kwargs):
form_args = self.request.POST.dict() form_args = self.request.POST.dict()
device_name = form_args["device"] device_name = form_args["device"]
image_bytes = transport.get_link_qr(self.service, device_name) try:
image_bytes = transport.get_link_qr(self.service, device_name)
except Exception as exc:
return render(
self.request,
"mixins/wm/modal.html",
{
"window_content": "mixins/partials/notify.html",
"message": (
"Signal QR link is unavailable right now. "
f"signal-cli-rest-api did not return a QR in time: {exc}"
),
"class": "danger",
},
)
base64_image = transport.image_bytes_to_base64(image_bytes) base64_image = transport.image_bytes_to_base64(image_bytes)
return base64_image return base64_image

View File

@@ -40,6 +40,14 @@ from core.models import (
WorkspaceConversation, WorkspaceConversation,
WorkspaceMetricSnapshot, WorkspaceMetricSnapshot,
) )
from core.security.capabilities import (
CAPABILITY_SCOPES,
)
from core.security.capabilities import GLOBAL_SCOPE_KEY as COMMAND_GLOBAL_SCOPE_KEY
from core.security.capabilities import GROUP_LABELS as CAPABILITY_GROUP_LABELS
from core.security.capabilities import (
scope_record,
)
from core.transports.capabilities import capability_snapshot from core.transports.capabilities import capability_snapshot
from core.views.manage.permissions import SuperUserRequiredMixin from core.views.manage.permissions import SuperUserRequiredMixin
@@ -528,7 +536,7 @@ class SecurityPage(LoginRequiredMixin, View):
template_name = "pages/security.html" template_name = "pages/security.html"
page_mode = "encryption" page_mode = "encryption"
GLOBAL_SCOPE_KEY = "global.override" GLOBAL_SCOPE_KEY = COMMAND_GLOBAL_SCOPE_KEY
# Allowed Services list used by both Global Scope Override and local scopes. # Allowed Services list used by both Global Scope Override and local scopes.
# Keep this in sync with the UI text on the Security page. # Keep this in sync with the UI text on the Security page.
POLICY_SERVICES = ["xmpp", "whatsapp", "signal", "instagram", "web"] POLICY_SERVICES = ["xmpp", "whatsapp", "signal", "instagram", "web"]
@@ -541,47 +549,7 @@ class SecurityPage(LoginRequiredMixin, View):
"require_omemo", "require_omemo",
"require_trusted_fingerprint", "require_trusted_fingerprint",
) )
POLICY_SCOPES = [ POLICY_GROUP_LABELS = CAPABILITY_GROUP_LABELS
(
"gateway.tasks",
"Gateway .tasks commands",
"Handles .tasks list/show/complete/undo over gateway channels.",
),
(
"gateway.approval",
"Gateway approval commands",
"Handles .approval/.codex/.claude approve/deny over gateway channels.",
),
(
"gateway.totp",
"Gateway TOTP enrollment",
"Controls TOTP enrollment/status commands over gateway channels.",
),
(
"tasks.submit",
"Task submissions from chat",
"Controls automatic task creation from inbound messages.",
),
(
"tasks.commands",
"Task command verbs (.task/.undo/.epic)",
"Controls explicit task command verbs.",
),
(
"command.bp",
"Business plan command",
"Controls Business Plan command execution.",
),
("command.codex", "Codex command", "Controls Codex command execution."),
("command.claude", "Claude command", "Controls Claude command execution."),
]
POLICY_GROUP_LABELS = {
"gateway": "Gateway",
"tasks": "Tasks",
"command": "Commands",
"agentic": "Agentic",
"other": "Other",
}
def _show_encryption(self) -> bool: def _show_encryption(self) -> bool:
return str(getattr(self, "page_mode", "encryption")).strip().lower() in { return str(getattr(self, "page_mode", "encryption")).strip().lower() in {
@@ -774,8 +742,10 @@ class SecurityPage(LoginRequiredMixin, View):
) )
} }
payload = [] payload = []
for scope_key, label, description in self.POLICY_SCOPES: for scope in CAPABILITY_SCOPES:
key = str(scope_key or "").strip().lower() if not bool(scope.configurable):
continue
key = str(scope.key or "").strip().lower()
item = rows.get(key) item = rows.get(key)
raw_allowed_services = [ raw_allowed_services = [
str(value or "").strip().lower() str(value or "").strip().lower()
@@ -797,8 +767,8 @@ class SecurityPage(LoginRequiredMixin, View):
payload.append( payload.append(
{ {
"scope_key": key, "scope_key": key,
"label": label, "label": scope.label,
"description": description, "description": scope.description,
"enabled": self._apply_global_override( "enabled": self._apply_global_override(
bool(getattr(item, "enabled", True)), bool(getattr(item, "enabled", True)),
global_overrides["scope_enabled"], global_overrides["scope_enabled"],
@@ -827,38 +797,20 @@ class SecurityPage(LoginRequiredMixin, View):
return payload return payload
def _scope_group_key(self, scope_key: str) -> str: def _scope_group_key(self, scope_key: str) -> str:
key = str(scope_key or "").strip().lower() row = scope_record(scope_key)
if key in {"tasks.commands", "gateway.tasks"}: return row.group if row is not None else "other"
return "tasks"
if key in {"command.codex", "command.claude"}:
return "agentic"
if key.startswith("gateway."):
return "command"
if key.startswith("tasks."):
if key == "tasks.submit":
return "tasks"
return "command"
if key.startswith("command."):
return "command"
if ".commands" in key:
return "command"
if ".approval" in key:
return "command"
if ".totp" in key:
return "command"
if ".task" in key:
return "tasks"
return "other"
def _grouped_scope_rows(self, request): def _grouped_scope_rows(self, request):
rows = self._scope_rows(request) rows = self._scope_rows(request)
grouped: dict[str, list[dict]] = {key: [] for key in self.POLICY_GROUP_LABELS} grouped: dict[str, list[dict]] = {
key: [] for key in self.POLICY_GROUP_LABELS.keys()
}
for row in rows: for row in rows:
group_key = self._scope_group_key(row.get("scope_key")) group_key = self._scope_group_key(row.get("scope_key"))
grouped.setdefault(group_key, []) grouped.setdefault(group_key, [])
grouped[group_key].append(row) grouped[group_key].append(row)
payload = [] payload = []
for group_key in ("tasks", "command", "agentic", "other"): for group_key in ("gateway", "tasks", "command", "agentic", "other"):
items = grouped.get(group_key) or [] items = grouped.get(group_key) or []
if not items: if not items:
continue continue
@@ -875,6 +827,10 @@ class SecurityPage(LoginRequiredMixin, View):
row = self._security_settings(request) row = self._security_settings(request)
if str(request.POST.get("encryption_settings_submit") or "").strip() == "1": if str(request.POST.get("encryption_settings_submit") or "").strip() == "1":
row.require_omemo = _to_bool(request.POST.get("require_omemo"), False) row.require_omemo = _to_bool(request.POST.get("require_omemo"), False)
row.encrypt_component_messages_with_omemo = _to_bool(
request.POST.get("encrypt_component_messages_with_omemo"),
True,
)
row.encrypt_contact_messages_with_omemo = _to_bool( row.encrypt_contact_messages_with_omemo = _to_bool(
request.POST.get("encrypt_contact_messages_with_omemo"), request.POST.get("encrypt_contact_messages_with_omemo"),
False, False,
@@ -882,6 +838,7 @@ class SecurityPage(LoginRequiredMixin, View):
row.save( row.save(
update_fields=[ update_fields=[
"require_omemo", "require_omemo",
"encrypt_component_messages_with_omemo",
"encrypt_contact_messages_with_omemo", "encrypt_contact_messages_with_omemo",
"updated_at", "updated_at",
] ]

View File

@@ -1,5 +1,6 @@
from __future__ import annotations from __future__ import annotations
import datetime
import hashlib import hashlib
import json import json
from urllib.parse import urlencode from urllib.parse import urlencode
@@ -40,6 +41,7 @@ from core.tasks.chat_defaults import (
) )
from core.tasks.codex_approval import queue_codex_event_with_pre_approval from core.tasks.codex_approval import queue_codex_event_with_pre_approval
from core.tasks.codex_support import resolve_external_chat_id from core.tasks.codex_support import resolve_external_chat_id
from core.tasks.engine import create_task_record_and_sync
from core.tasks.providers import get_provider from core.tasks.providers import get_provider
@@ -828,6 +830,9 @@ class TasksHub(LoginRequiredMixin, View):
return { return {
"projects": projects, "projects": projects,
"project_choices": all_projects, "project_choices": all_projects,
"epic_choices": TaskEpic.objects.filter(
project__user=request.user
).select_related("project").order_by("project__name", "name"),
"tasks": tasks, "tasks": tasks,
"scope": scope, "scope": scope,
"person_identifier_rows": person_identifier_rows, "person_identifier_rows": person_identifier_rows,
@@ -875,6 +880,60 @@ class TasksHub(LoginRequiredMixin, View):
return redirect(f"{reverse('tasks_hub')}?{urlencode(query)}") return redirect(f"{reverse('tasks_hub')}?{urlencode(query)}")
return redirect("tasks_hub") return redirect("tasks_hub")
if action == "task_create":
project = get_object_or_404(
TaskProject,
user=request.user,
id=request.POST.get("project_id"),
)
epic = None
epic_id = str(request.POST.get("epic_id") or "").strip()
if epic_id:
epic = get_object_or_404(TaskEpic, id=epic_id, project=project)
title = str(request.POST.get("title") or "").strip()
if not title:
messages.error(request, "Task title is required.")
return redirect("tasks_hub")
scope = self._scope(request)
source_service = str(scope.get("service") or "").strip().lower() or "web"
source_channel = str(scope.get("identifier") or "").strip()
due_raw = str(request.POST.get("due_date") or "").strip()
due_date = None
if due_raw:
try:
due_date = datetime.date.fromisoformat(due_raw)
except Exception:
messages.error(request, "Due date must be YYYY-MM-DD.")
return redirect("tasks_hub")
task, _event = async_to_sync(create_task_record_and_sync)(
user=request.user,
project=project,
epic=epic,
title=title,
source_service=source_service,
source_channel=source_channel,
actor_identifier=str(request.user.username or request.user.id),
due_date=due_date,
assignee_identifier=str(
request.POST.get("assignee_identifier") or ""
).strip(),
immutable_payload={
"source": "tasks_hub_manual_create",
"person_id": scope["person_id"],
"service": source_service,
"identifier": source_channel,
},
event_payload={
"source": "tasks_hub_manual_create",
"via": "web_ui",
},
)
messages.success(
request,
f"Created task #{task.reference_code} in '{project.name}'.",
)
return redirect("tasks_task", task_id=str(task.id))
if action == "project_map_identifier": if action == "project_map_identifier":
project = get_object_or_404( project = get_object_or_404(
TaskProject, TaskProject,

View File

@@ -20,13 +20,6 @@ reload-on-as=512
reload-on-rss=256 reload-on-rss=256
vacuum=1 vacuum=1
home=/venv home=/venv
processes=4 processes=2
threads=2 threads=2
log-level=debug log-level=warning
# Autoreload on code changes (graceful reload)
py-autoreload=1
# In the container the repository is mounted at /code
# point autoreload at the actual in-container paths
py-autoreload-on-edit=/code/core
py-autoreload-on-edit=/code/app

View File

@@ -114,12 +114,14 @@ run_worker_container() {
local cmd="$2" local cmd="$2"
local with_uwsgi="${3:-0}" local with_uwsgi="${3:-0}"
local with_whatsapp="${4:-0}" local with_whatsapp="${4:-0}"
local cpu_limit="${5:-0.5}"
rm_if_exists "$name" rm_if_exists "$name"
local args=( local args=(
--replace --replace
--name "$name" --name "$name"
--pod "$POD_NAME" --pod "$POD_NAME"
--cpus "$cpu_limit"
--user "$(id -u):$(id -g)" --user "$(id -u):$(id -g)"
--env-file "$STACK_ENV" --env-file "$STACK_ENV"
--env "SIGNAL_HTTP_URL=http://127.0.0.1:8080" --env "SIGNAL_HTTP_URL=http://127.0.0.1:8080"
@@ -147,6 +149,7 @@ run_oneshot_container() {
--replace --replace
--name "$name" --name "$name"
--pod "$POD_NAME" --pod "$POD_NAME"
--cpus "0.5"
--user "$(id -u):$(id -g)" --user "$(id -u):$(id -g)"
--env-file "$STACK_ENV" --env-file "$STACK_ENV"
--env "SIGNAL_HTTP_URL=http://127.0.0.1:8080" --env "SIGNAL_HTTP_URL=http://127.0.0.1:8080"
@@ -223,6 +226,7 @@ start_stack() {
--replace \ --replace \
--name "$REDIS_CONTAINER" \ --name "$REDIS_CONTAINER" \
--pod "$POD_NAME" \ --pod "$POD_NAME" \
--cpus "0.1" \
-v "$REPO_DIR/docker/redis.conf:/etc/redis.conf:ro" \ -v "$REPO_DIR/docker/redis.conf:/etc/redis.conf:ro" \
-v "$REDIS_DATA_DIR:/data" \ -v "$REDIS_DATA_DIR:/data" \
-v "$VRUN_DIR:/var/run" \ -v "$VRUN_DIR:/var/run" \
@@ -233,15 +237,17 @@ start_stack() {
--replace \ --replace \
--name "$SIGNAL_CONTAINER" \ --name "$SIGNAL_CONTAINER" \
--pod "$POD_NAME" \ --pod "$POD_NAME" \
-e MODE=json-rpc \ --cpus "0.2" \
-e MODE=native \
-v "$ROOT_DIR/signal-cli-config:/home/.local/share/signal-cli" \ -v "$ROOT_DIR/signal-cli-config:/home/.local/share/signal-cli" \
docker.io/bbernhard/signal-cli-rest-api:latest >/dev/null localhost/gia-signal:latest >/dev/null
if [[ "$PROSODY_ENABLED" == "true" ]]; then if [[ "$PROSODY_ENABLED" == "true" ]]; then
podman run -d \ podman run -d \
--replace \ --replace \
--name "$PROSODY_CONTAINER" \ --name "$PROSODY_CONTAINER" \
--pod "$POD_NAME" \ --pod "$POD_NAME" \
--cpus "0.2" \
--env-file "$STACK_ENV" \ --env-file "$STACK_ENV" \
--user "$PROSODY_RUN_USER" \ --user "$PROSODY_RUN_USER" \
--entrypoint prosody \ --entrypoint prosody \
@@ -259,11 +265,11 @@ start_stack() {
run_oneshot_container "$MIGRATION_CONTAINER" ". /venv/bin/activate && python manage.py migrate --noinput" run_oneshot_container "$MIGRATION_CONTAINER" ". /venv/bin/activate && python manage.py migrate --noinput"
run_oneshot_container "$COLLECTSTATIC_CONTAINER" ". /venv/bin/activate && python manage.py collectstatic --noinput" run_oneshot_container "$COLLECTSTATIC_CONTAINER" ". /venv/bin/activate && python manage.py collectstatic --noinput"
run_worker_container "$APP_CONTAINER" "if [ \"\$OPERATION\" = \"uwsgi\" ] ; then . /venv/bin/activate && uwsgi --ini /conf/uwsgi.ini ; else . /venv/bin/activate && exec python manage.py runserver 0.0.0.0:8000; fi" 1 1 run_worker_container "$APP_CONTAINER" "if [ \"\$OPERATION\" = \"uwsgi\" ] ; then . /venv/bin/activate && uwsgi --ini /conf/uwsgi.ini ; else . /venv/bin/activate && exec python manage.py runserver 0.0.0.0:8000; fi" 1 1 0.5
run_worker_container "$ASGI_CONTAINER" "rm -f /var/run/asgi-gia.sock && . /venv/bin/activate && python -m pip install --disable-pip-version-check -q uvicorn && python -m uvicorn app.asgi:application --uds /var/run/asgi-gia.sock --workers 1" 0 1 run_worker_container "$ASGI_CONTAINER" "rm -f /var/run/asgi-gia.sock && . /venv/bin/activate && python -m pip install --disable-pip-version-check -q uvicorn && python -m uvicorn app.asgi:application --uds /var/run/asgi-gia.sock --workers 1" 0 1 0.3
run_worker_container "$UR_CONTAINER" ". /venv/bin/activate && python manage.py ur" 1 1 run_worker_container "$UR_CONTAINER" ". /venv/bin/activate && python manage.py ur" 1 1 0.5
run_worker_container "$SCHED_CONTAINER" ". /venv/bin/activate && python manage.py scheduling" 1 0 run_worker_container "$SCHED_CONTAINER" ". /venv/bin/activate && python manage.py scheduling" 1 0 0.3
run_worker_container "$CODEX_WORKER_CONTAINER" ". /venv/bin/activate && python manage.py codex_worker" 1 0 run_worker_container "$CODEX_WORKER_CONTAINER" ". /venv/bin/activate && python manage.py codex_worker" 1 0 0.4
} }
render_units() { render_units() {

View File

@@ -186,7 +186,7 @@ Image=docker.io/bbernhard/signal-cli-rest-api:latest
ContainerName={with_stack('signal')} ContainerName={with_stack('signal')}
Pod={pod_ref} Pod={pod_ref}
Volume={signal_cli_dir}:/home/.local/share/signal-cli Volume={signal_cli_dir}:/home/.local/share/signal-cli
Environment=MODE=json-rpc Environment=MODE=native
[Service] [Service]
Restart=always Restart=always

View File

@@ -35,11 +35,16 @@ XMPP_USER_DOMAIN=example.com
XMPP_PORT=8888 XMPP_PORT=8888
# Auto-generated if empty by Prosody startup helpers. # Auto-generated if empty by Prosody startup helpers.
XMPP_SECRET= XMPP_SECRET=
# XEP-0363 upload service (used by clients + relay attachment upload).
XMPP_UPLOAD_SERVICE=example.com
# Optional legacy alias consumed by app fallback:
# XMPP_UPLOAD_JID=upload.example.com
XMPP_UPLOAD_BASE_URL=https://share.example.com/file_share
# Directory for OMEMO key storage. Defaults to <BASE_DIR>/xmpp_omemo_data if unset. # Directory for OMEMO key storage. Defaults to <BASE_DIR>/xmpp_omemo_data if unset.
# XMPP_OMEMO_DATA_DIR=./.podman/gia_xmpp_omemo_data # XMPP_OMEMO_DATA_DIR=./.podman/gia_xmpp_omemo_data
# Optional Prosody container storage/config paths used by utilities/prosody/manage_prosody_container.sh # Optional Prosody container storage/config paths used by utilities/prosody/manage_prosody_container.sh
PROSODY_IMAGE=docker.io/prosody/prosody:latest PROSODY_IMAGE=docker.io/prosody/prosody:trunk
QUADLET_PROSODY_CONFIG_FILE=./utilities/prosody/prosody.cfg.lua QUADLET_PROSODY_CONFIG_FILE=./utilities/prosody/prosody.cfg.lua
QUADLET_PROSODY_CERTS_DIR=./.podman/gia_prosody_certs QUADLET_PROSODY_CERTS_DIR=./.podman/gia_prosody_certs
QUADLET_PROSODY_DATA_DIR=./.podman/gia_prosody_data QUADLET_PROSODY_DATA_DIR=./.podman/gia_prosody_data

View File

@@ -41,7 +41,7 @@ up() {
-p "${MANTICORE_SPHINX_PORT}:9312" \ -p "${MANTICORE_SPHINX_PORT}:9312" \
-v "$MANTICORE_DATA_DIR:/var/lib/manticore" \ -v "$MANTICORE_DATA_DIR:/var/lib/manticore" \
-v "$MANTICORE_LOG_DIR:/var/log/manticore" \ -v "$MANTICORE_LOG_DIR:/var/log/manticore" \
-v "$MANTICORE_CONFIG_FILE:/etc/manticoresearch/manticore.conf:ro" \ -v "$MANTICORE_CONFIG_FILE:/etc/manticoresearch/manticore.conf" \
docker.io/manticoresearch/manticore:latest >/dev/null docker.io/manticoresearch/manticore:latest >/dev/null
echo "Started $MANTICORE_CONTAINER" echo "Started $MANTICORE_CONTAINER"
} }

View File

@@ -0,0 +1,121 @@
local st = require "prosody.util.stanza";
local jid_bare = require "prosody.util.jid".bare;
local xmlns_privilege = "urn:xmpp:privilege:2";
local xmlns_forward = "urn:xmpp:forward:0";
local xmlns_carbons = "urn:xmpp:carbons:2";
local bare_sessions = prosody.bare_sessions;
local allowed_component_jid = module:get_option_string("privileged_entity_jid");
module:log("info", "mod_privilege loaded for host=%s privileged_entity_jid=%s", module.host or "?", allowed_component_jid or "");
local function iter_sessions(user_sessions)
if not user_sessions or not user_sessions.sessions then
return function() return nil end;
end
return pairs(user_sessions.sessions);
end
local function send_privilege_advertisement(session)
if not allowed_component_jid or allowed_component_jid == "" then
module:log("debug", "Privilege advertisement skipped: no privileged_entity_jid configured");
return;
end
if session.host ~= allowed_component_jid then
module:log("debug", "Privilege advertisement skipped: session host %s != %s", session.host or "?", allowed_component_jid);
return;
end
local msg = st.message({
from = module.host;
to = session.host;
type = "normal";
});
msg:tag("privilege", { xmlns = xmlns_privilege })
:tag("perm", { access = "message", type = "outgoing" }):up()
:up();
session.send(msg);
module:log("info", "Advertised outgoing message privilege to %s", session.host);
end
local function unwrap_forwarded_message(stanza)
local privilege = stanza:get_child("privilege", xmlns_privilege);
if not privilege then
return nil;
end
local forwarded = privilege:get_child("forwarded", xmlns_forward);
if not forwarded then
return nil;
end
for _, tag in ipairs(forwarded.tags or {}) do
if tag.name == "message" then
return tag;
end
end
return nil;
end
local function is_carbon_delivery(inner)
if not inner then
return false;
end
return inner:get_child("sent", xmlns_carbons) ~= nil
or inner:get_child("received", xmlns_carbons) ~= nil;
end
local function deliver_carbon_to_user_sessions(bare_jid, inner)
local user_sessions = bare_sessions[bare_jid];
if not user_sessions then
module:log("debug", "Privilege carbon skipped for offline user %s", bare_jid);
return true;
end
local delivered = false;
for _, session in iter_sessions(user_sessions) do
if session.full_jid and session.send then
local copy = st.clone(inner);
copy.attr.to = session.full_jid;
session.send(copy);
delivered = true;
end
end
module:log("debug", "Privilege carbon delivered user=%s delivered=%s", bare_jid, tostring(delivered));
return true;
end
local function handle_privileged_carbon(event)
local origin, stanza = event.origin, event.stanza;
if not origin or origin.type ~= "component" then
return nil;
end
if allowed_component_jid and allowed_component_jid ~= "" and origin.host ~= allowed_component_jid then
module:log("debug", "Ignoring privileged message from unexpected component %s", origin.host or "?");
return nil;
end
local inner = unwrap_forwarded_message(stanza);
if not is_carbon_delivery(inner) then
module:log("debug", "Ignoring privileged message without carbon payload from %s", origin.host or "?");
return nil;
end
local bare_to = jid_bare(inner.attr.to);
local bare_from = jid_bare(inner.attr.from);
if not bare_to or bare_to == "" or bare_to ~= bare_from then
module:log("warn", "Rejected malformed privileged carbon from %s", origin.host or "?");
return true;
end
if bare_to:match("@(.+)$") ~= module.host then
module:log("debug", "Ignoring privileged carbon for remote host %s", bare_to);
return nil;
end
return deliver_carbon_to_user_sessions(bare_to, inner);
end
module:hook_global("component-authenticated", function(event)
module:log("info", "component-authenticated for %s", event.session and event.session.host or "?");
send_privilege_advertisement(event.session);
end);
module:hook("pre-message/host", handle_privileged_carbon, 10);
module:hook("message/host", handle_privileged_carbon, 10);

View File

@@ -0,0 +1,138 @@
local st = require "prosody.util.stanza";
local jid_bare = require "prosody.util.jid".bare;
local xmlns_privilege = "urn:xmpp:privilege:2";
local xmlns_forward = "urn:xmpp:forward:0";
local xmlns_carbons = "urn:xmpp:carbons:2";
local bare_sessions = prosody.bare_sessions;
local allowed_component_jid = module:get_option_string("privileged_entity_jid");
module:log("info", "mod_privileged_carbons loaded for host=%s privileged_entity_jid=%s", module.host or "?", allowed_component_jid or "");
local function iter_sessions(user_sessions)
if not user_sessions or not user_sessions.sessions then
return function() return nil end;
end
return pairs(user_sessions.sessions);
end
local function unwrap_forwarded_message(stanza)
local privilege = stanza:get_child("privilege", xmlns_privilege);
if not privilege then
return nil;
end
local forwarded = privilege:get_child("forwarded", xmlns_forward);
if not forwarded then
return nil;
end
for _, tag in ipairs(forwarded.tags or {}) do
if tag.name == "message" then
return tag;
end
end
return nil;
end
local function is_carbon_delivery(inner)
if not inner then
return false;
end
return inner:get_child("sent", xmlns_carbons) ~= nil
or inner:get_child("received", xmlns_carbons) ~= nil;
end
local function build_sent_carbon(inner, user_bare)
local function rebuild_stanza(node)
if type(node) ~= "table" or not node.name then
return node;
end
local attr = {};
for k, v in pairs(node.attr or {}) do
attr[k] = v;
end
local rebuilt = st.stanza(node.name, attr);
for _, child in ipairs(node) do
if type(child) == "table" and child.name then
rebuilt:add_direct_child(rebuild_stanza(child));
elseif type(child) == "string" then
rebuilt:add_direct_child(child);
end
end
return rebuilt;
end
local copy = rebuild_stanza(inner);
local function normalize_client_ns(node)
if not node then
return;
end
if node.attr then
if node.attr.xmlns == nil or node.attr.xmlns == "jabber:component:accept" then
node.attr.xmlns = "jabber:client";
end
end
if node.tags then
for _, child in ipairs(node.tags) do
normalize_client_ns(child);
end
end
end
normalize_client_ns(copy);
return st.message({ from = user_bare, type = inner.attr.type or "chat", xmlns = "jabber:client" })
:tag("sent", { xmlns = xmlns_carbons })
:tag("forwarded", { xmlns = xmlns_forward })
:add_child(copy):reset();
end
local function deliver_carbon_to_user_sessions(bare_jid, inner)
local user_sessions = bare_sessions[bare_jid];
if not user_sessions then
module:log("debug", "Privileged carbon skipped for offline user %s", bare_jid);
return true;
end
local carbon = build_sent_carbon(inner, bare_jid);
local delivered = false;
for _, session in iter_sessions(user_sessions) do
if session.full_jid and session.send then
local copy = st.clone(carbon);
copy.attr.xmlns = "jabber:client";
copy.attr.to = session.full_jid;
session.rawsend(tostring(copy));
delivered = true;
end
end
module:log("info", "Privileged carbon delivered user=%s delivered=%s", bare_jid, tostring(delivered));
return true;
end
local function handle_privileged_carbon(event)
local origin, stanza = event.origin, event.stanza;
if not origin or origin.type ~= "component" then
return nil;
end
if allowed_component_jid and allowed_component_jid ~= "" and origin.host ~= allowed_component_jid then
module:log("debug", "Ignoring privileged message from unexpected component %s", origin.host or "?");
return nil;
end
local inner = unwrap_forwarded_message(stanza);
if not inner then
module:log("debug", "Ignoring privileged message without forwarded payload from %s", origin.host or "?");
return nil;
end
local bare_from = jid_bare(inner.attr.from);
if not bare_from or bare_from == "" then
module:log("warn", "Rejected malformed privileged carbon from %s", origin.host or "?");
return true;
end
if bare_from:match("@(.+)$") ~= module.host then
module:log("debug", "Ignoring privileged carbon for remote host %s", bare_from);
return nil;
end
return deliver_carbon_to_user_sessions(bare_from, inner);
end
module:hook("pre-message/host", handle_privileged_carbon, 10);
module:hook("message/host", handle_privileged_carbon, 10);

View File

@@ -1,5 +1,7 @@
local env = os.getenv local env = os.getenv
local xmpp_secret = env("XMPP_SECRET") or "" local xmpp_secret = env("XMPP_SECRET") or ""
local xmpp_user_domain = env("XMPP_USER_DOMAIN") or "example.com"
local xmpp_upload_base_url = env("XMPP_UPLOAD_BASE_URL") or "https://share.example.com/file_share"
if xmpp_secret == "" then if xmpp_secret == "" then
error("XMPP_SECRET is required for Prosody component authentication") error("XMPP_SECRET is required for Prosody component authentication")
@@ -19,6 +21,7 @@ modules_enabled = {
"saslauth"; "saslauth";
"tls"; "tls";
"blocklist"; "blocklist";
"privileged_carbons";
"carbons"; "carbons";
"dialback"; "dialback";
"limits"; "limits";
@@ -34,6 +37,7 @@ modules_enabled = {
"admin_adhoc"; "admin_adhoc";
"announce"; "announce";
"http"; "http";
"http_file_share";
} }
s2s_secure_auth = true s2s_secure_auth = true
@@ -66,7 +70,10 @@ http_external_url = "https://share.example.com/"
VirtualHost "example.com" VirtualHost "example.com"
authentication = "gia" authentication = "gia"
http_host = "share.example.com"
external_auth_command = "/code/utilities/prosody/auth_django.sh" external_auth_command = "/code/utilities/prosody/auth_django.sh"
privileged_entity_jid = env("XMPP_JID") or "jews.example.com"
http_file_share_size_limit = 104857600
ssl = { ssl = {
key = "/etc/prosody/certs/cert.pem"; key = "/etc/prosody/certs/cert.pem";
certificate = "/etc/prosody/certs/cert.pem"; certificate = "/etc/prosody/certs/cert.pem";

View File

@@ -0,0 +1,85 @@
# Custom signal-cli-rest-api image with signal-cli 0.14.1
#
# signal-cli 0.14.1 ships pre-built as two standalone binaries:
# signal-cli-<VER>-Linux-client.tar.gz → single file "signal-cli-client" (JVM)
# signal-cli-<VER>-Linux-native.tar.gz → single file "signal-cli" (native/GraalVM)
#
# We pull the REST API Go binaries from the upstream bbernhard image and
# layer in the 0.14.1 native binary.
ARG SIGNAL_CLI_VERSION=0.14.1
ARG BBERNHARD_TAG=latest
# ── Stage 1: REST API binaries from upstream ───────────────────────────────
FROM docker.io/bbernhard/signal-cli-rest-api:${BBERNHARD_TAG} AS restapi
# ── Stage 2: runtime image ─────────────────────────────────────────────────
FROM ubuntu:noble
ARG SIGNAL_CLI_VERSION
ENV GIN_MODE=release
ENV PORT=8080
ENV SIGNAL_CLI_CONFIG_DIR=/home/.local/share/signal-cli
ENV SIGNAL_CLI_UID=1000
ENV SIGNAL_CLI_GID=1000
ENV SIGNAL_CLI_CHOWN_ON_STARTUP=true
ENV SIGNAL_CLI_REST_API_PLUGIN_SHARED_OBJ_DIR=/usr/bin/
ENV LANG=en_US.UTF-8
# Runtime deps (openjdk-21 for JVM fallback, supervisor for json-rpc mode)
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
util-linux supervisor openjdk-21-jre wget curl locales \
&& sed -i 's/# en_US.UTF-8 UTF-8/en_US.UTF-8 UTF-8/' /etc/locale.gen \
&& dpkg-reconfigure --frontend=noninteractive locales \
&& update-locale LANG=en_US.UTF-8 \
&& rm -rf /var/lib/apt/lists/*
# Copy REST API binaries from upstream image
COPY --from=restapi /usr/bin/signal-cli-rest-api /usr/bin/signal-cli-rest-api
COPY --from=restapi /usr/bin/jsonrpc2-helper /usr/bin/jsonrpc2-helper
COPY --from=restapi /usr/bin/signal-cli-rest-api_plugin_loader.so /usr/bin/signal-cli-rest-api_plugin_loader.so
COPY --from=restapi /entrypoint.sh /entrypoint.sh
COPY --from=restapi /etc/supervisor /etc/supervisor
# Download signal-cli 0.14.1 binaries.
# The tarballs each contain a single file (no subdirectory).
# native tarball → file named "signal-cli"
# client tarball → file named "signal-cli-client"
RUN mkdir -p /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin \
\
# Native binary (GraalVM compiled, no JVM needed)
&& wget -q "https://github.com/AsamK/signal-cli/releases/download/v${SIGNAL_CLI_VERSION}/signal-cli-${SIGNAL_CLI_VERSION}-Linux-native.tar.gz" \
-O /opt/signal-cli-native.tar.gz \
&& tar xzf /opt/signal-cli-native.tar.gz -C /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/ \
&& mv /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/signal-cli \
/opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/signal-cli-native \
&& chmod +x /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/signal-cli-native \
&& rm /opt/signal-cli-native.tar.gz \
\
# JVM client (used when MODE != native, and as "signal-cli" wrapper)
&& wget -q "https://github.com/AsamK/signal-cli/releases/download/v${SIGNAL_CLI_VERSION}/signal-cli-${SIGNAL_CLI_VERSION}-Linux-client.tar.gz" \
-O /opt/signal-cli-client.tar.gz \
&& tar xzf /opt/signal-cli-client.tar.gz -C /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/ \
&& mv /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/signal-cli-client \
/opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/signal-cli \
&& chmod +x /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/signal-cli \
&& rm /opt/signal-cli-client.tar.gz \
\
# Symlinks to /usr/bin (expected by signal-cli-rest-api)
&& ln -sf /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/signal-cli /usr/bin/signal-cli \
&& ln -sf /opt/signal-cli-${SIGNAL_CLI_VERSION}/bin/signal-cli-native /usr/bin/signal-cli-native
# User + directories (mirror upstream; remove default ubuntu user first)
RUN userdel ubuntu -r 2>/dev/null || true \
&& groupadd -g 1000 signal-api \
&& useradd --no-log-init -M -d /home -s /bin/bash -u 1000 -g 1000 signal-api \
&& mkdir -p /home/.local/share/signal-cli /signal-cli-config
EXPOSE ${PORT}
ENTRYPOINT ["/entrypoint.sh"]
HEALTHCHECK --interval=20s --timeout=10s --retries=3 \
CMD curl -f http://localhost:${PORT}/v1/health || exit 1

View File

@@ -33,12 +33,12 @@ Designed for **HTMX**, **Gridstack.js**, and **Django ORM** with built-in **acce
Add to your `requirements.txt`: Add to your `requirements.txt`:
```shell ```shell
git+https://git.zm.is/XF/django-crud-mixins git+https://git.example.invalid/vendor/django-crud-mixins
``` ```
Or install via pip: Or install via pip:
```shell ```shell
pip install git+https://git.zm.is/XF/django-crud-mixins pip install git+https://git.example.invalid/vendor/django-crud-mixins
``` ```
## 🔧 Usage ## 🔧 Usage

View File

@@ -2,8 +2,8 @@
name = django-crud-mixins name = django-crud-mixins
version = 1.0.3 version = 1.0.3
author = Mark Veidemanis author = Mark Veidemanis
author_email = m@zm.is author_email = maintainer@example.test
url = https://git.zm.is/XF/django-crud-mixins url = https://git.example.invalid/vendor/django-crud-mixins
description = CRUD mixins for Django class-based views. description = CRUD mixins for Django class-based views.
long_description = file: README.md long_description = file: README.md
long_description_content_type = text/markdown long_description_content_type = text/markdown
@@ -25,4 +25,4 @@ install_requires =
[options.package_data] [options.package_data]
mixins = templates/mixins/*, README.md mixins = templates/mixins/*, README.md
* = README.md * = README.md