Compare commits

..

77 Commits

Author SHA1 Message Date
af5c212450 Ingest when there are no matches, add extra validation for send empty 2023-02-09 07:20:07 +00:00
2a034a16e7 Allow disabling notifications 2023-02-09 07:20:07 +00:00
c356f58d8a Add the time taken even where there are no hits 2023-02-09 07:20:28 +00:00
6a890723d9 Add index rule directive to example settings 2023-02-09 07:20:28 +00:00
f0455984ef Create sync versions of pathway to ingest messages to work around sync-only Django management commands 2023-02-09 07:20:45 +00:00
1b1dbbc76c Add mode to stored rules output 2023-02-08 18:26:40 +00:00
7e78c2857e Hide index field for rule searches 2023-02-08 17:46:43 +00:00
1eea401657 Reformat headers for extra rule fields 2023-02-08 16:35:23 +00:00
81c8e34211 Make notification rules queryable 2023-02-02 20:41:19 +00:00
df1e82c5f2 Ingest notification matches to ES 2023-02-02 20:04:55 +00:00
79b4512546 Make notification rule ID field UUID and fix default sources 2023-02-02 19:35:27 +00:00
97e932cbae Add more fine-grained permissions to rules 2023-02-02 19:08:10 +00:00
0cbd2d8a6f Check hash of message when determining if it is a new match 2023-02-02 18:58:40 +00:00
66596cda42 Add total hits to output 2023-02-01 07:20:24 +00:00
53cb9a7f76 When annotating results, don't send empty queries to Threshold 2023-02-01 07:20:31 +00:00
eb7ff88c15 Fix context view for certain mtypes 2023-02-01 07:20:31 +00:00
2153054cac Fix call to get_users 2023-02-01 07:20:15 +00:00
7b05e48d71 Make meta optional 2023-01-16 07:20:37 +00:00
4aa8e67e11 Make metadata return from search more flexible 2023-01-16 07:20:37 +00:00
2eb090f088 Use generic meta variable for returning more data about the search 2023-01-16 16:44:55 +00:00
bea84ee2b6 Set maximum amount 2023-01-16 07:20:37 +00:00
3d6c8d618f Document rule system 2023-01-16 07:20:37 +00:00
ef9734a34d Initialise ES client only on first search 2023-01-16 07:20:37 +00:00
c08ecc036f Check if date range is equal to 2023-01-16 07:20:37 +00:00
1964ab62ec Check the specified window 2023-01-16 01:17:19 +00:00
742a2f92da Prepopulate the match field 2023-01-16 00:29:54 +00:00
ddffc2c3d8 Store index and source as lists 2023-01-16 00:23:23 +00:00
f5e371bf5c Store dict in match field if empty 2023-01-16 00:20:40 +00:00
9de1787de6 Only look for ondemand rules in processing 2023-01-16 00:16:08 +00:00
a2207bbcf4 Support sending messages when a rule no longer matches and fix dual-use notification sender 2023-01-16 00:10:41 +00:00
75603570ff Finish implementing webhook delivery 2023-01-15 23:02:13 +00:00
2dd9efcc6f Fix window/interval validation and make aggs optional in parse_results 2023-01-15 20:27:19 +00:00
eb2486afba Allow using webhooks for notifications 2023-01-15 18:40:17 +00:00
46c7d96310 Allow clearing matches 2023-01-15 18:39:57 +00:00
6bfa0aa73b Implement running scheduled rules and check aggregations 2023-01-15 17:59:12 +00:00
435d9b5571 Implement running scheduled tasks 2023-01-14 17:24:54 +00:00
2a1e6b3292 Allow scheduling notification rules 2023-01-14 16:36:22 +00:00
9ee9c7abde Fix insights 2023-01-14 16:36:00 +00:00
dbf581245b Validate interval and window fields in form 2023-01-14 14:45:19 +00:00
fbe5607899 Add interval and window fields to NotificationRule 2023-01-14 14:36:46 +00:00
158fffed99 Show which fields matched 2023-01-13 07:20:31 +00:00
dd4b2ddd3a Log NTFY errors 2023-01-12 19:00:06 +00:00
092d4c64ff Don't show None to the user if no topic is set 2023-01-12 07:20:48 +00:00
9aacc2cc51 Lowercase msg before matching 2023-01-12 07:20:48 +00:00
031995d4b9 Allow partial matching on msg field 2023-01-12 07:20:48 +00:00
4f55ffeaf7 Allow overriding topic 2023-01-12 07:20:48 +00:00
0b840d227b Add priority to notification rules 2023-01-12 07:20:48 +00:00
e01aea7712 Properly check tokens in notification rules 2023-01-12 07:20:48 +00:00
b68d7606f8 Clean up debug statements 2023-01-12 07:20:48 +00:00
37789a1f18 Add local settings to processing 2023-01-12 07:20:48 +00:00
4dd8224a77 Finish implementing notification rules 2023-01-12 07:20:48 +00:00
f93d37d1c0 Implement notification rules and settings 2023-01-12 07:20:43 +00:00
a70bc16d22 Add CRUD and form helpers 2023-01-11 21:04:54 +00:00
a6b385c8bf Change default query string operator to and 2023-01-01 22:33:17 +00:00
e40b943a01 Add tracker 2022-12-09 17:06:27 +00:00
0a132c6e3a Fix deduplication function 2022-12-09 07:20:59 +00:00
bd8b995134 Fix dedup 2022-12-09 07:20:28 +00:00
5fdd5121eb Make buttons lighter 2022-12-08 07:20:46 +00:00
11f6d676f5 Change color of some buttons 2022-12-08 07:20:46 +00:00
78b28b3994 Fix partial swaps on table 2022-12-02 07:20:37 +00:00
32aa93a28e Remove old code 2022-11-10 07:20:20 +00:00
1b2a02b5ab Fix HTMX target for search results table 2022-11-10 07:20:20 +00:00
f1a68f92a0 Also load results pane with errors on load 2022-12-01 07:20:35 +00:00
ac3a57a2e8 Begin implementing smarter WM system for multi-type objects 2022-11-29 07:20:39 +00:00
fd4cecee05 Switch to UWSGI and improve Docker definitions 2022-11-29 16:06:44 +00:00
23b35da282 Add perms for all the indexes 2022-11-22 07:20:37 +00:00
ffc1aaa030 Mutate the response when reversing 2022-11-23 18:52:48 +00:00
1bdd332e6e Fix annotating results and remove debugging code 2022-11-23 18:39:36 +00:00
c49c8898eb Fix online info 2022-11-23 18:33:09 +00:00
0fd004ca7d Finish reimplementing Elasticsearch 2022-11-23 18:15:42 +00:00
7008c365a6 Begin modernising Docker files 2022-11-22 21:53:21 +00:00
39ae1203be Begin refactoring Elastic backend to use helper functions 2022-11-21 19:43:23 +00:00
61f93390d9 Replace OpenSearch with Elasticsearch 2022-11-21 07:20:29 +00:00
7702e04286 Rename elastic and update settings file 2022-11-21 19:20:02 +00:00
b6ea714d59 Add ripsecrets pre-commit hook 2022-11-21 19:19:44 +00:00
2933360560 Remove duplicate mtype column 2022-10-21 07:20:30 +01:00
987ba6ed3c Change druid URL 2022-10-04 21:47:37 +01:00
85 changed files with 4189 additions and 1850 deletions

1
.gitignore vendored
View File

@@ -154,4 +154,5 @@ cython_debug/
.idea/ .idea/
.bash_history .bash_history
.python_history
.vscode/ .vscode/

View File

@@ -24,8 +24,7 @@ repos:
exclude : ^core/static/css # slow exclude : ^core/static/css # slow
- id: djjs - id: djjs
exclude: ^core/static/js # slow exclude: ^core/static/js # slow
# - repo: https://github.com/thibaudcolas/curlylint - repo: https://github.com/sirwart/ripsecrets.git
# rev: v0.13.1 rev: v0.1.5
# hooks: hooks:
# - id: curlylint - id: ripsecrets
# files: \.(html|sls)$

28
Dockerfile Normal file
View File

@@ -0,0 +1,28 @@
# syntax=docker/dockerfile:1
FROM python:3
ARG OPERATION
RUN useradd -d /code pathogen
RUN mkdir -p /code
RUN chown -R pathogen:pathogen /code
RUN mkdir -p /conf/static
RUN chown -R pathogen:pathogen /conf
RUN mkdir /venv
RUN chown pathogen:pathogen /venv
USER pathogen
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
WORKDIR /code
COPY requirements.txt /code/
RUN python -m venv /venv
RUN . /venv/bin/activate && pip install -r requirements.txt
# CMD . /venv/bin/activate && uwsgi --ini /conf/uwsgi.ini
CMD if [ "$OPERATION" = "uwsgi" ] ; then . /venv/bin/activate && uwsgi --ini /conf/uwsgi.ini ; else . /venv/bin/activate && exec python manage.py runserver 0.0.0.0:8000; fi
# CMD . /venv/bin/activate && uvicorn --reload --reload-include *.html --workers 2 --uds /var/run/socks/app.sock app.asgi:application
# CMD . /venv/bin/activate && gunicorn -b 0.0.0.0:8000 --reload app.asgi:application -k uvicorn.workers.UvicornWorker

20
Makefile Normal file
View File

@@ -0,0 +1,20 @@
run:
docker-compose --env-file=stack.env up -d
build:
docker-compose --env-file=stack.env build
stop:
docker-compose --env-file=stack.env down
log:
docker-compose --env-file=stack.env logs -f
migrate:
docker-compose --env-file=stack.env run --rm app sh -c ". /venv/bin/activate && python manage.py migrate"
makemigrations:
docker-compose --env-file=stack.env run --rm app sh -c ". /venv/bin/activate && python manage.py makemigrations"
auth:
docker-compose --env-file=stack.env run --rm app sh -c ". /venv/bin/activate && python manage.py createsuperuser"

View File

@@ -1,40 +1,38 @@
# OpenSearch settings # Elasticsearch settings
OPENSEARCH_URL = "127.0.0.1" ELASTICSEARCH_URL = "10.1.0.1"
OPENSEARCH_PORT = 9200 ELASTICSEARCH_PORT = 9200
OPENSEARCH_TLS = True ELASTICSEARCH_TLS = True
OPENSEARCH_USERNAME = "admin" ELASTICSEARCH_USERNAME = "admin"
OPENSEARCH_PASSWORD = "" ELASTICSEARCH_PASSWORD = "secret"
OPENSEARCH_INDEX_MAIN = "pathogen-main"
OPENSEARCH_INDEX_META = "pathogen-meta"
OPENSEARCH_INDEX_INT = "pathogen-int"
OPENSEARCH_MAIN_SIZES = ["20", "50", "100", "200", "400", "800"]
OPENSEARCH_MAIN_SIZES_ANON = ["20", "50", "100"]
OPENSEARCH_MAIN_SOURCES = ["dis", "4ch", "all"]
OPENSEARCH_SOURCES_RESTRICTED = ["irc"]
# Manticore settings # Manticore settings
MANTICORE_URL = "http://monolith-db-1:9308" MANTICORE_URL = "http://example-db-1:9308"
MANTICORE_INDEX_MAIN = "main"
MANTICORE_INDEX_META = "meta"
MANTICORE_INDEX_INT = "internal"
MANTICORE_MAIN_SIZES = ["20", "50", "100", "200", "400", "800"] DB_BACKEND = "ELASTICSEARCH"
MANTICORE_MAIN_SIZES_ANON = ["20", "50", "100"]
MANTICORE_MAIN_SOURCES = ["dis", "4ch", "all"] # Common DB settings
MANTICORE_SOURCES_RESTRICTED = ["irc"] INDEX_MAIN = "main"
MANTICORE_CACHE = True INDEX_RESTRICTED = "restricted"
MANTICORE_CACHE_TIMEOUT = 60 INDEX_META = "meta"
INDEX_INT = "internal"
INDEX_RULE_STORAGE = "rule_storage"
MAIN_SIZES = ["1", "5", "15", "30", "50", "100", "250", "500", "1000"]
MAIN_SIZES_ANON = ["1", "5", "15", "30", "50", "100"]
MAIN_SOURCES = ["dis", "4ch", "all"]
SOURCES_RESTRICTED = ["irc"]
CACHE = False
CACHE_TIMEOUT = 2
DRILLDOWN_RESULTS_PER_PAGE = 15 DRILLDOWN_RESULTS_PER_PAGE = 15
DRILLDOWN_DEFAULT_PARAMS = { DRILLDOWN_DEFAULT_PARAMS = {
"size": "20", "size": "15",
"index": "main", "index": "main",
"sorting": "desc", "sorting": "desc",
"source": "4ch", "source": "4ch",
} }
# Encryption # Encryption
# ENCRYPTION = False # ENCRYPTION = False
# ENCRYPTION_KEY = b"" # ENCRYPTION_KEY = b""
@@ -61,7 +59,7 @@ DRILLDOWN_DEFAULT_PARAMS = {
# # Delay results by this many days # # Delay results by this many days
# DELAY_DURATION = 10 # DELAY_DURATION = 10
OPENSEARCH_BLACKLISTED = {} ELASTICSEARCH_BLACKLISTED = {}
# URLs\ # URLs\
@@ -89,8 +87,8 @@ SECRET_KEY = "a"
STRIPE_ADMIN_COUPON = "" STRIPE_ADMIN_COUPON = ""
# Threshold # Threshold
THRESHOLD_ENDPOINT = "http://threshold-app-1:13869" THRESHOLD_ENDPOINT = "http://threshold:13869"
THRESHOLD_API_KEY = "" THRESHOLD_API_KEY = "api_1"
THRESHOLD_API_TOKEN = "" THRESHOLD_API_TOKEN = ""
THRESHOLD_API_COUNTER = "" THRESHOLD_API_COUNTER = ""
@@ -106,12 +104,3 @@ META_QUERY_SIZE = 10000
DEBUG = True DEBUG = True
PROFILER = False PROFILER = False
if DEBUG:
import socket # only if you haven't already imported this
hostname, _, ips = socket.gethostbyname_ex(socket.gethostname())
INTERNAL_IPS = [ip[: ip.rfind(".")] + ".1" for ip in ips] + [
"127.0.0.1",
"10.0.2.2",
]

View File

@@ -19,8 +19,9 @@ from django.contrib import admin
from django.urls import include, path from django.urls import include, path
from django.views.generic import TemplateView from django.views.generic import TemplateView
# Notification settings and rules
# Threshold API stuff # Threshold API stuff
from core.views import About, Billing, Cancel, Order, Portal, Signup from core.views import About, Billing, Cancel, Order, Portal, Signup, notifications
from core.views.callbacks import Callback from core.views.callbacks import Callback
from core.views.manage.threshold.irc import ( from core.views.manage.threshold.irc import (
ThresholdIRCNetworkList, # Actions and just get list output ThresholdIRCNetworkList, # Actions and just get list output
@@ -63,20 +64,20 @@ from core.views.ui.drilldown import ( # DrilldownTableView,; Drilldown,
DrilldownTableView, DrilldownTableView,
ThresholdInfoModal, ThresholdInfoModal,
) )
from core.views.ui.insights import (
# from core.views.ui.insights import ( Insights,
# Insights, InsightsChannels,
# InsightsChannels, InsightsInfoModal,
# InsightsInfoModal, InsightsMeta,
# InsightsMeta, InsightsNicks,
# InsightsNicks, InsightsSearch,
# InsightsSearch, )
# )
urlpatterns = [ urlpatterns = [
path("__debug__/", include("debug_toolbar.urls")), path("__debug__/", include("debug_toolbar.urls")),
path("", DrilldownTableView.as_view(), name="home"), path("", DrilldownTableView.as_view(), name="home"),
path("search/", DrilldownTableView.as_view(), name="search"), path("search/", DrilldownTableView.as_view(), name="search"),
path("search/partial/", DrilldownTableView.as_view(), name="search_partial"),
path("about/", About.as_view(), name="about"), path("about/", About.as_view(), name="about"),
path("callback", Callback.as_view(), name="callback"), path("callback", Callback.as_view(), name="callback"),
path("billing/", Billing.as_view(), name="billing"), path("billing/", Billing.as_view(), name="billing"),
@@ -101,12 +102,32 @@ urlpatterns = [
path("context/", DrilldownContextModal.as_view(), name="modal_context"), path("context/", DrilldownContextModal.as_view(), name="modal_context"),
path("context_table/", DrilldownContextModal.as_view(), name="modal_context_table"), path("context_table/", DrilldownContextModal.as_view(), name="modal_context_table"),
## ##
# path("ui/insights/", Insights.as_view(), name="insights"), path("ui/insights/index/<str:index>/", Insights.as_view(), name="insights"),
# path("ui/insights/search/", InsightsSearch.as_view(), name="search_insights"), path(
# path("ui/insights/channels/", InsightsChannels.as_view(), name="chans_insights"), "ui/insights/index/<str:index>/search/",
# path("ui/insights/nicks/", InsightsNicks.as_view(), name="nicks_insights"), InsightsSearch.as_view(),
# path("ui/insights/meta/", InsightsMeta.as_view(), name="meta_insights"), name="search_insights",
# path("ui/insights/modal/", InsightsInfoModal.as_view(), name="modal_insights"), ),
path(
"ui/insights/index/<str:index>/channels/",
InsightsChannels.as_view(),
name="chans_insights",
),
path(
"ui/insights/index/<str:index>/nicks/",
InsightsNicks.as_view(),
name="nicks_insights",
),
path(
"ui/insights/index/<str:index>/meta/",
InsightsMeta.as_view(),
name="meta_insights",
),
path(
"ui/insights/index/<str:index>/modal/",
InsightsInfoModal.as_view(),
name="modal_insights",
),
## ##
path( path(
"manage/threshold/irc/overview/", "manage/threshold/irc/overview/",
@@ -260,4 +281,34 @@ urlpatterns = [
name="threshold_irc_msg", name="threshold_irc_msg",
), ),
## ##
path(
"notifications/<str:type>/update/",
notifications.NotificationsUpdate.as_view(),
name="notifications_update",
),
path(
"rules/<str:type>/",
notifications.RuleList.as_view(),
name="rules",
),
path(
"rule/<str:type>/create/",
notifications.RuleCreate.as_view(),
name="rule_create",
),
path(
"rule/<str:type>/update/<str:pk>/",
notifications.RuleUpdate.as_view(),
name="rule_update",
),
path(
"rule/<str:type>/delete/<str:pk>/",
notifications.RuleDelete.as_view(),
name="rule_delete",
),
path(
"rule/<str:type>/clear/<str:pk>/",
notifications.RuleClear.as_view(),
name="rule_clear",
),
] + static(settings.STATIC_URL, document_root=settings.STATIC_ROOT) ] + static(settings.STATIC_URL, document_root=settings.STATIC_ROOT)

View File

@@ -1,8 +1,13 @@
import os
import stripe import stripe
from django.conf import settings from django.conf import settings
from redis import StrictRedis from redis import StrictRedis
r = StrictRedis(unix_socket_path="/var/run/redis/redis.sock", db=0) os.environ["DJANGO_ALLOW_ASYNC_UNSAFE"] = "true"
r = StrictRedis(unix_socket_path="/var/run/socks/redis.sock", db=0)
if settings.STRIPE_TEST: if settings.STRIPE_TEST:
stripe.api_key = settings.STRIPE_API_KEY_TEST stripe.api_key = settings.STRIPE_API_KEY_TEST

View File

@@ -1,7 +1,7 @@
import random import random
import string import string
import time import time
from datetime import datetime from abc import ABC, abstractmethod
from math import floor, log10 from math import floor, log10
import orjson import orjson
@@ -11,19 +11,55 @@ from siphashc import siphash
from core import r from core import r
from core.db.processing import annotate_results from core.db.processing import annotate_results
from core.util import logs from core.util import logs
from core.views import helpers
class StorageBackend(object): def remove_defaults(query_params):
for field, value in list(query_params.items()):
if field in settings.DRILLDOWN_DEFAULT_PARAMS:
if value == settings.DRILLDOWN_DEFAULT_PARAMS[field]:
del query_params[field]
def add_defaults(query_params):
for field, value in settings.DRILLDOWN_DEFAULT_PARAMS.items():
if field not in query_params:
query_params[field] = value
def dedup_list(data, check_keys):
"""
Remove duplicate dictionaries from list.
"""
seen = set()
out = []
dup_count = 0
for x in data:
dedupeKey = tuple(x[k] for k in check_keys if k in x)
if dedupeKey in seen:
dup_count += 1
continue
if dup_count > 0:
out.append({"type": "control", "hidden": dup_count})
dup_count = 0
out.append(x)
seen.add(dedupeKey)
if dup_count > 0:
out.append({"type": "control", "hidden": dup_count})
return out
class StorageBackend(ABC):
def __init__(self, name): def __init__(self, name):
self.log = logs.get_logger(name) self.log = logs.get_logger(name)
self.log.info(f"Initialising storage backend {name}") self.log.info(f"Initialising storage backend {name}")
self.initialise_caching() self.initialise_caching()
self.initialise() # self.initialise()
@abstractmethod
def initialise(self, **kwargs): def initialise(self, **kwargs):
raise NotImplementedError pass
def initialise_caching(self): def initialise_caching(self):
hash_key = r.get("cache_hash_key") hash_key = r.get("cache_hash_key")
@@ -37,86 +73,36 @@ class StorageBackend(object):
self.log.debug(f"Decoded hash key: {hash_key}") self.log.debug(f"Decoded hash key: {hash_key}")
self.hash_key = hash_key self.hash_key = hash_key
@abstractmethod
def construct_query(self, **kwargs): def construct_query(self, **kwargs):
raise NotImplementedError pass
def run_query(self, **kwargs): def parse_query(self, query_params, tags, size, custom_query, add_bool, **kwargs):
raise NotImplementedError
def parse_size(self, query_params, sizes):
if "size" in query_params:
size = query_params["size"]
if size not in sizes:
message = "Size is not permitted"
message_class = "danger"
return {"message": message, "class": message_class}
size = int(size)
else:
size = 15
return size
def parse_index(self, user, query_params):
if "index" in query_params:
index = query_params["index"]
if index == "main":
index = settings.INDEX_MAIN
else:
if not user.has_perm(f"core.index_{index}"):
message = "Not permitted to search by this index"
message_class = "danger"
return {
"message": message,
"class": message_class,
}
if index == "meta":
index = settings.INDEX_META
elif index == "internal":
index = settings.INDEX_INT
elif index == "restricted":
if not user.has_perm("core.restricted_sources"):
message = "Not permitted to search by this index"
message_class = "danger"
return {
"message": message,
"class": message_class,
}
index = settings.INDEX_RESTRICTED
else:
message = "Index is not valid."
message_class = "danger"
return {
"message": message,
"class": message_class,
}
else:
index = settings.INDEX_MAIN
return index
def parse_query(self, query_params, tags, size, index, custom_query, add_bool):
query_created = False query_created = False
if "query" in query_params: if "query" in query_params:
query = query_params["query"] query = query_params["query"]
search_query = self.construct_query(query, size, index) search_query = self.construct_query(query, size, **kwargs)
query_created = True query_created = True
else: else:
if custom_query: if custom_query:
search_query = custom_query search_query = custom_query
else: else:
search_query = self.construct_query(None, size, index, blank=True) search_query = self.construct_query(None, size, blank=True, **kwargs)
if tags: if tags:
# Get a blank search query # Get a blank search query
if not query_created: if not query_created:
search_query = self.construct_query(None, size, index, blank=True) search_query = self.construct_query(None, size, blank=True, **kwargs)
query_created = True query_created = True
for item in tags: for item in tags:
for tagname, tagvalue in item.items(): for tagname, tagvalue in item.items():
add_bool.append({tagname: tagvalue}) add_bool.append({tagname: tagvalue})
valid = self.check_valid_query(query_params, custom_query) bypass_check = kwargs.get("bypass_check", False)
if isinstance(valid, dict): if not bypass_check:
return valid valid = self.check_valid_query(query_params, custom_query, **kwargs)
if isinstance(valid, dict):
return valid
return search_query return search_query
@@ -128,80 +114,9 @@ class StorageBackend(object):
message_class = "warning" message_class = "warning"
return {"message": message, "class": message_class} return {"message": message, "class": message_class}
def parse_source(self, user, query_params): @abstractmethod
if "source" in query_params: def run_query(self, **kwargs):
source = query_params["source"] pass
if source in settings.SOURCES_RESTRICTED:
if not user.has_perm("core.restricted_sources"):
message = "Access denied"
message_class = "danger"
return {"message": message, "class": message_class}
elif source not in settings.MAIN_SOURCES:
message = "Invalid source"
message_class = "danger"
return {"message": message, "class": message_class}
if source == "all":
source = None # the next block will populate it
if source:
sources = [source]
else:
sources = list(settings.MAIN_SOURCES)
if user.has_perm("core.restricted_sources"):
for source_iter in settings.SOURCES_RESTRICTED:
sources.append(source_iter)
if "all" in sources:
sources.remove("all")
return sources
def parse_sort(self, query_params):
sort = None
if "sorting" in query_params:
sorting = query_params["sorting"]
if sorting not in ("asc", "desc", "none"):
message = "Invalid sort"
message_class = "danger"
return {"message": message, "class": message_class}
if sorting == "asc":
sort = "ascending"
elif sorting == "desc":
sort = "descending"
return sort
def parse_date_time(self, query_params):
if set({"from_date", "to_date", "from_time", "to_time"}).issubset(
query_params.keys()
):
from_ts = f"{query_params['from_date']}T{query_params['from_time']}Z"
to_ts = f"{query_params['to_date']}T{query_params['to_time']}Z"
from_ts = datetime.strptime(from_ts, "%Y-%m-%dT%H:%MZ")
to_ts = datetime.strptime(to_ts, "%Y-%m-%dT%H:%MZ")
return (from_ts, to_ts)
return (None, None)
def parse_sentiment(self, query_params):
sentiment = None
if "check_sentiment" in query_params:
if "sentiment_method" not in query_params:
message = "No sentiment method"
message_class = "danger"
return {"message": message, "class": message_class}
if "sentiment" in query_params:
sentiment = query_params["sentiment"]
try:
sentiment = float(sentiment)
except ValueError:
message = "Sentiment is not a float"
message_class = "danger"
return {"message": message, "class": message_class}
sentiment_method = query_params["sentiment_method"]
return (sentiment_method, sentiment)
def filter_blacklisted(self, user, response): def filter_blacklisted(self, user, response):
""" """
@@ -217,7 +132,7 @@ class StorageBackend(object):
# For every hit from ES # For every hit from ES
for index, item in enumerate(list(response["hits"]["hits"])): for index, item in enumerate(list(response["hits"]["hits"])):
# For every blacklisted type # For every blacklisted type
for blacklisted_type in settings.OPENSEARCH_BLACKLISTED.keys(): for blacklisted_type in settings.ELASTICSEARCH_BLACKLISTED.keys():
# Check this field we are matching exists # Check this field we are matching exists
if "_source" in item.keys(): if "_source" in item.keys():
data_index = "_source" data_index = "_source"
@@ -228,9 +143,7 @@ class StorageBackend(object):
if blacklisted_type in item[data_index].keys(): if blacklisted_type in item[data_index].keys():
content = item[data_index][blacklisted_type] content = item[data_index][blacklisted_type]
# For every item in the blacklisted array for the type # For every item in the blacklisted array for the type
for blacklisted_item in settings.OPENSEARCH_BLACKLISTED[ for blacklisted_item in settings.BLACKLISTED[blacklisted_type]:
blacklisted_type
]:
if blacklisted_item == str(content): if blacklisted_item == str(content):
# Remove the item # Remove the item
if item in response["hits"]["hits"]: if item in response["hits"]["hits"]:
@@ -255,7 +168,7 @@ class StorageBackend(object):
# Actually get rid of all the things we set to None # Actually get rid of all the things we set to None
response["hits"]["hits"] = [hit for hit in response["hits"]["hits"] if hit] response["hits"]["hits"] = [hit for hit in response["hits"]["hits"] if hit]
def query(self, user, search_query): def query(self, user, search_query, **kwargs):
# For time tracking # For time tracking
start = time.process_time() start = time.process_time()
if settings.CACHE: if settings.CACHE:
@@ -265,8 +178,6 @@ class StorageBackend(object):
cache_hit = r.get(f"query_cache.{user.id}.{hash}") cache_hit = r.get(f"query_cache.{user.id}.{hash}")
if cache_hit: if cache_hit:
response = orjson.loads(cache_hit) response = orjson.loads(cache_hit)
print("CACHE HIT", response)
time_took = (time.process_time() - start) * 1000 time_took = (time.process_time() - start) * 1000
# Round to 3 significant figures # Round to 3 significant figures
time_took_rounded = round( time_took_rounded = round(
@@ -277,7 +188,31 @@ class StorageBackend(object):
"took": time_took_rounded, "took": time_took_rounded,
"cache": True, "cache": True,
} }
response = self.run_query(user, search_query) response = self.run_query(user, search_query, **kwargs)
# For Elasticsearch
if isinstance(response, Exception):
message = f"Error: {response.info['error']['root_cause'][0]['type']}"
message_class = "danger"
return {"message": message, "class": message_class}
if "took" in response:
if response["took"] is None:
return None
if len(response["hits"]["hits"]) == 0:
message = "No results."
message_class = "danger"
time_took = (time.process_time() - start) * 1000
# Round to 3 significant figures
time_took_rounded = round(
time_took, 3 - int(floor(log10(abs(time_took)))) - 1
)
return {
"message": message,
"class": message_class,
"took": time_took_rounded,
}
# For Druid
if "error" in response: if "error" in response:
if "errorMessage" in response: if "errorMessage" in response:
context = { context = {
@@ -287,12 +222,9 @@ class StorageBackend(object):
return context return context
else: else:
return response return response
# response = response.to_dict()
# print("RESP", response) # Removed for now, no point given we have restricted indexes
if "took" in response: # self.filter_blacklisted(user, response)
if response["took"] is None:
return None
self.filter_blacklisted(user, response)
# Parse the response # Parse the response
response_parsed = self.parse(response) response_parsed = self.parse(response)
@@ -308,18 +240,22 @@ class StorageBackend(object):
time_took_rounded = round(time_took, 3 - int(floor(log10(abs(time_took)))) - 1) time_took_rounded = round(time_took, 3 - int(floor(log10(abs(time_took)))) - 1)
return {"object_list": response_parsed, "took": time_took_rounded} return {"object_list": response_parsed, "took": time_took_rounded}
@abstractmethod
def query_results(self, **kwargs): def query_results(self, **kwargs):
raise NotImplementedError pass
def process_results(self, response, **kwargs): def process_results(self, response, **kwargs):
if kwargs.get("annotate"): if kwargs.get("annotate"):
annotate_results(response) annotate_results(response)
if kwargs.get("reverse"):
response.reverse()
if kwargs.get("dedup"): if kwargs.get("dedup"):
response = response[::-1] dedup_fields = kwargs.get("dedup_fields")
if kwargs.get("dedup"): if not dedup_fields:
if not kwargs.get("dedup_fields"):
dedup_fields = ["msg", "nick", "ident", "host", "net", "channel"] dedup_fields = ["msg", "nick", "ident", "host", "net", "channel"]
response = helpers.dedup_list(response, dedup_fields) response = dedup_list(response, dedup_fields)
return response
@abstractmethod
def parse(self, response): def parse(self, response):
raise NotImplementedError pass

View File

@@ -4,9 +4,16 @@ import orjson
import requests import requests
from django.conf import settings from django.conf import settings
from core.db import StorageBackend from core.db import StorageBackend, add_defaults
from core.db.processing import parse_druid from core.db.processing import parse_druid
from core.views import helpers from core.lib.parsing import (
parse_date_time,
parse_index,
parse_sentiment,
parse_size,
parse_sort,
parse_source,
)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -77,7 +84,8 @@ class DruidBackend(StorageBackend):
self.add_type("or", search_query, extra_should2) self.add_type("or", search_query, extra_should2)
return search_query return search_query
def construct_query(self, query, size, index, blank=False): def construct_query(self, query, size, blank=False, **kwargs):
index = kwargs.get("index")
search_query = { search_query = {
"limit": size, "limit": size,
"queryType": "scan", "queryType": "scan",
@@ -107,19 +115,13 @@ class DruidBackend(StorageBackend):
def parse(self, response): def parse(self, response):
parsed = parse_druid(response) parsed = parse_druid(response)
print("PARSE LEN", len(parsed))
return parsed return parsed
def run_query(self, user, search_query): def run_query(self, user, search_query):
ss = orjson.dumps(search_query, option=orjson.OPT_INDENT_2) ss = orjson.dumps(search_query, option=orjson.OPT_INDENT_2)
ss = ss.decode() ss = ss.decode()
print(ss) response = requests.post("http://druid:8082/druid/v2", json=search_query)
response = requests.post("http://broker:8082/druid/v2", json=search_query)
response = orjson.loads(response.text) response = orjson.loads(response.text)
print("RESPONSE LEN", len(response))
# ss = orjson.dumps(response, option=orjson.OPT_INDENT_2)
# ss = ss.decode()
# print(ss)
return response return response
def filter_blacklisted(self, user, response): def filter_blacklisted(self, user, response):
@@ -140,7 +142,7 @@ class DruidBackend(StorageBackend):
add_bool = [] add_bool = []
add_in = {} add_in = {}
helpers.add_defaults(query_params) add_defaults(query_params)
# Now, run the helpers for SIQTSRSS/ADR # Now, run the helpers for SIQTSRSS/ADR
# S - Size # S - Size
@@ -161,25 +163,25 @@ class DruidBackend(StorageBackend):
else: else:
sizes = settings.MAIN_SIZES sizes = settings.MAIN_SIZES
if not size: if not size:
size = self.parse_size(query_params, sizes) size = parse_size(query_params, sizes)
if isinstance(size, dict): if isinstance(size, dict):
return size return size
# I - Index # I - Index
index = self.parse_index(request.user, query_params) index = parse_index(request.user, query_params)
if isinstance(index, dict): if isinstance(index, dict):
return index return index
# Q/T - Query/Tags # Q/T - Query/Tags
search_query = self.parse_query( search_query = self.parse_query(
query_params, tags, size, index, custom_query, add_bool query_params, tags, size, custom_query, add_bool, index=index
) )
# Query should be a dict, so check if it contains message here # Query should be a dict, so check if it contains message here
if "message" in search_query: if "message" in search_query:
return search_query return search_query
# S - Sources # S - Sources
sources = self.parse_source(request.user, query_params) sources = parse_source(request.user, query_params)
if isinstance(sources, dict): if isinstance(sources, dict):
return sources return sources
total_count = len(sources) total_count = len(sources)
@@ -188,20 +190,20 @@ class DruidBackend(StorageBackend):
add_in["src"] = sources add_in["src"] = sources
# R - Ranges # R - Ranges
from_ts, to_ts = self.parse_date_time(query_params) from_ts, to_ts = parse_date_time(query_params)
if from_ts: if from_ts:
addendum = f"{from_ts}/{to_ts}" addendum = f"{from_ts}/{to_ts}"
search_query["intervals"] = [addendum] search_query["intervals"] = [addendum]
# S - Sort # S - Sort
sort = self.parse_sort(query_params) sort = parse_sort(query_params)
if isinstance(sort, dict): if isinstance(sort, dict):
return sort return sort
if sort: if sort:
search_query["order"] = sort search_query["order"] = sort
# S - Sentiment # S - Sentiment
sentiment_r = self.parse_sentiment(query_params) sentiment_r = parse_sentiment(query_params)
if isinstance(sentiment_r, dict): if isinstance(sentiment_r, dict):
return sentiment_r return sentiment_r
if sentiment_r: if sentiment_r:
@@ -232,18 +234,13 @@ class DruidBackend(StorageBackend):
response = self.query(request.user, search_query) response = self.query(request.user, search_query)
# A/D/R - Annotate/Dedup/Reverse # A/D/R - Annotate/Dedup/Reverse
self.process_results( response = self.process_results(
response, response,
annotate=annotate, annotate=annotate,
dedup=dedup, dedup=dedup,
dedup_fields=dedup_fields, dedup_fields=dedup_fields,
reverse=reverse, reverse=reverse,
) )
# ss = orjson.dumps(list(response), option=orjson.OPT_INDENT_2)
# ss = ss.decode()
# print(ss)
# print("PARSED", results_parsed)
# return results_parsed
context = response context = response
return context return context

610
core/db/elastic.py Normal file
View File

@@ -0,0 +1,610 @@
# from copy import deepcopy
# from datetime import datetime, timedelta
from django.conf import settings
from elasticsearch import AsyncElasticsearch, Elasticsearch
from elasticsearch.exceptions import NotFoundError, RequestError
from core.db import StorageBackend, add_defaults
# from json import dumps
# pp = lambda x: print(dumps(x, indent=2))
from core.db.processing import parse_results
from core.lib.parsing import (
QueryError,
parse_date_time,
parse_index,
parse_rule,
parse_sentiment,
parse_size,
parse_sort,
parse_source,
)
# These are sometimes numeric, sometimes strings.
# If they are seen to be numeric first, ES will erroneously
# index them as "long" and then subsequently fail to index messages
# with strings in the field.
keyword_fields = ["nick_id", "user_id", "net_id"]
mapping = {
"mappings": {
"properties": {
"ts": {"type": "date", "format": "epoch_second"},
"match_ts": {"type": "date", "format": "iso8601"},
"file_tim": {"type": "date", "format": "epoch_millis"},
"rule_uuid": {"type": "keyword"},
}
}
}
for field in keyword_fields:
mapping["mappings"]["properties"][field] = {"type": "text"}
class ElasticsearchBackend(StorageBackend):
def __init__(self):
super().__init__("Elasticsearch")
self.client = None
self.async_client = None
def initialise(self, **kwargs):
"""
Inititialise the Elasticsearch API endpoint.
"""
auth = (settings.ELASTICSEARCH_USERNAME, settings.ELASTICSEARCH_PASSWORD)
client = Elasticsearch(
settings.ELASTICSEARCH_URL, http_auth=auth, verify_certs=False
)
self.client = client
async def async_initialise(self, **kwargs):
"""
Inititialise the Elasticsearch API endpoint in async mode.
"""
global mapping
auth = (settings.ELASTICSEARCH_USERNAME, settings.ELASTICSEARCH_PASSWORD)
client = AsyncElasticsearch(
settings.ELASTICSEARCH_URL, http_auth=auth, verify_certs=False
)
self.async_client = client
# Create the rule storage indices
if await client.indices.exists(index=settings.INDEX_RULE_STORAGE):
await client.indices.put_mapping(
index=settings.INDEX_RULE_STORAGE,
properties=mapping["mappings"]["properties"],
)
else:
await client.indices.create(
index=settings.INDEX_RULE_STORAGE, mappings=mapping["mappings"]
)
def construct_context_query(
self, index, net, channel, src, num, size, type=None, nicks=None
):
# Get the initial query
query = self.construct_query(None, size, blank=True)
extra_must = []
extra_should = []
extra_should2 = []
if num:
extra_must.append({"match_phrase": {"num": num}})
if net:
extra_must.append({"match_phrase": {"net": net}})
if channel:
extra_must.append({"match": {"channel": channel}})
if nicks:
for nick in nicks:
extra_should2.append({"match": {"nick": nick}})
types = ["msg", "notice", "action", "kick", "topic", "mode"]
fields = [
"nick",
"ident",
"host",
"channel",
"ts",
"msg",
"type",
"net",
"src",
"tokens",
]
query["fields"] = fields
if index == "internal":
fields.append("mtype")
if channel == "*status" or type == "znc":
if {"match": {"channel": channel}} in extra_must:
extra_must.remove({"match": {"channel": channel}})
extra_should2 = []
# Type is one of msg or notice
# extra_should.append({"match": {"mtype": "msg"}})
# extra_should.append({"match": {"mtype": "notice"}})
extra_should.append({"match": {"type": "znc"}})
extra_should.append({"match": {"type": "self"}})
extra_should2.append({"match": {"type": "znc"}})
extra_should2.append({"match": {"nick": channel}})
elif type == "auth":
if {"match": {"channel": channel}} in extra_must:
extra_must.remove({"match": {"channel": channel}})
extra_should2 = []
extra_should2.append({"match": {"nick": channel}})
# extra_should2.append({"match": {"mtype": "msg"}})
# extra_should2.append({"match": {"mtype": "notice"}})
extra_should.append({"match": {"type": "query"}})
extra_should2.append({"match": {"type": "self"}})
extra_should.append({"match": {"nick": channel}})
else:
for ctype in types:
extra_should.append({"match": {"mtype": ctype}})
else:
for ctype in types:
extra_should.append({"match": {"type": ctype}})
# query = {
# "index": index,
# "limit": size,
# "query": {
# "bool": {
# "must": [
# # {"equals": {"src": src}},
# # {
# # "bool": {
# # "should": [*extra_should],
# # }
# # },
# # {
# # "bool": {
# # "should": [*extra_should2],
# # }
# # },
# *extra_must,
# ]
# }
# },
# "fields": fields,
# # "_source": False,
# }
if extra_must:
for x in extra_must:
query["query"]["bool"]["must"].append(x)
if extra_should:
query["query"]["bool"]["must"].append({"bool": {"should": [*extra_should]}})
if extra_should2:
query["query"]["bool"]["must"].append(
{"bool": {"should": [*extra_should2]}}
)
return query
def construct_query(self, query, size=None, blank=False, **kwargs):
"""
Accept some query parameters and construct an Elasticsearch query.
"""
query_base = {
# "size": size,
"query": {"bool": {"must": []}},
}
if size:
query_base["size"] = size
query_string = {
"query_string": {
"query": query,
# "fields": fields,
# "default_field": "msg",
# "type": "best_fields",
"fuzziness": "AUTO",
"fuzzy_transpositions": True,
"fuzzy_max_expansions": 50,
"fuzzy_prefix_length": 0,
# "minimum_should_match": 1,
"default_operator": "and",
"analyzer": "standard",
"lenient": True,
"boost": 1,
"allow_leading_wildcard": True,
# "enable_position_increments": False,
"phrase_slop": 3,
# "max_determinized_states": 10000,
"quote_field_suffix": "",
"quote_analyzer": "standard",
"analyze_wildcard": False,
"auto_generate_synonyms_phrase_query": True,
}
}
if not blank:
query_base["query"]["bool"]["must"].append(query_string)
return query_base
def parse(self, response, **kwargs):
parsed = parse_results(response, **kwargs)
return parsed
def run_query(self, user, search_query, **kwargs):
"""
Low level helper to run an ES query.
Accept a user to pass it to the filter, so we can
avoid filtering for superusers.
Accept fields and size, for the fields we want to match and the
number of results to return.
"""
if self.client is None:
self.initialise()
index = kwargs.get("index")
try:
response = self.client.search(body=search_query, index=index)
except RequestError as err:
print("Elasticsearch error", err)
return err
except NotFoundError as err:
print("Elasticsearch error", err)
return err
return response
async def async_run_query(self, user, search_query, **kwargs):
"""
Low level helper to run an ES query.
Accept a user to pass it to the filter, so we can
avoid filtering for superusers.
Accept fields and size, for the fields we want to match and the
number of results to return.
"""
if self.async_client is None:
await self.async_initialise()
index = kwargs.get("index")
try:
response = await self.async_client.search(body=search_query, index=index)
except RequestError as err:
print("Elasticsearch error", err)
return err
except NotFoundError as err:
print("Elasticsearch error", err)
return err
return response
async def async_store_matches(self, matches):
"""
Store a list of matches in Elasticsearch.
:param index: The index to store the matches in.
:param matches: A list of matches to store.
"""
if self.async_client is None:
await self.async_initialise()
for match in matches:
result = await self.async_client.index(
index=settings.INDEX_RULE_STORAGE, body=match
)
if not result["result"] == "created":
self.log.error(f"Indexing failed: {result}")
self.log.debug(f"Indexed {len(matches)} messages in ES")
def store_matches(self, matches):
"""
Store a list of matches in Elasticsearch.
:param index: The index to store the matches in.
:param matches: A list of matches to store.
"""
if self.client is None:
self.initialise()
for match in matches:
result = self.client.index(index=settings.INDEX_RULE_STORAGE, body=match)
if not result["result"] == "created":
self.log.error(f"Indexing failed: {result}")
self.log.debug(f"Indexed {len(matches)} messages in ES")
async def schedule_query_results(self, rule_object):
"""
Helper to run a scheduled query with reduced functionality and async.
"""
data = rule_object.parsed
if "tags" in data:
tags = data["tags"]
else:
tags = []
if "query" in data:
query = data["query"][0]
data["query"] = query
result_map = {}
add_bool = []
add_top = []
if "source" in data:
total_count = len(data["source"])
total_sources = len(settings.MAIN_SOURCES) + len(
settings.SOURCES_RESTRICTED
)
if total_count != total_sources:
add_top_tmp = {"bool": {"should": []}}
for source_iter in data["source"]:
add_top_tmp["bool"]["should"].append(
{"match_phrase": {"src": source_iter}}
)
add_top.append(add_top_tmp)
for field, values in data.items():
if field not in ["source", "index", "tags", "query", "sentiment"]:
for value in values:
add_top.append({"match": {field: value}})
# Bypass the check for query and tags membership since we can search by msg, etc
search_query = self.parse_query(
data, tags, None, False, add_bool, bypass_check=True
)
if rule_object.window is not None:
range_query = {
"range": {
"ts": {
"gte": f"now-{rule_object.window}/d",
"lte": "now/d",
}
}
}
add_top.append(range_query)
self.add_bool(search_query, add_bool)
self.add_top(search_query, add_top)
if "sentiment" in data:
search_query["aggs"] = {
"avg_sentiment": {
"avg": {"field": "sentiment"},
}
}
for index in data["index"]:
if "message" in search_query:
self.log.error(f"Error parsing query: {search_query['message']}")
continue
response = await self.async_run_query(
rule_object.user,
search_query,
index=index,
)
self.log.debug(f"Running scheduled query on {index}: {search_query}")
# self.log.debug(f"Response from scheduled query: {response}")
if isinstance(response, Exception):
error = response.info["error"]["root_cause"][0]["reason"]
self.log.error(f"Error running scheduled search: {error}")
raise QueryError(error)
if len(response["hits"]["hits"]) == 0:
# No results, skip
continue
meta, response = self.parse(response, meta=True)
# print("Parsed response", response)
if "message" in response:
self.log.error(f"Error running scheduled search: {response['message']}")
continue
result_map[index] = (meta, response)
# Average aggregation check
# Could probably do this in elasticsearch
for index, (meta, result) in result_map.items():
# Default to true, if no aggs are found, we still want to match
match = True
for agg_name, (operator, number) in rule_object.aggs.items():
if agg_name in meta:
agg_value = meta["aggs"][agg_name]["value"]
# TODO: simplify this, match is default to True
if operator == ">":
if agg_value > number:
match = True
else:
match = False
elif operator == "<":
if agg_value < number:
match = True
else:
match = False
elif operator == "=":
if agg_value == number:
match = True
else:
match = False
else:
match = False
else:
# No aggregation found, but it is required
match = False
result_map[index][0]["aggs"][agg_name]["match"] = match
return result_map
def query_results(
self,
request,
query_params,
size=None,
annotate=True,
custom_query=False,
reverse=False,
dedup=False,
dedup_fields=None,
tags=None,
):
add_bool = []
add_top = []
add_top_negative = []
add_defaults(query_params)
# Now, run the helpers for SIQTSRSS/ADR
# S - Size
# I - Index
# Q - Query
# T - Tags
# S - Source
# R - Ranges
# S - Sort
# S - Sentiment
# A - Annotate
# D - Dedup
# R - Reverse
# S - Size
if request.user.is_anonymous:
sizes = settings.MAIN_SIZES_ANON
else:
sizes = settings.MAIN_SIZES
if not size:
size = parse_size(query_params, sizes)
if isinstance(size, dict):
return size
rule_object = parse_rule(request.user, query_params)
if isinstance(rule_object, dict):
return rule_object
if rule_object is not None:
index = settings.INDEX_RULE_STORAGE
add_bool.append({"rule_uuid": str(rule_object.id)})
else:
# I - Index
index = parse_index(request.user, query_params)
if isinstance(index, dict):
return index
# Q/T - Query/Tags
search_query = self.parse_query(
query_params, tags, size, custom_query, add_bool
)
# Query should be a dict, so check if it contains message here
if "message" in search_query:
return search_query
# S - Sources
sources = parse_source(request.user, query_params)
if isinstance(sources, dict):
return sources
total_count = len(sources)
total_sources = len(settings.MAIN_SOURCES) + len(settings.SOURCES_RESTRICTED)
if total_count != total_sources:
add_top_tmp = {"bool": {"should": []}}
for source_iter in sources:
add_top_tmp["bool"]["should"].append(
{"match_phrase": {"src": source_iter}}
)
add_top.append(add_top_tmp)
# R - Ranges
# date_query = False
from_ts, to_ts = parse_date_time(query_params)
if from_ts:
range_query = {
"range": {
"ts": {
"gt": from_ts,
"lt": to_ts,
}
}
}
add_top.append(range_query)
# S - Sort
sort = parse_sort(query_params)
if isinstance(sort, dict):
return sort
if sort:
# For Druid compatibility
sort_map = {"ascending": "asc", "descending": "desc"}
sorting = [
{
"ts": {
"order": sort_map[sort],
}
}
]
search_query["sort"] = sorting
# S - Sentiment
sentiment_r = parse_sentiment(query_params)
if isinstance(sentiment_r, dict):
return sentiment_r
if sentiment_r:
sentiment_method, sentiment = sentiment_r
range_query_compare = {"range": {"sentiment": {}}}
range_query_precise = {
"match": {
"sentiment": None,
}
}
if sentiment_method == "below":
range_query_compare["range"]["sentiment"]["lt"] = sentiment
add_top.append(range_query_compare)
elif sentiment_method == "above":
range_query_compare["range"]["sentiment"]["gt"] = sentiment
add_top.append(range_query_compare)
elif sentiment_method == "exact":
range_query_precise["match"]["sentiment"] = sentiment
add_top.append(range_query_precise)
elif sentiment_method == "nonzero":
range_query_precise["match"]["sentiment"] = 0
add_top_negative.append(range_query_precise)
# Add in the additional information we already populated
self.add_bool(search_query, add_bool)
self.add_top(search_query, add_top)
self.add_top(search_query, add_top_negative, negative=True)
response = self.query(
request.user,
search_query,
index=index,
)
if "message" in response:
return response
# A/D/R - Annotate/Dedup/Reverse
response["object_list"] = self.process_results(
response["object_list"],
annotate=annotate,
dedup=dedup,
dedup_fields=dedup_fields,
reverse=reverse,
)
context = response
return context
def query_single_result(self, request, query_params):
context = self.query_results(request, query_params, size=100)
if not context:
return {"message": "Failed to run query", "message_class": "danger"}
if "message" in context:
return context
dedup_set = {item["nick"] for item in context["object_list"]}
if dedup_set:
context["item"] = context["object_list"][0]
return context
def add_bool(self, search_query, add_bool):
"""
Add the specified boolean matches to search query.
"""
if not add_bool:
return
for item in add_bool:
search_query["query"]["bool"]["must"].append({"match_phrase": item})
def add_top(self, search_query, add_top, negative=False):
"""
Merge add_top with the base of the search_query.
"""
if not add_top:
return
if negative:
for item in add_top:
if "must_not" in search_query["query"]["bool"]:
search_query["query"]["bool"]["must_not"].append(item)
else:
search_query["query"]["bool"]["must_not"] = [item]
else:
for item in add_top:
if "query" not in search_query:
search_query["query"] = {"bool": {"must": []}}
search_query["query"]["bool"]["must"].append(item)

View File

@@ -5,9 +5,8 @@ from pprint import pprint
import requests import requests
from django.conf import settings from django.conf import settings
from core.db import StorageBackend from core.db import StorageBackend, add_defaults, dedup_list
from core.db.processing import annotate_results, parse_results from core.db.processing import annotate_results, parse_results
from core.views import helpers
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -67,7 +66,7 @@ class ManticoreBackend(StorageBackend):
sort = None sort = None
query_created = False query_created = False
source = None source = None
helpers.add_defaults(query_params) add_defaults(query_params)
# Check size # Check size
if request.user.is_anonymous: if request.user.is_anonymous:
sizes = settings.MANTICORE_MAIN_SIZES_ANON sizes = settings.MANTICORE_MAIN_SIZES_ANON
@@ -292,7 +291,7 @@ class ManticoreBackend(StorageBackend):
if dedup: if dedup:
if not dedup_fields: if not dedup_fields:
dedup_fields = ["msg", "nick", "ident", "host", "net", "channel"] dedup_fields = ["msg", "nick", "ident", "host", "net", "channel"]
results_parsed = helpers.dedup_list(results_parsed, dedup_fields) results_parsed = dedup_list(results_parsed, dedup_fields)
context = { context = {
"object_list": results_parsed, "object_list": results_parsed,
"card": results["hits"]["total"], "card": results["hits"]["total"],

View File

@@ -1,485 +0,0 @@
# from copy import deepcopy
# from datetime import datetime, timedelta
from django.conf import settings
from opensearchpy import OpenSearch
from opensearchpy.exceptions import NotFoundError, RequestError
from core.db import StorageBackend
# from json import dumps
# pp = lambda x: print(dumps(x, indent=2))
from core.db.processing import annotate_results, parse_results
from core.views.helpers import dedup_list
class OpensearchBackend(StorageBackend):
def __init__(self):
super().__init__("Opensearch")
def initialise(self, **kwargs):
"""
Inititialise the OpenSearch API endpoint.
"""
auth = (settings.OPENSEARCH_USERNAME, settings.OPENSEARCH_PASSWORD)
client = OpenSearch(
# fmt: off
hosts=[{"host": settings.OPENSEARCH_URL,
"port": settings.OPENSEARCH_PORT}],
http_compress=False, # enables gzip compression for request bodies
http_auth=auth,
# client_cert = client_cert_path,
# client_key = client_key_path,
use_ssl=settings.OPENSEARCH_TLS,
verify_certs=False,
ssl_assert_hostname=False,
ssl_show_warn=False,
# a_certs=ca_certs_path,
)
self.client = client
def construct_query(self, query, size, use_query_string=True, tokens=False):
"""
Accept some query parameters and construct an OpenSearch query.
"""
if not size:
size = 5
query_base = {
"size": size,
"query": {"bool": {"must": []}},
}
query_string = {
"query_string": {
"query": query,
# "fields": fields,
# "default_field": "msg",
# "type": "best_fields",
"fuzziness": "AUTO",
"fuzzy_transpositions": True,
"fuzzy_max_expansions": 50,
"fuzzy_prefix_length": 0,
# "minimum_should_match": 1,
"default_operator": "or",
"analyzer": "standard",
"lenient": True,
"boost": 1,
"allow_leading_wildcard": True,
# "enable_position_increments": False,
"phrase_slop": 3,
# "max_determinized_states": 10000,
"quote_field_suffix": "",
"quote_analyzer": "standard",
"analyze_wildcard": False,
"auto_generate_synonyms_phrase_query": True,
}
}
query_tokens = {
"simple_query_string": {
# "tokens": query,
"query": query,
"fields": ["tokens"],
"flags": "ALL",
"fuzzy_transpositions": True,
"fuzzy_max_expansions": 50,
"fuzzy_prefix_length": 0,
"default_operator": "and",
"analyzer": "standard",
"lenient": True,
"boost": 1,
"quote_field_suffix": "",
"analyze_wildcard": False,
"auto_generate_synonyms_phrase_query": False,
}
}
if tokens:
query_base["query"]["bool"]["must"].append(query_tokens)
# query["query"]["bool"]["must"].append(query_string)
# query["query"]["bool"]["must"][0]["query_string"]["fields"] = ["tokens"]
elif use_query_string:
query_base["query"]["bool"]["must"].append(query_string)
return query_base
def run_query(self, client, user, query, custom_query=False, index=None, size=None):
"""
Low level helper to run an ES query.
Accept a user to pass it to the filter, so we can
avoid filtering for superusers.
Accept fields and size, for the fields we want to match and the
number of results to return.
"""
if not index:
index = settings.INDEX_MAIN
if custom_query:
search_query = query
else:
search_query = self.construct_query(query, size)
try:
response = client.search(body=search_query, index=index)
except RequestError as err:
print("OpenSearch error", err)
return err
except NotFoundError as err:
print("OpenSearch error", err)
return err
return response
def query_results(
self,
request,
query_params,
size=None,
annotate=True,
custom_query=False,
reverse=False,
dedup=False,
dedup_fields=None,
lookup_hashes=True,
tags=None,
):
"""
API helper to alter the OpenSearch return format into something
a bit better to parse.
Accept a HTTP request object. Run the query, and annotate the
results with the other data we have.
"""
# is_anonymous = isinstance(request.user, AnonymousUser)
query = None
message = None
message_class = None
add_bool = []
add_top = []
add_top_negative = []
sort = None
query_created = False
# Lookup the hash values but don't disclose them to the user
# denied = []
# if lookup_hashes:
# if settings.HASHING:
# query_params = deepcopy(query_params)
# denied_q = hash_lookup(request.user, query_params)
# denied.extend(denied_q)
# if tags:
# denied_t = hash_lookup(request.user, tags, query_params)
# denied.extend(denied_t)
# message = "Permission denied: "
# for x in denied:
# if isinstance(x, SearchDenied):
# message += f"Search({x.key}: {x.value}) "
# elif isinstance(x, LookupDenied):
# message += f"Lookup({x.key}: {x.value}) "
# if denied:
# # message = [f"{i}" for i in message]
# # message = "\n".join(message)
# message_class = "danger"
# return {"message": message, "class": message_class}
if request.user.is_anonymous:
sizes = settings.MAIN_SIZES_ANON
else:
sizes = settings.MAIN_SIZES
if not size:
if "size" in query_params:
size = query_params["size"]
if size not in sizes:
message = "Size is not permitted"
message_class = "danger"
return {"message": message, "class": message_class}
else:
size = 20
source = None
if "source" in query_params:
source = query_params["source"]
if source in settings.SOURCES_RESTRICTED:
if not request.user.has_perm("core.restricted_sources"):
message = "Access denied"
message_class = "danger"
return {"message": message, "class": message_class}
elif source not in settings.MAIN_SOURCES:
message = "Invalid source"
message_class = "danger"
return {"message": message, "class": message_class}
if source == "all":
source = None # the next block will populate it
if source:
sources = [source]
else:
sources = settings.MAIN_SOURCES
if request.user.has_perm("core.restricted_sources"):
for source_iter in settings.SOURCES_RESTRICTED:
sources.append(source_iter)
add_top_tmp = {"bool": {"should": []}}
for source_iter in sources:
add_top_tmp["bool"]["should"].append({"match_phrase": {"src": source_iter}})
add_top.append(add_top_tmp)
# date_query = False
if set({"from_date", "to_date", "from_time", "to_time"}).issubset(
query_params.keys()
):
from_ts = f"{query_params['from_date']}T{query_params['from_time']}Z"
to_ts = f"{query_params['to_date']}T{query_params['to_time']}Z"
range_query = {
"range": {
"ts": {
"gt": from_ts,
"lt": to_ts,
}
}
}
add_top.append(range_query)
# if date_query:
# if settings.DELAY_RESULTS:
# if source not in settings.SAFE_SOURCES:
# if request.user.has_perm("core.bypass_delay"):
# add_top.append(range_query)
# else:
# delay_as_ts = datetime.now() - timedelta(
# days=settings.DELAY_DURATION
# )
# lt_as_ts = datetime.strptime(
# range_query["range"]["ts"]["lt"], "%Y-%m-%dT%H:%MZ"
# )
# if lt_as_ts > delay_as_ts:
# range_query["range"]["ts"][
# "lt"
# ] = f"now-{settings.DELAY_DURATION}d"
# add_top.append(range_query)
# else:
# add_top.append(range_query)
# else:
# if settings.DELAY_RESULTS:
# if source not in settings.SAFE_SOURCES:
# if not request.user.has_perm("core.bypass_delay"):
# range_query = {
# "range": {
# "ts": {
# # "gt": ,
# "lt": f"now-{settings.DELAY_DURATION}d",
# }
# }
# }
# add_top.append(range_query)
if "sorting" in query_params:
sorting = query_params["sorting"]
if sorting not in ("asc", "desc", "none"):
message = "Invalid sort"
message_class = "danger"
return {"message": message, "class": message_class}
if sorting in ("asc", "desc"):
sort = [
{
"ts": {
"order": sorting,
}
}
]
if "check_sentiment" in query_params:
if "sentiment_method" not in query_params:
message = "No sentiment method"
message_class = "danger"
return {"message": message, "class": message_class}
if "sentiment" in query_params:
sentiment = query_params["sentiment"]
try:
sentiment = float(sentiment)
except ValueError:
message = "Sentiment is not a float"
message_class = "danger"
return {"message": message, "class": message_class}
sentiment_method = query_params["sentiment_method"]
range_query_compare = {"range": {"sentiment": {}}}
range_query_precise = {
"match": {
"sentiment": None,
}
}
if sentiment_method == "below":
range_query_compare["range"]["sentiment"]["lt"] = sentiment
add_top.append(range_query_compare)
elif sentiment_method == "above":
range_query_compare["range"]["sentiment"]["gt"] = sentiment
add_top.append(range_query_compare)
elif sentiment_method == "exact":
range_query_precise["match"]["sentiment"] = sentiment
add_top.append(range_query_precise)
elif sentiment_method == "nonzero":
range_query_precise["match"]["sentiment"] = 0
add_top_negative.append(range_query_precise)
# Only one of query or query_full can be active at once
# We prefer query because it's simpler
if "query" in query_params:
query = query_params["query"]
search_query = self.construct_query(query, size, tokens=True)
query_created = True
elif "query_full" in query_params:
query_full = query_params["query_full"]
# if request.user.has_perm("core.query_search"):
search_query = self.construct_query(query_full, size)
query_created = True
# else:
# message = "You cannot search by query string"
# message_class = "danger"
# return {"message": message, "class": message_class}
else:
if custom_query:
search_query = custom_query
if tags:
# Get a blank search query
if not query_created:
search_query = self.construct_query(None, size, use_query_string=False)
query_created = True
for tagname, tagvalue in tags.items():
add_bool.append({tagname: tagvalue})
required_any = ["query_full", "query", "tags"]
if not any([field in query_params.keys() for field in required_any]):
if not custom_query:
message = "Empty query!"
message_class = "warning"
return {"message": message, "class": message_class}
if add_bool:
# if "bool" not in search_query["query"]:
# search_query["query"]["bool"] = {}
# if "must" not in search_query["query"]["bool"]:
# search_query["query"]["bool"] = {"must": []}
for item in add_bool:
search_query["query"]["bool"]["must"].append({"match_phrase": item})
if add_top:
for item in add_top:
search_query["query"]["bool"]["must"].append(item)
if add_top_negative:
for item in add_top_negative:
if "must_not" in search_query["query"]["bool"]:
search_query["query"]["bool"]["must_not"].append(item)
else:
search_query["query"]["bool"]["must_not"] = [item]
if sort:
search_query["sort"] = sort
if "index" in query_params:
index = query_params["index"]
if index == "main":
index = settings.INDEX_MAIN
else:
if not request.user.has_perm(f"core.index_{index}"):
message = "Not permitted to search by this index"
message_class = "danger"
return {
"message": message,
"class": message_class,
}
if index == "meta":
index = settings.INDEX_META
elif index == "internal":
index = settings.INDEX_INT
else:
message = "Index is not valid."
message_class = "danger"
return {
"message": message,
"class": message_class,
}
else:
index = settings.INDEX_MAIN
results = self.query(
request.user, # passed through run_main_query to filter_blacklisted
search_query,
custom_query=True,
index=index,
size=size,
)
if not results:
return False
if isinstance(results, Exception):
message = f"Error: {results.info['error']['root_cause'][0]['type']}"
message_class = "danger"
return {"message": message, "class": message_class}
if len(results["hits"]["hits"]) == 0:
message = "No results."
message_class = "danger"
return {"message": message, "class": message_class}
results_parsed = parse_results(results)
if annotate:
annotate_results(results_parsed)
if "dedup" in query_params:
if query_params["dedup"] == "on":
dedup = True
else:
dedup = False
else:
dedup = False
if reverse:
results_parsed = results_parsed[::-1]
if dedup:
if not dedup_fields:
dedup_fields = ["msg", "nick", "ident", "host", "net", "channel"]
results_parsed = dedup_list(results_parsed, dedup_fields)
# if source not in settings.SAFE_SOURCES:
# if settings.ENCRYPTION:
# encrypt_list(request.user, results_parsed, settings.ENCRYPTION_KEY)
# if settings.HASHING:
# hash_list(request.user, results_parsed)
# if settings.OBFUSCATION:
# obfuscate_list(request.user, results_parsed)
# if settings.RANDOMISATION:
# randomise_list(request.user, results_parsed)
# process_list(results)
# IMPORTANT! - DO NOT PASS query_params to the user!
context = {
"object_list": results_parsed,
"card": results["hits"]["total"]["value"],
"took": results["took"],
}
if "redacted" in results:
context["redacted"] = results["redacted"]
if "exemption" in results:
context["exemption"] = results["exemption"]
if query:
context["query"] = query
# if settings.DELAY_RESULTS:
# if source not in settings.SAFE_SOURCES:
# if not request.user.has_perm("core.bypass_delay"):
# context["delay"] = settings.DELAY_DURATION
# if settings.RANDOMISATION:
# if source not in settings.SAFE_SOURCES:
# if not request.user.has_perm("core.bypass_randomisation"):
# context["randomised"] = True
return context
def query_single_result(self, request, query_params):
context = self.query_results(request, query_params, size=100)
if not context:
return {"message": "Failed to run query", "message_class": "danger"}
if "message" in context:
return context
dedup_set = {item["nick"] for item in context["object_list"]}
if dedup_set:
context["item"] = context["object_list"][0]
return context

View File

@@ -3,7 +3,7 @@ from datetime import datetime
from core.lib.threshold import annotate_num_chans, annotate_num_users, annotate_online from core.lib.threshold import annotate_num_chans, annotate_num_users, annotate_online
def annotate_results(results_parsed): def annotate_results(results):
""" """
Accept a list of dict objects, search for the number of channels and users. Accept a list of dict objects, search for the number of channels and users.
Add them to the object. Add them to the object.
@@ -11,7 +11,7 @@ def annotate_results(results_parsed):
""" """
# Figure out items with net (not discord) # Figure out items with net (not discord)
nets = set() nets = set()
for x in results_parsed: for x in results:
if "net" in x: if "net" in x:
nets.add(x["net"]) nets.add(x["net"])
@@ -21,7 +21,7 @@ def annotate_results(results_parsed):
set( set(
[ [
x["nick"] x["nick"]
for x in results_parsed for x in results
if {"nick", "src", "net"}.issubset(x) if {"nick", "src", "net"}.issubset(x)
and x["src"] == "irc" and x["src"] == "irc"
and x["net"] == net and x["net"] == net
@@ -32,33 +32,42 @@ def annotate_results(results_parsed):
set( set(
[ [
x["channel"] x["channel"]
for x in results_parsed for x in results
if {"channel", "src", "net"}.issubset(x) if {"channel", "src", "net"}.issubset(x)
and x["src"] == "irc" and x["src"] == "irc"
and x["net"] == net and x["net"] == net
] ]
) )
) )
online_info = annotate_online(net, nicks) online_info = None
num_users = None
num_chans = None
if nicks:
online_info = annotate_online(net, nicks)
# Annotate the number of users in the channel # Annotate the number of users in the channel
num_users = annotate_num_users(net, channels) if channels:
num_users = annotate_num_users(net, channels)
# Annotate the number channels the user is on # Annotate the number channels the user is on
num_chans = annotate_num_chans(net, nicks) if nicks:
for item in results_parsed: num_chans = annotate_num_chans(net, nicks)
for item in results:
if "net" in item: if "net" in item:
if item["net"] == net: if item["net"] == net:
if "nick" in item: if "nick" in item:
if item["nick"] in online_info: if online_info:
item["online"] = online_info[item["nick"]] if item["nick"] in online_info:
item["online"] = online_info[item["nick"]]
if "channel" in item: if "channel" in item:
if item["channel"] in num_users: if num_users:
item["num_users"] = num_users[item["channel"]] if item["channel"] in num_users:
item["num_users"] = num_users[item["channel"]]
if "nick" in item: if "nick" in item:
if item["nick"] in num_chans: if num_chans:
item["num_chans"] = num_chans[item["nick"]] if item["nick"] in num_chans:
item["num_chans"] = num_chans[item["nick"]]
def parse_results(results): def parse_results(results, meta=None):
results_parsed = [] results_parsed = []
stringify = ["host", "channel"] stringify = ["host", "channel"]
if "hits" in results.keys(): if "hits" in results.keys():
@@ -110,6 +119,16 @@ def parse_results(results):
else: else:
element["time"] = time element["time"] = time
results_parsed.append(element) results_parsed.append(element)
if meta:
meta = {"aggs": {}}
if "aggregations" in results:
for field in ["avg_sentiment"]: # Add other number fields here
if field in results["aggregations"]:
meta["aggs"][field] = results["aggregations"][field]
total_hits = results["hits"]["total"]["value"]
meta["total_hits"] = total_hits
return (meta, results_parsed)
return results_parsed return results_parsed

View File

@@ -6,10 +6,10 @@ def get_db():
from core.db.druid import DruidBackend from core.db.druid import DruidBackend
return DruidBackend() return DruidBackend()
elif settings.DB_BACKEND == "OPENSEARCH": elif settings.DB_BACKEND == "ELASTICSEARCH":
from core.db.opensearch import OpensearchBackend from core.db.elastic import ElasticsearchBackend
return OpensearchBackend() return ElasticsearchBackend()
elif settings.DB_BACKEND == "MANTICORE": elif settings.DB_BACKEND == "MANTICORE":
from core.db.manticore import ManticoreBackend from core.db.manticore import ManticoreBackend

View File

@@ -1,9 +1,45 @@
from django import forms from django import forms
from django.contrib.auth.forms import UserCreationForm from django.contrib.auth.forms import UserCreationForm
from django.core.exceptions import FieldDoesNotExist
from django.forms import ModelForm
from .models import User from core.db.storage import db
from core.lib.parsing import QueryError
from core.lib.rules import NotificationRuleData, RuleParseError
# Create your forms here. from .models import NotificationRule, NotificationSettings, User
# flake8: noqa: E501
class RestrictedFormMixin:
"""
This mixin is used to restrict the queryset of a form to the current user.
The request object is passed from the view.
Fieldargs is used to pass additional arguments to the queryset filter.
"""
fieldargs = {}
def __init__(self, *args, **kwargs):
# self.fieldargs = {}
self.request = kwargs.pop("request")
super().__init__(*args, **kwargs)
for field in self.fields:
# Check it's not something like a CharField which has no queryset
if not hasattr(self.fields[field], "queryset"):
continue
model = self.fields[field].queryset.model
# Check if the model has a user field
try:
model._meta.get_field("user")
# Add the user to the queryset filters
self.fields[field].queryset = model.objects.filter(
user=self.request.user, **self.fieldargs.get(field, {})
)
except FieldDoesNotExist:
pass
class NewUserForm(UserCreationForm): class NewUserForm(UserCreationForm):
@@ -32,3 +68,88 @@ class CustomUserCreationForm(UserCreationForm):
class Meta: class Meta:
model = User model = User
fields = "__all__" fields = "__all__"
class NotificationSettingsForm(RestrictedFormMixin, ModelForm):
class Meta:
model = NotificationSettings
fields = (
"topic",
"url",
"service",
)
help_texts = {
"topic": "The topic to send notifications to.",
"url": "Custom NTFY server/webhook destination. Leave blank to use the default server for NTFY. For webhooks this field is required.",
"service": "The service to use for notifications.",
}
def clean(self):
cleaned_data = super(NotificationSettingsForm, self).clean()
if "service" in cleaned_data:
if cleaned_data["service"] == "webhook":
if not cleaned_data.get("url"):
self.add_error(
"url",
"You must set a URL for webhooks.",
)
class NotificationRuleForm(RestrictedFormMixin, ModelForm):
class Meta:
model = NotificationRule
fields = (
"name",
"data",
"interval",
"window",
"amount",
"priority",
"topic",
"url",
"service",
"send_empty",
"enabled",
)
help_texts = {
"name": "The name of the rule.",
"priority": "The notification priority of the rule.",
"url": "Custom NTFY server/webhook destination. Leave blank to use the default server for NTFY. For webhooks this field is required.",
"service": "The service to use for notifications",
"topic": "The topic to send notifications to. Leave blank for default.",
"enabled": "Whether the rule is enabled.",
"data": "The notification rule definition.",
"interval": "How often to run the search. On demand evaluates messages as they are received, without running a scheduled search. The remaining options schedule a search of the database with the window below.",
"window": "Time window to search: 1d, 1h, 1m, 1s, etc.",
"amount": "Amount of matches to be returned for scheduled queries. Cannot be used with on-demand queries.",
"send_empty": "Send a notification if no matches are found.",
}
def clean(self):
cleaned_data = super(NotificationRuleForm, self).clean()
if "service" in cleaned_data:
if cleaned_data["service"] == "webhook":
if not cleaned_data.get("url"):
self.add_error(
"url",
"You must set a URL for webhooks.",
)
try:
# Passing db to avoid circular import
parsed_data = NotificationRuleData(self.request.user, cleaned_data, db=db)
if cleaned_data["enabled"]:
parsed_data.test_schedule()
except RuleParseError as e:
self.add_error(e.field, f"Parsing error: {e}")
return
except QueryError as e:
self.add_error("data", f"Query error: {e}")
return
# Write back the validated data
# We need this to populate the index and source variable if
# they are not set
to_store = str(parsed_data)
cleaned_data["data"] = to_store
return cleaned_data

View File

@@ -4,7 +4,7 @@ def construct_query(index, net, channel, src, num, size, type=None, nicks=None):
extra_should = [] extra_should = []
extra_should2 = [] extra_should2 = []
if num: if num:
extra_must.append({"equals": {"num": num}}) extra_must.append({"match_phrase": {"num": num}})
if net: if net:
extra_must.append({"match_phrase": {"net": net}}) extra_must.append({"match_phrase": {"net": net}})
if channel: if channel:
@@ -52,7 +52,7 @@ def construct_query(index, net, channel, src, num, size, type=None, nicks=None):
extra_should.append({"match": {"nick": channel}}) extra_should.append({"match": {"nick": channel}})
else: else:
for ctype in types: for ctype in types:
extra_should.append({"equals": {"mtype": ctype}}) extra_should.append({"match": {"mtype": ctype}})
else: else:
for ctype in types: for ctype in types:
extra_should.append({"match": {"type": ctype}}) extra_should.append({"match": {"type": ctype}})

View File

@@ -3,7 +3,7 @@ from math import ceil
from django.conf import settings from django.conf import settings
from numpy import array_split from numpy import array_split
from core.db.opensearch import client, run_main_query from core.db.storage import db
def construct_query(net, nicks): def construct_query(net, nicks):
@@ -43,27 +43,14 @@ def get_meta(request, net, nicks, iter=True):
break break
meta_tmp = [] meta_tmp = []
query = construct_query(net, nicks_chunked) query = construct_query(net, nicks_chunked)
results = run_main_query( results = db.query(
client,
request.user, request.user,
query, query,
custom_query=True, index=settings.INDEX_META,
index=settings.OPENSEARCH_INDEX_META,
) )
if "hits" in results.keys(): if "object_list" in results.keys():
if "hits" in results["hits"]: for element in results["object_list"]:
for item in results["hits"]["hits"]: meta_tmp.append(element)
element = item["_source"]
element["id"] = item["_id"]
# Split the timestamp into date and time
ts = element["ts"]
ts_spl = ts.split("T")
date = ts_spl[0]
time = ts_spl[1]
element["date"] = date
element["time"] = time
meta_tmp.append(element)
for x in meta_tmp: for x in meta_tmp:
if x not in meta: if x not in meta:
meta.append(x) meta.append(x)

View File

@@ -3,7 +3,7 @@ from math import ceil
from django.conf import settings from django.conf import settings
from numpy import array_split from numpy import array_split
from core.lib.druid import client, run_main_query from core.db.storage import db
def construct_query(net, nicks): def construct_query(net, nicks):
@@ -45,7 +45,7 @@ def get_nicks(request, net, nicks, iter=True):
if len(nicks_chunked) == 0: if len(nicks_chunked) == 0:
break break
query = construct_query(net, nicks_chunked) query = construct_query(net, nicks_chunked)
results = run_main_query(client, request.user, query, custom_query=True) results = db.query(request.user, query)
if "hits" in results.keys(): if "hits" in results.keys():
if "hits" in results["hits"]: if "hits" in results["hits"]:
for item in results["hits"]["hits"]: for item in results["hits"]["hits"]:

105
core/lib/notify.py Normal file
View File

@@ -0,0 +1,105 @@
import requests
from core.util import logs
NTFY_URL = "https://ntfy.sh"
log = logs.get_logger(__name__)
# Actual function to send a message to a topic
def ntfy_sendmsg(**kwargs):
"""
Send a message to a topic using NTFY.
kwargs:
msg: Message to send, must be specified
notification_settings: Notification settings, must be specified
url: URL to NTFY server, can be None to use default
topic: Topic to send message to, must be specified
priority: Priority of message, optional
title: Title of message, optional
tags: Tags to add to message, optional
"""
msg = kwargs.get("msg", None)
notification_settings = kwargs.get("notification_settings")
title = kwargs.get("title", None)
priority = notification_settings.get("priority", None)
tags = kwargs.get("tags", None)
url = notification_settings.get("url") or NTFY_URL
topic = notification_settings.get("topic", None)
headers = {"Title": "Fisk"}
if title:
headers["Title"] = title
if priority:
headers["Priority"] = priority
if tags:
headers["Tags"] = tags
try:
requests.post(
f"{url}/{topic}",
data=msg,
headers=headers,
)
except requests.exceptions.ConnectionError as e:
log.error(f"Error sending notification: {e}")
def webhook_sendmsg(**kwargs):
"""
Send a message to a webhook.
kwargs:
msg: Message to send, must be specified
notification_settings: Notification settings, must be specified
url: URL to webhook, must be specified"""
msg = kwargs.get("msg", None)
notification_settings = kwargs.get("notification_settings")
url = notification_settings.get("url")
try:
requests.post(
f"{url}",
data=msg,
)
except requests.exceptions.ConnectionError as e:
log.error(f"Error sending webhook: {e}")
# Sendmsg helper to send a message to a user's notification settings
def sendmsg(**kwargs):
"""
Send a message to a user's notification settings.
Fetches the user's default notification settings if not specified.
kwargs:
user: User to send message to, must be specified
notification_settings: Notification settings, optional
service: Notification service to use
kwargs for both services:
msg: Message to send, must be specified
notification_settings: Notification settings, must be specified
url: URL to NTFY server, can be None to use default
extra kwargs for ntfy:
title: Title of message, optional
tags: Tags to add to message, optional
notification_settings: Notification settings, must be specified
topic: Topic to send message to, must be specified
priority: Priority of message, optional
"""
user = kwargs.get("user", None)
notification_settings = kwargs.get(
"notification_settings", user.get_notification_settings().__dict__
)
if not notification_settings:
return
service = notification_settings.get("service")
if service == "none":
# Don't send anything
return
if service == "ntfy":
ntfy_sendmsg(**kwargs)
elif service == "webhook":
webhook_sendmsg(**kwargs)

177
core/lib/parsing.py Normal file
View File

@@ -0,0 +1,177 @@
from datetime import datetime
from django.conf import settings
from django.core.exceptions import ValidationError
from core.models import NotificationRule
class QueryError(Exception):
pass
def parse_rule(user, query_params):
"""
Parse a rule query.
"""
if "rule" in query_params:
try:
rule_object = NotificationRule.objects.filter(id=query_params["rule"])
except ValidationError:
message = "Rule is not a valid UUID"
message_class = "danger"
return {"message": message, "class": message_class}
if not rule_object.exists():
message = "Rule does not exist"
message_class = "danger"
return {"message": message, "class": message_class}
rule_object = rule_object.first()
if not rule_object.user == user:
message = "Rule does not belong to you"
message_class = "danger"
return {"message": message, "class": message_class}
return rule_object
else:
return None
def parse_size(query_params, sizes):
if "size" in query_params:
size = query_params["size"]
if size not in sizes:
message = "Size is not permitted"
message_class = "danger"
return {"message": message, "class": message_class}
size = int(size)
else:
size = 15
return size
def parse_index(user, query_params, raise_error=False):
if "index" in query_params:
index = query_params["index"]
if index == "main":
index = settings.INDEX_MAIN
else:
if not user.has_perm(f"core.index_{index}"):
message = f"Not permitted to search by this index: {index}"
if raise_error:
raise QueryError(message)
message_class = "danger"
return {
"message": message,
"class": message_class,
}
if index == "meta":
index = settings.INDEX_META
elif index == "internal":
index = settings.INDEX_INT
elif index == "restricted":
if not user.has_perm("core.restricted_sources"):
message = f"Not permitted to search by this index: {index}"
if raise_error:
raise QueryError(message)
message_class = "danger"
return {
"message": message,
"class": message_class,
}
index = settings.INDEX_RESTRICTED
else:
message = f"Index is not valid: {index}"
if raise_error:
raise QueryError(message)
message_class = "danger"
return {
"message": message,
"class": message_class,
}
else:
index = settings.INDEX_MAIN
return index
def parse_source(user, query_params, raise_error=False):
source = None
if "source" in query_params:
source = query_params["source"]
if source in settings.SOURCES_RESTRICTED:
if not user.has_perm("core.restricted_sources"):
message = f"Access denied: {source}"
if raise_error:
raise QueryError(message)
message_class = "danger"
return {"message": message, "class": message_class}
elif source not in settings.MAIN_SOURCES:
message = f"Invalid source: {source}"
if raise_error:
raise QueryError(message)
message_class = "danger"
return {"message": message, "class": message_class}
if source == "all":
source = None # the next block will populate it
if source:
sources = [source]
else:
sources = list(settings.MAIN_SOURCES)
if user.has_perm("core.restricted_sources"):
for source_iter in settings.SOURCES_RESTRICTED:
sources.append(source_iter)
if "all" in sources:
sources.remove("all")
return sources
def parse_sort(query_params):
sort = None
if "sorting" in query_params:
sorting = query_params["sorting"]
if sorting not in ("asc", "desc", "none"):
message = "Invalid sort"
message_class = "danger"
return {"message": message, "class": message_class}
if sorting == "asc":
sort = "ascending"
elif sorting == "desc":
sort = "descending"
return sort
def parse_date_time(query_params):
if set({"from_date", "to_date", "from_time", "to_time"}).issubset(
query_params.keys()
):
from_ts = f"{query_params['from_date']}T{query_params['from_time']}Z"
to_ts = f"{query_params['to_date']}T{query_params['to_time']}Z"
from_ts = datetime.strptime(from_ts, "%Y-%m-%dT%H:%MZ")
to_ts = datetime.strptime(to_ts, "%Y-%m-%dT%H:%MZ")
return (from_ts, to_ts)
return (None, None)
def parse_sentiment(query_params):
sentiment = None
if "check_sentiment" in query_params:
if "sentiment_method" not in query_params:
message = "No sentiment method"
message_class = "danger"
return {"message": message, "class": message_class}
if "sentiment" in query_params:
sentiment = query_params["sentiment"]
try:
sentiment = float(sentiment)
except ValueError:
message = "Sentiment is not a float"
message_class = "danger"
return {"message": message, "class": message_class}
sentiment_method = query_params["sentiment_method"]
return (sentiment_method, sentiment)

656
core/lib/rules.py Normal file
View File

@@ -0,0 +1,656 @@
from yaml import dump, load
from yaml.parser import ParserError
from yaml.scanner import ScannerError
try:
from yaml import CDumper as Dumper
from yaml import CLoader as Loader
except ImportError:
from yaml import Loader, Dumper
from datetime import datetime
import orjson
from asgiref.sync import async_to_sync
from siphashc import siphash
from core.lib.notify import sendmsg
from core.lib.parsing import parse_index, parse_source
from core.util import logs
log = logs.get_logger("rules")
SECONDS_PER_UNIT = {"s": 1, "m": 60, "h": 3600, "d": 86400, "w": 604800}
MAX_WINDOW = 2592000
MAX_AMOUNT_NTFY = 10
MAX_AMOUNT_WEBHOOK = 1000
HIGH_FREQUENCY_MIN_SEC = 60
class RuleParseError(Exception):
def __init__(self, message, field):
super().__init__(message)
self.field = field
def format_ntfy(**kwargs):
"""
Format a message for ntfy.
If the message is a list, it will be joined with newlines.
If the message is None, it will be replaced with an empty string.
If specified, `matched` will be pretty-printed in the first line.
kwargs:
rule: The rule object, must be specified
index: The index the rule matched on, can be None
message: The message to send, can be None
matched: The matched fields, can be None
total_hits: The total number of matches, optional
"""
rule = kwargs.get("rule")
index = kwargs.get("index")
message = kwargs.get("message")
matched = kwargs.get("matched")
total_hits = kwargs.get("total_hits", 0)
if message:
# Dump the message in YAML for readability
messages_formatted = ""
if isinstance(message, list):
for message_iter in message:
messages_formatted += dump(
message_iter, Dumper=Dumper, default_flow_style=False
)
messages_formatted += "\n"
else:
messages_formatted = dump(message, Dumper=Dumper, default_flow_style=False)
else:
messages_formatted = ""
if matched:
matched = ", ".join([f"{k}: {v}" for k, v in matched.items()])
else:
matched = ""
notify_message = f"{rule.name} on {index}: {matched}\n{messages_formatted}"
notify_message += f"\nTotal hits: {total_hits}"
notify_message = notify_message.encode("utf-8", "replace")
return notify_message
def format_webhook(**kwargs):
"""
Format a message for a webhook.
Adds some metadata to the message that would normally be only in
notification_settings.
Dumps the message in JSON.
kwargs:
rule: The rule object, must be specified
index: The index the rule matched on, can be None
message: The message to send, can be None, but will be sent as None
matched: The matched fields, can be None, but will be sent as None
total_hits: The total number of matches, optional
notification_settings: The notification settings, must be specified
priority: The priority of the message, optional
topic: The topic of the message, optional
"""
rule = kwargs.get("rule")
index = kwargs.get("index")
message = kwargs.get("message")
matched = kwargs.get("matched")
total_hits = kwargs.get("total_hits", 0)
notification_settings = kwargs.get("notification_settings")
notify_message = {
"rule_id": rule.id,
"rule_name": rule.name,
"match": matched,
"total_hits": total_hits,
"index": index,
"data": message,
}
if "priority" in notification_settings:
notify_message["priority"] = notification_settings["priority"]
if "topic" in notification_settings:
notify_message["topic"] = notification_settings["topic"]
notify_message = orjson.dumps(notify_message)
return notify_message
def rule_notify(rule, index, message, meta=None):
"""
Send a notification for a matching rule.
Gets the notification settings for the rule.
Runs the formatting helpers for the service.
:param rule: The rule object, must be specified
:param index: The index the rule matched on, can be None
:param message: The message to send, can be None
:param meta: dict of metadata, contains `aggs` key for the matched fields
"""
# If there is no message, don't say anything matched
if message:
word = "match"
else:
word = "no match"
title = f"Rule {rule.name} {word} on {index}"
# The user notification settings are merged in with this
notification_settings = rule.get_notification_settings()
if not notification_settings:
# No/invalid notification settings, don't send anything
return
if notification_settings.get("service") == "none":
# Don't send anything
return
# Create a cast we can reuse for the formatting helpers and sendmsg
cast = {
"title": title,
"user": rule.user,
"rule": rule,
"index": index,
"message": message,
"notification_settings": notification_settings,
}
if meta:
if "matched" in meta:
cast["matched"] = meta["matched"]
if "total_hits" in meta:
cast["total_hits"] = meta["total_hits"]
if rule.service == "ntfy":
cast["msg"] = format_ntfy(**cast)
elif rule.service == "webhook":
cast["msg"] = format_webhook(**cast)
sendmsg(**cast)
class NotificationRuleData(object):
def __init__(self, user, cleaned_data, db):
self.user = user
self.object = None
# We are running live and have been passed a database object
if not isinstance(cleaned_data, dict):
self.object = cleaned_data
cleaned_data = cleaned_data.__dict__
self.cleaned_data = cleaned_data
self.db = db
self.data = self.cleaned_data.get("data")
self.window = self.cleaned_data.get("window")
self.parsed = None
self.aggs = {}
self.validate_user_permissions()
self.parse_data()
self.ensure_list()
self.validate_permissions()
self.validate_schedule_fields()
self.validate_time_fields()
if self.object is not None:
self.populate_matched()
def populate_matched(self):
"""
On first creation, the match field is None. We need to populate it with
a dictionary containing the index names as keys and False as values.
"""
if self.object.match is None:
self.object.match = {}
for index in self.parsed["index"]:
if index not in self.object.match:
self.object.match[index] = False
self.object.save()
def store_match(self, index, match):
"""
Store a match result.
Accepts None for the index to set all indices.
:param index: the index to store the match for, can be None
:param match: the object that matched
"""
if match is not False:
# Dump match to JSON while sorting the keys
match_normalised = orjson.dumps(match, option=orjson.OPT_SORT_KEYS)
match = siphash(self.db.hash_key, match_normalised)
if self.object.match is None:
self.object.match = {}
if not isinstance(self.object.match, dict):
self.object.match = {}
if index is None:
for index_iter in self.parsed["index"]:
self.object.match[index_iter] = match
else:
self.object.match[index] = match
self.object.save()
log.debug(f"Stored match: {index} - {match}")
def get_match(self, index=None, match=None):
"""
Get a match result for an index.
If the index is None, it will return True if any index has a match.
:param index: the index to get the match for, can be None
"""
if self.object.match is None:
self.object.match = {}
self.object.save()
return None
if not isinstance(self.object.match, dict):
return None
if index is None:
# Check if we have any matches on all indices
return any(self.object.match.values())
# Check if it's the same hash
if match is not None:
match_normalised = orjson.dumps(match, option=orjson.OPT_SORT_KEYS)
match = siphash(self.db.hash_key, match_normalised)
hash_matches = self.object.match.get(index) == match
return hash_matches
return self.object.match.get(index)
def format_aggs(self, aggs):
"""
Format aggregations for the query.
We have self.aggs, which contains:
{"avg_sentiment": (">", 0.5)}
and aggs, which contains:
{"avg_sentiment": {"value": 0.6}}
It's matched already, we just need to format it like so:
{"avg_sentiment": "0.06>0.5"}
:param aggs: the aggregations to format
:return: the formatted aggregations
"""
new_aggs = {}
for agg_name, agg in aggs.items():
# Already checked membership below
op, value = self.aggs[agg_name]
new_aggs[agg_name] = f"{agg['value']}{op}{value}"
return
def reform_matches(self, index, matches, meta, mode):
if not isinstance(matches, list):
matches = [matches]
matches_copy = matches.copy()
match_ts = datetime.utcnow().isoformat()
for match_index, _ in enumerate(matches_copy):
matches_copy[match_index]["index"] = index
matches_copy[match_index]["rule_uuid"] = self.object.id
matches_copy[match_index]["meta"] = meta
matches_copy[match_index]["match_ts"] = match_ts
matches_copy[match_index]["mode"] = mode
return matches_copy
async def ingest_matches(self, index, matches, meta, mode):
"""
Store all matches for an index.
:param index: the index to store the matches for
:param matches: the matches to store
"""
new_matches = self.reform_matches(index, matches, meta, mode)
await self.db.async_store_matches(new_matches)
def ingest_matches_sync(self, index, matches, meta, mode):
"""
Store all matches for an index.
:param index: the index to store the matches for
:param matches: the matches to store
"""
new_matches = self.reform_matches(index, matches, meta, mode)
self.db.store_matches(new_matches)
async def rule_matched(self, index, message, meta, mode):
"""
A rule has matched.
If the previous run did not match, send a notification after formatting
the aggregations.
:param index: the index the rule matched on
:param message: the message object that matched
:param aggs: the aggregations that matched
"""
current_match = self.get_match(index, message)
log.debug(f"Rule matched: {index} - current match: {current_match}")
if current_match is False:
# Matched now, but not before
if "matched" not in meta:
meta["matched"] = self.format_aggs(meta["aggs"])
rule_notify(self.object, index, message, meta)
self.store_match(index, message)
await self.ingest_matches(index, message, meta, mode)
def rule_matched_sync(self, index, message, meta, mode):
"""
A rule has matched.
If the previous run did not match, send a notification after formatting
the aggregations.
:param index: the index the rule matched on
:param message: the message object that matched
:param aggs: the aggregations that matched
"""
current_match = self.get_match(index, message)
log.debug(f"Rule matched: {index} - current match: {current_match}")
if current_match is False:
# Matched now, but not before
if "matched" not in meta:
meta["matched"] = self.format_aggs(meta["aggs"])
rule_notify(self.object, index, message, meta)
self.store_match(index, message)
self.ingest_matches_sync(index, message, meta, mode)
# No async helper for this one as we only need it for schedules
async def rule_no_match(self, index=None):
"""
A rule has not matched.
If the previous run did match, send a notification if configured to notify
for empty matches.
:param index: the index the rule did not match on, can be None
"""
current_match = self.get_match(index)
log.debug(f"Rule not matched: {index} - current match: {current_match}")
if current_match is True:
# Matched before, but not now
if self.object.send_empty:
rule_notify(self.object, index, "no_match", None)
self.store_match(index, False)
await self.ingest_matches(
index=index, message={}, meta={"msg": "No matches"}, mode="schedule"
)
async def run_schedule(self):
"""
Run the schedule query.
Get the results from the database, and check if the rule has matched.
Check if all of the required aggregations have matched.
"""
response = await self.db.schedule_query_results(self)
if not response:
# No results in the result_map
await self.rule_no_match()
for index, (meta, results) in response.items():
if not results:
# Falsy results, no matches
await self.rule_no_match(index)
# Add the match values of all aggregations to a list
aggs_for_index = []
for agg_name in self.aggs.keys():
if agg_name in meta["aggs"]:
if "match" in meta["aggs"][agg_name]:
aggs_for_index.append(meta["aggs"][agg_name]["match"])
# All required aggs are present
if len(aggs_for_index) == len(self.aggs.keys()):
if all(aggs_for_index):
# All aggs have matched
await self.rule_matched(
index, results[: self.object.amount], meta, mode="schedule"
)
continue
# Default branch, since the happy path has a continue keyword
await self.rule_no_match(index)
def test_schedule(self):
"""
Test the schedule query to ensure it is valid.
Run the query with the async_to_sync helper so we can call it from
a form.
Raises an exception if the query is invalid.
"""
if self.db:
sync_schedule = async_to_sync(self.db.schedule_query_results)
sync_schedule(self)
def validate_schedule_fields(self):
"""
Ensure schedule fields are valid.
index: can be a list, it will schedule one search per index.
source: can be a list, it will be the filter for each search.
tokens: can be list, it will ensure the message matches any token.
msg: can be a list, it will ensure the message contains any msg.
No other fields can be lists containing more than one item.
:raises RuleParseError: if the fields are invalid
"""
is_schedule = self.is_schedule
if is_schedule:
allowed_list_fields = ["index", "source", "tokens", "msg"]
for field, value in self.parsed.items():
if field not in allowed_list_fields:
if len(value) > 1:
raise RuleParseError(
(
f"For scheduled rules, field {field} cannot contain "
"more than one item"
),
"data",
)
if len(str(value[0])) == 0:
raise RuleParseError(f"Field {field} cannot be empty", "data")
if "sentiment" in self.parsed:
sentiment = str(self.parsed["sentiment"][0])
sentiment = sentiment.strip()
if sentiment[0] not in [">", "<", "="]:
raise RuleParseError(
(
"Sentiment field must be a comparison operator and then a "
"float: >0.02"
),
"data",
)
operator = sentiment[0]
number = sentiment[1:]
try:
number = float(number)
except ValueError:
raise RuleParseError(
(
"Sentiment field must be a comparison operator and then a "
"float: >0.02"
),
"data",
)
self.aggs["avg_sentiment"] = (operator, number)
else:
if "query" in self.parsed:
raise RuleParseError(
"Field query cannot be used with on-demand rules", "data"
)
if "tags" in self.parsed:
raise RuleParseError(
"Field tags cannot be used with on-demand rules", "data"
)
if self.cleaned_data["send_empty"]:
raise RuleParseError(
"Field cannot be used with on-demand rules", "send_empty"
)
@property
def is_schedule(self):
"""
Check if the rule is a schedule rule.
:return: True if the rule is a schedule rule, False otherwise
"""
if "interval" in self.cleaned_data:
if self.cleaned_data["interval"] != 0:
return True
return False
def ensure_list(self):
"""
Ensure all values in the data field are lists.
Convert all strings to lists with one item.
"""
for field, value in self.parsed.items():
if not isinstance(value, list):
self.parsed[field] = [value]
def validate_user_permissions(self):
"""
Ensure the user can use notification rules.
:raises RuleParseError: if the user does not have permission
"""
if not self.user.has_perm("core.use_rules"):
raise RuleParseError("User does not have permission to use rules", "data")
def validate_time_fields(self):
"""
Validate the interval and window fields.
Prohibit window being specified with an ondemand interval.
Prohibit window not being specified with a non-ondemand interval.
Prohibit amount being specified with an on-demand interval.
Prohibut amount not being specified with a non-ondemand interval.
Validate window field.
Validate window unit and enforce maximum.
:raises RuleParseError: if the fields are invalid
"""
interval = self.cleaned_data.get("interval")
window = self.cleaned_data.get("window")
amount = self.cleaned_data.get("amount")
service = self.cleaned_data.get("service")
on_demand = interval == 0
# Not on demand and interval is too low
if not on_demand and interval <= HIGH_FREQUENCY_MIN_SEC:
if not self.user.has_perm("core.rules_high_frequency"):
raise RuleParseError(
"User does not have permission to use high frequency rules", "data"
)
if not on_demand:
if not self.user.has_perm("core.rules_scheduled"):
raise RuleParseError(
"User does not have permission to use scheduled rules", "data"
)
if on_demand and window is not None:
# Interval is on demand and window is specified
# We can't have a window with on-demand rules
raise RuleParseError(
"Window cannot be specified with on-demand interval", "window"
)
if not on_demand and window is None:
# Interval is not on demand and window is not specified
# We can't have a non-on-demand interval without a window
raise RuleParseError(
"Window must be specified with non-on-demand interval", "window"
)
if not on_demand and amount is None:
# Interval is not on demand and amount is not specified
# We can't have a non-on-demand interval without an amount
raise RuleParseError(
"Amount must be specified with non-on-demand interval", "amount"
)
if on_demand and amount is not None:
# Interval is on demand and amount is specified
# We can't have an amount with on-demand rules
raise RuleParseError(
"Amount cannot be specified with on-demand interval", "amount"
)
if window is not None:
window_number = window[:-1]
if not window_number.isdigit():
raise RuleParseError("Window prefix must be a number", "window")
window_number = int(window_number)
window_unit = window[-1]
if window_unit not in SECONDS_PER_UNIT:
raise RuleParseError(
(
"Window unit must be one of "
f"{', '.join(SECONDS_PER_UNIT.keys())},"
f" not '{window_unit}'"
),
"window",
)
window_seconds = window_number * SECONDS_PER_UNIT[window_unit]
if window_seconds > MAX_WINDOW:
raise RuleParseError(
f"Window cannot be larger than {MAX_WINDOW} seconds (30 days)",
"window",
)
if amount is not None:
if service == "ntfy":
if amount > MAX_AMOUNT_NTFY:
raise RuleParseError(
f"Amount cannot be larger than {MAX_AMOUNT_NTFY} for ntfy",
"amount",
)
else:
if amount > MAX_AMOUNT_WEBHOOK:
raise RuleParseError(
(
f"Amount cannot be larger than {MAX_AMOUNT_WEBHOOK} for "
f"{service}"
),
"amount",
)
def validate_permissions(self):
"""
Validate permissions for the source and index variables.
Also set the default values for the user if not present.
Stores the default or expanded values in the parsed field.
:raises QueryError: if the user does not have permission to use the source
"""
if "index" in self.parsed:
index = self.parsed["index"]
if type(index) == list:
for i in index:
parse_index(self.user, {"index": i}, raise_error=True)
# else:
# db.parse_index(self.user, {"index": index}, raise_error=True)
else:
# Get the default value for the user if not present
index = parse_index(self.user, {}, raise_error=True)
self.parsed["index"] = [index]
if "source" in self.parsed:
source = self.parsed["source"]
if type(source) == list:
for i in source:
parse_source(self.user, {"source": i}, raise_error=True)
# else:
# parse_source(self.user, {"source": source}, raise_error=True)
else:
# Get the default value for the user if not present
source = parse_source(self.user, {}, raise_error=True)
self.parsed["source"] = source
def parse_data(self):
"""
Parse the data in the text field to YAML.
:raises RuleParseError: if the data is invalid
"""
try:
self.parsed = load(self.data, Loader=Loader)
except (ScannerError, ParserError) as e:
raise RuleParseError(f"Invalid YAML: {e}", "data")
def __str__(self):
"""
Get a YAML representation of the data field of the rule.
"""
return dump(self.parsed, Dumper=Dumper)
def get_data(self):
"""
Return the data field as a dictionary.
"""
return self.parsed

View File

View File

View File

@@ -0,0 +1,107 @@
import msgpack
from django.core.management.base import BaseCommand
from redis import StrictRedis
from core.db.storage import db
from core.lib.rules import NotificationRuleData
from core.models import NotificationRule
from core.util import logs
log = logs.get_logger("processing")
def process_rules(data):
all_rules = NotificationRule.objects.filter(enabled=True, interval=0)
for index, index_messages in data.items():
for message in index_messages:
for rule in all_rules:
# Quicker helper to get the data without spinning
# up a NotificationRuleData object
parsed_rule = rule.parse()
matched = {}
# Rule is invalid, this shouldn't happen
if "index" not in parsed_rule:
continue
if "source" not in parsed_rule:
continue
rule_index = parsed_rule["index"]
rule_source = parsed_rule["source"]
# if not type(rule_index) == list:
# rule_index = [rule_index]
# if not type(rule_source) == list:
# rule_source = [rule_source]
if index not in rule_index:
# We don't care about this index, go to the next one
continue
if message["src"] not in rule_source:
# We don't care about this source, go to the next one
continue
matched["index"] = index
matched["source"] = message["src"]
rule_field_length = len(parsed_rule.keys())
matched_field_number = 0
for field, value in parsed_rule.items():
# if not type(value) == list:
# value = [value]
if field == "src":
# We already checked this
continue
if field == "tokens":
# Check if tokens are in the rule
# We only check if *at least one* token matches
for token in value:
if "tokens" in message:
if token in message["tokens"]:
matched_field_number += 1
matched[field] = token
# Break out of the token matching loop
break
# Continue to next field
continue
if field == "msg":
# Allow partial matches for msg
for msg in value:
if "msg" in message:
if msg.lower() in message["msg"].lower():
matched_field_number += 1
matched[field] = msg
# Break out of the msg matching loop
break
# Continue to next field
continue
if field in message and message[field] in value:
# Do exact matches for all other fields
matched_field_number += 1
matched[field] = message[field]
# Subtract 2, 1 for source and 1 for index
if matched_field_number == rule_field_length - 2:
meta = {"matched": matched, "total_hits": 1}
# Parse the rule, we saved some work above to avoid doing this,
# but it makes delivering messages significantly easier as we can
# use the same code as for scheduling.
rule_data_object = NotificationRuleData(rule.user, rule, db=db)
# rule_notify(rule, index, message, meta=meta)
rule_data_object.rule_matched_sync(
index, message, meta=meta, mode="ondemand"
)
class Command(BaseCommand):
def handle(self, *args, **options):
r = StrictRedis(unix_socket_path="/var/run/socks/redis.sock", db=0)
p = r.pubsub()
p.psubscribe("messages")
for message in p.listen():
if message:
if message["channel"] == b"messages":
data = message["data"]
try:
unpacked = msgpack.unpackb(data, raw=False)
except TypeError:
continue
process_rules(unpacked)

View File

@@ -0,0 +1,54 @@
import asyncio
from apscheduler.schedulers.asyncio import AsyncIOScheduler
from asgiref.sync import sync_to_async
from django.core.management.base import BaseCommand
from core.db.storage import db
from core.lib.parsing import QueryError
from core.lib.rules import NotificationRuleData, RuleParseError
from core.models import NotificationRule
from core.util import logs
log = logs.get_logger("scheduling")
INTERVALS = [5, 60, 900, 1800, 3600, 14400, 86400]
async def job(interval_seconds):
"""
Run all schedules matching the given interval.
:param interval_seconds: The interval to run.
"""
matching_rules = await sync_to_async(list)(
NotificationRule.objects.filter(enabled=True, interval=interval_seconds)
)
for rule in matching_rules:
log.debug(f"Running rule {rule}")
try:
rule = NotificationRuleData(rule.user, rule, db=db)
await rule.run_schedule()
# results = await db.schedule_query_results(rule.user, rule)
except QueryError as e:
log.error(f"Error running rule {rule}: {e}")
except RuleParseError as e:
log.error(f"Error parsing rule {rule}: {e}")
class Command(BaseCommand):
def handle(self, *args, **options):
"""
Start the scheduling process.
"""
scheduler = AsyncIOScheduler()
for interval in INTERVALS:
log.debug(f"Scheduling {interval} second job")
scheduler.add_job(job, "interval", seconds=interval, args=[interval])
scheduler.start()
loop = asyncio.get_event_loop()
try:
loop.run_forever()
except (KeyboardInterrupt, SystemExit):
log.info("Process terminating")
finally:
loop.close()

View File

@@ -0,0 +1,17 @@
# Generated by Django 4.1.3 on 2022-11-29 12:04
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('core', '0010_alter_perms_options'),
]
operations = [
migrations.AlterModelOptions(
name='perms',
options={'permissions': (('bypass_hashing', 'Can bypass field hashing'), ('bypass_blacklist', 'Can bypass the blacklist'), ('bypass_encryption', 'Can bypass field encryption'), ('bypass_obfuscation', 'Can bypass field obfuscation'), ('bypass_delay', 'Can bypass data delay'), ('bypass_randomisation', 'Can bypass data randomisation'), ('post_irc', 'Can post to IRC'), ('post_discord', 'Can post to Discord'), ('query_search', 'Can search with query strings'), ('use_insights', 'Can use the Insights page'), ('index_internal', 'Can use the internal index'), ('index_meta', 'Can use the meta index'), ('index_restricted', 'Can use the restricted index'), ('restricted_sources', 'Can access restricted sources'))},
),
]

View File

@@ -0,0 +1,25 @@
# Generated by Django 4.1.3 on 2023-01-12 15:12
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0011_alter_perms_options'),
]
operations = [
migrations.CreateModel(
name='NotificationRule',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=255)),
('enabled', models.BooleanField(default=True)),
('data', models.TextField()),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]

View File

@@ -0,0 +1,24 @@
# Generated by Django 4.1.3 on 2023-01-12 15:25
import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0012_notificationrule'),
]
operations = [
migrations.CreateModel(
name='NotificationSettings',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('ntfy_topic', models.CharField(blank=True, max_length=255, null=True)),
('ntfy_url', models.CharField(blank=True, max_length=255, null=True)),
('user', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 4.1.5 on 2023-01-12 18:06
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0013_notificationsettings'),
]
operations = [
migrations.AddField(
model_name='notificationrule',
name='priority',
field=models.IntegerField(choices=[(1, 'min'), (2, 'low'), (3, 'default'), (4, 'high'), (5, 'max')], default=1),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 4.1.5 on 2023-01-12 18:14
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0014_notificationrule_priority'),
]
operations = [
migrations.AddField(
model_name='notificationrule',
name='topic',
field=models.CharField(blank=True, max_length=255, null=True),
),
]

View File

@@ -0,0 +1,23 @@
# Generated by Django 4.1.3 on 2023-01-14 14:33
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0015_notificationrule_topic'),
]
operations = [
migrations.AddField(
model_name='notificationrule',
name='interval',
field=models.CharField(choices=[('ondemand', 'On demand'), ('minute', 'Every minute'), ('15m', 'Every 15 minutes'), ('30m', 'Every 30 minutes'), ('hour', 'Every hour'), ('4h', 'Every 4 hours'), ('day', 'Every day'), ('week', 'Every week'), ('month', 'Every month')], default='ondemand', max_length=255),
),
migrations.AddField(
model_name='notificationrule',
name='window',
field=models.CharField(blank=True, max_length=255, null=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 4.1.3 on 2023-01-14 14:54
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0016_notificationrule_interval_notificationrule_window'),
]
operations = [
migrations.AlterField(
model_name='notificationrule',
name='interval',
field=models.IntegerField(choices=[(0, 'On demand'), (60, 'Every minute'), (900, 'Every 15 minutes'), (1800, 'Every 30 minutes'), (3600, 'Every hour'), (14400, 'Every 4 hours'), (86400, 'Every day')], default=0),
),
]

View File

@@ -0,0 +1,27 @@
# Generated by Django 4.1.5 on 2023-01-15 00:58
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0017_alter_notificationrule_interval'),
]
operations = [
migrations.AlterModelOptions(
name='perms',
options={'permissions': (('post_irc', 'Can post to IRC'), ('post_discord', 'Can post to Discord'), ('use_insights', 'Can use the Insights page'), ('use_rules', 'Can use the Rules page'), ('index_internal', 'Can use the internal index'), ('index_meta', 'Can use the meta index'), ('index_restricted', 'Can use the restricted index'), ('restricted_sources', 'Can access restricted sources'))},
),
migrations.AddField(
model_name='notificationrule',
name='match',
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name='notificationrule',
name='interval',
field=models.IntegerField(choices=[(0, 'On demand'), (5, 'Every 5 seconds'), (60, 'Every minute'), (900, 'Every 15 minutes'), (1800, 'Every 30 minutes'), (3600, 'Every hour'), (14400, 'Every 4 hours'), (86400, 'Every day')], default=0),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 4.1.5 on 2023-01-15 01:52
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0018_alter_perms_options_notificationrule_match_and_more'),
]
operations = [
migrations.AlterField(
model_name='notificationrule',
name='match',
field=models.JSONField(blank=True, null=True),
),
]

View File

@@ -0,0 +1,42 @@
# Generated by Django 4.1.5 on 2023-01-15 18:14
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0019_alter_notificationrule_match'),
]
operations = [
migrations.RenameField(
model_name='notificationsettings',
old_name='ntfy_topic',
new_name='topic',
),
migrations.RemoveField(
model_name='notificationsettings',
name='ntfy_url',
),
migrations.AddField(
model_name='notificationrule',
name='service',
field=models.CharField(blank=True, choices=[('ntfy', 'NTFY'), ('wehbook', 'Custom webhook')], max_length=255, null=True),
),
migrations.AddField(
model_name='notificationrule',
name='url',
field=models.CharField(blank=True, max_length=1024, null=True),
),
migrations.AddField(
model_name='notificationsettings',
name='service',
field=models.CharField(blank=True, choices=[('ntfy', 'NTFY'), ('wehbook', 'Custom webhook')], max_length=255, null=True),
),
migrations.AddField(
model_name='notificationsettings',
name='url',
field=models.CharField(blank=True, max_length=1024, null=True),
),
]

View File

@@ -0,0 +1,28 @@
# Generated by Django 4.1.5 on 2023-01-15 20:45
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0020_rename_ntfy_topic_notificationsettings_topic_and_more'),
]
operations = [
migrations.AddField(
model_name='notificationrule',
name='amount',
field=models.IntegerField(blank=True, default=1, null=True),
),
migrations.AlterField(
model_name='notificationrule',
name='service',
field=models.CharField(choices=[('ntfy', 'NTFY'), ('webhook', 'Custom webhook')], default='ntfy', max_length=255),
),
migrations.AlterField(
model_name='notificationsettings',
name='service',
field=models.CharField(choices=[('ntfy', 'NTFY'), ('webhook', 'Custom webhook')], default='ntfy', max_length=255),
),
]

View File

@@ -0,0 +1,23 @@
# Generated by Django 4.1.5 on 2023-01-15 23:34
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0021_notificationrule_amount_and_more'),
]
operations = [
migrations.AddField(
model_name='notificationrule',
name='send_empty',
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name='notificationrule',
name='amount',
field=models.PositiveIntegerField(blank=True, default=1, null=True),
),
]

View File

@@ -0,0 +1,17 @@
# Generated by Django 4.1.5 on 2023-02-02 19:07
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('core', '0022_notificationrule_send_empty_and_more'),
]
operations = [
migrations.AlterModelOptions(
name='perms',
options={'permissions': (('post_irc', 'Can post to IRC'), ('post_discord', 'Can post to Discord'), ('use_insights', 'Can use the Insights page'), ('use_rules', 'Can use the Rules page'), ('rules_scheduled', 'Can use the scheduled rules'), ('rules_high_frequency', 'Can use the high frequency rules'), ('index_internal', 'Can use the internal index'), ('index_meta', 'Can use the meta index'), ('index_restricted', 'Can use the restricted index'), ('restricted_sources', 'Can access restricted sources'))},
),
]

View File

@@ -0,0 +1,20 @@
# Generated by Django 4.1.5 on 2023-02-02 19:08
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0023_alter_perms_options'),
]
operations = [
migrations.AlterField(
model_name='notificationrule',
name='id',
field=models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False),
),
]

View File

@@ -0,0 +1,20 @@
# Generated by Django 4.1.5 on 2023-02-02 19:35
import uuid
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0024_alter_notificationrule_id'),
]
operations = [
migrations.AlterField(
model_name='notificationrule',
name='id',
field=models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True),
),
]

View File

@@ -1,13 +1,47 @@
import logging import logging
import uuid
import stripe import stripe
from django.conf import settings
from django.contrib.auth.models import AbstractUser from django.contrib.auth.models import AbstractUser
from django.db import models from django.db import models
from yaml import load
from yaml.parser import ParserError
from yaml.scanner import ScannerError
from core.lib.customers import get_or_create, update_customer_fields from core.lib.customers import get_or_create, update_customer_fields
try:
from yaml import CLoader as Loader
except ImportError:
from yaml import Loader
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
PRIORITY_CHOICES = (
(1, "min"),
(2, "low"),
(3, "default"),
(4, "high"),
(5, "max"),
)
INTERVAL_CHOICES = (
(0, "On demand"),
(5, "Every 5 seconds"),
(60, "Every minute"),
(900, "Every 15 minutes"),
(1800, "Every 30 minutes"),
(3600, "Every hour"),
(14400, "Every 4 hours"),
(86400, "Every day"),
)
SERVICE_CHOICES = (
("ntfy", "NTFY"),
("webhook", "Custom webhook"),
("none", "Disabled"),
)
class Plan(models.Model): class Plan(models.Model):
name = models.CharField(max_length=255, unique=True) name = models.CharField(max_length=255, unique=True)
@@ -60,6 +94,28 @@ class User(AbstractUser):
plan_list = [plan.name for plan in self.plans.all()] plan_list = [plan.name for plan in self.plans.all()]
return plan in plan_list return plan in plan_list
def get_notification_settings(self, check=True):
sets = NotificationSettings.objects.get_or_create(user=self)[0]
if check:
if sets.service == "ntfy" and sets.topic is None:
return None
if sets.service == "webhook" and sets.url is None:
return None
return sets
@property
def allowed_indices(self):
indices = [settings.INDEX_MAIN]
if self.has_perm("core.index_meta"):
indices.append(settings.INDEX_META)
if self.has_perm("core.index_internal"):
indices.append(settings.INDEX_INT)
if self.has_perm("core.index_restricted"):
if self.has_perm("core.restricted_sources"):
indices.append(settings.INDEX_RESTRICTED)
return indices
class Session(models.Model): class Session(models.Model):
user = models.ForeignKey(User, on_delete=models.CASCADE) user = models.ForeignKey(User, on_delete=models.CASCADE)
@@ -107,17 +163,87 @@ class ContentBlock(models.Model):
class Perms(models.Model): class Perms(models.Model):
class Meta: class Meta:
permissions = ( permissions = (
("bypass_hashing", "Can bypass field hashing"), #
("bypass_blacklist", "Can bypass the blacklist"), #
("bypass_encryption", "Can bypass field encryption"), #
("bypass_obfuscation", "Can bypass field obfuscation"), #
("bypass_delay", "Can bypass data delay"), #
("bypass_randomisation", "Can bypass data randomisation"), #
("post_irc", "Can post to IRC"), ("post_irc", "Can post to IRC"),
("post_discord", "Can post to Discord"), ("post_discord", "Can post to Discord"),
("query_search", "Can search with query strings"), #
("use_insights", "Can use the Insights page"), ("use_insights", "Can use the Insights page"),
("index_int", "Can use the internal index"), ("use_rules", "Can use the Rules page"),
("rules_scheduled", "Can use the scheduled rules"),
("rules_high_frequency", "Can use the high frequency rules"),
("index_internal", "Can use the internal index"),
("index_meta", "Can use the meta index"), ("index_meta", "Can use the meta index"),
("index_restricted", "Can use the restricted index"),
("restricted_sources", "Can access restricted sources"), ("restricted_sources", "Can access restricted sources"),
) )
class NotificationRule(models.Model):
id = models.UUIDField(
default=uuid.uuid4, primary_key=True, editable=False, unique=True
)
user = models.ForeignKey(User, on_delete=models.CASCADE)
name = models.CharField(max_length=255)
priority = models.IntegerField(choices=PRIORITY_CHOICES, default=1)
topic = models.CharField(max_length=255, null=True, blank=True)
url = models.CharField(max_length=1024, null=True, blank=True)
interval = models.IntegerField(choices=INTERVAL_CHOICES, default=0)
window = models.CharField(max_length=255, null=True, blank=True)
amount = models.PositiveIntegerField(default=1, null=True, blank=True)
enabled = models.BooleanField(default=True)
data = models.TextField()
match = models.JSONField(null=True, blank=True)
service = models.CharField(choices=SERVICE_CHOICES, max_length=255, default="ntfy")
send_empty = models.BooleanField(default=False)
def __str__(self):
return f"{self.user} - {self.name}"
def parse(self):
try:
parsed = load(self.data, Loader=Loader)
except (ScannerError, ParserError) as e:
raise ValueError(f"Invalid YAML: {e}")
return parsed
@property
def matches(self):
"""
Get the total number of matches for this rule.
"""
if isinstance(self.match, dict):
truthy_values = [x for x in self.match.values() if x is not False]
return f"{len(truthy_values)}/{len(self.match)}"
def get_notification_settings(self, check=True):
"""
Get the notification settings for this rule.
Notification rule settings take priority.
"""
user_settings = self.user.get_notification_settings(check=False)
user_settings = user_settings.__dict__
if self.priority is not None:
user_settings["priority"] = str(self.priority)
if self.topic is not None:
user_settings["topic"] = self.topic
if self.url is not None:
user_settings["url"] = self.url
if self.service is not None:
user_settings["service"] = self.service
if self.send_empty is not None:
user_settings["send_empty"] = self.send_empty
if check:
if user_settings["service"] == "ntfy" and user_settings["topic"] is None:
return None
if user_settings["service"] == "webhook" and user_settings["url"] is None:
return None
return user_settings
class NotificationSettings(models.Model):
user = models.OneToOneField(User, on_delete=models.CASCADE)
topic = models.CharField(max_length=255, null=True, blank=True)
url = models.CharField(max_length=1024, null=True, blank=True)
service = models.CharField(choices=SERVICE_CHOICES, max_length=255, default="ntfy")
def __str__(self):
return f"Notification settings for {self.user}"

View File

@@ -65,11 +65,16 @@ $(document).ready(function(){
"file_ext": "off", "file_ext": "off",
"file_size": "off", "file_size": "off",
"lang_code": "off", "lang_code": "off",
"tokens": "off",
"rule_uuid": "off",
"index": "off",
"meta": "off",
"match_ts": "off",
//"lang_name": "off", //"lang_name": "off",
"words_noun": "off", // "words_noun": "off",
"words_adj": "off", // "words_adj": "off",
"words_verb": "off", // "words_verb": "off",
"words_adv": "off" // "words_adv": "off"
}, },
}; };
} else { } else {

View File

@@ -213,6 +213,21 @@
} }
</style> </style>
<!-- Piwik --> {# Yes it's in the source, fight me #}
<script type="text/javascript">
var _paq = _paq || [];
_paq.push(['trackPageView']);
_paq.push(['enableLinkTracking']);
(function() {
_paq.push(['setTrackerUrl', 'https://api-a6fe73d3464641fe99ba77e5fdafa19c.s.zm.is']);
_paq.push(['setSiteId', 4]);
_paq.push(['setApiToken', 'je4TjsrunIM9uD4jrr_DGXJP4_b_Kq6ABhulOLo_Old']);
var d=document, g=d.createElement('script'), s=d.getElementsByTagName('script')[0];
g.type='text/javascript'; g.async=true; g.defer=true; g.src='https://c87zpt9a74m181wto33r.s.zm.is/embed.js'; s.parentNode.insertBefore(g,s);
})();
</script>
<!-- End Piwik Code -->
</head> </head>
<body> <body>
@@ -234,10 +249,24 @@
<a class="navbar-item" href="{% url 'home' %}"> <a class="navbar-item" href="{% url 'home' %}">
Search Search
</a> </a>
<a class="navbar-item" href="{% url 'rules' type='page' %}">
Rules
</a>
{% if user.is_authenticated %} {% if user.is_authenticated %}
<a class="navbar-item" href="{% url 'billing' %}"> <div class="navbar-item has-dropdown is-hoverable">
Billing <a class="navbar-link">
</a> Account
</a>
<div class="navbar-dropdown">
<a class="navbar-item" href="{% url 'billing' %}">
Billing
</a>
<a class="navbar-item" href="{% url 'notifications_update' type='page' %}">
Notifications
</a>
</div>
</div>
{% endif %} {% endif %}
{% if user.is_superuser %} {% if user.is_superuser %}
<div class="navbar-item has-dropdown is-hoverable"> <div class="navbar-item has-dropdown is-hoverable">
@@ -257,9 +286,21 @@
{% endif %} {% endif %}
{% if perms.core.use_insights %} {% if perms.core.use_insights %}
<a class="navbar-item" href="{# url 'insights' #}"> <div class="navbar-item has-dropdown is-hoverable">
Insights <a class="navbar-link">
</a> Insights
</a>
<div class="navbar-dropdown">
{% for index in user.allowed_indices %}
{% if index != "meta" %}
<a class="navbar-item" href="{% url 'insights' index=index %}">
{{ index }}
</a>
{% endif %}
{% endfor %}
</div>
</div>
{% endif %} {% endif %}
<a class="navbar-item add-button"> <a class="navbar-item add-button">
Install Install
@@ -271,15 +312,15 @@
<div class="buttons"> <div class="buttons">
{% if not user.is_authenticated %} {% if not user.is_authenticated %}
<a class="button is-info" href="{% url 'signup' %}"> <a class="button is-info" href="{% url 'signup' %}">
<strong>Sign up</strong> Sign up
</a> </a>
<a class="button is-light" href="{% url 'login' %}"> <a class="button" href="{% url 'login' %}">
Log in Log in
</a> </a>
{% endif %} {% endif %}
{% if user.is_authenticated %} {% if user.is_authenticated %}
<a class="button is-dark" href="{% url 'logout' %}">Logout</a> <a class="button" href="{% url 'logout' %}">Logout</a>
{% endif %} {% endif %}
</div> </div>
@@ -320,8 +361,18 @@
{% endblock %} {% endblock %}
<section class="section"> <section class="section">
<div class="container"> <div class="container">
{% block content %} {% block content_wrapper %}
{% block content %}
{% endblock %}
{% endblock %} {% endblock %}
<div id="modals-here">
</div>
<div id="windows-here">
</div>
<div id="widgets-here" style="display: none;">
{% block widgets %}
{% endblock %}
</div>
</div> </div>
</section> </section>
</body> </body>

View File

@@ -1,48 +1,152 @@
{% extends "base.html" %} {% extends 'base.html' %}
{% load static %} {% load static %}
{% load joinsep %}
{% block outer_content %}
{% if params.modal == 'context' %}
<div
style="display: none;"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-post="{% url 'modal_context' %}"
hx-vals='{"net": "{{ params.net|escapejs }}",
"num": "{{ params.num|escapejs }}",
"source": "{{ params.source|escapejs }}",
"channel": "{{ params.channel|escapejs }}",
"time": "{{ params.time|escapejs }}",
"date": "{{ params.date|escapejs }}",
"index": "{{ params.index }}",
"type": "{{ params.type|escapejs }}",
"mtype": "{{ params.mtype|escapejs }}",
"nick": "{{ params.nick|escapejs }}"}'
hx-target="#modals-here"
hx-trigger="load">
</div>
{% endif %}
<script src="{% static 'js/chart.js' %}"></script>
<script src="{% static 'tabs.js' %}"></script>
<script>
function setupTags() {
var inputTags = document.getElementById('tags');
new BulmaTagsInput(inputTags);
{% block content %} inputTags.BulmaTagsInput().on('before.add', function(item) {
<div class="block"> if (item.includes(": ")) {
{% for block in blocks %} var spl = item.split(": ");
{% if block.title is not None %} } else {
<h1 class="title">{{ block.title }}</h1> var spl = item.split(":");
{% endif %} }
<div class="box"> var field = spl[0];
<div class="columns"> try {
{% if block.column1 is not None %} var value = JSON.parse(spl[1]);
<div class="column"> } catch {
{{ block.column1 }} var value = spl[1];
</div> }
{% endif %} return `${field}: ${value}`;
{% if block.column2 is not None %} });
<div class="column"> inputTags.BulmaTagsInput().on('after.remove', function(item) {
{{ block.column2 }} var spl = item.split(": ");
</div> var field = spl[0];
{% endif %} var value = spl[1].trim();
{% if block.column3 is not None %} });
<div class="column"> }
{{ block.column3 }} function populateSearch(field, value) {
</div> var inputTags = document.getElementById('tags');
{% endif %} inputTags.BulmaTagsInput().add(field+": "+value);
</div> //htmx.trigger("#search", "click");
<div class="columns"> }
{% if block.image1 is not None %} </script>
<div class="column">
<img src="{% static block.image1 %}"> <div class="grid-stack" id="grid-stack-main">
</div> <div class="grid-stack-item" gs-w="7" gs-h="10" gs-y="0" gs-x="1">
{% endif %} <div class="grid-stack-item-content">
{% if block.image2 is not None %} <nav class="panel">
<div class="column"> <p class="panel-heading" style="padding: .2em; line-height: .5em;">
<img src="{% static block.image2 %}"> <i class="fa-solid fa-arrows-up-down-left-right has-text-grey-light"></i>
</div> Search
{% endif %} </p>
{% if block.image3 is not None %} <article class="panel-block is-active">
<div class="column"> {% include 'window-content/search.html' %}
<img src="{% static block.image3 %}"> </article>
</div> </nav>
{% endif %}
</div>
</div> </div>
{% endfor %} </div>
</div> </div>
<script>
var grid = GridStack.init({
cellHeight: 20,
cellWidth: 50,
cellHeightUnit: 'px',
auto: true,
float: true,
draggable: {handle: '.panel-heading', scroll: false, appendTo: 'body'},
removable: false,
animate: true,
});
// GridStack.init();
setupTags();
// a widget is ready to be loaded
document.addEventListener('load-widget', function(event) {
let container = htmx.find('#widget');
// get the scripts, they won't be run on the new element so we need to eval them
var scripts = htmx.findAll(container, "script");
let widgetelement = container.firstElementChild.cloneNode(true);
var new_id = widgetelement.id;
// check if there's an existing element like the one we want to swap
let grid_element = htmx.find('#grid-stack-main');
let existing_widget = htmx.find(grid_element, "#"+new_id);
// get the size and position attributes
if (existing_widget) {
let attrs = existing_widget.getAttributeNames();
for (let i = 0, len = attrs.length; i < len; i++) {
if (attrs[i].startsWith('gs-')) { // only target gridstack attributes
widgetelement.setAttribute(attrs[i], existing_widget.getAttribute(attrs[i]));
}
}
}
// clear the queue element
container.outerHTML = "";
// temporary workaround, other widgets can be duplicated, but not results
if (widgetelement.id == 'widget-results') {
grid.removeWidget("widget-results");
}
grid.addWidget(widgetelement);
// re-create the HTMX JS listeners, otherwise HTMX won't work inside the grid
htmx.process(widgetelement);
// update size when the widget is loaded
document.addEventListener('load-widget-results', function(evt) {
var added_widget = htmx.find(grid_element, '#widget-results');
var itemContent = htmx.find(added_widget, ".control");
var scrollheight = itemContent.scrollHeight+80;
var verticalmargin = 0;
var cellheight = grid.opts.cellHeight;
var height = Math.ceil((scrollheight + verticalmargin) / (cellheight + verticalmargin));
var opts = {
h: height,
}
grid.update(
added_widget,
opts
);
});
// run the JS scripts inside the added element again
// for instance, this will fix the dropdown
for (var i = 0; i < scripts.length; i++) {
eval(scripts[i].innerHTML);
}
});
</script>
{% endblock %} {% endblock %}
{% block widgets %}
{% if table or message is not None %}
{% include 'partials/results_load.html' %}
{% endif %}
{% endblock %}

View File

@@ -0,0 +1 @@
<button class="modal-close is-large" aria-label="close"></button>

View File

@@ -0,0 +1,3 @@
<i
class="fa-solid fa-xmark has-text-grey-light float-right"
onclick='grid.removeWidget("widget-{{ unique }}");'></i>

View File

@@ -0,0 +1,3 @@
<i
class="fa-solid fa-xmark has-text-grey-light float-right"
data-script="on click remove the closest <nav/>"></i>

View File

@@ -1,20 +1,10 @@
{% extends 'wm/widget.html' %} {% extends 'wm/widget.html' %}
{% load static %} {% load static %}
{% block widget_options %}
gs-w="10" gs-h="1" gs-y="10" gs-x="1"
{% endblock %}
{% block heading %} {% block heading %}
Results Results
{% endblock %} {% endblock %}
{% block close_button %}
<i
class="fa-solid fa-xmark has-text-grey-light float-right"
onclick='grid.removeWidget("drilldown-widget-{{ unique }}"); //grid.compact();'></i>
{% endblock %}
{% block panel_content %} {% block panel_content %}
{% include 'partials/notify.html' %} {% include 'partials/notify.html' %}
<script src="{% static 'js/column-shifter.js' %}"></script> <script src="{% static 'js/column-shifter.js' %}"></script>
@@ -24,7 +14,9 @@
</span> </span>
{% endif %} {% endif %}
fetched {{ table.data|length }} hits in {{ took }}ms fetched {{ table.data|length }}
{% if params.rule is None %} hits {% else %} rule hits for {{ params.rule }}{% endif %}
in {{ took }}ms
{% if exemption is not None %} {% if exemption is not None %}
<span class="icon has-tooltip-bottom" data-tooltip="God mode"> <span class="icon has-tooltip-bottom" data-tooltip="God mode">
@@ -38,6 +30,6 @@
{% endif %} {% endif %}
{% endif %} {% endif %}
{% include 'ui/drilldown/table_results_partial.html' %} {% include 'partials/results_table.html' %}
{% include 'ui/drilldown/sentiment_partial.html' %} {% include 'partials/sentiment_chart.html' %}
{% endblock %} {% endblock %}

View File

@@ -3,7 +3,9 @@
{% load static %} {% load static %}
{% load joinsep %} {% load joinsep %}
{% load urlsafe %} {% load urlsafe %}
{% load pretty %}
{% block table-wrapper %} {% block table-wrapper %}
<script src="{% static 'js/column-shifter.js' %}"></script>
<div id="drilldown-table" class="column-shifter-container" style="position:relative; z-index:1;"> <div id="drilldown-table" class="column-shifter-container" style="position:relative; z-index:1;">
{% block table %} {% block table %}
<div class="nowrap-parent"> <div class="nowrap-parent">
@@ -80,11 +82,11 @@
</div> </div>
<div class="nowrap-child"> <div class="nowrap-child">
<a <a
hx-get="search/{% querystring table.prefixed_order_by_field=column.order_by_alias.next %}&{{ uri }}" hx-get="search/partial/{% querystring table.prefixed_order_by_field=column.order_by_alias.next %}&{{ uri }}"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-trigger="click" hx-trigger="click"
hx-target="#results" hx-target="#drilldown-table"
hx-swap="innerHTML" hx-swap="outerHTML"
hx-indicator="#spinner" hx-indicator="#spinner"
style="cursor: pointer;"> style="cursor: pointer;">
{{ column.header }} {{ column.header }}
@@ -141,14 +143,10 @@
<i class="fa-solid fa-file-slash"></i> <i class="fa-solid fa-file-slash"></i>
</span> </span>
</td> </td>
{% elif column.name == 'tokens' %}
<td class="{{ column.name }} wrap" style="max-width: 10em">
{{ cell|joinsep:',' }}
</td>
{% elif column.name == 'src' %} {% elif column.name == 'src' %}
<td class="{{ column.name }}"> <td class="{{ column.name }}">
<a <a
class="has-text-link is-underlined" class="has-text-grey"
onclick="populateSearch('src', '{{ cell|escapejs }}')"> onclick="populateSearch('src', '{{ cell|escapejs }}')">
{% if row.cells.src == 'irc' %} {% if row.cells.src == 'irc' %}
<span class="icon" data-tooltip="IRC"> <span class="icon" data-tooltip="IRC">
@@ -173,7 +171,7 @@
{% elif column.name == 'type' or column.name == 'mtype' %} {% elif column.name == 'type' or column.name == 'mtype' %}
<td class="{{ column.name }}"> <td class="{{ column.name }}">
<a <a
class="has-text-link is-underlined" class="has-text-grey"
onclick="populateSearch('{{ column.name }}', '{{ cell|escapejs }}')"> onclick="populateSearch('{{ column.name }}', '{{ cell|escapejs }}')">
{% if cell == 'msg' %} {% if cell == 'msg' %}
<span class="icon" data-tooltip="Message"> <span class="icon" data-tooltip="Message">
@@ -281,7 +279,7 @@
</span> </span>
{% endif %} {% endif %}
</div> </div>
<a class="nowrap-child has-text-link is-underlined" onclick="populateSearch('nick', '{{ cell|escapejs }}')"> <a class="nowrap-child has-text-grey" onclick="populateSearch('nick', '{{ cell|escapejs }}')">
{{ cell }} {{ cell }}
</a> </a>
<div class="nowrap-child"> <div class="nowrap-child">
@@ -301,7 +299,7 @@
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-post="{% url 'modal_drilldown' type='window' %}" hx-post="{% url 'modal_drilldown' type='window' %}"
hx-vals='{"net": "{{ row.cells.net }}", "nick": "{{ row.cells.nick }}", "channel": "{{ row.cells.channel }}"}' hx-vals='{"net": "{{ row.cells.net }}", "nick": "{{ row.cells.nick }}", "channel": "{{ row.cells.channel }}"}'
hx-target="#items-here" hx-target="#windows-here"
hx-swap="afterend" hx-swap="afterend"
hx-trigger="click" hx-trigger="click"
class="has-text-black"> class="has-text-black">
@@ -336,7 +334,7 @@
{% if cell != '—' %} {% if cell != '—' %}
<div class="nowrap-parent"> <div class="nowrap-parent">
<a <a
class="nowrap-child has-text-link is-underlined" class="nowrap-child has-text-grey"
onclick="populateSearch('channel', '{{ cell|escapejs }}')"> onclick="populateSearch('channel', '{{ cell|escapejs }}')">
{{ cell }} {{ cell }}
</a> </a>
@@ -364,30 +362,26 @@
</span> </span>
{% endif %} {% endif %}
</td> </td>
{% elif column.name|slice:":6" == "words_" %} {% elif column.name == "tokens" %}
<td class="{{ column.name }}"> <td class="{{ column.name }}">
{% if cell.0.1|length == 0 %} <div class="tags">
<a {% for word in cell %}
class="tag is-info" <a
onclick="populateSearch('{{ column.name }}', '{{ cell }}')"> class="tag"
{{ cell }} onclick="populateSearch('{{ column.name }}', '{{ word }}')">
</a> {{ word }}
{% else %} </a>
<div class="tags"> {% endfor %}
{% for word in cell %} </div>
<a </td>
class="tag is-info" {% elif column.name == "meta" %}
onclick="populateSearch('{{ column.name }}', '{{ word }}')"> <td class="{{ column.name }}">
{{ word }} <pre>{{ cell|pretty }}</pre>
</a>
{% endfor %}
</div>
{% endif %}
</td> </td>
{% else %} {% else %}
<td class="{{ column.name }}"> <td class="{{ column.name }}">
<a <a
class="has-text-link is-underlined" class="has-text-grey"
onclick="populateSearch('{{ column.name }}', '{{ cell|escapejs }}')"> onclick="populateSearch('{{ column.name }}', '{{ cell|escapejs }}')">
{{ cell }} {{ cell }}
</a> </a>
@@ -433,11 +427,11 @@
<a <a
class="pagination-previous is-flex-grow-0 {% if not table.page.has_previous %}is-hidden-mobile{% endif %}" class="pagination-previous is-flex-grow-0 {% if not table.page.has_previous %}is-hidden-mobile{% endif %}"
{% if table.page.has_previous %} {% if table.page.has_previous %}
hx-get="search/{% querystring table.prefixed_page_field=table.page.previous_page_number %}&{{ uri }}" hx-get="search/partial/{% querystring table.prefixed_page_field=table.page.previous_page_number %}&{{ uri }}"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-trigger="click" hx-trigger="click"
hx-target="#results" hx-target="#drilldown-table"
hx-swap="innerHTML" hx-swap="outerHTML"
hx-indicator="#spinner" hx-indicator="#spinner"
{% else %} {% else %}
href="#" href="#"
@@ -453,11 +447,11 @@
<a <a
class="pagination-next is-flex-grow-0 {% if not table.page.has_next %}is-hidden-mobile{% endif %}" class="pagination-next is-flex-grow-0 {% if not table.page.has_next %}is-hidden-mobile{% endif %}"
{% if table.page.has_next %} {% if table.page.has_next %}
hx-get="search/{% querystring table.prefixed_page_field=table.page.next_page_number %}&{{ uri }}" hx-get="search/partial/{% querystring table.prefixed_page_field=table.page.next_page_number %}&{{ uri }}"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-trigger="click" hx-trigger="click"
hx-target="#results" hx-target="#drilldown-table"
hx-swap="innerHTML" hx-swap="outerHTML"
hx-indicator="#spinner" hx-indicator="#spinner"
{% else %} {% else %}
href="#" href="#"
@@ -482,11 +476,11 @@
{% if p == table.page.number %} {% if p == table.page.number %}
href="#" href="#"
{% else %} {% else %}
hx-get="search/{% querystring table.prefixed_page_field=p %}&{{ uri }}" hx-get="search/partial/{% querystring table.prefixed_page_field=p %}&{{ uri }}"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-trigger="click" hx-trigger="click"
hx-target="#results" hx-target="#drilldown-table"
hx-swap="innerHTML" hx-swap="outerHTML"
hx-indicator="#spinner" hx-indicator="#spinner"
{% endif %} {% endif %}
> >

View File

@@ -0,0 +1,93 @@
{% include 'partials/notify.html' %}
<table
class="table is-fullwidth is-hoverable"
hx-target="#{{ context_object_name }}-table"
id="{{ context_object_name }}-table"
hx-swap="outerHTML"
hx-trigger="{{ context_object_name_singular }}Event from:body"
hx-get="{{ list_url }}">
<thead>
<th>id</th>
<th>user</th>
<th>name</th>
<th>interval</th>
<th>window</th>
<th>priority</th>
<th>topic</th>
<th>enabled</th>
<th>data length</th>
<th>match</th>
<th>actions</th>
</thead>
{% for item in object_list %}
<tr>
<td><a href="/search/?rule={{ item.id }}&query=*&source=all">{{ item.id }}</a></td>
<td>{{ item.user }}</td>
<td>{{ item.name }}</td>
<td>{{ item.interval }}s</td>
<td>{{ item.window|default_if_none:"—" }}</td>
<td>{{ item.priority }}</td>
<td>{{ item.topic|default_if_none:"—" }}</td>
<td>
{% if item.enabled %}
<span class="icon">
<i class="fa-solid fa-check"></i>
</span>
{% else %}
<span class="icon">
<i class="fa-solid fa-xmark"></i>
</span>
{% endif %}
</td>
<td>{{ item.data|length }}</td>
<td>{{ item.matches }}</td>
<td>
<div class="buttons">
<button
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-get="{% url 'rule_update' type=type pk=item.id %}"
hx-trigger="click"
hx-target="#{{ type }}s-here"
hx-swap="innerHTML"
class="button">
<span class="icon-text">
<span class="icon">
<i class="fa-solid fa-pencil"></i>
</span>
</span>
</button>
<button
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-delete="{% url 'rule_delete' type=type pk=item.id %}"
hx-trigger="click"
hx-target="#modals-here"
hx-swap="innerHTML"
hx-confirm="Are you sure you wish to delete {{ item.name }}?"
class="button">
<span class="icon-text">
<span class="icon">
<i class="fa-solid fa-xmark"></i>
</span>
</span>
</button>
<button
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-post="{% url 'rule_clear' type=type pk=item.id %}"
hx-trigger="click"
hx-target="#modals-here"
hx-swap="innerHTML"
hx-confirm="Are you sure you wish to clear matches for {{ item.name }}?"
class="button">
<span class="icon-text">
<span class="icon">
<i class="fa-solid fa-arrow-rotate-right"></i>
</span>
</span>
</button>
</div>
</td>
</tr>
{% endfor %}
</table>

View File

@@ -1,163 +0,0 @@
{% extends "base.html" %}
{% load static %}
{% load joinsep %}
{% block outer_content %}
{% if params.modal == 'context' %}
<div
style="display: none;"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-post="{% url 'modal_context' %}"
hx-vals='{"net": "{{ params.net|escapejs }}",
"num": "{{ params.num|escapejs }}",
"source": "{{ params.source|escapejs }}",
"channel": "{{ params.channel|escapejs }}",
"time": "{{ params.time|escapejs }}",
"date": "{{ params.date|escapejs }}",
"index": "{{ params.index }}",
"type": "{{ params.type|escapejs }}",
"mtype": "{{ params.mtype|escapejs }}",
"nick": "{{ params.nick|escapejs }}"}'
hx-target="#modals-here"
hx-trigger="load">
</div>
{% endif %}
<script src="{% static 'js/chart.js' %}"></script>
<script src="{% static 'tabs.js' %}"></script>
<script>
function setupTags() {
var inputTags = document.getElementById('tags');
new BulmaTagsInput(inputTags);
inputTags.BulmaTagsInput().on('before.add', function(item) {
if (item.includes(": ")) {
var spl = item.split(": ");
} else {
var spl = item.split(":");
}
var field = spl[0];
try {
var value = JSON.parse(spl[1]);
} catch {
var value = spl[1];
}
return `${field}: ${value}`;
});
inputTags.BulmaTagsInput().on('after.remove', function(item) {
var spl = item.split(": ");
var field = spl[0];
var value = spl[1].trim();
});
}
function populateSearch(field, value) {
var inputTags = document.getElementById('tags');
inputTags.BulmaTagsInput().add(field+": "+value);
//htmx.trigger("#search", "click");
}
</script>
<div class="grid-stack" id="grid-stack-main">
<div class="grid-stack-item" gs-w="7" gs-h="10" gs-y="0" gs-x="1">
<div class="grid-stack-item-content">
<nav class="panel">
<p class="panel-heading" style="padding: .2em; line-height: .5em;">
<i class="fa-solid fa-arrows-up-down-left-right has-text-grey-light"></i>
Search
</p>
<article class="panel-block is-active">
{% include 'ui/drilldown/search_partial.html' %}
</article>
</nav>
</div>
</div>
</div>
<script>
var grid = GridStack.init({
cellHeight: 20,
cellWidth: 50,
cellHeightUnit: 'px',
auto: true,
float: true,
draggable: {handle: '.panel-heading', scroll: false, appendTo: 'body'},
removable: false,
animate: true,
});
// GridStack.init();
setupTags();
// a widget is ready to be loaded
document.addEventListener('load-widget', function(event) {
let container = htmx.find('#drilldown-widget');
// get the scripts, they won't be run on the new element so we need to eval them
var scripts = htmx.findAll(container, "script");
let widgetelement = container.firstElementChild.cloneNode(true);
// check if there's an existing element like the one we want to swap
let grid_element = htmx.find('#grid-stack-main');
let existing_widget = htmx.find(grid_element, '#drilldown-widget-results');
// get the size and position attributes
if (existing_widget) {
let attrs = existing_widget.getAttributeNames();
for (let i = 0, len = attrs.length; i < len; i++) {
if (attrs[i].startsWith('gs-')) { // only target gridstack attributes
widgetelement.setAttribute(attrs[i], existing_widget.getAttribute(attrs[i]));
}
}
}
// clear the queue element
container.outerHTML = "";
// temporary workaround, other widgets can be duplicated, but not results
if (widgetelement.id == 'drilldown-widget-results') {
grid.removeWidget("drilldown-widget-{{ unique }}");
}
grid.addWidget(widgetelement);
// re-create the HTMX JS listeners, otherwise HTMX won't work inside the grid
htmx.process(widgetelement);
// update size when the widget is loaded
document.addEventListener('load-widget-results', function(evt) {
var added_widget = htmx.find(grid_element, '#drilldown-widget-results');
console.log(added_widget);
var itemContent = htmx.find(added_widget, ".control");
console.log(itemContent);
var scrollheight = itemContent.scrollHeight+80;
var verticalmargin = 0;
var cellheight = grid.opts.cellHeight;
var height = Math.ceil((scrollheight + verticalmargin) / (cellheight + verticalmargin));
var opts = {
h: height,
}
grid.update(
added_widget,
opts
);
});
// run the JS scripts inside the added element again
// for instance, this will fix the dropdown
for (var i = 0; i < scripts.length; i++) {
eval(scripts[i].innerHTML);
}
});
</script>
<div id="modals-here">
</div>
<div id="items-here">
</div>
<div id="widgets-here" style="display: none;">
</div>
<div id="results" style="display: none;">
{% if table %}
{% include 'widgets/table_results.html' %}
{% endif %}
</div>
<script>
</script>
{% endblock %}

View File

@@ -4,7 +4,7 @@
style="display: none;" style="display: none;"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-vals='{"net": "{{ item.net }}", "nick": "{{ item.nick }}"}' hx-vals='{"net": "{{ item.net }}", "nick": "{{ item.nick }}"}'
hx-post="{% url 'chans_insights' %}" hx-post="{% url 'chans_insights' index=index %}"
hx-trigger="load" hx-trigger="load"
hx-target="#channels" hx-target="#channels"
hx-swap="outerHTML"> hx-swap="outerHTML">
@@ -13,7 +13,7 @@
style="display: none;" style="display: none;"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-vals='{"net": "{{ item.net }}", "nick": "{{ item.nick }}"}' hx-vals='{"net": "{{ item.net }}", "nick": "{{ item.nick }}"}'
hx-post="{% url 'nicks_insights' %}" hx-post="{% url 'nicks_insights' index=index %}"
hx-trigger="load" hx-trigger="load"
hx-target="#nicks" hx-target="#nicks"
hx-swap="outerHTML"> hx-swap="outerHTML">
@@ -81,7 +81,7 @@
{% if item.src == 'irc' %} {% if item.src == 'irc' %}
<button <button
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-post="{% url 'modal_insights' %}" hx-post="{% url 'modal_insights' index=index %}"
hx-vals='{"net": "{{ item.net }}", "nick": "{{ item.nick }}", "channel": "{{ item.channel }}"}' hx-vals='{"net": "{{ item.net }}", "nick": "{{ item.nick }}", "channel": "{{ item.channel }}"}'
hx-target="#modals-here" hx-target="#modals-here"
hx-trigger="click" hx-trigger="click"

View File

@@ -2,39 +2,7 @@
{% load static %} {% load static %}
{% block content %} {% block content %}
{% include 'partials/notify.html' %} {% include 'partials/notify.html' %}
<script> <script src="{% static 'tabs.js' %}"></script>
// tabbed browsing for the modal
function initTabs() {
TABS.forEach((tab) => {
tab.addEventListener('click', (e) => {
let selected = tab.getAttribute('data-tab');
updateActiveTab(tab);
updateActiveContent(selected);
})
})
}
function updateActiveTab(selected) {
TABS.forEach((tab) => {
if (tab && tab.classList.contains(ACTIVE_CLASS)) {
tab.classList.remove(ACTIVE_CLASS);
}
});
selected.classList.add(ACTIVE_CLASS);
}
function updateActiveContent(selected) {
CONTENT.forEach((item) => {
if (item && item.classList.contains(ACTIVE_CLASS)) {
item.classList.remove(ACTIVE_CLASS);
}
let data = item.getAttribute('data-content');
if (data === selected) {
item.classList.add(ACTIVE_CLASS);
}
});
}
</script>
<style> <style>
.icon { border-bottom: 0px !important;} .icon { border-bottom: 0px !important;}
</style> </style>
@@ -47,7 +15,7 @@
{% csrf_token %} {% csrf_token %}
<div class="field has-addons"> <div class="field has-addons">
<div class="control is-expanded has-icons-left"> <div class="control is-expanded has-icons-left">
<input id="query_full" name="query_full" class="input" type="text" placeholder="nickname"> <input id="query_full" name="query" class="input" type="text" placeholder="nickname">
<span class="icon is-small is-left"> <span class="icon is-small is-left">
<i class="fas fa-magnifying-glass"></i> <i class="fas fa-magnifying-glass"></i>
</span> </span>
@@ -55,7 +23,7 @@
<div class="control"> <div class="control">
<button <button
class="button is-info is-fullwidth" class="button is-info is-fullwidth"
hx-post="{% url 'search_insights' %}" hx-post="{% url 'search_insights' index=index %}"
hx-trigger="click" hx-trigger="click"
hx-target="#info" hx-target="#info"
hx-swap="outerHTML"> hx-swap="outerHTML">

View File

@@ -3,7 +3,7 @@
style="display: none;" style="display: none;"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}' hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-vals='{"net": "{{ net }}", "nicks": "{{ nicks }}"}' hx-vals='{"net": "{{ net }}", "nicks": "{{ nicks }}"}'
hx-post="{% url 'meta_insights' %}" hx-post="{% url 'meta_insights' index=index %}"
hx-trigger="load" hx-trigger="load"
hx-target="#meta" hx-target="#meta"
hx-swap="outerHTML"> hx-swap="outerHTML">

View File

@@ -1,122 +0,0 @@
{% load index %}
{% load static %}
<script src="{% static 'modal.js' %}"></script>
<script>
document.addEventListener("restore-modal-scroll", function(event) {
var modalContent = document.getElementsByClassName("modal-content")[0];
var maxScroll = modalContent.scrollHeight - modalContent.offsetHeight;
var scrollpos = localStorage.getItem('scrollpos_modal_content');
if (scrollpos == 'BOTTOM') {
modalContent.scrollTop = maxScroll;
} else if (scrollpos) {
modalContent.scrollTop = scrollpos;
};
});
document.addEventListener("htmx:beforeSwap", function(event) {
var modalContent = document.getElementsByClassName("modal-content")[0];
var scrollpos = modalContent.scrollTop;
if(modalContent.scrollTop === (modalContent.scrollHeight - modalContent.offsetHeight)) {
localStorage.setItem('scrollpos_modal_content', 'BOTTOM');
} else {
localStorage.setItem('scrollpos_modal_content', scrollpos);
}
});
</script>
<style>
#tab-content-{{ unique }} div {
display: none;
}
#tab-content-{{ unique }} div.is-active {
display: block;
}
</style>
<div id="modal" class="modal is-active is-clipped">
<div class="modal-background"></div>
<div class="modal-content">
<div class="box">
{% include 'partials/notify.html' %}
<div class="tabs is-toggle is-fullwidth is-info" id="tabs-{{ unique }}">
<ul>
<li class="is-active" data-tab="1">
<a>
<span class="icon is-small"><i class="fa-solid fa-message-arrow-down"></i></span>
<span>Scrollback</span>
</a>
</li>
<li data-tab="2">
<a>
<span class="icon is-small"><i class="fa-solid fa-messages"></i></span>
<span>Context</span>
</a>
</li>
<li data-tab="3">
<a>
<span class="icon is-small"><i class="fa-solid fa-message"></i></span>
<span>Message</span>
</a>
</li>
<li data-tab="4">
<a>
<span class="icon is-small"><i class="fa-solid fa-asterisk"></i></span>
<span>Info</span>
</a>
</li>
</ul>
</div>
<div id="tab-content-{{ unique }}">
<div class="is-active" data-content="1">
<h4 class="subtitle is-4">Scrollback of {{ channel }} on {{ net }}{{ num }}</h4>
{% include 'modals/context_table.html' %}
{% if user.is_superuser and src == 'irc' %}
<form method="PUT">
<article class="field has-addons">
<article class="control is-expanded has-icons-left">
<input id="context-input" name="msg" class="input" type="text" placeholder="Type your message here">
<span class="icon is-small is-left">
<i class="fas fa-magnifying-glass"></i>
</span>
</article>
<article class="control">
<article class="field">
<button
id="search"
class="button is-info is-fullwidth"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-put="{% url 'threshold_irc_msg' net num %}"
hx-vals='{"channel": "{{ channel }}", "nick": "{{ nick }}"}'
hx-trigger="click"
hx-target="#context-input"
hx-swap="outerHTML">
Send
</button>
</article>
</article>
</article>
</form>
{% endif %}
</div>
<div data-content="2">
<h4 class="subtitle is-4">Scrollback of {{ channel }} on {{ net }}{{ num }} around {{ ts }}</h4>
Context
</div>
<div data-content="3">
<h4 class="subtitle is-4">Message details</h4>
Message deetails
</div>
<div data-content="4">
<h4 class="subtitle is-4">Information about {{ channel }} on {{ net }}{{ num }}</h4>
info
</div>
</div>
</div>
<script>initTabs("{{ unique }}");</script>
<button class="modal-close is-large" aria-label="close"></button>
</div>
</div>

View File

@@ -1,177 +0,0 @@
<article class="table-container" id="modal-context-table">
<table class="table is-fullwidth">
<thead>
<th></th>
<th></th>
<th></th>
</thead>
<tbody>
{% for item in object_list %}
{% if item.type == 'control' %}
<tr>
<td></td>
<td>
<span class="icon has-text-grey" data-tooltip="Hidden">
<i class="fa-solid fa-file-slash"></i>
</span>
</td>
<td>
<p class="has-text-grey">Hidden {{ item.hidden }} similar result{% if item.hidden > 1%}s{% endif %}</p>
</td>
</tr>
{% else %}
<tr>
<td>{{ item.time }}</td>
<td>
{% if item.type != 'znc' and item.type != 'self' and query is not True %}
<article class="nowrap-parent">
<article class="nowrap-child">
{% if item.type == 'msg' %}
<span class="icon" data-tooltip="Message">
<i class="fa-solid fa-message"></i>
</span>
{% elif item.type == 'join' %}
<span class="icon" data-tooltip="Join">
<i class="fa-solid fa-person-to-portal"></i>
</span>
{% elif item.type == 'part' %}
<span class="icon" data-tooltip="Part">
<i class="fa-solid fa-person-from-portal"></i>
</span>
{% elif item.type == 'quit' %}
<span class="icon" data-tooltip="Quit">
<i class="fa-solid fa-circle-xmark"></i>
</span>
{% elif item.type == 'kick' %}
<span class="icon" data-tooltip="Kick">
<i class="fa-solid fa-user-slash"></i>
</span>
{% elif item.type == 'nick' %}
<span class="icon" data-tooltip="Nick">
<i class="fa-solid fa-signature"></i>
</span>
{% elif item.type == 'mode' %}
<span class="icon" data-tooltip="Mode">
<i class="fa-solid fa-gear"></i>
</span>
{% elif item.type == 'action' %}
<span class="icon" data-tooltip="Action">
<i class="fa-solid fa-exclamation"></i>
</span>
{% elif item.type == 'notice' %}
<span class="icon" data-tooltip="Notice">
<i class="fa-solid fa-message-code"></i>
</span>
{% elif item.type == 'conn' %}
<span class="icon" data-tooltip="Connection">
<i class="fa-solid fa-cloud-exclamation"></i>
</span>
{% elif item.type == 'znc' %}
<span class="icon" data-tooltip="ZNC">
<i class="fa-brands fa-unity"></i>
</span>
{% elif item.type == 'query' %}
<span class="icon" data-tooltip="Query">
<i class="fa-solid fa-message"></i>
</span>
{% elif item.type == 'highlight' %}
<span class="icon" data-tooltip="Highlight">
<i class="fa-solid fa-exclamation"></i>
</span>
{% elif item.type == 'who' %}
<span class="icon" data-tooltip="Who">
<i class="fa-solid fa-passport"></i>
</span>
{% elif item.type == 'topic' %}
<span class="icon" data-tooltip="Topic">
<i class="fa-solid fa-sign"></i>
</span>
{% else %}
{{ item.type }}
{% endif %}
{% if item.online is True %}
<span class="icon has-text-success has-tooltip-success" data-tooltip="Online">
<i class="fa-solid fa-circle"></i>
</span>
{% elif item.online is False %}
<span class="icon has-text-danger has-tooltip-danger" data-tooltip="Offline">
<i class="fa-solid fa-circle"></i>
</span>
{% else %}
<span class="icon has-text-warning has-tooltip-warning" data-tooltip="Unknown">
<i class="fa-solid fa-circle"></i>
</span>
{% endif %}
{% if item.src == 'irc' %}
<a
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-post="{% url 'modal_drilldown' %}"
hx-vals='{"net": "{{ item.net|escapejs }}", "nick": "{{ item.nick|escapejs }}", "channel": "{{ item.channel|escapejs }}"}'
hx-target="#modals-here"
hx-trigger="click"
class="has-text-black">
<span class="icon" data-tooltip="Open drilldown modal">
<i class="fa-solid fa-album"></i>
</span>
</a>
{% endif %}
</article>
<a class="nowrap-child has-text-link is-underlined" onclick="populateSearch('nick', '{{ item.nick|escapejs }}')">
{{ item.nick }}
</a>
{% if item.num_chans != '—' %}
<article class="nowrap-child">
<span class="tag">
{{ item.num_chans }}
</span>
</article>
{% endif %}
</article>
{% endif %}
{% if item.type == 'self' %}
<span class="icon has-text-primary" data-tooltip="You">
<i class="fa-solid fa-message-check"></i>
</span>
{% elif item.type == 'znc' %}
<span class="icon has-text-info" data-tooltip="ZNC">
<i class="fa-brands fa-unity"></i>
</span>
{% elif query %}
<span class="icon has-text-info" data-tooltip="Auth">
<i class="fa-solid fa-passport"></i>
</span>
{% endif %}
</td>
<td class="wrap">{{ item.msg }}</td>
</tr>
{% endif %}
{% endfor %}
</tbody>
</table>
{% if object_list %}
<div
class="modal-refresh"
style="display: none;"
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-post="{% url 'modal_context_table' %}"
hx-vals='{"net": "{{ net }}",
"num": "{{ num }}",
"src": "{{ src }}",
"channel": "{{ channel }}",
"time": "{{ time }}",
"date": "{{ date }}",
"index": "{{ index }}",
"type": "{{ type }}",
"mtype": "{{ mtype }}",
"nick": "{{ nick }}",
"dedup": "{{ params.dedup }}"}'
hx-target="#modal-context-table"
hx-trigger="every 5s">
</div>
{% endif %}
</article>
<script>
var modal_event = new Event('restore-modal-scroll');
document.dispatchEvent(modal_event);
</script>

View File

@@ -0,0 +1,34 @@
{% include 'partials/notify.html' %}
{% if page_title is not None %}
<h1 class="title is-4">{{ page_title }}</h1>
{% endif %}
{% if page_subtitle is not None %}
<h1 class="subtitle">{{ page_subtitle }}</h1>
{% endif %}
{% load crispy_forms_tags %}
{% load crispy_forms_bulma_field %}
<form
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-post="{{ submit_url }}"
hx-target="#modals-here"
hx-swap="innerHTML">
{% csrf_token %}
{{ form|crispy }}
{% if hide_cancel is not True %}
<button
type="button"
class="button is-light modal-close-button">
Cancel
</button>
{% endif %}
<button type="submit" class="button modal-close-button">Submit</button>
</form>

View File

@@ -0,0 +1,45 @@
{% include 'partials/notify.html' %}
{% if page_title is not None %}
<h1 class="title is-4">{{ page_title }}</h1>
{% endif %}
{% if page_subtitle is not None %}
<h1 class="subtitle">{{ page_subtitle }}</h1>
{% endif %}
<div class="buttons">
{% if submit_url is not None %}
<button
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-get="{{ submit_url }}"
hx-trigger="click"
hx-target="#modals-here"
class="button">
<span class="icon-text">
<span class="icon">
<i class="fa-solid fa-plus"></i>
</span>
<span>{{ title_singular }}</span>
</span>
</button>
{% endif %}
{% if delete_all_url is not None %}
<button
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-delete="{{ delete_all_url }}"
hx-trigger="click"
hx-target="#modals-here"
hx-swap="innerHTML"
hx-confirm="Are you sure you wish to delete all {{ context_object_name }}?"
class="button">
<span class="icon-text">
<span class="icon">
<i class="fa-solid fa-xmark"></i>
</span>
<span>Delete all {{ context_object_name }} </span>
</span>
</button>
{% endif %}
</div>
{% include detail_template %}

View File

@@ -0,0 +1,45 @@
{% include 'partials/notify.html' %}
{% if page_title is not None %}
<h1 class="title is-4">{{ page_title }}</h1>
{% endif %}
{% if page_subtitle is not None %}
<h1 class="subtitle">{{ page_subtitle }}</h1>
{% endif %}
<div class="buttons">
{% if submit_url is not None %}
<button
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-get="{{ submit_url }}"
hx-trigger="click"
hx-target="#modals-here"
class="button">
<span class="icon-text">
<span class="icon">
<i class="fa-solid fa-plus"></i>
</span>
<span>{{ title_singular }}</span>
</span>
</button>
{% endif %}
{% if delete_all_url is not None %}
<button
hx-headers='{"X-CSRFToken": "{{ csrf_token }}"}'
hx-delete="{{ delete_all_url }}"
hx-trigger="click"
hx-target="#modals-here"
hx-swap="innerHTML"
hx-confirm="Are you sure you wish to delete all {{ context_object_name }}?"
class="button">
<span class="icon-text">
<span class="icon">
<i class="fa-solid fa-xmark"></i>
</span>
<span>Delete all {{ context_object_name }} </span>
</span>
</button>
{% endif %}
</div>
{% include list_template %}

View File

@@ -0,0 +1,27 @@
{% load static %}
{% include 'partials/notify.html' %}
{% if cache is not None %}
<span class="icon has-tooltip-bottom" data-tooltip="Cached">
<i class="fa-solid fa-database"></i>
</span>
{% endif %}
fetched {{ table.data|length }}
{% if params.rule is None %} hits {% else %} rule hits for {{ params.rule }}{% endif %}
in {{ took }}ms
{% if exemption is not None %}
<span class="icon has-tooltip-bottom" data-tooltip="God mode">
<i class="fa-solid fa-book-bible"></i>
</span>
{% else %}
{% if redacted is not None %}
<span class="icon has-tooltip-bottom" data-tooltip="{{ redacted }} redacted">
<i class="fa-solid fa-mask"></i>
</span>
{% endif %}
{% endif %}
{% include 'partials/results_table.html' %}
{% include 'partials/sentiment_chart.html' %}

View File

@@ -1,6 +1,6 @@
<form class="skipEmptyFields" method="POST" hx-post="{% url 'search' %}" <form class="skipEmptyFields" method="POST" hx-post="{% url 'search' %}"
hx-trigger="change" hx-trigger="change"
hx-target="#results" hx-target="#widgets-here"
hx-swap="innerHTML" hx-swap="innerHTML"
hx-indicator="#spinner"> hx-indicator="#spinner">
{% csrf_token %} {% csrf_token %}
@@ -11,7 +11,7 @@
<input <input
hx-post="{% url 'search' %}" hx-post="{% url 'search' %}"
hx-trigger="keyup changed delay:200ms" hx-trigger="keyup changed delay:200ms"
hx-target="#results" hx-target="#widgets-here"
hx-swap="innerHTML" hx-swap="innerHTML"
name="query" name="query"
value="{{ params.query }}" value="{{ params.query }}"
@@ -26,10 +26,10 @@
<div class="field"> <div class="field">
<button <button
id="search" id="search"
class="button is-info is-fullwidth" class="button is-fullwidth"
hx-post="{% url 'search' %}" hx-post="{% url 'search' %}"
hx-trigger="click" hx-trigger="click"
hx-target="#results" hx-target="#widgets-here"
hx-swap="innerHTML"> hx-swap="innerHTML">
Search Search
</button> </button>
@@ -41,7 +41,7 @@
<div class="nowrap-parent"> <div class="nowrap-parent">
<div <div
data-script="on click toggle .is-hidden on #options" data-script="on click toggle .is-hidden on #options"
class="button is-light has-text-link is-right nowrap-child"> class="button is-right nowrap-child">
Options Options
</div> </div>
<div class="nowrap-child"> <div class="nowrap-child">
@@ -341,60 +341,64 @@
</div> </div>
</div> </div>
</div> </div>
<div class="column is-narrow rounded-tooltip">
<div class="field has-addons">
<div class="control has-icons-left">
<span class="select is-warning">
<select {% if not user.is_superuser %}disabled{% endif %} id="index" name="index">
{% if params.index == 'main' %}
<option selected value="main">Main</option>
{% elif params.index == None %}
<option selected value="main">Main</option>
{% else %}
<option value="main">Main</option>
{% endif %}
{% if params.index == 'internal' %} {% if params.rule is None %}
<option selected value="internal">Internal</option> <div class="column is-narrow rounded-tooltip">
{% else %} <div class="field has-addons">
<option value="internal">Internal</option> <div class="control has-icons-left">
{% endif %} <span class="select is-warning">
<select {% if not user.is_superuser %}disabled{% endif %} id="index" name="index">
{% if params.index == 'main' %}
<option selected value="main">Main</option>
{% elif params.index == None %}
<option selected value="main">Main</option>
{% else %}
<option value="main">Main</option>
{% endif %}
{% if params.index == 'meta' %} {% if params.index == 'internal' %}
<option selected value="meta">Meta</option> <option selected value="internal">Internal</option>
{% else %} {% else %}
<option value="meta">Meta</option> <option value="internal">Internal</option>
{% endif %} {% endif %}
{% if params.index == 'restricted' %} {% if params.index == 'meta' %}
<option selected value="restricted">Restricted</option> <option selected value="meta">Meta</option>
{% else %} {% else %}
<option value="restricted">Restricted</option> <option value="meta">Meta</option>
{% endif %} {% endif %}
</select> {% if params.index == 'restricted' %}
<span class="icon is-small is-left"> <option selected value="restricted">Restricted</option>
<i class="fas fa-magnifying-glass"></i> {% else %}
<option value="restricted">Restricted</option>
{% endif %}
</select>
<span class="icon is-small is-left">
<i class="fas fa-magnifying-glass"></i>
</span>
</span> </span>
</span> </div>
<p class="control">
<a class="button is-static">
index
</a>
</p>
</div> </div>
<p class="control"> {% if not user.is_superuser %}
<a class="button is-static"> <span class="tooltiptext tag is-danger is-light">No access</span>
index {% endif %}
</a>
</p>
</div> </div>
{% if not user.is_superuser %} {% endif %}
<span class="tooltiptext tag is-danger is-light">No access</span>
{% endif %}
</div>
</div> </div>
</div> </div>
<div class="block"> <div class="block">
<input <input
hx-trigger="change" hx-trigger="change"
hx-post="{% url 'search' %}" hx-post="{% url 'search' %}"
hx-target="#results" hx-target="#widgets-here"
hx-swap="innerHTML" hx-swap="innerHTML"
id="tags" id="tags"
class="input" class="input"
@@ -404,4 +408,9 @@
value="{{ params.tags }}"> value="{{ params.tags }}">
</div> </div>
<div class="is-hidden"></div> <div class="is-hidden"></div>
{% if params.rule is not None %}
<div style="display:none;">
<input name="rule" value="{{ params.rule }}">
</div>
{% endif %}
</form> </form>

View File

@@ -1,4 +1,4 @@
{% extends 'wm/magnet.html' %} {% extends 'wm/window.html' %}
{% block heading %} {% block heading %}
Drilldown Drilldown

View File

@@ -12,8 +12,9 @@
<div class="modal-content"> <div class="modal-content">
<div class="box"> <div class="box">
{% block modal_content %} {% block modal_content %}
{% include window_content %}
{% endblock %} {% endblock %}
<button class="modal-close is-large" aria-label="close"></button> {% include 'partials/close-modal.html' %}
</div> </div>
</div> </div>
</div> </div>

View File

@@ -0,0 +1,6 @@
{% extends "base.html" %}
{% block content %}
{% include window_content %}
{% endblock %}

View File

@@ -3,9 +3,7 @@
<p class="panel-heading" style="padding: .2em; line-height: .5em;"> <p class="panel-heading" style="padding: .2em; line-height: .5em;">
<i class="fa-solid fa-arrows-up-down-left-right has-text-grey-light"></i> <i class="fa-solid fa-arrows-up-down-left-right has-text-grey-light"></i>
{% block close_button %} {% block close_button %}
<i {% include 'partials/close-window.html' %}
class="fa-solid fa-xmark has-text-grey-light float-right"
data-script="on click remove the closest <nav/>"></i>
{% endblock %} {% endblock %}
{% block heading %} {% block heading %}
{% endblock %} {% endblock %}

View File

@@ -1,24 +1,24 @@
<div id="drilldown-widget"> <div id="widget">
<div id="drilldown-widget-{{ unique }}" class="grid-stack-item" {% block widget_options %}{% endblock %}> <div id="widget-{{ unique }}" class="grid-stack-item" {% block widget_options %}gs-w="10" gs-h="1" gs-y="10" gs-x="1"{% endblock %}>
<div class="grid-stack-item-content"> <div class="grid-stack-item-content">
<nav class="panel"> <nav class="panel">
<p class="panel-heading" style="padding: .2em; line-height: .5em;"> <p class="panel-heading" style="padding: .2em; line-height: .5em;">
<i class="fa-solid fa-arrows-up-down-left-right has-text-grey-light"></i> <i class="fa-solid fa-arrows-up-down-left-right has-text-grey-light"></i>
{% block close_button %} {% block close_button %}
<i {% include 'partials/close-widget.html' %}
class="fa-solid fa-xmark has-text-grey-light float-right"
onclick='grid.removeWidget("drilldown-widget-{{ unique }}");'></i>
{% endblock %} {% endblock %}
<i <i
class="fa-solid fa-arrows-minimize has-text-grey-light float-right" class="fa-solid fa-arrows-minimize has-text-grey-light float-right"
onclick='grid.compact();'></i> onclick='grid.compact();'></i>
{% block heading %} {% block heading %}
{{ title }}
{% endblock %} {% endblock %}
</p> </p>
<article class="panel-block is-active"> <article class="panel-block is-active">
<div class="control"> <div class="control">
{% block panel_content %} {% block panel_content %}
{% include window_content %}
{% endblock %} {% endblock %}
</div> </div>
</article> </article>

View File

@@ -1,8 +1,10 @@
<magnet-block attract-distance="10" align-to="outer|center" class="floating-window"> <magnet-block attract-distance="10" align-to="outer|center" class="floating-window">
{% extends 'wm/panel.html' %} {% extends 'wm/panel.html' %}
{% block heading %} {% block heading %}
{{ title }}
{% endblock %} {% endblock %}
{% block panel_content %} {% block panel_content %}
{% include window_content %}
{% endblock %} {% endblock %}
</magnet-block> </magnet-block>

View File

@@ -0,0 +1,9 @@
import orjson
from django import template
register = template.Library()
@register.filter
def pretty(data):
return orjson.dumps(data, option=orjson.OPT_INDENT_2).decode("utf-8")

View File

@@ -8,56 +8,485 @@
# from siphashc import siphash # from siphashc import siphash
# from sortedcontainers import SortedSet # from sortedcontainers import SortedSet
import uuid
# from core import r # from core import r
from django.conf import settings from django.core.exceptions import ImproperlyConfigured
from django.core.paginator import Paginator
from django.db.models import QuerySet
from django.http import Http404, HttpResponse, HttpResponseBadRequest
from django.urls import reverse
from django.views.generic.detail import DetailView
from django.views.generic.edit import CreateView, DeleteView, UpdateView
from django.views.generic.list import ListView
from rest_framework.parsers import FormParser
from core.util import logs
log = logs.get_logger(__name__)
class SearchDenied: class RestrictedViewMixin:
def __init__(self, key, value):
self.key = key
self.value = value
class LookupDenied:
def __init__(self, key, value):
self.key = key
self.value = value
def remove_defaults(query_params):
for field, value in list(query_params.items()):
if field in settings.DRILLDOWN_DEFAULT_PARAMS:
if value == settings.DRILLDOWN_DEFAULT_PARAMS[field]:
del query_params[field]
def add_defaults(query_params):
for field, value in settings.DRILLDOWN_DEFAULT_PARAMS.items():
if field not in query_params:
query_params[field] = value
def dedup_list(data, check_keys):
""" """
Remove duplicate dictionaries from list. This mixin overrides two helpers in order to pass the user object to the filters.
get_queryset alters the objects returned for list views.
get_form_kwargs passes the request object to the form class. Remaining permissions
checks are in forms.py
""" """
seen = set()
out = []
dup_count = 0 allow_empty = True
for x in data: queryset = None
dedupeKey = tuple(x[k] for k in check_keys if k in x) model = None
if dedupeKey in seen: paginate_by = None
dup_count += 1 paginate_orphans = 0
continue context_object_name = None
if dup_count > 0: paginator_class = Paginator
out.append({"type": "control", "hidden": dup_count}) page_kwarg = "page"
dup_count = 0 ordering = None
out.append(x)
seen.add(dedupeKey) def get_queryset(self, **kwargs):
if dup_count > 0: """
out.append({"type": "control", "hidden": dup_count}) This function is overriden to filter the objects by the requesting user.
return out """
if self.queryset is not None:
queryset = self.queryset
if isinstance(queryset, QuerySet):
# queryset = queryset.all()
queryset = queryset.filter(user=self.request.user)
elif self.model is not None:
queryset = self.model._default_manager.filter(user=self.request.user)
else:
raise ImproperlyConfigured(
"%(cls)s is missing a QuerySet. Define "
"%(cls)s.model, %(cls)s.queryset, or override "
"%(cls)s.get_queryset()." % {"cls": self.__class__.__name__}
)
if hasattr(self, "get_ordering"):
ordering = self.get_ordering()
if ordering:
if isinstance(ordering, str):
ordering = (ordering,)
queryset = queryset.order_by(*ordering)
return queryset
def get_form_kwargs(self):
"""Passes the request object to the form class.
This is necessary to only display members that belong to a given user"""
kwargs = super().get_form_kwargs()
kwargs["request"] = self.request
return kwargs
class ObjectNameMixin(object):
def __init__(self, *args, **kwargs):
if self.model is None:
self.title = self.context_object_name.title()
self.title_singular = self.context_object_name_singular.title()
else:
self.title_singular = self.model._meta.verbose_name.title() # Hook
self.context_object_name_singular = self.title_singular.lower() # hook
self.title = self.model._meta.verbose_name_plural.title() # Hooks
self.context_object_name = self.title.lower() # hooks
self.context_object_name = self.context_object_name.replace(" ", "")
self.context_object_name_singular = (
self.context_object_name_singular.replace(" ", "")
)
super().__init__(*args, **kwargs)
class ObjectList(RestrictedViewMixin, ObjectNameMixin, ListView):
allowed_types = ["modal", "widget", "window", "page"]
window_content = "window-content/objects.html"
list_template = None
page_title = None
page_subtitle = None
list_url_name = None
# WARNING: TAKEN FROM locals()
list_url_args = ["type"]
submit_url_name = None
delete_all_url_name = None
widget_options = None
# copied from BaseListView
def get(self, request, *args, **kwargs):
type = kwargs.get("type", None)
if not type:
return HttpResponseBadRequest("No type specified")
if type not in self.allowed_types:
return HttpResponseBadRequest("Invalid type specified")
self.request = request
self.object_list = self.get_queryset(**kwargs)
if isinstance(self.object_list, HttpResponse):
return self.object_list
if isinstance(self.object_list, HttpResponseBadRequest):
return self.object_list
allow_empty = self.get_allow_empty()
self.template_name = f"wm/{type}.html"
unique = str(uuid.uuid4())[:8]
list_url_args = {}
for arg in self.list_url_args:
if arg in locals():
list_url_args[arg] = locals()[arg]
elif arg in kwargs:
list_url_args[arg] = kwargs[arg]
orig_type = type
if type == "page":
type = "modal"
if not allow_empty:
# When pagination is enabled and object_list is a queryset,
# it's better to do a cheap query than to load the unpaginated
# queryset in memory.
if self.get_paginate_by(self.object_list) is not None and hasattr(
self.object_list, "exists"
):
is_empty = not self.object_list.exists()
else:
is_empty = not self.object_list
if is_empty:
raise Http404("Empty list")
context = self.get_context_data()
context["title"] = self.title + f" ({type})"
context["title_singular"] = self.title_singular
context["unique"] = unique
context["window_content"] = self.window_content
context["list_template"] = self.list_template
context["page_title"] = self.page_title
context["page_subtitle"] = self.page_subtitle
context["type"] = type
context["context_object_name"] = self.context_object_name
context["context_object_name_singular"] = self.context_object_name_singular
if self.submit_url_name is not None:
context["submit_url"] = reverse(self.submit_url_name, kwargs={"type": type})
if self.list_url_name is not None:
context["list_url"] = reverse(self.list_url_name, kwargs=list_url_args)
if self.delete_all_url_name:
context["delete_all_url"] = reverse(self.delete_all_url_name)
if self.widget_options:
context["widget_options"] = self.widget_options
# Return partials for HTMX
if self.request.htmx:
if request.headers["HX-Target"] == self.context_object_name + "-table":
self.template_name = self.list_template
elif orig_type == "page":
self.template_name = self.list_template
else:
context["window_content"] = self.list_template
return self.render_to_response(context)
class ObjectCreate(RestrictedViewMixin, ObjectNameMixin, CreateView):
allowed_types = ["modal", "widget", "window", "page"]
window_content = "window-content/object-form.html"
parser_classes = [FormParser]
page_title = None
page_subtitle = None
model = None
submit_url_name = None
submit_url_args = ["type"]
request = None
# Whether to hide the cancel button in the form
hide_cancel = False
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.title = "Create " + self.context_object_name_singular
def post_save(self, obj):
pass
def form_valid(self, form):
obj = form.save(commit=False)
if self.request is None:
raise Exception("Request is None")
obj.user = self.request.user
obj.save()
form.save_m2m()
self.post_save(obj)
context = {"message": "Object created", "class": "success"}
response = self.render_to_response(context)
response["HX-Trigger"] = f"{self.context_object_name_singular}Event"
return response
def form_invalid(self, form):
"""If the form is invalid, render the invalid form."""
return self.get(self.request, **self.kwargs, form=form)
def get(self, request, *args, **kwargs):
type = kwargs.get("type", None)
if not type:
return HttpResponseBadRequest("No type specified")
if type not in self.allowed_types:
return HttpResponseBadRequest("Invalid type specified")
self.template_name = f"wm/{type}.html"
unique = str(uuid.uuid4())[:8]
self.request = request
self.kwargs = kwargs
if type == "widget":
self.hide_cancel = True
if type == "page":
type = "modal"
self.object = None
submit_url_args = {}
for arg in self.submit_url_args:
if arg in locals():
submit_url_args[arg] = locals()[arg]
elif arg in kwargs:
submit_url_args[arg] = kwargs[arg]
submit_url = reverse(self.submit_url_name, kwargs=submit_url_args)
context = self.get_context_data()
form = kwargs.get("form", None)
if form:
context["form"] = form
context["unique"] = unique
context["window_content"] = self.window_content
context["context_object_name"] = self.context_object_name
context["context_object_name_singular"] = self.context_object_name_singular
context["submit_url"] = submit_url
context["type"] = type
context["hide_cancel"] = self.hide_cancel
if self.page_title:
context["page_title"] = self.page_title
if self.page_subtitle:
context["page_subtitle"] = self.page_subtitle
response = self.render_to_response(context)
# response["HX-Trigger"] = f"{self.context_object_name_singular}Event"
return response
def post(self, request, *args, **kwargs):
self.request = request
self.template_name = "partials/notify.html"
return super().post(request, *args, **kwargs)
class ObjectRead(RestrictedViewMixin, ObjectNameMixin, DetailView):
allowed_types = ["modal", "widget", "window", "page"]
window_content = "window-content/object.html"
detail_template = "partials/generic-detail.html"
page_title = None
page_subtitle = None
model = None
# submit_url_name = None
detail_url_name = None
# WARNING: TAKEN FROM locals()
detail_url_args = ["type"]
request = None
def get(self, request, *args, **kwargs):
type = kwargs.get("type", None)
if not type:
return HttpResponseBadRequest("No type specified")
if type not in self.allowed_types:
return HttpResponseBadRequest()
self.template_name = f"wm/{type}.html"
unique = str(uuid.uuid4())[:8]
detail_url_args = {}
for arg in self.detail_url_args:
if arg in locals():
detail_url_args[arg] = locals()[arg]
elif arg in kwargs:
detail_url_args[arg] = kwargs[arg]
self.request = request
self.object = self.get_object(**kwargs)
if isinstance(self.object, HttpResponse):
return self.object
orig_type = type
if type == "page":
type = "modal"
context = self.get_context_data()
context["title"] = self.title + f" ({type})"
context["title_singular"] = self.title_singular
context["unique"] = unique
context["window_content"] = self.window_content
context["detail_template"] = self.detail_template
if self.page_title:
context["page_title"] = self.page_title
if self.page_subtitle:
context["page_subtitle"] = self.page_subtitle
context["type"] = type
context["context_object_name"] = self.context_object_name
context["context_object_name_singular"] = self.context_object_name_singular
if self.detail_url_name is not None:
context["detail_url"] = reverse(
self.detail_url_name, kwargs=detail_url_args
)
# Return partials for HTMX
if self.request.htmx:
if request.headers["HX-Target"] == self.context_object_name + "-info":
self.template_name = self.detail_template
elif orig_type == "page":
self.template_name = self.detail_template
else:
context["window_content"] = self.detail_template
return self.render_to_response(context)
class ObjectUpdate(RestrictedViewMixin, ObjectNameMixin, UpdateView):
allowed_types = ["modal", "widget", "window", "page"]
window_content = "window-content/object-form.html"
parser_classes = [FormParser]
page_title = None
page_subtitle = None
model = None
submit_url_name = None
submit_url_args = ["type", "pk"]
request = None
# Whether pk is required in the get request
pk_required = True
# Whether to hide the cancel button in the form
hide_cancel = False
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.title = "Update " + self.context_object_name_singular
def post_save(self, obj):
pass
def form_valid(self, form):
obj = form.save(commit=False)
if self.request is None:
raise Exception("Request is None")
obj.save()
form.save_m2m()
self.post_save(obj)
context = {"message": "Object updated", "class": "success"}
response = self.render_to_response(context)
response["HX-Trigger"] = f"{self.context_object_name_singular}Event"
return response
def form_invalid(self, form):
"""If the form is invalid, render the invalid form."""
return self.get(self.request, **self.kwargs, form=form)
def get(self, request, *args, **kwargs):
self.request = request
type = kwargs.get("type", None)
pk = kwargs.get("pk", None)
if not type:
return HttpResponseBadRequest("No type specified")
if not pk:
if self.pk_required:
return HttpResponseBadRequest("No pk specified")
if type not in self.allowed_types:
return HttpResponseBadRequest("Invalid type specified")
self.template_name = f"wm/{type}.html"
unique = str(uuid.uuid4())[:8]
if type == "widget":
self.hide_cancel = True
if type == "page":
type = "modal"
self.object = self.get_object()
submit_url_args = {}
for arg in self.submit_url_args:
if arg in locals():
submit_url_args[arg] = locals()[arg]
elif arg in kwargs:
submit_url_args[arg] = kwargs[arg]
submit_url = reverse(self.submit_url_name, kwargs=submit_url_args)
context = self.get_context_data()
form = kwargs.get("form", None)
if form:
context["form"] = form
context["title"] = self.title + f" ({type})"
context["title_singular"] = self.title_singular
context["unique"] = unique
context["window_content"] = self.window_content
context["context_object_name"] = self.context_object_name
context["context_object_name_singular"] = self.context_object_name_singular
context["submit_url"] = submit_url
context["type"] = type
context["hide_cancel"] = self.hide_cancel
if self.page_title:
context["page_title"] = self.page_title
if self.page_subtitle:
context["page_subtitle"] = self.page_subtitle
response = self.render_to_response(context)
# response["HX-Trigger"] = f"{self.context_object_name_singular}Event"
return response
def post(self, request, *args, **kwargs):
self.request = request
self.template_name = "partials/notify.html"
return super().post(request, *args, **kwargs)
class ObjectDelete(RestrictedViewMixin, ObjectNameMixin, DeleteView):
model = None
template_name = "partials/notify.html"
# Overriden to prevent success URL from being used
def delete(self, request, *args, **kwargs):
"""
Call the delete() method on the fetched object and then redirect to the
success URL.
"""
self.object = self.get_object()
# success_url = self.get_success_url()
self.object.delete()
context = {"message": "Object deleted", "class": "success"}
response = self.render_to_response(context)
response["HX-Trigger"] = f"{self.context_object_name_singular}Event"
return response
# This will be used in newer Django versions, until then we get a warning
def form_valid(self, form):
"""
Call the delete() method on the fetched object.
"""
self.object = self.get_object()
self.object.delete()
context = {"message": "Object deleted", "class": "success"}
response = self.render_to_response(context)
response["HX-Trigger"] = f"{self.context_object_name_singular}Event"
return response
# from random import randint # from random import randint

View File

@@ -0,0 +1,86 @@
from django.contrib.auth.mixins import LoginRequiredMixin, PermissionRequiredMixin
from django.shortcuts import render
from rest_framework.views import APIView
from core.forms import NotificationRuleForm, NotificationSettingsForm
from core.models import NotificationRule, NotificationSettings
from core.views.helpers import ObjectCreate, ObjectDelete, ObjectList, ObjectUpdate
# Notifications - we create a new notification settings object if there isn't one
# Hence, there is only an update view, not a create view.
class NotificationsUpdate(LoginRequiredMixin, PermissionRequiredMixin, ObjectUpdate):
permission_required = "use_rules"
model = NotificationSettings
form_class = NotificationSettingsForm
page_title = "Update your notification settings"
page_subtitle = (
"At least the topic must be set if you want to receive notifications."
)
submit_url_name = "notifications_update"
submit_url_args = ["type"]
pk_required = False
hide_cancel = True
def get_object(self, **kwargs):
notification_settings, _ = NotificationSettings.objects.get_or_create(
user=self.request.user
)
return notification_settings
class RuleList(LoginRequiredMixin, ObjectList):
list_template = "partials/rule-list.html"
model = NotificationRule
page_title = "List of notification rules"
list_url_name = "rules"
list_url_args = ["type"]
submit_url_name = "rule_create"
class RuleCreate(LoginRequiredMixin, PermissionRequiredMixin, ObjectCreate):
permission_required = "use_rules"
model = NotificationRule
form_class = NotificationRuleForm
submit_url_name = "rule_create"
class RuleUpdate(LoginRequiredMixin, PermissionRequiredMixin, ObjectUpdate):
permission_required = "use_rules"
model = NotificationRule
form_class = NotificationRuleForm
submit_url_name = "rule_update"
class RuleDelete(LoginRequiredMixin, PermissionRequiredMixin, ObjectDelete):
permission_required = "use_rules"
model = NotificationRule
class RuleClear(LoginRequiredMixin, PermissionRequiredMixin, APIView):
permission_required = "use_rules"
def post(self, request, type, pk):
template_name = "partials/notify.html"
rule = NotificationRule.objects.get(pk=pk, user=request.user)
if isinstance(rule.match, dict):
for index in rule.match:
rule.match[index] = False
rule.save()
cleared_indices = ", ".join(rule.match)
context = {
"message": f"Cleared match status for indices: {cleared_indices}",
"class": "success",
}
response = render(request, template_name, context)
response["HX-Trigger"] = "notificationruleEvent"
return response

View File

@@ -6,11 +6,11 @@ from django.conf import settings
from django.http import HttpResponse, JsonResponse from django.http import HttpResponse, JsonResponse
from django.shortcuts import render from django.shortcuts import render
from django.urls import reverse from django.urls import reverse
from django.views import View
from django_tables2 import SingleTableView from django_tables2 import SingleTableView
from rest_framework.parsers import FormParser from rest_framework.parsers import FormParser
from rest_framework.views import APIView from rest_framework.views import APIView
from core.db import add_defaults, remove_defaults
from core.db.storage import db from core.db.storage import db
from core.lib.threshold import ( from core.lib.threshold import (
annotate_num_chans, annotate_num_chans,
@@ -18,7 +18,6 @@ from core.lib.threshold import (
get_chans, get_chans,
get_users, get_users,
) )
from core.views import helpers
from core.views.ui.tables import DrilldownTable from core.views.ui.tables import DrilldownTable
# from copy import deepcopy # from copy import deepcopy
@@ -97,139 +96,146 @@ def make_graph(results):
return orjson.dumps(graph).decode("utf-8") return orjson.dumps(graph).decode("utf-8")
def drilldown_search(request, return_context=False, template=None):
extra_params = {}
if not template:
template_name = "widgets/table_results.html"
else:
template_name = template
if request.user.is_anonymous:
sizes = settings.MAIN_SIZES_ANON
else:
sizes = settings.MAIN_SIZES
if request.GET:
if not request.htmx:
template_name = "ui/drilldown/drilldown.html"
query_params = request.GET.dict()
elif request.POST:
query_params = request.POST.dict()
else:
template_name = "ui/drilldown/drilldown.html"
params_with_defaults = {}
helpers.add_defaults(params_with_defaults)
context = {"sizes": sizes, "unique": "results", "params": params_with_defaults}
return render(request, template_name, context)
tmp_post = request.POST.dict()
tmp_get = request.GET.dict()
tmp_post = {k: v for k, v in tmp_post.items() if v and not v == "None"}
tmp_get = {k: v for k, v in tmp_get.items() if v and not v == "None"}
query_params.update(tmp_post)
query_params.update(tmp_get)
# URI we're passing to the template for linking
if "csrfmiddlewaretoken" in query_params:
del query_params["csrfmiddlewaretoken"]
# Parse the dates
if "dates" in query_params:
dates = parse_dates(query_params["dates"])
del query_params["dates"]
if dates:
if "message" in dates:
return render(request, template_name, dates)
query_params["from_date"] = dates["from_date"]
query_params["to_date"] = dates["to_date"]
query_params["from_time"] = dates["from_time"]
query_params["to_time"] = dates["to_time"]
if "query" in query_params:
# Remove null values
if query_params["query"] == "":
del query_params["query"]
# Turn the query into tags for populating the taglist
# tags = create_tags(query_params["query"])
# context["tags"] = tags
# else:
# context = {"object_list": []}
if "tags" in query_params:
if query_params["tags"] == "":
del query_params["tags"]
else:
tags = parse_tags(query_params["tags"])
extra_params["tags"] = tags
context = db.query_results(request, query_params, **extra_params)
context["unique"] = "results"
# Valid sizes
context["sizes"] = sizes
# Add any default parameters to the context
params_with_defaults = dict(query_params)
helpers.add_defaults(params_with_defaults)
context["params"] = params_with_defaults
helpers.remove_defaults(query_params)
url_params = urllib.parse.urlencode(query_params)
context["client_uri"] = url_params
if "message" in context:
if return_context:
return context
response = render(request, template_name, context)
if request.GET:
if request.htmx:
response["HX-Push"] = reverse("home") + "?" + url_params
elif request.POST:
response["HX-Push"] = reverse("home") + "?" + url_params
return response
# Create data for chart.js sentiment graph
graph = make_graph(context["object_list"])
context["data"] = graph
context = make_table(context)
# URI we're passing to the template for linking, table fields removed
table_fields = ["page", "sort"]
clean_params = {k: v for k, v in query_params.items() if k not in table_fields}
clean_url_params = urllib.parse.urlencode(clean_params)
context["uri"] = clean_url_params
# unique = str(uuid.uuid4())[:8]
if return_context:
return context
response = render(request, template_name, context)
if request.GET:
if request.htmx:
response["HX-Push"] = reverse("home") + "?" + url_params
elif request.POST:
response["HX-Push"] = reverse("home") + "?" + url_params
return response
class DrilldownTableView(SingleTableView): class DrilldownTableView(SingleTableView):
table_class = DrilldownTable table_class = DrilldownTable
template_name = "widgets/table_results.html" template_name = "wm/widget.html"
window_content = "window-content/results.html"
# htmx_partial = "partials/"
paginate_by = settings.DRILLDOWN_RESULTS_PER_PAGE paginate_by = settings.DRILLDOWN_RESULTS_PER_PAGE
def get_queryset(self, request, **kwargs): def common_request(self, request, **kwargs):
context = drilldown_search(request, return_context=True) extra_params = {}
# Save the context as we will need to merge other attributes later
self.context = context
if "object_list" in context: if request.user.is_anonymous:
return context["object_list"] sizes = settings.MAIN_SIZES_ANON
else: else:
return [] sizes = settings.MAIN_SIZES
if request.GET:
self.template_name = "index.html"
# GET arguments in URL like ?query=xyz
query_params = request.GET.dict()
if request.htmx:
if request.resolver_match.url_name == "search_partial":
self.template_name = "partials/results_table.html"
elif request.POST:
query_params = request.POST.dict()
else:
self.template_name = "index.html"
# No query, this is a fresh page load
# Don't try to search, since there's clearly nothing to do
params_with_defaults = {}
add_defaults(params_with_defaults)
context = {
"sizes": sizes,
"params": params_with_defaults,
"unique": "results",
"window_content": self.window_content,
"title": "Results",
}
return render(request, self.template_name, context)
# Merge everything together just in case
tmp_post = request.POST.dict()
tmp_get = request.GET.dict()
tmp_post = {k: v for k, v in tmp_post.items() if v and not v == "None"}
tmp_get = {k: v for k, v in tmp_get.items() if v and not v == "None"}
query_params.update(tmp_post)
query_params.update(tmp_get)
# URI we're passing to the template for linking
if "csrfmiddlewaretoken" in query_params:
del query_params["csrfmiddlewaretoken"]
# Parse the dates
if "dates" in query_params:
dates = parse_dates(query_params["dates"])
del query_params["dates"]
if dates:
if "message" in dates:
return render(request, self.template_name, dates)
query_params["from_date"] = dates["from_date"]
query_params["to_date"] = dates["to_date"]
query_params["from_time"] = dates["from_time"]
query_params["to_time"] = dates["to_time"]
# Remove null values
if "query" in query_params:
if query_params["query"] == "":
del query_params["query"]
# Remove null tags values
if "tags" in query_params:
if query_params["tags"] == "":
del query_params["tags"]
else:
# Parse the tags and populate cast to pass to search function
tags = parse_tags(query_params["tags"])
extra_params["tags"] = tags
if "dedup" in query_params:
if query_params["dedup"] == "on":
extra_params["dedup"] = True
else:
extra_params["dedup"] = False
else:
extra_params["dedup"] = False
context = db.query_results(request, query_params, **extra_params)
# Unique is for identifying the widgets.
# We don't want a random one since we only want one results pane.
context["unique"] = "results"
context["window_content"] = self.window_content
context["title"] = "Results"
# Valid sizes
context["sizes"] = sizes
# Add any default parameters to the context
params_with_defaults = dict(query_params)
add_defaults(params_with_defaults)
context["params"] = params_with_defaults
# Remove anything that we or the user set to a default for
# pretty URLs
remove_defaults(query_params)
url_params = urllib.parse.urlencode(query_params)
context["client_uri"] = url_params
# There's an error
if "message" in context:
response = render(request, self.template_name, context)
# Still push the URL so they can share it to get assistance
if request.GET:
if request.htmx:
response["HX-Push"] = reverse("home") + "?" + url_params
elif request.POST:
response["HX-Push"] = reverse("home") + "?" + url_params
return response
# Create data for chart.js sentiment graph
graph = make_graph(context["object_list"])
context["data"] = graph
# Create the table
context = make_table(context)
# URI we're passing to the template for linking, table fields removed
table_fields = ["page", "sort"]
clean_params = {k: v for k, v in query_params.items() if k not in table_fields}
clean_url_params = urllib.parse.urlencode(clean_params)
context["uri"] = clean_url_params
# unique = str(uuid.uuid4())[:8]
# self.context = context
return context
def get(self, request, *args, **kwargs): def get(self, request, *args, **kwargs):
self.object_list = self.get_queryset(request) self.context = self.common_request(request)
if isinstance(self.context, HttpResponse):
return self.context
self.object_list = self.context["object_list"]
show = [] show = []
show = set().union(*(d.keys() for d in self.object_list)) show = set().union(*(d.keys() for d in self.object_list))
allow_empty = self.get_allow_empty() allow_empty = self.get_allow_empty()
@@ -245,17 +251,17 @@ class DrilldownTableView(SingleTableView):
else: else:
is_empty = not self.object_list # noqa is_empty = not self.object_list # noqa
context = self.get_context_data() context = self.get_context_data()
if isinstance(self.context, HttpResponse):
return self.context
for k, v in self.context.items(): for k, v in self.context.items():
if k not in context: if k not in context:
context[k] = v context[k] = v
context["show"] = show context["show"] = show
if request.method == "GET": # if request.htmx:
if not request.htmx: # self.template_name = self.window_content
self.template_name = "ui/drilldown/drilldown.html" # if request.method == "GET":
# if not request.htmx:
# self.template_name = "ui/drilldown/drilldown.html"
response = self.render_to_response(context) response = self.render_to_response(context)
# if not request.method == "GET": # if not request.method == "GET":
if "client_uri" in context: if "client_uri" in context:
@@ -266,15 +272,15 @@ class DrilldownTableView(SingleTableView):
return self.get(request, *args, **kwargs) return self.get(request, *args, **kwargs)
class Drilldown(View): # class Drilldown(View):
template_name = "ui/drilldown/drilldown.html" # template_name = "ui/drilldown/drilldown.html"
plan_name = "drilldown" # plan_name = "drilldown"
def get(self, request): # def get(self, request):
return drilldown_search(request) # return drilldown_search(request)
def post(self, request): # def post(self, request):
return drilldown_search(request) # return drilldown_search(request)
class DrilldownContextModal(APIView): class DrilldownContextModal(APIView):
@@ -389,19 +395,6 @@ class DrilldownContextModal(APIView):
if "message" in results: if "message" in results:
return render(request, self.template_name, results) return render(request, self.template_name, results)
# if settings.HASHING: # we probably want to see the tokens
# if query_params["source"] not in settings.SAFE_SOURCES:
# if not request.user.has_perm("core.bypass_hashing"):
# for index, item in enumerate(results["object_list"]):
# if "tokens" in item:
# results["object_list"][index]["msg"] = results[
# "object_list"
# ][index].pop("tokens")
# # item["msg"] = item.pop("tokens")
# Make the time nicer
# for index, item in enumerate(results["object_list"]):
# results["object_list"][index]["time"] = item["time"]+"SSS"
unique = str(uuid.uuid4())[:8] unique = str(uuid.uuid4())[:8]
context = { context = {
"net": query_params["net"], "net": query_params["net"],
@@ -449,45 +442,18 @@ class ThresholdInfoModal(APIView):
nick = request.data["nick"] nick = request.data["nick"]
channel = request.data["channel"] channel = request.data["channel"]
# SAFE BLOCK #
# Lookup the hash values but don't disclose them to the user
# if settings.HASHING:
# SAFE_PARAMS = request.data.dict()
# hash_lookup(request.user, SAFE_PARAMS)
channels = get_chans(net, [nick]) channels = get_chans(net, [nick])
print("CHANNELS", channels) users = get_users(net, [channel])
users = get_users(net, [nick])
print("USERS", users)
num_users = annotate_num_users(net, channels) num_users = annotate_num_users(net, channels)
print("NUM_USERS", num_users)
num_chans = annotate_num_chans(net, users) num_chans = annotate_num_chans(net, users)
print("NUM_CHANS", num_chans)
if channels: if channels:
inter_users = get_users(net, channels) inter_users = get_users(net, channels)
else: else:
inter_users = [] inter_users = []
print("INTER_USERS", inter_users)
if users: if users:
inter_chans = get_chans(net, users) inter_chans = get_chans(net, users)
else: else:
inter_chans = [] inter_chans = []
print("INTER_CHANS", inter_chans)
# if settings.HASHING:
# hash_list(request.user, inter_chans)
# hash_list(request.user, inter_users)
# hash_list(request.user, num_chans, hash_keys=True)
# hash_list(request.user, num_users, hash_keys=True)
# hash_list(request.user, channels)
# hash_list(request.user, users)
# if settings.RANDOMISATION:
# randomise_list(request.user, num_chans)
# randomise_list(request.user, num_users)
# SAFE BLOCK END #
unique = str(uuid.uuid4())[:8] unique = str(uuid.uuid4())[:8]
context = { context = {
@@ -502,5 +468,4 @@ class ThresholdInfoModal(APIView):
"num_chans": num_chans, "num_chans": num_chans,
"unique": unique, "unique": unique,
} }
print("CON", context)
return render(request, self.template_name, context) return render(request, self.template_name, context)

View File

@@ -7,7 +7,7 @@ from django.views import View
from rest_framework.parsers import FormParser from rest_framework.parsers import FormParser
from rest_framework.views import APIView from rest_framework.views import APIView
from core.db.druid import query_single_result from core.db.storage import db
from core.lib.meta import get_meta from core.lib.meta import get_meta
from core.lib.nicktrace import get_nicks from core.lib.nicktrace import get_nicks
from core.lib.threshold import ( from core.lib.threshold import (
@@ -23,8 +23,9 @@ class Insights(LoginRequiredMixin, PermissionRequiredMixin, View):
template_name = "ui/insights/insights.html" template_name = "ui/insights/insights.html"
permission_required = "use_insights" permission_required = "use_insights"
def get(self, request): def get(self, request, index):
return render(request, self.template_name) context = {"index": index}
return render(request, self.template_name, context)
class InsightsSearch(LoginRequiredMixin, PermissionRequiredMixin, View): class InsightsSearch(LoginRequiredMixin, PermissionRequiredMixin, View):
@@ -32,13 +33,16 @@ class InsightsSearch(LoginRequiredMixin, PermissionRequiredMixin, View):
template_name = "ui/insights/info.html" template_name = "ui/insights/info.html"
permission_required = "use_insights" permission_required = "use_insights"
def post(self, request): def post(self, request, index):
query_params = request.POST.dict() query_params = request.POST.dict()
if "query_full" in query_params: if "query" in query_params:
query_params["query_full"] = "nick: " + query_params["query_full"] query_params["query"] = "nick: " + query_params["query"]
context = query_single_result(request, query_params) query_params["source"] = "all"
query_params["index"] = index
context = db.query_single_result(request, query_params)
if not context: if not context:
return HttpResponseForbidden() return HttpResponseForbidden()
context["index"] = index
return render(request, self.template_name, context) return render(request, self.template_name, context)
@@ -47,7 +51,7 @@ class InsightsChannels(LoginRequiredMixin, PermissionRequiredMixin, APIView):
template_name = "ui/insights/channels.html" template_name = "ui/insights/channels.html"
permission_required = "use_insights" permission_required = "use_insights"
def post(self, request): def post(self, request, index):
if "net" not in request.data: if "net" not in request.data:
return HttpResponse("No net") return HttpResponse("No net")
if "nick" not in request.data: if "nick" not in request.data:
@@ -58,7 +62,13 @@ class InsightsChannels(LoginRequiredMixin, PermissionRequiredMixin, APIView):
num_users = annotate_num_users(net, chans) num_users = annotate_num_users(net, chans)
if not chans: if not chans:
return HttpResponseForbidden() return HttpResponseForbidden()
context = {"net": net, "nick": nick, "chans": chans, "num_users": num_users} context = {
"net": net,
"nick": nick,
"chans": chans,
"num_users": num_users,
"index": index,
}
return render(request, self.template_name, context) return render(request, self.template_name, context)
@@ -67,7 +77,7 @@ class InsightsNicks(LoginRequiredMixin, PermissionRequiredMixin, APIView):
template_name = "ui/insights/nicks.html" template_name = "ui/insights/nicks.html"
permission_required = "use_insights" permission_required = "use_insights"
def post(self, request): def post(self, request, index):
if "net" not in request.data: if "net" not in request.data:
return HttpResponse("No net") return HttpResponse("No net")
if "nick" not in request.data: if "nick" not in request.data:
@@ -82,7 +92,13 @@ class InsightsNicks(LoginRequiredMixin, PermissionRequiredMixin, APIView):
online = annotate_online(net, nicks) online = annotate_online(net, nicks)
if not nicks: if not nicks:
return HttpResponseForbidden() return HttpResponseForbidden()
context = {"net": net, "nick": nick, "nicks": nicks, "online": online} context = {
"net": net,
"nick": nick,
"nicks": nicks,
"online": online,
"index": index,
}
return render(request, self.template_name, context) return render(request, self.template_name, context)
@@ -91,7 +107,7 @@ class InsightsMeta(LoginRequiredMixin, PermissionRequiredMixin, APIView):
template_name = "ui/insights/meta.html" template_name = "ui/insights/meta.html"
permission_required = "use_insights" permission_required = "use_insights"
def post(self, request): def post(self, request, index):
if "net" not in request.data: if "net" not in request.data:
return HttpResponse("No net") return HttpResponse("No net")
if "nicks" not in request.data: if "nicks" not in request.data:
@@ -99,6 +115,10 @@ class InsightsMeta(LoginRequiredMixin, PermissionRequiredMixin, APIView):
net = request.data["net"] net = request.data["net"]
nicks = request.data["nicks"] nicks = request.data["nicks"]
nicks = literal_eval(nicks) nicks = literal_eval(nicks)
# Check the user has permissions to use the meta index
if not request.user.has_perm("core.index_meta"):
return HttpResponseForbidden()
meta = get_meta(request, net, nicks) meta = get_meta(request, net, nicks)
unique_values = {} unique_values = {}
# Create a map of unique values for each key for each nick # Create a map of unique values for each key for each nick
@@ -122,7 +142,7 @@ class InsightsMeta(LoginRequiredMixin, PermissionRequiredMixin, APIView):
meta_dedup[k].add(v) meta_dedup[k].add(v)
unique_values[nick][k].remove(v) unique_values[nick][k].remove(v)
context = {"net": net, "nicks": nicks, "meta": meta_dedup} context = {"net": net, "nicks": nicks, "meta": meta_dedup, "index": index}
return render(request, self.template_name, context) return render(request, self.template_name, context)
@@ -131,7 +151,7 @@ class InsightsInfoModal(LoginRequiredMixin, PermissionRequiredMixin, APIView):
template_name = "modals/drilldown.html" template_name = "modals/drilldown.html"
permission_required = "use_insights" permission_required = "use_insights"
def post(self, request): def post(self, request, index):
if "net" not in request.data: if "net" not in request.data:
return JsonResponse({"success": False}) return JsonResponse({"success": False})
if "nick" not in request.data: if "nick" not in request.data:
@@ -163,5 +183,6 @@ class InsightsInfoModal(LoginRequiredMixin, PermissionRequiredMixin, APIView):
"inter_users": inter_users, "inter_users": inter_users,
"num_users": num_users, "num_users": num_users,
"num_chans": num_chans, "num_chans": num_chans,
"index": index,
} }
return render(request, self.template_name, context) return render(request, self.template_name, context)

View File

@@ -12,10 +12,13 @@ def format_header(self):
header = header.lower() header = header.lower()
header = header.title() header = header.title()
if header != "Ident": if header != "Ident":
header = header.replace("Uuid", "UUID")
header = header.replace("Id", "ID") header = header.replace("Id", "ID")
header = header.replace("id", "ID") header = header.replace("id", "ID")
if header == "Ts": if header == "Ts":
header = "TS" header = "TS"
if header == "Match Ts":
header = "Match TS"
header = header.replace("Nsfw", "NSFW") header = header.replace("Nsfw", "NSFW")
return header return header
@@ -64,19 +67,23 @@ class DrilldownTable(Table):
mtype = Column() mtype = Column()
realname = Column() realname = Column()
server = Column() server = Column()
mtype = Column() tokens = Column()
# tokens = Column()
lang_code = Column() lang_code = Column()
lang_name = Column() lang_name = Column()
words_noun = Column() # words_noun = Column()
words_adj = Column() # words_adj = Column()
words_verb = Column() # words_verb = Column()
words_adv = Column() # words_adv = Column()
hidden = Column() hidden = Column()
filename = Column() filename = Column()
file_md5 = Column() file_md5 = Column()
file_ext = Column() file_ext = Column()
file_size = Column() file_size = Column()
rule_uuid = Column()
index = Column()
meta = Column()
match_ts = Column()
mode = Column()
template_name = "ui/drilldown/table_results.html" template_name = "ui/drilldown/table_results.html"
paginate_by = settings.DRILLDOWN_RESULTS_PER_PAGE paginate_by = settings.DRILLDOWN_RESULTS_PER_PAGE

View File

@@ -1,67 +1,194 @@
version: "2" version: "2.2"
services: services:
app: app:
image: pathogen/neptune:latest image: pathogen/neptune:latest
build: ./docker container_name: neptune
build:
context: .
args:
OPERATION: ${OPERATION}
volumes: volumes:
- ${PORTAINER_GIT_DIR}:/code - ${PORTAINER_GIT_DIR}:/code
- ${NEPTUNE_LOCAL_SETTINGS}:/code/app/local_settings.py - ${PORTAINER_GIT_DIR}/docker/uwsgi.ini:/conf/uwsgi.ini
- ${NEPTUNE_DATABASE_FILE}:/code/db.sqlite3 - ${APP_LOCAL_SETTINGS}:/code/app/local_settings.py
ports: - ${APP_DATABASE_FILE}:/code/db.sqlite3
- "${NEPTUNE_PORT}:8000" - neptune_static:${STATIC_ROOT}
env_file: env_file:
- .env - stack.env
volumes_from: volumes_from:
- tmp - tmp
depends_on: depends_on:
redis: redis:
condition: service_healthy condition: service_healthy
migration: migration:
condition: service_started condition: service_started
collectstatic:
condition: service_started
networks:
- default
- pathogen
- elastic
processing:
image: pathogen/neptune:latest
container_name: processing_neptune
build:
context: .
args:
OPERATION: ${OPERATION}
command: sh -c '. /venv/bin/activate && python manage.py processing'
volumes:
- ${PORTAINER_GIT_DIR}:/code
- ${PORTAINER_GIT_DIR}/docker/uwsgi.ini:/conf/uwsgi.ini
- ${APP_LOCAL_SETTINGS}:/code/app/local_settings.py
- ${APP_DATABASE_FILE}:/code/db.sqlite3
- neptune_static:${STATIC_ROOT}
env_file:
- stack.env
volumes_from:
- tmp
depends_on:
redis:
condition: service_healthy
migration:
condition: service_started
collectstatic:
condition: service_started
networks:
- default
- pathogen
- elastic
scheduling:
image: pathogen/neptune:latest
container_name: scheduling_neptune
build:
context: .
args:
OPERATION: ${OPERATION}
command: sh -c '. /venv/bin/activate && python manage.py scheduling'
volumes:
- ${PORTAINER_GIT_DIR}:/code
- ${PORTAINER_GIT_DIR}/docker/uwsgi.ini:/conf/uwsgi.ini
- ${APP_LOCAL_SETTINGS}:/code/app/local_settings.py
- ${APP_DATABASE_FILE}:/code/db.sqlite3
- neptune_static:${STATIC_ROOT}
env_file:
- stack.env
volumes_from:
- tmp
depends_on:
redis:
condition: service_healthy
migration:
condition: service_started
collectstatic:
condition: service_started
networks:
- default
- pathogen
- elastic
migration: migration:
image: pathogen/neptune:latest image: pathogen/neptune:latest
container_name: migration_neptune
build:
context: .
args:
OPERATION: ${OPERATION}
command: sh -c '. /venv/bin/activate && python manage.py migrate --noinput' command: sh -c '. /venv/bin/activate && python manage.py migrate --noinput'
volumes: volumes:
- ${PORTAINER_GIT_DIR}:/code - ${PORTAINER_GIT_DIR}:/code
- ${NEPTUNE_LOCAL_SETTINGS}:/code/app/local_settings.py - ${APP_LOCAL_SETTINGS}:/code/app/local_settings.py
- ${NEPTUNE_DATABASE_FILE}:/code/db.sqlite3 - ${APP_DATABASE_FILE}:/code/db.sqlite3
- neptune_static:${STATIC_ROOT}
volumes_from: volumes_from:
- tmp - tmp
depends_on: depends_on:
redis: redis:
condition: service_healthy condition: service_healthy
# pyroscope: collectstatic:
# image: pyroscope/pyroscope image: pathogen/neptune:latest
# environment: container_name: collectstatic_neptune
# - PYROSCOPE_LOG_LEVEL=debug build:
# ports: context: .
# - '4040:4040' args:
# command: OPERATION: ${OPERATION}
# - 'server' command: sh -c '. /venv/bin/activate && python manage.py collectstatic --noinput'
volumes:
- ${PORTAINER_GIT_DIR}:/code
- ${APP_LOCAL_SETTINGS}:/code/app/local_settings.py
- ${APP_DATABASE_FILE}:/code/db.sqlite3
- neptune_static:${STATIC_ROOT}
volumes_from:
- tmp
env_file:
- stack.env
depends_on:
redis:
condition: service_healthy
nginx:
image: nginx:latest
container_name: nginx_neptune
ports:
- ${APP_PORT}:9999
ulimits:
nproc: 65535
nofile:
soft: 65535
hard: 65535
volumes:
- ${PORTAINER_GIT_DIR}:/code
- ${PORTAINER_GIT_DIR}/docker/nginx/conf.d/${OPERATION}.conf:/etc/nginx/conf.d/default.conf
- neptune_static:${STATIC_ROOT}
volumes_from:
- tmp
networks:
- default
- pathogen
depends_on:
app:
condition: service_started
tmp: tmp:
image: busybox image: busybox
command: chmod -R 777 /var/run/redis container_name: tmp_neptune
command: chmod -R 777 /var/run/socks
volumes: volumes:
- /var/run/redis - /var/run/socks
redis: redis:
image: redis image: redis
container_name: redis_neptune
command: redis-server /etc/redis.conf command: redis-server /etc/redis.conf
ulimits:
nproc: 65535
nofile:
soft: 65535
hard: 65535
volumes: volumes:
- ${PORTAINER_GIT_DIR}/docker/redis.conf:/etc/redis.conf - ${PORTAINER_GIT_DIR}/docker/redis.conf:/etc/redis.conf
volumes_from: volumes_from:
- tmp - tmp
healthcheck: healthcheck:
test: "redis-cli -s /var/run/redis/redis.sock ping" test: "redis-cli -s /var/run/socks/redis.sock ping"
interval: 2s interval: 2s
timeout: 2s timeout: 2s
retries: 15 retries: 15
networks:
- default
- pathogen
networks: networks:
default: default:
external: driver: bridge
name: pathogen pathogen:
external: true
elastic:
external: true
volumes:
neptune_static: {}

View File

@@ -1,60 +0,0 @@
version: "2"
services:
app:
image: pathogen/neptune:latest
build: ./docker/prod
volumes:
- ${PORTAINER_GIT_DIR}:/code
- ${PORTAINER_GIT_DIR}/docker/prod/uwsgi.ini:/conf/uwsgi.ini
- ${NEPTUNE_LOCAL_SETTINGS}:/code/app/local_settings.py
- ${NEPTUNE_DATABASE_FILE}:/code/db.sqlite3
ports:
- "${NEPTUNE_PORT}:8000" # uwsgi socket
env_file:
- ../stack.env
volumes_from:
- tmp
depends_on:
redis:
condition: service_healthy
migration:
condition: service_started
migration:
image: pathogen/neptune:latest
build: ./docker/prod
command: sh -c '. /venv/bin/activate && python manage.py migrate --noinput'
volumes:
- ${PORTAINER_GIT_DIR}:/code
- ${NEPTUNE_LOCAL_SETTINGS}:/code/app/local_settings.py
- ${NEPTUNE_DATABASE_FILE}:/code/db.sqlite3
volumes_from:
- tmp
depends_on:
redis:
condition: service_healthy
tmp:
image: busybox
command: chmod -R 777 /var/run/redis
volumes:
- /var/run/redis
redis:
image: redis
command: redis-server /etc/redis.conf
volumes:
- ${PORTAINER_GIT_DIR}/docker/redis.conf:/etc/redis.conf
volumes_from:
- tmp
healthcheck:
test: "redis-cli -s /var/run/redis/redis.sock ping"
interval: 2s
timeout: 2s
retries: 15
networks:
default:
external:
name: pathogen

View File

@@ -0,0 +1,23 @@
upstream django {
#server app:8000;
#server unix:///var/run/socks/app.sock;
server app:8000;
}
server {
listen 9999;
location = /favicon.ico { access_log off; log_not_found off; }
location /static/ {
root /conf;
}
location / {
proxy_pass http://django;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
}
}

View File

@@ -0,0 +1,24 @@
upstream django {
server app:8000;
#server unix:///var/run/socks/app.sock;
}
server {
listen 9999;
location = /favicon.ico { access_log off; log_not_found off; }
location /static/ {
root /conf;
}
location / {
include /etc/nginx/uwsgi_params; # the uwsgi_params file you installed
uwsgi_pass django;
uwsgi_param Host $host;
uwsgi_param X-Real-IP $remote_addr;
uwsgi_param X-Forwarded-For $proxy_add_x_forwarded_for;
uwsgi_param X-Forwarded-Proto $http_x_forwarded_proto;
}
}

View File

@@ -1,21 +0,0 @@
# syntax=docker/dockerfile:1
FROM python:3
RUN useradd -d /code pathogen
RUN mkdir /code
RUN chown pathogen:pathogen /code
RUN mkdir /conf
RUN chown pathogen:pathogen /conf
RUN mkdir /venv
RUN chown pathogen:pathogen /venv
USER pathogen
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
WORKDIR /code
COPY requirements.prod.txt /code/
RUN python -m venv /venv
RUN . /venv/bin/activate && pip install -r requirements.prod.txt
CMD . /venv/bin/activate && uwsgi --ini /conf/uwsgi.ini

View File

@@ -1,19 +0,0 @@
wheel
django
django-crispy-forms
crispy-bulma
#opensearch-py
stripe
django-rest-framework
numpy
uwsgi
django-tables2
django-tables2-bulma-template
django-htmx
cryptography
siphashc
redis
sortedcontainers
django-debug-toolbar
django-debug-toolbar-template-profiler
orjson

View File

@@ -1,2 +1,5 @@
unixsocket /var/run/redis/redis.sock unixsocket /var/run/socks/redis.sock
unixsocketperm 777 unixsocketperm 777
# For Monolith PubSub
port 6379

View File

@@ -1,18 +0,0 @@
wheel
django
django-crispy-forms
crispy-bulma
#opensearch-py
stripe
django-rest-framework
numpy
django-tables2
django-tables2-bulma-template
django-htmx
cryptography
siphashc
redis
sortedcontainers
django-debug-toolbar
django-debug-toolbar-template-profiler
orjson

View File

@@ -5,9 +5,8 @@ env=DJANGO_SETTINGS_MODULE=app.settings
master=1 master=1
pidfile=/tmp/project-master.pid pidfile=/tmp/project-master.pid
socket=0.0.0.0:8000 socket=0.0.0.0:8000
processes=5
harakiri=20 harakiri=20
max-requests=5000 max-requests=100000
vacuum=1 vacuum=1
home=/venv home=/venv
processes=12

View File

@@ -1,9 +1,10 @@
wheel wheel
uwsgi
django django
pre-commit pre-commit
django-crispy-forms django-crispy-forms
crispy-bulma crispy-bulma
#opensearch-py elasticsearch[async]
stripe stripe
django-rest-framework django-rest-framework
numpy numpy
@@ -17,3 +18,5 @@ sortedcontainers
django-debug-toolbar django-debug-toolbar
django-debug-toolbar-template-profiler django-debug-toolbar-template-profiler
orjson orjson
msgpack
apscheduler

View File

@@ -1,4 +1,6 @@
NEPTUNE_PORT=5000 APP_PORT=5000
PORTAINER_GIT_DIR=.. PORTAINER_GIT_DIR=.
NEPTUNE_LOCAL_SETTINGS=../app/local_settings.py APP_LOCAL_SETTINGS=./app/local_settings.py
NEPTUNE_DATABASE_FILE=../db.sqlite3 APP_DATABASE_FILE=./db.sqlite3
STATIC_ROOT=/conf/static
OPERATION=dev