Compare commits

..

No commits in common. "46ef515d4fcc21d4efe1a3f0d1e0477da82aeb61" and "588f788e311e6e2d21786a015dcb9853c7024a8b" have entirely different histories.

67 changed files with 2156 additions and 11754 deletions

7
.gitignore vendored
View File

@ -1,9 +1,2 @@
# Claude Code confidential files # Claude Code confidential files
.claude/ .claude/
# Codex local workspace files
.codex/
# Python cache artifacts
__pycache__/
*.pyc

View File

@ -1 +1 @@
25 main

View File

@ -1,249 +0,0 @@
# TODO: Cove Data Protection Integration
**Date:** 2026-02-23
**Status:** Research COMPLETED — Ready for implementation
**Priority:** Medium
---
## 🎯 Goal
Integrate Cove Data Protection (formerly N-able Backup / SolarWinds Backup) into Backupchecks for backup status monitoring via scheduled API polling. The integration runs server-side within the Backupchecks web application.
**Challenge:** Cove does NOT work with email notifications like other backup systems (Veeam, Synology, NAKIVO). We use the JSON-RPC API instead.
---
## ✅ Research Phase — COMPLETED (2026-02-23)
### Confirmed findings
- **API endpoint:** `https://api.backup.management/jsonapi`
- **Protocol:** JSON-RPC 2.0, POST requests, `Content-Type: application/json`
- **Authentication:** Login method returns a `visa` token — include in all subsequent calls
- **PartnerId:** `139124` (MCC Automatisering) — required for all queries, partnernaam is NIET nodig
- **Alle benodigde data is beschikbaar** — eerdere blockers (D02/D03 errors) waren door gebruik van legacy column codes. Vervangen door D10/D11.
- **Geen MSP-level beperking** — elke API user heeft dezelfde toegang. Toegang tot alle sub-customers via top-level account.
- **Geen EnumerateAccounts nodig**`EnumerateAccountStatistics` met juiste columns geeft alles wat we nodig hebben.
### Officiële documentatie (van N-able support, Andrew Robinson)
- **Getting Started:** https://developer.n-able.com/n-able-cove/docs/getting-started
- **Column Codes:** https://developer.n-able.com/n-able-cove/docs/column-codes
- **Construct a Call:** https://developer.n-able.com/n-able-cove/docs/construct-a-json-rpc-api-call
- **Authorization:** https://developer.n-able.com/n-able-cove/docs/authorization
---
## 📡 API — Vastgestelde werking
### Stap 1: Login
```json
POST https://api.backup.management/jsonapi
Content-Type: application/json
{
"jsonrpc": "2.0",
"id": "jsonrpc",
"method": "Login",
"params": {
"username": "{{cove_api_username}}",
"password": "{{cove_api_password}}"
}
}
```
**Response bevat:**
- `visa` — sessie token (meegeven in alle vervolg calls)
- `result.PartnerId` — het partner ID (139124 voor MCC Automatisering)
### Stap 2: EnumerateAccountStatistics
```json
{
"jsonrpc": "2.0",
"visa": "{{visa}}",
"id": "jsonrpc",
"method": "EnumerateAccountStatistics",
"params": {
"query": {
"PartnerId": 139124,
"StartRecordNumber": 0,
"RecordsCount": 250,
"Columns": [
"I1", "I18", "I8", "I78",
"D09F00", "D09F09", "D09F15", "D09F08",
"D1F00", "D1F15",
"D10F00", "D10F15",
"D11F00", "D11F15",
"D19F00", "D19F15",
"D20F00", "D20F15",
"D5F00", "D5F15",
"D23F00", "D23F15"
]
}
}
}
```
---
## 📋 Column codes — wat ze betekenen
### Device info
| Column | Betekenis | Type |
|--------|-----------|------|
| `I1` | Device naam (intern, uniek) | String |
| `I18` | Computer naam (leesbaar) — leeg bij M365 | String |
| `I8` | Klant naam | String |
| `I78` | Actieve datasources, bijv. `D01D02D10` | String |
### Datasource status (per datasource herhaalbaar)
| Suffix | Betekenis | Type |
|--------|-----------|------|
| `F00` | Status van laatste sessie | Int (zie tabel) |
| `F09` | Tijdstip laatste **succesvolle** sessie | Unix timestamp |
| `F15` | Tijdstip laatste sessie (ongeacht status) | Unix timestamp |
| `F08` | Color bar laatste 28 dagen (28 cijfers) | String |
### Status waarden (F00)
| Waarde | Betekenis |
|--------|-----------|
| `1` | In process |
| `2` | Failed ❌ |
| `3` | Aborted |
| `5` | Completed ✅ |
| `6` | Interrupted |
| `8` | CompletedWithErrors ⚠️ |
| `9` | InProgressWithFaults |
| `10` | OverQuota |
| `11` | NoSelection (geconfigureerd maar niets geselecteerd) |
| `12` | Restarted |
### Datasources
| Code | Naam | Gebruik |
|-------|------|---------|
| `D09` | Total (alle datasources gecombineerd) | Altijd aanwezig, beste voor overall status |
| `D1` | Files & Folders | Servers/workstations |
| `D2` | System State | Servers/workstations |
| `D10` | VssMsSql (SQL Server) | Servers met SQL |
| `D11` | VssSharePoint | Servers met SharePoint |
| `D19` | Microsoft 365 Exchange | M365 tenants |
| `D20` | Microsoft 365 OneDrive | M365 tenants |
| `D5` | Microsoft 365 SharePoint | M365 tenants |
| `D23` | Microsoft 365 Teams | M365 tenants |
**Let op:** D02 en D03 zijn legacy codes — gebruik D10 en D11.
### Device types herkennen via I78
- `I78` bevat waarden zoals `D01D02`, `D01D02D10`, `D19D20D05D23`
- Leeg `I18` veld = Microsoft 365 tenant
- Gevuld `I18` veld = server of workstation
### D09F08 — Color bar decoderen
28 tekens, elk karakter = 1 dag (oudste eerst):
- `5` = Completed ✅
- `8` = CompletedWithErrors ⚠️
- `2` = Failed ❌
- `1` = In progress
- `0` = Geen backup
---
## 🏗️ Architectuur beslissing
**Gekozen: Option 2 — Parallel Import System**
```
API Poller → Cove API Parser → JobRun (direct, zonder MailMessage)
```
Rationale:
- Schone scheiding van email- en API-gebaseerde imports
- Geen misbruik van MailMessage model voor data zonder email context
- Toekomstbestendig voor andere API-gebaseerde backup systemen
### Database wijzigingen nodig
- `JobRun.source_type` — nieuw veld: `"email"` of `"api"`
- `JobRun.external_id` — Cove `AccountId` als externe referentie
- `JobRun.mail_message` — moet nullable worden (of aparte tabel)
---
## 🔧 Implementatie fases
### Phase 1: Database migratie
- [ ] `source_type` veld toevoegen aan JobRun (`email` / `api`)
- [ ] `external_id` veld toevoegen aan JobRun (voor Cove AccountId)
- [ ] `mail_message` FK nullable maken voor API-gebaseerde runs
- [ ] Migratie schrijven en testen
### Phase 2: Cove API client
- [ ] Nieuw bestand: `app/services/cove_client.py`
- [ ] Login methode (visa token ophalen)
- [ ] `enumerate_account_statistics()` methode
- [ ] Paginatie afhandelen (RecordsCount / StartRecordNumber)
- [ ] Token verloop afhandelen (opnieuw inloggen)
- [ ] Error handling & retry logic
### Phase 3: Data transformatie
- [ ] Nieuw bestand: `app/services/cove_importer.py`
- [ ] Settings lijst omzetten naar dict voor makkelijke lookup
- [ ] Unix timestamps omzetten naar datetime
- [ ] Datasource status mappen naar Backupchecks status (success/warning/failed)
- [ ] Device type bepalen (server vs M365) via `I18` en `I78`
- [ ] JobRun records aanmaken per device
### Phase 4: Scheduled polling
- [ ] Cronjob of scheduled task (elke 15-60 minuten?)
- [ ] Duplicate detectie op basis van `external_id` + tijdstip
- [ ] Logging & audit trail
- [ ] Rate limiting respecteren
### Phase 5: UI aanpassingen
- [ ] Job Details: geen "Download EML" knop voor API-gebaseerde runs
- [ ] Indicatie dat job afkomstig is van Cove API (niet email)
- [ ] 28-daagse color bar eventueel tonen
### Phase 6: Configuratie
- [ ] Cove API credentials opslaan in SystemSettings
- [ ] PartnerId configureerbaar maken
- [ ] Polling interval instelbaar
---
## 🔑 API Credentials
- **API User:** `backupchecks-cove-01`
- **User ID:** `1665555`
- **PartnerId:** `139124`
- **Role:** SuperUser + SecurityOfficer
- **Portal:** https://backup.management/#/api-users
**BELANGRIJK:** Token opslaan in password manager — kan niet opnieuw worden opgevraagd!
---
## ❓ Openstaande vragen voor implementatie
1. Hoe slaan we de Cove API credentials veilig op in Backupchecks? (SystemSettings? Environment variable?)
2. Wat is de gewenste polling frequentie? (15 min / 30 min / 1 uur?)
3. Willen we historische data importeren bij eerste run, of alleen nieuwe sessies?
4. Willen we de 28-daagse color bar (`D09F08`) tonen in de UI?
5. Ondersteunen we meerdere Cove accounts (meerdere MSPs)?
---
## 🎯 Success Criteria (MVP)
- [ ] Backup status (success/warning/failed) per device zichtbaar in Backupchecks
- [ ] Klant naam en device naam correct gekoppeld
- [ ] Tijdstip laatste backup beschikbaar
- [ ] Zichtbaar in Daily Jobs & Run Checks
- [ ] Servers én Microsoft 365 tenants worden ondersteund
- [ ] Geen duplicates bij herhaalde polling
### Nice to Have
- [ ] 28-daagse history grafiek
- [ ] Per-datasource status (SQL, Exchange, etc.)
- [ ] Polling frequentie instelbaar per klant

View File

@ -1,128 +0,0 @@
# TODO: Notification System via Email Inbound
**Date:** 2026-02-26
**Status:** Idea — Ready for refinement
**Priority:** Medium
---
## 🎯 Goal
Maak een notificatieflow waarbij collega's een email sturen naar `backups+notification@...`, waarna Backupchecks deze berichten ophaalt en de operator informeert in de applicatie (optioneel ook via email).
**Context:** Dit is een idee/TODO, nog geen implementatie.
---
## ✅ Idee samenvatting
### Gewenste werking
- Inkomende emails naar `backups+notification@...` worden periodiek opgehaald
- Relevante informatie uit onderwerp/body wordt omgezet naar een operator-notificatie
- Operator ziet ongelezen notificaties in Backupchecks
- Operator kan notificatie markeren als gelezen/afgehandeld
- Acties worden gelogd voor audit/doelmatige opvolging
---
## 🏗️ Architectuur richting
**Voorkeur:** Hergebruik van bestaande mail import pipeline
```
Mailbox Poller → Notification Parser → Operator Notification Store → UI Badge/Inbox
```
Rationale:
- Minder nieuwe infrastructuur nodig
- Consistent met bestaande email-gedreven verwerking in Backupchecks
- Sneller naar bruikbare MVP
---
## 🔧 Implementatie fases
### Phase 1: Database model
- [ ] Nieuwe tabel `operator_notifications` toevoegen
- [ ] Velden voor statusflow (`new`, `read`, `handled`) toevoegen
- [ ] Relatie naar operator (`operator_id`) toevoegen
- [ ] Optionele koppelingen naar `customer_id` en `job_id`
### Phase 2: Mail ingest
- [ ] Alias `backups+notification@...` opnemen in verwerking
- [ ] Alleen notificatie-mails verwerken voor dit kanaal
- [ ] Duplicate protectie op `message_id`
- [ ] Fallback voor berichten zonder bruikbare parse
### Phase 3: Parsing en prioriteit
- [ ] Subject/sender/body snippet opslaan
- [ ] Prioriteit bepalen (`normal`/`high`) op keywords zoals `urgent`, `critical`, `failed`
- [ ] Onveilige HTML strippen
- [ ] Max body size limiteren
### Phase 4: Operator UI
- [ ] Notificatie badge in hoofdlayout
- [ ] Operator notificatie-overzicht (inbox)
- [ ] Detailweergave met metadata
- [ ] Acties: markeren als gelezen en afgehandeld
### Phase 5: Settings en policy
- [ ] Instellingen voor mailbox alias
- [ ] Polling aan/uit + interval
- [ ] Sender allowlist (domeinen/adressen)
- [ ] Optie voor relay email naar operator
### Phase 6: Logging en audit
- [ ] Event `notification_created`
- [ ] Event `notification_read`
- [ ] Event `notification_handled`
- [ ] Afwijzingen van niet-toegestane afzenders loggen
---
## 📋 Data model (voorstel)
Nieuwe tabel: `operator_notifications`
- `id` (PK)
- `message_id` (unique, nullable fallback)
- `sender`
- `recipient`
- `subject`
- `body_text`
- `priority` (`normal` / `high`)
- `state` (`new` / `read` / `handled`)
- `operator_id` (FK users.id)
- `customer_id` (FK customers.id, nullable)
- `job_id` (FK jobs.id, nullable)
- `received_at`
- `created_at`
- `read_at` (nullable)
- `handled_at` (nullable)
- `handled_by` (FK users.id, nullable)
---
## ❓ Openstaande vragen
1. Wat is het exacte mailbox adres/domein voor `backups+notification@...`?
2. Moet notificatie routing naar alle operators, of naar een toegewezen operator per klant/job?
3. Is email relay naar operator verplicht voor MVP, of Nice to Have?
4. Wat is de bewaartermijn voor notificatie-inhoud?
5. Moeten attachments in fase 1 genegeerd blijven?
---
## 🎯 Success Criteria (MVP)
- [ ] Email naar `backups+notification@...` verschijnt binnen polling interval in Backupchecks
- [ ] Duplicates worden niet dubbel geïmporteerd
- [ ] Operator ziet ongelezen badge + details
- [ ] Operator kan notificatie afhandelen
- [ ] Niet-toegestane afzenders worden geweigerd en gelogd
### Nice to Have
- [ ] Automatische mapping naar klant/job via herkenning in onderwerp/body
- [ ] SLA timers op open notificaties
- [ ] Escalatie bij niet-opgevolgde high-priority notificaties

View File

@ -17,4 +17,4 @@ ENV APP_PORT=8080
EXPOSE 8080 EXPOSE 8080
# Use the application factory from backend.app # Use the application factory from backend.app
CMD ["gunicorn", "-b", "0.0.0.0:8080", "--timeout", "120", "backend.app:create_app()"] CMD ["gunicorn", "-b", "0.0.0.0:8080", "backend.app:create_app()"]

View File

@ -13,7 +13,6 @@ from .main.routes import main_bp
from .main.routes_documentation import doc_bp from .main.routes_documentation import doc_bp
from .migrations import run_migrations from .migrations import run_migrations
from .auto_importer_service import start_auto_importer from .auto_importer_service import start_auto_importer
from .cove_importer_service import start_cove_importer
def _get_today_ui_date() -> str: def _get_today_ui_date() -> str:
@ -200,21 +199,6 @@ def create_app():
return None return None
@app.context_processor
def inject_inbox_count():
"""Inject inbox_count into every template so the sidebar badge is always visible."""
try:
from flask_login import current_user as _cu
if not _cu or not _cu.is_authenticated:
return {}
from .models import MailMessage
q = MailMessage.query
if hasattr(MailMessage, "location"):
q = q.filter(MailMessage.location == "inbox")
return {"inbox_count": int(q.count() or 0)}
except Exception:
return {}
@app.get("/health") @app.get("/health")
def health(): def health():
return {"status": "ok"} return {"status": "ok"}
@ -228,7 +212,4 @@ def create_app():
# Start automatic mail importer background thread # Start automatic mail importer background thread
start_auto_importer(app) start_auto_importer(app)
# Start Cove Data Protection importer background thread
start_cove_importer(app)
return app return app

View File

@ -1,11 +1,5 @@
import base64
import binascii
import hashlib
import os
import random import random
import secrets
from functools import wraps from functools import wraps
from urllib.parse import urlencode
from flask import ( from flask import (
Blueprint, Blueprint,
@ -17,10 +11,9 @@ from flask import (
session, session,
) )
from flask_login import login_user, logout_user, login_required, current_user from flask_login import login_user, logout_user, login_required, current_user
import requests
from ..database import db from ..database import db
from ..models import SystemSettings, User from ..models import User
auth_bp = Blueprint("auth", __name__, url_prefix="/auth") auth_bp = Blueprint("auth", __name__, url_prefix="/auth")
@ -38,146 +31,10 @@ def generate_captcha():
return question, answer return question, answer
def _entra_effective_config() -> dict:
"""Return effective Entra SSO config from DB settings with env fallback."""
settings = SystemSettings.query.first()
enabled = bool(getattr(settings, "entra_sso_enabled", False)) if settings else False
tenant_id = (getattr(settings, "entra_tenant_id", None) or "").strip() if settings else ""
client_id = (getattr(settings, "entra_client_id", None) or "").strip() if settings else ""
client_secret = (getattr(settings, "entra_client_secret", None) or "").strip() if settings else ""
redirect_uri = (getattr(settings, "entra_redirect_uri", None) or "").strip() if settings else ""
allowed_domain = (getattr(settings, "entra_allowed_domain", None) or "").strip().lower() if settings else ""
allowed_group_ids = (getattr(settings, "entra_allowed_group_ids", None) or "").strip() if settings else ""
auto_provision = bool(getattr(settings, "entra_auto_provision_users", False)) if settings else False
if not tenant_id:
tenant_id = (os.environ.get("ENTRA_TENANT_ID", "") or "").strip()
if not client_id:
client_id = (os.environ.get("ENTRA_CLIENT_ID", "") or "").strip()
if not client_secret:
client_secret = (os.environ.get("ENTRA_CLIENT_SECRET", "") or "").strip()
if not redirect_uri:
redirect_uri = (os.environ.get("ENTRA_REDIRECT_URI", "") or "").strip()
if not allowed_domain:
allowed_domain = (os.environ.get("ENTRA_ALLOWED_DOMAIN", "") or "").strip().lower()
if not enabled:
env_enabled = (os.environ.get("ENTRA_SSO_ENABLED", "") or "").strip().lower()
enabled = env_enabled in ("1", "true", "yes", "on")
if not auto_provision:
env_auto = (os.environ.get("ENTRA_AUTO_PROVISION_USERS", "") or "").strip().lower()
auto_provision = env_auto in ("1", "true", "yes", "on")
if not allowed_group_ids:
allowed_group_ids = (os.environ.get("ENTRA_ALLOWED_GROUP_IDS", "") or "").strip()
return {
"enabled": enabled,
"tenant_id": tenant_id,
"client_id": client_id,
"client_secret": client_secret,
"redirect_uri": redirect_uri,
"allowed_domain": allowed_domain,
"allowed_group_ids": allowed_group_ids,
"auto_provision": auto_provision,
}
def _parse_group_ids(raw: str | None) -> set[str]:
if not raw:
return set()
normalized = raw.replace("\n", ",").replace(";", ",")
out = set()
for item in normalized.split(","):
value = (item or "").strip()
if value:
out.add(value.lower())
return out
def _b64url_decode(data: str) -> bytes:
pad = "=" * (-len(data) % 4)
return base64.urlsafe_b64decode((data + pad).encode("ascii"))
def _decode_id_token_payload(id_token: str) -> dict:
"""Decode JWT payload without signature verification (token comes from Entra token endpoint)."""
if not id_token or "." not in id_token:
return {}
parts = id_token.split(".")
if len(parts) < 2:
return {}
try:
payload_raw = _b64url_decode(parts[1])
import json
payload = json.loads(payload_raw.decode("utf-8"))
if isinstance(payload, dict):
return payload
except (binascii.Error, ValueError, UnicodeDecodeError):
return {}
return {}
def _resolve_sso_user(claims: dict, auto_provision: bool) -> User | None:
"""Resolve or optionally create a local user from Entra claims."""
username = (
(claims.get("preferred_username") or "")
or (claims.get("upn") or "")
or (claims.get("email") or "")
).strip()
email = ((claims.get("email") or claims.get("preferred_username") or "") or "").strip() or None
if not username:
return None
user = User.query.filter_by(username=username).first()
if not user and email:
user = User.query.filter_by(email=email).first()
if user:
return user
if not auto_provision:
return None
new_username = username
if User.query.filter_by(username=new_username).first():
base = new_username
idx = 1
while User.query.filter_by(username=f"{base}.{idx}").first():
idx += 1
new_username = f"{base}.{idx}"
# Random local password as fallback; SSO users authenticate via Entra.
random_password = secrets.token_urlsafe(32)
new_user = User(username=new_username, email=email, role="viewer")
new_user.set_password(random_password)
db.session.add(new_user)
db.session.commit()
return new_user
def _groups_from_claims(claims: dict) -> set[str]:
groups = claims.get("groups")
if isinstance(groups, list):
return {str(x).strip().lower() for x in groups if str(x).strip()}
if isinstance(groups, str) and groups.strip():
return {groups.strip().lower()}
return set()
def _captcha_enabled() -> bool:
"""Return True when the login captcha is enabled in SystemSettings."""
try:
s = SystemSettings.query.first()
if s is None:
return True # default on for new installs
return bool(getattr(s, "login_captcha_enabled", True))
except Exception:
return True # fail-safe: keep captcha on if DB unreachable
def captcha_required(func): def captcha_required(func):
@wraps(func) @wraps(func)
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
if request.method == "POST" and _captcha_enabled(): if request.method == "POST":
expected = session.get("captcha_answer") expected = session.get("captcha_answer")
provided = (request.form.get("captcha") or "").strip() provided = (request.form.get("captcha") or "").strip()
if not expected or provided != expected: if not expected or provided != expected:
@ -185,19 +42,10 @@ def captcha_required(func):
# regenerate captcha for re-render # regenerate captcha for re-render
question, answer = generate_captcha() question, answer = generate_captcha()
session["captcha_answer"] = answer session["captcha_answer"] = answer
cfg = _entra_effective_config()
entra_ready = bool(
cfg.get("enabled")
and cfg.get("tenant_id")
and cfg.get("client_id")
and cfg.get("client_secret")
)
return render_template( return render_template(
"auth/login.html", "auth/login.html",
captcha_question=question, captcha_question=question,
captcha_enabled=True,
username=request.form.get("username", ""), username=request.form.get("username", ""),
entra_sso_enabled=entra_ready,
) )
return func(*args, **kwargs) return func(*args, **kwargs)
@ -207,27 +55,13 @@ def captcha_required(func):
@auth_bp.route("/login", methods=["GET", "POST"]) @auth_bp.route("/login", methods=["GET", "POST"])
@captcha_required @captcha_required
def login(): def login():
captcha_on = _captcha_enabled()
if request.method == "GET": if request.method == "GET":
if not users_exist(): if not users_exist():
return redirect(url_for("auth.initial_setup")) return redirect(url_for("auth.initial_setup"))
question, answer = generate_captcha() question, answer = generate_captcha()
session["captcha_answer"] = answer session["captcha_answer"] = answer
cfg = _entra_effective_config() return render_template("auth/login.html", captcha_question=question)
entra_ready = bool(
cfg.get("enabled")
and cfg.get("tenant_id")
and cfg.get("client_id")
and cfg.get("client_secret")
)
return render_template(
"auth/login.html",
captcha_question=question,
captcha_enabled=captcha_on,
entra_sso_enabled=entra_ready,
)
# POST # POST
username = (request.form.get("username") or "").strip() username = (request.form.get("username") or "").strip()
@ -238,19 +72,8 @@ def login():
flash("Invalid username or password.", "danger") flash("Invalid username or password.", "danger")
question, answer = generate_captcha() question, answer = generate_captcha()
session["captcha_answer"] = answer session["captcha_answer"] = answer
cfg = _entra_effective_config()
entra_ready = bool(
cfg.get("enabled")
and cfg.get("tenant_id")
and cfg.get("client_id")
and cfg.get("client_secret")
)
return render_template( return render_template(
"auth/login.html", "auth/login.html", captcha_question=question, username=username
captcha_question=question,
captcha_enabled=captcha_on,
username=username,
entra_sso_enabled=entra_ready,
) )
login_user(user) login_user(user)
@ -258,180 +81,18 @@ def login():
session["active_role"] = user.roles[0] session["active_role"] = user.roles[0]
except Exception: except Exception:
session["active_role"] = (getattr(user, "role", "viewer") or "viewer").split(",")[0].strip() or "viewer" session["active_role"] = (getattr(user, "role", "viewer") or "viewer").split(",")[0].strip() or "viewer"
session["auth_provider"] = "local"
flash("You are now logged in.", "success") flash("You are now logged in.", "success")
return redirect(url_for("main.dashboard")) return redirect(url_for("main.dashboard"))
@auth_bp.route("/entra/login")
def entra_login():
"""Start Microsoft Entra ID authorization code flow."""
cfg = _entra_effective_config()
if not cfg.get("enabled"):
flash("Microsoft Entra SSO is not enabled.", "warning")
return redirect(url_for("auth.login"))
if not cfg.get("tenant_id") or not cfg.get("client_id") or not cfg.get("client_secret"):
flash("Microsoft Entra SSO is not fully configured.", "danger")
return redirect(url_for("auth.login"))
redirect_uri = cfg.get("redirect_uri") or url_for("auth.entra_callback", _external=True)
state = secrets.token_urlsafe(24)
nonce = hashlib.sha256(secrets.token_bytes(32)).hexdigest()
session["entra_state"] = state
session["entra_nonce"] = nonce
params = {
"client_id": cfg["client_id"],
"response_type": "code",
"redirect_uri": redirect_uri,
"response_mode": "query",
"scope": "openid profile email",
"state": state,
"nonce": nonce,
"prompt": "select_account",
}
auth_url = (
f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/authorize?"
f"{urlencode(params)}"
)
return redirect(auth_url)
@auth_bp.route("/entra/callback")
def entra_callback():
"""Handle Microsoft Entra ID callback and log in mapped local user."""
cfg = _entra_effective_config()
if not cfg.get("enabled"):
flash("Microsoft Entra SSO is not enabled.", "warning")
return redirect(url_for("auth.login"))
error = (request.args.get("error") or "").strip()
if error:
desc = (request.args.get("error_description") or "").strip()
flash(f"Microsoft Entra login failed: {error} {desc}".strip(), "danger")
return redirect(url_for("auth.login"))
state = (request.args.get("state") or "").strip()
expected_state = (session.get("entra_state") or "").strip()
if not state or not expected_state or state != expected_state:
flash("Invalid SSO state. Please try again.", "danger")
return redirect(url_for("auth.login"))
code = (request.args.get("code") or "").strip()
if not code:
flash("No authorization code returned by Microsoft Entra.", "danger")
return redirect(url_for("auth.login"))
redirect_uri = cfg.get("redirect_uri") or url_for("auth.entra_callback", _external=True)
token_url = f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/token"
token_payload = {
"client_id": cfg["client_id"],
"client_secret": cfg["client_secret"],
"grant_type": "authorization_code",
"code": code,
"redirect_uri": redirect_uri,
"scope": "openid profile email",
}
try:
token_resp = requests.post(token_url, data=token_payload, timeout=30)
token_resp.raise_for_status()
token_data = token_resp.json()
except Exception as exc:
flash(f"Failed to fetch token from Microsoft Entra: {exc}", "danger")
return redirect(url_for("auth.login"))
id_token = token_data.get("id_token")
claims = _decode_id_token_payload(id_token or "")
if not claims:
flash("Could not read Microsoft Entra ID token.", "danger")
return redirect(url_for("auth.login"))
expected_nonce = (session.get("entra_nonce") or "").strip()
token_nonce = (claims.get("nonce") or "").strip()
if expected_nonce and token_nonce and token_nonce != expected_nonce:
flash("Invalid SSO nonce. Please try again.", "danger")
return redirect(url_for("auth.login"))
allowed_domain = (cfg.get("allowed_domain") or "").strip().lower()
if allowed_domain:
token_tid = (claims.get("tid") or "").strip().lower()
token_domain = ""
upn = (claims.get("preferred_username") or claims.get("email") or "").strip().lower()
if "@" in upn:
token_domain = upn.split("@", 1)[1]
if allowed_domain not in {token_tid, token_domain}:
flash("Your Microsoft account is not allowed for this instance.", "danger")
return redirect(url_for("auth.login"))
allowed_groups = _parse_group_ids(cfg.get("allowed_group_ids"))
if allowed_groups:
claim_names = claims.get("_claim_names") or {}
groups_overage = isinstance(claim_names, dict) and "groups" in claim_names
token_groups = _groups_from_claims(claims)
if groups_overage:
flash(
"Group-based access check could not be completed because token group overage is active. "
"Limit group claims to assigned groups or reduce memberships.",
"danger",
)
return redirect(url_for("auth.login"))
if not token_groups:
flash(
"Group-based access is enabled, but no groups claim was received from Microsoft Entra. "
"Configure group claims in the Entra app token settings.",
"danger",
)
return redirect(url_for("auth.login"))
if token_groups.isdisjoint(allowed_groups):
flash("Your Microsoft account is not in an allowed security group.", "danger")
return redirect(url_for("auth.login"))
user = _resolve_sso_user(claims, auto_provision=bool(cfg.get("auto_provision")))
if not user:
flash(
"No local Backupchecks user is mapped to this Microsoft account. "
"Ask an admin to create or map your account.",
"danger",
)
return redirect(url_for("auth.login"))
login_user(user)
try:
session["active_role"] = user.roles[0]
except Exception:
session["active_role"] = (getattr(user, "role", "viewer") or "viewer").split(",")[0].strip() or "viewer"
session["auth_provider"] = "entra"
session.pop("entra_state", None)
session.pop("entra_nonce", None)
flash("You are now logged in with Microsoft Entra.", "success")
return redirect(url_for("main.dashboard"))
@auth_bp.route("/logout") @auth_bp.route("/logout")
@login_required @login_required
def logout(): def logout():
cfg = _entra_effective_config()
auth_provider = (session.get("auth_provider") or "").strip()
logout_user() logout_user()
try: try:
session.pop("active_role", None) session.pop("active_role", None)
session.pop("auth_provider", None)
session.pop("entra_state", None)
session.pop("entra_nonce", None)
except Exception: except Exception:
pass pass
if auth_provider == "entra" and cfg.get("enabled") and cfg.get("tenant_id"):
post_logout = url_for("auth.login", _external=True)
logout_url = (
f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/logout?"
f"{urlencode({'post_logout_redirect_uri': post_logout})}"
)
return redirect(logout_url)
flash("You have been logged out.", "info") flash("You have been logged out.", "info")
return redirect(url_for("auth.login")) return redirect(url_for("auth.login"))

View File

@ -3,232 +3,6 @@ Changelog data structure for Backupchecks
""" """
CHANGELOG = [ CHANGELOG = [
{
"version": "v0.2.1",
"date": "2026-03-20",
"summary": "Patch release with inbox improvements, bug fixes for missed run detection, objects sorting, and the inbox sidebar badge.",
"sections": [
{
"title": "Added",
"type": "feature",
"subsections": [
{
"subtitle": "Inbox",
"changes": [
"Re-parse all now shows a live progress modal with a progress bar instead of blocking the browser; a new batch endpoint processes 50 messages at a time and the browser loops automatically until all messages are done, showing parsed / auto-approved / no match / errors stats in real time"
]
},
{
"subtitle": "Jobs export/import",
"changes": [
"Export schema bumped to v2: each job entry now includes the linked Cove account and Cloud Connect account objects",
"Import accepts both v1 and v2 files and automatically re-links Cove and Cloud Connect accounts on import if not already linked to a different job"
]
}
]
},
{
"title": "Fixed",
"type": "bugfix",
"subsections": [
{
"subtitle": "Missed run detection",
"changes": [
"Weekly inference now only looks at the last 90 days so old time slots are forgotten within 3 months after a schedule change",
"Monthly jobs that accumulated enough weekly hits no longer trigger daily missed runs; a cadence guard (median gap >= 20 days) routes them to monthly inference instead",
"Monthly inference limited to the last 180 days so schedule changes are reflected within 6 months",
"MIN_OCCURRENCES for weekly inference raised from 3 to 5 to reduce false positives during transitional periods"
]
},
{
"subtitle": "Run Checks and Job Detail modals",
"changes": [
"Objects list sort order: Warning items with a non-empty error_message were ranked as Critical instead of Warning; fixed so only error/failed/failure status triggers the Critical rank",
"Mail iframe height in Run Checks modal no longer collapses to near-zero; flex rules moved to the correct wrapper element"
]
},
{
"subtitle": "Inbox",
"changes": [
"Inbox badge in the sidebar now stays visible on all pages, not only the dashboard"
]
}
]
}
]
},
{
"version": "v0.2.0",
"date": "2026-03-20",
"summary": "This release is a significant update since v0.1.27 (released on February 23, 2026). It introduces a completely redesigned sidebar-first UI, a new Veeam Cloud Connect importer, Run Checks user preferences, and multiple Cove Data Protection improvements including historical run backfill and a run detail popup. Several bug fixes and UX refinements are included across the board.",
"sections": [
{
"title": "Added",
"type": "feature",
"subsections": [
{
"subtitle": "Veeam Cloud Connect",
"changes": [
"New Veeam Cloud Connect importer — HTML parser for daily report emails; upserts tenant rows into cloud_connect_accounts and creates JobRun records for linked accounts",
"Cloud Connect Accounts page (/cloud-connect/accounts): unmatched accounts first, matched accounts below; clickable-row UX with a single shared modal (pre-fills job name, backup type, customer)",
"CloudConnectAccount model and migration added",
"Sidebar link added for admin/operator roles"
]
},
{
"subtitle": "Cove Data Protection",
"changes": [
"Historical run backfill: when a new run is created for a linked job, up to 27 days of history are reconstructed from the colorbar field (D09F08); fully idempotent via external_id deduplication",
"Cove run details popup: Cove run rows in the job detail history table are now clickable; popup shows account name, computer, customer, datasource labels, last run, and status; mail section hidden for Cove runs",
"Cove Accounts page: same clickable-row UX as Cloud Connect with a single shared modal and customer datalist auto-complete"
]
},
{
"subtitle": "Run Checks and Settings",
"changes": [
"Run Checks user preferences (per user): sort mode and filter defaults (status, ticket, remark, search query) stored in DB and applied on page load",
"New route POST /run-checks/preferences to save current controls as user defaults; User Settings page includes a dedicated section",
"Login captcha toggle in Settings → General Security card: when disabled, the math captcha is hidden and not validated; migration adds column with DEFAULT TRUE"
]
}
]
},
{
"title": "Changed",
"type": "improvement",
"subsections": [
{
"subtitle": "Layout v2",
"changes": [
"Complete sidebar-first redesign: layout.css rewritten with IBM Plex Sans/Mono fonts and CSS design tokens; fixed dark sidebar (220 px)",
"base.html updated with Google Fonts preload and sidebar-aware structure"
]
},
{
"subtitle": "Performance and UX",
"changes": [
"Run Checks background sweep: missed-run generation and Autotask ticket polling now run in a background daemon thread on page load, throttled per job (10-minute minimum interval)",
"Cloud Connect run detail popup now shows a structured CC summary (account, repository, objects) instead of the raw report email; raw email accessible via a show toggle",
"Cloud Connect unique key changed from (user, section) to (user, section, repo_name) so users with multiple repositories each get a separate staging entry",
"Cove timestamp parsing now supports epoch milliseconds, microseconds, nanoseconds, and .NET JSON Date strings",
"Sandbox banner is now semi-transparent instead of solid red"
]
}
]
},
{
"title": "Fixed",
"type": "bugfix",
"changes": [
"Login page layout no longer breaks when flash messages are present",
"Delete all jobs in Settings → Maintenance no longer times out on large datasets (direct SQL DELETE FROM statements; handles 650K+ rows in seconds)",
"Automatic mail-to-job matching no longer selects archived jobs",
"Cove run creation transaction scope fixed (FK/visibility issue with second DB connection)",
"Cove link sync between both link paths (cove_accounts.job_id and jobs.cove_account_id)"
]
}
]
},
{
"version": "v0.1.27",
"date": "2026-02-23",
"summary": "This release is a major functional update since v0.1.26 (released on February 10, 2026). It introduces full Cove Data Protection integration, broad search and navigation improvements, and multiple workflow/ticketing fixes. It also adds Microsoft Entra SSO foundations (currently marked as untested in Backupchecks), along with extensive documentation updates and UI refinements.",
"sections": [
{
"title": "Added",
"type": "feature",
"subsections": [
{
"subtitle": "Cove Data Protection",
"changes": [
"Full Cove Data Protection integration with API importer and background polling",
"Cove Accounts staging/linking page for unmatched and matched account workflow",
"Manual import trigger and JobRun source tracking via source_type and external_id",
"CoveAccount model and migrations for staging and account linkage",
"Per-datasource object persistence for reporting and run inspection"
]
},
{
"subtitle": "Search and Navigation",
"changes": [
"Global grouped search with role-aware results",
"Per-section pagination in search",
"Remarks included in search results",
"Customer-to-Jobs quick filter navigation"
]
},
{
"subtitle": "Microsoft Entra SSO (Untested)",
"changes": [
"Added Microsoft Entra SSO login/callback/logout flow",
"Added settings and migrations for tenant/client/secret/redirect configuration",
"Added optional auto-provisioning for unknown users as Viewer",
"Added optional tenant/domain restriction",
"Added security-group gate using allowed Entra group IDs",
"Added dedicated Entra SSO documentation page"
]
},
{
"subtitle": "Other",
"changes": [
"Added Cove API test script (cove_api_test.py)",
"Added optional Autotask ID import toggle for jobs/customers import"
]
}
]
},
{
"title": "Changed",
"type": "improvement",
"subsections": [
{
"subtitle": "Cove Import and Linking",
"changes": [
"Immediate import after linking a Cove account",
"Type derivation refined to Server/Workstation/Microsoft 365",
"Cove Accounts display improved with clearer derived fields and readable datasource labels",
"Richer run details in Cove-created runs and datasource object records",
"Timestamp fallback for run creation from D09F15 to D09F09 when needed"
]
},
{
"subtitle": "Navbar Restructuring",
"changes": [
"Admin-only links grouped under an Admin dropdown",
"Secondary non-admin links grouped under a More dropdown",
"Cove Accounts moved back to main bar and Daily Jobs moved under More",
"Viewer role now has Customers and Jobs directly visible on navbar",
"Run Checks remains directly visible for daily operations"
]
},
{
"subtitle": "Documentation and UX",
"changes": [
"Documentation expanded and corrected across workflow and settings topics",
"Search UX improved with wildcard/contains filtering and section/pagination state preservation",
"Parser/Run Checks behavior updated for informational 3CX update handling"
]
}
]
},
{
"title": "Fixed",
"type": "bugfix",
"changes": [
"Fixed tickets not showing in Run Checks modal",
"Fixed copy ticket button behavior in Edge via improved clipboard fallback",
"Fixed resolved tickets incorrectly appearing on new runs using explicit link-based logic",
"Fixed duplicate tickets in Run Checks popup",
"Fixed missed-run ticket linking with internal and Autotask tickets",
"Fixed Cove run deduplication by scoping dedupe per job",
"Fixed Cove Run import now submit issue in settings UI",
"Fixed checkbox auto-reselect behavior after reload",
"Fixed search template crash caused by section.items access",
"Stopped tracking Python cache artifacts in version control"
]
}
]
},
{ {
"version": "v0.1.26", "version": "v0.1.26",
"date": "2026-02-10", "date": "2026-02-10",

View File

@ -1,390 +0,0 @@
"""Veeam Cloud Connect daily report importer.
Parses the HTML body of a Veeam Cloud Connect provider daily report email
and upserts each tenant (User row) into the cloud_connect_accounts staging
table identical in spirit to the Cove Data Protection importer.
Flow:
1. When the mail-importer receives a Cloud Connect daily report it calls
``upsert_cloud_connect_report(mail_message_id)``.
2. Every User × section combination is upserted into cloud_connect_accounts.
3. Unlinked accounts appear on the new "Cloud Connect" review page where an
admin can create or link a Backupchecks job (same UX as Cove Accounts).
4. For linked accounts a JobRun is created/updated; the mail_message_id is
attached so the mail body is available in the popup.
Status mapping (row background colour in the HTML report):
#fb9895 → Failed
#ffd96c → Warning
#ffffff → Success
"""
from __future__ import annotations
import logging
import re
from datetime import datetime, timedelta
from typing import Optional
from .database import db
from .models import CloudConnectAccount, Customer, Job, JobRun, MailMessage
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# HTML parsing helpers
# ---------------------------------------------------------------------------
def _strip_tags(html: str) -> str:
"""Strip HTML tags and normalise whitespace."""
if not html:
return ""
text = re.sub(r"<br\s*/?>", " ", html, flags=re.IGNORECASE)
text = re.sub(r"<[^>]+>", "", text)
text = re.sub(r"\s+", " ", text)
return text.strip()
def _row_status(row_style: str) -> str:
"""Map Veeam row background colour to a Backupchecks status string."""
m = re.search(r"background-color\s*:\s*([^;\"'\s]+)", row_style, re.IGNORECASE)
if not m:
return "Success"
colour = m.group(1).strip().lower()
if colour in {"#fb9895", "#ff9999", "#f4cccc", "#ffb3b3"}:
return "Failed"
if colour in {"#ffd96c", "#fff2cc", "#ffe599", "#f9cb9c"}:
return "Warning"
return "Success"
def _parse_last_active(raw: str) -> Optional[datetime]:
"""Convert a 'Last active' string like '14 hours ago' to a UTC datetime.
Returns None when the value is 'never' or cannot be parsed.
"""
s = (raw or "").strip().lower()
if not s or s == "never":
return None
now = datetime.utcnow()
m = re.match(r"(\d+)\s+(hour|day|week|month)s?\s+ago", s)
if not m:
return None
n = int(m.group(1))
unit = m.group(2)
if unit == "hour":
return now - timedelta(hours=n)
if unit == "day":
return now - timedelta(days=n)
if unit == "week":
return now - timedelta(weeks=n)
if unit == "month":
return now - timedelta(days=n * 30)
return None
def _parse_report_tables(html: str) -> list[dict]:
"""Extract all tenant rows from a Cloud Connect daily report HTML body.
Returns a list of dicts with keys:
section, user, repo_name, repo_type, num_items,
total_quota, used_space, free_space, last_active_raw,
last_active_dt, status
"""
if not html:
return []
# Section headers are <p> tags with font-size 18px just before each table.
# We walk the HTML top-to-bottom, tracking the current section name.
section_pattern = re.compile(
r'<p[^>]*font-size:\s*18px[^>]*>\s*(Backup|Replication|Agent)\s*</p>',
re.IGNORECASE | re.DOTALL,
)
table_pattern = re.compile(r'<table[^>]*>(.*?)</table>', re.IGNORECASE | re.DOTALL)
row_pattern = re.compile(r'<tr([^>]*)>(.*?)</tr>', re.IGNORECASE | re.DOTALL)
cell_pattern = re.compile(r'<t[dh][^>]*>(.*?)</t[dh]>', re.IGNORECASE | re.DOTALL)
results: list[dict] = []
# Find positions of section headers and tables to interleave them.
section_positions = [(m.start(), m.group(1)) for m in section_pattern.finditer(html)]
table_positions = [(m.start(), m.group(1)) for m in table_pattern.finditer(html)]
def _section_for_table(table_start: int) -> str:
"""Return the section name of the nearest preceding section header."""
current = "Backup"
for pos, name in section_positions:
if pos < table_start:
current = name
return current
for table_start, table_inner in table_positions:
section = _section_for_table(table_start)
rows = row_pattern.findall(table_inner)
if not rows:
continue
# Determine column positions from header row.
first_row_cells = [_strip_tags(c).strip() for _, c in [rows[0]]]
# Re-parse properly:
header_cells = [_strip_tags(c).strip() for c in cell_pattern.findall(rows[0][1])]
if not header_cells or header_cells[0].lower() != "user":
continue # not a tenant table (e.g. the version footer table)
# Column index lookup (graceful — different tables have different columns)
col = {name.lower(): i for i, name in enumerate(header_cells)}
for row_attr, row_inner in rows[1:]:
cells_raw = cell_pattern.findall(row_inner)
cells = [_strip_tags(c).strip() for c in cells_raw]
if not cells:
continue
user = cells[0] if len(cells) > 0 else ""
if not user or user.upper() == "TOTAL":
continue
# Column indices differ between Backup and Agent tables.
# Backup: User | #VM | Repo Name | Repo | Quota | Used | Free | Last active | Expiry
# Agent: User | #WS | #Server | Repo Name | Repo | Quota | Used | Free | Last active | Expiry
is_agent = section.lower() == "agent"
if is_agent:
num_ws = cells[1] if len(cells) > 1 else ""
num_srv = cells[2] if len(cells) > 2 else ""
num_items = f"{num_ws} WS / {num_srv} Server"
repo_name = cells[3] if len(cells) > 3 else ""
repo_type = cells[4] if len(cells) > 4 else ""
total_quota = cells[5] if len(cells) > 5 else ""
used_space = cells[6] if len(cells) > 6 else ""
free_space = cells[7] if len(cells) > 7 else ""
last_active_raw = cells[8] if len(cells) > 8 else ""
else:
num_items = cells[1] if len(cells) > 1 else ""
repo_name = cells[2] if len(cells) > 2 else ""
repo_type = cells[3] if len(cells) > 3 else ""
total_quota = cells[4] if len(cells) > 4 else ""
used_space = cells[5] if len(cells) > 5 else ""
free_space = cells[6] if len(cells) > 6 else ""
last_active_raw = cells[7] if len(cells) > 7 else ""
status = _row_status(row_attr)
last_active_dt = _parse_last_active(last_active_raw)
results.append({
"section": section,
"user": user,
"repo_name": repo_name,
"repo_type": repo_type,
"num_items": num_items,
"total_quota": total_quota,
"used_space": used_space,
"free_space": free_space,
"last_active_raw": last_active_raw,
"last_active_dt": last_active_dt,
"status": status,
})
return results
# ---------------------------------------------------------------------------
# Public import entry point
# ---------------------------------------------------------------------------
def upsert_cloud_connect_report(mail_message_id: int, html_body: str) -> dict:
"""Parse a Cloud Connect daily report and upsert all tenant rows.
Called by the mail importer when it detects a Cloud Connect daily report.
Returns a summary dict: {total, linked, unlinked, created, skipped}.
"""
rows = _parse_report_tables(html_body)
if not rows:
return {"total": 0, "linked": 0, "unlinked": 0, "created": 0, "skipped": 0}
now = datetime.utcnow()
# Use the mail's received_at as the report date so that re-importing
# historical emails creates runs on the correct calendar day, not today.
_mail_msg = MailMessage.query.get(mail_message_id)
report_dt = (_mail_msg.received_at if _mail_msg and _mail_msg.received_at else now)
report_date = report_dt.date().isoformat()
counters = {"total": len(rows), "linked": 0, "unlinked": 0, "created": 0, "skipped": 0}
for row in rows:
user = row["user"]
section = row["section"]
repo_name = row["repo_name"] or "" # never None — part of unique key
# Upsert the staging record — keyed on (user, section, repo_name).
acc = CloudConnectAccount.query.filter_by(user=user, section=section, repo_name=repo_name).first()
if acc is None:
acc = CloudConnectAccount(
user=user,
section=section,
repo_name=repo_name,
first_seen_at=now,
)
db.session.add(acc)
acc.repo_name = repo_name
acc.repo_type = row["repo_type"]
acc.num_items = row["num_items"]
acc.total_quota = row["total_quota"]
acc.used_space = row["used_space"]
acc.free_space = row["free_space"]
acc.last_active_raw = row["last_active_raw"]
acc.last_active_dt = row["last_active_dt"]
acc.last_status = row["status"]
acc.last_seen_at = now
acc.last_mail_message_id = mail_message_id
db.session.flush()
if not acc.job_id:
counters["unlinked"] += 1
continue
# Account is linked — create a JobRun if not already present for this report date.
job = Job.query.get(acc.job_id)
if not job:
counters["skipped"] += 1
continue
# Deduplicate: one run per job per report date.
repo_slug = repo_name.lower().replace(" ", "_")
external_id = f"vcc-{user}-{section}-{repo_slug}-{report_date}".lower().replace(" ", "_")
existing = JobRun.query.filter_by(job_id=job.id, external_id=external_id).first()
if existing:
# Update status and object links in case re-import has a different result.
existing.status = row["status"]
db.session.add(existing)
db.session.flush()
_persist_cc_objects(row, job.customer_id, job.id, existing.id, report_dt)
counters["skipped"] += 1
counters["linked"] += 1
continue
error_message = _build_error_message(row)
run = JobRun(
job_id=job.id,
mail_message_id=mail_message_id,
run_at=report_dt,
status=row["status"],
remark=error_message or None,
missed=False,
override_applied=False,
source_type="cloud_connect",
external_id=external_id,
)
db.session.add(run)
db.session.flush() # need run.id for object links
_persist_cc_objects(row, job.customer_id, job.id, run.id, report_dt)
counters["created"] += 1
counters["linked"] += 1
db.session.commit()
return counters
def _persist_cc_objects(
row: dict,
customer_id: int,
job_id: int,
run_id: int,
observed_at: datetime,
) -> None:
"""Upsert repository as a customer_object and create a run_object_link.
This mirrors the Cove datasource object persistence so Cloud Connect runs
have per-run object records suitable for reporting.
"""
from sqlalchemy import text as _text
repo_name = (row.get("repo_name") or "").strip() or "Unknown Repository"
repo_type = (row.get("repo_type") or "").strip()
object_name = f"{repo_name} ({repo_type})" if repo_type else repo_name
used = row.get("used_space") or ""
quota = row.get("total_quota") or ""
free = row.get("free_space") or ""
last = row.get("last_active_raw") or "unknown"
detail = f"Used: {used} / {quota} — Free: {free} — Last active: {last}"
try:
customer_object_id = db.session.execute(
_text("""
INSERT INTO customer_objects
(customer_id, object_name, object_type, first_seen_at, last_seen_at)
VALUES
(:customer_id, :object_name, :object_type, NOW(), NOW())
ON CONFLICT (customer_id, object_name)
DO UPDATE SET
object_type = COALESCE(EXCLUDED.object_type, customer_objects.object_type),
last_seen_at = NOW()
RETURNING id
"""),
{
"customer_id": customer_id,
"object_name": object_name,
"object_type": "cloud_connect_repo",
},
).scalar()
db.session.execute(
_text("""
INSERT INTO job_object_links
(job_id, customer_object_id, first_seen_at, last_seen_at)
VALUES (:job_id, :customer_object_id, NOW(), NOW())
ON CONFLICT (job_id, customer_object_id)
DO UPDATE SET last_seen_at = NOW()
"""),
{"job_id": job_id, "customer_object_id": customer_object_id},
)
db.session.execute(
_text("""
INSERT INTO run_object_links
(run_id, customer_object_id, status, error_message, observed_at)
VALUES
(:run_id, :customer_object_id, :status, :error_message, :observed_at)
ON CONFLICT (run_id, customer_object_id)
DO UPDATE SET
status = EXCLUDED.status,
error_message = EXCLUDED.error_message,
observed_at = EXCLUDED.observed_at
"""),
{
"run_id": run_id,
"customer_object_id": customer_object_id,
"status": row["status"],
"error_message": detail,
"observed_at": observed_at,
},
)
except Exception as exc:
logger.warning("CC object persist failed for run %s: %s", run_id, exc)
def _build_error_message(row: dict) -> str:
"""Build a human-readable remark for a Cloud Connect run."""
parts = [
f"Repository: {row['repo_name']} ({row['repo_type']})",
f"Used: {row['used_space']} / {row['total_quota']}",
f"Free: {row['free_space']}",
f"Last active: {row['last_active_raw'] or 'unknown'}",
]
if row["status"] == "Failed":
parts.append("⚠ Repository appears to be full or near full")
elif row["status"] == "Warning" and row["last_active_raw"].lower() in ("never", ""):
parts.append("⚠ Backup has never run")
elif row["status"] == "Warning" and "days ago" in row["last_active_raw"].lower():
parts.append(f"⚠ No recent activity: {row['last_active_raw']}")
return " | ".join(parts)

View File

@ -1,706 +0,0 @@
"""Cove Data Protection API importer.
Fetches backup job run data from the Cove (N-able) API.
Flow (mirrors the mail Inbox flow):
1. All Cove accounts are upserted into the `cove_accounts` staging table.
2. Accounts without a linked job appear on the Cove Accounts page where
an admin can create or link a job (same as approving a mail from Inbox).
3. For accounts that have a linked job, a JobRun is created per new session
(deduplicated via external_id).
"""
from __future__ import annotations
import logging
import re
from datetime import date, datetime, timedelta, timezone
from typing import Any
import requests
from sqlalchemy import text
from .database import db
logger = logging.getLogger(__name__)
COVE_DEFAULT_URL = "https://api.backup.management/jsonapi"
# Columns to request from EnumerateAccountStatistics
COVE_COLUMNS = [
"I1", # Account/device name
"I18", # Computer name
"I8", # Customer / partner name
"I78", # Active datasource label
"D09F00", # Overall last session status
"D09F09", # Last successful session timestamp
"D09F15", # Last session end timestamp
"D09F08", # 28-day colorbar
# Datasource-specific status (F00) and last session time (F15)
"D1F00", "D1F15", # Files & Folders
"D10F00", "D10F15", # VssMsSql
"D11F00", "D11F15", # VssSharePoint
"D19F00", "D19F15", # M365 Exchange
"D20F00", "D20F15", # M365 OneDrive
"D5F00", "D5F15", # M365 SharePoint
"D23F00", "D23F15", # M365 Teams
]
# Mapping from Cove status code to Backupchecks status string
STATUS_MAP: dict[int, str] = {
1: "Warning", # In process
2: "Error", # Failed
3: "Error", # Aborted
5: "Success", # Completed
6: "Error", # Interrupted
7: "Warning", # NotStarted
8: "Warning", # CompletedWithErrors
9: "Warning", # InProgressWithFaults
10: "Error", # OverQuota
11: "Warning", # NoSelection
12: "Warning", # Restarted
}
# Mapping from Cove status code to readable label
STATUS_LABELS: dict[int, str] = {
1: "In process",
2: "Failed",
3: "Aborted",
5: "Completed",
6: "Interrupted",
7: "Not started",
8: "Completed with errors",
9: "In progress with faults",
10: "Over quota",
11: "No selection",
12: "Restarted",
}
# Datasource label mapping (column prefix → human-readable label)
DATASOURCE_LABELS: dict[str, str] = {
"D1": "Files & Folders",
"D10": "VssMsSql",
"D11": "VssSharePoint",
"D19": "M365 Exchange",
"D20": "M365 OneDrive",
"D5": "M365 SharePoint",
"D23": "M365 Teams",
}
class CoveImportError(Exception):
"""Raised when Cove API interaction fails."""
def _cove_login(url: str, username: str, password: str) -> tuple[str, int]:
"""Login to the Cove API and return (visa, partner_id).
Raises CoveImportError on failure.
"""
payload = {
"jsonrpc": "2.0",
"id": "jsonrpc",
"method": "Login",
"params": {
"username": username,
"password": password,
},
}
try:
resp = requests.post(
url,
json=payload,
headers={"Content-Type": "application/json"},
timeout=30,
)
resp.raise_for_status()
data = resp.json()
except requests.RequestException as exc:
raise CoveImportError(f"Cove login request failed: {exc}") from exc
except ValueError as exc:
raise CoveImportError(f"Cove login response is not valid JSON: {exc}") from exc
if "error" in data and data["error"]:
error = data["error"]
msg = error.get("message") or str(error) if isinstance(error, dict) else str(error)
raise CoveImportError(f"Cove login failed: {msg}")
# Visa is returned at the top level of the response (not inside result)
visa = data.get("visa") or ""
if not visa:
raise CoveImportError("Cove login succeeded but no visa token returned")
# PartnerId is inside result
result = data.get("result") or {}
partner_id = (
result.get("PartnerId")
or result.get("PartnerID")
or result.get("result", {}).get("PartnerId")
or 0
)
return visa, int(partner_id)
def _cove_enumerate(
url: str,
visa: str,
partner_id: int,
start: int,
count: int,
) -> list[dict]:
"""Call EnumerateAccountStatistics and return a list of account dicts.
Returns empty list when no more results.
"""
payload = {
"jsonrpc": "2.0",
"visa": visa,
"id": "jsonrpc",
"method": "EnumerateAccountStatistics",
"params": {
"query": {
"PartnerId": partner_id,
"StartRecordNumber": start,
"RecordsCount": count,
"Columns": COVE_COLUMNS,
}
},
}
try:
resp = requests.post(
url,
json=payload,
headers={"Content-Type": "application/json"},
timeout=60,
)
resp.raise_for_status()
data = resp.json()
except requests.RequestException as exc:
raise CoveImportError(f"Cove EnumerateAccountStatistics request failed: {exc}") from exc
except ValueError as exc:
raise CoveImportError(f"Cove EnumerateAccountStatistics response is not valid JSON: {exc}") from exc
if "error" in data and data["error"]:
error = data["error"]
msg = error.get("message") or str(error) if isinstance(error, dict) else str(error)
raise CoveImportError(f"Cove EnumerateAccountStatistics failed: {msg}")
result = data.get("result")
if result is None:
return []
# Unwrap possible nested result
if isinstance(result, dict) and "result" in result:
result = result["result"]
# Accounts can be a list directly or wrapped in an "Accounts" key
if isinstance(result, list):
return result
if isinstance(result, dict):
return result.get("Accounts", []) or []
return []
def _flatten_settings(account: dict) -> dict:
"""Convert the Settings array in an account dict to a flat key→value dict.
Cove returns settings as a list of single-key dicts, e.g.:
[{"D09F00": "5"}, {"I1": "device name"}, ...]
"""
flat: dict[str, Any] = {}
settings_list = account.get("Settings") or []
if isinstance(settings_list, list):
for item in settings_list:
if isinstance(item, dict):
flat.update(item)
return flat
def _map_status(code: Any) -> str:
"""Map a Cove status code (int) to a Backupchecks status string."""
if code is None:
return "Warning"
try:
return STATUS_MAP.get(int(code), "Warning")
except (ValueError, TypeError):
return "Warning"
def _status_label(code: Any) -> str:
"""Map a Cove status code (int) to a human-readable label."""
if code is None:
return "Unknown"
try:
return STATUS_LABELS.get(int(code), f"Code {int(code)}")
except (ValueError, TypeError):
return "Unknown"
def _ts_to_dt(value: Any) -> datetime | None:
"""Convert a Unix timestamp (int or str) to a naive UTC datetime."""
ts = _extract_unix_ts(value)
if ts is None:
return None
try:
return datetime.fromtimestamp(ts, tz=timezone.utc).replace(tzinfo=None)
except (ValueError, TypeError, OSError):
return None
def _extract_unix_ts(value: Any) -> int | None:
"""Extract Unix epoch seconds from Cove timestamp variants.
Supports:
- plain epoch seconds ("1735689600")
- epoch milliseconds ("1735689600000")
- .NET JSON date style ("/Date(1735689600000)/")
"""
if value is None:
return None
# Plain int/float values
if isinstance(value, (int, float)):
ts = int(value)
else:
s = str(value).strip()
if not s:
return None
# Handle /Date(1735689600000)/ (optionally with timezone suffix)
m = re.search(r"/Date\(([-]?\d+)(?:[+-]\d{4})?\)/", s)
if m:
s = m.group(1)
try:
ts = int(s)
except (ValueError, TypeError):
return None
if ts <= 0:
return None
# Convert ms/us/ns to seconds when needed
if ts > 10_000_000_000_000_000: # ns
ts = ts // 1_000_000_000
elif ts > 10_000_000_000_000: # us
ts = ts // 1_000_000
elif ts > 10_000_000_000: # ms
ts = ts // 1_000
return ts if ts > 0 else None
def _fmt_utc(dt: datetime | None) -> str:
"""Format a naive UTC datetime to readable text for run object messages."""
if not dt:
return "unknown"
return dt.strftime("%Y-%m-%d %H:%M UTC")
def _backfill_colorbar_runs(
cove_acc,
job,
colorbar: str,
last_run_at: datetime,
) -> int:
"""Create historical JobRun records from the 28-day colorbar string (D09F08).
The colorbar encodes backup status for each of the last 28 days. This is
called after an account is first linked so that historical data is visible
immediately instead of only accumulating day-by-day going forward.
Supported formats:
- Continuous string: "55525555..." (each char = 1 day, oldest first)
- Separated: "5 5 2 5 5..." or "5,5,2,5,5..."
Status code 0 means no backup ran that day and is skipped.
Returns the number of runs created.
"""
from .models import JobRun # local import to avoid circular deps
if not colorbar or not last_run_at:
return 0
# Normalise to a list of raw code strings
stripped = colorbar.strip()
if " " in stripped or "," in stripped:
raw_codes = re.split(r"[,\s]+", stripped)
else:
raw_codes = list(stripped)
if not raw_codes:
return 0
# Oldest day is first (index 0), newest day is last (index -1).
# The newest entry corresponds to the real run already created; skip it.
newest_date = last_run_at.date()
ref_time = last_run_at.time() # use same time-of-day as the real run
created = 0
for i, code_raw in enumerate(raw_codes):
try:
code = int(str(code_raw).strip())
except (ValueError, TypeError):
continue
if code == 0:
continue # no backup that day
# How many days before the newest date is this position?
days_ago = len(raw_codes) - 1 - i
if days_ago == 0:
# Most-recent day — real run already exists with precise timestamp
continue
run_date = newest_date - timedelta(days=days_ago)
run_dt = datetime(
run_date.year, run_date.month, run_date.day,
ref_time.hour, ref_time.minute, ref_time.second,
)
date_str = run_date.isoformat()
external_id = f"cove-colorbar-{cove_acc.account_id}-{date_str}"
existing = JobRun.query.filter_by(
job_id=job.id, external_id=external_id
).first()
if existing:
continue
status = _map_status(code)
run = JobRun(
job_id=job.id,
mail_message_id=None,
run_at=run_dt,
status=status,
remark=(
f"Cove historical run (28-day colorbar) | "
f"date: {date_str} | "
f"status: {_status_label(code)} ({code})"
),
missed=False,
override_applied=False,
source_type="cove_api",
external_id=external_id,
)
db.session.add(run)
db.session.flush()
created += 1
return created
def run_cove_import(settings, include_reasons: bool = False):
"""Fetch Cove account statistics and update the staging table + JobRuns.
For every account:
- Upsert into cove_accounts (always)
- If the account has a linked job create a JobRun if not already seen
Args:
settings: SystemSettings ORM object with cove_* fields.
Returns:
By default: tuple(total_accounts, created_runs, skipped_runs, error_count).
When include_reasons=True:
tuple(total_accounts, created_runs, skipped_runs, error_count, reason_counts).
Raises:
CoveImportError if the API login fails.
"""
url = (getattr(settings, "cove_api_url", None) or "").strip() or COVE_DEFAULT_URL
username = (getattr(settings, "cove_api_username", None) or "").strip()
password = (getattr(settings, "cove_api_password", None) or "").strip()
if not username or not password:
raise CoveImportError("Cove API username or password not configured")
visa, partner_id = _cove_login(url, username, password)
# Save partner_id back to settings
if partner_id and partner_id != getattr(settings, "cove_partner_id", None):
settings.cove_partner_id = partner_id
try:
db.session.commit()
except Exception:
db.session.rollback()
total = 0
created = 0
skipped = 0
errors = 0
reason_counts: dict[str, int] = {}
page_size = 250
start = 0
while True:
try:
accounts = _cove_enumerate(url, visa, partner_id, start, page_size)
except CoveImportError:
raise
except Exception as exc:
raise CoveImportError(f"Unexpected error fetching accounts at offset {start}: {exc}") from exc
if not accounts:
break
for account in accounts:
total += 1
try:
reason = _process_account(account)
if reason == "created":
created += 1
else:
skipped += 1
reason_counts[reason] = reason_counts.get(reason, 0) + 1
except Exception as exc:
errors += 1
reason_counts["error"] = reason_counts.get("error", 0) + 1
logger.warning("Cove import: error processing account: %s", exc)
try:
db.session.rollback()
except Exception:
pass
if len(accounts) < page_size:
break
start += page_size
# Update last import timestamp
settings.cove_last_import_at = datetime.utcnow()
try:
db.session.commit()
except Exception:
db.session.rollback()
if include_reasons:
return total, created, skipped, errors, reason_counts
return total, created, skipped, errors
def _process_account(account: dict) -> str:
"""Upsert a Cove account into the staging table and create a JobRun if linked.
Returns a result code:
- "created"
- "skip_invalid_account_id"
- "skip_unlinked"
- "skip_no_timestamp"
- "skip_missing_job"
- "skip_duplicate"
"""
from .models import CoveAccount, Job, JobRun
flat = _flatten_settings(account)
# AccountId is a top-level field
account_id = account.get("AccountId") or account.get("AccountID")
if not account_id:
return "skip_invalid_account_id"
try:
account_id = int(account_id)
except (ValueError, TypeError):
return "skip_invalid_account_id"
# Extract metadata from flat settings
account_name = (flat.get("I1") or "").strip() or None
computer_name = (flat.get("I18") or "").strip() or None
customer_name = (flat.get("I8") or "").strip() or None
datasource_types = (flat.get("I78") or "").strip() or None
# Prefer "last session end" (D09F15); fallback to "last successful session" (D09F09)
# so accounts without D09F15 can still produce an initial run.
last_run_ts_raw = flat.get("D09F15")
last_run_at = _ts_to_dt(last_run_ts_raw)
if last_run_at is None:
last_run_ts_raw = flat.get("D09F09")
last_run_at = _ts_to_dt(last_run_ts_raw)
colorbar_28d = (flat.get("D09F08") or "").strip() or None
try:
last_status_code = int(flat["D09F00"]) if flat.get("D09F00") is not None else None
except (ValueError, TypeError):
last_status_code = None
# Upsert into cove_accounts staging table
cove_acc = CoveAccount.query.filter_by(account_id=account_id).first()
if cove_acc is None:
cove_acc = CoveAccount(
account_id=account_id,
first_seen_at=datetime.utcnow(),
)
db.session.add(cove_acc)
cove_acc.account_name = account_name
cove_acc.computer_name = computer_name
cove_acc.customer_name = customer_name
cove_acc.datasource_types = datasource_types
cove_acc.last_status_code = last_status_code
cove_acc.last_run_at = last_run_at
cove_acc.colorbar_28d = colorbar_28d
cove_acc.last_seen_at = datetime.utcnow()
db.session.flush() # ensure cove_acc.id is set
# Backwards-compat fallback: resolve link from jobs.cove_account_id.
# This keeps legacy/manual job-detail linking working even if cove_accounts.job_id
# was not set yet.
if not cove_acc.job_id:
fallback_job = (
Job.query
.filter(Job.cove_account_id == account_id, Job.archived.is_(False))
.order_by(Job.id.asc())
.first()
)
if fallback_job:
cove_acc.job_id = fallback_job.id
# If still not linked to a job, nothing more to do (shows up in Cove Accounts page)
if not cove_acc.job_id:
db.session.commit()
return "skip_unlinked"
# Account is linked: create a JobRun if the last session is new
if not last_run_at:
db.session.commit()
return "skip_no_timestamp"
run_ts = _extract_unix_ts(last_run_ts_raw)
if not run_ts:
# Fallback to parsed datetime to keep dedup stable for non-numeric raw formats.
run_ts = int(last_run_at.replace(tzinfo=timezone.utc).timestamp())
# Fetch the linked job
job = Job.query.get(cove_acc.job_id)
if not job:
db.session.commit()
return "skip_missing_job"
external_id = f"cove-{account_id}-{run_ts}"
# Deduplicate per job + session, not globally.
# This avoids blocking a run on a newly linked/relinked job when the same
# Cove session was previously stored under another job.
existing = JobRun.query.filter_by(job_id=job.id, external_id=external_id).first()
if existing:
db.session.commit()
return "skip_duplicate"
status = _map_status(last_status_code)
run_remark = (
f"Cove account: {account_name or account_id} | "
f"Computer: {computer_name or '-'} | "
f"Customer: {customer_name or '-'} | "
f"Last status: {_status_label(last_status_code)} ({last_status_code if last_status_code is not None else '-'}) | "
f"Last run: {_fmt_utc(last_run_at)}"
)
run = JobRun(
job_id=job.id,
mail_message_id=None,
run_at=last_run_at,
status=status,
remark=run_remark,
missed=False,
override_applied=False,
source_type="cove_api",
external_id=external_id,
)
db.session.add(run)
db.session.flush() # get run.id
# Persist per-datasource objects
if job.customer_id:
_persist_datasource_objects(flat, job.customer_id, job.id, run.id, last_run_at)
# Backfill historical runs from the 28-day colorbar when this is the first
# real run for this job (i.e. the account was just linked). Deduplication
# via external_id makes this safe to call on every import.
if colorbar_28d:
backfilled = _backfill_colorbar_runs(cove_acc, job, colorbar_28d, last_run_at)
if backfilled:
logger.info(
"Cove backfill: created %d historical runs for account %s",
backfilled,
account_id,
)
db.session.commit()
return "created"
def _persist_datasource_objects(
flat: dict,
customer_id: int,
job_id: int,
run_id: int,
observed_at: datetime,
) -> None:
"""Create run_object_links for each active datasource found in the account stats."""
for ds_prefix, ds_label in DATASOURCE_LABELS.items():
status_key = f"{ds_prefix}F00"
status_code = flat.get(status_key)
if status_code is None:
continue
status = _map_status(status_code)
ds_last_ts = _ts_to_dt(flat.get(f"{ds_prefix}F15"))
status_msg = (
f"Cove datasource status: {_status_label(status_code)} "
f"({status_code}); last session: {_fmt_utc(ds_last_ts)}"
)
# Use the same SQLAlchemy session/transaction as JobRun creation.
# A separate engine connection cannot reliably see the uncommitted run row.
customer_object_id = db.session.execute(
text(
"""
INSERT INTO customer_objects (customer_id, object_name, object_type, first_seen_at, last_seen_at)
VALUES (:customer_id, :object_name, :object_type, NOW(), NOW())
ON CONFLICT (customer_id, object_name)
DO UPDATE SET
last_seen_at = NOW(),
object_type = COALESCE(EXCLUDED.object_type, customer_objects.object_type)
RETURNING id
"""
),
{
"customer_id": customer_id,
"object_name": ds_label,
"object_type": "cove_datasource",
},
).scalar()
db.session.execute(
text(
"""
INSERT INTO job_object_links (job_id, customer_object_id, first_seen_at, last_seen_at)
VALUES (:job_id, :customer_object_id, NOW(), NOW())
ON CONFLICT (job_id, customer_object_id)
DO UPDATE SET last_seen_at = NOW()
"""
),
{"job_id": job_id, "customer_object_id": customer_object_id},
)
db.session.execute(
text(
"""
INSERT INTO run_object_links (run_id, customer_object_id, status, error_message, observed_at)
VALUES (:run_id, :customer_object_id, :status, :error_message, :observed_at)
ON CONFLICT (run_id, customer_object_id)
DO UPDATE SET
status = EXCLUDED.status,
error_message = EXCLUDED.error_message,
observed_at = EXCLUDED.observed_at
"""
),
{
"run_id": run_id,
"customer_object_id": customer_object_id,
"status": status,
"error_message": status_msg,
"observed_at": ds_last_ts or observed_at,
},
)

View File

@ -1,101 +0,0 @@
"""Cove Data Protection importer background service.
Runs a background thread that periodically fetches backup job run data
from the Cove API and creates JobRun records in the local database.
"""
from __future__ import annotations
import threading
import time
from datetime import datetime
from .admin_logging import log_admin_event
from .cove_importer import CoveImportError, run_cove_import
from .models import SystemSettings
_COVE_IMPORTER_THREAD_NAME = "cove_importer"
def start_cove_importer(app) -> None:
"""Start the Cove importer background thread.
The thread checks settings on every loop and only runs imports when
enabled and the configured interval has elapsed.
"""
# Avoid starting multiple threads if create_app() is called more than once.
if any(t.name == _COVE_IMPORTER_THREAD_NAME for t in threading.enumerate()):
return
def _worker() -> None:
last_run_at: datetime | None = None
while True:
try:
with app.app_context():
settings = SystemSettings.query.first()
if settings is None:
time.sleep(10)
continue
enabled = bool(getattr(settings, "cove_import_enabled", False))
try:
interval_minutes = int(getattr(settings, "cove_import_interval_minutes", 30) or 30)
except (TypeError, ValueError):
interval_minutes = 30
if interval_minutes < 1:
interval_minutes = 1
now = datetime.utcnow()
due = False
if enabled:
if last_run_at is None:
due = True
else:
due = (now - last_run_at).total_seconds() >= (interval_minutes * 60)
if not due:
time.sleep(5)
continue
try:
total, created, skipped, errors = run_cove_import(settings)
except CoveImportError as exc:
log_admin_event(
"cove_import_error",
f"Cove import failed: {exc}",
)
last_run_at = now
time.sleep(5)
continue
except Exception as exc:
log_admin_event(
"cove_import_error",
f"Unexpected error during Cove import: {exc}",
)
last_run_at = now
time.sleep(5)
continue
log_admin_event(
"cove_import",
f"Cove import finished. accounts={total}, created={created}, skipped={skipped}, errors={errors}",
)
last_run_at = now
except Exception:
# Never let the thread die.
try:
with app.app_context():
log_admin_event(
"cove_import_error",
"Cove importer thread recovered from an unexpected exception.",
)
except Exception:
pass
time.sleep(5)
t = threading.Thread(target=_worker, name=_COVE_IMPORTER_THREAD_NAME, daemon=True)
t.start()

View File

@ -35,11 +35,6 @@ def find_matching_job(msg: MailMessage) -> Optional[Job]:
q = Job.query q = Job.query
# Never auto-match archived jobs.
# Archived jobs should remain historical and must not receive new mail links/runs.
if hasattr(Job, "archived"):
q = q.filter(Job.archived.is_(False))
if norm_from is None: if norm_from is None:
q = q.filter(Job.from_address.is_(None)) q = q.filter(Job.from_address.is_(None))
else: else:
@ -91,8 +86,6 @@ def find_matching_job(msg: MailMessage) -> Optional[Job]:
return None return None
q2 = Job.query q2 = Job.query
if hasattr(Job, "archived"):
q2 = q2.filter(Job.archived.is_(False))
if norm_from is None: if norm_from is None:
q2 = q2.filter(Job.from_address.is_(None)) q2 = q2.filter(Job.from_address.is_(None))
else: else:

View File

@ -14,7 +14,6 @@ from . import db
from .models import MailMessage, SystemSettings, Job, JobRun, MailObject from .models import MailMessage, SystemSettings, Job, JobRun, MailObject
from .parsers import parse_mail_message from .parsers import parse_mail_message
from .parsers.veeam import extract_vspc_active_alarms_companies from .parsers.veeam import extract_vspc_active_alarms_companies
from .cloud_connect_importer import upsert_cloud_connect_report
from .email_utils import normalize_from_address, extract_best_html_from_eml, is_effectively_blank_html from .email_utils import normalize_from_address, extract_best_html_from_eml, is_effectively_blank_html
from .job_matching import find_matching_job from .job_matching import find_matching_job
from .ticketing_utils import link_open_internal_tickets_to_run from .ticketing_utils import link_open_internal_tickets_to_run
@ -273,37 +272,6 @@ def _store_messages(settings: SystemSettings, messages):
btype = (getattr(mail, "backup_type", "") or "").strip().lower() btype = (getattr(mail, "backup_type", "") or "").strip().lower()
jname = (getattr(mail, "job_name", "") or "").strip().lower() jname = (getattr(mail, "job_name", "") or "").strip().lower()
# ── Veeam Cloud Connect daily report ──────────────────────
# One report contains all tenants. Upsert each into the
# cloud_connect_accounts staging table; linked accounts get
# a JobRun automatically — same flow as Cove Data Protection.
if bsw == "veeam" and btype == "cloud connect report":
try:
result = upsert_cloud_connect_report(
mail_message_id=mail.id,
html_body=(mail.html_body or ""),
)
logger.debug(
"Cloud Connect import: total=%s linked=%s unlinked=%s "
"created=%s skipped=%s",
result.get("total"), result.get("linked"),
result.get("unlinked"), result.get("created"),
result.get("skipped"),
)
if result.get("created", 0) > 0 or result.get("linked", 0) > 0:
if hasattr(mail, "approved"):
mail.approved = True
if hasattr(mail, "approved_at"):
mail.approved_at = datetime.utcnow()
if hasattr(mail, "location"):
mail.location = "history"
auto_approved += 1
except Exception as cc_exc:
logger.warning("Cloud Connect import failed: %s", cc_exc)
db.session.commit()
continue
# ── end Cloud Connect ──────────────────────────────────────
if bsw == "veeam" and btype == "service provider console" and jname == "active alarms summary": if bsw == "veeam" and btype == "service provider console" and jname == "active alarms summary":
raw = (mail.text_body or "").strip() or (mail.html_body or "") raw = (mail.text_body or "").strip() or (mail.html_body or "")
companies = extract_vspc_active_alarms_companies(raw) companies = extract_vspc_active_alarms_companies(raw)

View File

@ -26,8 +26,5 @@ from . import routes_feedback # noqa: F401
from . import routes_api # noqa: F401 from . import routes_api # noqa: F401
from . import routes_reporting_api # noqa: F401 from . import routes_reporting_api # noqa: F401
from . import routes_user_settings # noqa: F401 from . import routes_user_settings # noqa: F401
from . import routes_search # noqa: F401
from . import routes_cove # noqa: F401
from . import routes_cloud_connect # noqa: F401
__all__ = ["main_bp", "roles_required"] __all__ = ["main_bp", "roles_required"]

View File

@ -16,11 +16,9 @@ def api_job_run_alerts(run_id: int):
tickets = [] tickets = []
remarks = [] remarks = []
# Tickets linked to this run: # Tickets linked to this specific run
# 1. Explicitly linked via ticket_job_runs (audit trail when resolved) # Only show tickets that were explicitly linked via ticket_job_runs
# 2. Linked to the job via ticket_scopes (active on run date)
try: try:
# First, get tickets explicitly linked to this run via ticket_job_runs
rows = ( rows = (
db.session.execute( db.session.execute(
text( text(
@ -45,11 +43,7 @@ def api_job_run_alerts(run_id: int):
.all() .all()
) )
ticket_ids_seen = set()
for r in rows: for r in rows:
ticket_id = int(r.get("id"))
ticket_ids_seen.add(ticket_id)
resolved_at = r.get("resolved_at") resolved_at = r.get("resolved_at")
resolved_same_day = False resolved_same_day = False
if resolved_at and run_date: if resolved_at and run_date:
@ -58,62 +52,7 @@ def api_job_run_alerts(run_id: int):
tickets.append( tickets.append(
{ {
"id": ticket_id, "id": int(r.get("id")),
"ticket_code": r.get("ticket_code") or "",
"description": r.get("description") or "",
"start_date": _format_datetime(r.get("start_date")),
"active_from_date": str(r.get("active_from_date")) if r.get("active_from_date") else "",
"resolved_at": _format_datetime(r.get("resolved_at")) if r.get("resolved_at") else "",
"active": bool(active_now),
"resolved_same_day": bool(resolved_same_day),
}
)
# Second, get tickets linked to the job via ticket_scopes
# These are tickets that apply to the whole job (not just a specific run)
rows = (
db.session.execute(
text(
"""
SELECT DISTINCT t.id,
t.ticket_code,
t.description,
t.start_date,
t.resolved_at,
t.active_from_date
FROM tickets t
JOIN ticket_scopes ts ON ts.ticket_id = t.id
WHERE ts.job_id = :job_id
AND t.active_from_date <= :run_date
AND COALESCE(ts.resolved_at, t.resolved_at) IS NULL
ORDER BY t.start_date DESC
"""
),
{
"job_id": job.id if job else 0,
"run_date": run_date,
},
)
.mappings()
.all()
)
for r in rows:
ticket_id = int(r.get("id"))
# Skip if already added via ticket_job_runs
if ticket_id in ticket_ids_seen:
continue
ticket_ids_seen.add(ticket_id)
resolved_at = r.get("resolved_at")
resolved_same_day = False
if resolved_at and run_date:
resolved_same_day = _to_amsterdam_date(resolved_at) == run_date
active_now = r.get("resolved_at") is None
tickets.append(
{
"id": ticket_id,
"ticket_code": r.get("ticket_code") or "", "ticket_code": r.get("ticket_code") or "",
"description": r.get("description") or "", "description": r.get("description") or "",
"start_date": _format_datetime(r.get("start_date")), "start_date": _format_datetime(r.get("start_date")),
@ -126,13 +65,9 @@ def api_job_run_alerts(run_id: int):
except Exception as exc: except Exception as exc:
return jsonify({"status": "error", "message": str(exc) or "Failed to load tickets."}), 500 return jsonify({"status": "error", "message": str(exc) or "Failed to load tickets."}), 500
# Remarks linked to this run: # Remarks linked to this specific run
# 1. Explicitly linked via remark_job_runs (audit trail when resolved) # Only show remarks that were explicitly linked via remark_job_runs
# 2. Linked to the job via remark_scopes (active on run date)
try: try:
remark_ids_seen = set()
# First, remarks explicitly linked to this run.
rows = ( rows = (
db.session.execute( db.session.execute(
text( text(
@ -153,9 +88,6 @@ def api_job_run_alerts(run_id: int):
) )
for rr in rows: for rr in rows:
remark_id = int(rr.get("id"))
remark_ids_seen.add(remark_id)
body = (rr.get("body") or "").strip() body = (rr.get("body") or "").strip()
if len(body) > 180: if len(body) > 180:
body = body[:177] + "..." body = body[:177] + "..."
@ -169,64 +101,7 @@ def api_job_run_alerts(run_id: int):
remarks.append( remarks.append(
{ {
"id": remark_id, "id": int(rr.get("id")),
"body": body,
"start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-",
"active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "",
"resolved_at": _format_datetime(rr.get("resolved_at")) if rr.get("resolved_at") else "",
"active": bool(active_now),
"resolved_same_day": bool(resolved_same_day),
}
)
# Second, active job-level remarks from scope (not yet explicitly linked to this run).
ui_tz = _get_ui_timezone_name()
rows = (
db.session.execute(
text(
"""
SELECT DISTINCT r.id, r.body, r.start_date, r.resolved_at, r.active_from_date
FROM remarks r
JOIN remark_scopes rs ON rs.remark_id = r.id
WHERE rs.job_id = :job_id
AND COALESCE(
r.active_from_date,
((r.start_date AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date)
) <= :run_date
AND r.resolved_at IS NULL
ORDER BY r.start_date DESC
"""
),
{
"job_id": job.id if job else 0,
"run_date": run_date,
"ui_tz": ui_tz,
},
)
.mappings()
.all()
)
for rr in rows:
remark_id = int(rr.get("id"))
if remark_id in remark_ids_seen:
continue
remark_ids_seen.add(remark_id)
body = (rr.get("body") or "").strip()
if len(body) > 180:
body = body[:177] + "..."
resolved_at = rr.get("resolved_at")
resolved_same_day = False
if resolved_at and run_date:
resolved_same_day = _to_amsterdam_date(resolved_at) == run_date
active_now = resolved_at is None or (not resolved_same_day)
remarks.append(
{
"id": remark_id,
"body": body, "body": body,
"start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-", "start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-",
"active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "", "active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "",

View File

@ -1,219 +0,0 @@
"""Veeam Cloud Connect accounts review routes.
Mirrors the Cove Accounts flow:
/cloud-connect/accounts list all accounts (unmatched first)
/cloud-connect/accounts/scan-inbox process existing inbox mails
/cloud-connect/accounts/<id>/link link to existing job or create new job
/cloud-connect/accounts/<id>/unlink remove the job link
"""
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _log_admin_event
from ..models import CloudConnectAccount, Customer, Job, JobRun, MailMessage
from ..cloud_connect_importer import upsert_cloud_connect_report
@main_bp.route("/cloud-connect/accounts")
@login_required
@roles_required("admin", "operator")
def cloud_connect_accounts():
# Unmatched accounts shown first, then matched — same as Cove Accounts
unmatched = (
CloudConnectAccount.query
.filter(CloudConnectAccount.job_id.is_(None))
.order_by(CloudConnectAccount.user.asc(), CloudConnectAccount.section.asc())
.all()
)
matched = (
CloudConnectAccount.query
.filter(CloudConnectAccount.job_id.isnot(None))
.order_by(CloudConnectAccount.user.asc(), CloudConnectAccount.section.asc())
.all()
)
customers = Customer.query.filter_by(active=True).order_by(Customer.name.asc()).all()
customer_rows = [{"id": c.id, "name": c.name} for c in customers]
jobs = Job.query.filter_by(archived=False).order_by(Job.job_name.asc()).all()
# Attach derived fields for the template
for acc in unmatched + matched:
acc.derived_backup_software = "Veeam"
acc.derived_backup_type = (
"Cloud Connect Agent" if acc.section == "Agent" else "Cloud Connect Backup"
)
# Use repo_name as the suggested job name if set (distinguishes multiple repos
# per user); fall back to user name when repo_name is absent.
acc.derived_job_name = acc.repo_name.strip() if acc.repo_name and acc.repo_name.strip() else acc.user
return render_template(
"main/cloud_connect_accounts.html",
unmatched=unmatched,
matched=matched,
customers=customer_rows,
jobs=jobs,
)
@main_bp.route("/cloud-connect/accounts/scan-inbox", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def cloud_connect_scan_inbox():
"""Process all existing inbox mails that are Cloud Connect report emails."""
mails = (
MailMessage.query
.filter(
MailMessage.location == "inbox",
db.func.lower(MailMessage.backup_type) == "cloud connect report",
)
.all()
)
total = len(mails)
total_accounts = 0
total_created = 0
errors = 0
for mail in mails:
try:
result = upsert_cloud_connect_report(
mail_message_id=mail.id,
html_body=(mail.html_body or ""),
)
total_accounts += result.get("total", 0)
total_created += result.get("created", 0)
except Exception as exc:
errors += 1
_log_admin_event(
event_type="cloud_connect_scan_error",
message=f"Failed to process mail {mail.id}: {exc}",
)
db.session.commit()
_log_admin_event(
event_type="cloud_connect_scan_inbox",
message=(
f"Scanned {total} inbox mail(s): "
f"{total_accounts} accounts upserted, {total_created} new runs created, {errors} error(s)"
),
)
flash(
f"Scanned {total} mail(s): {total_accounts} accounts upserted, "
f"{total_created} new runs created."
+ (f" {errors} error(s)." if errors else ""),
"success" if errors == 0 else "warning",
)
return redirect(url_for("main.cloud_connect_accounts"))
def _import_historical_runs(acc) -> int:
"""Re-process all stored Cloud Connect report emails for a newly linked account.
Returns the total number of new JobRun records created.
"""
mails = (
MailMessage.query
.filter(
db.func.lower(MailMessage.backup_type) == "cloud connect report",
MailMessage.location != "deleted",
MailMessage.html_body.isnot(None),
)
.order_by(MailMessage.received_at.asc())
.all()
)
runs_created = 0
for mail in mails:
try:
result = upsert_cloud_connect_report(
mail_message_id=mail.id,
html_body=mail.html_body or "",
)
runs_created += result.get("created", 0)
except Exception as exc:
_log_admin_event(
event_type="cloud_connect_historical_import_error",
message=f"Failed to re-process mail {mail.id}: {exc}",
)
return runs_created
@main_bp.route("/cloud-connect/accounts/<int:cc_account_db_id>/link", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def cloud_connect_account_link(cc_account_db_id: int):
acc = CloudConnectAccount.query.get_or_404(cc_account_db_id)
action = (request.form.get("action") or "").strip() # "create" or "link"
if action == "create":
customer_id = request.form.get("customer_id", type=int)
if not customer_id:
flash("Please select a customer.", "danger")
return redirect(url_for("main.cloud_connect_accounts"))
customer = Customer.query.get_or_404(customer_id)
job_name = (acc.repo_name.strip() if acc.repo_name and acc.repo_name.strip() else acc.user.strip())
backup_type = "Cloud Connect Agent" if acc.section == "Agent" else "Cloud Connect Backup"
job = Job(
customer_id=customer.id,
backup_software="Veeam",
backup_type=backup_type,
job_name=job_name,
)
db.session.add(job)
db.session.flush()
acc.job_id = job.id
db.session.commit()
runs_created = _import_historical_runs(acc)
_log_admin_event(
event_type="cloud_connect_account_linked",
message=f"Cloud Connect account '{acc.user}' ({acc.section}) linked to new job '{job_name}' ({runs_created} historical run(s) created)",
details=f"customer={customer.name}, job_name={job_name}",
)
flash(
f"Job '{job_name}' created and linked to '{acc.user}' ({acc.section}). "
f"{runs_created} historical run(s) imported.",
"success",
)
elif action == "link":
job_id = request.form.get("job_id", type=int)
if not job_id:
flash("Please select a job.", "danger")
return redirect(url_for("main.cloud_connect_accounts"))
job = Job.query.get_or_404(job_id)
acc.job_id = job.id
db.session.commit()
runs_created = _import_historical_runs(acc)
_log_admin_event(
event_type="cloud_connect_account_linked",
message=f"Cloud Connect account '{acc.user}' ({acc.section}) linked to existing job '{job.job_name}' ({runs_created} historical run(s) created)",
details=f"job_id={job.id}, job_name={job.job_name}",
)
flash(
f"Linked '{acc.user}' ({acc.section}) to job '{job.job_name}'. "
f"{runs_created} historical run(s) imported.",
"success",
)
else:
flash("Unknown action.", "danger")
return redirect(url_for("main.cloud_connect_accounts"))
@main_bp.route("/cloud-connect/accounts/<int:cc_account_db_id>/unlink", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def cloud_connect_account_unlink(cc_account_db_id: int):
acc = CloudConnectAccount.query.get_or_404(cc_account_db_id)
acc.job_id = None
db.session.commit()
flash(f"Unlinked '{acc.user}' ({acc.section}).", "success")
return redirect(url_for("main.cloud_connect_accounts"))

View File

@ -1,393 +0,0 @@
"""Cove Data Protection account review routes.
Mirrors the Inbox flow for mail messages:
/cove/accounts list all Cove accounts (unmatched first)
/cove/accounts/<id>/link link an account to an existing or new job
/cove/accounts/<id>/unlink remove the job link
"""
import re
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _log_admin_event
from ..cove_importer import CoveImportError, run_cove_import, DATASOURCE_LABELS as _COVE_DS_LABELS
from ..models import CoveAccount, Customer, Job, JobRun, SystemSettings
_COVE_DATASOURCE_LABELS = {
"D01": "Files & Folders",
"D1": "Files & Folders",
"D02": "System State",
"D2": "System State",
"D10": "VssMsSql",
"D11": "VssSharePoint",
"D19": "M365 Exchange",
"D20": "M365 OneDrive",
"D05": "M365 SharePoint",
"D5": "M365 SharePoint",
"D23": "M365 Teams",
}
_COVE_M365_CODES = {"D19", "D20", "D05", "D5", "D23"}
_COVE_SERVER_CODES = {"D10", "D11"}
def _parse_cove_datasource_codes(raw: str | None) -> list[str]:
"""Extract datasource codes from Cove I78 strings like 'D01D02D10'."""
text = (raw or "").strip().upper()
if not text:
return []
return re.findall(r"D\d{1,2}", text)
def _derive_backup_type_for_account(cove_acc: CoveAccount) -> str:
"""Return Backupchecks-style backup type for a Cove account.
Heuristic:
- M365 datasource present -> Microsoft 365
- Server-specific datasource -> Server
- Otherwise -> Workstation
"""
codes = set(_parse_cove_datasource_codes(getattr(cove_acc, "datasource_types", None)))
if codes.intersection(_COVE_M365_CODES):
return "Microsoft 365"
if codes.intersection(_COVE_SERVER_CODES):
return "Server"
return "Workstation"
def _humanize_datasources(raw: str | None) -> str:
"""Return readable datasource labels from Cove I78 code string."""
labels: list[str] = []
for code in _parse_cove_datasource_codes(raw):
label = _COVE_DATASOURCE_LABELS.get(code, code)
if label not in labels:
labels.append(label)
return ", ".join(labels)
@main_bp.route("/cove/accounts")
@login_required
@roles_required("admin", "operator")
def cove_accounts():
settings = SystemSettings.query.first()
if not settings or not getattr(settings, "cove_enabled", False):
flash("Cove integration is not enabled.", "warning")
return redirect(url_for("main.settings", section="integrations"))
# Unmatched accounts (no job linked) shown first, like Inbox items
unmatched = (
CoveAccount.query
.filter(CoveAccount.job_id.is_(None))
.order_by(CoveAccount.customer_name.asc().nullslast(), CoveAccount.account_name.asc())
.all()
)
# Matched accounts
matched = (
CoveAccount.query
.filter(CoveAccount.job_id.isnot(None))
.order_by(CoveAccount.customer_name.asc().nullslast(), CoveAccount.account_name.asc())
.all()
)
customers = Customer.query.filter_by(active=True).order_by(Customer.name.asc()).all()
customer_rows = [{"id": c.id, "name": c.name} for c in customers]
jobs = Job.query.filter_by(archived=False).order_by(Job.job_name.asc()).all()
for acc in unmatched + matched:
acc.derived_backup_software = "Cove Data Protection"
acc.derived_backup_type = _derive_backup_type_for_account(acc)
acc.derived_job_name = (acc.account_name or acc.computer_name or f"Cove account {acc.account_id}").strip()
acc.datasource_display = _humanize_datasources(acc.datasource_types) or ""
return render_template(
"main/cove_accounts.html",
unmatched=unmatched,
matched=matched,
customers=customer_rows,
jobs=jobs,
settings=settings,
STATUS_LABELS={
1: "In process", 2: "Failed", 3: "Aborted", 5: "Completed",
6: "Interrupted", 7: "Not started", 8: "Completed with errors",
9: "In progress with faults", 10: "Over quota",
11: "No selection", 12: "Restarted",
},
STATUS_CLASS={
1: "warning", 2: "danger", 3: "danger", 5: "success",
6: "danger", 7: "secondary", 8: "warning", 9: "warning",
10: "danger", 11: "warning", 12: "warning",
},
)
@main_bp.route("/cove/accounts/<int:cove_account_db_id>/link", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def cove_account_link(cove_account_db_id: int):
"""Link a Cove account to a job (create a new one or select existing)."""
cove_acc = CoveAccount.query.get_or_404(cove_account_db_id)
action = (request.form.get("action") or "").strip() # "create" or "link"
linked_job_name = ""
if action == "create":
# Create a new job from the Cove account data
customer_id_raw = (request.form.get("customer_id") or "").strip()
if not customer_id_raw:
flash("Please select a customer.", "danger")
return redirect(url_for("main.cove_accounts"))
try:
customer_id = int(customer_id_raw)
except ValueError:
flash("Invalid customer selection.", "danger")
return redirect(url_for("main.cove_accounts"))
customer = Customer.query.get(customer_id)
if not customer:
flash("Customer not found.", "danger")
return redirect(url_for("main.cove_accounts"))
job_name = (cove_acc.account_name or cove_acc.computer_name or f"Cove account {cove_acc.account_id}").strip()
backup_type = _derive_backup_type_for_account(cove_acc)
job = Job(
customer_id=customer.id,
backup_software="Cove Data Protection",
backup_type=backup_type,
job_name=job_name,
cove_account_id=cove_acc.account_id,
active=True,
auto_approve=True,
)
db.session.add(job)
db.session.flush()
cove_acc.job_id = job.id
db.session.commit()
_log_admin_event(
"cove_account_linked",
f"Created job {job.id} and linked Cove account {cove_acc.account_id} ({cove_acc.account_name})",
details=f"customer={customer.name}, job_name={job_name}",
)
linked_job_name = job_name
flash(f"Job '{job_name}' created for customer '{customer.name}'.", "success")
elif action == "link":
# Link to an existing job
job_id_raw = (request.form.get("job_id") or "").strip()
if not job_id_raw:
flash("Please select a job.", "danger")
return redirect(url_for("main.cove_accounts"))
try:
job_id = int(job_id_raw)
except ValueError:
flash("Invalid job selection.", "danger")
return redirect(url_for("main.cove_accounts"))
job = Job.query.get(job_id)
if not job:
flash("Job not found.", "danger")
return redirect(url_for("main.cove_accounts"))
job.cove_account_id = cove_acc.account_id
cove_acc.job_id = job.id
db.session.commit()
_log_admin_event(
"cove_account_linked",
f"Linked Cove account {cove_acc.account_id} ({cove_acc.account_name}) to existing job {job.id}",
details=f"job_name={job.job_name}",
)
linked_job_name = job.job_name or ""
flash(f"Cove account linked to job '{job.job_name}'.", "success")
else:
flash("Unknown action.", "warning")
return redirect(url_for("main.cove_accounts"))
# Trigger an immediate import so the latest Cove run appears right away
# after linking (instead of waiting for the next scheduled/manual import).
settings = SystemSettings.query.first()
if settings and getattr(settings, "cove_enabled", False):
linked_job_id = cove_acc.job_id
before_count = 0
if linked_job_id:
before_count = (
JobRun.query
.filter_by(job_id=linked_job_id, source_type="cove_api")
.count()
)
try:
total, created, skipped, errors = run_cove_import(settings)
after_count = 0
if linked_job_id:
after_count = (
JobRun.query
.filter_by(job_id=linked_job_id, source_type="cove_api")
.count()
)
linked_created = max(after_count - before_count, 0)
_log_admin_event(
"cove_import_after_link",
(
"Triggered immediate Cove import after account link. "
f"accounts={total}, created={created}, skipped={skipped}, errors={errors}"
),
)
if linked_created > 0:
flash(
(
f"Immediate import complete for '{linked_job_name}'. "
f"New linked runs: {linked_created} (accounts: {total}, skipped: {skipped}, errors: {errors})."
),
"success" if errors == 0 else "warning",
)
else:
latest_cove = CoveAccount.query.get(cove_acc.id)
if latest_cove and latest_cove.last_run_at:
reason = (
"latest run seems unchanged (already imported) "
"or Cove has not published a newer session yet"
)
else:
reason = "Cove returned no usable last-session timestamp yet for this account"
flash(
(
f"Immediate import complete for '{linked_job_name}', but no new run was found yet. "
f"Reason: {reason}. (accounts: {total}, skipped: {skipped}, errors: {errors})"
),
"info" if errors == 0 else "warning",
)
except CoveImportError as exc:
_log_admin_event(
"cove_import_after_link_error",
f"Immediate Cove import after account link failed: {exc}",
)
flash(
"Account linked, but immediate import failed. "
"You can run import again from Cove settings.",
"warning",
)
except Exception as exc:
_log_admin_event(
"cove_import_after_link_error",
f"Unexpected immediate Cove import error after account link: {exc}",
)
flash(
"Account linked, but immediate import encountered an unexpected error. "
"You can run import again from Cove settings.",
"warning",
)
return redirect(url_for("main.cove_accounts"))
@main_bp.route("/cove/accounts/<int:cove_account_db_id>/unlink", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def cove_account_unlink(cove_account_db_id: int):
"""Remove the job link from a Cove account (puts it back in the unmatched list)."""
cove_acc = CoveAccount.query.get_or_404(cove_account_db_id)
old_job_id = cove_acc.job_id
if old_job_id:
job = Job.query.get(old_job_id)
if job and job.cove_account_id == cove_acc.account_id:
job.cove_account_id = None
cove_acc.job_id = None
db.session.commit()
_log_admin_event(
"cove_account_unlinked",
f"Unlinked Cove account {cove_acc.account_id} ({cove_acc.account_name}) from job {old_job_id}",
)
flash("Cove account unlinked.", "success")
return redirect(url_for("main.cove_accounts"))
@main_bp.route("/cove/run/<int:run_id>/detail")
@login_required
@roles_required("admin", "operator", "viewer")
def cove_run_detail(run_id: int):
"""Return structured Cove run details as JSON for the job detail popup."""
from ..database import db
from sqlalchemy import text as _sql_text
run = JobRun.query.get_or_404(run_id)
if getattr(run, "source_type", None) != "cove_api":
return jsonify({"status": "error", "message": "Not a Cove run"}), 400
cove_acc = CoveAccount.query.filter_by(job_id=run.job_id).first()
cove_summary = None
if cove_acc:
raw_ds = (cove_acc.datasource_types or "").strip().upper()
ds_codes = re.findall(r"D\d{1,2}", raw_ds) if raw_ds else []
ds_labels: list[str] = []
for code in ds_codes:
lbl = _COVE_DS_LABELS.get(code, code)
if lbl not in ds_labels:
ds_labels.append(lbl)
cove_summary = {
"account_name": cove_acc.account_name or "",
"computer_name": cove_acc.computer_name or "",
"customer_name": cove_acc.customer_name or "",
"datasources": ", ".join(ds_labels) if ds_labels else (raw_ds or ""),
"last_run_at": cove_acc.last_run_at.strftime("%Y-%m-%d %H:%M") if cove_acc.last_run_at else "",
"status": run.status or "",
}
# Per-run datasource objects from run_object_links
obj_rows = db.session.execute(
_sql_text("""
SELECT co.object_name AS name, rol.status, rol.error_message
FROM run_object_links rol
JOIN customer_objects co ON co.id = rol.customer_object_id
WHERE rol.run_id = :run_id
ORDER BY co.object_name ASC
"""),
{"run_id": run_id},
).mappings().all()
objects = [
{
"name": r["name"] or "",
"type": "",
"status": r["status"] or "",
"error_message": r["error_message"] or "",
}
for r in obj_rows
]
account_label = ""
if cove_acc:
account_label = cove_acc.account_name or cove_acc.computer_name or f"account {cove_acc.account_id}"
return jsonify({
"status": "ok",
"meta": {
"subject": account_label,
"from_address": "",
"backup_software": "Cove Data Protection",
"backup_type": "",
"job_name": "",
"overall_status": run.status or "",
"overall_message": run.remark or "",
"customer_name": cove_acc.customer_name if cove_acc else "",
"received_at": run.run_at.strftime("%Y-%m-%d %H:%M") if run.run_at else "",
"parsed_at": "",
"has_eml": False,
},
"cove_summary": cove_summary,
"cloud_connect_summary": None,
"objects": objects,
"body_html": "",
"mail": None,
})

View File

@ -63,27 +63,7 @@ def _get_or_create_settings_local():
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def customers(): def customers():
q = (request.args.get("q") or "").strip() items = Customer.query.order_by(Customer.name.asc()).all()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
query = Customer.query
if q:
for pat in _patterns(q):
query = query.filter(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
items = query.order_by(Customer.name.asc()).all()
settings = _get_or_create_settings_local() settings = _get_or_create_settings_local()
autotask_enabled = bool(getattr(settings, "autotask_enabled", False)) autotask_enabled = bool(getattr(settings, "autotask_enabled", False))
@ -125,7 +105,6 @@ def customers():
can_manage=can_manage, can_manage=can_manage,
autotask_enabled=autotask_enabled, autotask_enabled=autotask_enabled,
autotask_configured=autotask_configured, autotask_configured=autotask_configured,
q=q,
) )
@ -505,7 +484,6 @@ def customers_export():
@roles_required("admin", "operator") @roles_required("admin", "operator")
def customers_import(): def customers_import():
file = request.files.get("file") file = request.files.get("file")
include_autotask_ids = bool(request.form.get("include_autotask_ids"))
if not file or not getattr(file, "filename", ""): if not file or not getattr(file, "filename", ""):
flash("No file selected.", "warning") flash("No file selected.", "warning")
return redirect(url_for("main.customers")) return redirect(url_for("main.customers"))
@ -542,11 +520,10 @@ def customers_import():
# Detect Autotask columns (backwards compatible - these are optional) # Detect Autotask columns (backwards compatible - these are optional)
autotask_id_idx = None autotask_id_idx = None
autotask_name_idx = None autotask_name_idx = None
if include_autotask_ids: if "autotask_company_id" in header:
if "autotask_company_id" in header: autotask_id_idx = header.index("autotask_company_id")
autotask_id_idx = header.index("autotask_company_id") if "autotask_company_name" in header:
if "autotask_company_name" in header: autotask_name_idx = header.index("autotask_company_name")
autotask_name_idx = header.index("autotask_company_name")
for r in rows[start_idx:]: for r in rows[start_idx:]:
if not r: if not r:
@ -584,7 +561,7 @@ def customers_import():
if active_val is not None: if active_val is not None:
existing.active = active_val existing.active = active_val
# Update Autotask mapping if provided in CSV # Update Autotask mapping if provided in CSV
if include_autotask_ids and autotask_company_id is not None: if autotask_company_id is not None:
existing.autotask_company_id = autotask_company_id existing.autotask_company_id = autotask_company_id
existing.autotask_company_name = autotask_company_name existing.autotask_company_name = autotask_company_name
existing.autotask_mapping_status = None # Will be resynced existing.autotask_mapping_status = None # Will be resynced
@ -602,10 +579,7 @@ def customers_import():
try: try:
db.session.commit() db.session.commit()
flash( flash(f"Import finished. Created: {created}, Updated: {updated}, Skipped: {skipped}.", "success")
f"Import finished. Created: {created}, Updated: {updated}, Skipped: {skipped}. Autotask IDs imported: {'yes' if include_autotask_ids else 'no'}.",
"success",
)
# Audit logging # Audit logging
import json import json
@ -614,7 +588,6 @@ def customers_import():
f"Imported customers from CSV", f"Imported customers from CSV",
details=json.dumps({ details=json.dumps({
"format": "CSV", "format": "CSV",
"include_autotask_ids": include_autotask_ids,
"created": created, "created": created,
"updated": updated, "updated": updated,
"skipped": skipped "skipped": skipped
@ -626,3 +599,5 @@ def customers_import():
flash("Failed to import customers.", "danger") flash("Failed to import customers.", "danger")
return redirect(url_for("main.customers")) return redirect(url_for("main.customers"))

View File

@ -9,21 +9,6 @@ MISSED_GRACE_WINDOW = timedelta(hours=1)
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def daily_jobs(): def daily_jobs():
q = (request.args.get("q") or "").strip()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
# Determine target date (default: today) in Europe/Amsterdam # Determine target date (default: today) in Europe/Amsterdam
date_str = request.args.get("date") date_str = request.args.get("date")
try: try:
@ -89,21 +74,10 @@ def daily_jobs():
weekday_idx = target_date.weekday() # 0=Mon..6=Sun weekday_idx = target_date.weekday() # 0=Mon..6=Sun
jobs_query = ( jobs = (
Job.query.join(Customer, isouter=True) Job.query.join(Customer, isouter=True)
.filter(Job.archived.is_(False)) .filter(Job.archived.is_(False))
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True))) .filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
)
if q:
for pat in _patterns(q):
jobs_query = jobs_query.filter(
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
)
jobs = (
jobs_query
.order_by(Customer.name.asc().nullslast(), Job.backup_software.asc(), Job.backup_type.asc(), Job.job_name.asc()) .order_by(Customer.name.asc().nullslast(), Job.backup_software.asc(), Job.backup_type.asc(), Job.job_name.asc())
.all() .all()
) )
@ -332,7 +306,7 @@ def daily_jobs():
) )
target_date_str = target_date.strftime("%Y-%m-%d") target_date_str = target_date.strftime("%Y-%m-%d")
return render_template("main/daily_jobs.html", rows=rows, target_date_str=target_date_str, q=q) return render_template("main/daily_jobs.html", rows=rows, target_date_str=target_date_str)
@main_bp.route("/daily-jobs/details") @main_bp.route("/daily-jobs/details")

View File

@ -89,7 +89,6 @@ DOCUMENTATION_STRUCTURE = {
{'slug': 'general', 'title': 'General Settings'}, {'slug': 'general', 'title': 'General Settings'},
{'slug': 'mail-configuration', 'title': 'Mail Configuration'}, {'slug': 'mail-configuration', 'title': 'Mail Configuration'},
{'slug': 'autotask-integration', 'title': 'Autotask Integration'}, {'slug': 'autotask-integration', 'title': 'Autotask Integration'},
{'slug': 'entra-sso', 'title': 'Microsoft Entra SSO'},
{'slug': 'reporting-settings', 'title': 'Reporting Settings'}, {'slug': 'reporting-settings', 'title': 'Reporting Settings'},
{'slug': 'user-management', 'title': 'User Management'}, {'slug': 'user-management', 'title': 'User Management'},
{'slug': 'maintenance', 'title': 'Maintenance'}, {'slug': 'maintenance', 'title': 'Maintenance'},

View File

@ -1,53 +1,5 @@
from .routes_shared import * # noqa: F401,F403 from .routes_shared import * # noqa: F401,F403
from .routes_shared import _format_datetime from .routes_shared import _format_datetime
from werkzeug.utils import secure_filename
import imghdr
# Allowed image extensions and max file size
ALLOWED_EXTENSIONS = {'png', 'jpg', 'jpeg', 'gif', 'webp'}
MAX_FILE_SIZE = 5 * 1024 * 1024 # 5 MB
def _validate_image_file(file):
"""Validate uploaded image file.
Returns (is_valid, error_message, mime_type)
"""
if not file or not file.filename:
return False, "No file selected", None
# Check file size
file.seek(0, 2) # Seek to end
size = file.tell()
file.seek(0) # Reset to beginning
if size > MAX_FILE_SIZE:
return False, f"File too large (max {MAX_FILE_SIZE // (1024*1024)}MB)", None
if size == 0:
return False, "Empty file", None
# Check extension
filename = secure_filename(file.filename)
if '.' not in filename:
return False, "File must have an extension", None
ext = filename.rsplit('.', 1)[1].lower()
if ext not in ALLOWED_EXTENSIONS:
return False, f"Only images allowed ({', '.join(ALLOWED_EXTENSIONS)})", None
# Verify it's actually an image by reading header
file_data = file.read()
file.seek(0)
image_type = imghdr.what(None, h=file_data)
if image_type is None:
return False, "Invalid image file", None
mime_type = f"image/{image_type}"
return True, None, mime_type
@main_bp.route("/feedback") @main_bp.route("/feedback")
@ -69,14 +21,7 @@ def feedback_page():
if sort not in ("votes", "newest", "updated"): if sort not in ("votes", "newest", "updated"):
sort = "votes" sort = "votes"
# Admin-only: show deleted items where = ["fi.deleted_at IS NULL"]
show_deleted = False
if get_active_role() == "admin":
show_deleted = request.args.get("show_deleted", "0") in ("1", "true", "yes", "on")
where = []
if not show_deleted:
where.append("fi.deleted_at IS NULL")
params = {"user_id": int(current_user.id)} params = {"user_id": int(current_user.id)}
if item_type: if item_type:
@ -113,8 +58,6 @@ def feedback_page():
fi.status, fi.status,
fi.created_at, fi.created_at,
fi.updated_at, fi.updated_at,
fi.deleted_at,
fi.deleted_by_user_id,
u.username AS created_by, u.username AS created_by,
COALESCE(v.vote_count, 0) AS vote_count, COALESCE(v.vote_count, 0) AS vote_count,
EXISTS ( EXISTS (
@ -152,8 +95,6 @@ def feedback_page():
"created_by": r["created_by"] or "-", "created_by": r["created_by"] or "-",
"vote_count": int(r["vote_count"] or 0), "vote_count": int(r["vote_count"] or 0),
"user_voted": bool(r["user_voted"]), "user_voted": bool(r["user_voted"]),
"is_deleted": bool(r["deleted_at"]),
"deleted_at": _format_datetime(r["deleted_at"]) if r["deleted_at"] else "",
} }
) )
@ -164,7 +105,6 @@ def feedback_page():
status=status, status=status,
q=q, q=q,
sort=sort, sort=sort,
show_deleted=show_deleted,
) )
@ -195,31 +135,6 @@ def feedback_new():
created_by_user_id=int(current_user.id), created_by_user_id=int(current_user.id),
) )
db.session.add(item) db.session.add(item)
db.session.flush() # Get item.id for attachments
# Handle file uploads (multiple files allowed)
files = request.files.getlist('screenshots')
for file in files:
if file and file.filename:
is_valid, error_msg, mime_type = _validate_image_file(file)
if not is_valid:
db.session.rollback()
flash(f"Screenshot error: {error_msg}", "danger")
return redirect(url_for("main.feedback_new"))
filename = secure_filename(file.filename)
file_data = file.read()
attachment = FeedbackAttachment(
feedback_item_id=item.id,
feedback_reply_id=None,
filename=filename,
file_data=file_data,
mime_type=mime_type,
file_size=len(file_data),
)
db.session.add(attachment)
db.session.commit() db.session.commit()
flash("Feedback item created.", "success") flash("Feedback item created.", "success")
@ -233,8 +148,7 @@ def feedback_new():
@roles_required("admin", "operator", "reporter", "viewer") @roles_required("admin", "operator", "reporter", "viewer")
def feedback_detail(item_id: int): def feedback_detail(item_id: int):
item = FeedbackItem.query.get_or_404(item_id) item = FeedbackItem.query.get_or_404(item_id)
# Allow admins to view deleted items if item.deleted_at is not None:
if item.deleted_at is not None and get_active_role() != "admin":
abort(404) abort(404)
vote_count = ( vote_count = (
@ -260,15 +174,6 @@ def feedback_detail(item_id: int):
resolved_by = User.query.get(item.resolved_by_user_id) resolved_by = User.query.get(item.resolved_by_user_id)
resolved_by_name = resolved_by.username if resolved_by else "" resolved_by_name = resolved_by.username if resolved_by else ""
# Get attachments for the main item (not linked to a reply)
item_attachments = (
FeedbackAttachment.query.filter(
FeedbackAttachment.feedback_item_id == item.id,
FeedbackAttachment.feedback_reply_id.is_(None),
)
.order_by(FeedbackAttachment.created_at.asc())
.all()
)
replies = ( replies = (
FeedbackReply.query.filter(FeedbackReply.feedback_item_id == item.id) FeedbackReply.query.filter(FeedbackReply.feedback_item_id == item.id)
@ -276,25 +181,6 @@ def feedback_detail(item_id: int):
.all() .all()
) )
# Get attachments for each reply
reply_ids = [r.id for r in replies]
reply_attachments_list = []
if reply_ids:
reply_attachments_list = (
FeedbackAttachment.query.filter(
FeedbackAttachment.feedback_reply_id.in_(reply_ids)
)
.order_by(FeedbackAttachment.created_at.asc())
.all()
)
# Map reply_id -> list of attachments
reply_attachments_map = {}
for att in reply_attachments_list:
if att.feedback_reply_id not in reply_attachments_map:
reply_attachments_map[att.feedback_reply_id] = []
reply_attachments_map[att.feedback_reply_id].append(att)
reply_user_ids = sorted({int(r.user_id) for r in replies}) reply_user_ids = sorted({int(r.user_id) for r in replies})
reply_users = ( reply_users = (
User.query.filter(User.id.in_(reply_user_ids)).all() if reply_user_ids else [] User.query.filter(User.id.in_(reply_user_ids)).all() if reply_user_ids else []
@ -310,8 +196,6 @@ def feedback_detail(item_id: int):
user_voted=bool(user_voted), user_voted=bool(user_voted),
replies=replies, replies=replies,
reply_user_map=reply_user_map, reply_user_map=reply_user_map,
item_attachments=item_attachments,
reply_attachments_map=reply_attachments_map,
) )
@main_bp.route("/feedback/<int:item_id>/reply", methods=["POST"]) @main_bp.route("/feedback/<int:item_id>/reply", methods=["POST"])
@ -338,31 +222,6 @@ def feedback_reply(item_id: int):
created_at=datetime.utcnow(), created_at=datetime.utcnow(),
) )
db.session.add(reply) db.session.add(reply)
db.session.flush() # Get reply.id for attachments
# Handle file uploads (multiple files allowed)
files = request.files.getlist('screenshots')
for file in files:
if file and file.filename:
is_valid, error_msg, mime_type = _validate_image_file(file)
if not is_valid:
db.session.rollback()
flash(f"Screenshot error: {error_msg}", "danger")
return redirect(url_for("main.feedback_detail", item_id=item.id))
filename = secure_filename(file.filename)
file_data = file.read()
attachment = FeedbackAttachment(
feedback_item_id=item.id,
feedback_reply_id=reply.id,
filename=filename,
file_data=file_data,
mime_type=mime_type,
file_size=len(file_data),
)
db.session.add(attachment)
db.session.commit() db.session.commit()
flash("Reply added.", "success") flash("Reply added.", "success")
@ -449,60 +308,3 @@ def feedback_delete(item_id: int):
flash("Feedback item deleted.", "success") flash("Feedback item deleted.", "success")
return redirect(url_for("main.feedback_page")) return redirect(url_for("main.feedback_page"))
@main_bp.route("/feedback/<int:item_id>/permanent-delete", methods=["POST"])
@login_required
@roles_required("admin")
def feedback_permanent_delete(item_id: int):
"""Permanently delete a feedback item and all its attachments from the database.
This is a hard delete - the item and all associated data will be removed permanently.
Only available for items that are already soft-deleted.
"""
item = FeedbackItem.query.get_or_404(item_id)
# Only allow permanent delete on already soft-deleted items
if item.deleted_at is None:
flash("Item must be deleted first before permanent deletion.", "warning")
return redirect(url_for("main.feedback_detail", item_id=item.id))
# Get attachment count for feedback message
attachment_count = FeedbackAttachment.query.filter_by(feedback_item_id=item.id).count()
# Hard delete - CASCADE will automatically delete:
# - feedback_votes
# - feedback_replies
# - feedback_attachments (via replies CASCADE)
# - feedback_attachments (direct, via item CASCADE)
db.session.delete(item)
db.session.commit()
flash(f"Feedback item permanently deleted ({attachment_count} screenshot(s) removed).", "success")
return redirect(url_for("main.feedback_page", show_deleted="1"))
@main_bp.route("/feedback/attachment/<int:attachment_id>")
@login_required
@roles_required("admin", "operator", "reporter", "viewer")
def feedback_attachment(attachment_id: int):
"""Serve a feedback attachment image."""
attachment = FeedbackAttachment.query.get_or_404(attachment_id)
# Check if the feedback item is deleted - allow admins to view
item = FeedbackItem.query.get(attachment.feedback_item_id)
if not item:
abort(404)
if item.deleted_at is not None and get_active_role() != "admin":
abort(404)
# Serve the image
from flask import send_file
import io
return send_file(
io.BytesIO(attachment.file_data),
mimetype=attachment.mime_type,
as_attachment=False,
download_name=attachment.filename,
)

View File

@ -9,28 +9,12 @@ from ..ticketing_utils import link_open_internal_tickets_to_run
import time import time
import re import re
import html as _html import html as _html
from sqlalchemy import cast, String
@main_bp.route("/inbox") @main_bp.route("/inbox")
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def inbox(): def inbox():
q = (request.args.get("q") or "").strip()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
try: try:
page = int(request.args.get("page", "1")) page = int(request.args.get("page", "1"))
except ValueError: except ValueError:
@ -44,18 +28,6 @@ def inbox():
# Use location column if available; otherwise just return all # Use location column if available; otherwise just return all
if hasattr(MailMessage, "location"): if hasattr(MailMessage, "location"):
query = query.filter(MailMessage.location == "inbox") query = query.filter(MailMessage.location == "inbox")
if q:
for pat in _patterns(q):
query = query.filter(
(func.coalesce(MailMessage.from_address, "").ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.subject, "").ilike(pat, escape="\\"))
| (cast(MailMessage.received_at, String).ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.backup_software, "").ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.backup_type, "").ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.job_name, "").ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.parse_result, "").ilike(pat, escape="\\"))
| (cast(MailMessage.parsed_at, String).ilike(pat, escape="\\"))
)
total_items = query.count() total_items = query.count()
total_pages = max(1, math.ceil(total_items / per_page)) if total_items else 1 total_pages = max(1, math.ceil(total_items / per_page)) if total_items else 1
@ -107,7 +79,6 @@ def inbox():
customers=customer_rows, customers=customer_rows,
can_bulk_delete=(get_active_role() in ("admin", "operator")), can_bulk_delete=(get_active_role() in ("admin", "operator")),
is_admin=(get_active_role() == "admin"), is_admin=(get_active_role() == "admin"),
q=q,
) )
@ -184,56 +155,6 @@ def inbox_message_detail(message_id: int):
for obj in MailObject.query.filter_by(mail_message_id=msg.id).order_by(MailObject.object_name.asc()).all() for obj in MailObject.query.filter_by(mail_message_id=msg.id).order_by(MailObject.object_name.asc()).all()
] ]
# Optional run_id: if provided and the run is a Cloud Connect run, return
# per-run objects (from run_object_links) and a structured CC summary instead
# of the raw MailObject list which contains all tenants from the shared report email.
cloud_connect_summary = None
run_id_param = request.args.get("run_id", type=int)
if run_id_param:
try:
from ..models import JobRun, CloudConnectAccount
from ..database import db
from sqlalchemy import text as _sql_text
_run = JobRun.query.get(run_id_param)
if _run and getattr(_run, "source_type", None) == "cloud_connect" and _run.job_id:
_cc_acc = CloudConnectAccount.query.filter_by(job_id=_run.job_id).first()
if _cc_acc:
# Use the run's own stored status (not the account's latest status)
# so historical runs show their actual status, not today's.
cloud_connect_summary = {
"user": _cc_acc.user or "",
"section": _cc_acc.section or "",
"repo_name": _cc_acc.repo_name or "",
"repo_type": _cc_acc.repo_type or "",
"used_space": _cc_acc.used_space or "",
"total_quota": _cc_acc.total_quota or "",
"free_space": _cc_acc.free_space or "",
"last_active": _cc_acc.last_active_raw or "",
"status": _run.status or "",
}
# Replace MailObject list with per-run objects from run_object_links
cc_rows = db.session.execute(
_sql_text("""
SELECT co.object_name AS name, rol.status, rol.error_message
FROM run_object_links rol
JOIN customer_objects co ON co.id = rol.customer_object_id
WHERE rol.run_id = :run_id
ORDER BY co.object_name ASC
"""),
{"run_id": run_id_param},
).mappings().all()
objects = [
{
"name": r["name"] or "",
"type": "",
"status": r["status"] or "",
"error_message": r["error_message"] or "",
}
for r in cc_rows
]
except Exception:
pass # keep MailObject objects as fallback
# VSPC multi-company emails (e.g. "Active alarms summary") may not store parsed objects yet. # VSPC multi-company emails (e.g. "Active alarms summary") may not store parsed objects yet.
# Extract company names from the stored body so the UI can offer a dedicated mapping workflow. # Extract company names from the stored body so the UI can offer a dedicated mapping workflow.
vspc_companies: list[str] = [] vspc_companies: list[str] = []
@ -277,7 +198,6 @@ def inbox_message_detail(message_id: int):
"meta": meta, "meta": meta,
"body_html": body_html, "body_html": body_html,
"objects": objects, "objects": objects,
"cloud_connect_summary": cloud_connect_summary,
"vspc_companies": vspc_companies, "vspc_companies": vspc_companies,
"vspc_company_defaults": vspc_company_defaults, "vspc_company_defaults": vspc_company_defaults,
}) })
@ -1401,252 +1321,3 @@ def inbox_reparse_all():
) )
return redirect(url_for("main.inbox")) return redirect(url_for("main.inbox"))
@main_bp.route("/inbox/reparse-batch", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def inbox_reparse_batch():
"""Process one batch of inbox messages and return JSON progress info.
Expects JSON body: {"last_id": <int|null>, "total": <int|null>}
Returns JSON: {processed, total, parsed_ok, auto_approved, no_match, errors, last_id, done}
"""
from flask import jsonify
data = request.get_json(silent=True) or {}
last_id = data.get("last_id") # keyset cursor (id < last_id)
total_known = data.get("total") # total passed from client so we don't recount
base_q = MailMessage.query
if hasattr(MailMessage, "location"):
base_q = base_q.filter(MailMessage.location == "inbox")
if total_known is None:
total_known = base_q.count()
batch_size = 50
time_budget_s = 8.0
started_at = time.monotonic()
processed = 0
parsed_ok = 0
auto_approved = 0
auto_approved_runs = []
no_match = 0
errors = 0
q = base_q
if last_id is not None:
q = q.filter(MailMessage.id < last_id)
batch = q.order_by(MailMessage.id.desc()).limit(batch_size).all()
new_last_id = last_id
for msg in batch:
if (time.monotonic() - started_at) >= time_budget_s:
break
new_last_id = msg.id
processed += 1
try:
parse_mail_message(msg)
try:
if (
getattr(msg, "location", "inbox") == "inbox"
and getattr(msg, "parse_result", None) == "ok"
and getattr(msg, "job_id", None) is None
):
bsw = (getattr(msg, "backup_software", "") or "").strip().lower()
btype = (getattr(msg, "backup_type", "") or "").strip().lower()
jname = (getattr(msg, "job_name", "") or "").strip().lower()
if bsw == "veeam" and btype == "service provider console" and jname == "active alarms summary":
raw = (getattr(msg, "text_body", None) or "").strip() or (getattr(msg, "html_body", None) or "")
companies = extract_vspc_active_alarms_companies(raw)
if companies:
def _is_error_status(value):
v = (value or "").strip().lower()
return v in {"error", "failed", "critical"} or v.startswith("fail")
first_job = None
mapped_count = 0
created_any = False
for company in companies:
tmp_msg = MailMessage(
from_address=msg.from_address,
backup_software=msg.backup_software,
backup_type=msg.backup_type,
job_name=f"{(msg.job_name or 'Active alarms summary').strip()} | {company}".strip(),
)
with db.session.no_autoflush:
job = find_matching_job(tmp_msg)
if not job:
continue
if hasattr(job, "active") and not bool(job.active):
continue
if hasattr(job, "auto_approve") and not bool(job.auto_approve):
continue
mapped_count += 1
objs = (
MailObject.query.filter(MailObject.mail_message_id == msg.id)
.filter(MailObject.object_name.ilike(f"{company} | %"))
.all()
)
saw_error = any(_is_error_status(o.status) for o in objs)
saw_warning = any((o.status or "").strip().lower() == "warning" for o in objs)
status = "Error" if saw_error else ("Warning" if saw_warning else (msg.overall_status or "Success"))
run = JobRun(
job_id=job.id,
mail_message_id=msg.id,
run_at=(msg.received_at or getattr(msg, "parsed_at", None) or datetime.utcnow()),
status=status or None,
missed=False,
)
if hasattr(run, "remark"):
run.remark = getattr(msg, "overall_message", None)
if hasattr(run, "storage_used_bytes") and hasattr(msg, "storage_used_bytes"):
run.storage_used_bytes = msg.storage_used_bytes
if hasattr(run, "storage_capacity_bytes") and hasattr(msg, "storage_capacity_bytes"):
run.storage_capacity_bytes = msg.storage_capacity_bytes
if hasattr(run, "storage_free_bytes") and hasattr(msg, "storage_free_bytes"):
run.storage_free_bytes = msg.storage_free_bytes
if hasattr(run, "storage_free_percent") and hasattr(msg, "storage_free_percent"):
run.storage_free_percent = msg.storage_free_percent
db.session.add(run)
db.session.flush()
try:
link_open_internal_tickets_to_run(run=run, job=job)
except Exception:
pass
auto_approved_runs.append((job.customer_id, job.id, run.id, msg.id))
created_any = True
if not first_job:
first_job = job
if created_any and mapped_count == len(companies):
msg.job_id = first_job.id if first_job else None
if hasattr(msg, "approved"):
msg.approved = True
if hasattr(msg, "approved_at"):
msg.approved_at = datetime.utcnow()
if hasattr(msg, "approved_by_id"):
msg.approved_by_id = None
if hasattr(msg, "location"):
msg.location = "history"
auto_approved += 1
# Do not fall back to single-job matching for VSPC summary.
if msg.parse_result == "ok":
parsed_ok += 1
elif msg.parse_result == "no_match":
no_match += 1
else:
errors += 1
continue
with db.session.no_autoflush:
job = find_matching_job(msg)
if job:
if hasattr(job, "active") and not bool(job.active):
raise Exception("job not active")
if hasattr(job, "auto_approve") and not bool(job.auto_approve):
raise Exception("job auto_approve disabled")
run = JobRun(
job_id=job.id,
mail_message_id=msg.id,
run_at=(msg.received_at or getattr(msg, "parsed_at", None) or datetime.utcnow()),
status=msg.overall_status or None,
missed=False,
)
if hasattr(run, "remark"):
run.remark = getattr(msg, "overall_message", None)
if hasattr(run, "storage_used_bytes") and hasattr(msg, "storage_used_bytes"):
run.storage_used_bytes = msg.storage_used_bytes
if hasattr(run, "storage_capacity_bytes") and hasattr(msg, "storage_capacity_bytes"):
run.storage_capacity_bytes = msg.storage_capacity_bytes
if hasattr(run, "storage_free_bytes") and hasattr(msg, "storage_free_bytes"):
run.storage_free_bytes = msg.storage_free_bytes
if hasattr(run, "storage_free_percent") and hasattr(msg, "storage_free_percent"):
run.storage_free_percent = msg.storage_free_percent
db.session.add(run)
db.session.flush()
try:
link_open_internal_tickets_to_run(run=run, job=job)
except Exception:
pass
auto_approved_runs.append((job.customer_id, job.id, run.id, msg.id))
msg.job_id = job.id
if hasattr(msg, "approved"):
msg.approved = True
if hasattr(msg, "approved_at"):
msg.approved_at = datetime.utcnow()
if hasattr(msg, "approved_by_id"):
msg.approved_by_id = None
if hasattr(msg, "location"):
msg.location = "history"
auto_approved += 1
except Exception as _exc:
current_app.logger.exception(
f"Auto-approve during reparse-batch failed for message {getattr(msg,'id',None)}: {_exc}"
)
if msg.parse_result == "ok":
parsed_ok += 1
elif msg.parse_result == "no_match":
no_match += 1
else:
errors += 1
except Exception as exc:
errors += 1
msg.parse_result = "error"
msg.parse_error = str(exc)[:500]
try:
db.session.commit()
except Exception:
db.session.rollback()
# Persist objects for auto-approved runs
if auto_approved_runs:
for (customer_id, job_id, run_id, mail_message_id) in auto_approved_runs:
try:
persist_objects_for_auto_run(customer_id, job_id, run_id, mail_message_id)
except Exception as exc:
_log_admin_event(
"object_persist_error",
f"Object persistence failed for auto-approved message {mail_message_id} (job {job_id}, run {run_id}): {exc}",
)
# Determine if we are done: no more messages below new_last_id
done = False
if processed < batch_size:
done = True
elif new_last_id is not None:
remaining_q = base_q.filter(MailMessage.id < new_last_id)
done = remaining_q.count() == 0
return jsonify({
"processed": processed,
"total": total_known,
"parsed_ok": parsed_ok,
"auto_approved": auto_approved,
"no_match": no_match,
"errors": errors,
"last_id": new_last_id,
"done": done,
})

View File

@ -13,56 +13,12 @@ from .routes_shared import (
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def jobs(): def jobs():
selected_customer_id = None # Join with customers for display
selected_customer_name = "" jobs = (
q = (request.args.get("q") or "").strip()
customer_id_raw = (request.args.get("customer_id") or "").strip()
if customer_id_raw:
try:
selected_customer_id = int(customer_id_raw)
except ValueError:
selected_customer_id = None
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
base_query = (
Job.query Job.query
.filter(Job.archived.is_(False)) .filter(Job.archived.is_(False))
.outerjoin(Customer, Customer.id == Job.customer_id) .outerjoin(Customer, Customer.id == Job.customer_id)
) .filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
if selected_customer_id is not None:
base_query = base_query.filter(Job.customer_id == selected_customer_id)
selected_customer = Customer.query.filter(Customer.id == selected_customer_id).first()
if selected_customer is not None:
selected_customer_name = selected_customer.name or ""
else:
# Default listing hides jobs for inactive customers.
base_query = base_query.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
if q:
for pat in _patterns(q):
base_query = base_query.filter(
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
)
# Join with customers for display
jobs = (
base_query
.add_columns( .add_columns(
Job.id, Job.id,
Job.backup_software, Job.backup_software,
@ -98,9 +54,6 @@ def jobs():
"main/jobs.html", "main/jobs.html",
jobs=rows, jobs=rows,
can_manage_jobs=can_manage_jobs, can_manage_jobs=can_manage_jobs,
selected_customer_id=selected_customer_id,
selected_customer_name=selected_customer_name,
q=q,
) )
@ -187,50 +140,6 @@ def unarchive_job(job_id: int):
return redirect(url_for("main.archived_jobs")) return redirect(url_for("main.archived_jobs"))
@main_bp.route("/jobs/<int:job_id>/set-cove-account", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def job_set_cove_account(job_id: int):
"""Save or clear the Cove Account ID for this job."""
from ..models import CoveAccount
job = Job.query.get_or_404(job_id)
old_account_id = job.cove_account_id
account_id_raw = (request.form.get("cove_account_id") or "").strip()
if account_id_raw:
try:
job.cove_account_id = int(account_id_raw)
except (ValueError, TypeError):
flash("Invalid Cove Account ID must be a number.", "warning")
return redirect(url_for("main.job_detail", job_id=job_id))
else:
job.cove_account_id = None
# Keep staging-table link in sync when possible.
# Importer primarily links from cove_accounts.job_id.
if old_account_id and old_account_id != job.cove_account_id:
old_cove_acc = CoveAccount.query.filter_by(account_id=old_account_id, job_id=job.id).first()
if old_cove_acc:
old_cove_acc.job_id = None
if job.cove_account_id:
cove_acc = CoveAccount.query.filter_by(account_id=job.cove_account_id).first()
if cove_acc:
cove_acc.job_id = job.id
db.session.commit()
try:
log_admin_event(
"job_cove_account_set",
f"Set Cove Account ID for job {job.id} to {job.cove_account_id!r}",
details=f"job_name={job.job_name}",
)
except Exception:
pass
flash("Cove Account ID saved.", "success")
return redirect(url_for("main.job_detail", job_id=job_id))
@main_bp.route("/jobs/<int:job_id>") @main_bp.route("/jobs/<int:job_id>")
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
@ -512,7 +421,6 @@ def job_detail(job_id: int):
"ticket_codes": ticket_codes, "ticket_codes": ticket_codes,
"remark_items": remark_items, "remark_items": remark_items,
"mail_message_id": r.mail_message_id, "mail_message_id": r.mail_message_id,
"source_type": getattr(r, "source_type", "") or "",
"reviewed_by": (r.reviewed_by.username if getattr(r, "reviewed_by", None) else ""), "reviewed_by": (r.reviewed_by.username if getattr(r, "reviewed_by", None) else ""),
"reviewed_at": _format_datetime(r.reviewed_at) if r.reviewed_at else "", "reviewed_at": _format_datetime(r.reviewed_at) if r.reviewed_at else "",
} }
@ -536,11 +444,6 @@ def job_detail(job_id: int):
if job.customer_id: if job.customer_id:
customer = Customer.query.get(job.customer_id) customer = Customer.query.get(job.customer_id)
# Load system settings for Cove integration display
from ..models import SystemSettings as _SystemSettings
_settings = _SystemSettings.query.first()
cove_enabled = bool(getattr(_settings, "cove_enabled", False)) if _settings else False
return render_template( return render_template(
"main/job_detail.html", "main/job_detail.html",
job=job, job=job,
@ -557,7 +460,6 @@ def job_detail(job_id: int):
has_prev=has_prev, has_prev=has_prev,
has_next=has_next, has_next=has_next,
can_manage_jobs=can_manage_jobs, can_manage_jobs=can_manage_jobs,
cove_enabled=cove_enabled,
) )

View File

@ -11,16 +11,6 @@ _OVERRIDE_DEFAULT_START_AT = datetime(1970, 1, 1)
def overrides(): def overrides():
can_manage = get_active_role() in ("admin", "operator") can_manage = get_active_role() in ("admin", "operator")
can_delete = get_active_role() == "admin" can_delete = get_active_role() == "admin"
q = (request.args.get("q") or "").strip()
def _match_query(text: str, raw_query: str) -> bool:
hay = (text or "").lower()
tokens = [t.strip() for t in (raw_query or "").split() if t.strip()]
for tok in tokens:
needle = tok.lower().replace("*", "")
if needle and needle not in hay:
return False
return True
overrides_q = Override.query.order_by(Override.level.asc(), Override.start_at.desc()).all() overrides_q = Override.query.order_by(Override.level.asc(), Override.start_at.desc()).all()
@ -102,31 +92,16 @@ def overrides():
rows = [] rows = []
for ov in overrides_q: for ov in overrides_q:
scope_text = _describe_scope(ov)
start_text = _format_datetime(ov.start_at)
end_text = _format_datetime(ov.end_at) if ov.end_at else ""
comment_text = ov.comment or ""
if q:
full_text = " | ".join([
ov.level or "",
scope_text,
start_text,
end_text,
comment_text,
])
if not _match_query(full_text, q):
continue
rows.append( rows.append(
{ {
"id": ov.id, "id": ov.id,
"level": ov.level or "", "level": ov.level or "",
"scope": scope_text, "scope": _describe_scope(ov),
"start_at": start_text, "start_at": _format_datetime(ov.start_at),
"end_at": end_text, "end_at": _format_datetime(ov.end_at) if ov.end_at else "",
"active": bool(ov.active), "active": bool(ov.active),
"treat_as_success": bool(ov.treat_as_success), "treat_as_success": bool(ov.treat_as_success),
"comment": comment_text, "comment": ov.comment or "",
"match_status": ov.match_status or "", "match_status": ov.match_status or "",
"match_error_contains": ov.match_error_contains or "", "match_error_contains": ov.match_error_contains or "",
"match_error_mode": getattr(ov, "match_error_mode", None) or "", "match_error_mode": getattr(ov, "match_error_mode", None) or "",
@ -147,7 +122,6 @@ def overrides():
jobs_for_select=jobs_for_select, jobs_for_select=jobs_for_select,
backup_software_options=backup_software_options, backup_software_options=backup_software_options,
backup_type_options=backup_type_options, backup_type_options=backup_type_options,
q=q,
) )
@ -424,3 +398,4 @@ def overrides_toggle(override_id: int):
flash("Override status updated.", "success") flash("Override status updated.", "success")
return redirect(url_for("main.overrides")) return redirect(url_for("main.overrides"))

View File

@ -1,6 +1,6 @@
from .routes_shared import * # noqa: F401,F403 from .routes_shared import * # noqa: F401,F403
from sqlalchemy import text, cast, String from sqlalchemy import text
import json import json
import csv import csv
import io import io
@ -101,33 +101,12 @@ def api_reports_list():
if err is not None: if err is not None:
return err return err
q = (request.args.get("q") or "").strip() rows = (
db.session.query(ReportDefinition)
def _patterns(raw: str) -> list[str]: .order_by(ReportDefinition.created_at.desc())
out = [] .limit(200)
for tok in [t.strip() for t in (raw or "").split() if t.strip()]: .all()
p = tok.replace("\\", "\\\\") )
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
query = db.session.query(ReportDefinition)
if q:
for pat in _patterns(q):
query = query.filter(
(func.coalesce(ReportDefinition.name, "").ilike(pat, escape="\\"))
| (func.coalesce(ReportDefinition.report_type, "").ilike(pat, escape="\\"))
| (func.coalesce(ReportDefinition.output_format, "").ilike(pat, escape="\\"))
| (cast(ReportDefinition.period_start, String).ilike(pat, escape="\\"))
| (cast(ReportDefinition.period_end, String).ilike(pat, escape="\\"))
)
rows = query.order_by(ReportDefinition.created_at.desc()).limit(200).all()
return { return {
"items": [ "items": [
{ {

View File

@ -1,7 +1,6 @@
from .routes_shared import * # noqa: F401,F403 from .routes_shared import * # noqa: F401,F403
from datetime import date, timedelta from datetime import date, timedelta
from .routes_reporting_api import build_report_columns_meta, build_report_job_filters_meta from .routes_reporting_api import build_report_columns_meta, build_report_job_filters_meta
from sqlalchemy import cast, String
def get_default_report_period(): def get_default_report_period():
"""Return default report period (last 7 days).""" """Return default report period (last 7 days)."""
@ -53,33 +52,13 @@ def _build_report_item(r):
@main_bp.route("/reports") @main_bp.route("/reports")
@login_required @login_required
def reports(): def reports():
q = (request.args.get("q") or "").strip()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
# Pre-render items so the page is usable even if JS fails to load/execute. # Pre-render items so the page is usable even if JS fails to load/execute.
query = db.session.query(ReportDefinition) rows = (
if q: db.session.query(ReportDefinition)
for pat in _patterns(q): .order_by(ReportDefinition.created_at.desc())
query = query.filter( .limit(200)
(func.coalesce(ReportDefinition.name, "").ilike(pat, escape="\\")) .all()
| (func.coalesce(ReportDefinition.report_type, "").ilike(pat, escape="\\")) )
| (func.coalesce(ReportDefinition.output_format, "").ilike(pat, escape="\\"))
| (cast(ReportDefinition.period_start, String).ilike(pat, escape="\\"))
| (cast(ReportDefinition.period_end, String).ilike(pat, escape="\\"))
)
rows = query.order_by(ReportDefinition.created_at.desc()).limit(200).all()
items = [_build_report_item(r) for r in rows] items = [_build_report_item(r) for r in rows]
period_start, period_end = get_default_report_period() period_start, period_end = get_default_report_period()
@ -91,7 +70,6 @@ def reports():
job_filters_meta=build_report_job_filters_meta(), job_filters_meta=build_report_job_filters_meta(),
default_period_start=period_start.isoformat(), default_period_start=period_start.isoformat(),
default_period_end=period_end.isoformat(), default_period_end=period_end.isoformat(),
q=q,
) )

View File

@ -1,13 +1,11 @@
from __future__ import annotations from __future__ import annotations
import calendar import calendar
import re
import threading
from datetime import date, datetime, time, timedelta, timezone from datetime import date, datetime, time, timedelta, timezone
from flask import flash, jsonify, redirect, render_template, request, url_for from flask import jsonify, render_template, request, url_for
from urllib.parse import urlencode, urljoin from urllib.parse import urljoin
from flask_login import current_user, login_required from flask_login import current_user, login_required
from sqlalchemy import and_, or_, func, text from sqlalchemy import and_, or_, func, text
@ -40,128 +38,9 @@ from ..models import (
TicketScope, TicketScope,
User, User,
) )
from ..ticketing_utils import link_open_internal_tickets_to_run
AUTOTASK_TERMINAL_STATUS_IDS = {5} AUTOTASK_TERMINAL_STATUS_IDS = {5}
RUN_CHECKS_SORT_MODES = {"customer", "status"}
# ---------------------------------------------------------------------------
# Background task helpers
# ---------------------------------------------------------------------------
# Throttle: track when we last ran missed-run generation per job.
# Key: job_id (int), Value: datetime of last run (UTC)
_missed_run_last_ran: dict[int, datetime] = {}
_missed_run_lock = threading.Lock()
_MISSED_RUN_THROTTLE = timedelta(minutes=10)
# Guard: only one background sweep at a time
_bg_sweep_lock = threading.Lock()
def _run_background_sweep(app, job_ids_and_dates: list[tuple[int, date, date]], run_ids_for_at: list[int]) -> None:
"""Heavy operations executed in a daemon thread so the page loads immediately.
- Missed-run generation for eligible jobs (throttled per job).
- Autotask ticket state polling.
"""
if not _bg_sweep_lock.acquire(blocking=False):
# A sweep is already running; skip to avoid pile-up.
return
try:
with app.app_context():
# 1. Missed-run generation
try:
for job_id, start_date, end_date in job_ids_and_dates:
try:
job = Job.query.get(job_id)
if job is None:
continue
_ensure_missed_runs_for_job(job, start_date, end_date)
with _missed_run_lock:
_missed_run_last_ran[job_id] = datetime.utcnow()
except Exception:
try:
db.session.rollback()
except Exception:
pass
except Exception:
pass
# 2. Autotask ticket state polling
try:
if run_ids_for_at:
_poll_autotask_ticket_states_for_runs(run_ids=run_ids_for_at)
except Exception:
pass
finally:
_bg_sweep_lock.release()
RUN_CHECKS_STATUS_FILTER_VALUES = ("critical", "missed", "warning", "success_override", "success")
def _parse_bool_flag(raw: str | None, default: bool = False) -> bool:
if raw is None:
return bool(default)
return str(raw).strip().lower() in {"1", "true", "yes", "on"}
def _normalize_run_checks_sort_mode(raw: str | None) -> str:
v = (raw or "").strip().lower()
if v not in RUN_CHECKS_SORT_MODES:
return "customer"
return v
def _normalize_run_checks_status_filters(values: list[str] | tuple[str, ...] | None) -> list[str]:
selected = {(v or "").strip().lower() for v in (values or []) if (v or "").strip()}
return [k for k in RUN_CHECKS_STATUS_FILTER_VALUES if k in selected]
def _row_status_categories(status_counts: dict[str, int] | None) -> set[str]:
counts = status_counts or {}
failed_count = int(counts.get("Failed", 0) or 0) + int(counts.get("Error", 0) or 0)
warning_count = int(counts.get("Warning", 0) or 0)
missed_count = int(counts.get("Missed", 0) or 0)
success_override_count = int(counts.get("Success (override)", 0) or 0)
success_count = int(counts.get("Success", 0) or 0)
cats: set[str] = set()
if failed_count > 0:
cats.add("critical")
if missed_count > 0:
cats.add("missed")
if warning_count > 0:
cats.add("warning")
if success_override_count > 0:
cats.add("success_override")
if success_count > 0:
cats.add("success")
return cats
def _row_primary_status_rank(status_counts: dict[str, int] | None) -> int:
cats = _row_status_categories(status_counts)
if "critical" in cats:
return 0
if "missed" in cats:
return 1
if "warning" in cats:
return 2
if "success_override" in cats:
return 3
if "success" in cats:
return 4
return 9
def _is_hidden_3cx_non_backup(backup_software: str | None, backup_type: str | None) -> bool:
"""Hide non-backup 3CX informational jobs from Run Checks."""
bs = (backup_software or "").strip().lower()
bt = (backup_type or "").strip().lower()
return bs == "3cx" and bt in {"update", "ssl certificate"}
def _ensure_internal_ticket_for_autotask( def _ensure_internal_ticket_for_autotask(
@ -736,12 +615,6 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
Returns number of inserted missed runs. Returns number of inserted missed runs.
""" """
# Cloud Connect jobs get one run per day whenever Veeam sends the report — the
# delivery time can shift (e.g. from 18:55 to 10:24) without indicating a missed run.
# Schedule inference would wrongly produce two expected slots, so skip entirely.
if getattr(job, "backup_type", "").lower() in ("cloud connect backup", "cloud connect agent"):
return 0
tz = _get_ui_timezone() tz = _get_ui_timezone()
schedule_map = _infer_schedule_map_from_runs(job.id) or {} schedule_map = _infer_schedule_map_from_runs(job.id) or {}
has_weekly_times = any((schedule_map.get(i) or []) for i in range(7)) has_weekly_times = any((schedule_map.get(i) or []) for i in range(7))
@ -852,8 +725,6 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
mail_message_id=None, mail_message_id=None,
) )
db.session.add(miss) db.session.add(miss)
db.session.flush() # Ensure miss.id is available for ticket linking
link_open_internal_tickets_to_run(run=miss, job=job)
inserted += 1 inserted += 1
d = d + timedelta(days=1) d = d + timedelta(days=1)
@ -935,8 +806,6 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
mail_message_id=None, mail_message_id=None,
) )
db.session.add(miss) db.session.add(miss)
db.session.flush() # Ensure miss.id is available for ticket linking
link_open_internal_tickets_to_run(run=miss, job=job)
inserted += 1 inserted += 1
# Next month # Next month
@ -956,63 +825,13 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
def run_checks_page(): def run_checks_page():
"""Run Checks page: list jobs that have runs to review (including generated missed runs).""" """Run Checks page: list jobs that have runs to review (including generated missed runs)."""
default_q = (getattr(current_user, "run_checks_filter_q", None) or "").strip()
default_sort_mode = _normalize_run_checks_sort_mode(getattr(current_user, "run_checks_sort_mode", "customer"))
default_status_filters = _normalize_run_checks_status_filters(
(getattr(current_user, "run_checks_filter_statuses", "") or "").split(",")
)
default_has_ticket = bool(getattr(current_user, "run_checks_filter_has_ticket", False))
default_has_remark = bool(getattr(current_user, "run_checks_filter_has_remark", False))
q_arg = request.args.get("q")
q = (q_arg if q_arg is not None else default_q).strip()
sort_mode_arg = request.args.get("sort")
sort_mode = _normalize_run_checks_sort_mode(sort_mode_arg if sort_mode_arg is not None else default_sort_mode)
status_args = [(x or "").strip().lower() for x in request.args.getlist("status") if (x or "").strip()]
if status_args:
selected_status_filters = _normalize_run_checks_status_filters(status_args)
else:
statuses_csv_arg = request.args.get("statuses")
if statuses_csv_arg is not None:
selected_status_filters = _normalize_run_checks_status_filters(statuses_csv_arg.split(","))
else:
selected_status_filters = default_status_filters
has_ticket_arg = request.args.get("has_ticket")
has_ticket = _parse_bool_flag(has_ticket_arg, default=default_has_ticket)
has_remark_arg = request.args.get("has_remark")
has_remark = _parse_bool_flag(has_remark_arg, default=default_has_remark)
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
include_reviewed = False include_reviewed = False
if get_active_role() == "admin": if get_active_role() == "admin":
include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on") include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on")
# --------------------------------------------------------------------------- # Generate missed runs since the last review per job so they show up in Run Checks.
# Background sweep: missed-run generation + Autotask ticket polling. # This is intentionally best-effort; any errors should not block page load.
# Both operations are dispatched to a daemon thread so this page loads
# immediately. Results will be visible on the *next* page load.
# Missed-run generation is throttled per job (max once per 10 minutes).
# ---------------------------------------------------------------------------
try: try:
from flask import current_app
app = current_app._get_current_object() # noqa: SLF001 — safe proxy unwrap
settings_start = _get_default_missed_start_date() settings_start = _get_default_missed_start_date()
last_reviewed_rows = ( last_reviewed_rows = (
@ -1022,59 +841,34 @@ def run_checks_page():
) )
last_reviewed_map = {int(jid): (dt if dt else None) for jid, dt in last_reviewed_rows} last_reviewed_map = {int(jid): (dt if dt else None) for jid, dt in last_reviewed_rows}
jobs_all = ( jobs = (
Job.query.outerjoin(Customer) Job.query.outerjoin(Customer)
.filter(Job.archived.is_(False)) .filter(Job.archived.is_(False))
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True))) .filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
.all() .all()
) )
today_local = _to_amsterdam_date(datetime.utcnow()) or datetime.utcnow().date() today_local = _to_amsterdam_date(datetime.utcnow()) or datetime.utcnow().date()
now_utc = datetime.utcnow()
# Build list of jobs that need a missed-run sweep (throttled).
jobs_to_sweep: list[tuple[int, date, date]] = []
with _missed_run_lock:
for job in jobs_all:
if _is_hidden_3cx_non_backup(
getattr(job, "backup_software", None), getattr(job, "backup_type", None)
):
continue
last_ran = _missed_run_last_ran.get(int(job.id))
if last_ran and (now_utc - last_ran) < _MISSED_RUN_THROTTLE:
continue # ran recently enough
last_rev = last_reviewed_map.get(int(job.id))
if last_rev:
start_date = _to_amsterdam_date(last_rev) or settings_start
else:
start_date = settings_start
if start_date and start_date > today_local:
continue
jobs_to_sweep.append((int(job.id), start_date, today_local))
# Collect Autotask run ids for Phase 2 polling.
at_run_ids: list[int] = []
try:
at_run_ids = [
int(x)
for (x,) in JobRun.query
.filter(JobRun.reviewed_at.is_(None), JobRun.autotask_ticket_id.isnot(None))
.with_entities(JobRun.id)
.limit(800)
.all()
]
except Exception:
at_run_ids = []
if jobs_to_sweep or at_run_ids:
t = threading.Thread(
target=_run_background_sweep,
args=(app, jobs_to_sweep, at_run_ids),
daemon=True,
)
t.start()
for job in jobs:
last_rev = last_reviewed_map.get(int(job.id))
if last_rev:
start_date = _to_amsterdam_date(last_rev) or settings_start
else:
start_date = settings_start
if start_date and start_date > today_local:
continue
_ensure_missed_runs_for_job(job, start_date, today_local)
except Exception:
# Don't block the page if missed-run generation fails.
pass
# Phase 2 (read-only PSA driven): sync internal ticket resolved state based on PSA ticket status.
# Best-effort: never blocks page load.
try:
run_q = JobRun.query.filter(JobRun.reviewed_at.is_(None), JobRun.autotask_ticket_id.isnot(None))
run_ids = [int(x) for (x,) in run_q.with_entities(JobRun.id).limit(800).all()]
_poll_autotask_ticket_states_for_runs(run_ids=run_ids)
except Exception: except Exception:
# Never block the page load.
pass pass
# Aggregated per-job rows # Aggregated per-job rows
@ -1090,14 +884,6 @@ def run_checks_page():
.outerjoin(Customer, Customer.id == Job.customer_id) .outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False)) .filter(Job.archived.is_(False))
) )
if q:
for pat in _patterns(q):
base = base.filter(
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
)
# Runs to show in the overview: unreviewed (or all if admin toggle enabled) # Runs to show in the overview: unreviewed (or all if admin toggle enabled)
run_filter = [] run_filter = []
@ -1153,7 +939,7 @@ def run_checks_page():
agg = agg.subquery() agg = agg.subquery()
rows_query = ( q = (
base.join(agg, agg.c.job_id == Job.id) base.join(agg, agg.c.job_id == Job.id)
.outerjoin(last_reviewed, last_reviewed.c.job_id == Job.id) .outerjoin(last_reviewed, last_reviewed.c.job_id == Job.id)
.add_columns( .add_columns(
@ -1162,7 +948,7 @@ def run_checks_page():
) )
) )
# Sort for operational review: Customer > Backup > Type > Job # Sort for operational review: Customer > Backup > Type > Job
rows_query = rows_query.order_by( q = q.order_by(
Customer.name.asc().nullslast(), Customer.name.asc().nullslast(),
Job.backup_software.asc().nullslast(), Job.backup_software.asc().nullslast(),
Job.backup_type.asc().nullslast(), Job.backup_type.asc().nullslast(),
@ -1170,7 +956,7 @@ def run_checks_page():
Job.id.asc(), Job.id.asc(),
) )
rows = [r for r in rows_query.limit(2000).all() if not _is_hidden_3cx_non_backup(r.backup_software, r.backup_type)] rows = q.limit(2000).all()
# Ensure override flags are up-to-date for the runs shown in this overview. # Ensure override flags are up-to-date for the runs shown in this overview.
# The Run Checks modal computes override status on-the-fly, but the overview # The Run Checks modal computes override status on-the-fly, but the overview
@ -1336,41 +1122,6 @@ def run_checks_page():
} }
) )
if selected_status_filters:
selected_set = set(selected_status_filters)
payload = [
row for row in payload
if bool(_row_status_categories(row.get("status_counts", {})) & selected_set)
]
if has_ticket:
payload = [row for row in payload if bool(row.get("has_active_ticket", False))]
if has_remark:
payload = [row for row in payload if bool(row.get("has_active_remark", False))]
if sort_mode == "status":
payload.sort(
key=lambda row: (
_row_primary_status_rank(row.get("status_counts", {})),
str(row.get("customer_name") or "").lower(),
str(row.get("backup_software") or "").lower(),
str(row.get("backup_type") or "").lower(),
str(row.get("job_name") or "").lower(),
int(row.get("job_id") or 0),
)
)
else:
payload.sort(
key=lambda row: (
str(row.get("customer_name") or "").lower(),
str(row.get("backup_software") or "").lower(),
str(row.get("backup_type") or "").lower(),
str(row.get("job_name") or "").lower(),
int(row.get("job_id") or 0),
)
)
settings = _get_or_create_settings() settings = _get_or_create_settings()
autotask_enabled = bool(getattr(settings, "autotask_enabled", False)) autotask_enabled = bool(getattr(settings, "autotask_enabled", False))
@ -1380,55 +1131,9 @@ def run_checks_page():
is_admin=(get_active_role() == "admin"), is_admin=(get_active_role() == "admin"),
include_reviewed=include_reviewed, include_reviewed=include_reviewed,
autotask_enabled=autotask_enabled, autotask_enabled=autotask_enabled,
q=q,
sort_mode=sort_mode,
selected_status_filters=selected_status_filters,
has_ticket=has_ticket,
has_remark=has_remark,
) )
@main_bp.route("/run-checks/preferences", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def run_checks_save_preferences():
q = (request.form.get("q") or "").strip()[:255]
sort_mode = _normalize_run_checks_sort_mode(request.form.get("sort"))
selected_status_filters = _normalize_run_checks_status_filters(request.form.getlist("status"))
has_ticket = _parse_bool_flag(request.form.get("has_ticket"), default=False)
has_remark = _parse_bool_flag(request.form.get("has_remark"), default=False)
current_user.run_checks_sort_mode = sort_mode
current_user.run_checks_filter_statuses = ",".join(selected_status_filters)
current_user.run_checks_filter_has_ticket = has_ticket
current_user.run_checks_filter_has_remark = has_remark
current_user.run_checks_filter_q = q or None
db.session.commit()
flash("Run Checks preferences saved.", "success")
include_reviewed = False
if get_active_role() == "admin":
include_reviewed = _parse_bool_flag(request.form.get("include_reviewed"), default=False)
params: list[tuple[str, str]] = [("sort", sort_mode)]
if q:
params.append(("q", q))
for status in selected_status_filters:
params.append(("status", status))
if has_ticket:
params.append(("has_ticket", "1"))
if has_remark:
params.append(("has_remark", "1"))
if include_reviewed:
params.append(("include_reviewed", "1"))
target = url_for("main.run_checks_page")
if params:
target = f"{target}?{urlencode(params, doseq=True)}"
return redirect(target)
@main_bp.route("/api/run-checks/details") @main_bp.route("/api/run-checks/details")
@login_required @login_required
@roles_required("admin", "operator") @roles_required("admin", "operator")
@ -1446,15 +1151,6 @@ def run_checks_details():
include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on") include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on")
job = Job.query.get_or_404(job_id) job = Job.query.get_or_404(job_id)
if _is_hidden_3cx_non_backup(getattr(job, "backup_software", None), getattr(job, "backup_type", None)):
job_payload = {
"id": job.id,
"customer_name": job.customer.name if job.customer else "",
"backup_software": job.backup_software or "",
"backup_type": job.backup_type or "",
"job_name": job.job_name or "",
}
return jsonify({"status": "ok", "job": job_payload, "runs": [], "message": "This 3CX informational type is hidden from Run Checks."})
q = JobRun.query.filter(JobRun.job_id == job.id) q = JobRun.query.filter(JobRun.job_id == job.id)
if not include_reviewed: if not include_reviewed:
@ -1482,66 +1178,12 @@ def run_checks_details():
mail_meta = None mail_meta = None
has_eml = False has_eml = False
body_html = "" body_html = ""
cloud_connect_summary = None if msg:
cove_summary = None
# For Cove API runs, suppress the mail section entirely and show
# structured Cove account details instead.
if getattr(run, "source_type", None) == "cove_api":
from ..models import CoveAccount
from ..cove_importer import DATASOURCE_LABELS as _COVE_DS_LABELS
_cove_acc = CoveAccount.query.filter_by(job_id=job.id).first()
if _cove_acc:
# Translate raw datasource code string (e.g. "D01D19") to labels
_raw_ds = (_cove_acc.datasource_types or "").strip().upper()
_ds_codes = re.findall(r"D\d{1,2}", _raw_ds) if _raw_ds else []
_ds_labels: list[str] = []
for _code in _ds_codes:
_lbl = _COVE_DS_LABELS.get(_code, _code)
if _lbl not in _ds_labels:
_ds_labels.append(_lbl)
cove_summary = {
"account_name": _cove_acc.account_name or "",
"computer_name": _cove_acc.computer_name or "",
"customer_name": _cove_acc.customer_name or "",
"datasources": ", ".join(_ds_labels) if _ds_labels else (_raw_ds or ""),
"last_run_at": _format_datetime(_cove_acc.last_run_at) if _cove_acc.last_run_at else "",
"status": run.status or "",
}
# No mail_meta / body_html for Cove runs — Cove has no email
# For Cloud Connect runs, suppress the raw report email (it contains all
# tenants) and replace it with a structured summary from the staging account.
elif getattr(run, "source_type", None) == "cloud_connect":
from ..models import CloudConnectAccount
_cc_acc = CloudConnectAccount.query.filter_by(job_id=job.id).first()
if _cc_acc:
cloud_connect_summary = {
"user": _cc_acc.user or "",
"section": _cc_acc.section or "",
"repo_name": _cc_acc.repo_name or "",
"repo_type": _cc_acc.repo_type or "",
"used_space": _cc_acc.used_space or "",
"total_quota": _cc_acc.total_quota or "",
"free_space": _cc_acc.free_space or "",
"last_active": _cc_acc.last_active_raw or "",
"status": run.status or "",
}
# Keep mail meta, EML link, and body for the collapsible source panel.
if msg:
mail_meta = {
"from_address": msg.from_address or "",
"subject": msg.subject or "",
"received_at": _format_datetime(msg.received_at),
}
has_eml = bool(getattr(msg, "eml_stored_at", None))
elif msg:
mail_meta = { mail_meta = {
"from_address": msg.from_address or "", "from_address": msg.from_address or "",
"subject": msg.subject or "", "subject": msg.subject or "",
"received_at": _format_datetime(msg.received_at), "received_at": _format_datetime(msg.received_at),
} }
if msg:
def _is_blank_text(s): def _is_blank_text(s):
return s is None or (isinstance(s, str) and s.strip() == "") return s is None or (isinstance(s, str) and s.strip() == "")
@ -1617,8 +1259,7 @@ def run_checks_details():
) )
# If no run-linked objects exist yet, fall back to objects parsed/stored on the mail message. # If no run-linked objects exist yet, fall back to objects parsed/stored on the mail message.
# Skip this fallback for cloud_connect runs: the mail contains all tenants, not just this one. if (not objects_payload) and msg:
if (not objects_payload) and msg and getattr(run, "source_type", None) != "cloud_connect":
try: try:
for mo in ( for mo in (
MailObject.query.filter_by(mail_message_id=msg.id) MailObject.query.filter_by(mail_message_id=msg.id)
@ -1670,8 +1311,6 @@ def run_checks_details():
"has_eml": bool(has_eml), "has_eml": bool(has_eml),
"mail": mail_meta, "mail": mail_meta,
"body_html": body_html, "body_html": body_html,
"cloud_connect_summary": cloud_connect_summary,
"cove_summary": cove_summary,
"objects": objects_payload, "objects": objects_payload,
"autotask_ticket_id": getattr(run, "autotask_ticket_id", None), "autotask_ticket_id": getattr(run, "autotask_ticket_id", None),
"autotask_ticket_number": getattr(run, "autotask_ticket_number", None) or "", "autotask_ticket_number": getattr(run, "autotask_ticket_number", None) or "",

View File

@ -1,963 +0,0 @@
from .routes_shared import * # noqa: F401,F403
from .routes_shared import (
_apply_overrides_to_run,
_format_datetime,
_get_or_create_settings,
_get_ui_timezone,
_infer_monthly_schedule_from_runs,
_infer_schedule_map_from_runs,
)
from sqlalchemy import and_, cast, func, or_, String
import math
SEARCH_LIMIT_PER_SECTION = 10
SEARCH_SECTION_KEYS = [
"inbox",
"customers",
"jobs",
"daily_jobs",
"run_checks",
"tickets",
"remarks",
"overrides",
"reports",
]
def _is_section_allowed(section: str) -> bool:
role = get_active_role()
allowed = {
"inbox": {"admin", "operator", "viewer"},
"customers": {"admin", "operator", "viewer"},
"jobs": {"admin", "operator", "viewer"},
"daily_jobs": {"admin", "operator", "viewer"},
"run_checks": {"admin", "operator"},
"tickets": {"admin", "operator", "viewer"},
"remarks": {"admin", "operator", "viewer"},
"overrides": {"admin", "operator", "viewer"},
"reports": {"admin", "operator", "viewer", "reporter"},
}
return role in allowed.get(section, set())
def _build_patterns(raw_query: str) -> list[str]:
tokens = [t.strip() for t in (raw_query or "").split() if t.strip()]
patterns: list[str] = []
for token in tokens:
p = token.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = f"%{p}"
if not p.endswith("%"):
p = f"{p}%"
patterns.append(p)
return patterns
def _contains_all_terms(columns: list, patterns: list[str]):
if not patterns or not columns:
return None
term_filters = []
for pattern in patterns:
per_term = [col.ilike(pattern, escape="\\") for col in columns]
term_filters.append(or_(*per_term))
return and_(*term_filters)
def _parse_page(value: str | None) -> int:
try:
page = int((value or "").strip())
except Exception:
page = 1
return page if page > 0 else 1
def _paginate_query(query, page: int, order_by_cols: list):
total = query.count()
total_pages = max(1, math.ceil(total / SEARCH_LIMIT_PER_SECTION)) if total else 1
current_page = min(max(page, 1), total_pages)
rows = (
query.order_by(*order_by_cols)
.offset((current_page - 1) * SEARCH_LIMIT_PER_SECTION)
.limit(SEARCH_LIMIT_PER_SECTION)
.all()
)
return total, current_page, total_pages, rows
def _enrich_paging(section: dict, total: int, current_page: int, total_pages: int) -> None:
section["total"] = int(total or 0)
section["current_page"] = int(current_page or 1)
section["total_pages"] = int(total_pages or 1)
section["has_prev"] = section["current_page"] > 1
section["has_next"] = section["current_page"] < section["total_pages"]
section["prev_url"] = ""
section["next_url"] = ""
def _build_inbox_results(patterns: list[str], page: int) -> dict:
section = {
"key": "inbox",
"title": "Inbox",
"view_all_url": url_for("main.inbox"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("inbox"):
return section
query = MailMessage.query
if hasattr(MailMessage, "location"):
query = query.filter(MailMessage.location == "inbox")
match_expr = _contains_all_terms(
[
func.coalesce(MailMessage.from_address, ""),
func.coalesce(MailMessage.subject, ""),
cast(MailMessage.received_at, String),
func.coalesce(MailMessage.backup_software, ""),
func.coalesce(MailMessage.backup_type, ""),
func.coalesce(MailMessage.job_name, ""),
func.coalesce(MailMessage.parse_result, ""),
cast(MailMessage.parsed_at, String),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[MailMessage.received_at.desc().nullslast(), MailMessage.id.desc()],
)
_enrich_paging(section, total, current_page, total_pages)
for msg in rows:
parsed_flag = bool(getattr(msg, "parsed_at", None) or (msg.parse_result or ""))
section["items"].append(
{
"title": msg.subject or f"Message #{msg.id}",
"subtitle": f"{msg.from_address or '-'} | {_format_datetime(msg.received_at)}",
"meta": f"{msg.backup_software or '-'} / {msg.backup_type or '-'} / {msg.job_name or '-'} | Parsed: {'Yes' if parsed_flag else 'No'}",
"link": url_for("main.inbox"),
}
)
return section
def _build_customers_results(patterns: list[str], page: int) -> dict:
section = {
"key": "customers",
"title": "Customers",
"view_all_url": url_for("main.customers"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("customers"):
return section
query = Customer.query
match_expr = _contains_all_terms([func.coalesce(Customer.name, "")], patterns)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[Customer.name.asc()],
)
_enrich_paging(section, total, current_page, total_pages)
for c in rows:
try:
job_count = c.jobs.count()
except Exception:
job_count = 0
section["items"].append(
{
"title": c.name or f"Customer #{c.id}",
"subtitle": f"Jobs: {job_count}",
"meta": "Active" if c.active else "Inactive",
"link": url_for("main.jobs", customer_id=c.id),
}
)
return section
def _build_jobs_results(patterns: list[str], page: int) -> dict:
section = {
"key": "jobs",
"title": "Jobs",
"view_all_url": url_for("main.jobs"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("jobs"):
return section
query = (
db.session.query(
Job.id.label("job_id"),
Job.backup_software.label("backup_software"),
Job.backup_type.label("backup_type"),
Job.job_name.label("job_name"),
Customer.name.label("customer_name"),
)
.select_from(Job)
.outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False))
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
)
match_expr = _contains_all_terms(
[
func.coalesce(Customer.name, ""),
func.coalesce(Job.backup_software, ""),
func.coalesce(Job.backup_type, ""),
func.coalesce(Job.job_name, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[
Customer.name.asc().nullslast(),
Job.backup_software.asc(),
Job.backup_type.asc(),
Job.job_name.asc(),
],
)
_enrich_paging(section, total, current_page, total_pages)
for row in rows:
section["items"].append(
{
"title": row.job_name or f"Job #{row.job_id}",
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
"meta": "",
"link": url_for("main.job_detail", job_id=row.job_id),
}
)
return section
def _build_daily_jobs_results(patterns: list[str], page: int) -> dict:
section = {
"key": "daily_jobs",
"title": "Daily Jobs",
"view_all_url": url_for("main.daily_jobs"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("daily_jobs"):
return section
try:
tz = _get_ui_timezone()
except Exception:
tz = None
try:
target_date = datetime.now(tz).date() if tz else datetime.utcnow().date()
except Exception:
target_date = datetime.utcnow().date()
settings = _get_or_create_settings()
missed_start_date = getattr(settings, "daily_jobs_start_date", None)
if tz:
local_midnight = datetime(
year=target_date.year,
month=target_date.month,
day=target_date.day,
hour=0,
minute=0,
second=0,
tzinfo=tz,
)
start_of_day = local_midnight.astimezone(datetime_module.timezone.utc).replace(tzinfo=None)
end_of_day = (local_midnight + timedelta(days=1)).astimezone(datetime_module.timezone.utc).replace(tzinfo=None)
else:
start_of_day = datetime(
year=target_date.year,
month=target_date.month,
day=target_date.day,
hour=0,
minute=0,
second=0,
)
end_of_day = start_of_day + timedelta(days=1)
def _to_local(dt_utc):
if not dt_utc or not tz:
return dt_utc
try:
if dt_utc.tzinfo is None:
dt_utc = dt_utc.replace(tzinfo=datetime_module.timezone.utc)
return dt_utc.astimezone(tz)
except Exception:
return dt_utc
def _bucket_15min(dt_utc):
d = _to_local(dt_utc)
if not d:
return None
minute_bucket = (d.minute // 15) * 15
return f"{d.hour:02d}:{minute_bucket:02d}"
def _is_success_status(value: str) -> bool:
s = (value or "").strip().lower()
if not s:
return False
return ("success" in s) or ("override" in s)
query = (
db.session.query(
Job.id.label("job_id"),
Job.job_name.label("job_name"),
Job.backup_software.label("backup_software"),
Job.backup_type.label("backup_type"),
Customer.name.label("customer_name"),
)
.select_from(Job)
.outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False))
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
)
match_expr = _contains_all_terms(
[
func.coalesce(Customer.name, ""),
func.coalesce(Job.backup_software, ""),
func.coalesce(Job.backup_type, ""),
func.coalesce(Job.job_name, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[
Customer.name.asc().nullslast(),
Job.backup_software.asc(),
Job.backup_type.asc(),
Job.job_name.asc(),
],
)
_enrich_paging(section, total, current_page, total_pages)
for row in rows:
expected_times = (_infer_schedule_map_from_runs(row.job_id).get(target_date.weekday()) or [])
if not expected_times:
monthly = _infer_monthly_schedule_from_runs(row.job_id)
if monthly:
try:
dom = int(monthly.get("day_of_month") or 0)
except Exception:
dom = 0
mtimes = monthly.get("times") or []
try:
import calendar as _calendar
last_dom = _calendar.monthrange(target_date.year, target_date.month)[1]
except Exception:
last_dom = target_date.day
scheduled_dom = dom if (dom and dom <= last_dom) else last_dom
if target_date.day == scheduled_dom:
expected_times = list(mtimes)
runs_for_day = (
JobRun.query.filter(
JobRun.job_id == row.job_id,
JobRun.run_at >= start_of_day,
JobRun.run_at < end_of_day,
)
.order_by(JobRun.run_at.asc())
.all()
)
run_count = len(runs_for_day)
last_status = "-"
expected_display = expected_times[-1] if expected_times else "-"
if run_count > 0:
last_run = runs_for_day[-1]
try:
job_obj = Job.query.get(int(row.job_id))
status_display, _override_applied, _override_level, _ov_id, _ov_reason = _apply_overrides_to_run(job_obj, last_run)
if getattr(last_run, "missed", False):
last_status = status_display or "Missed"
else:
last_status = status_display or (last_run.status or "-")
except Exception:
last_status = last_run.status or "-"
expected_display = _bucket_15min(last_run.run_at) or expected_display
else:
try:
today_local = datetime.now(tz).date() if tz else datetime.utcnow().date()
except Exception:
today_local = datetime.utcnow().date()
if target_date > today_local:
last_status = "Expected"
elif target_date == today_local:
last_status = "Expected"
else:
if missed_start_date and target_date < missed_start_date:
last_status = "-"
else:
last_status = "Missed"
success_text = "Yes" if _is_success_status(last_status) else "No"
section["items"].append(
{
"title": row.job_name or f"Job #{row.job_id}",
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
"meta": f"Expected: {expected_display} | Successful: {success_text} | Runs: {run_count}",
"link": url_for("main.daily_jobs", date=target_date.strftime("%Y-%m-%d"), open_job_id=row.job_id),
}
)
return section
def _build_run_checks_results(patterns: list[str], page: int) -> dict:
section = {
"key": "run_checks",
"title": "Run Checks",
"view_all_url": url_for("main.run_checks_page"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("run_checks"):
return section
agg = (
db.session.query(
JobRun.job_id.label("job_id"),
func.count(JobRun.id).label("run_count"),
)
.filter(JobRun.reviewed_at.is_(None))
.group_by(JobRun.job_id)
.subquery()
)
query = (
db.session.query(
Job.id.label("job_id"),
Job.job_name.label("job_name"),
Job.backup_software.label("backup_software"),
Job.backup_type.label("backup_type"),
Customer.name.label("customer_name"),
agg.c.run_count.label("run_count"),
)
.select_from(Job)
.join(agg, agg.c.job_id == Job.id)
.outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False))
)
match_expr = _contains_all_terms(
[
func.coalesce(Customer.name, ""),
func.coalesce(Job.backup_software, ""),
func.coalesce(Job.backup_type, ""),
func.coalesce(Job.job_name, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[
Customer.name.asc().nullslast(),
Job.backup_software.asc().nullslast(),
Job.backup_type.asc().nullslast(),
Job.job_name.asc().nullslast(),
],
)
_enrich_paging(section, total, current_page, total_pages)
for row in rows:
section["items"].append(
{
"title": row.job_name or f"Job #{row.job_id}",
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
"meta": f"Unreviewed runs: {int(row.run_count or 0)}",
"link": url_for("main.run_checks_page"),
}
)
return section
def _build_tickets_results(patterns: list[str], page: int) -> dict:
section = {
"key": "tickets",
"title": "Tickets",
"view_all_url": url_for("main.tickets_page"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("tickets"):
return section
query = (
db.session.query(Ticket)
.select_from(Ticket)
.outerjoin(TicketScope, TicketScope.ticket_id == Ticket.id)
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
.outerjoin(Job, Job.id == TicketScope.job_id)
)
match_expr = _contains_all_terms(
[
func.coalesce(Ticket.ticket_code, ""),
func.coalesce(Customer.name, ""),
func.coalesce(TicketScope.scope_type, ""),
func.coalesce(TicketScope.backup_software, ""),
func.coalesce(TicketScope.backup_type, ""),
func.coalesce(TicketScope.job_name_match, ""),
func.coalesce(Job.job_name, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
query = query.distinct()
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[Ticket.start_date.desc().nullslast()],
)
_enrich_paging(section, total, current_page, total_pages)
for t in rows:
customer_display = "-"
scope_summary = "-"
try:
scope_rows = (
db.session.query(
TicketScope.scope_type.label("scope_type"),
TicketScope.backup_software.label("backup_software"),
TicketScope.backup_type.label("backup_type"),
Customer.name.label("customer_name"),
)
.select_from(TicketScope)
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
.filter(TicketScope.ticket_id == t.id)
.all()
)
customer_names = []
for s in scope_rows:
cname = getattr(s, "customer_name", None)
if cname and cname not in customer_names:
customer_names.append(cname)
if customer_names:
customer_display = customer_names[0]
if len(customer_names) > 1:
customer_display = f"{customer_display} +{len(customer_names)-1}"
if scope_rows:
s = scope_rows[0]
bits = []
if getattr(s, "scope_type", None):
bits.append(str(getattr(s, "scope_type")))
if getattr(s, "backup_software", None):
bits.append(str(getattr(s, "backup_software")))
if getattr(s, "backup_type", None):
bits.append(str(getattr(s, "backup_type")))
scope_summary = " / ".join(bits) if bits else "-"
except Exception:
customer_display = "-"
scope_summary = "-"
section["items"].append(
{
"title": t.ticket_code or f"Ticket #{t.id}",
"subtitle": f"{customer_display} | {scope_summary}",
"meta": _format_datetime(t.start_date),
"link": url_for("main.ticket_detail", ticket_id=t.id),
}
)
return section
def _build_remarks_results(patterns: list[str], page: int) -> dict:
section = {
"key": "remarks",
"title": "Remarks",
"view_all_url": url_for("main.tickets_page", tab="remarks"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("remarks"):
return section
query = (
db.session.query(Remark)
.select_from(Remark)
.outerjoin(RemarkScope, RemarkScope.remark_id == Remark.id)
.outerjoin(Customer, Customer.id == RemarkScope.customer_id)
.outerjoin(Job, Job.id == RemarkScope.job_id)
)
match_expr = _contains_all_terms(
[
func.coalesce(Remark.title, ""),
func.coalesce(Remark.body, ""),
func.coalesce(Customer.name, ""),
func.coalesce(RemarkScope.scope_type, ""),
func.coalesce(RemarkScope.backup_software, ""),
func.coalesce(RemarkScope.backup_type, ""),
func.coalesce(RemarkScope.job_name_match, ""),
func.coalesce(Job.job_name, ""),
cast(Remark.start_date, String),
cast(Remark.resolved_at, String),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
query = query.distinct()
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[Remark.start_date.desc().nullslast()],
)
_enrich_paging(section, total, current_page, total_pages)
for r in rows:
customer_display = "-"
scope_summary = "-"
try:
scope_rows = (
db.session.query(
RemarkScope.scope_type.label("scope_type"),
RemarkScope.backup_software.label("backup_software"),
RemarkScope.backup_type.label("backup_type"),
Customer.name.label("customer_name"),
)
.select_from(RemarkScope)
.outerjoin(Customer, Customer.id == RemarkScope.customer_id)
.filter(RemarkScope.remark_id == r.id)
.all()
)
customer_names = []
for s in scope_rows:
cname = getattr(s, "customer_name", None)
if cname and cname not in customer_names:
customer_names.append(cname)
if customer_names:
customer_display = customer_names[0]
if len(customer_names) > 1:
customer_display = f"{customer_display} +{len(customer_names)-1}"
if scope_rows:
s = scope_rows[0]
bits = []
if getattr(s, "scope_type", None):
bits.append(str(getattr(s, "scope_type")))
if getattr(s, "backup_software", None):
bits.append(str(getattr(s, "backup_software")))
if getattr(s, "backup_type", None):
bits.append(str(getattr(s, "backup_type")))
scope_summary = " / ".join(bits) if bits else "-"
except Exception:
customer_display = "-"
scope_summary = "-"
preview = (r.title or r.body or "").strip()
if len(preview) > 80:
preview = preview[:77] + "..."
section["items"].append(
{
"title": preview or f"Remark #{r.id}",
"subtitle": f"{customer_display} | {scope_summary}",
"meta": _format_datetime(r.start_date),
"link": url_for("main.remark_detail", remark_id=r.id),
}
)
return section
def _build_overrides_results(patterns: list[str], page: int) -> dict:
section = {
"key": "overrides",
"title": "Existing overrides",
"view_all_url": url_for("main.overrides"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("overrides"):
return section
query = (
db.session.query(
Override.id.label("id"),
Override.level.label("level"),
Override.backup_software.label("backup_software"),
Override.backup_type.label("backup_type"),
Override.object_name.label("object_name"),
Override.start_at.label("start_at"),
Override.end_at.label("end_at"),
Override.comment.label("comment"),
Customer.name.label("customer_name"),
Job.job_name.label("job_name"),
)
.select_from(Override)
.outerjoin(Job, Job.id == Override.job_id)
.outerjoin(Customer, Customer.id == Job.customer_id)
)
match_expr = _contains_all_terms(
[
func.coalesce(Override.level, ""),
func.coalesce(Customer.name, ""),
func.coalesce(Override.backup_software, ""),
func.coalesce(Override.backup_type, ""),
func.coalesce(Job.job_name, ""),
func.coalesce(Override.object_name, ""),
cast(Override.start_at, String),
cast(Override.end_at, String),
func.coalesce(Override.comment, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[Override.level.asc(), Override.start_at.desc()],
)
_enrich_paging(section, total, current_page, total_pages)
for row in rows:
scope_bits = []
if row.customer_name:
scope_bits.append(row.customer_name)
if row.backup_software:
scope_bits.append(row.backup_software)
if row.backup_type:
scope_bits.append(row.backup_type)
if row.job_name:
scope_bits.append(row.job_name)
if row.object_name:
scope_bits.append(f"object: {row.object_name}")
scope_text = " / ".join(scope_bits) if scope_bits else "All jobs"
section["items"].append(
{
"title": (row.level or "override").capitalize(),
"subtitle": scope_text,
"meta": f"From {_format_datetime(row.start_at)} to {_format_datetime(row.end_at) if row.end_at else '-'} | {row.comment or ''}",
"link": url_for("main.overrides"),
}
)
return section
def _build_reports_results(patterns: list[str], page: int) -> dict:
section = {
"key": "reports",
"title": "Reports",
"view_all_url": url_for("main.reports"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("reports"):
return section
query = ReportDefinition.query
match_expr = _contains_all_terms(
[
func.coalesce(ReportDefinition.name, ""),
func.coalesce(ReportDefinition.report_type, ""),
cast(ReportDefinition.period_start, String),
cast(ReportDefinition.period_end, String),
func.coalesce(ReportDefinition.output_format, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[ReportDefinition.created_at.desc()],
)
_enrich_paging(section, total, current_page, total_pages)
can_edit = get_active_role() in ("admin", "operator", "reporter")
for r in rows:
section["items"].append(
{
"title": r.name or f"Report #{r.id}",
"subtitle": f"{r.report_type or '-'} | {r.output_format or '-'}",
"meta": f"{_format_datetime(r.period_start)} -> {_format_datetime(r.period_end)}",
"link": (url_for("main.reports_edit", report_id=r.id) if can_edit else url_for("main.reports")),
}
)
return section
@main_bp.route("/search")
@login_required
def search_page():
query = (request.args.get("q") or "").strip()
patterns = _build_patterns(query)
requested_pages = {
key: _parse_page(request.args.get(f"p_{key}"))
for key in SEARCH_SECTION_KEYS
}
sections = []
if patterns:
sections.append(_build_inbox_results(patterns, requested_pages["inbox"]))
sections.append(_build_customers_results(patterns, requested_pages["customers"]))
sections.append(_build_jobs_results(patterns, requested_pages["jobs"]))
sections.append(_build_daily_jobs_results(patterns, requested_pages["daily_jobs"]))
sections.append(_build_run_checks_results(patterns, requested_pages["run_checks"]))
sections.append(_build_tickets_results(patterns, requested_pages["tickets"]))
sections.append(_build_remarks_results(patterns, requested_pages["remarks"]))
sections.append(_build_overrides_results(patterns, requested_pages["overrides"]))
sections.append(_build_reports_results(patterns, requested_pages["reports"]))
else:
sections = [
{"key": "inbox", "title": "Inbox", "view_all_url": url_for("main.inbox"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "customers", "title": "Customers", "view_all_url": url_for("main.customers"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "jobs", "title": "Jobs", "view_all_url": url_for("main.jobs"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "daily_jobs", "title": "Daily Jobs", "view_all_url": url_for("main.daily_jobs"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "run_checks", "title": "Run Checks", "view_all_url": url_for("main.run_checks_page"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "tickets", "title": "Tickets", "view_all_url": url_for("main.tickets_page"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "remarks", "title": "Remarks", "view_all_url": url_for("main.tickets_page", tab="remarks"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "overrides", "title": "Existing overrides", "view_all_url": url_for("main.overrides"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "reports", "title": "Reports", "view_all_url": url_for("main.reports"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
]
visible_sections = [s for s in sections if _is_section_allowed(s["key"])]
current_pages = {
s["key"]: int(s.get("current_page", 1) or 1)
for s in sections
}
def _build_search_url(page_overrides: dict[str, int]) -> str:
args = {"q": query}
for key in SEARCH_SECTION_KEYS:
args[f"p_{key}"] = int(page_overrides.get(key, current_pages.get(key, 1)))
return url_for("main.search_page", **args)
for s in visible_sections:
key = s["key"]
cur = int(s.get("current_page", 1) or 1)
if query:
if key == "inbox":
s["view_all_url"] = url_for("main.inbox", q=query)
elif key == "customers":
s["view_all_url"] = url_for("main.customers", q=query)
elif key == "jobs":
s["view_all_url"] = url_for("main.jobs", q=query)
elif key == "daily_jobs":
s["view_all_url"] = url_for("main.daily_jobs", q=query)
elif key == "run_checks":
s["view_all_url"] = url_for("main.run_checks_page", q=query)
elif key == "tickets":
s["view_all_url"] = url_for("main.tickets_page", q=query)
elif key == "remarks":
s["view_all_url"] = url_for("main.tickets_page", tab="remarks", q=query)
elif key == "overrides":
s["view_all_url"] = url_for("main.overrides", q=query)
elif key == "reports":
s["view_all_url"] = url_for("main.reports", q=query)
if s.get("has_prev"):
prev_pages = dict(current_pages)
prev_pages[key] = cur - 1
s["prev_url"] = _build_search_url(prev_pages)
if s.get("has_next"):
next_pages = dict(current_pages)
next_pages[key] = cur + 1
s["next_url"] = _build_search_url(next_pages)
total_hits = sum(int(s.get("total", 0) or 0) for s in visible_sections)
return render_template(
"main/search.html",
query=query,
sections=visible_sections,
total_hits=total_hits,
limit_per_section=SEARCH_LIMIT_PER_SECTION,
)

View File

@ -2,65 +2,124 @@ from .routes_shared import * # noqa: F401,F403
from .routes_shared import _get_database_size_bytes, _get_or_create_settings, _format_bytes, _get_free_disk_bytes, _log_admin_event from .routes_shared import _get_database_size_bytes, _get_or_create_settings, _format_bytes, _get_free_disk_bytes, _log_admin_event
import json import json
from datetime import datetime from datetime import datetime
from ..models import CoveAccount, CloudConnectAccount
@main_bp.route("/settings/jobs/delete-all", methods=["POST"]) @main_bp.route("/settings/jobs/delete-all", methods=["POST"])
@login_required @login_required
@roles_required("admin") @roles_required("admin")
def settings_jobs_delete_all(): def settings_jobs_delete_all():
try: try:
job_count = db.session.execute(text("SELECT COUNT(*) FROM jobs")).scalar() or 0 jobs = Job.query.all()
if not job_count: if not jobs:
flash("No jobs to delete.", "info") flash("No jobs to delete.", "info")
return redirect(url_for("main.settings")) return redirect(url_for("main.settings", section="general"))
def _try(stmt):
"""Best-effort: skip if table/column doesn't exist in this schema version."""
# Collect run ids for FK cleanup in auxiliary tables that may not have ON DELETE CASCADE
run_ids = []
mail_message_ids = []
for job in jobs:
for run in job.runs:
if run.id is not None:
run_ids.append(run.id)
if run.mail_message_id:
mail_message_ids.append(run.mail_message_id)
# Return related mails back to inbox and unlink from job
if mail_message_ids:
msgs = MailMessage.query.filter(MailMessage.id.in_(mail_message_ids)).all()
for msg in msgs:
if hasattr(msg, "location"):
msg.location = "inbox"
msg.job_id = None
def _safe_execute(stmt, params):
try: try:
db.session.execute(text(stmt)) db.session.execute(stmt, params)
except Exception as exc: except Exception as cleanup_exc:
print(f"[settings-jobs] Skipped: {exc}") # Best-effort cleanup for differing DB schemas
print(f"[settings-jobs] Cleanup skipped: {cleanup_exc}")
# All deletions via direct SQL — no ORM loading into Python memory.
# Order: unlink → FK-dependent tables → job_runs → jobs.
# 1. Unlink staging accounts that have a nullable FK to jobs. # Ensure run_object_links doesn't block job_runs deletion (older schemas may miss ON DELETE CASCADE)
_try("UPDATE cove_accounts SET job_id = NULL WHERE job_id IS NOT NULL") if run_ids:
_try("UPDATE cloud_connect_accounts SET job_id = NULL WHERE job_id IS NOT NULL") db.session.execute(
text("DELETE FROM run_object_links WHERE run_id IN :run_ids").bindparams(
bindparam("run_ids", expanding=True)
),
{"run_ids": run_ids},
)
# 2. Return mail messages linked to jobs back to the inbox.
db.session.execute(text(
"UPDATE mail_messages SET job_id = NULL, location = 'inbox' WHERE job_id IS NOT NULL"
))
# 3. Delete tables that FK → job_runs.
_try("DELETE FROM remark_job_runs")
_try("DELETE FROM ticket_job_runs")
_try("DELETE FROM run_object_links")
_try("DELETE FROM job_run_review_events")
# 4. Delete tables that FK → jobs. # Ensure job_object_links doesn't block jobs deletion (older schemas may miss ON DELETE CASCADE)
_try("DELETE FROM job_object_links") job_ids = [j.id for j in jobs]
_try("DELETE FROM ticket_scopes WHERE job_id IS NOT NULL") if job_ids:
_try("DELETE FROM remark_scopes WHERE job_id IS NOT NULL OR job_run_id IS NOT NULL") db.session.execute(
_try("DELETE FROM overrides WHERE job_id IS NOT NULL") text("DELETE FROM job_object_links WHERE job_id IN :job_ids").bindparams(
bindparam("job_ids", expanding=True)
),
{"job_ids": job_ids},
)
# 5. Delete runs, then jobs. # Clean up auxiliary FK tables that may reference job_runs/jobs without ON DELETE CASCADE (older schemas)
db.session.execute(text("DELETE FROM job_runs")) if run_ids:
db.session.execute(text("DELETE FROM jobs")) _safe_execute(
text("DELETE FROM remark_job_runs WHERE job_run_id IN :run_ids").bindparams(
bindparam("run_ids", expanding=True)
),
{"run_ids": run_ids},
)
_safe_execute(
text("DELETE FROM ticket_job_runs WHERE job_run_id IN :run_ids").bindparams(
bindparam("run_ids", expanding=True)
),
{"run_ids": run_ids},
)
# Some schemas use remark_scopes for per-run remarks
_safe_execute(
text("DELETE FROM remark_scopes WHERE job_run_id IN :run_ids").bindparams(
bindparam("run_ids", expanding=True)
),
{"run_ids": run_ids},
)
if job_ids:
# ticket_scopes.job_id is a FK without ON DELETE CASCADE in some schemas
_safe_execute(
text("DELETE FROM ticket_scopes WHERE job_id IN :job_ids").bindparams(
bindparam("job_ids", expanding=True)
),
{"job_ids": job_ids},
)
# Some schemas use remark_scopes for per-job remarks
_safe_execute(
text("DELETE FROM remark_scopes WHERE job_id IN :job_ids").bindparams(
bindparam("job_ids", expanding=True)
),
{"job_ids": job_ids},
)
# Overrides may reference jobs directly
_safe_execute(
text("DELETE FROM overrides WHERE job_id IN :job_ids").bindparams(
bindparam("job_ids", expanding=True)
),
{"job_ids": job_ids},
)
# Delete all jobs (runs/objects are cascaded via ORM relationships)
for job in jobs:
db.session.delete(job)
db.session.commit() db.session.commit()
flash("All jobs deleted. Related mails are returned to the inbox.", "success")
_log_admin_event(
event_type="jobs_delete_all",
message=f"Deleted all {job_count} job(s) and all related data.",
)
flash(f"All {job_count} jobs deleted. Related mails are returned to the inbox.", "success")
except Exception as exc: except Exception as exc:
db.session.rollback() db.session.rollback()
print(f"[settings-jobs] Failed to delete all jobs: {exc}") print(f"[settings-jobs] Failed to delete all jobs: {exc}")
flash(f"Failed to delete all jobs: {exc}", "danger") flash("Failed to delete all jobs.", "danger")
return redirect(url_for("main.settings")) return redirect(url_for("main.settings"))
@ -452,7 +511,7 @@ def settings_jobs_export():
try: try:
jobs = Job.query.all() jobs = Job.query.all()
payload = { payload = {
"schema": "approved_jobs_export_v2", "schema": "approved_jobs_export_v1",
"exported_at": datetime.utcnow().isoformat() + "Z", "exported_at": datetime.utcnow().isoformat() + "Z",
"counts": {"customers": 0, "jobs": 0}, "counts": {"customers": 0, "jobs": 0},
"customers": [], "customers": [],
@ -476,25 +535,6 @@ def settings_jobs_export():
for job in jobs: for job in jobs:
customer = customer_by_id.get(job.customer_id) customer = customer_by_id.get(job.customer_id)
cove_acc = getattr(job, "cove_account", None)
cove_data = None
if cove_acc:
cove_data = {
"account_id": cove_acc.account_id,
"account_name": cove_acc.account_name or "",
"computer_name": cove_acc.computer_name or "",
}
cc_acc = getattr(job, "cloud_connect_account", None)
cc_data = None
if cc_acc:
cc_data = {
"user": cc_acc.user or "",
"section": cc_acc.section or "",
"repo_name": cc_acc.repo_name or "",
}
payload["jobs"].append( payload["jobs"].append(
{ {
"customer_name": customer.name if customer else None, "customer_name": customer.name if customer else None,
@ -508,8 +548,6 @@ def settings_jobs_export():
"schedule_times": job.schedule_times, "schedule_times": job.schedule_times,
"auto_approve": bool(job.auto_approve), "auto_approve": bool(job.auto_approve),
"active": bool(job.active), "active": bool(job.active),
"cove_account": cove_data,
"cloud_connect_account": cc_data,
} }
) )
@ -522,7 +560,7 @@ def settings_jobs_export():
f"Exported jobs configuration", f"Exported jobs configuration",
details=json.dumps({ details=json.dumps({
"format": "JSON", "format": "JSON",
"schema": "approved_jobs_export_v2", "schema": "approved_jobs_export_v1",
"customers_count": len(payload["customers"]), "customers_count": len(payload["customers"]),
"jobs_count": len(payload["jobs"]) "jobs_count": len(payload["jobs"])
}, indent=2) }, indent=2)
@ -547,7 +585,6 @@ def settings_jobs_export():
@roles_required("admin") @roles_required("admin")
def settings_jobs_import(): def settings_jobs_import():
upload = request.files.get("jobs_file") upload = request.files.get("jobs_file")
include_autotask_ids = bool(request.form.get("include_autotask_ids"))
if not upload or not upload.filename: if not upload or not upload.filename:
flash("No import file was provided.", "danger") flash("No import file was provided.", "danger")
return redirect(url_for("main.settings", section="general")) return redirect(url_for("main.settings", section="general"))
@ -559,7 +596,7 @@ def settings_jobs_import():
flash("Invalid JSON file.", "danger") flash("Invalid JSON file.", "danger")
return redirect(url_for("main.settings", section="general")) return redirect(url_for("main.settings", section="general"))
if not isinstance(payload, dict) or payload.get("schema") not in ("approved_jobs_export_v1", "approved_jobs_export_v2"): if not isinstance(payload, dict) or payload.get("schema") != "approved_jobs_export_v1":
flash("Unsupported import file schema.", "danger") flash("Unsupported import file schema.", "danger")
return redirect(url_for("main.settings", section="general")) return redirect(url_for("main.settings", section="general"))
@ -584,17 +621,14 @@ def settings_jobs_import():
if not cust_name: if not cust_name:
continue continue
autotask_company_id = None # Read Autotask fields (backwards compatible - optional)
autotask_company_name = None autotask_company_id = cust_item.get("autotask_company_id")
if include_autotask_ids: autotask_company_name = cust_item.get("autotask_company_name")
# Read Autotask fields (backwards compatible - optional)
autotask_company_id = cust_item.get("autotask_company_id")
autotask_company_name = cust_item.get("autotask_company_name")
existing_customer = Customer.query.filter_by(name=cust_name).first() existing_customer = Customer.query.filter_by(name=cust_name).first()
if existing_customer: if existing_customer:
# Update Autotask mapping only when explicitly allowed by import option. # Update Autotask mapping if provided
if include_autotask_ids and autotask_company_id is not None: if autotask_company_id is not None:
existing_customer.autotask_company_id = autotask_company_id existing_customer.autotask_company_id = autotask_company_id
existing_customer.autotask_company_name = autotask_company_name existing_customer.autotask_company_name = autotask_company_name
existing_customer.autotask_mapping_status = None # Will be resynced existing_customer.autotask_mapping_status = None # Will be resynced
@ -691,7 +725,6 @@ def settings_jobs_import():
existing.auto_approve = auto_approve existing.auto_approve = auto_approve
existing.active = active existing.active = active
updated_jobs += 1 updated_jobs += 1
job_record = existing
else: else:
job_kwargs = { job_kwargs = {
"customer_id": (customer.id if customer else None), "customer_id": (customer.id if customer else None),
@ -710,45 +743,11 @@ def settings_jobs_import():
job_kwargs["from_address"] = from_address job_kwargs["from_address"] = from_address
new_job = Job(**job_kwargs) new_job = Job(**job_kwargs)
db.session.add(new_job) db.session.add(new_job)
db.session.flush()
created_jobs += 1 created_jobs += 1
job_record = new_job
# Link Cove account if present in export and not already linked to another job
cove_data = item.get("cove_account")
if cove_data and isinstance(cove_data, dict):
try:
cove_acc = None
if cove_data.get("account_id") is not None:
cove_acc = CoveAccount.query.filter_by(account_id=int(cove_data["account_id"])).first()
if cove_acc is None and cove_data.get("account_name") and cove_data.get("computer_name"):
cove_acc = CoveAccount.query.filter_by(
account_name=cove_data["account_name"],
computer_name=cove_data["computer_name"],
).first()
if cove_acc and (cove_acc.job_id is None or cove_acc.job_id == job_record.id):
cove_acc.job_id = job_record.id
job_record.cove_account_id = cove_acc.account_id
except Exception:
pass
# Link Cloud Connect account if present in export and not already linked to another job
cc_data = item.get("cloud_connect_account")
if cc_data and isinstance(cc_data, dict):
try:
cc_acc = CloudConnectAccount.query.filter_by(
user=cc_data.get("user", ""),
section=cc_data.get("section", ""),
repo_name=cc_data.get("repo_name", ""),
).first()
if cc_acc and (cc_acc.job_id is None or cc_acc.job_id == job_record.id):
cc_acc.job_id = job_record.id
except Exception:
pass
db.session.commit() db.session.commit()
flash( flash(
f"Import completed. Customers created: {created_customers}, updated: {updated_customers}. Jobs created: {created_jobs}, updated: {updated_jobs}. Autotask IDs imported: {'yes' if include_autotask_ids else 'no'}.", f"Import completed. Customers created: {created_customers}, updated: {updated_customers}. Jobs created: {created_jobs}, updated: {updated_jobs}.",
"success", "success",
) )
@ -759,7 +758,6 @@ def settings_jobs_import():
details=json.dumps({ details=json.dumps({
"format": "JSON", "format": "JSON",
"schema": payload.get("schema"), "schema": payload.get("schema"),
"include_autotask_ids": include_autotask_ids,
"customers_created": created_customers, "customers_created": created_customers,
"customers_updated": updated_customers, "customers_updated": updated_customers,
"jobs_created": created_jobs, "jobs_created": created_jobs,
@ -783,8 +781,6 @@ def settings():
if request.method == "POST": if request.method == "POST":
autotask_form_touched = any(str(k).startswith("autotask_") for k in (request.form or {}).keys()) autotask_form_touched = any(str(k).startswith("autotask_") for k in (request.form or {}).keys())
cove_form_touched = any(str(k).startswith("cove_") for k in (request.form or {}).keys())
entra_form_touched = any(str(k).startswith("entra_") for k in (request.form or {}).keys())
import_form_touched = any(str(k).startswith("auto_import_") or str(k).startswith("manual_import_") or str(k).startswith("ingest_eml_") for k in (request.form or {}).keys()) import_form_touched = any(str(k).startswith("auto_import_") or str(k).startswith("manual_import_") or str(k).startswith("ingest_eml_") for k in (request.form or {}).keys())
general_form_touched = "ui_timezone" in request.form general_form_touched = "ui_timezone" in request.form
mail_form_touched = any(k in request.form for k in ["graph_tenant_id", "graph_client_id", "graph_mailbox", "incoming_folder", "processed_folder"]) mail_form_touched = any(k in request.form for k in ["graph_tenant_id", "graph_client_id", "graph_mailbox", "incoming_folder", "processed_folder"])
@ -798,7 +794,6 @@ def settings():
old_ui_timezone = settings.ui_timezone old_ui_timezone = settings.ui_timezone
old_require_daily_dashboard_visit = settings.require_daily_dashboard_visit old_require_daily_dashboard_visit = settings.require_daily_dashboard_visit
old_is_sandbox_environment = settings.is_sandbox_environment old_is_sandbox_environment = settings.is_sandbox_environment
old_login_captcha_enabled = getattr(settings, "login_captcha_enabled", True)
old_graph_tenant_id = settings.graph_tenant_id old_graph_tenant_id = settings.graph_tenant_id
old_graph_client_id = settings.graph_client_id old_graph_client_id = settings.graph_client_id
old_graph_mailbox = settings.graph_mailbox old_graph_mailbox = settings.graph_mailbox
@ -845,9 +840,6 @@ def settings():
# Checkbox: present in form = checked, absent = unchecked. # Checkbox: present in form = checked, absent = unchecked.
settings.is_sandbox_environment = bool(request.form.get("is_sandbox_environment")) settings.is_sandbox_environment = bool(request.form.get("is_sandbox_environment"))
# Login captcha toggle — same form (General tab).
settings.login_captcha_enabled = bool(request.form.get("login_captcha_enabled"))
# Autotask integration # Autotask integration
if "autotask_enabled" in request.form: if "autotask_enabled" in request.form:
settings.autotask_enabled = bool(request.form.get("autotask_enabled")) settings.autotask_enabled = bool(request.form.get("autotask_enabled"))
@ -911,51 +903,6 @@ def settings():
except (ValueError, TypeError): except (ValueError, TypeError):
pass pass
# Cove Data Protection integration
if cove_form_touched:
settings.cove_enabled = bool(request.form.get("cove_enabled"))
settings.cove_import_enabled = bool(request.form.get("cove_import_enabled"))
if "cove_api_url" in request.form:
settings.cove_api_url = (request.form.get("cove_api_url") or "").strip() or None
if "cove_api_username" in request.form:
settings.cove_api_username = (request.form.get("cove_api_username") or "").strip() or None
if "cove_api_password" in request.form:
pw = (request.form.get("cove_api_password") or "").strip()
if pw:
settings.cove_api_password = pw
if "cove_import_interval_minutes" in request.form:
try:
interval = int(request.form.get("cove_import_interval_minutes") or 30)
if interval < 1:
interval = 1
settings.cove_import_interval_minutes = interval
except (ValueError, TypeError):
pass
# Microsoft Entra SSO
if entra_form_touched:
settings.entra_sso_enabled = bool(request.form.get("entra_sso_enabled"))
settings.entra_auto_provision_users = bool(request.form.get("entra_auto_provision_users"))
if "entra_tenant_id" in request.form:
settings.entra_tenant_id = (request.form.get("entra_tenant_id") or "").strip() or None
if "entra_client_id" in request.form:
settings.entra_client_id = (request.form.get("entra_client_id") or "").strip() or None
if "entra_redirect_uri" in request.form:
settings.entra_redirect_uri = (request.form.get("entra_redirect_uri") or "").strip() or None
if "entra_allowed_domain" in request.form:
settings.entra_allowed_domain = (request.form.get("entra_allowed_domain") or "").strip() or None
if "entra_allowed_group_ids" in request.form:
settings.entra_allowed_group_ids = (request.form.get("entra_allowed_group_ids") or "").strip() or None
if "entra_client_secret" in request.form:
pw = (request.form.get("entra_client_secret") or "").strip()
if pw:
settings.entra_client_secret = pw
# Daily Jobs # Daily Jobs
if "daily_jobs_start_date" in request.form: if "daily_jobs_start_date" in request.form:
daily_jobs_start_date_str = (request.form.get("daily_jobs_start_date") or "").strip() daily_jobs_start_date_str = (request.form.get("daily_jobs_start_date") or "").strip()
@ -1042,8 +989,6 @@ def settings():
changes_general["require_daily_dashboard_visit"] = {"old": old_require_daily_dashboard_visit, "new": settings.require_daily_dashboard_visit} changes_general["require_daily_dashboard_visit"] = {"old": old_require_daily_dashboard_visit, "new": settings.require_daily_dashboard_visit}
if old_is_sandbox_environment != settings.is_sandbox_environment: if old_is_sandbox_environment != settings.is_sandbox_environment:
changes_general["is_sandbox_environment"] = {"old": old_is_sandbox_environment, "new": settings.is_sandbox_environment} changes_general["is_sandbox_environment"] = {"old": old_is_sandbox_environment, "new": settings.is_sandbox_environment}
if old_login_captcha_enabled != settings.login_captcha_enabled:
changes_general["login_captcha_enabled"] = {"old": old_login_captcha_enabled, "new": settings.login_captcha_enabled}
if changes_general: if changes_general:
_log_admin_event( _log_admin_event(
@ -1169,8 +1114,6 @@ def settings():
has_client_secret = bool(settings.graph_client_secret) has_client_secret = bool(settings.graph_client_secret)
has_autotask_password = bool(getattr(settings, "autotask_api_password", None)) has_autotask_password = bool(getattr(settings, "autotask_api_password", None))
has_cove_password = bool(getattr(settings, "cove_api_password", None))
has_entra_secret = bool(getattr(settings, "entra_client_secret", None))
# Common UI timezones (IANA names) # Common UI timezones (IANA names)
tz_options = [ tz_options = [
@ -1296,8 +1239,6 @@ def settings():
free_disk_warning=free_disk_warning, free_disk_warning=free_disk_warning,
has_client_secret=has_client_secret, has_client_secret=has_client_secret,
has_autotask_password=has_autotask_password, has_autotask_password=has_autotask_password,
has_cove_password=has_cove_password,
has_entra_secret=has_entra_secret,
tz_options=tz_options, tz_options=tz_options,
users=users, users=users,
admin_users_count=admin_users_count, admin_users_count=admin_users_count,
@ -1312,90 +1253,6 @@ def settings():
) )
@main_bp.route("/settings/cove/test-connection", methods=["POST"])
@login_required
@roles_required("admin")
def settings_cove_test_connection():
"""Test the Cove Data Protection API connection and return JSON result."""
from flask import jsonify
from ..cove_importer import CoveImportError, _cove_login, COVE_DEFAULT_URL
settings = _get_or_create_settings()
username = (getattr(settings, "cove_api_username", None) or "").strip()
password = (getattr(settings, "cove_api_password", None) or "").strip()
url = (getattr(settings, "cove_api_url", None) or "").strip() or COVE_DEFAULT_URL
if not username or not password:
return jsonify({"ok": False, "message": "Cove API username and password must be saved first."})
try:
visa, partner_id = _cove_login(url, username, password)
# Store the partner_id
settings.cove_partner_id = partner_id
db.session.commit()
_log_admin_event(
"cove_test_connection",
f"Cove connection test succeeded. Partner ID: {partner_id}",
)
return jsonify({
"ok": True,
"partner_id": partner_id,
"message": f"Connected Partner ID: {partner_id}",
})
except CoveImportError as exc:
db.session.rollback()
return jsonify({"ok": False, "message": str(exc)})
except Exception as exc:
db.session.rollback()
return jsonify({"ok": False, "message": f"Unexpected error: {exc}"})
@main_bp.route("/settings/cove/run-now", methods=["POST"])
@login_required
@roles_required("admin")
def settings_cove_run_now():
"""Manually trigger a Cove import and show the result as a flash message."""
from ..cove_importer import CoveImportError, run_cove_import
settings = _get_or_create_settings()
if not getattr(settings, "cove_enabled", False):
flash("Cove integration is not enabled.", "warning")
return redirect(url_for("main.settings", section="integrations"))
username = (getattr(settings, "cove_api_username", None) or "").strip()
password = (getattr(settings, "cove_api_password", None) or "").strip()
if not username or not password:
flash("Cove API credentials not configured.", "warning")
return redirect(url_for("main.settings", section="integrations"))
try:
total, created, skipped, errors, reasons = run_cove_import(settings, include_reasons=True)
reason_text = ", ".join(f"{k}={v}" for k, v in sorted(reasons.items())) or "none"
_log_admin_event(
"cove_import_manual",
(
"Manual Cove import finished. "
f"accounts={total}, created={created}, skipped={skipped}, errors={errors}, reasons={reason_text}"
),
)
flash(
(
f"Cove import finished. Accounts: {total}, new runs: {created}, "
f"skipped: {skipped}, errors: {errors}. Skip reasons: {reason_text}."
),
"success" if errors == 0 else "warning",
)
except CoveImportError as exc:
_log_admin_event("cove_import_manual_error", f"Manual Cove import failed: {exc}")
flash(f"Cove import failed: {exc}", "danger")
except Exception as exc:
_log_admin_event("cove_import_manual_error", f"Unexpected error during manual Cove import: {exc}")
flash(f"Unexpected error: {exc}", "danger")
return redirect(url_for("main.settings", section="integrations"))
@main_bp.route("/settings/news/create", methods=["POST"]) @main_bp.route("/settings/news/create", methods=["POST"])
@login_required @login_required

View File

@ -52,7 +52,6 @@ from ..models import (
FeedbackItem, FeedbackItem,
FeedbackVote, FeedbackVote,
FeedbackReply, FeedbackReply,
FeedbackAttachment,
NewsItem, NewsItem,
NewsRead, NewsRead,
ReportDefinition, ReportDefinition,
@ -659,9 +658,7 @@ def _infer_schedule_map_from_runs(job_id: int):
- Synthetic missed rows never influence schedule inference. - Synthetic missed rows never influence schedule inference.
- To reduce noise, a weekday/time bucket must occur at least MIN_OCCURRENCES times. - To reduce noise, a weekday/time bucket must occur at least MIN_OCCURRENCES times.
""" """
# Higher threshold reduces false positives from short-lived patterns MIN_OCCURRENCES = 3
# (e.g. a time-of-day shift that briefly leaves two active slots).
MIN_OCCURRENCES = 5
schedule = {i: [] for i in range(7)} # 0=Mon .. 6=Sun schedule = {i: [] for i in range(7)} # 0=Mon .. 6=Sun
# Certain job types are informational and should never participate in schedule # Certain job types are informational and should never participate in schedule
@ -681,10 +678,6 @@ def _infer_schedule_map_from_runs(job_id: int):
return schedule return schedule
if bs == 'qnap' and bt == 'firmware update': if bs == 'qnap' and bt == 'firmware update':
return schedule return schedule
if bs == '3cx' and bt == 'update':
return schedule
if bs == '3cx' and bt == 'ssl certificate':
return schedule
if bs == 'syncovery' and bt == 'syncovery': if bs == 'syncovery' and bt == 'syncovery':
return schedule return schedule
except Exception: except Exception:
@ -693,10 +686,6 @@ def _infer_schedule_map_from_runs(job_id: int):
try: try:
# Only infer schedules from real runs that came from mail reports. # Only infer schedules from real runs that came from mail reports.
# Synthetic "Missed" rows must never influence schedule inference. # Synthetic "Missed" rows must never influence schedule inference.
# Limit to the last 90 days so that schedule changes (different day,
# time, or frequency) take effect quickly and do not leave stale slots
# generating false missed runs.
cutoff_utc = datetime.utcnow() - timedelta(days=90)
runs = ( runs = (
JobRun.query JobRun.query
.filter( .filter(
@ -704,7 +693,6 @@ def _infer_schedule_map_from_runs(job_id: int):
JobRun.run_at.isnot(None), JobRun.run_at.isnot(None),
JobRun.missed.is_(False), JobRun.missed.is_(False),
JobRun.mail_message_id.isnot(None), JobRun.mail_message_id.isnot(None),
JobRun.run_at >= cutoff_utc,
) )
.order_by(JobRun.run_at.desc()) .order_by(JobRun.run_at.desc())
.limit(500) .limit(500)
@ -723,7 +711,6 @@ def _infer_schedule_map_from_runs(job_id: int):
tz = None tz = None
counts = {i: {} for i in range(7)} # weekday -> { "HH:MM": count } counts = {i: {} for i in range(7)} # weekday -> { "HH:MM": count }
run_dts = [] # Collected for cadence guard below
for r in runs: for r in runs:
if not r.run_at: if not r.run_at:
continue continue
@ -738,32 +725,12 @@ def _infer_schedule_map_from_runs(job_id: int):
except Exception: except Exception:
pass pass
run_dts.append(dt)
wd = dt.weekday() wd = dt.weekday()
minute_bucket = (dt.minute // 15) * 15 minute_bucket = (dt.minute // 15) * 15
hh = dt.hour hh = dt.hour
tstr = f"{hh:02d}:{minute_bucket:02d}" tstr = f"{hh:02d}:{minute_bucket:02d}"
counts[wd][tstr] = int(counts[wd].get(tstr, 0)) + 1 counts[wd][tstr] = int(counts[wd].get(tstr, 0)) + 1
# Cadence guard: if the median gap between runs is >= 20 days the job has a
# monthly (or lower) cadence. Return an empty weekly schedule so that
# _infer_monthly_schedule_from_runs() handles it instead.
if len(run_dts) >= 2:
sorted_dts = sorted(run_dts)
gaps = []
for i in range(1, len(sorted_dts)):
try:
delta_days = (sorted_dts[i] - sorted_dts[i - 1]).total_seconds() / 86400.0
if delta_days > 0:
gaps.append(delta_days)
except Exception:
continue
if gaps:
gaps_sorted = sorted(gaps)
median_gap = gaps_sorted[len(gaps_sorted) // 2]
if median_gap >= 20.0:
return schedule # empty — defer to monthly inference
for wd in range(7): for wd in range(7):
# Keep only buckets that occur frequently enough. # Keep only buckets that occur frequently enough.
keep = [t for t, c in counts[wd].items() if int(c) >= MIN_OCCURRENCES] keep = [t for t, c in counts[wd].items() if int(c) >= MIN_OCCURRENCES]
@ -790,9 +757,6 @@ def _infer_monthly_schedule_from_runs(job_id: int):
try: try:
# Same "real run" rule as weekly inference. # Same "real run" rule as weekly inference.
# 180 days gives ~6 occurrences for a monthly job (enough for
# MIN_OCCURRENCES=3) while still discarding stale schedule data.
cutoff_utc = datetime.utcnow() - timedelta(days=180)
runs = ( runs = (
JobRun.query JobRun.query
.filter( .filter(
@ -800,7 +764,6 @@ def _infer_monthly_schedule_from_runs(job_id: int):
JobRun.run_at.isnot(None), JobRun.run_at.isnot(None),
JobRun.missed.is_(False), JobRun.missed.is_(False),
JobRun.mail_message_id.isnot(None), JobRun.mail_message_id.isnot(None),
JobRun.run_at >= cutoff_utc,
) )
.order_by(JobRun.run_at.asc()) .order_by(JobRun.run_at.asc())
.limit(500) .limit(500)
@ -1030,3 +993,4 @@ def _next_ticket_code(now_utc: datetime) -> str:
seq = 1 seq = 1
return f"{prefix}{seq:04d}" return f"{prefix}{seq:04d}"

View File

@ -28,33 +28,17 @@ def tickets_page():
if tab == "tickets": if tab == "tickets":
query = Ticket.query query = Ticket.query
joined_scope = False
if active_only: if active_only:
query = query.filter(Ticket.resolved_at.is_(None)) query = query.filter(Ticket.resolved_at.is_(None))
if q: if q:
like_q = f"%{q}%" like_q = f"%{q}%"
query = (
query
.outerjoin(TicketScope, TicketScope.ticket_id == Ticket.id)
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
.outerjoin(Job, Job.id == TicketScope.job_id)
)
joined_scope = True
query = query.filter( query = query.filter(
(Ticket.ticket_code.ilike(like_q)) (Ticket.ticket_code.ilike(like_q))
| (Ticket.description.ilike(like_q)) | (Ticket.description.ilike(like_q))
| (Customer.name.ilike(like_q))
| (TicketScope.scope_type.ilike(like_q))
| (TicketScope.backup_software.ilike(like_q))
| (TicketScope.backup_type.ilike(like_q))
| (TicketScope.job_name_match.ilike(like_q))
| (Job.job_name.ilike(like_q))
) )
query = query.distinct()
if customer_id or backup_software or backup_type: if customer_id or backup_software or backup_type:
if not joined_scope: query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id)
query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id)
if customer_id: if customer_id:
query = query.filter(TicketScope.customer_id == customer_id) query = query.filter(TicketScope.customer_id == customer_id)
if backup_software: if backup_software:
@ -338,3 +322,4 @@ def ticket_detail(ticket_id: int):
scopes=scopes, scopes=scopes,
runs=runs, runs=runs,
) )

View File

@ -10,74 +10,29 @@ from .routes_shared import main_bp
def user_settings(): def user_settings():
"""User self-service settings. """User self-service settings.
Allows the logged-in user to manage password and Run Checks preferences. Currently allows the logged-in user to change their own password.
""" """
def _parse_bool_flag(raw: str | None, default: bool = False) -> bool:
if raw is None:
return bool(default)
return str(raw).strip().lower() in ("1", "true", "yes", "on")
def _normalize_sort_mode(raw: str | None) -> str:
v = (raw or "").strip().lower()
return v if v in ("customer", "status") else "customer"
def _normalize_status_filters(values: list[str] | None) -> list[str]:
allowed = ("critical", "missed", "warning", "success_override", "success")
selected = {(x or "").strip().lower() for x in (values or []) if (x or "").strip()}
return [k for k in allowed if k in selected]
def _prefs_payload():
selected = [x.strip().lower() for x in (current_user.run_checks_filter_statuses or "").split(",") if x.strip()]
return {
"run_checks_sort_mode": _normalize_sort_mode(current_user.run_checks_sort_mode),
"run_checks_selected_statuses": _normalize_status_filters(selected),
"run_checks_filter_has_ticket": bool(current_user.run_checks_filter_has_ticket),
"run_checks_filter_has_remark": bool(current_user.run_checks_filter_has_remark),
"run_checks_filter_q": (current_user.run_checks_filter_q or ""),
}
if request.method == "POST": if request.method == "POST":
form_name = (request.form.get("form_name") or "password").strip().lower()
if form_name == "run_checks_preferences":
current_user.run_checks_sort_mode = _normalize_sort_mode(request.form.get("run_checks_sort_mode"))
current_user.run_checks_filter_statuses = ",".join(
_normalize_status_filters(request.form.getlist("run_checks_status"))
)
current_user.run_checks_filter_has_ticket = _parse_bool_flag(
request.form.get("run_checks_filter_has_ticket"),
default=False,
)
current_user.run_checks_filter_has_remark = _parse_bool_flag(
request.form.get("run_checks_filter_has_remark"),
default=False,
)
q = (request.form.get("run_checks_filter_q") or "").strip()[:255]
current_user.run_checks_filter_q = q or None
db.session.commit()
flash("Run Checks preferences updated.", "success")
return redirect(url_for("main.user_settings"))
current_password = request.form.get("current_password") or "" current_password = request.form.get("current_password") or ""
new_password = (request.form.get("new_password") or "").strip() new_password = (request.form.get("new_password") or "").strip()
confirm_password = (request.form.get("confirm_password") or "").strip() confirm_password = (request.form.get("confirm_password") or "").strip()
if not current_user.check_password(current_password): if not current_user.check_password(current_password):
flash("Current password is incorrect.", "danger") flash("Current password is incorrect.", "danger")
return render_template("main/user_settings.html", **_prefs_payload()) return render_template("main/user_settings.html")
if not new_password: if not new_password:
flash("New password is required.", "danger") flash("New password is required.", "danger")
return render_template("main/user_settings.html", **_prefs_payload()) return render_template("main/user_settings.html")
if new_password != confirm_password: if new_password != confirm_password:
flash("Passwords do not match.", "danger") flash("Passwords do not match.", "danger")
return render_template("main/user_settings.html", **_prefs_payload()) return render_template("main/user_settings.html")
current_user.set_password(new_password) current_user.set_password(new_password)
db.session.commit() db.session.commit()
flash("Password updated.", "success") flash("Password updated.", "success")
return redirect(url_for("main.user_settings")) return redirect(url_for("main.user_settings"))
return render_template("main/user_settings.html", **_prefs_payload()) return render_template("main/user_settings.html")

View File

@ -715,71 +715,6 @@ def migrate_users_theme_preference() -> None:
print("[migrations] migrate_users_theme_preference completed.") print("[migrations] migrate_users_theme_preference completed.")
def migrate_users_run_checks_preferences() -> None:
"""Add Run Checks preference columns to users if missing."""
table = "users"
columns = [
("run_checks_sort_mode", "VARCHAR(32) NOT NULL DEFAULT 'customer'"),
("run_checks_filter_statuses", "TEXT NOT NULL DEFAULT ''"),
("run_checks_filter_has_ticket", "BOOLEAN NOT NULL DEFAULT FALSE"),
("run_checks_filter_has_remark", "BOOLEAN NOT NULL DEFAULT FALSE"),
("run_checks_filter_q", "VARCHAR(255) NULL"),
]
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for users Run Checks preferences migration: {exc}")
return
try:
with engine.begin() as conn:
for column, ddl in columns:
if _column_exists_on_conn(conn, table, column):
continue
conn.execute(text(f'ALTER TABLE "{table}" ADD COLUMN {column} {ddl}'))
conn.execute(
text(
"""
UPDATE "users"
SET run_checks_sort_mode = 'customer'
WHERE run_checks_sort_mode IS NULL OR run_checks_sort_mode = '';
"""
)
)
conn.execute(
text(
"""
UPDATE "users"
SET run_checks_filter_statuses = ''
WHERE run_checks_filter_statuses IS NULL;
"""
)
)
conn.execute(
text(
"""
UPDATE "users"
SET run_checks_filter_has_ticket = FALSE
WHERE run_checks_filter_has_ticket IS NULL;
"""
)
)
conn.execute(
text(
"""
UPDATE "users"
SET run_checks_filter_has_remark = FALSE
WHERE run_checks_filter_has_remark IS NULL;
"""
)
)
print("[migrations] migrate_users_run_checks_preferences completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate users Run Checks preferences: {exc}")
def migrate_system_settings_eml_retention() -> None: def migrate_system_settings_eml_retention() -> None:
"""Add ingest_eml_retention_days to system_settings if missing. """Add ingest_eml_retention_days to system_settings if missing.
@ -1135,275 +1070,11 @@ def migrate_rename_admin_logs_to_audit_logs() -> None:
print("[migrations] audit_logs table will be created by db.create_all()") print("[migrations] audit_logs table will be created by db.create_all()")
def migrate_cove_accounts_table() -> None:
"""Create the cove_accounts staging table if it does not exist.
This table stores all accounts returned by Cove EnumerateAccountStatistics.
Unlinked accounts (job_id IS NULL) appear in the Cove Accounts review page.
"""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for cove_accounts migration: {exc}")
return
try:
with engine.begin() as conn:
conn.execute(text("""
CREATE TABLE IF NOT EXISTS cove_accounts (
id SERIAL PRIMARY KEY,
account_id INTEGER NOT NULL UNIQUE,
account_name VARCHAR(512) NULL,
computer_name VARCHAR(512) NULL,
customer_name VARCHAR(255) NULL,
datasource_types VARCHAR(255) NULL,
last_status_code INTEGER NULL,
last_run_at TIMESTAMP NULL,
colorbar_28d VARCHAR(64) NULL,
job_id INTEGER NULL REFERENCES jobs(id) ON DELETE SET NULL,
first_seen_at TIMESTAMP NOT NULL DEFAULT NOW(),
last_seen_at TIMESTAMP NOT NULL DEFAULT NOW()
)
"""))
conn.execute(text(
"CREATE INDEX IF NOT EXISTS idx_cove_accounts_account_id ON cove_accounts (account_id)"
))
conn.execute(text(
"CREATE INDEX IF NOT EXISTS idx_cove_accounts_job_id ON cove_accounts (job_id)"
))
print("[migrations] migrate_cove_accounts_table completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate cove_accounts table: {exc}")
def migrate_cove_integration() -> None:
"""Add Cove Data Protection integration columns if missing.
Adds to system_settings:
- cove_enabled (BOOLEAN NOT NULL DEFAULT FALSE)
- cove_api_url (VARCHAR(255) NULL)
- cove_api_username (VARCHAR(255) NULL)
- cove_api_password (VARCHAR(255) NULL)
- cove_import_enabled (BOOLEAN NOT NULL DEFAULT FALSE)
- cove_import_interval_minutes (INTEGER NOT NULL DEFAULT 30)
- cove_partner_id (INTEGER NULL)
- cove_last_import_at (TIMESTAMP NULL)
Adds to jobs:
- cove_account_id (INTEGER NULL)
Adds to job_runs:
- source_type (VARCHAR(20) NULL)
- external_id (VARCHAR(100) NULL)
"""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for Cove integration migration: {exc}")
return
try:
with engine.begin() as conn:
# system_settings columns
ss_columns = [
("cove_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
("cove_api_url", "VARCHAR(255) NULL"),
("cove_api_username", "VARCHAR(255) NULL"),
("cove_api_password", "VARCHAR(255) NULL"),
("cove_import_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
("cove_import_interval_minutes", "INTEGER NOT NULL DEFAULT 30"),
("cove_partner_id", "INTEGER NULL"),
("cove_last_import_at", "TIMESTAMP NULL"),
]
for column, ddl in ss_columns:
if _column_exists_on_conn(conn, "system_settings", column):
continue
conn.execute(text(f'ALTER TABLE "system_settings" ADD COLUMN {column} {ddl}'))
# jobs column
if not _column_exists_on_conn(conn, "jobs", "cove_account_id"):
conn.execute(text('ALTER TABLE "jobs" ADD COLUMN cove_account_id INTEGER NULL'))
# job_runs columns
if not _column_exists_on_conn(conn, "job_runs", "source_type"):
conn.execute(text('ALTER TABLE "job_runs" ADD COLUMN source_type VARCHAR(20) NULL'))
if not _column_exists_on_conn(conn, "job_runs", "external_id"):
conn.execute(text('ALTER TABLE "job_runs" ADD COLUMN external_id VARCHAR(100) NULL'))
# Index for deduplication lookups
conn.execute(text(
'CREATE INDEX IF NOT EXISTS idx_job_runs_external_id ON "job_runs" (external_id)'
))
print("[migrations] migrate_cove_integration completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate Cove integration columns: {exc}")
def migrate_entra_sso_settings() -> None:
"""Add Microsoft Entra SSO columns to system_settings if missing."""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for Entra SSO migration: {exc}")
return
columns = [
("entra_sso_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
("entra_tenant_id", "VARCHAR(128) NULL"),
("entra_client_id", "VARCHAR(128) NULL"),
("entra_client_secret", "VARCHAR(255) NULL"),
("entra_redirect_uri", "VARCHAR(512) NULL"),
("entra_allowed_domain", "VARCHAR(255) NULL"),
("entra_allowed_group_ids", "TEXT NULL"),
("entra_auto_provision_users", "BOOLEAN NOT NULL DEFAULT FALSE"),
]
try:
with engine.begin() as conn:
for column, ddl in columns:
if _column_exists_on_conn(conn, "system_settings", column):
continue
conn.execute(text(f'ALTER TABLE "system_settings" ADD COLUMN {column} {ddl}'))
print("[migrations] migrate_entra_sso_settings completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate Entra SSO columns: {exc}")
def migrate_cloud_connect_accounts_table() -> None:
"""Create the cloud_connect_accounts staging table if it does not exist."""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for cloud_connect_accounts migration: {exc}")
return
try:
with engine.begin() as conn:
conn.execute(text("""
CREATE TABLE IF NOT EXISTS cloud_connect_accounts (
id SERIAL PRIMARY KEY,
"user" VARCHAR(255) NOT NULL,
section VARCHAR(32) NOT NULL,
repo_name VARCHAR(512) NULL,
repo_type VARCHAR(255) NULL,
num_items VARCHAR(64) NULL,
total_quota VARCHAR(32) NULL,
used_space VARCHAR(32) NULL,
free_space VARCHAR(32) NULL,
last_active_raw VARCHAR(64) NULL,
last_active_dt TIMESTAMP NULL,
last_status VARCHAR(32) NULL,
last_mail_message_id INTEGER NULL REFERENCES mail_messages(id) ON DELETE SET NULL,
job_id INTEGER NULL REFERENCES jobs(id) ON DELETE SET NULL,
first_seen_at TIMESTAMP NOT NULL DEFAULT NOW(),
last_seen_at TIMESTAMP NOT NULL DEFAULT NOW(),
CONSTRAINT uq_cloud_connect_accounts_user_section UNIQUE ("user", section)
)
"""))
conn.execute(text(
'CREATE INDEX IF NOT EXISTS idx_cc_accounts_user ON cloud_connect_accounts ("user")'
))
conn.execute(text(
"CREATE INDEX IF NOT EXISTS idx_cc_accounts_job_id ON cloud_connect_accounts (job_id)"
))
print("[migrations] migrate_cloud_connect_accounts_table completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate cloud_connect_accounts table: {exc}")
def migrate_cc_accounts_repo_unique_key() -> None:
"""Extend the cloud_connect_accounts unique key to include repo_name.
Old key: (user, section)
New key: (user, section, repo_name)
This allows a single user to have multiple repository entries in the Cloud Connect
daily report (e.g. both a Veeam Cloud Connect Repository and an Immutable repository),
each linked to a separate Backupchecks job.
"""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for cc repo key migration: {exc}")
return
try:
with engine.begin() as conn:
# Make repo_name NOT NULL with default '' (required for unique constraint).
conn.execute(text(
"UPDATE cloud_connect_accounts SET repo_name = '' WHERE repo_name IS NULL"
))
conn.execute(text(
"ALTER TABLE cloud_connect_accounts ALTER COLUMN repo_name SET NOT NULL"
))
conn.execute(text(
"ALTER TABLE cloud_connect_accounts ALTER COLUMN repo_name SET DEFAULT ''"
))
# Drop old (user, section) constraint if it still exists.
conn.execute(text(
"ALTER TABLE cloud_connect_accounts "
"DROP CONSTRAINT IF EXISTS uq_cloud_connect_accounts_user_section"
))
# Add new (user, section, repo_name) constraint if not already present.
conn.execute(text("""
DO $$
BEGIN
IF NOT EXISTS (
SELECT 1 FROM pg_constraint
WHERE conname = 'uq_cloud_connect_accounts_user_section_repo'
) THEN
ALTER TABLE cloud_connect_accounts
ADD CONSTRAINT uq_cloud_connect_accounts_user_section_repo
UNIQUE ("user", section, repo_name);
END IF;
END $$;
"""))
print("[migrations] migrate_cc_accounts_repo_unique_key completed.")
except Exception as exc:
print(f"[migrations] Failed migrate_cc_accounts_repo_unique_key: {exc}")
def migrate_cc_remove_synthetic_missed_runs() -> None:
"""Remove synthetic missed runs that were incorrectly generated for Cloud Connect jobs.
Cloud Connect jobs do not have a fixed schedule the daily report email can arrive at
different times of day. The schedule-inference + missed-run generator would create phantom
'missed' entries when the delivery time shifted (e.g. from 18:55 to 10:24). These are now
suppressed in code; this migration cleans up any that were already stored.
"""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for cc missed-run cleanup: {exc}")
return
try:
with engine.begin() as conn:
result = conn.execute(text("""
DELETE FROM job_runs
WHERE missed = true
AND mail_message_id IS NULL
AND external_id IS NULL
AND job_id IN (
SELECT id FROM jobs
WHERE LOWER(backup_type) IN ('cloud connect backup', 'cloud connect agent')
)
"""))
deleted = result.rowcount
print(f"[migrations] migrate_cc_remove_synthetic_missed_runs completed ({deleted} rows removed).")
except Exception as exc:
print(f"[migrations] Failed migrate_cc_remove_synthetic_missed_runs: {exc}")
def run_migrations() -> None: def run_migrations() -> None:
print("[migrations] Starting migrations...") print("[migrations] Starting migrations...")
migrate_add_username_to_users() migrate_add_username_to_users()
migrate_make_email_nullable() migrate_make_email_nullable()
migrate_users_theme_preference() migrate_users_theme_preference()
migrate_users_run_checks_preferences()
migrate_system_settings_eml_retention() migrate_system_settings_eml_retention()
migrate_system_settings_auto_import_cutoff_date() migrate_system_settings_auto_import_cutoff_date()
migrate_system_settings_daily_jobs_start_date() migrate_system_settings_daily_jobs_start_date()
@ -1424,7 +1095,6 @@ def run_migrations() -> None:
migrate_object_persistence_tables() migrate_object_persistence_tables()
migrate_feedback_tables() migrate_feedback_tables()
migrate_feedback_replies_table() migrate_feedback_replies_table()
migrate_feedback_attachments_table()
migrate_tickets_active_from_date() migrate_tickets_active_from_date()
migrate_tickets_resolved_origin() migrate_tickets_resolved_origin()
migrate_remarks_active_from_date() migrate_remarks_active_from_date()
@ -1441,13 +1111,6 @@ def run_migrations() -> None:
migrate_performance_indexes() migrate_performance_indexes()
migrate_system_settings_require_daily_dashboard_visit() migrate_system_settings_require_daily_dashboard_visit()
migrate_rename_admin_logs_to_audit_logs() migrate_rename_admin_logs_to_audit_logs()
migrate_cove_integration()
migrate_cove_accounts_table()
migrate_cloud_connect_accounts_table()
migrate_cc_accounts_repo_unique_key()
migrate_cc_remove_synthetic_missed_runs()
migrate_entra_sso_settings()
migrate_system_settings_login_captcha()
print("[migrations] All migrations completed.") print("[migrations] All migrations completed.")
@ -1555,37 +1218,6 @@ def migrate_system_settings_sandbox_environment() -> None:
print(f"[migrations] Failed to migrate system_settings.is_sandbox_environment: {exc}") print(f"[migrations] Failed to migrate system_settings.is_sandbox_environment: {exc}")
def migrate_system_settings_login_captcha() -> None:
"""Add login_captcha_enabled column to system_settings if missing.
Default TRUE so existing installs keep captcha enabled after the upgrade.
"""
table = "system_settings"
column = "login_captcha_enabled"
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for login_captcha_enabled migration: {exc}")
return
try:
if _column_exists(table, column):
print("[migrations] system_settings.login_captcha_enabled already exists.")
return
with engine.begin() as conn:
conn.execute(
text(
f'ALTER TABLE "{table}" ADD COLUMN {column} BOOLEAN NOT NULL DEFAULT TRUE'
)
)
print("[migrations] migrate_system_settings_login_captcha completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate system_settings.login_captcha_enabled: {exc}")
def migrate_performance_indexes() -> None: def migrate_performance_indexes() -> None:
"""Add performance indexes for frequently queried foreign key columns. """Add performance indexes for frequently queried foreign key columns.
@ -1814,49 +1446,6 @@ def migrate_feedback_replies_table() -> None:
print("[migrations] Feedback replies table ensured.") print("[migrations] Feedback replies table ensured.")
def migrate_feedback_attachments_table() -> None:
"""Ensure feedback attachments table exists.
Table:
- feedback_attachments (screenshots/images for feedback items and replies)
"""
engine = db.get_engine()
with engine.begin() as conn:
conn.execute(
text(
"""
CREATE TABLE IF NOT EXISTS feedback_attachments (
id SERIAL PRIMARY KEY,
feedback_item_id INTEGER NOT NULL REFERENCES feedback_items(id) ON DELETE CASCADE,
feedback_reply_id INTEGER REFERENCES feedback_replies(id) ON DELETE CASCADE,
filename VARCHAR(255) NOT NULL,
file_data BYTEA NOT NULL,
mime_type VARCHAR(64) NOT NULL,
file_size INTEGER NOT NULL,
created_at TIMESTAMP NOT NULL DEFAULT NOW()
);
"""
)
)
conn.execute(
text(
"""
CREATE INDEX IF NOT EXISTS idx_feedback_attachments_item
ON feedback_attachments (feedback_item_id);
"""
)
)
conn.execute(
text(
"""
CREATE INDEX IF NOT EXISTS idx_feedback_attachments_reply
ON feedback_attachments (feedback_reply_id);
"""
)
)
print("[migrations] Feedback attachments table ensured.")
def migrate_tickets_active_from_date() -> None: def migrate_tickets_active_from_date() -> None:
"""Ensure tickets.active_from_date exists and is populated. """Ensure tickets.active_from_date exists and is populated.

View File

@ -21,12 +21,6 @@ class User(db.Model, UserMixin):
role = db.Column(db.String(50), nullable=False, default="viewer") role = db.Column(db.String(50), nullable=False, default="viewer")
# UI theme preference: 'auto' (follow OS), 'light', 'dark' # UI theme preference: 'auto' (follow OS), 'light', 'dark'
theme_preference = db.Column(db.String(16), nullable=False, default="auto") theme_preference = db.Column(db.String(16), nullable=False, default="auto")
# Run Checks user preferences
run_checks_sort_mode = db.Column(db.String(32), nullable=False, default="customer")
run_checks_filter_statuses = db.Column(db.Text, nullable=False, default="")
run_checks_filter_has_ticket = db.Column(db.Boolean, nullable=False, default=False)
run_checks_filter_has_remark = db.Column(db.Boolean, nullable=False, default=False)
run_checks_filter_q = db.Column(db.String(255), nullable=True)
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False) created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
def set_password(self, password: str) -> None: def set_password(self, password: str) -> None:
@ -123,29 +117,6 @@ class SystemSettings(db.Model):
# this is not a production environment. # this is not a production environment.
is_sandbox_environment = db.Column(db.Boolean, nullable=False, default=False) is_sandbox_environment = db.Column(db.Boolean, nullable=False, default=False)
# Login page captcha (simple math question). Default True for new installs.
login_captcha_enabled = db.Column(db.Boolean, nullable=False, default=True)
# Cove Data Protection integration settings
cove_enabled = db.Column(db.Boolean, nullable=False, default=False)
cove_api_url = db.Column(db.String(255), nullable=True) # default: https://api.backup.management/jsonapi
cove_api_username = db.Column(db.String(255), nullable=True)
cove_api_password = db.Column(db.String(255), nullable=True)
cove_import_enabled = db.Column(db.Boolean, nullable=False, default=False)
cove_import_interval_minutes = db.Column(db.Integer, nullable=False, default=30)
cove_partner_id = db.Column(db.Integer, nullable=True) # stored after successful login
cove_last_import_at = db.Column(db.DateTime, nullable=True)
# Microsoft Entra SSO settings
entra_sso_enabled = db.Column(db.Boolean, nullable=False, default=False)
entra_tenant_id = db.Column(db.String(128), nullable=True)
entra_client_id = db.Column(db.String(128), nullable=True)
entra_client_secret = db.Column(db.String(255), nullable=True)
entra_redirect_uri = db.Column(db.String(512), nullable=True)
entra_allowed_domain = db.Column(db.String(255), nullable=True)
entra_allowed_group_ids = db.Column(db.Text, nullable=True) # comma/newline separated Entra Group Object IDs
entra_auto_provision_users = db.Column(db.Boolean, nullable=False, default=False)
# Autotask integration settings # Autotask integration settings
autotask_enabled = db.Column(db.Boolean, nullable=False, default=False) autotask_enabled = db.Column(db.Boolean, nullable=False, default=False)
autotask_environment = db.Column(db.String(32), nullable=True) # sandbox | production autotask_environment = db.Column(db.String(32), nullable=True) # sandbox | production
@ -271,9 +242,6 @@ class Job(db.Model):
auto_approve = db.Column(db.Boolean, nullable=False, default=True) auto_approve = db.Column(db.Boolean, nullable=False, default=True)
active = db.Column(db.Boolean, nullable=False, default=True) active = db.Column(db.Boolean, nullable=False, default=True)
# Cove Data Protection integration (legacy: account ID stored directly on job)
cove_account_id = db.Column(db.Integer, nullable=True) # kept for backwards compat
# Archived jobs are excluded from Daily Jobs and Run Checks. # Archived jobs are excluded from Daily Jobs and Run Checks.
# JobRuns remain in the database and are still included in reporting. # JobRuns remain in the database and are still included in reporting.
archived = db.Column(db.Boolean, nullable=False, default=False) archived = db.Column(db.Boolean, nullable=False, default=False)
@ -322,10 +290,6 @@ class JobRun(db.Model):
reviewed_at = db.Column(db.DateTime, nullable=True) reviewed_at = db.Column(db.DateTime, nullable=True)
reviewed_by_user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=True) reviewed_by_user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=True)
# Import source tracking
source_type = db.Column(db.String(20), nullable=True) # NULL = email (backwards compat), "cove_api"
external_id = db.Column(db.String(100), nullable=True) # e.g. "cove-{account_id}-{run_ts}" for deduplication
# Autotask integration (Phase 4: ticket creation from Run Checks) # Autotask integration (Phase 4: ticket creation from Run Checks)
autotask_ticket_id = db.Column(db.Integer, nullable=True) autotask_ticket_id = db.Column(db.Integer, nullable=True)
autotask_ticket_number = db.Column(db.String(64), nullable=True) autotask_ticket_number = db.Column(db.String(64), nullable=True)
@ -350,81 +314,6 @@ class JobRun(db.Model):
autotask_ticket_created_by = db.relationship("User", foreign_keys=[autotask_ticket_created_by_user_id]) autotask_ticket_created_by = db.relationship("User", foreign_keys=[autotask_ticket_created_by_user_id])
class CoveAccount(db.Model):
"""Staging table for Cove Data Protection accounts.
All accounts returned by EnumerateAccountStatistics are upserted here.
Unlinked accounts (job_id IS NULL) appear in the Cove Accounts page
where an admin can create or link a job the same flow as the mail Inbox.
Once linked, the importer creates JobRuns for each new session.
"""
__tablename__ = "cove_accounts"
id = db.Column(db.Integer, primary_key=True)
# Cove account identifier (unique, from AccountId field)
account_id = db.Column(db.Integer, nullable=False, unique=True)
# Account/device info from Cove columns
account_name = db.Column(db.String(512), nullable=True) # I1 device/backup name
computer_name = db.Column(db.String(512), nullable=True) # I18 computer name
customer_name = db.Column(db.String(255), nullable=True) # I8 Cove customer/partner name
datasource_types = db.Column(db.String(255), nullable=True) # I78 active datasource label
# Last known status
last_status_code = db.Column(db.Integer, nullable=True) # D09F00
last_run_at = db.Column(db.DateTime, nullable=True) # D09F15 (converted from Unix ts)
colorbar_28d = db.Column(db.String(64), nullable=True) # D09F08
# Link to a Backupchecks job (NULL = unmatched, needs review)
job_id = db.Column(db.Integer, db.ForeignKey("jobs.id"), nullable=True)
first_seen_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
last_seen_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
job = db.relationship("Job", backref=db.backref("cove_account", uselist=False))
class CloudConnectAccount(db.Model):
"""Staging table for Veeam Cloud Connect tenant accounts.
Each row represents one User × section (Backup / Agent) combination
as found in the Veeam Cloud Connect daily report email.
Unlinked accounts (job_id IS NULL) appear on the Cloud Connect Accounts
review page where an admin can create or link a Backupchecks job
identical to the Cove Accounts flow.
"""
__tablename__ = "cloud_connect_accounts"
id = db.Column(db.Integer, primary_key=True)
user = db.Column(db.String(255), nullable=False)
section = db.Column(db.String(32), nullable=False)
repo_name = db.Column(db.String(512), nullable=False, default="")
repo_type = db.Column(db.String(255), nullable=True)
num_items = db.Column(db.String(64), nullable=True)
total_quota = db.Column(db.String(32), nullable=True)
used_space = db.Column(db.String(32), nullable=True)
free_space = db.Column(db.String(32), nullable=True)
last_active_raw = db.Column(db.String(64), nullable=True)
last_active_dt = db.Column(db.DateTime, nullable=True)
last_status = db.Column(db.String(32), nullable=True)
last_mail_message_id = db.Column(db.Integer, db.ForeignKey("mail_messages.id"), nullable=True)
job_id = db.Column(db.Integer, db.ForeignKey("jobs.id"), nullable=True)
first_seen_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
last_seen_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
job = db.relationship("Job", backref=db.backref("cloud_connect_account", uselist=False))
__table_args__ = (
db.UniqueConstraint("user", "section", "repo_name", name="uq_cloud_connect_accounts_user_section_repo"),
)
class JobRunReviewEvent(db.Model): class JobRunReviewEvent(db.Model):
__tablename__ = "job_run_review_events" __tablename__ = "job_run_review_events"
@ -678,23 +567,6 @@ class FeedbackReply(db.Model):
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False) created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
class FeedbackAttachment(db.Model):
__tablename__ = "feedback_attachments"
id = db.Column(db.Integer, primary_key=True)
feedback_item_id = db.Column(
db.Integer, db.ForeignKey("feedback_items.id", ondelete="CASCADE"), nullable=False
)
feedback_reply_id = db.Column(
db.Integer, db.ForeignKey("feedback_replies.id", ondelete="CASCADE"), nullable=True
)
filename = db.Column(db.String(255), nullable=False)
file_data = db.Column(db.LargeBinary, nullable=False)
mime_type = db.Column(db.String(64), nullable=False)
file_size = db.Column(db.Integer, nullable=False)
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
class NewsItem(db.Model): class NewsItem(db.Model):
__tablename__ = "news_items" __tablename__ = "news_items"

View File

@ -24,10 +24,6 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
- SSL Certificate Renewal (informational) - SSL Certificate Renewal (informational)
Subject: '3CX Notification: SSL Certificate Renewal - <host>' Subject: '3CX Notification: SSL Certificate Renewal - <host>'
Body contains an informational message about the renewal. Body contains an informational message about the renewal.
- Update Successful (informational)
Subject: '3CX Notification: Update Successful - <host>'
Body confirms update completion and healthy services.
""" """
subject = (msg.subject or "").strip() subject = (msg.subject or "").strip()
if not subject: if not subject:
@ -42,16 +38,11 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
subject, subject,
flags=re.IGNORECASE, flags=re.IGNORECASE,
) )
m_update = re.match(
r"^3CX Notification:\s*Update Successful\s*-\s*(.+)$",
subject,
flags=re.IGNORECASE,
)
if not m_backup and not m_ssl and not m_update: if not m_backup and not m_ssl:
return False, {}, [] return False, {}, []
job_name = (m_backup or m_ssl or m_update).group(1).strip() job_name = (m_backup or m_ssl).group(1).strip()
body = (getattr(msg, "text_body", None) or getattr(msg, "body", None) or "") body = (getattr(msg, "text_body", None) or getattr(msg, "body", None) or "")
if not body: if not body:
@ -69,17 +60,6 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
} }
return True, result, [] return True, result, []
# Update successful: store as tracked informational run
if m_update:
result = {
"backup_software": "3CX",
"backup_type": "Update",
"job_name": job_name,
"overall_status": "Success",
"overall_message": body or None,
}
return True, result, []
# Backup complete # Backup complete
backup_file = None backup_file = None
m_file = re.search(r"^\s*Backup\s+name\s*:\s*(.+?)\s*$", body, flags=re.IGNORECASE | re.MULTILINE) m_file = re.search(r"^\s*Backup\s+name\s*:\s*(.+?)\s*$", body, flags=re.IGNORECASE | re.MULTILINE)

View File

@ -1,430 +1,25 @@
/* ============================================================ /* Global layout constraints
Backupchecks Layout v2 - Consistent content width across all pages
Sidebar-first, IBM Plex, token-based design system - Optimized for 1080p while preventing further widening on higher resolutions
============================================================ */ */
/* ---- Design tokens ---- */ /* Default pages: use more horizontal space on 1920x1080 */
:root { main.content-container {
--bc-sidebar-w: 220px; width: min(96vw, 1840px);
--bc-sidebar-bg: #0f1117; max-width: 1840px;
--bc-sidebar-border: rgba(255,255,255,0.07);
--bc-sidebar-text: rgba(255,255,255,0.65);
--bc-sidebar-text-hover: rgba(255,255,255,0.95);
--bc-sidebar-active-bg: rgba(255,255,255,0.08);
--bc-sidebar-active-text: #fff;
--bc-sidebar-label-color: rgba(255,255,255,0.3);
--bc-logo-color: #fff;
--bc-accent: #3b82f6;
--bc-accent-dim: rgba(59,130,246,0.15);
--bc-main-bg: #f5f6f8;
--bc-main-bg-dark: #181c24;
--bc-content-max: 1720px;
--bc-radius: 8px;
--bc-font: 'IBM Plex Sans', system-ui, sans-serif;
--bc-mono: 'IBM Plex Mono', 'Cascadia Code', monospace;
--bc-transition: 150ms ease;
} }
/* ---- Global resets ---- */ /* Dashboard: keep the original width */
*, *::before, *::after { box-sizing: border-box; } main.dashboard-container {
width: min(90vw, 1728px);
body.bc-body { max-width: 1728px;
font-family: var(--bc-font);
font-size: 14px;
line-height: 1.55;
margin: 0;
padding: 0;
display: flex;
min-height: 100vh;
} }
/* ============================================================ /* Prevent long detail values (e.g., email addresses) from overlapping other fields */
SIDEBAR .dl-compact dt {
============================================================ */
.bc-sidebar {
position: fixed;
top: 0;
left: 0;
bottom: 0;
width: var(--bc-sidebar-w);
background: var(--bc-sidebar-bg);
border-right: 1px solid var(--bc-sidebar-border);
display: flex;
flex-direction: column;
z-index: 200;
overflow: hidden;
}
/* ---- Header / Logo ---- */
.bc-sidebar-header {
display: flex;
align-items: center;
justify-content: space-between;
padding: 0 14px;
height: 52px;
flex-shrink: 0;
border-bottom: 1px solid var(--bc-sidebar-border);
}
.bc-logo {
display: flex;
align-items: center;
gap: 9px;
text-decoration: none;
color: var(--bc-logo-color);
}
.bc-logo-icon { display: flex; align-items: center; color: var(--bc-accent); flex-shrink: 0; }
.bc-logo-text {
font-size: 13.5px;
font-weight: 600;
letter-spacing: -.01em;
white-space: nowrap; white-space: nowrap;
} }
.bc-sidebar-toggle {
display: flex;
align-items: center;
justify-content: center;
background: none;
border: none;
color: var(--bc-sidebar-text);
cursor: pointer;
padding: 4px;
border-radius: 4px;
}
.bc-sidebar-toggle:hover { color: var(--bc-sidebar-text-hover); }
/* ---- Search ---- */
.bc-sidebar-search {
padding: 10px 12px;
border-bottom: 1px solid var(--bc-sidebar-border);
flex-shrink: 0;
}
.bc-search-form {
position: relative;
display: flex;
align-items: center;
}
.bc-search-icon {
position: absolute;
left: 9px;
color: var(--bc-sidebar-label-color);
pointer-events: none;
flex-shrink: 0;
}
.bc-search-input {
width: 100%;
background: rgba(255,255,255,0.06);
border: 1px solid rgba(255,255,255,0.1);
border-radius: 6px;
color: rgba(255,255,255,0.85);
font-size: 12.5px;
font-family: var(--bc-font);
padding: 5px 9px 5px 29px;
outline: none;
transition: border-color var(--bc-transition);
}
.bc-search-input::placeholder { color: var(--bc-sidebar-label-color); }
.bc-search-input:focus { border-color: rgba(59,130,246,0.5); background: rgba(255,255,255,0.08); }
/* ---- Nav ---- */
.bc-nav-section {
flex: 1;
overflow-y: auto;
overflow-x: hidden;
padding: 8px 0;
scrollbar-width: thin;
scrollbar-color: rgba(255,255,255,0.1) transparent;
}
.bc-nav-section::-webkit-scrollbar { width: 4px; }
.bc-nav-section::-webkit-scrollbar-track { background: transparent; }
.bc-nav-section::-webkit-scrollbar-thumb { background: rgba(255,255,255,0.1); border-radius: 2px; }
.bc-nav-item {
display: flex;
align-items: center;
gap: 9px;
padding: 6px 12px;
margin: 1px 8px;
border-radius: 6px;
text-decoration: none;
color: var(--bc-sidebar-text);
font-size: 13px;
font-weight: 500;
transition: background var(--bc-transition), color var(--bc-transition);
white-space: nowrap;
}
.bc-nav-item:hover {
background: rgba(255,255,255,0.06);
color: var(--bc-sidebar-text-hover);
}
.bc-nav-item.is-active {
background: var(--bc-sidebar-active-bg);
color: var(--bc-sidebar-active-text);
}
.bc-nav-icon { display: flex; align-items: center; flex-shrink: 0; }
.bc-nav-label-text { flex: 1; overflow: hidden; text-overflow: ellipsis; }
.bc-nav-badge {
background: var(--bc-accent);
color: #fff;
font-size: 10px;
font-weight: 600;
padding: 1px 6px;
border-radius: 99px;
min-width: 18px;
text-align: center;
flex-shrink: 0;
}
.bc-nav-divider {
height: 1px;
background: var(--bc-sidebar-border);
margin: 6px 12px;
}
.bc-nav-label {
font-size: 10.5px;
font-weight: 600;
letter-spacing: .08em;
text-transform: uppercase;
color: var(--bc-sidebar-label-color);
padding: 4px 20px 2px;
}
/* ---- Footer ---- */
.bc-sidebar-footer {
border-top: 1px solid var(--bc-sidebar-border);
padding: 10px 12px;
flex-shrink: 0;
display: flex;
flex-direction: column;
gap: 8px;
}
.bc-user-block {
display: flex;
align-items: center;
justify-content: space-between;
gap: 8px;
}
.bc-user-name {
font-size: 12.5px;
font-weight: 500;
color: rgba(255,255,255,0.75);
text-decoration: none;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
}
.bc-user-name:hover { color: #fff; }
.bc-user-role {
font-size: 10.5px;
color: var(--bc-sidebar-label-color);
background: rgba(255,255,255,0.06);
border-radius: 4px;
padding: 1px 6px;
white-space: nowrap;
flex-shrink: 0;
}
.bc-footer-controls {
display: flex;
align-items: center;
gap: 6px;
}
.bc-select {
flex: 1;
background: rgba(255,255,255,0.06);
border: 1px solid rgba(255,255,255,0.1);
border-radius: 5px;
color: rgba(255,255,255,0.75);
font-size: 11.5px;
font-family: var(--bc-font);
padding: 3px 6px;
outline: none;
cursor: pointer;
min-width: 0;
}
.bc-select:focus { border-color: rgba(59,130,246,0.5); }
.bc-logout {
display: flex;
align-items: center;
color: var(--bc-sidebar-text);
flex-shrink: 0;
padding: 4px;
border-radius: 4px;
transition: color var(--bc-transition);
}
.bc-logout:hover { color: #ef4444; }
/* ============================================================
MAIN CONTENT AREA
============================================================ */
.bc-main {
flex: 1;
margin-left: var(--bc-sidebar-w);
min-width: 0;
display: flex;
flex-direction: column;
min-height: 100vh;
}
.bc-main-auth {
margin-left: 0;
}
/* ---- Topbar (mobile only) ---- */
.bc-topbar {
position: sticky;
top: 0;
z-index: 100;
height: 48px;
display: flex;
align-items: center;
gap: 12px;
padding: 0 16px;
border-bottom: 1px solid var(--bs-border-color);
background: var(--bs-body-bg);
}
.bc-hamburger {
background: none;
border: none;
color: var(--bs-body-color);
cursor: pointer;
display: flex;
align-items: center;
padding: 4px;
border-radius: 4px;
}
.bc-topbar-logo {
font-size: 14px;
font-weight: 600;
text-decoration: none;
color: var(--bs-body-color);
}
/* ---- Content ---- */
.bc-content {
flex: 1;
padding: 28px 32px;
max-width: var(--bc-content-max);
width: 100%;
}
/* ---- Overlay (mobile) ---- */
.bc-overlay {
display: none;
position: fixed;
inset: 0;
background: rgba(0,0,0,0.5);
z-index: 150;
}
.bc-overlay.is-visible { display: block; }
/* ============================================================
AUTH PAGES (login, setup)
============================================================ */
.bc-main-auth .bc-content {
display: flex;
flex-direction: column;
justify-content: center;
min-height: 100vh;
padding: 32px 16px;
max-width: 480px;
margin: 0 auto;
width: 100%;
}
/* ============================================================
MOBILE SIDEBAR
============================================================ */
@media (max-width: 991px) {
.bc-sidebar {
transform: translateX(-100%);
transition: transform 220ms cubic-bezier(.4,0,.2,1);
}
.bc-sidebar.is-open {
transform: translateX(0);
box-shadow: 4px 0 32px rgba(0,0,0,0.3);
}
.bc-main {
margin-left: 0;
}
}
.bc-main {
background: var(--bc-main-bg-dark);
}
/* ============================================================
CONTENT TYPOGRAPHY & UTILITIES
============================================================ */
/* Page headings */
.bc-content h2 {
font-size: 20px;
font-weight: 600;
margin-bottom: 20px;
letter-spacing: -.02em;
}
.bc-content h3 {
font-size: 16px;
font-weight: 600;
}
/* Tables */
.table {
font-size: 13.5px;
}
.table thead th {
font-size: 11.5px;
font-weight: 600;
letter-spacing: .04em;
text-transform: uppercase;
opacity: .6;
border-bottom-width: 1px;
white-space: nowrap;
}
/* Cards */
.card {
border-radius: var(--bc-radius);
}
/* Mono values (log lines, codes, etc.) */
code, kbd, pre, .bc-mono {
font-family: var(--bc-mono);
font-size: 12.5px;
}
/* Compact definition lists */
.dl-compact dt { white-space: nowrap; }
.dl-compact .ellipsis-field { .dl-compact .ellipsis-field {
min-width: 0; min-width: 0;
overflow: hidden; overflow: hidden;
@ -432,6 +27,7 @@ code, kbd, pre, .bc-mono {
white-space: nowrap; white-space: nowrap;
cursor: pointer; cursor: pointer;
} }
.dl-compact .ellipsis-field.is-expanded { .dl-compact .ellipsis-field.is-expanded {
overflow: visible; overflow: visible;
text-overflow: clip; text-overflow: clip;
@ -439,100 +35,55 @@ code, kbd, pre, .bc-mono {
cursor: text; cursor: text;
} }
/* Markdown content */ /* Markdown rendering (e.g., changelog page) */
.markdown-content { overflow-wrap: anywhere; } .markdown-content {
.markdown-content h1,.markdown-content h2,.markdown-content h3, overflow-wrap: anywhere;
.markdown-content h4,.markdown-content h5,.markdown-content h6 {
margin-top: 1.25rem;
margin-bottom: .75rem;
} }
.markdown-content p { margin-bottom: .75rem; }
.markdown-content ul, .markdown-content ol { margin-bottom: .75rem; } .markdown-content h1,
.markdown-content h2,
.markdown-content h3,
.markdown-content h4,
.markdown-content h5,
.markdown-content h6 {
margin-top: 1.25rem;
margin-bottom: 0.75rem;
}
.markdown-content p {
margin-bottom: 0.75rem;
}
.markdown-content ul,
.markdown-content ol {
margin-bottom: 0.75rem;
}
.markdown-content pre { .markdown-content pre {
padding: .75rem; padding: 0.75rem;
border-radius: .5rem; border-radius: 0.5rem;
background: rgba(0,0,0,.05); background: rgba(0, 0, 0, 0.05);
overflow: auto; overflow: auto;
} }
.markdown-content code { font-size: .95em; }
.markdown-content table { width: 100%; margin-bottom: 1rem; } .markdown-content code {
.markdown-content table th, .markdown-content table td { font-size: 0.95em;
padding: .5rem;
border-top: 1px solid rgba(0,0,0,.15);
} }
.markdown-content table {
width: 100%;
margin-bottom: 1rem;
}
.markdown-content table th,
.markdown-content table td {
padding: 0.5rem;
border-top: 1px solid rgba(0, 0, 0, 0.15);
}
.markdown-content blockquote { .markdown-content blockquote {
border-left: .25rem solid rgba(0,0,0,.15); border-left: 0.25rem solid rgba(0, 0, 0, 0.15);
padding-left: .75rem; padding-left: 0.75rem;
margin-left: 0; margin-left: 0;
color: rgba(0,0,0,.7); color: rgba(0, 0, 0, 0.7);
}
/* Alerts */
.bc-alerts { max-width: 800px; }
/* Info block (was used in run_checks) */
.info-block {
font-size: 13.5px;
color: var(--bs-secondary-color);
max-width: 860px;
}
/* Legacy compatibility: content-container / dashboard-container */
main.content-container,
main.dashboard-container {
/* Replaced by .bc-content — these are no-ops now but kept for safety */
}
/* ============================================================
DASHBOARD STAT CARDS
============================================================ */
.bc-stat-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(120px, 1fr));
gap: 12px;
}
.bc-stat-card {
background: var(--bs-card-bg);
border: 1px solid var(--bs-border-color);
border-radius: var(--bc-radius);
padding: 14px 16px;
}
.bc-stat-card.bc-stat-large {
grid-column: span 2;
}
.bc-stat-label {
display: flex;
align-items: center;
gap: 5px;
font-size: 11.5px;
font-weight: 500;
opacity: .55;
margin-bottom: 5px;
}
.bc-stat-value {
font-size: 30px;
font-weight: 600;
letter-spacing: -.03em;
line-height: 1;
}
.bc-stat-sub {
font-size: 11px;
opacity: .4;
margin-top: 4px;
}
.bc-stat-success { color: var(--bs-success); }
.bc-stat-warning { color: var(--bs-warning); }
.bc-stat-failed { color: var(--bs-danger); }
.bc-stat-override{ color: var(--bs-primary); }
.bc-stat-muted { color: var(--bs-secondary); }
@media (max-width: 767px) {
.bc-stat-card.bc-stat-large { grid-column: span 1; }
.bc-stat-grid { grid-template-columns: repeat(2, 1fr); }
} }

View File

@ -8,13 +8,13 @@
top: 30px; top: 30px;
left: -60px; left: -60px;
width: 250px; width: 250px;
background-color: rgba(220, 53, 69, 0.45); background-color: #dc3545;
color: rgba(255, 255, 255, 0.9); color: white;
text-align: center; text-align: center;
transform: rotate(-45deg); transform: rotate(-45deg);
z-index: 9999; z-index: 9999;
pointer-events: none; /* Banner itself is not clickable, elements behind it are */ pointer-events: none; /* Banner itself is not clickable, elements behind it are */
box-shadow: 0 4px 8px rgba(0, 0, 0, 0.12); box-shadow: 0 4px 8px rgba(0, 0, 0, 0.3);
} }
.sandbox-banner-text { .sandbox-banner-text {

View File

@ -1,6 +1,7 @@
{% extends "layout/base.html" %} {% extends "layout/base.html" %}
{% block content %} {% block content %}
<div> <div class="row justify-content-center">
<div class="col-md-4">
<h2 class="mb-3">Login</h2> <h2 class="mb-3">Login</h2>
<form method="post"> <form method="post">
<div class="mb-3"> <div class="mb-3">
@ -10,7 +11,7 @@
class="form-control" class="form-control"
id="username" id="username"
name="username" name="username"
value="{{ username or '' }}" value="{{ email or '' }}"
required required
/> />
</div> </div>
@ -24,7 +25,6 @@
required required
/> />
</div> </div>
{% if captcha_enabled %}
<div class="mb-3"> <div class="mb-3">
<label class="form-label">Captcha: {{ captcha_question }}</label> <label class="form-label">Captcha: {{ captcha_question }}</label>
<input <input
@ -34,17 +34,11 @@
required required
/> />
</div> </div>
{% endif %}
<button type="submit" class="btn btn-primary w-100">Login</button> <button type="submit" class="btn btn-primary w-100">Login</button>
<div class="mt-3 text-center"> <div class="mt-3 text-center">
<a class="btn btn-link" href="{{ url_for('auth.password_reset_request') }}">Forgot password?</a> <a class="btn btn-link" href="{{ url_for('auth.password_reset_request') }}">Forgot password?</a>
</div> </div>
</form> </form>
{% if entra_sso_enabled %} </div>
<div class="my-3"><hr /></div>
<a class="btn btn-outline-secondary w-100" href="{{ url_for('auth.entra_login') }}">
Sign in with Microsoft
</a>
{% endif %}
</div> </div>
{% endblock %} {% endblock %}

View File

@ -4,6 +4,8 @@
<link rel="stylesheet" href="{{ url_for('static', filename='css/documentation.css') }}"> <link rel="stylesheet" href="{{ url_for('static', filename='css/documentation.css') }}">
{% endblock %} {% endblock %}
{% block main_class %}container-fluid content-container{% endblock %}
{% block content %} {% block content %}
<div class="documentation-container"> <div class="documentation-container">
<div class="row g-0"> <div class="row g-0">

View File

@ -1,121 +0,0 @@
{% extends "documentation/base.html" %}
{% block doc_content %}
<h1>Microsoft Entra SSO</h1>
<p>Use Microsoft Entra ID (Azure AD) to let users sign in with their Microsoft account.</p>
<div class="doc-callout doc-callout-warning">
<strong>Status: Untested in Backupchecks.</strong>
This SSO implementation has not yet been end-to-end validated in Backupchecks itself.
Treat this page as implementation guidance for future rollout, not as a confirmed production setup.
</div>
<div class="doc-callout doc-callout-info">
<strong>Scope:</strong> this page explains the setup for Backupchecks and Microsoft Entra.
It does not replace your internal identity/security policies.
</div>
<h2>Prerequisites</h2>
<ul>
<li>Admin access to your Microsoft Entra tenant.</li>
<li>Admin access to Backupchecks <strong>Settings → Integrations</strong>.</li>
<li>A stable HTTPS URL for Backupchecks (recommended for production).</li>
</ul>
<h2>Step 1: Register an app in Microsoft Entra</h2>
<ol>
<li>Open <strong>Microsoft Entra admin center</strong><strong>App registrations</strong>.</li>
<li>Create a new registration (single-tenant is typical for internal use).</li>
<li>Set a name, for example <code>Backupchecks SSO</code>.</li>
<li>After creation, copy:
<ul>
<li><strong>Application (client) ID</strong></li>
<li><strong>Directory (tenant) ID</strong></li>
</ul>
</li>
</ol>
<h2>Step 2: Configure redirect URI</h2>
<ol>
<li>In the app registration, open <strong>Authentication</strong>.</li>
<li>Add a <strong>Web</strong> redirect URI:
<ul>
<li><code>https://your-backupchecks-domain/auth/entra/callback</code></li>
</ul>
</li>
<li>Save the authentication settings.</li>
</ol>
<h2>Step 3: Create client secret</h2>
<ol>
<li>Open <strong>Certificates &amp; secrets</strong> in the app registration.</li>
<li>Create a new client secret.</li>
<li>Copy the secret value immediately (it is shown only once).</li>
</ol>
<h2>Step 4: Configure Backupchecks</h2>
<ol>
<li>Open <strong>Settings → Integrations → Microsoft Entra SSO</strong>.</li>
<li>Enable <strong>Microsoft sign-in</strong>.</li>
<li>Fill in:
<ul>
<li><strong>Tenant ID</strong></li>
<li><strong>Client ID</strong></li>
<li><strong>Client Secret</strong></li>
<li><strong>Redirect URI</strong> (optional override, leave empty to auto-use callback URL)</li>
<li><strong>Allowed domain/tenant</strong> (optional restriction)</li>
<li><strong>Allowed Entra Group Object ID(s)</strong> (optional but recommended)</li>
</ul>
</li>
<li>Optional: enable <strong>Auto-provision unknown users as Viewer</strong>.</li>
<li>Save settings.</li>
</ol>
<h2>Security Group Restriction (recommended)</h2>
<p>You can enforce that only members of one or more specific Entra security groups can sign in.</p>
<ol>
<li>Create or choose a security group in Entra (for example <code>Backupchecks-Users</code>).</li>
<li>Add the allowed users to that group.</li>
<li>Copy the group <strong>Object ID</strong> (not display name).</li>
<li>Paste one or more group object IDs in:
<ul>
<li><strong>Settings → Integrations → Microsoft Entra SSO → Allowed Entra Group Object ID(s)</strong></li>
</ul>
</li>
<li>In the Entra app registration, configure <strong>Token configuration</strong> to include the <code>groups</code> claim in ID tokens.</li>
</ol>
<div class="doc-callout doc-callout-warning">
<strong>Important:</strong> if users are member of many groups, Entra may return a "group overage" token without inline
<code>groups</code> list. In that case Backupchecks cannot verify membership and login is blocked by design.
</div>
<h2>Step 5: Test sign-in</h2>
<ol>
<li>Open <strong>/auth/login</strong> in a private/incognito browser session.</li>
<li>Click <strong>Sign in with Microsoft</strong>.</li>
<li>Authenticate with an allowed account.</li>
<li>Confirm you are redirected back into Backupchecks.</li>
</ol>
<h2>User mapping behavior</h2>
<ul>
<li>Backupchecks first tries to match Entra user to local user by username/email.</li>
<li>If no match exists:
<ul>
<li>With auto-provision disabled: login is rejected.</li>
<li>With auto-provision enabled: a new local user is created with <strong>Viewer</strong> role.</li>
</ul>
</li>
</ul>
<h2>Troubleshooting</h2>
<ul>
<li><strong>Redirect URI mismatch:</strong> ensure Entra app URI exactly matches Backupchecks callback URI.</li>
<li><strong>SSO button not visible:</strong> check that SSO is enabled and Tenant/Client/Secret are saved.</li>
<li><strong>Account not allowed:</strong> verify tenant/domain restriction in <em>Allowed domain/tenant</em>.</li>
<li><strong>Group restricted login fails:</strong> verify group object IDs and ensure the ID token includes a <code>groups</code> claim.</li>
<li><strong>No local user mapping:</strong> create a matching local user or enable auto-provision.</li>
</ul>
{% endblock %}

View File

@ -1,276 +1,364 @@
{# ===== ICON MACROS ===== #}
{% macro icon_grid() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><rect x="1" y="1" width="5.5" height="5.5" rx="1" fill="currentColor"/><rect x="8.5" y="1" width="5.5" height="5.5" rx="1" fill="currentColor" opacity=".5"/><rect x="1" y="8.5" width="5.5" height="5.5" rx="1" fill="currentColor" opacity=".5"/><rect x="8.5" y="8.5" width="5.5" height="5.5" rx="1" fill="currentColor" opacity=".25"/></svg>{% endmacro %}
{% macro icon_inbox() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M1 9.5l2.5-5h8L14 9.5V13a1 1 0 01-1 1H2a1 1 0 01-1-1V9.5z" stroke="currentColor" stroke-width="1.3"/><path d="M1 9.5h3.5a2 2 0 004 0H14" stroke="currentColor" stroke-width="1.3"/></svg>{% endmacro %}
{% macro icon_check() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M2.5 7.5l3.5 3.5 6.5-6.5" stroke="currentColor" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/></svg>{% endmacro %}
{% macro icon_calendar() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><rect x="1" y="3" width="13" height="11" rx="1.5" stroke="currentColor" stroke-width="1.3"/><path d="M5 1v4M10 1v4M1 7h13" stroke="currentColor" stroke-width="1.3" stroke-linecap="round"/></svg>{% endmacro %}
{% macro icon_bars() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M2 4h11M2 7.5h8M2 11h5" stroke="currentColor" stroke-width="1.5" stroke-linecap="round"/></svg>{% endmacro %}
{% macro icon_building() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><rect x="1" y="3" width="9" height="11" rx="1" stroke="currentColor" stroke-width="1.3"/><path d="M10 6h3a1 1 0 011 1v7H10" stroke="currentColor" stroke-width="1.3"/><rect x="3" y="6" width="2" height="2" rx=".5" fill="currentColor"/><rect x="7" y="6" width="2" height="2" rx=".5" fill="currentColor"/><rect x="3" y="10" width="2" height="2" rx=".5" fill="currentColor"/><rect x="7" y="10" width="2" height="2" rx=".5" fill="currentColor"/></svg>{% endmacro %}
{% macro icon_server() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><rect x="1" y="2" width="13" height="4" rx="1" stroke="currentColor" stroke-width="1.3"/><rect x="1" y="9" width="13" height="4" rx="1" stroke="currentColor" stroke-width="1.3"/><circle cx="11.5" cy="4" r=".8" fill="currentColor"/><circle cx="11.5" cy="11" r=".8" fill="currentColor"/></svg>{% endmacro %}
{% macro icon_ticket() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M1 5a1 1 0 011-1h11a1 1 0 011 1v1.5a2 2 0 000 4V12a1 1 0 01-1 1H2a1 1 0 01-1-1v-1.5a2 2 0 000-4V5z" stroke="currentColor" stroke-width="1.3"/></svg>{% endmacro %}
{% macro icon_shield() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M7.5 1.5l5 2v4c0 2.5-2 4.5-5 6-3-1.5-5-3.5-5-6v-4l5-2z" stroke="currentColor" stroke-width="1.3"/></svg>{% endmacro %}
{% macro icon_cloud() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M3.5 10a3.5 3.5 0 01-.5-7 4.5 4.5 0 018.5 1.5A3 3 0 0112.5 10H3.5z" stroke="currentColor" stroke-width="1.3"/></svg>{% endmacro %}
{% macro icon_book() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M3 2h8a1 1 0 011 1v10a1 1 0 01-1 1H3a1 1 0 01-1-1V3a1 1 0 011-1z" stroke="currentColor" stroke-width="1.3"/><path d="M5 5.5h5M5 8h5M5 10.5h3" stroke="currentColor" stroke-width="1.3" stroke-linecap="round"/></svg>{% endmacro %}
{% macro icon_clock() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><circle cx="7.5" cy="7.5" r="5.5" stroke="currentColor" stroke-width="1.3"/><path d="M7.5 4.5v3l2 1.5" stroke="currentColor" stroke-width="1.3" stroke-linecap="round"/></svg>{% endmacro %}
{% macro icon_chat() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M1 3a1 1 0 011-1h11a1 1 0 011 1v7a1 1 0 01-1 1H5l-3 3V3z" stroke="currentColor" stroke-width="1.3"/></svg>{% endmacro %}
{% macro icon_mail() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><rect x="1" y="3" width="13" height="9" rx="1" stroke="currentColor" stroke-width="1.3"/><path d="M1 4l6.5 5L14 4" stroke="currentColor" stroke-width="1.3"/></svg>{% endmacro %}
{% macro icon_trash() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M3 4h9M5 4V2.5a.5.5 0 01.5-.5h4a.5.5 0 01.5.5V4M6 7v4M9 7v4" stroke="currentColor" stroke-width="1.3" stroke-linecap="round"/><rect x="2" y="4" width="11" height="9" rx="1" stroke="currentColor" stroke-width="1.3"/></svg>{% endmacro %}
{% macro icon_archive() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><rect x="1" y="2" width="13" height="3" rx="1" stroke="currentColor" stroke-width="1.3"/><path d="M2 5v7a1 1 0 001 1h9a1 1 0 001-1V5" stroke="currentColor" stroke-width="1.3"/><path d="M5.5 8.5h4" stroke="currentColor" stroke-width="1.3" stroke-linecap="round"/></svg>{% endmacro %}
{% macro icon_cog() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><circle cx="7.5" cy="7.5" r="2" stroke="currentColor" stroke-width="1.3"/><path d="M7.5 1.5v1.3M7.5 12.2v1.3M1.5 7.5h1.3M12.2 7.5h1.3M3.3 3.3l.9.9M10.8 10.8l.9.9M3.3 11.7l.9-.9M10.8 4.2l.9-.9" stroke="currentColor" stroke-width="1.3" stroke-linecap="round"/></svg>{% endmacro %}
{% macro icon_log() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><rect x="2" y="2" width="11" height="11" rx="1" stroke="currentColor" stroke-width="1.3"/><path d="M5 5h5M5 7.5h5M5 10h3" stroke="currentColor" stroke-width="1.3" stroke-linecap="round"/></svg>{% endmacro %}
{% macro icon_code() %}<svg width="15" height="15" viewBox="0 0 15 15" fill="none"><path d="M4.5 5L1 7.5 4.5 10M10.5 5L14 7.5 10.5 10M8.5 3l-2 9" stroke="currentColor" stroke-width="1.3" stroke-linecap="round" stroke-linejoin="round"/></svg>{% endmacro %}
{# ===== NAV ITEM MACRO ===== #}
{% macro bc_nav_item(endpoint, label, icon_svg, badge=none, startswith=none) %}
{% set _active = (startswith and request.path.startswith(startswith)) or (not startswith and request.path == url_for(endpoint)) %}
<a class="bc-nav-item {% if _active %}is-active{% endif %}" href="{{ url_for(endpoint) }}">
<span class="bc-nav-icon">{{ icon_svg }}</span>
<span class="bc-nav-label-text">{{ label }}</span>
{% if badge %}<span class="bc-nav-badge">{{ badge }}</span>{% endif %}
</a>
{% endmacro %}
<!doctype html> <!doctype html>
<html lang="en" data-bs-theme="dark" data-theme="dark"> {% set _theme_pref = (current_user.theme_preference if current_user.is_authenticated else 'auto') %}
<html lang="en" data-theme-preference="{{ _theme_pref }}">
<head> <head>
<meta charset="utf-8" /> <meta charset="utf-8" />
<title>{% block title %}Backupchecks{% endblock %}</title> <title>Backupchecks</title>
<meta name="viewport" content="width=device-width, initial-scale=1" /> <meta name="viewport" content="width=device-width, initial-scale=1" />
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css" rel="stylesheet" /> <link
<link rel="preconnect" href="https://fonts.googleapis.com" /> href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css"
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin /> rel="stylesheet"
<link href="https://fonts.googleapis.com/css2?family=IBM+Plex+Mono:wght@400;500&family=IBM+Plex+Sans:wght@400;500;600&display=swap" rel="stylesheet" /> />
<link rel="stylesheet" href="{{ url_for('static', filename='css/layout.css') }}" /> <link rel="stylesheet" href="{{ url_for('static', filename='css/layout.css') }}" />
<link rel="stylesheet" href="{{ url_for('static', filename='css/status-text.css') }}" /> <link rel="stylesheet" href="{{ url_for('static', filename='css/status-text.css') }}" />
<link rel="stylesheet" href="{{ url_for('static', filename='css/sandbox.css') }}" /> <link rel="stylesheet" href="{{ url_for('static', filename='css/sandbox.css') }}" />
<link rel="icon" type="image/x-icon" href="{{ url_for('static', filename='favicon.ico') }}" /> <link rel="icon" type="image/x-icon" href="{{ url_for('static', filename='favicon.ico') }}" />
{% block head %}{% endblock %} {% block head %}{% endblock %}
</head>
<body class="bc-body">
<script>
(function () {
try {
var root = document.documentElement;
var pref = root.getAttribute('data-theme-preference') || 'auto';
var mq = window.matchMedia ? window.matchMedia('(prefers-color-scheme: dark)') : null;
function applyTheme() {
var theme = pref;
if (pref === 'auto') {
theme = (mq && mq.matches) ? 'dark' : 'light';
}
root.setAttribute('data-bs-theme', theme);
}
applyTheme();
if (mq && typeof mq.addEventListener === 'function') {
mq.addEventListener('change', function () {
if ((root.getAttribute('data-theme-preference') || 'auto') === 'auto') {
applyTheme();
}
});
} else if (mq && typeof mq.addListener === 'function') {
// Safari fallback
mq.addListener(function () {
if ((root.getAttribute('data-theme-preference') || 'auto') === 'auto') {
applyTheme();
}
});
}
} catch (e) {
// no-op
}
})();
</script>
</head>
<body>
{% if system_settings and system_settings.is_sandbox_environment %} {% if system_settings and system_settings.is_sandbox_environment %}
<div class="sandbox-banner"><span class="sandbox-banner-text">SANDBOX</span></div> <div class="sandbox-banner">
<span class="sandbox-banner-text">SANDBOX</span>
</div>
{% endif %} {% endif %}
{% if current_user.is_authenticated %} <nav class="navbar navbar-expand-lg fixed-top bg-body-tertiary border-bottom">
<!-- SIDEBAR --> <div class="container-fluid">
<nav class="bc-sidebar" id="bcSidebar"> <a class="navbar-brand" href="{{ url_for('main.dashboard') }}">Backupchecks</a>
<div class="bc-sidebar-header"> <button
<a class="bc-logo" href="{{ url_for('main.dashboard') }}"> class="navbar-toggler"
<span class="bc-logo-icon"> type="button"
<img src="{{ url_for('static', filename='favicon.ico') }}" width="20" height="20" alt="" /> data-bs-toggle="collapse"
</span> data-bs-target="#navbarNav"
<span class="bc-logo-text">Backupchecks</span> aria-controls="navbarNav"
</a> aria-expanded="false"
<button class="bc-sidebar-toggle d-lg-none" id="bcSidebarClose" aria-label="Close menu"> aria-label="Toggle navigation"
<svg width="18" height="18" viewBox="0 0 18 18" fill="none"><path d="M2 2l14 14M16 2L2 16" stroke="currentColor" stroke-width="2" stroke-linecap="round"/></svg> >
<span class="navbar-toggler-icon"></span>
</button> </button>
</div> <div class="collapse navbar-collapse" id="navbarNav">
{% if current_user.is_authenticated %}
<ul class="navbar-nav me-auto mb-2 mb-lg-0">
{% if active_role == 'reporter' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.reports') }}">Reports</a>
</li>
<li class="nav-item">
<a class="nav-link {% if request.path.startswith('/documentation') %}active{% endif %}" href="{{ url_for('documentation.index') }}">
<span class="nav-icon">📖</span> Documentation
</a>
</li>
<li class="nav-item">
<a class="nav-link" href='{{ url_for("main.changelog_page") }}'>Changelog</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.feedback_page') }}">Feedback</a>
</li>
{% else %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.inbox') }}">Inbox</a>
</li>
{% if active_role == 'admin' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.admin_all_mails') }}">All Mail</a>
</li>
{% endif %}
{% if active_role == 'admin' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.inbox_deleted_mails') }}">Deleted mails</a>
</li>
{% endif %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.customers') }}">Customers</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.jobs') }}">Jobs</a>
</li>
{% if active_role == 'admin' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.archived_jobs') }}">Archived Jobs</a>
</li>
{% endif %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.daily_jobs') }}">Daily Jobs</a>
</li>
{% if active_role in ('admin', 'operator') %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.run_checks_page') }}">Run Checks</a>
</li>
{% endif %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.tickets_page') }}">Tickets</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.overrides') }}">Overrides</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.reports') }}">Reports</a>
</li>
{% if active_role == 'admin' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.settings') }}">Settings</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.logging_page') }}">Logging</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.parsers_overview') }}">Parsers</a>
</li>
{% endif %}
<li class="nav-item">
<a class="nav-link {% if request.path.startswith('/documentation') %}active{% endif %}" href="{{ url_for('documentation.index') }}">
<span class="nav-icon">📖</span> Documentation
</a>
</li>
<li class="nav-item">
<a class="nav-link" href='{{ url_for("main.changelog_page") }}'>Changelog</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.feedback_page') }}">Feedback</a>
</li>
{% endif %}
</ul>
<span class="navbar-text me-3">
<a class="text-decoration-none" href="{{ url_for('main.user_settings') }}">
{{ current_user.username }} ({{ active_role }})
</a>
</span>
<div class="bc-sidebar-search">
<form method="get" action="{{ url_for('main.search_page') }}" autocomplete="off" class="bc-search-form">
<svg class="bc-search-icon" width="14" height="14" viewBox="0 0 14 14" fill="none"><circle cx="6" cy="6" r="4.5" stroke="currentColor" stroke-width="1.5"/><path d="M10 10l2.5 2.5" stroke="currentColor" stroke-width="1.5" stroke-linecap="round"/></svg>
<input
class="bc-search-input"
type="search"
name="q"
placeholder="Search..."
aria-label="Search"
value="{{ request.args.get('q','') if request.path == url_for('main.search_page') else '' }}"
/>
</form>
</div>
<div class="bc-nav-section">
{% if active_role == 'reporter' %}
{{ bc_nav_item('main.reports', 'Reports', icon_bars()) }}
{{ bc_nav_item('documentation.index', 'Documentation', icon_book(), startswith='/documentation') }}
{{ bc_nav_item('main.changelog_page', 'Changelog', icon_clock()) }}
{{ bc_nav_item('main.feedback_page', 'Feedback', icon_chat()) }}
{% else %}
{{ bc_nav_item('main.dashboard', 'Dashboard', icon_grid()) }}
{{ bc_nav_item('main.inbox', 'Inbox', icon_inbox(), badge=inbox_count if inbox_count is defined and inbox_count > 0 else none) }}
{% if active_role in ('admin', 'operator') %}
{{ bc_nav_item('main.run_checks_page', 'Run Checks', icon_check()) }}
{% endif %}
{{ bc_nav_item('main.daily_jobs', 'Daily Jobs', icon_calendar()) }}
{{ bc_nav_item('main.reports', 'Reports', icon_bars()) }}
<div class="bc-nav-divider"></div>
<div class="bc-nav-label">Manage</div>
{% if active_role != 'viewer' %}
{{ bc_nav_item('main.customers', 'Customers', icon_building()) }}
{{ bc_nav_item('main.jobs', 'Jobs', icon_server()) }}
{% else %}
{{ bc_nav_item('main.customers', 'Customers', icon_building()) }}
{{ bc_nav_item('main.jobs', 'Jobs', icon_server()) }}
{% endif %}
{{ bc_nav_item('main.tickets_page', 'Tickets', icon_ticket()) }}
{{ bc_nav_item('main.overrides', 'Overrides', icon_shield()) }}
{% if system_settings and system_settings.cove_enabled and active_role in ('admin', 'operator') %}
{{ bc_nav_item('main.cove_accounts', 'Cove Accounts', icon_cloud()) }}
{% endif %}
{% if active_role in ('admin', 'operator') %}
{{ bc_nav_item('main.cloud_connect_accounts', 'Cloud Connect', icon_server()) }}
{% endif %}
<div class="bc-nav-divider"></div>
<div class="bc-nav-label">Info</div>
{{ bc_nav_item('documentation.index', 'Documentation', icon_book(), startswith='/documentation') }}
{{ bc_nav_item('main.changelog_page', 'Changelog', icon_clock()) }}
{{ bc_nav_item('main.feedback_page', 'Feedback', icon_chat()) }}
{% if active_role == 'admin' %}
<div class="bc-nav-divider"></div>
<div class="bc-nav-label">Admin</div>
{{ bc_nav_item('main.admin_all_mails', 'All Mail', icon_mail()) }}
{{ bc_nav_item('main.inbox_deleted_mails', 'Deleted Mail', icon_trash()) }}
{{ bc_nav_item('main.archived_jobs', 'Archived Jobs', icon_archive()) }}
{{ bc_nav_item('main.settings', 'Settings', icon_cog()) }}
{{ bc_nav_item('main.logging_page', 'Logging', icon_log()) }}
{{ bc_nav_item('main.parsers_overview', 'Parsers', icon_code()) }}
{% endif %}
{% endif %}
</div>
<div class="bc-sidebar-footer">
<div class="bc-user-block">
<a class="bc-user-name" href="{{ url_for('main.user_settings') }}">{{ current_user.username }}</a>
<span class="bc-user-role">{{ active_role }}</span>
</div>
<div class="bc-footer-controls">
{% if current_user.is_authenticated and user_roles|length > 1 %} {% if current_user.is_authenticated and user_roles|length > 1 %}
<form method="post" action="{{ url_for('main.set_active_role_route') }}" class="bc-role-form"> <form method="post" action="{{ url_for('main.set_active_role_route') }}" class="me-2">
<select class="bc-select" name="active_role" aria-label="Role" onchange="this.form.submit()"> <select
class="form-select form-select-sm"
name="active_role"
aria-label="Role"
onchange="this.form.submit()"
style="min-width: 10rem; width: auto;"
>
{% for r in user_roles %} {% for r in user_roles %}
<option value="{{ r }}" {% if r == active_role %}selected{% endif %}>{{ r|capitalize }}</option> <option value="{{ r }}" {% if r == active_role %}selected{% endif %}>{{ r|capitalize }}</option>
{% endfor %} {% endfor %}
</select> </select>
</form> </form>
{% endif %} {% endif %}
<a class="bc-logout" href="{{ url_for('auth.logout') }}" title="Logout"> <form method="post" action="{{ url_for('main.set_theme_preference') }}" class="me-2">
<svg width="16" height="16" viewBox="0 0 16 16" fill="none"><path d="M6 2H3a1 1 0 00-1 1v10a1 1 0 001 1h3M10 11l3-3-3-3M13 8H6" stroke="currentColor" stroke-width="1.5" stroke-linecap="round" stroke-linejoin="round"/></svg> <select
</a> class="form-select form-select-sm"
name="theme"
aria-label="Theme"
onchange="this.form.submit()"
style="width: auto;"
>
<option value="light" {% if _theme_pref == 'light' %}selected{% endif %}>Light</option>
<option value="dark" {% if _theme_pref == 'dark' %}selected{% endif %}>Dark</option>
<option value="auto" {% if _theme_pref == 'auto' %}selected{% endif %}>Auto</option>
</select>
</form>
<a class="btn btn-outline-secondary" href="{{ url_for('auth.logout') }}">Logout</a>
{% endif %}
</div> </div>
</div> </div>
</nav> </nav>
<!-- Mobile overlay --> <main class="{% block main_class %}container content-container{% endblock %}" id="main-content">
<div class="bc-overlay" id="bcOverlay"></div> {% with messages = get_flashed_messages(with_categories=true) %}
{% endif %} {% if messages %}
<div class="mb-3">
{% for category, message in messages %}
<div class="alert alert-{{ category }} alert-dismissible fade show" role="alert">
{{ message }}
<button type="button" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button>
</div>
{% endfor %}
</div>
{% endif %}
{% endwith %}
<!-- MAIN CONTENT --> {% block content %}{% endblock %}
<div class="bc-main {% if not current_user.is_authenticated %}bc-main-auth{% endif %}"> </main>
{% if current_user.is_authenticated %}
<header class="bc-topbar d-lg-none">
<button class="bc-hamburger" id="bcHamburger" aria-label="Open menu">
<svg width="20" height="20" viewBox="0 0 20 20" fill="none"><path d="M3 5h14M3 10h14M3 15h14" stroke="currentColor" stroke-width="1.8" stroke-linecap="round"/></svg>
</button>
<a class="bc-topbar-logo" href="{{ url_for('main.dashboard') }}">Backupchecks</a>
</header>
{% endif %}
<div class="bc-content {% block content_class %}{% endblock %}">
{% with messages = get_flashed_messages(with_categories=true) %}
{% if messages %}
<div class="bc-alerts mb-4">
{% for category, message in messages %}
<div class="alert alert-{{ category }} alert-dismissible fade show" role="alert">
{{ message }}
<button type="button" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button>
</div>
{% endfor %}
</div>
{% endif %}
{% endwith %}
{% block content %}{% endblock %}
</div>
</div>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.bundle.min.js"></script>
<script> <script>
(function () { // Dynamic navbar height adjustment
// Sidebar mobile toggle (function () {
var sidebar = document.getElementById('bcSidebar'); function adjustContentPadding() {
var overlay = document.getElementById('bcOverlay'); try {
var hamburger = document.getElementById('bcHamburger'); var navbar = document.querySelector('.navbar.fixed-top');
var closeBtn = document.getElementById('bcSidebarClose'); var mainContent = document.getElementById('main-content');
if (!navbar || !mainContent) return;
function openSidebar() { // Get actual navbar height
if (!sidebar) return; var navbarHeight = navbar.offsetHeight;
sidebar.classList.add('is-open');
if (overlay) overlay.classList.add('is-visible');
document.body.classList.add('sidebar-open');
}
function closeSidebar() { // Add small buffer (20px) for visual spacing
if (!sidebar) return; var paddingTop = navbarHeight + 20;
sidebar.classList.remove('is-open');
if (overlay) overlay.classList.remove('is-visible');
document.body.classList.remove('sidebar-open');
}
if (hamburger) hamburger.addEventListener('click', openSidebar); // Apply padding to main content
if (closeBtn) closeBtn.addEventListener('click', closeSidebar); mainContent.style.paddingTop = paddingTop + 'px';
if (overlay) overlay.addEventListener('click', closeSidebar); } catch (e) {
// Fallback to 80px if something goes wrong
var mainContent = document.getElementById('main-content');
if (mainContent) {
mainContent.style.paddingTop = '80px';
}
}
}
// Ellipsis field behavior (preserved from original) // Run on page load
function isOverflowing(el) { if (document.readyState === 'loading') {
try { return el && el.scrollWidth > el.clientWidth; } catch(e) { return false; } document.addEventListener('DOMContentLoaded', adjustContentPadding);
} } else {
adjustContentPadding();
}
function setEllipsisTitle(el) { // Run after navbar is fully rendered
if (!el || el.classList.contains('is-expanded')) return; window.addEventListener('load', adjustContentPadding);
var txt = (el.textContent || '').trim();
if (!txt) { el.removeAttribute('title'); return; }
if (isOverflowing(el)) { el.setAttribute('title', txt); } else { el.removeAttribute('title'); }
}
function collapseExpandedEllipsis(root) { // Run on window resize
try { var resizeTimeout;
if (!root || !root.querySelectorAll) return; window.addEventListener('resize', function () {
root.querySelectorAll('.ellipsis-field.is-expanded').forEach(function(el) { clearTimeout(resizeTimeout);
el.classList.remove('is-expanded'); setEllipsisTitle(el); resizeTimeout = setTimeout(adjustContentPadding, 100);
}); });
} catch(e) {}
}
document.addEventListener('click', function(e) { // Run when navbar collapse is toggled
var el = e.target; var navbarCollapse = document.getElementById('navbarNav');
if (!el || !el.classList || !el.classList.contains('ellipsis-field')) return; if (navbarCollapse) {
if (e.target.closest && e.target.closest('a, button, input, select, textarea, label')) return; navbarCollapse.addEventListener('shown.bs.collapse', adjustContentPadding);
el.classList.toggle('is-expanded'); navbarCollapse.addEventListener('hidden.bs.collapse', adjustContentPadding);
if (el.classList.contains('is-expanded')) { el.removeAttribute('title'); } else { setEllipsisTitle(el); } }
}); })();
document.addEventListener('dblclick', function(e) {
var el = e.target;
if (!el || !el.classList || !el.classList.contains('ellipsis-field')) return;
el.classList.add('is-expanded'); el.removeAttribute('title');
try {
var range = document.createRange(); range.selectNodeContents(el);
var sel = window.getSelection(); sel.removeAllRanges(); sel.addRange(range);
} catch(err) {}
});
document.addEventListener('mouseover', function(e) {
var el = e.target;
if (!el || !el.classList || !el.classList.contains('ellipsis-field')) return;
setEllipsisTitle(el);
});
document.addEventListener('show.bs.modal', function(e) { collapseExpandedEllipsis(e.target); });
document.addEventListener('hidden.bs.modal', function(e) { collapseExpandedEllipsis(e.target); });
document.addEventListener('show.bs.offcanvas', function(e) { collapseExpandedEllipsis(e.target); });
document.addEventListener('hidden.bs.offcanvas', function(e) { collapseExpandedEllipsis(e.target); });
})();
</script> </script>
{% block scripts %}{% endblock %} <script>
(function () {
function isOverflowing(el) {
try {
return el && el.scrollWidth > el.clientWidth;
} catch (e) {
return false;
}
}
function collapseExpandedEllipsis(root) {
try {
if (!root || !root.querySelectorAll) return;
var expanded = root.querySelectorAll('.ellipsis-field.is-expanded');
if (!expanded || !expanded.length) return;
expanded.forEach(function (el) {
el.classList.remove('is-expanded');
setEllipsisTitle(el);
});
} catch (e) {
// no-op
}
}
function setEllipsisTitle(el) {
if (!el || el.classList.contains('is-expanded')) {
return;
}
var txt = (el.textContent || '').trim();
if (!txt) {
el.removeAttribute('title');
return;
}
if (isOverflowing(el)) {
el.setAttribute('title', txt);
} else {
el.removeAttribute('title');
}
}
document.addEventListener('click', function (e) {
var el = e.target;
if (!el) return;
if (!el.classList || !el.classList.contains('ellipsis-field')) return;
// Ignore clicks on interactive children
if (e.target.closest && e.target.closest('a, button, input, select, textarea, label')) return;
el.classList.toggle('is-expanded');
if (el.classList.contains('is-expanded')) {
el.removeAttribute('title');
} else {
setEllipsisTitle(el);
}
});
document.addEventListener('dblclick', function (e) {
var el = e.target;
if (!el || !el.classList || !el.classList.contains('ellipsis-field')) return;
// Expand on double click and select all text
el.classList.add('is-expanded');
el.removeAttribute('title');
try {
var range = document.createRange();
range.selectNodeContents(el);
var sel = window.getSelection();
sel.removeAllRanges();
sel.addRange(range);
} catch (err) {
// no-op
}
});
document.addEventListener('mouseover', function (e) {
var el = e.target;
if (!el || !el.classList || !el.classList.contains('ellipsis-field')) return;
setEllipsisTitle(el);
});
// Ensure expanded fields do not persist between popup/modal openings.
document.addEventListener('show.bs.modal', function (e) {
collapseExpandedEllipsis(e.target);
});
document.addEventListener('hidden.bs.modal', function (e) {
collapseExpandedEllipsis(e.target);
});
document.addEventListener('show.bs.offcanvas', function (e) {
collapseExpandedEllipsis(e.target);
});
document.addEventListener('hidden.bs.offcanvas', function (e) {
collapseExpandedEllipsis(e.target);
});
})();
</script>
</body> </body>
</html> </html>

View File

@ -1,14 +1,14 @@
{% extends "layout/base.html" %} {% extends "layout/base.html" %}
{% block head %}
<style> <style>
.modal-xxl { max-width: 98vw; } .modal-xxl { max-width: 98vw; }
@media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } } @media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } }
#msg_body_container_iframe { height: 55vh; } #msg_body_container_iframe { height: 55vh; }
#msg_objects_container { max-height: 25vh; overflow: auto; } #msg_objects_container { max-height: 25vh; overflow: auto; }
.filter-card .form-label { font-size: 0.85rem; } .filter-card .form-label { font-size: 0.85rem; }
</style> </style>
{% endblock %}
{# Pager macro must be defined before it is used #} {# Pager macro must be defined before it is used #}
{% macro pager(position, page, total_pages, has_prev, has_next, filter_params) -%} {% macro pager(position, page, total_pages, has_prev, has_next, filter_params) -%}

View File

@ -9,7 +9,7 @@
<div class="row"> <div class="row">
<!-- Sidebar with version navigation --> <!-- Sidebar with version navigation -->
<div class="col-lg-3 col-md-4 d-none d-md-block"> <div class="col-lg-3 col-md-4 d-none d-md-block">
<div class="changelog-nav sticky-top" style="top: 24px;"> <div class="changelog-nav sticky-top" style="top: 80px;">
<h6 class="text-body-secondary text-uppercase mb-3">Versions</h6> <h6 class="text-body-secondary text-uppercase mb-3">Versions</h6>
<nav class="nav flex-column"> <nav class="nav flex-column">
{% for version_data in changelog_versions %} {% for version_data in changelog_versions %}

View File

@ -1,278 +0,0 @@
{% extends "layout/base.html" %}
{% block content %}
<div class="d-flex justify-content-between align-items-center mb-3">
<h2 class="mb-0">Cloud Connect Accounts</h2>
<form method="post" action="{{ url_for('main.cloud_connect_scan_inbox') }}">
<button type="submit" class="btn btn-sm btn-outline-secondary">Scan inbox mails</button>
</form>
</div>
{# ── Unmatched accounts ─────────────────────────────────────────────────── #}
{% if unmatched %}
<h4 class="mb-2">Unmatched <span class="badge bg-warning text-dark">{{ unmatched|length }}</span></h4>
<p class="text-muted small mb-3">Click a row to create a new job or link to an existing one.</p>
<div class="table-responsive mb-4">
<table class="table table-sm table-hover align-middle">
<thead class="table-light">
<tr>
<th>User</th>
<th>Section</th>
<th>Repository</th>
<th>Used / Quota</th>
<th>Free</th>
<th>Last active</th>
<th>Status</th>
<th>First seen</th>
</tr>
</thead>
<tbody>
{% for acc in unmatched %}
<tr class="cc-unmatched-row"
style="cursor: pointer;"
data-id="{{ acc.id }}"
data-user="{{ acc.user | e }}"
data-section="{{ acc.section | e }}"
data-backup-type="{{ acc.derived_backup_type | e }}"
data-job-name="{{ acc.derived_job_name | e }}">
<td class="fw-semibold">{{ acc.user }}</td>
<td><span class="badge bg-secondary">{{ acc.section }}</span></td>
<td class="text-muted small">{{ acc.repo_name or '—' }}<br><span class="text-muted" style="font-size:11px;">{{ acc.repo_type or '' }}</span></td>
<td class="text-muted small">{{ acc.used_space or '—' }} / {{ acc.total_quota or '—' }}</td>
<td class="text-muted small">{{ acc.free_space or '—' }}</td>
<td class="text-muted small">{{ acc.last_active_raw or '—' }}</td>
<td>
{% if acc.last_status == 'Failed' %}
<span class="badge bg-danger">Failed</span>
{% elif acc.last_status == 'Warning' %}
<span class="badge bg-warning text-dark">Warning</span>
{% else %}
<span class="badge bg-success">Success</span>
{% endif %}
</td>
<td class="text-muted small">{{ acc.first_seen_at|local_datetime }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="alert alert-success mb-4">
<strong>All accounts matched.</strong> No unmatched Cloud Connect accounts.
</div>
{% endif %}
{# ── Matched accounts ───────────────────────────────────────────────────── #}
{% if matched %}
<h4 class="mb-2">Linked <span class="badge bg-success">{{ matched|length }}</span></h4>
<div class="table-responsive">
<table class="table table-sm table-hover align-middle">
<thead class="table-light">
<tr>
<th>User</th>
<th>Section</th>
<th>Repository</th>
<th>Used / Quota</th>
<th>Free</th>
<th>Last active</th>
<th>Status</th>
<th>Linked job</th>
<th></th>
</tr>
</thead>
<tbody>
{% for acc in matched %}
<tr>
<td class="fw-semibold">{{ acc.user }}</td>
<td><span class="badge bg-secondary">{{ acc.section }}</span></td>
<td class="text-muted small">{{ acc.repo_name or '—' }}<br><span style="font-size:11px;">{{ acc.repo_type or '' }}</span></td>
<td class="text-muted small">{{ acc.used_space or '—' }} / {{ acc.total_quota or '—' }}</td>
<td class="text-muted small">{{ acc.free_space or '—' }}</td>
<td class="text-muted small">{{ acc.last_active_raw or '—' }}</td>
<td>
{% if acc.last_status == 'Failed' %}
<span class="badge bg-danger">Failed</span>
{% elif acc.last_status == 'Warning' %}
<span class="badge bg-warning text-dark">Warning</span>
{% else %}
<span class="badge bg-success">Success</span>
{% endif %}
</td>
<td>
{% if acc.job %}
<a href="{{ url_for('main.job_detail', job_id=acc.job.id) }}">
{{ acc.job.customer.name ~ ' ' if acc.job.customer else '' }}{{ acc.job.job_name }}
</a>
{% else %}—{% endif %}
</td>
<td>
<form method="post"
action="{{ url_for('main.cloud_connect_account_unlink', cc_account_db_id=acc.id) }}"
onsubmit="return confirm('Remove link for {{ acc.user | e }} ({{ acc.section | e }})?');"
class="mb-0">
<button type="submit" class="btn btn-sm btn-outline-secondary">Unlink</button>
</form>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% endif %}
{% if not unmatched and not matched %}
<div class="alert alert-info">
No Cloud Connect accounts found yet. They appear here automatically after the first daily report email is imported.
</div>
{% endif %}
{# ── Shared link/create modal ───────────────────────────────────────────── #}
<div class="modal fade" id="ccLinkModal" tabindex="-1">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title" id="ccLinkModalTitle">Link account</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<ul class="nav nav-tabs mb-3" role="tablist">
<li class="nav-item" role="presentation">
<button class="nav-link active" data-bs-toggle="tab"
data-bs-target="#ccTabCreate" type="button">
Create new job
</button>
</li>
<li class="nav-item" role="presentation">
<button class="nav-link" data-bs-toggle="tab"
data-bs-target="#ccTabExisting" type="button">
Link to existing job
</button>
</li>
</ul>
<div class="tab-content">
{# Tab 1: Create new job #}
<div class="tab-pane fade show active" id="ccTabCreate">
<form id="ccCreateForm" method="post" action="">
<input type="hidden" name="action" value="create" />
<input type="hidden" id="ccCustomerId" name="customer_id" />
<div class="mb-3">
<label class="form-label">Customer <span class="text-danger">*</span></label>
<input type="text" id="ccCustomerInput" class="form-control"
list="ccCustomerList" placeholder="Select customer…"
autocomplete="off" />
<datalist id="ccCustomerList">
{% for c in customers %}
<option value="{{ c.name | e }}"></option>
{% endfor %}
</datalist>
</div>
<dl class="row mb-3">
<dt class="col-5">Backup software</dt>
<dd class="col-7">Veeam</dd>
<dt class="col-5">Backup type</dt>
<dd class="col-7" id="ccDisplayBackupType"></dd>
<dt class="col-5">Job name</dt>
<dd class="col-7" id="ccDisplayJobName"></dd>
</dl>
<div class="d-flex justify-content-end gap-2">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
<button type="submit" class="btn btn-primary">Create job &amp; link</button>
</div>
</form>
</div>
{# Tab 2: Link to existing job #}
<div class="tab-pane fade" id="ccTabExisting">
<form id="ccLinkExistingForm" method="post" action="">
<input type="hidden" name="action" value="link" />
<div class="mb-3">
<label class="form-label">Job <span class="text-danger">*</span></label>
<select class="form-select" name="job_id" required>
<option value="">Select job…</option>
{% for j in jobs %}
<option value="{{ j.id }}">
{{ j.customer.name ~ ' ' if j.customer else '' }}{{ j.backup_software }} / {{ j.job_name }}
</option>
{% endfor %}
</select>
</div>
<div class="d-flex justify-content-end gap-2">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
<button type="submit" class="btn btn-primary">Link to job</button>
</div>
</form>
</div>
</div>
</div>
</div>
</div>
</div>
<script>
(function () {
var customers = {{ customers | tojson | safe }};
function findCustomerIdByName(name) {
for (var i = 0; i < customers.length; i++) {
if (customers[i].name === name) return customers[i].id;
}
return null;
}
function attachHandlers() {
var rows = document.querySelectorAll('.cc-unmatched-row');
var modalEl = document.getElementById('ccLinkModal');
if (!modalEl) return;
var modal = new bootstrap.Modal(modalEl);
var linkUrlTpl = "{{ url_for('main.cloud_connect_account_link', cc_account_db_id=0) }}";
rows.forEach(function (row) {
row.addEventListener('click', function () {
var id = row.getAttribute('data-id');
var user = row.getAttribute('data-user');
var section = row.getAttribute('data-section');
var backupType = row.getAttribute('data-backup-type');
var jobName = row.getAttribute('data-job-name') || user;
var linkUrl = linkUrlTpl.replace('0', id);
document.getElementById('ccLinkModalTitle').textContent = user + ' (' + section + ')';
document.getElementById('ccDisplayBackupType').textContent = backupType;
document.getElementById('ccDisplayJobName').textContent = jobName;
var customerInput = document.getElementById('ccCustomerInput');
var customerIdField = document.getElementById('ccCustomerId');
if (customerInput) customerInput.value = '';
if (customerIdField) customerIdField.value = '';
var createForm = document.getElementById('ccCreateForm');
var linkForm = document.getElementById('ccLinkExistingForm');
if (createForm) {
createForm.action = linkUrl;
createForm.onsubmit = function (ev) {
var cid = findCustomerIdByName(customerInput ? customerInput.value : '');
if (!cid) {
ev.preventDefault();
alert('Please select an existing customer name from the list.');
return false;
}
if (customerIdField) customerIdField.value = String(cid);
};
}
if (linkForm) linkForm.action = linkUrl;
modal.show();
});
});
}
document.addEventListener('DOMContentLoaded', attachHandlers);
})();
</script>
{% endblock %}

View File

@ -1,299 +0,0 @@
{% extends "layout/base.html" %}
{% block content %}
<div class="d-flex justify-content-between align-items-center mb-3">
<h2 class="mb-0">Cove Accounts</h2>
<div class="d-flex gap-2">
{% if settings.cove_partner_id %}
<form method="post" action="{{ url_for('main.settings_cove_run_now') }}" class="mb-0">
<button type="submit" class="btn btn-sm btn-outline-primary">Run import now</button>
</form>
{% endif %}
<a href="{{ url_for('main.settings', section='integrations') }}" class="btn btn-sm btn-outline-secondary">Cove settings</a>
</div>
</div>
{% if settings.cove_last_import_at %}
<p class="text-muted small mb-3">Last import: {{ settings.cove_last_import_at|local_datetime }}</p>
{% else %}
<p class="text-muted small mb-3">No import has run yet. Click <strong>Run import now</strong> to fetch Cove accounts.</p>
{% endif %}
{# ── Unmatched accounts (need a job) ─────────────────────────────────────── #}
{% if unmatched %}
<h4 class="mb-2">Unmatched <span class="badge bg-warning text-dark">{{ unmatched|length }}</span></h4>
<p class="text-muted small mb-3">Click a row to create a new job or link to an existing one.</p>
<div class="table-responsive mb-4">
<table class="table table-sm table-hover align-middle">
<thead class="table-light">
<tr>
<th>Backup software</th>
<th>Type</th>
<th>Job name</th>
<th>Computer</th>
<th>Customer (Cove)</th>
<th>Datasources</th>
<th>Last status</th>
<th>Last run</th>
<th>First seen</th>
</tr>
</thead>
<tbody>
{% for acc in unmatched %}
<tr class="cove-unmatched-row"
style="cursor: pointer;"
data-id="{{ acc.id }}"
data-job-name="{{ acc.derived_job_name | e }}"
data-backup-software="{{ acc.derived_backup_software | e }}"
data-backup-type="{{ acc.derived_backup_type | e }}"
data-cove-customer="{{ acc.customer_name | e if acc.customer_name else '' }}">
<td>{{ acc.derived_backup_software }}</td>
<td>{{ acc.derived_backup_type }}</td>
<td>{{ acc.derived_job_name }}</td>
<td class="text-muted small">{{ acc.computer_name or '—' }}</td>
<td>{{ acc.customer_name or '—' }}</td>
<td class="text-muted small">{{ acc.datasource_display }}</td>
<td>
{% if acc.last_status_code is not none %}
<span class="badge bg-{{ STATUS_CLASS.get(acc.last_status_code, 'secondary') }}">
{{ STATUS_LABELS.get(acc.last_status_code, acc.last_status_code) }}
</span>
{% else %}—{% endif %}
</td>
<td class="text-muted small">{{ acc.last_run_at|local_datetime if acc.last_run_at else '—' }}</td>
<td class="text-muted small">{{ acc.first_seen_at|local_datetime }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="alert alert-success mb-4">
<strong>All accounts matched.</strong>
{% if not settings.cove_last_import_at %}
Run an import first to see Cove accounts here.
{% else %}
No unmatched Cove accounts.
{% endif %}
</div>
{% endif %}
{# ── Matched accounts ────────────────────────────────────────────────────── #}
{% if matched %}
<h4 class="mb-2">Linked <span class="badge bg-success">{{ matched|length }}</span></h4>
<div class="table-responsive">
<table class="table table-sm table-hover align-middle">
<thead class="table-light">
<tr>
<th>Backup software</th>
<th>Type</th>
<th>Job name</th>
<th>Computer</th>
<th>Customer (Cove)</th>
<th>Datasources</th>
<th>Last status</th>
<th>Last run</th>
<th>Linked job</th>
<th></th>
</tr>
</thead>
<tbody>
{% for acc in matched %}
<tr>
<td>{{ acc.derived_backup_software }}</td>
<td>{{ acc.derived_backup_type }}</td>
<td>{{ acc.derived_job_name }}</td>
<td class="text-muted small">{{ acc.computer_name or '—' }}</td>
<td>{{ acc.customer_name or '—' }}</td>
<td class="text-muted small">{{ acc.datasource_display }}</td>
<td>
{% if acc.last_status_code is not none %}
<span class="badge bg-{{ STATUS_CLASS.get(acc.last_status_code, 'secondary') }}">
{{ STATUS_LABELS.get(acc.last_status_code, acc.last_status_code) }}
</span>
{% else %}—{% endif %}
</td>
<td class="text-muted small">{{ acc.last_run_at|local_datetime if acc.last_run_at else '—' }}</td>
<td>
{% if acc.job %}
<a href="{{ url_for('main.job_detail', job_id=acc.job.id) }}">
{{ acc.job.customer.name ~ ' ' if acc.job.customer else '' }}{{ acc.job.job_name }}
</a>
{% else %}—{% endif %}
</td>
<td>
<form method="post"
action="{{ url_for('main.cove_account_unlink', cove_account_db_id=acc.id) }}"
onsubmit="return confirm('Remove link between this Cove account and the job?');"
class="mb-0">
<button type="submit" class="btn btn-sm btn-outline-secondary">Unlink</button>
</form>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% endif %}
{% if not unmatched and not matched %}
<div class="alert alert-info">
No Cove accounts found. Run an import first via the button above or via Settings → Integrations → Cove.
</div>
{% endif %}
{# ── Shared link/create modal ─────────────────────────────────────────────── #}
<div class="modal fade" id="coveLinkModal" tabindex="-1" aria-hidden="true">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title" id="coveLinkModalTitle">Link Cove account</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<ul class="nav nav-tabs mb-3" id="coveLinkTabs" role="tablist">
<li class="nav-item" role="presentation">
<button class="nav-link active" data-bs-toggle="tab"
data-bs-target="#coveCreateTab" type="button">Create new job</button>
</li>
<li class="nav-item" role="presentation">
<button class="nav-link" data-bs-toggle="tab"
data-bs-target="#coveExistingTab" type="button">Link to existing</button>
</li>
</ul>
<div class="tab-content">
{# Tab 1: Create new job #}
<div class="tab-pane fade show active" id="coveCreateTab">
<form method="post" id="coveCreateForm">
<input type="hidden" name="action" value="create" />
<input type="hidden" name="customer_id" id="coveCustomerId" />
<div class="mb-3">
<label class="form-label">Customer <span class="text-danger">*</span></label>
<input type="text" class="form-control" id="coveCustomerInput"
list="coveCustomerList" placeholder="Type customer name…" autocomplete="off" />
<datalist id="coveCustomerList">
{% for c in customers %}
<option value="{{ c.name | e }}"></option>
{% endfor %}
</datalist>
</div>
<dl class="row mb-3 dl-compact">
<dt class="col-5">Job name</dt>
<dd class="col-7" id="coveDisplayJobName"></dd>
<dt class="col-5">Backup software</dt>
<dd class="col-7" id="coveDisplayBackupSoftware"></dd>
<dt class="col-5">Backup type</dt>
<dd class="col-7" id="coveDisplayBackupType"></dd>
</dl>
<div class="d-flex justify-content-end gap-2">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
<button type="submit" class="btn btn-primary">Create job &amp; link</button>
</div>
</form>
</div>
{# Tab 2: Link to existing job #}
<div class="tab-pane fade" id="coveExistingTab">
<form method="post" id="coveLinkExistingForm">
<input type="hidden" name="action" value="link" />
<div class="mb-3">
<label class="form-label">Job <span class="text-danger">*</span></label>
<select class="form-select" name="job_id" required>
<option value="">Select job…</option>
{% for j in jobs %}
<option value="{{ j.id }}">
{{ j.customer.name ~ ' ' if j.customer else '' }}{{ j.backup_software }} / {{ j.job_name }}
</option>
{% endfor %}
</select>
</div>
<div class="d-flex justify-content-end gap-2">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
<button type="submit" class="btn btn-primary">Link to job</button>
</div>
</form>
</div>
</div>{# /tab-content #}
</div>
</div>
</div>
</div>
<script>
document.addEventListener('DOMContentLoaded', function () {
var modalEl = document.getElementById('coveLinkModal');
if (!modalEl) return;
var modal = new bootstrap.Modal(modalEl);
var customers = {{ customers | tojson | safe }};
function findCustomerIdByName(name) {
var n = (name || '').trim().toLowerCase();
for (var i = 0; i < customers.length; i++) {
if (customers[i].name.toLowerCase() === n) return customers[i].id;
}
return null;
}
var linkUrlTpl = "{{ url_for('main.cove_account_link', cove_account_db_id=0) }}";
document.querySelectorAll('tr.cove-unmatched-row').forEach(function (row) {
row.addEventListener('click', function () {
var id = row.getAttribute('data-id');
var jobName = row.getAttribute('data-job-name') || '';
var backupSoftware = row.getAttribute('data-backup-software') || '';
var backupType = row.getAttribute('data-backup-type') || '';
var coveCustomer = row.getAttribute('data-cove-customer') || '';
var linkUrl = linkUrlTpl.replace('0', id);
document.getElementById('coveLinkModalTitle').textContent = jobName;
document.getElementById('coveDisplayJobName').textContent = jobName;
document.getElementById('coveDisplayBackupSoftware').textContent = backupSoftware;
document.getElementById('coveDisplayBackupType').textContent = backupType;
// Pre-fill customer if Cove's customer name matches an existing customer
var customerInput = document.getElementById('coveCustomerInput');
var customerIdField = document.getElementById('coveCustomerId');
if (customerInput) customerInput.value = '';
if (customerIdField) customerIdField.value = '';
if (coveCustomer && customerInput) {
var matchedId = findCustomerIdByName(coveCustomer);
if (matchedId) {
customerInput.value = coveCustomer;
customerIdField.value = String(matchedId);
} else {
customerInput.value = coveCustomer;
}
}
var createForm = document.getElementById('coveCreateForm');
var linkForm = document.getElementById('coveLinkExistingForm');
if (createForm) {
createForm.action = linkUrl;
createForm.onsubmit = function (ev) {
var cid = findCustomerIdByName(customerInput ? customerInput.value : '');
if (!cid) {
ev.preventDefault();
alert('Please select an existing customer name from the list.');
return false;
}
if (customerIdField) customerIdField.value = String(cid);
};
}
if (linkForm) linkForm.action = linkUrl;
modal.show();
});
});
});
</script>
{% endblock %}

View File

@ -15,10 +15,6 @@
<form method="post" action="{{ url_for('main.customers_import') }}" enctype="multipart/form-data" class="d-flex align-items-center gap-2 mb-0"> <form method="post" action="{{ url_for('main.customers_import') }}" enctype="multipart/form-data" class="d-flex align-items-center gap-2 mb-0">
<input type="file" name="file" accept=".csv,text/csv" class="form-control form-control-sm" required style="max-width: 420px;" /> <input type="file" name="file" accept=".csv,text/csv" class="form-control form-control-sm" required style="max-width: 420px;" />
<div class="form-check mb-0">
<input class="form-check-input" type="checkbox" value="1" id="include_autotask_ids_customers" name="include_autotask_ids" />
<label class="form-check-label small" for="include_autotask_ids_customers">Include Autotask IDs</label>
</div>
<button type="submit" class="btn btn-outline-secondary btn-sm" style="white-space: nowrap;">Import CSV</button> <button type="submit" class="btn btn-outline-secondary btn-sm" style="white-space: nowrap;">Import CSV</button>
</form> </form>
@ -49,11 +45,7 @@
{% if customers %} {% if customers %}
{% for c in customers %} {% for c in customers %}
<tr> <tr>
<td> <td>{{ c.name }}</td>
<a href="{{ url_for('main.jobs', customer_id=c.id) }}" class="link-primary text-decoration-none">
{{ c.name }}
</a>
</td>
<td> <td>
{% if c.active %} {% if c.active %}
<span class="badge bg-success">Active</span> <span class="badge bg-success">Active</span>

View File

@ -4,9 +4,6 @@
<h2 class="mb-3">Daily Jobs</h2> <h2 class="mb-3">Daily Jobs</h2>
<form method="get" class="row g-3 mb-3"> <form method="get" class="row g-3 mb-3">
{% if q %}
<input type="hidden" name="q" value="{{ q }}" />
{% endif %}
<div class="col-auto"> <div class="col-auto">
<label for="dj_date" class="form-label">Date</label> <label for="dj_date" class="form-label">Date</label>
<input <input
@ -668,7 +665,7 @@ if (tStatus) tStatus.textContent = '';
}); });
} }
function attachDailyJobsHandlers() { function attachDailyJobsHandlers() {
var rows = document.querySelectorAll(".daily-job-row"); var rows = document.querySelectorAll(".daily-job-row");
if (!rows.length) { if (!rows.length) {
return; return;
@ -774,43 +771,9 @@ if (tStatus) tStatus.textContent = '';
}); });
} }
function autoOpenJobFromQuery() {
try {
var params = new URLSearchParams(window.location.search || "");
var openJobId = (params.get("open_job_id") || "").trim();
if (!openJobId) {
return;
}
var rows = document.querySelectorAll(".daily-job-row");
var targetRow = null;
rows.forEach(function (row) {
if ((row.getAttribute("data-job-id") || "") === openJobId) {
targetRow = row;
}
});
if (!targetRow) {
return;
}
targetRow.click();
params.delete("open_job_id");
var nextQuery = params.toString();
var nextUrl = window.location.pathname + (nextQuery ? ("?" + nextQuery) : "");
if (window.history && window.history.replaceState) {
window.history.replaceState({}, document.title, nextUrl);
}
} catch (e) {
// no-op
}
}
document.addEventListener("DOMContentLoaded", function () { document.addEventListener("DOMContentLoaded", function () {
bindInlineCreateForms(); bindInlineCreateForms();
attachDailyJobsHandlers(); attachDailyJobsHandlers();
autoOpenJobFromQuery();
}); });
})(); })();
</script> </script>

View File

@ -1,71 +1,97 @@
{% extends "layout/base.html" %} {% extends "layout/base.html" %}
{% block main_class %}container dashboard-container{% endblock %}
{% block content %} {% block content %}
<h2 class="mb-4">Dashboard</h2> <h2 class="mb-4">Dashboard</h2>
<div class="bc-stat-grid mb-4"> <div class="row g-3 mb-4">
<div class="bc-stat-card bc-stat-large"> <div class="col-12 col-md-3">
<div class="bc-stat-label"> <div class="card h-100">
<svg width="13" height="13" viewBox="0 0 15 15" fill="none"><path d="M1 9.5l2.5-5h8L14 9.5V13a1 1 0 01-1 1H2a1 1 0 01-1-1V9.5z" stroke="currentColor" stroke-width="1.3"/><path d="M1 9.5h3.5a2 2 0 004 0H14" stroke="currentColor" stroke-width="1.3"/></svg> <div class="card-body">
Inbox <div class="text-muted">Inbox</div>
<div class="display-6 mb-0">{{ inbox_count }}</div>
<div class="text-muted small mt-2">Open items</div>
</div>
</div> </div>
<div class="bc-stat-value">{{ inbox_count }}</div>
<div class="bc-stat-sub">Open items</div>
</div> </div>
<div class="bc-stat-card"> <div class="col-12 col-md-9">
<div class="bc-stat-label"><span class="status-dot dot-success me-1" aria-hidden="true"></span>Success</div> <div class="row g-3">
<div class="bc-stat-value bc-stat-success">{{ jobs_success_count }}</div> <div class="col-6 col-lg-2">
</div> <div class="card h-100">
<div class="card-body">
<div class="bc-stat-card"> <div class="text-muted"><span class="status-dot dot-success me-2" aria-hidden="true"></span>Success</div>
<div class="bc-stat-label"><span class="status-dot dot-override me-1" aria-hidden="true"></span>Success (override)</div> <div class="display-6 mb-0">{{ jobs_success_count }}</div>
<div class="bc-stat-value bc-stat-override">{{ jobs_success_override_count }}</div> </div>
</div> </div>
</div>
<div class="bc-stat-card"> <div class="col-6 col-lg-2">
<div class="bc-stat-label"><span class="status-dot dot-expected me-1" aria-hidden="true"></span>Expected</div> <div class="card h-100">
<div class="bc-stat-value bc-stat-muted">{{ jobs_expected_count }}</div> <div class="card-body">
</div> <div class="text-muted"><span class="status-dot dot-override me-2" aria-hidden="true"></span><span class="text-nowrap">Success (override)</span></div>
<div class="display-6 mb-0">{{ jobs_success_override_count }}</div>
<div class="bc-stat-card"> </div>
<div class="bc-stat-label"><span class="status-dot dot-warning me-1" aria-hidden="true"></span>Warning</div> </div>
<div class="bc-stat-value bc-stat-warning">{{ jobs_warning_count }}</div> </div>
</div> <div class="col-6 col-lg-2">
<div class="card h-100">
<div class="bc-stat-card"> <div class="card-body">
<div class="bc-stat-label"><span class="status-dot dot-failed me-1" aria-hidden="true"></span>Failed</div> <div class="text-muted"><span class="status-dot dot-expected me-2" aria-hidden="true"></span>Expected</div>
<div class="bc-stat-value bc-stat-failed">{{ jobs_error_count }}</div> <div class="display-6 mb-0">{{ jobs_expected_count }}</div>
</div> </div>
</div>
<div class="bc-stat-card"> </div>
<div class="bc-stat-label"><span class="status-dot dot-missed me-1" aria-hidden="true"></span>Missed</div> <div class="col-6 col-lg-2">
<div class="bc-stat-value bc-stat-muted">{{ jobs_missed_count }}</div> <div class="card h-100">
</div> <div class="card-body">
</div> <div class="text-muted"><span class="status-dot dot-warning me-2" aria-hidden="true"></span>Warning</div>
<div class="display-6 mb-0">{{ jobs_warning_count }}</div>
<div class="card mb-3"> </div>
<div class="card-header py-2"><span class="fw-semibold" style="font-size:13px;">Legend</span></div> </div>
<div class="card-body py-2"> </div>
<div class="d-flex flex-wrap gap-x-4 gap-y-1" style="column-gap:2rem;row-gap:.35rem;"> <div class="col-6 col-lg-2">
<div class="small"><span class="status-dot dot-success me-2" aria-hidden="true"></span><strong>Success</strong> — job run completed successfully</div> <div class="card h-100">
<div class="small"><span class="status-dot dot-failed me-2" aria-hidden="true"></span><strong>Failed</strong> — job run failed, action required</div> <div class="card-body">
<div class="small"><span class="status-dot dot-warning me-2" aria-hidden="true"></span><strong>Warning</strong> — job run completed with a warning</div> <div class="text-muted"><span class="status-dot dot-failed me-2" aria-hidden="true"></span>Failed</div>
<div class="small"><span class="status-dot dot-missed me-2" aria-hidden="true"></span><strong>Missed</strong> — job run expected but did not execute</div> <div class="display-6 mb-0">{{ jobs_error_count }}</div>
<div class="small"><span class="status-dot dot-expected me-2" aria-hidden="true"></span><strong>Expected</strong> — job run not yet due</div> </div>
<div class="small"><span class="status-dot dot-override me-2" aria-hidden="true"></span><strong>Success (override)</strong> — marked as successful via override</div> </div>
</div>
<div class="col-6 col-lg-2">
<div class="card h-100">
<div class="card-body">
<div class="text-muted"><span class="status-dot dot-missed me-2" aria-hidden="true"></span>Missed</div>
<div class="display-6 mb-0">{{ jobs_missed_count }}</div>
</div>
</div>
</div>
</div> </div>
</div> </div>
</div> </div>
</div>
<div class="card mb-4">
<div class="card-header">Legend</div>
<div class="card-body">
<div class="d-flex flex-column gap-2 small">
<div><span class="status-dot dot-success me-2" aria-hidden="true"></span><strong>Success</strong> — job run completed successfully</div>
<div><span class="status-dot dot-failed me-2" aria-hidden="true"></span><strong>Failed</strong> — job run failed, action required</div>
<div><span class="status-dot dot-warning me-2" aria-hidden="true"></span><strong>Warning</strong> — job run completed with a warning</div>
<div><span class="status-dot dot-missed me-2" aria-hidden="true"></span><strong>Missed</strong> — job run expected but did not execute</div>
<div><span class="status-dot dot-expected me-2" aria-hidden="true"></span><strong>Expected</strong> — job run not yet due</div>
<div><span class="status-dot dot-override me-2" aria-hidden="true"></span><strong>Success (override)</strong> — marked as successful via override</div>
</div>
</div>
</div>
{% if news_items %} {% if news_items %}
<div class="card mb-3"> <div class="card mb-4">
<div class="card-header d-flex align-items-center justify-content-between py-2"> <div class="card-header d-flex align-items-center justify-content-between">
<span class="fw-semibold" style="font-size:13px;">News</span> <span>News</span>
{% if active_role == 'admin' %} {% if active_role == 'admin' %}
<a class="btn btn-sm btn-outline-secondary" href="{{ url_for('main.settings', section='news') }}">Manage</a> <a class="btn btn-sm btn-outline-secondary" href="{{ url_for('main.settings', section='news') }}">Manage</a>
{% endif %} {% endif %}
</div> </div>
<div class="card-body py-3"> <div class="card-body">
<div class="d-flex flex-column gap-3"> <div class="d-flex flex-column gap-3">
{% for item in news_items %} {% for item in news_items %}
<div class="border rounded p-3"> <div class="border rounded p-3">
@ -97,20 +123,46 @@
</div> </div>
</div> </div>
{% endif %} {% endif %}
<div class="mt-3 small text-muted">
<p>Backupchecks provides a centralized and consistent overview of the health and reliability of all backups within your environment. The platform collects backup results from multiple backup solutions and normalizes them into a single, clear and consistent status model. This enables teams to monitor backup quality across different vendors and environments in a predictable and uniform way.</p>
<p>Backup results are imported and evaluated automatically. Each backup run is analyzed and assigned a status such as Success, Warning, Failed, or Success (override). These statuses are determined by interpreting exit codes, detected error messages, log content, and configured rules, ensuring that the reported outcome reflects the real operational impact rather than raw technical output alone.</p>
<p>The dashboard provides an at-a-glance overview of the current backup situation:</p>
<ul>
<li>A consolidated summary of all monitored backup jobs and their latest known results.</li>
<li>Clear counters for successful, warning, failed, and overridden runs.</li>
<li>Immediate visibility into environments that require attention or follow-up.</li>
</ul>
<p>The Daily Jobs view shows the most recent run per backup job, grouped by customer, backup software, and backup type. This view is intended for high-level monitoring and trend awareness. It reflects the latest state of each job, but it does not replace the daily operational review process.</p>
<p>Daily operational validation is performed from the Run Checks page. This page acts as the primary workspace for reviewing backup runs. All runs that require attention are listed here, allowing operators to systematically review results and decide on the appropriate next step. The main objective of this process is to actively review backup runs and keep the Run Checks page clear.</p>
<p>When reviewing a run, operators assess whether the result is acceptable, requires follow-up, or can be treated as successful. A run can be marked as reviewed once it has been checked, even if additional actions are required. Marking a run as reviewed confirms that the result has been acknowledged and assessed, and prevents it from repeatedly appearing as unprocessed.</p>
<p>If a backup run requires further investigation or corrective action, operators can add a remark or reference an external ticket number. After adding this information, the run can still be marked as reviewed, ensuring that it no longer blocks daily checks.</p>
<p>Reviewed runs that require follow-up retain their status until they are explicitly marked as resolved. The reviewed state remains in place to indicate that the run has been handled operationally, while the resolved state confirms that the underlying issue has been fully addressed.</p>
<p>Overrides can be applied during this process when a warning or error is known, accepted, or considered non-critical. Overrides allow such runs to be treated as successful for reporting and dashboard purposes, while preserving the original messages and maintaining a full audit trail.</p>
<p>The ultimate goal of the Run Checks workflow is to maintain an empty or near-empty Run Checks page.</p>
<p>Backupchecks is designed as a monitoring, validation, and control platform. It does not replace your backup software, but enhances it by adding structured review workflows, consistent reporting, and operational clarity across all backup solutions.</p>
</div>
{% if active_role == 'admin' %} {% if active_role == 'admin' %}
<div class="card mb-3"> <div class="card mb-4">
<div class="card-header py-2"><span class="fw-semibold" style="font-size:13px;">System status</span></div> <div class="card-header">
<div class="card-body py-3"> System status
<div class="row g-2 small"> </div>
<div class="col-md-6"><span class="text-muted">Database size:</span> <strong>{{ db_size_human }}</strong></div> <div class="card-body">
<div class="row mb-2">
<div class="col-md-6"> <div class="col-md-6">
<span class="text-muted">Free disk space:</span> <strong>Database size:</strong> {{ db_size_human }}
</div>
<div class="col-md-6">
<strong>Free disk space:</strong>
{% if free_disk_warning %} {% if free_disk_warning %}
<strong class="text-danger">{{ free_disk_human }}</strong> <span class="text-danger fw-bold">{{ free_disk_human }}</span>
<span class="text-danger">(mail import will be blocked below 2 GB)</span> <span class="text-danger">(mail import will be blocked below 2 GB)</span>
{% else %} {% else %}
<strong>{{ free_disk_human }}</strong> {{ free_disk_human }}
{% endif %} {% endif %}
</div> </div>
</div> </div>
@ -118,10 +170,4 @@
</div> </div>
{% endif %} {% endif %}
<div class="mt-3 small text-muted" style="max-width:900px;">
<p>Backupchecks provides a centralized and consistent overview of the health and reliability of all backups within your environment. The platform collects backup results from multiple backup solutions and normalizes them into a single, clear and consistent status model. This enables teams to monitor backup quality across different vendors and environments in a predictable and uniform way.</p>
<p>Backup results are imported and evaluated automatically. Each backup run is analyzed and assigned a status such as Success, Warning, Failed, or Success (override). These statuses are determined by interpreting exit codes, detected error messages, log content, and configured rules, ensuring that the reported outcome reflects the real operational impact rather than raw technical output alone.</p>
<p>Daily operational validation is performed from the Run Checks page. This page acts as the primary workspace for reviewing backup runs. All runs that require attention are listed here, allowing operators to systematically review results and decide on the appropriate next step. The main objective of this process is to actively review backup runs and keep the Run Checks page clear.</p>
</div>
{% endblock %} {% endblock %}

View File

@ -2,7 +2,7 @@
{% block content %} {% block content %}
<div class="d-flex justify-content-between align-items-center mb-3"> <div class="d-flex justify-content-between align-items-center mb-3">
<h2 class="mb-0">Feedback</h2> <h1 class="h4 mb-0">Feedback</h1>
<a class="btn btn-primary" href="{{ url_for('main.feedback_new') }}">New</a> <a class="btn btn-primary" href="{{ url_for('main.feedback_new') }}">New</a>
</div> </div>
@ -34,16 +34,6 @@
<div class="col-6 col-md-3"> <div class="col-6 col-md-3">
<button class="btn btn-outline-secondary" type="submit">Apply</button> <button class="btn btn-outline-secondary" type="submit">Apply</button>
</div> </div>
{% if active_role == 'admin' %}
<div class="col-12">
<div class="form-check">
<input class="form-check-input" type="checkbox" name="show_deleted" value="1" id="show_deleted" {% if show_deleted %}checked{% endif %} onchange="this.form.submit()">
<label class="form-check-label" for="show_deleted">
Show deleted items
</label>
</div>
</div>
{% endif %}
</form> </form>
<div class="table-responsive"> <div class="table-responsive">
@ -56,9 +46,6 @@
<th style="width: 160px;">Component</th> <th style="width: 160px;">Component</th>
<th style="width: 120px;">Status</th> <th style="width: 120px;">Status</th>
<th style="width: 170px;">Created</th> <th style="width: 170px;">Created</th>
{% if active_role == 'admin' and show_deleted %}
<th style="width: 140px;">Actions</th>
{% endif %}
</tr> </tr>
</thead> </thead>
<tbody> <tbody>
@ -69,30 +56,20 @@
{% endif %} {% endif %}
{% for i in items %} {% for i in items %}
<tr {% if i.is_deleted %}style="opacity: 0.6; background-color: var(--bs-secondary-bg);"{% endif %}> <tr>
<td> <td>
{% if not i.is_deleted %}
<form method="post" action="{{ url_for('main.feedback_vote', item_id=i.id) }}"> <form method="post" action="{{ url_for('main.feedback_vote', item_id=i.id) }}">
<input type="hidden" name="ref" value="list" /> <input type="hidden" name="ref" value="list" />
<button type="submit" class="btn btn-sm {% if i.user_voted %}btn-success{% else %}btn-outline-secondary{% endif %}"> <button type="submit" class="btn btn-sm {% if i.user_voted %}btn-success{% else %}btn-outline-secondary{% endif %}">
+ {{ i.vote_count }} + {{ i.vote_count }}
</button> </button>
</form> </form>
{% else %}
<span class="text-muted">+ {{ i.vote_count }}</span>
{% endif %}
</td> </td>
<td> <td>
<a href="{{ url_for('main.feedback_detail', item_id=i.id) }}">{{ i.title }}</a> <a href="{{ url_for('main.feedback_detail', item_id=i.id) }}">{{ i.title }}</a>
{% if i.is_deleted %}
<span class="badge text-bg-dark ms-2">Deleted</span>
{% endif %}
{% if i.created_by %} {% if i.created_by %}
<div class="text-muted" style="font-size: 0.85rem;">by {{ i.created_by }}</div> <div class="text-muted" style="font-size: 0.85rem;">by {{ i.created_by }}</div>
{% endif %} {% endif %}
{% if i.is_deleted and i.deleted_at %}
<div class="text-muted" style="font-size: 0.85rem;">Deleted {{ i.deleted_at|local_datetime }}</div>
{% endif %}
</td> </td>
<td> <td>
{% if i.item_type == 'bug' %} {% if i.item_type == 'bug' %}
@ -113,15 +90,6 @@
<div>{{ i.created_at|local_datetime }}</div> <div>{{ i.created_at|local_datetime }}</div>
<div class="text-muted" style="font-size: 0.85rem;">Updated {{ i.updated_at|local_datetime }}</div> <div class="text-muted" style="font-size: 0.85rem;">Updated {{ i.updated_at|local_datetime }}</div>
</td> </td>
{% if active_role == 'admin' and show_deleted %}
<td>
{% if i.is_deleted %}
<form method="post" action="{{ url_for('main.feedback_permanent_delete', item_id=i.id) }}" onsubmit="return confirm('Permanently delete this item and all screenshots? This cannot be undone!');">
<button type="submit" class="btn btn-sm btn-danger">Permanent Delete</button>
</form>
{% endif %}
</td>
{% endif %}
</tr> </tr>
{% endfor %} {% endfor %}
</tbody> </tbody>

View File

@ -3,7 +3,7 @@
{% block content %} {% block content %}
<div class="d-flex justify-content-between align-items-center mb-3"> <div class="d-flex justify-content-between align-items-center mb-3">
<div> <div>
<h2 class="mb-1">{{ item.title }}</h2> <h1 class="h4 mb-1">{{ item.title }}</h1>
<div class="text-muted" style="font-size: 0.9rem;"> <div class="text-muted" style="font-size: 0.9rem;">
{% if item.item_type == 'bug' %} {% if item.item_type == 'bug' %}
<span class="badge text-bg-danger">Bug</span> <span class="badge text-bg-danger">Bug</span>
@ -15,9 +15,6 @@
{% else %} {% else %}
<span class="badge text-bg-warning">Open</span> <span class="badge text-bg-warning">Open</span>
{% endif %} {% endif %}
{% if item.deleted_at %}
<span class="badge text-bg-dark">Deleted</span>
{% endif %}
<span class="ms-2">by {{ created_by_name }}</span> <span class="ms-2">by {{ created_by_name }}</span>
</div> </div>
</div> </div>
@ -32,23 +29,6 @@
<div class="mb-2"><strong>Component:</strong> {{ item.component }}</div> <div class="mb-2"><strong>Component:</strong> {{ item.component }}</div>
{% endif %} {% endif %}
<div style="white-space: pre-wrap;">{{ item.description }}</div> <div style="white-space: pre-wrap;">{{ item.description }}</div>
{% if item_attachments %}
<div class="mt-3">
<strong>Screenshots:</strong>
<div class="d-flex flex-wrap gap-2 mt-2">
{% for att in item_attachments %}
<a href="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}" target="_blank">
<img src="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}"
alt="{{ att.filename }}"
class="img-thumbnail"
style="max-height: 200px; max-width: 300px; cursor: pointer;"
title="Click to view full size" />
</a>
{% endfor %}
</div>
</div>
{% endif %}
</div> </div>
<div class="card-footer d-flex justify-content-between align-items-center"> <div class="card-footer d-flex justify-content-between align-items-center">
<div class="text-muted" style="font-size: 0.9rem;"> <div class="text-muted" style="font-size: 0.9rem;">
@ -83,22 +63,6 @@
</span> </span>
</div> </div>
<div style="white-space: pre-wrap;">{{ r.message }}</div> <div style="white-space: pre-wrap;">{{ r.message }}</div>
{% if r.id in reply_attachments_map %}
<div class="mt-2">
<div class="d-flex flex-wrap gap-2">
{% for att in reply_attachments_map[r.id] %}
<a href="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}" target="_blank">
<img src="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}"
alt="{{ att.filename }}"
class="img-thumbnail"
style="max-height: 150px; max-width: 200px; cursor: pointer;"
title="Click to view full size" />
</a>
{% endfor %}
</div>
</div>
{% endif %}
</div> </div>
{% endfor %} {% endfor %}
</div> </div>
@ -112,15 +76,10 @@
<div class="card-body"> <div class="card-body">
<h5 class="card-title mb-3">Add reply</h5> <h5 class="card-title mb-3">Add reply</h5>
{% if item.status == 'open' %} {% if item.status == 'open' %}
<form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}" enctype="multipart/form-data"> <form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}">
<div class="mb-2"> <div class="mb-2">
<textarea class="form-control" name="message" rows="4" required></textarea> <textarea class="form-control" name="message" rows="4" required></textarea>
</div> </div>
<div class="mb-2">
<label class="form-label">Screenshots (optional)</label>
<input type="file" name="screenshots" class="form-control" multiple accept="image/png,image/jpeg,image/jpg,image/gif,image/webp" />
<div class="form-text">You can attach multiple screenshots (PNG, JPG, GIF, WEBP, max 5MB each)</div>
</div>
<button type="submit" class="btn btn-primary">Post reply</button> <button type="submit" class="btn btn-primary">Post reply</button>
</form> </form>
{% else %} {% else %}
@ -136,32 +95,21 @@
<h2 class="h6">Actions</h2> <h2 class="h6">Actions</h2>
{% if active_role == 'admin' %} {% if active_role == 'admin' %}
{% if item.deleted_at %} {% if item.status == 'resolved' %}
{# Item is deleted - show permanent delete option #} <form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
<div class="alert alert-warning mb-2" style="font-size: 0.9rem;"> <input type="hidden" name="action" value="reopen" />
This item is deleted. <button type="submit" class="btn btn-outline-secondary w-100">Reopen</button>
</div> </form>
<form method="post" action="{{ url_for('main.feedback_permanent_delete', item_id=item.id) }}" onsubmit="return confirm('Permanently delete this item and all screenshots? This cannot be undone!');">
<button type="submit" class="btn btn-danger w-100">Permanent Delete</button>
</form>
{% else %} {% else %}
{# Item is not deleted - show normal actions #} <form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
{% if item.status == 'resolved' %} <input type="hidden" name="action" value="resolve" />
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2"> <button type="submit" class="btn btn-success w-100">Mark as resolved</button>
<input type="hidden" name="action" value="reopen" /> </form>
<button type="submit" class="btn btn-outline-secondary w-100">Reopen</button>
</form>
{% else %}
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
<input type="hidden" name="action" value="resolve" />
<button type="submit" class="btn btn-success w-100">Mark as resolved</button>
</form>
{% endif %}
<form method="post" action="{{ url_for('main.feedback_delete', item_id=item.id) }}" onsubmit="return confirm('Delete this item?');">
<button type="submit" class="btn btn-danger w-100">Delete</button>
</form>
{% endif %} {% endif %}
<form method="post" action="{{ url_for('main.feedback_delete', item_id=item.id) }}" onsubmit="return confirm('Delete this item?');">
<button type="submit" class="btn btn-danger w-100">Delete</button>
</form>
{% else %} {% else %}
<div class="text-muted">Only administrators can resolve or delete items.</div> <div class="text-muted">Only administrators can resolve or delete items.</div>
{% endif %} {% endif %}

View File

@ -2,11 +2,11 @@
{% block content %} {% block content %}
<div class="d-flex justify-content-between align-items-center mb-3"> <div class="d-flex justify-content-between align-items-center mb-3">
<h2 class="mb-0">New Feedback</h2> <h1 class="h4 mb-0">New Feedback</h1>
<a class="btn btn-outline-secondary" href="{{ url_for('main.feedback_page') }}">Back</a> <a class="btn btn-outline-secondary" href="{{ url_for('main.feedback_page') }}">Back</a>
</div> </div>
<form method="post" enctype="multipart/form-data" class="card"> <form method="post" class="card">
<div class="card-body"> <div class="card-body">
<div class="row g-3"> <div class="row g-3">
<div class="col-12 col-md-3"> <div class="col-12 col-md-3">
@ -28,11 +28,6 @@
<label class="form-label">Component (optional)</label> <label class="form-label">Component (optional)</label>
<input type="text" name="component" class="form-control" /> <input type="text" name="component" class="form-control" />
</div> </div>
<div class="col-12">
<label class="form-label">Screenshots (optional)</label>
<input type="file" name="screenshots" class="form-control" multiple accept="image/png,image/jpeg,image/jpg,image/gif,image/webp" />
<div class="form-text">You can attach multiple screenshots (PNG, JPG, GIF, WEBP, max 5MB each)</div>
</div>
</div> </div>
</div> </div>
<div class="card-footer d-flex justify-content-end"> <div class="card-footer d-flex justify-content-end">

View File

@ -1,32 +1,34 @@
{% extends "layout/base.html" %} {% extends "layout/base.html" %}
{% block head %}
<style> <style>
/* Inbox popup: wider + internal scroll areas */
.modal-xxl { max-width: 98vw; } .modal-xxl { max-width: 98vw; }
@media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } } @media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } }
#msg_body_container_iframe { height: 55vh; } #msg_body_container_iframe { height: 55vh; }
#msg_objects_container { max-height: 25vh; overflow: auto; } #msg_objects_container { max-height: 25vh; overflow: auto; }
</style> </style>
{% endblock %}
{# Pager macro must be defined before it is used #} {# Pager macro must be defined before it is used #}
{% macro pager(position, page, total_pages, has_prev, has_next) -%} {% macro pager(position, page, total_pages, has_prev, has_next) -%}
<div class="d-flex justify-content-between align-items-center my-2"> <div class="d-flex justify-content-between align-items-center my-2">
<div> <div>
{% if has_prev %} {% if has_prev %}
<a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.inbox', page=page-1, q=q) }}">Previous</a> <a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.inbox', page=page-1) }}">Previous</a>
{% else %} {% else %}
<button class="btn btn-outline-secondary btn-sm" disabled>Previous</button> <button class="btn btn-outline-secondary btn-sm" disabled>Previous</button>
{% endif %} {% endif %}
{% if has_next %} {% if has_next %}
<a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.inbox', page=page+1, q=q) }}">Next</a> <a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.inbox', page=page+1) }}">Next</a>
{% else %} {% else %}
<button class="btn btn-outline-secondary btn-sm ms-2" disabled>Next</button> <button class="btn btn-outline-secondary btn-sm ms-2" disabled>Next</button>
{% endif %} {% endif %}
</div> </div>
{% if current_user.is_authenticated and active_role in ["admin", "operator"] %} {% if current_user.is_authenticated and active_role in ["admin", "operator"] %}
<button type="button" class="btn btn-outline-secondary btn-sm me-3" id="btnReparseAll" data-bs-toggle="modal" data-bs-target="#reparseProgressModal">Re-parse all</button> <form method="POST" action="{{ url_for('main.inbox_reparse_all') }}" class="me-3 mb-0">
<button type="submit" class="btn btn-outline-secondary btn-sm">Re-parse all</button>
</form>
{% endif %} {% endif %}
<div class="d-flex align-items-center"> <div class="d-flex align-items-center">
@ -71,7 +73,7 @@
<tr> <tr>
{% if can_bulk_delete %} {% if can_bulk_delete %}
<th scope="col" style="width: 34px;"> <th scope="col" style="width: 34px;">
<input class="form-check-input" type="checkbox" id="inbox_select_all" autocomplete="off" /> <input class="form-check-input" type="checkbox" id="inbox_select_all" />
</th> </th>
{% endif %} {% endif %}
<th scope="col">From</th> <th scope="col">From</th>
@ -91,7 +93,7 @@
<tr class="inbox-row" data-message-id="{{ row.id }}" style="cursor: pointer;"> <tr class="inbox-row" data-message-id="{{ row.id }}" style="cursor: pointer;">
{% if can_bulk_delete %} {% if can_bulk_delete %}
<td onclick="event.stopPropagation();"> <td onclick="event.stopPropagation();">
<input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" autocomplete="off" /> <input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" />
</td> </td>
{% endif %} {% endif %}
<td>{{ row.from_address }}</td> <td>{{ row.from_address }}</td>
@ -207,27 +209,6 @@
</div> </div>
<!-- Re-parse progress modal -->
<div class="modal fade" id="reparseProgressModal" tabindex="-1" aria-labelledby="reparseProgressModalLabel" aria-hidden="true" data-bs-backdrop="static" data-bs-keyboard="false">
<div class="modal-dialog modal-dialog-centered">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title" id="reparseProgressModalLabel">Re-parse all messages</h5>
</div>
<div class="modal-body">
<div id="reparseStatusText" class="mb-2 text-center">Starting…</div>
<div class="progress mb-2" style="height: 22px;">
<div id="reparseProgressBar" class="progress-bar progress-bar-striped progress-bar-animated" role="progressbar" style="width: 0%;" aria-valuenow="0" aria-valuemin="0" aria-valuemax="100">0%</div>
</div>
<div id="reparseStatsText" class="small text-muted text-center"></div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary d-none" id="reparseCloseBtn" data-bs-dismiss="modal">Close</button>
</div>
</div>
</div>
</div>
<!-- VSPC company mapping modal (for multi-company summary emails) --> <!-- VSPC company mapping modal (for multi-company summary emails) -->
<div class="modal fade" id="vspcCompanyMapModal" tabindex="-1" aria-labelledby="vspcCompanyMapModalLabel" aria-hidden="true"> <div class="modal fade" id="vspcCompanyMapModal" tabindex="-1" aria-labelledby="vspcCompanyMapModalLabel" aria-hidden="true">
<div class="modal-dialog modal-lg modal-dialog-scrollable"> <div class="modal-dialog modal-lg modal-dialog-scrollable">
@ -699,88 +680,4 @@ function findCustomerIdByName(name) {
})(); })();
</script> </script>
<script>
(function () {
var reparseModal = null;
var reparseRunning = false;
function initReparseModal() {
var modalEl = document.getElementById("reparseProgressModal");
if (!modalEl) return;
reparseModal = new bootstrap.Modal(modalEl);
modalEl.addEventListener("show.bs.modal", function () {
if (reparseRunning) return;
resetReparseUI();
startReparse();
});
}
function resetReparseUI() {
setProgress(0, 0);
document.getElementById("reparseStatusText").textContent = "Starting…";
document.getElementById("reparseStatsText").textContent = "";
document.getElementById("reparseCloseBtn").classList.add("d-none");
}
function setProgress(done, total) {
var bar = document.getElementById("reparseProgressBar");
var pct = total > 0 ? Math.round((done / total) * 100) : 0;
bar.style.width = pct + "%";
bar.setAttribute("aria-valuenow", pct);
bar.textContent = pct + "%";
}
function startReparse() {
reparseRunning = true;
runBatch(null, null, 0, 0, 0, 0, 0);
}
function runBatch(lastId, total, totalProcessed, totalOk, totalApproved, totalNoMatch, totalErrors) {
var payload = { last_id: lastId, total: total };
fetch("{{ url_for('main.inbox_reparse_batch') }}", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(payload),
})
.then(function (r) { return r.json(); })
.then(function (data) {
var newTotal = data.total || total || 0;
var newProcessed = totalProcessed + (data.processed || 0);
var newOk = totalOk + (data.parsed_ok || 0);
var newApproved = totalApproved + (data.auto_approved || 0);
var newNoMatch = totalNoMatch + (data.no_match || 0);
var newErrors = totalErrors + (data.errors || 0);
setProgress(newProcessed, newTotal);
document.getElementById("reparseStatusText").textContent =
"Processing… " + newProcessed + " / " + newTotal;
document.getElementById("reparseStatsText").textContent =
"Parsed: " + newOk + " | Auto-approved: " + newApproved +
" | No match: " + newNoMatch + " | Errors: " + newErrors;
if (data.done) {
reparseRunning = false;
setProgress(newTotal, newTotal);
document.getElementById("reparseStatusText").textContent = "Finished!";
document.getElementById("reparseCloseBtn").classList.remove("d-none");
var bar = document.getElementById("reparseProgressBar");
bar.classList.remove("progress-bar-animated");
} else {
setTimeout(function () {
runBatch(data.last_id, newTotal, newProcessed, newOk, newApproved, newNoMatch, newErrors);
}, 100);
}
})
.catch(function (err) {
reparseRunning = false;
document.getElementById("reparseStatusText").textContent = "Error: " + err;
document.getElementById("reparseCloseBtn").classList.remove("d-none");
});
}
document.addEventListener("DOMContentLoaded", initReparseModal);
})();
</script>
{% endblock %} {% endblock %}

View File

@ -59,34 +59,6 @@
</div> </div>
{% endif %} {% endif %}
{% if cove_enabled and can_manage_jobs %}
<div class="card mb-3">
<div class="card-header">Cove Integration</div>
<div class="card-body">
<form method="post" action="{{ url_for('main.job_set_cove_account', job_id=job.id) }}" class="row g-2 align-items-end mb-0">
<div class="col-auto">
<label for="cove_account_id" class="form-label mb-1">Cove Account ID</label>
<input type="number" class="form-control form-control-sm" id="cove_account_id" name="cove_account_id"
value="{{ job.cove_account_id or '' }}" placeholder="e.g. 4504627" style="width: 180px;" />
</div>
<div class="col-auto">
<button type="submit" class="btn btn-sm btn-primary">Save</button>
{% if job.cove_account_id %}
<button type="submit" name="cove_account_id" value="" class="btn btn-sm btn-outline-secondary ms-1">Clear</button>
{% endif %}
</div>
<div class="col-auto text-muted small">
{% if job.cove_account_id %}
Linked to Cove account <strong>{{ job.cove_account_id }}</strong>
{% else %}
Not linked to a Cove account runs will not be imported automatically.
{% endif %}
</div>
</form>
</div>
</div>
{% endif %}
<h3 class="mt-4 mb-3">Job history</h3> <h3 class="mt-4 mb-3">Job history</h3>
<div class="table-responsive"> <div class="table-responsive">
@ -108,7 +80,7 @@
<tbody> <tbody>
{% if history_rows %} {% if history_rows %}
{% for r in history_rows %} {% for r in history_rows %}
<tr{% if r.mail_message_id or r.source_type == 'cove_api' %} class="jobrun-row" {% if r.mail_message_id %}data-message-id="{{ r.mail_message_id }}" {% endif %}data-run-id="{{ r.id }}" data-source-type="{{ r.source_type }}" data-ticket-codes="{{ (r.ticket_codes or [])|tojson|forceescape }}" data-remark-items="{{ (r.remark_items or [])|tojson|forceescape }}" style="cursor: pointer;"{% endif %}> <tr{% if r.mail_message_id %} class="jobrun-row" data-message-id="{{ r.mail_message_id }}" data-run-id="{{ r.id }}" data-ticket-codes="{{ (r.ticket_codes or [])|tojson|forceescape }}" data-remark-items="{{ (r.remark_items or [])|tojson|forceescape }}" style="cursor: pointer;"{% endif %}>
<td>{{ r.run_day }}</td> <td>{{ r.run_day }}</td>
<td>{{ r.run_at|local_datetime }}</td> <td>{{ r.run_at|local_datetime }}</td>
{% set _s = (r.status or "")|lower %} {% set _s = (r.status or "")|lower %}
@ -258,49 +230,8 @@
<h6 class="mb-1">Details</h6> <h6 class="mb-1">Details</h6>
<div id="run_msg_overall_message" class="border rounded p-2" style="white-space: pre-wrap; max-height: 20vh; overflow: auto;"></div> <div id="run_msg_overall_message" class="border rounded p-2" style="white-space: pre-wrap; max-height: 20vh; overflow: auto;"></div>
</div> </div>
<div class="border rounded p-2 p-0" style="overflow:hidden;">
<!-- Cloud Connect summary panel (shown instead of raw email for CC runs) --> <iframe id="run_msg_body_container_iframe" class="w-100" style="height:55vh; border:0; background:transparent;" sandbox="allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation"></iframe>
<div id="jdm_cc_summary_panel" class="mb-3" style="display:none;">
<h6>Cloud Connect</h6>
<dl class="row mb-0 dl-compact">
<dt class="col-4">User</dt> <dd class="col-8" id="jdm_cc_user"></dd>
<dt class="col-4">Section</dt> <dd class="col-8" id="jdm_cc_section"></dd>
<dt class="col-4">Repository</dt> <dd class="col-8" id="jdm_cc_repo"></dd>
<dt class="col-4">Used / Quota</dt><dd class="col-8" id="jdm_cc_used"></dd>
<dt class="col-4">Free</dt> <dd class="col-8" id="jdm_cc_free"></dd>
<dt class="col-4">Last active</dt> <dd class="col-8" id="jdm_cc_last_active"></dd>
<dt class="col-4">Status</dt> <dd class="col-8" id="jdm_cc_status"></dd>
</dl>
</div>
<!-- Cove summary panel (shown for cove_api runs, no mail involved) -->
<div id="jdm_cove_summary_panel" class="mb-3" style="display:none;">
<h6>Cove Data Protection</h6>
<dl class="row mb-0 dl-compact">
<dt class="col-4">Account</dt> <dd class="col-8" id="jdm_cove_account"></dd>
<dt class="col-4">Computer</dt> <dd class="col-8" id="jdm_cove_computer"></dd>
<dt class="col-4">Customer</dt> <dd class="col-8" id="jdm_cove_customer"></dd>
<dt class="col-4">Datasources</dt> <dd class="col-8" id="jdm_cove_datasources"></dd>
<dt class="col-4">Last run</dt> <dd class="col-8" id="jdm_cove_last_run"></dd>
<dt class="col-4">Status</dt> <dd class="col-8" id="jdm_cove_status"></dd>
</dl>
</div>
<div class="mb-3">
<div class="d-flex align-items-center gap-2 mb-1">
<h6 class="mb-0" id="jdm_mail_heading">Mail</h6>
<a href="#" class="small text-muted" id="jdm_mail_toggle" style="display:none;"
onclick="(function(){
var b=document.getElementById('jdm_mail_iframe_body');
var lnk=document.getElementById('jdm_mail_toggle');
var collapsed=b.style.display==='none';
b.style.display=collapsed?'':'none';
lnk.textContent=collapsed?'hide':'show';
})();return false;">show</a>
</div>
<div id="jdm_mail_iframe_body" class="border rounded p-0" style="overflow:hidden;">
<iframe id="run_msg_body_container_iframe" class="w-100" style="height:55vh; border:0; background:transparent;" sandbox="allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation"></iframe>
</div>
</div> </div>
<div class="mt-3"> <div class="mt-3">
@ -356,60 +287,6 @@
(function () { (function () {
var currentRunId = null; var currentRunId = null;
// Cross-browser copy to clipboard function
function copyToClipboard(text, button) {
// Method 1: Modern Clipboard API (works in most browsers with HTTPS)
if (navigator.clipboard && navigator.clipboard.writeText) {
navigator.clipboard.writeText(text)
.then(function () {
showCopyFeedback(button);
})
.catch(function () {
// Fallback to method 2 if clipboard API fails
fallbackCopy(text, button);
});
} else {
// Method 2: Legacy execCommand method
fallbackCopy(text, button);
}
}
function fallbackCopy(text, button) {
var textarea = document.createElement('textarea');
textarea.value = text;
textarea.style.position = 'fixed';
textarea.style.opacity = '0';
textarea.style.top = '0';
textarea.style.left = '0';
document.body.appendChild(textarea);
textarea.focus();
textarea.select();
try {
var successful = document.execCommand('copy');
if (successful) {
showCopyFeedback(button);
} else {
// If execCommand fails, use prompt as last resort
window.prompt('Copy ticket number:', text);
}
} catch (err) {
// If all else fails, show prompt
window.prompt('Copy ticket number:', text);
}
document.body.removeChild(textarea);
}
function showCopyFeedback(button) {
if (!button) return;
var original = button.textContent;
button.textContent = '✓';
setTimeout(function () {
button.textContent = original;
}, 800);
}
function apiJson(url, opts) { function apiJson(url, opts) {
opts = opts || {}; opts = opts || {};
opts.headers = opts.headers || {}; opts.headers = opts.headers || {};
@ -442,14 +319,12 @@
html += '<div class="mb-2"><strong>Tickets</strong><div class="mt-1">'; html += '<div class="mb-2"><strong>Tickets</strong><div class="mt-1">';
tickets.forEach(function (t) { tickets.forEach(function (t) {
var status = t.resolved_at ? 'Resolved' : 'Active'; var status = t.resolved_at ? 'Resolved' : 'Active';
var ticketCode = (t.ticket_code || '').toString();
html += '<div class="mb-2 border rounded p-2" data-alert-type="ticket" data-id="' + t.id + '">' + html += '<div class="mb-2 border rounded p-2" data-alert-type="ticket" data-id="' + t.id + '">' +
'<div class="d-flex align-items-start justify-content-between gap-2">' + '<div class="d-flex align-items-start justify-content-between gap-2">' +
'<div class="flex-grow-1 min-w-0">' + '<div class="flex-grow-1 min-w-0">' +
'<div class="text-truncate">' + '<div class="text-truncate">' +
'<span class="me-1" title="Ticket">🎫</span>' + '<span class="me-1" title="Ticket">🎫</span>' +
'<span class="fw-semibold">' + escapeHtml(ticketCode) + '</span>' + '<span class="fw-semibold">' + escapeHtml(t.ticket_code || '') + '</span>' +
'<button type="button" class="btn btn-sm btn-outline-secondary ms-2 py-0 px-1" title="Copy ticket number" data-action="copy-ticket" data-code="' + escapeHtml(ticketCode) + '"></button>' +
'<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' + '<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
'</div>' + '</div>' +
'</div>' + '</div>' +
@ -496,16 +371,7 @@
ev.preventDefault(); ev.preventDefault();
var action = btn.getAttribute('data-action'); var action = btn.getAttribute('data-action');
var id = btn.getAttribute('data-id'); var id = btn.getAttribute('data-id');
if (!action) return; if (!action || !id) return;
if (action === 'copy-ticket') {
var code = btn.getAttribute('data-code') || '';
if (!code) return;
copyToClipboard(code, btn);
return;
}
if (!id) return;
if (action === 'resolve-ticket') { if (action === 'resolve-ticket') {
if (!confirm('Mark ticket as resolved?')) return; if (!confirm('Mark ticket as resolved?')) return;
apiJson('/api/tickets/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'}) apiJson('/api/tickets/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})
@ -645,8 +511,8 @@ function renderObjects(objects) {
function objectSeverityRank(o) { function objectSeverityRank(o) {
var st = String((o && o.status) || "").trim().toLowerCase(); var st = String((o && o.status) || "").trim().toLowerCase();
var err = String((o && o.error_message) || "").trim(); var err = String((o && o.error_message) || "").trim();
if (st === "error" || st === "failed" || st === "failure") return 0; if (st === "error" || st === "failed" || st === "failure" || err) return 0;
if (st === "warning" || err) return 1; if (st === "warning") return 1;
return 2; return 2;
} }
@ -701,20 +567,12 @@ function renderObjects(objects) {
rows.forEach(function (row) { rows.forEach(function (row) {
row.addEventListener("click", function () { row.addEventListener("click", function () {
var messageId = row.getAttribute("data-message-id"); var messageId = row.getAttribute("data-message-id");
var runId = row.getAttribute("data-run-id"); var runId = row.getAttribute("data-run-id");
var sourceType = row.getAttribute("data-source-type") || ""; if (!messageId) return;
if (!messageId && !runId) return;
currentRunId = runId ? parseInt(runId, 10) : null; currentRunId = runId ? parseInt(runId, 10) : null;
var detailUrl; fetch("{{ url_for('main.inbox_message_detail', message_id=0) }}".replace("0", messageId))
if (sourceType === "cove_api" && runId && !messageId) {
detailUrl = "{{ url_for('main.cove_run_detail', run_id=0) }}".replace("0", runId);
} else {
detailUrl = "{{ url_for('main.inbox_message_detail', message_id=0) }}".replace("0", messageId);
if (runId) detailUrl += "?run_id=" + encodeURIComponent(runId);
}
fetch(detailUrl)
.then(function (resp) { .then(function (resp) {
if (!resp.ok) throw new Error("Failed to load message details"); if (!resp.ok) throw new Error("Failed to load message details");
return resp.json(); return resp.json();
@ -757,14 +615,8 @@ function renderObjects(objects) {
document.getElementById("run_msg_job").textContent = meta.job_name || ""; document.getElementById("run_msg_job").textContent = meta.job_name || "";
var overallEl = document.getElementById("run_msg_overall"); var overallEl = document.getElementById("run_msg_overall");
if (overallEl) { if (overallEl) {
// For CC runs use the run's own status; for regular runs use the mail's overall_status. var d = statusDotClass(meta.overall_status);
var overallStatus = ( overallEl.innerHTML = (d ? ('<span class="status-dot ' + d + ' me-2" aria-hidden="true"></span>') : '') + escapeHtml(meta.overall_status || "");
data.cove_summary ? (data.cove_summary.status || "") :
data.cloud_connect_summary ? (data.cloud_connect_summary.status || "") :
(meta.overall_status || "")
);
var d = statusDotClass(overallStatus);
overallEl.innerHTML = (d ? ('<span class="status-dot ' + d + ' me-2" aria-hidden="true"></span>') : '') + escapeHtml(overallStatus);
} }
document.getElementById("run_msg_overall_message").textContent = meta.overall_message || ""; document.getElementById("run_msg_overall_message").textContent = meta.overall_message || "";
document.getElementById("run_msg_customer").textContent = meta.customer_name || ""; document.getElementById("run_msg_customer").textContent = meta.customer_name || "";
@ -782,49 +634,8 @@ function renderObjects(objects) {
} }
} }
var ccPanel = document.getElementById("jdm_cc_summary_panel"); var bodyFrame = document.getElementById("run_msg_body_container_iframe");
var covePanel = document.getElementById("jdm_cove_summary_panel"); if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(data.body_html || "");
var mailHeading = document.getElementById("jdm_mail_heading");
var mailToggle = document.getElementById("jdm_mail_toggle");
var mailBody = document.getElementById("jdm_mail_iframe_body");
var bodyFrame = document.getElementById("run_msg_body_container_iframe");
if (data.cove_summary) {
var cs = data.cove_summary;
document.getElementById("jdm_cove_account").textContent = cs.account_name || "—";
document.getElementById("jdm_cove_computer").textContent = cs.computer_name || "—";
document.getElementById("jdm_cove_customer").textContent = cs.customer_name || "—";
document.getElementById("jdm_cove_datasources").textContent = cs.datasources || "—";
document.getElementById("jdm_cove_last_run").textContent = cs.last_run_at || "—";
document.getElementById("jdm_cove_status").textContent = cs.status || "—";
if (covePanel) covePanel.style.display = "";
if (ccPanel) ccPanel.style.display = "none";
if (mailHeading) mailHeading.style.display = "none";
if (mailToggle) mailToggle.style.display = "none";
if (mailBody) mailBody.style.display = "none";
} else if (data.cloud_connect_summary) {
var s = data.cloud_connect_summary;
document.getElementById("jdm_cc_user").textContent = s.user || "";
document.getElementById("jdm_cc_section").textContent = s.section || "";
document.getElementById("jdm_cc_repo").textContent = s.repo_name + (s.repo_type ? " (" + s.repo_type + ")" : "");
document.getElementById("jdm_cc_used").textContent = (s.used_space || "—") + " / " + (s.total_quota || "—");
document.getElementById("jdm_cc_free").textContent = s.free_space || "—";
document.getElementById("jdm_cc_last_active").textContent = s.last_active || "—";
document.getElementById("jdm_cc_status").textContent = s.status || "—";
if (covePanel) covePanel.style.display = "none";
if (ccPanel) ccPanel.style.display = "";
if (mailHeading) { mailHeading.style.display = ""; mailHeading.textContent = "Source report email"; }
if (mailToggle) { mailToggle.style.display = ""; mailToggle.textContent = "show"; }
if (mailBody) mailBody.style.display = "none";
if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(data.body_html || "");
} else {
if (covePanel) covePanel.style.display = "none";
if (ccPanel) ccPanel.style.display = "none";
if (mailHeading) { mailHeading.style.display = ""; mailHeading.textContent = "Mail"; }
if (mailToggle) mailToggle.style.display = "none";
if (mailBody) mailBody.style.display = "";
if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(data.body_html || "");
}
renderObjects(data.objects || []); renderObjects(data.objects || []);

View File

@ -2,16 +2,6 @@
{% block content %} {% block content %}
<h2 class="mb-3">Jobs</h2> <h2 class="mb-3">Jobs</h2>
{% if selected_customer_id %}
<div class="alert alert-info d-flex justify-content-between align-items-center py-2" role="alert">
<span>
Filtered on customer:
<strong>{{ selected_customer_name or ('#' ~ selected_customer_id) }}</strong>
</span>
<a href="{{ url_for('main.jobs') }}" class="btn btn-sm btn-outline-primary">Clear filter</a>
</div>
{% endif %}
<div class="table-responsive"> <div class="table-responsive">
<table class="table table-sm table-hover align-middle"> <table class="table table-sm table-hover align-middle">
<thead class="table-light"> <thead class="table-light">

View File

@ -422,10 +422,7 @@ function loadRawData() {
function loadReports() { function loadReports() {
setTableLoading('Loading…'); setTableLoading('Loading…');
var params = new URLSearchParams(window.location.search || ''); fetch('/api/reports', { credentials: 'same-origin' })
var q = (params.get('q') || '').trim();
var apiUrl = '/api/reports' + (q ? ('?q=' + encodeURIComponent(q)) : '');
fetch(apiUrl, { credentials: 'same-origin' })
.then(function (r) { return r.json(); }) .then(function (r) { return r.json(); })
.then(function (data) { .then(function (data) {
renderTable((data && data.items) ? data.items : []); renderTable((data && data.items) ? data.items : []);

View File

@ -41,70 +41,6 @@
</div> </div>
</div> </div>
<form method="get" class="border rounded p-3 mb-3">
<div class="row g-2 align-items-end">
<div class="col-12 col-sm-6 col-lg-4">
<label class="form-label mb-1" for="rc_filter_sort">Sort order</label>
<select class="form-select form-select-sm" id="rc_filter_sort" name="sort">
<option value="customer" {% if sort_mode == 'customer' %}selected{% endif %}>Customer &gt; Backup &gt; Type &gt; Job</option>
<option value="status" {% if sort_mode == 'status' %}selected{% endif %}>Critical &gt; Missed &gt; Warning &gt; Success (override) &gt; Success</option>
</select>
</div>
<div class="col-12 col-sm-6 col-lg-8">
<label class="form-label mb-1 d-block">Status filter</label>
<div class="d-flex flex-wrap gap-3">
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="rc_status_critical" name="status" value="critical" {% if 'critical' in selected_status_filters %}checked{% endif %}>
<label class="form-check-label" for="rc_status_critical">Critical</label>
</div>
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="rc_status_missed" name="status" value="missed" {% if 'missed' in selected_status_filters %}checked{% endif %}>
<label class="form-check-label" for="rc_status_missed">Missed</label>
</div>
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="rc_status_warning" name="status" value="warning" {% if 'warning' in selected_status_filters %}checked{% endif %}>
<label class="form-check-label" for="rc_status_warning">Warning</label>
</div>
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="rc_status_success_override" name="status" value="success_override" {% if 'success_override' in selected_status_filters %}checked{% endif %}>
<label class="form-check-label" for="rc_status_success_override">Success (override)</label>
</div>
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="rc_status_success" name="status" value="success" {% if 'success' in selected_status_filters %}checked{% endif %}>
<label class="form-check-label" for="rc_status_success">Success</label>
</div>
</div>
</div>
<div class="col-12 col-lg-8">
<div class="d-flex flex-wrap gap-3">
<div class="form-check">
<input class="form-check-input" type="checkbox" id="rc_has_ticket" name="has_ticket" value="1" {% if has_ticket %}checked{% endif %}>
<label class="form-check-label" for="rc_has_ticket">Only jobs with active ticket</label>
</div>
<div class="form-check">
<input class="form-check-input" type="checkbox" id="rc_has_remark" name="has_remark" value="1" {% if has_remark %}checked{% endif %}>
<label class="form-check-label" for="rc_has_remark">Only jobs with active remark</label>
</div>
</div>
</div>
<div class="col-12 col-lg-4 d-flex gap-2 justify-content-lg-end">
{% if is_admin and include_reviewed %}
<input type="hidden" name="include_reviewed" value="1" />
{% endif %}
<button type="submit" class="btn btn-sm btn-primary">Apply</button>
<a class="btn btn-sm btn-outline-secondary" href="{{ url_for('main.run_checks_page') }}">Reset</a>
<button
type="submit"
class="btn btn-sm btn-outline-primary"
formmethod="post"
formaction="{{ url_for('main.run_checks_save_preferences') }}"
>
Save as my default
</button>
</div>
</div>
</form>
<div class="small text-muted mb-2" id="rc_status"></div> <div class="small text-muted mb-2" id="rc_status"></div>
<div class="table-responsive"> <div class="table-responsive">
@ -112,7 +48,7 @@
<thead class="table-light"> <thead class="table-light">
<tr> <tr>
<th scope="col" style="width: 34px;"> <th scope="col" style="width: 34px;">
<input class="form-check-input" type="checkbox" id="rc_select_all" autocomplete="off" /> <input class="form-check-input" type="checkbox" id="rc_select_all" />
</th> </th>
<th scope="col">Customer</th> <th scope="col">Customer</th>
<th scope="col">Backup</th> <th scope="col">Backup</th>
@ -127,7 +63,7 @@
{% for r in rows %} {% for r in rows %}
<tr class="rc-job-row" data-job-id="{{ r.job_id }}" style="cursor: pointer;"> <tr class="rc-job-row" data-job-id="{{ r.job_id }}" style="cursor: pointer;">
<td onclick="event.stopPropagation();"> <td onclick="event.stopPropagation();">
<input class="form-check-input rc_row_cb" type="checkbox" value="{{ r.job_id }}" autocomplete="off" /> <input class="form-check-input rc_row_cb" type="checkbox" value="{{ r.job_id }}" />
</td> </td>
<td>{{ r.customer_name }}</td> <td>{{ r.customer_name }}</td>
<td>{{ r.backup_software }}</td> <td>{{ r.backup_software }}</td>
@ -208,21 +144,17 @@
overflow: auto; overflow: auto;
} }
#runChecksModal #rcm_body_iframe {
flex: 1 1 auto;
min-height: 0;
height: auto;
}
#runChecksModal .rcm-mail-panel { #runChecksModal .rcm-mail-panel {
display: flex; display: flex;
flex-direction: column; flex-direction: column;
flex: 1 1 auto; flex: 1 1 auto;
min-height: 0; min-height: 0;
} }
#runChecksModal #rcm_mail_iframe_body {
flex: 1 1 auto;
min-height: 0;
overflow: hidden;
}
#runChecksModal #rcm_body_iframe {
height: 100%;
display: block;
}
#runChecksModal .rcm-objects-scroll { #runChecksModal .rcm-objects-scroll {
max-height: 25vh; max-height: 25vh;
overflow: auto; overflow: auto;
@ -322,53 +254,15 @@
</dd> </dd>
</dl> </dl>
<div class="mb-3" id="rcm_cc_summary_panel" style="display:none;"> <div class="mb-3 rcm-mail-panel">
<h6>Cloud Connect</h6> <h6>Mail</h6>
<dl class="row mb-0 dl-compact"> <iframe
<dt class="col-4">User</dt> <dd class="col-8" id="rcc_user"></dd> id="rcm_body_iframe"
<dt class="col-4">Section</dt> <dd class="col-8" id="rcc_section"></dd> class="border rounded"
<dt class="col-4">Repository</dt> <dd class="col-8" id="rcc_repo"></dd> style="width:100%;"
<dt class="col-4">Used / Quota</dt><dd class="col-8" id="rcc_used"></dd> sandbox="allow-popups allow-popups-to-escape-sandbox allow-same-origin"
<dt class="col-4">Free</dt> <dd class="col-8" id="rcc_free"></dd> referrerpolicy="no-referrer"
<dt class="col-4">Last active</dt> <dd class="col-8" id="rcc_last_active"></dd> ></iframe>
<dt class="col-4">Status</dt> <dd class="col-8" id="rcc_status"></dd>
</dl>
</div>
<!-- Cove summary panel (shown for cove_api runs, no mail involved) -->
<div class="mb-3" id="rcm_cove_summary_panel" style="display:none;">
<h6>Cove Data Protection</h6>
<dl class="row mb-0 dl-compact">
<dt class="col-4">Account</dt> <dd class="col-8" id="rcm_cove_account"></dd>
<dt class="col-4">Computer</dt> <dd class="col-8" id="rcm_cove_computer"></dd>
<dt class="col-4">Customer</dt> <dd class="col-8" id="rcm_cove_customer"></dd>
<dt class="col-4">Datasources</dt> <dd class="col-8" id="rcm_cove_datasources"></dd>
<dt class="col-4">Last run</dt> <dd class="col-8" id="rcm_cove_last_run"></dd>
<dt class="col-4">Status</dt> <dd class="col-8" id="rcm_cove_status"></dd>
</dl>
</div>
<div class="mb-3 rcm-mail-panel" id="rcm_mail_iframe_panel">
<div class="d-flex align-items-center gap-2 mb-1">
<h6 class="mb-0" id="rcm_mail_heading">Mail</h6>
<a href="#" class="small text-muted" id="rcm_mail_toggle" style="display:none;"
onclick="(function(){
var b=document.getElementById('rcm_mail_iframe_body');
var lnk=document.getElementById('rcm_mail_toggle');
var collapsed=b.style.display==='none';
b.style.display=collapsed?'':'none';
lnk.textContent=collapsed?'hide':'show';
})();return false;">show</a>
</div>
<div id="rcm_mail_iframe_body">
<iframe
id="rcm_body_iframe"
class="border rounded"
style="width:100%;"
sandbox="allow-popups allow-popups-to-escape-sandbox allow-same-origin"
referrerpolicy="no-referrer"
></iframe>
</div>
</div> </div>
<div> <div>
@ -497,8 +391,8 @@ function statusClass(status) {
function objectSeverityRank(o) { function objectSeverityRank(o) {
var st = String((o && o.status) || '').trim().toLowerCase(); var st = String((o && o.status) || '').trim().toLowerCase();
var err = String((o && o.error_message) || '').trim(); var err = String((o && o.error_message) || '').trim();
if (st === 'error' || st === 'failed' || st === 'failure') return 0; if (st === 'error' || st === 'failed' || st === 'failure' || err) return 0;
if (st === 'warning' || err) return 1; if (st === 'warning') return 1;
return 2; return 2;
} }
@ -553,60 +447,6 @@ function escapeHtml(s) {
.replace(/'/g, "&#39;"); .replace(/'/g, "&#39;");
} }
// Cross-browser copy to clipboard function
function copyToClipboard(text, button) {
// Method 1: Modern Clipboard API (works in most browsers with HTTPS)
if (navigator.clipboard && navigator.clipboard.writeText) {
navigator.clipboard.writeText(text)
.then(function () {
showCopyFeedback(button);
})
.catch(function () {
// Fallback to method 2 if clipboard API fails
fallbackCopy(text, button);
});
} else {
// Method 2: Legacy execCommand method
fallbackCopy(text, button);
}
}
function fallbackCopy(text, button) {
var textarea = document.createElement('textarea');
textarea.value = text;
textarea.style.position = 'fixed';
textarea.style.opacity = '0';
textarea.style.top = '0';
textarea.style.left = '0';
document.body.appendChild(textarea);
textarea.focus();
textarea.select();
try {
var successful = document.execCommand('copy');
if (successful) {
showCopyFeedback(button);
} else {
// If execCommand fails, use prompt as last resort
window.prompt('Copy ticket number:', text);
}
} catch (err) {
// If all else fails, show prompt
window.prompt('Copy ticket number:', text);
}
document.body.removeChild(textarea);
}
function showCopyFeedback(button) {
if (!button) return;
var original = button.textContent;
button.textContent = '✓';
setTimeout(function () {
button.textContent = original;
}, 800);
}
function getSelectedJobIds() { function getSelectedJobIds() {
var cbs = table.querySelectorAll('tbody .rc_row_cb'); var cbs = table.querySelectorAll('tbody .rc_row_cb');
var ids = []; var ids = [];
@ -1000,7 +840,20 @@ table.addEventListener('change', function (e) {
if (action === 'copy-ticket') { if (action === 'copy-ticket') {
var code = btn.getAttribute('data-code') || ''; var code = btn.getAttribute('data-code') || '';
if (!code) return; if (!code) return;
copyToClipboard(code, btn); if (navigator.clipboard && navigator.clipboard.writeText) {
navigator.clipboard.writeText(code)
.then(function () {
var original = btn.textContent;
btn.textContent = '✓';
setTimeout(function () { btn.textContent = original; }, 800);
})
.catch(function () {
// Fallback: select/copy via prompt
window.prompt('Copy ticket number:', code);
});
} else {
window.prompt('Copy ticket number:', code);
}
return; return;
} }
@ -1463,50 +1316,9 @@ table.addEventListener('change', function (e) {
document.getElementById('rcm_received').textContent = ''; document.getElementById('rcm_received').textContent = '';
} }
var ccPanel = document.getElementById('rcm_cc_summary_panel'); var bodyFrame = document.getElementById('rcm_body_iframe');
var covePanel = document.getElementById('rcm_cove_summary_panel'); if (bodyFrame) {
var mailHeading = document.getElementById('rcm_mail_heading'); bodyFrame.srcdoc = wrapMailHtml(run.body_html || (run.missed ? '<div class="text-muted">No email for missed run.</div>' : ''));
var mailToggle = document.getElementById('rcm_mail_toggle');
var mailBody = document.getElementById('rcm_mail_iframe_body');
var bodyFrame = document.getElementById('rcm_body_iframe');
if (run.cove_summary) {
var cs = run.cove_summary;
document.getElementById('rcm_cove_account').textContent = cs.account_name || '—';
document.getElementById('rcm_cove_computer').textContent = cs.computer_name || '—';
document.getElementById('rcm_cove_customer').textContent = cs.customer_name || '—';
document.getElementById('rcm_cove_datasources').textContent = cs.datasources || '—';
document.getElementById('rcm_cove_last_run').textContent = cs.last_run_at || '—';
document.getElementById('rcm_cove_status').textContent = cs.status || '—';
if (covePanel) covePanel.style.display = '';
if (ccPanel) ccPanel.style.display = 'none';
if (mailHeading) mailHeading.style.display = 'none';
if (mailToggle) mailToggle.style.display = 'none';
if (mailBody) mailBody.style.display = 'none';
} else if (run.cloud_connect_summary) {
var s = run.cloud_connect_summary;
document.getElementById('rcc_user').textContent = s.user || '';
document.getElementById('rcc_section').textContent = s.section || '';
document.getElementById('rcc_repo').textContent = s.repo_name + (s.repo_type ? ' (' + s.repo_type + ')' : '');
document.getElementById('rcc_used').textContent = (s.used_space || '—') + ' / ' + (s.total_quota || '—');
document.getElementById('rcc_free').textContent = s.free_space || '—';
document.getElementById('rcc_last_active').textContent = s.last_active || '—';
document.getElementById('rcc_status').textContent = s.status || '—';
if (covePanel) covePanel.style.display = 'none';
if (ccPanel) ccPanel.style.display = '';
if (mailHeading) { mailHeading.style.display = ''; mailHeading.textContent = 'Source report email'; }
if (mailToggle) { mailToggle.style.display = ''; mailToggle.textContent = 'show'; }
if (mailBody) mailBody.style.display = 'none';
if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(run.body_html || '');
} else {
if (covePanel) covePanel.style.display = 'none';
if (ccPanel) ccPanel.style.display = 'none';
if (mailHeading) { mailHeading.style.display = ''; mailHeading.textContent = 'Mail'; }
if (mailToggle) mailToggle.style.display = 'none';
if (mailBody) mailBody.style.display = '';
if (bodyFrame) {
bodyFrame.srcdoc = wrapMailHtml(run.body_html || (run.missed ? '<div class="text-muted">No email for missed run.</div>' : ''));
}
} }
var emlBtn = document.getElementById('rcm_eml_btn'); var emlBtn = document.getElementById('rcm_eml_btn');

View File

@ -1,75 +0,0 @@
{% extends "layout/base.html" %}
{% block content %}
<h2 class="mb-3">Search</h2>
{% if query %}
<p class="text-muted mb-3">
Query: <strong>{{ query }}</strong> | Total hits: <strong>{{ total_hits }}</strong>
</p>
{% else %}
<div class="alert alert-secondary py-2">
Enter a search term in the top navigation bar.
</div>
{% endif %}
{% for section in sections %}
<div class="card mb-3" id="search-section-{{ section['key'] }}" style="scroll-margin-top: 96px;">
<div class="card-header d-flex justify-content-between align-items-center">
<span>{{ section['title'] }} ({{ section['total'] }})</span>
<a href="{{ section['view_all_url'] }}" class="btn btn-sm btn-outline-secondary">Open {{ section['title'] }}</a>
</div>
{% if section['key'] == 'daily_jobs' %}
<div class="px-3 py-2 small text-muted border-bottom">
Note: The Daily Jobs page itself only shows results for the selected day. Search results can include matches that relate to jobs across other days.
</div>
{% endif %}
<div class="card-body p-0">
{% if section['items'] %}
<div class="table-responsive">
<table class="table table-sm mb-0 align-middle">
<thead class="table-light">
<tr>
<th>Result</th>
<th>Details</th>
<th>Meta</th>
</tr>
</thead>
<tbody>
{% for item in section['items'] %}
<tr>
<td>
{% if item.link %}
<a href="{{ item.link }}">{{ item.title }}</a>
{% else %}
{{ item.title }}
{% endif %}
</td>
<td>{{ item.subtitle }}</td>
<td>{{ item.meta }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="p-3 text-muted">No results in this section.</div>
{% endif %}
</div>
{% if section['total_pages'] > 1 %}
<div class="card-footer d-flex justify-content-between align-items-center small">
<span class="text-muted">
Page {{ section['current_page'] }} of {{ section['total_pages'] }} ({{ section['total'] }} results)
</span>
<div class="d-flex gap-2">
{% if section['has_prev'] %}
<a class="btn btn-sm btn-outline-secondary" href="{{ section['prev_url'] }}#search-section-{{ section['key'] }}">Previous</a>
{% endif %}
{% if section['has_next'] %}
<a class="btn btn-sm btn-outline-secondary" href="{{ section['next_url'] }}#search-section-{{ section['key'] }}">Next</a>
{% endif %}
</div>
</div>
{% endif %}
</div>
{% endfor %}
{% endblock %}

View File

@ -161,17 +161,6 @@
</div> </div>
</div> </div>
<div class="card mb-3">
<div class="card-header">Security</div>
<div class="card-body">
<div class="form-check form-switch">
<input class="form-check-input" type="checkbox" id="login_captcha_enabled" name="login_captcha_enabled" {% if settings.login_captcha_enabled %}checked{% endif %} />
<label class="form-check-label" for="login_captcha_enabled">Enable login captcha</label>
</div>
<div class="form-text">When enabled, users must solve a simple math question before logging in. Enabled by default.</div>
</div>
</div>
<div class="d-flex justify-content-end mt-3"> <div class="d-flex justify-content-end mt-3">
<button type="submit" class="btn btn-primary">Save settings</button> <button type="submit" class="btn btn-primary">Save settings</button>
</div> </div>
@ -515,178 +504,6 @@
</div> </div>
{% endif %} {% endif %}
{% if section == 'integrations' %}
<form method="post" class="mb-4" id="cove-settings-form">
<div class="card mb-3">
<div class="card-header">Cove Data Protection (N-able)</div>
<div class="card-body">
<div class="form-check form-switch mb-3">
<input class="form-check-input" type="checkbox" id="cove_enabled" name="cove_enabled" {% if settings.cove_enabled %}checked{% endif %} />
<label class="form-check-label" for="cove_enabled">Enable Cove integration</label>
</div>
<div class="row g-3">
<div class="col-md-12">
<label for="cove_api_url" class="form-label">API URL</label>
<input type="url" class="form-control" id="cove_api_url" name="cove_api_url"
value="{{ settings.cove_api_url or '' }}"
placeholder="https://api.backup.management/jsonapi" />
<div class="form-text">Leave empty to use the default Cove API endpoint.</div>
</div>
<div class="col-md-6">
<label for="cove_api_username" class="form-label">API Username <span class="text-danger">*</span></label>
<input type="text" class="form-control" id="cove_api_username" name="cove_api_username"
value="{{ settings.cove_api_username or '' }}" />
</div>
<div class="col-md-6">
<label for="cove_api_password" class="form-label">API Password {% if not has_cove_password %}<span class="text-danger">*</span>{% endif %}</label>
<input type="password" class="form-control" id="cove_api_password" name="cove_api_password"
placeholder="{% if has_cove_password %}******** (stored){% else %}enter password{% endif %}" />
<div class="form-text">Leave empty to keep the existing password.</div>
</div>
<div class="col-md-6">
<div class="form-check form-switch mt-2">
<input class="form-check-input" type="checkbox" id="cove_import_enabled" name="cove_import_enabled" {% if settings.cove_import_enabled %}checked{% endif %} />
<label class="form-check-label" for="cove_import_enabled">Enable automatic import</label>
</div>
</div>
<div class="col-md-6">
<label for="cove_import_interval_minutes" class="form-label">Import interval (minutes)</label>
<input type="number" class="form-control" id="cove_import_interval_minutes" name="cove_import_interval_minutes"
value="{{ settings.cove_import_interval_minutes or 30 }}" min="1" max="1440" />
<div class="form-text">How often (in minutes) to fetch new data from the Cove API.</div>
</div>
</div>
<div class="d-flex justify-content-between align-items-center mt-3">
<div id="cove-test-result" class="small"></div>
<div class="d-flex gap-2">
<button type="button" class="btn btn-outline-secondary" id="cove-test-btn">Test connection</button>
<button type="submit" class="btn btn-primary">Save Cove Settings</button>
</div>
</div>
{% if settings.cove_partner_id %}
<div class="mt-2 d-flex justify-content-between align-items-center flex-wrap gap-2">
<div class="text-muted small">
Connected Partner ID: <strong>{{ settings.cove_partner_id }}</strong>
{% if settings.cove_last_import_at %}
&nbsp;·&nbsp; Last import: {{ settings.cove_last_import_at|local_datetime }}
{% else %}
&nbsp;·&nbsp; No import yet
{% endif %}
</div>
<button
type="submit"
class="btn btn-sm btn-outline-primary"
formaction="{{ url_for('main.settings_cove_run_now') }}"
formmethod="post"
>
Run import now
</button>
</div>
{% endif %}
</div>
</div>
</form>
<script>
(function () {
var btn = document.getElementById('cove-test-btn');
var resultDiv = document.getElementById('cove-test-result');
if (!btn) return;
btn.addEventListener('click', function () {
btn.disabled = true;
resultDiv.textContent = 'Testing…';
resultDiv.className = 'small text-muted';
fetch('{{ url_for("main.settings_cove_test_connection") }}', {
method: 'POST',
headers: { 'X-CSRFToken': document.querySelector('meta[name="csrf-token"]') ? document.querySelector('meta[name="csrf-token"]').content : '' },
credentials: 'same-origin',
})
.then(function (r) { return r.json(); })
.then(function (data) {
if (data.ok) {
resultDiv.textContent = data.message;
resultDiv.className = 'small text-success';
} else {
resultDiv.textContent = data.message;
resultDiv.className = 'small text-danger';
}
})
.catch(function (err) {
resultDiv.textContent = 'Request failed: ' + err;
resultDiv.className = 'small text-danger';
})
.finally(function () { btn.disabled = false; });
});
})();
</script>
<form method="post" class="mb-4" id="entra-settings-form">
<div class="card mb-3">
<div class="card-header">Microsoft Entra SSO</div>
<div class="card-body">
<div class="form-check form-switch mb-3">
<input class="form-check-input" type="checkbox" id="entra_sso_enabled" name="entra_sso_enabled" {% if settings.entra_sso_enabled %}checked{% endif %} />
<label class="form-check-label" for="entra_sso_enabled">Enable Microsoft sign-in</label>
</div>
<div class="row g-3">
<div class="col-md-6">
<label for="entra_tenant_id" class="form-label">Tenant ID</label>
<input type="text" class="form-control" id="entra_tenant_id" name="entra_tenant_id"
value="{{ settings.entra_tenant_id or '' }}" placeholder="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" />
</div>
<div class="col-md-6">
<label for="entra_client_id" class="form-label">Client ID</label>
<input type="text" class="form-control" id="entra_client_id" name="entra_client_id"
value="{{ settings.entra_client_id or '' }}" placeholder="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" />
</div>
<div class="col-md-12">
<label for="entra_client_secret" class="form-label">Client Secret {% if not has_entra_secret %}<span class="text-danger">*</span>{% endif %}</label>
<input type="password" class="form-control" id="entra_client_secret" name="entra_client_secret"
placeholder="{% if has_entra_secret %}******** (stored){% else %}enter secret{% endif %}" />
<div class="form-text">Leave empty to keep the existing secret.</div>
</div>
<div class="col-md-12">
<label for="entra_redirect_uri" class="form-label">Redirect URI (optional override)</label>
<input type="url" class="form-control" id="entra_redirect_uri" name="entra_redirect_uri"
value="{{ settings.entra_redirect_uri or '' }}"
placeholder="https://your-domain.example/auth/entra/callback" />
<div class="form-text">If empty, Backupchecks uses its own external callback URL.</div>
</div>
<div class="col-md-6">
<label for="entra_allowed_domain" class="form-label">Allowed domain/tenant (optional)</label>
<input type="text" class="form-control" id="entra_allowed_domain" name="entra_allowed_domain"
value="{{ settings.entra_allowed_domain or '' }}" placeholder="contoso.com or tenant-id" />
<div class="form-text">Restrict sign-ins to one tenant id or one email domain.</div>
</div>
<div class="col-md-12">
<label for="entra_allowed_group_ids" class="form-label">Allowed Entra Group Object ID(s) (optional)</label>
<textarea class="form-control" id="entra_allowed_group_ids" name="entra_allowed_group_ids" rows="3"
placeholder="group-object-id-1&#10;group-object-id-2">{{ settings.entra_allowed_group_ids or '' }}</textarea>
<div class="form-text">Optional hard access gate. Enter one or more Entra security group object IDs (comma or newline separated). User must be member of at least one.</div>
</div>
<div class="col-md-6">
<div class="form-check form-switch mt-4">
<input class="form-check-input" type="checkbox" id="entra_auto_provision_users" name="entra_auto_provision_users" {% if settings.entra_auto_provision_users %}checked{% endif %} />
<label class="form-check-label" for="entra_auto_provision_users">Auto-provision unknown users as Viewer</label>
</div>
</div>
</div>
<div class="d-flex justify-content-end mt-3">
<button type="submit" class="btn btn-primary">Save Entra Settings</button>
</div>
</div>
</div>
</form>
{% endif %}
{% if section == 'maintenance' %} {% if section == 'maintenance' %}
<div class="row g-3 mb-4"> <div class="row g-3 mb-4">
@ -711,16 +528,8 @@
<div class="col-md-4 d-flex align-items-end"> <div class="col-md-4 d-flex align-items-end">
<button type="submit" class="btn btn-primary w-100">Import jobs</button> <button type="submit" class="btn btn-primary w-100">Import jobs</button>
</div> </div>
<div class="col-12">
<div class="form-check">
<input class="form-check-input" type="checkbox" value="1" id="include_autotask_ids_jobs" name="include_autotask_ids" />
<label class="form-check-label" for="include_autotask_ids_jobs">
Include Autotask IDs from import file
</label>
</div>
</div>
<div class="col-md-8"> <div class="col-md-8">
<div class="form-text">Use a JSON export created by this application. Leave Autotask IDs unchecked for sandbox/development environments with a different Autotask database.</div> <div class="form-text">Use a JSON export created by this application.</div>
</div> </div>
</div> </div>
</form> </form>

View File

@ -3,7 +3,8 @@
{% block title %}Orphaned Jobs Preview{% endblock %} {% block title %}Orphaned Jobs Preview{% endblock %}
{% block content %} {% block content %}
<div class="d-flex justify-content-between align-items-center mb-4"> <div class="container-fluid py-4">
<div class="d-flex justify-content-between align-items-center mb-4">
<div> <div>
<h2>Orphaned Jobs Preview</h2> <h2>Orphaned Jobs Preview</h2>
<p class="text-muted mb-0">Jobs without a valid customer link</p> <p class="text-muted mb-0">Jobs without a valid customer link</p>
@ -82,4 +83,5 @@
<p class="mb-0">All jobs are properly linked to existing customers.</p> <p class="mb-0">All jobs are properly linked to existing customers.</p>
</div> </div>
{% endif %} {% endif %}
</div>
{% endblock %} {% endblock %}

View File

@ -2,14 +2,13 @@
{% block content %} {% block content %}
<div class="d-flex justify-content-between align-items-center mb-3"> <div class="d-flex justify-content-between align-items-center mb-3">
<h2 class="mb-0">User Settings</h2> <h1 class="h4 mb-0">User Settings</h1>
</div> </div>
<div class="card" style="max-width: 40rem;"> <div class="card" style="max-width: 40rem;">
<div class="card-body"> <div class="card-body">
<h2 class="h6">Change password</h2> <h2 class="h6">Change password</h2>
<form method="post" class="row g-3"> <form method="post" class="row g-3">
<input type="hidden" name="form_name" value="password" />
<div class="col-12"> <div class="col-12">
<label class="form-label" for="current_password">Current password</label> <label class="form-label" for="current_password">Current password</label>
<input class="form-control" type="password" id="current_password" name="current_password" autocomplete="current-password" required /> <input class="form-control" type="password" id="current_password" name="current_password" autocomplete="current-password" required />
@ -31,64 +30,4 @@
</form> </form>
</div> </div>
</div> </div>
<div class="card mt-3" style="max-width: 50rem;">
<div class="card-body">
<h2 class="h6">Run Checks preferences</h2>
<form method="post" class="row g-3">
<input type="hidden" name="form_name" value="run_checks_preferences" />
<div class="col-12 col-md-8">
<label class="form-label" for="run_checks_sort_mode">Default sort order</label>
<select class="form-select" id="run_checks_sort_mode" name="run_checks_sort_mode">
<option value="customer" {% if run_checks_sort_mode == 'customer' %}selected{% endif %}>Customer &gt; Backup &gt; Type &gt; Job</option>
<option value="status" {% if run_checks_sort_mode == 'status' %}selected{% endif %}>Critical &gt; Missed &gt; Warning &gt; Success (override) &gt; Success</option>
</select>
</div>
<div class="col-12">
<label class="form-label d-block">Default status filter</label>
<div class="d-flex flex-wrap gap-3">
<div class="form-check">
<input class="form-check-input" type="checkbox" id="run_checks_status_critical" name="run_checks_status" value="critical" {% if 'critical' in run_checks_selected_statuses %}checked{% endif %}>
<label class="form-check-label" for="run_checks_status_critical">Critical</label>
</div>
<div class="form-check">
<input class="form-check-input" type="checkbox" id="run_checks_status_missed" name="run_checks_status" value="missed" {% if 'missed' in run_checks_selected_statuses %}checked{% endif %}>
<label class="form-check-label" for="run_checks_status_missed">Missed</label>
</div>
<div class="form-check">
<input class="form-check-input" type="checkbox" id="run_checks_status_warning" name="run_checks_status" value="warning" {% if 'warning' in run_checks_selected_statuses %}checked{% endif %}>
<label class="form-check-label" for="run_checks_status_warning">Warning</label>
</div>
<div class="form-check">
<input class="form-check-input" type="checkbox" id="run_checks_status_success_override" name="run_checks_status" value="success_override" {% if 'success_override' in run_checks_selected_statuses %}checked{% endif %}>
<label class="form-check-label" for="run_checks_status_success_override">Success (override)</label>
</div>
<div class="form-check">
<input class="form-check-input" type="checkbox" id="run_checks_status_success" name="run_checks_status" value="success" {% if 'success' in run_checks_selected_statuses %}checked{% endif %}>
<label class="form-check-label" for="run_checks_status_success">Success</label>
</div>
</div>
</div>
<div class="col-12">
<div class="d-flex flex-wrap gap-4">
<div class="form-check">
<input class="form-check-input" type="checkbox" id="run_checks_filter_has_ticket" name="run_checks_filter_has_ticket" value="1" {% if run_checks_filter_has_ticket %}checked{% endif %}>
<label class="form-check-label" for="run_checks_filter_has_ticket">Only jobs with active ticket</label>
</div>
<div class="form-check">
<input class="form-check-input" type="checkbox" id="run_checks_filter_has_remark" name="run_checks_filter_has_remark" value="1" {% if run_checks_filter_has_remark %}checked{% endif %}>
<label class="form-check-label" for="run_checks_filter_has_remark">Only jobs with active remark</label>
</div>
</div>
</div>
<div class="col-12">
<button class="btn btn-primary" type="submit">Save Run Checks preferences</button>
</div>
</form>
</div>
</div>
{% endblock %} {% endblock %}

View File

@ -1,310 +0,0 @@
#!/usr/bin/env python3
"""
Cove Data Protection API Test Script
=======================================
Verified working via Postman (2026-02-23). Uses confirmed column codes.
Usage:
python3 cove_api_test.py --username "api-user" --password "secret"
Or via environment variables:
COVE_USERNAME="api-user" COVE_PASSWORD="secret" python3 cove_api_test.py
Optional:
--url API endpoint (default: https://api.backup.management/jsonapi)
--records Max records to fetch (default: 50)
"""
import argparse
import json
import os
import sys
from datetime import datetime, timezone
import requests
API_URL = "https://api.backup.management/jsonapi"
# Session status codes (F00 / F15 / F09)
SESSION_STATUS = {
1: "In process",
2: "Failed",
3: "Aborted",
5: "Completed",
6: "Interrupted",
7: "NotStarted",
8: "CompletedWithErrors",
9: "InProgressWithFaults",
10: "OverQuota",
11: "NoSelection",
12: "Restarted",
}
# Backupchecks status mapping
STATUS_MAP = {
1: "Warning", # In process
2: "Error", # Failed
3: "Error", # Aborted
5: "Success", # Completed
6: "Error", # Interrupted
7: "Warning", # NotStarted
8: "Warning", # CompletedWithErrors
9: "Warning", # InProgressWithFaults
10: "Error", # OverQuota
11: "Warning", # NoSelection
12: "Warning", # Restarted
}
# Confirmed working columns (verified via Postman 2026-02-23)
COLUMNS = [
"I1", "I18", "I8", "I78",
"D09F00", "D09F09", "D09F15", "D09F08",
"D1F00", "D1F15",
"D10F00", "D10F15",
"D11F00", "D11F15",
"D19F00", "D19F15",
"D20F00", "D20F15",
"D5F00", "D5F15",
"D23F00", "D23F15",
]
# Datasource labels
DATASOURCE_LABELS = {
"D09": "Total",
"D1": "Files & Folders",
"D2": "System State",
"D10": "VssMsSql (SQL Server)",
"D11": "VssSharePoint",
"D19": "M365 Exchange",
"D20": "M365 OneDrive",
"D5": "M365 SharePoint",
"D23": "M365 Teams",
}
def _post(url: str, payload: dict, timeout: int = 30) -> dict:
headers = {"Content-Type": "application/json"}
resp = requests.post(url, json=payload, headers=headers, timeout=timeout)
resp.raise_for_status()
return resp.json()
def login(url: str, username: str, password: str) -> tuple[str, int]:
"""Authenticate and return (visa, partner_id)."""
payload = {
"jsonrpc": "2.0",
"id": "jsonrpc",
"method": "Login",
"params": {
"username": username,
"password": password,
},
}
data = _post(url, payload)
if "error" in data:
raise RuntimeError(f"Login failed: {data['error']}")
visa = data.get("visa")
if not visa:
raise RuntimeError(f"No visa token in response: {data}")
result = data.get("result", {})
partner_id = result.get("PartnerId") or result.get("result", {}).get("PartnerId")
if not partner_id:
raise RuntimeError(f"Could not find PartnerId in response: {data}")
return visa, int(partner_id)
def enumerate_statistics(url: str, visa: str, partner_id: int, columns: list[str], records: int = 50) -> dict:
payload = {
"jsonrpc": "2.0",
"visa": visa,
"id": "jsonrpc",
"method": "EnumerateAccountStatistics",
"params": {
"query": {
"PartnerId": partner_id,
"StartRecordNumber": 0,
"RecordsCount": records,
"Columns": columns,
}
},
}
return _post(url, payload)
def fmt_ts(value) -> str:
if not value:
return "(none)"
try:
ts = int(value)
if ts == 0:
return "(none)"
dt = datetime.fromtimestamp(ts, tz=timezone.utc)
return dt.strftime("%Y-%m-%d %H:%M UTC")
except (ValueError, TypeError, OSError):
return str(value)
def fmt_status(value) -> str:
if value is None:
return "(none)"
try:
code = int(value)
bc = STATUS_MAP.get(code, "?")
label = SESSION_STATUS.get(code, f"Unknown")
return f"{code} ({label}) → {bc}"
except (ValueError, TypeError):
return str(value)
def fmt_colorbar(value: str) -> str:
if not value:
return "(none)"
icons = {"5": "", "8": "⚠️", "2": "", "1": "🔄", "0": "·"}
return "".join(icons.get(c, c) for c in str(value))
def print_header(title: str) -> None:
print()
print("=" * 70)
print(f" {title}")
print("=" * 70)
def run(url: str, username: str, password: str, records: int, debug: bool = False) -> None:
print_header("Cove Data Protection API Test")
print(f" URL: {url}")
print(f" Username: {username}")
# Login
print_header("Step 1: Login")
visa, partner_id = login(url, username, password)
print(f" ✅ Login OK")
print(f" PartnerId: {partner_id}")
print(f" Visa: {visa[:40]}...")
# Fetch statistics
print_header("Step 2: EnumerateAccountStatistics")
print(f" Columns: {', '.join(COLUMNS)}")
print(f" Records: max {records}")
data = enumerate_statistics(url, visa, partner_id, COLUMNS, records)
if debug:
print(f"\n RAW response (first 2000 chars):")
print(json.dumps(data, indent=2)[:2000])
if "error" in data:
err = data["error"]
print(f" ❌ FAILED error {err.get('code')}: {err.get('message')}")
print(f" Data: {err.get('data')}")
sys.exit(1)
result = data.get("result")
if result is None:
print(" ⚠️ result is null raw response:")
print(json.dumps(data, indent=2)[:1000])
sys.exit(0)
if debug:
print(f"\n result type: {type(result).__name__}")
if isinstance(result, dict):
print(f" result keys: {list(result.keys())}")
# Unwrap possible nested result
if isinstance(result, dict) and "result" in result:
result = result["result"]
# Result can be a list directly or wrapped in Accounts key
accounts = result if isinstance(result, list) else result.get("Accounts", []) if isinstance(result, dict) else []
total = len(accounts)
print(f" ✅ SUCCESS {total} account(s) returned")
# Per-account output
print_header(f"Step 3: Account Details ({total} total)")
for i, acc in enumerate(accounts):
# Settings is a list of single-key dicts: [{"D09F00": "5"}, {"I1": "name"}, ...]
# Flatten to a single dict for easy lookup.
s: dict = {}
for item in acc.get("Settings", []):
s.update(item)
account_id = acc.get("AccountId", "?")
device_name = s.get("I1", "(no name)")
computer = s.get("I18") or "(M365 tenant)"
customer = s.get("I8", "")
active_ds = s.get("I78", "")
print(f"\n [{i+1}/{total}] {device_name} (AccountId: {account_id})")
print(f" Computer : {computer}")
print(f" Customer : {customer}")
print(f" Datasrc : {active_ds}")
# Total (D09)
print(f" Total:")
print(f" Status : {fmt_status(s.get('D09F00'))}")
print(f" Last session: {fmt_ts(s.get('D09F15'))}")
print(f" Last success: {fmt_ts(s.get('D09F09'))}")
print(f" 28-day bar : {fmt_colorbar(s.get('D09F08'))}")
# Per-datasource (only if present in response)
ds_pairs = [
("D1", "D1F00", "D1F15"),
("D10", "D10F00", "D10F15"),
("D11", "D11F00", "D11F15"),
("D19", "D19F00", "D19F15"),
("D20", "D20F00", "D20F15"),
("D5", "D5F00", "D5F15"),
("D23", "D23F00", "D23F15"),
]
for ds_code, f00_col, f15_col in ds_pairs:
f00 = s.get(f00_col)
f15 = s.get(f15_col)
if f00 is None and f15 is None:
continue
label = DATASOURCE_LABELS.get(ds_code, ds_code)
print(f" {label}:")
print(f" Status : {fmt_status(f00)}")
print(f" Last session: {fmt_ts(f15)}")
# Summary
print_header("Summary")
status_counts: dict[str, int] = {}
for acc in accounts:
flat: dict = {}
for item in acc.get("Settings", []):
flat.update(item)
raw = flat.get("D09F00")
bc = STATUS_MAP.get(int(raw), "Unknown") if raw is not None else "No data"
status_counts[bc] = status_counts.get(bc, 0) + 1
for status, count in sorted(status_counts.items()):
icon = {"Success": "", "Warning": "⚠️", "Error": ""}.get(status, " ")
print(f" {icon} {status}: {count}")
print(f"\n Total accounts: {total}")
print()
def main() -> None:
parser = argparse.ArgumentParser(description="Test Cove Data Protection API")
parser.add_argument("--url", default=os.environ.get("COVE_URL", API_URL))
parser.add_argument("--username", default=os.environ.get("COVE_USERNAME", ""))
parser.add_argument("--password", default=os.environ.get("COVE_PASSWORD", ""))
parser.add_argument("--records", type=int, default=50, help="Max accounts to fetch")
parser.add_argument("--debug", action="store_true", help="Print raw API responses")
args = parser.parse_args()
if not args.username or not args.password:
print("Error: --username and --password are required.")
print("Or set COVE_USERNAME and COVE_PASSWORD environment variables.")
sys.exit(1)
run(args.url, args.username, args.password, args.records, args.debug)
if __name__ == "__main__":
main()

View File

@ -2,368 +2,9 @@
This file documents all changes made to this project via Claude Code. This file documents all changes made to this project via Claude Code.
## [2026-03-20] (9)
### Added
- Inbox: "Re-parse all" now shows a progress modal with a live progress bar instead of blocking the page:
- New `POST /inbox/reparse-batch` JSON endpoint processes 50 messages per call (8 s time budget) and returns `{processed, total, parsed_ok, auto_approved, no_match, errors, last_id, done}`
- The Re-parse all button now opens a Bootstrap modal that calls the batch endpoint in a loop (via `fetch`) until `done: true`, updating a progress bar and live stats (Parsed / Auto-approved / No match / Errors) after each batch
## [2026-03-20] (8)
### Added
- Jobs export/import now includes Cove and Cloud Connect account links (schema bumped to `approved_jobs_export_v2`):
- **Export**: each job entry now contains a `cove_account` object (`account_id`, `account_name`, `computer_name`) and a `cloud_connect_account` object (`user`, `section`, `repo_name`) when linked; `null` when not linked
- **Import**: accepts both `v1` (no linking) and `v2` files; for `v2` files, after creating/updating the job the importer looks up the matching `CoveAccount` by `account_id` (fallback: `account_name` + `computer_name`) and the matching `CloudConnectAccount` by `user` + `section` + `repo_name`, and links them to the job — only if the account is not yet linked to a different job
## [2026-03-20] (7)
### Fixed
- Inbox badge in sidebar now visible on all pages (not only the dashboard):
- Added a Flask `context_processor` in `__init__.py` that injects `inbox_count` into every template for authenticated users
- Previously `inbox_count` was only passed in the dashboard route, causing the badge to disappear on all other pages
## [2026-03-20] (6)
### Fixed
- Run Checks and Job Detail modals — objects list sorting:
- `objectSeverityRank`: Warning items with an `error_message` (e.g. "Processing mailbox: MT completed with warning: Cannot process") were incorrectly ranked as Critical (rank 0) due to `|| err` on the rank-0 check; they are now correctly ranked as Warning (rank 1); only status `error`/`failed`/`failure` triggers rank 0
- Success objects that do have an `error_message` are still promoted to Warning rank (rank 1) to keep them visible
- Run Checks modal — mail iframe no longer collapses to near-zero height:
- `#rcm_mail_iframe_body` was missing flex rules so the `flex: 1 1 auto` on `#rcm_body_iframe` had no effect (the iframe is not a direct flex child of `.rcm-mail-panel`)
- Fixed: `#rcm_mail_iframe_body` now gets `flex: 1 1 auto; min-height: 0; overflow: hidden` so it fills the available panel space; `#rcm_body_iframe` gets `height: 100%; display: block`
## [2026-03-20] (5)
### Fixed
- Missed run detection: false positives caused by stale schedule data:
- **Time-of-day changes**: old time slot stayed active until 500 historical runs were "used up"; now weekly inference only looks at the last 90 days, so a changed run time no longer generates missed runs on the old slot after 90 days
- **Frequency changes** (e.g. daily → weekly): same 90-day window ensures old patterns stop influencing inference within 3 months
- **Monthly jobs falsely detected as weekly**: after ~21 months a monthly job at a fixed time accumulated 3+ hits per weekday, triggering daily missed runs; fixed by a cadence guard — if the median gap between runs ≥ 20 days, weekly inference is skipped and monthly inference handles the job instead
- **Monthly inference**: limited to the last 180 days so schedule changes are forgotten within 6 months while still providing enough data (≥ 3 occurrences) for detection
- `MIN_OCCURRENCES` for weekly inference raised from 3 → 5 to reduce false positives from transitional patterns (two overlapping slots during a time shift)
## [2026-03-20] (4)
### Added
- Settings → General: "Security" card with login captcha toggle (`login_captcha_enabled`):
- When disabled, the math captcha is hidden on the login page and not validated
- Default `TRUE` for new installs and existing installs after migration
- Migration `migrate_system_settings_login_captcha()` adds the column with `DEFAULT TRUE`
- Audit-logged when changed (same pattern as other General settings)
### Fixed
- Login page layout broken when flash messages were present (e.g. "You have been logged out"):
- `bc-main-auth .bc-content` now uses `max-width: 480px; margin: 0 auto` as a centered column instead of `align-items: center` on a flex container (which caused Bootstrap `col-md-4` percentage widths to collapse)
- `login.html`: replaced `row justify-content-center / col-md-4` with a plain `<div>` — the parent CSS column handles centering
### Changed
- Sandbox banner: semi-transparent (`rgba(220,53,69,0.45)`) instead of solid red
## [2026-03-20] (3)
### Added
- Cove importer: historical run backfill from 28-day colorbar (`D09F08`):
- When a new run is created for a linked job, `_backfill_colorbar_runs()` reconstructs up to 27 days of history from the colorbar field
- Each non-zero colorbar position creates a `JobRun` with `source_type="cove_api"` and `external_id="cove-colorbar-{account_id}-{date}"`
- Uses same time-of-day as the real run for approximate `run_at` timestamps
- Fully idempotent: `external_id` deduplication prevents duplicates on subsequent imports
- Resolves the issue where only the most-recent session was visible after first linking an account
- Cove run details popup in job detail page:
- Cove run rows in job detail history table are now clickable (even without a mail message)
- New endpoint `GET /cove/run/<run_id>/detail` returns structured Cove account info and per-datasource objects
- Popup shows: account name, computer, customer (Cove), readable datasource labels, last run, status
- Mail section is hidden entirely for Cove runs (no email involved)
- `routes_jobs.py`: `source_type` added to `history_rows` dict so JS can detect Cove runs
- `job_detail.html`: rows with `source_type=cove_api` get `data-source-type` attribute and are made clickable; JS routes to `/cove/run/<id>/detail` instead of `inbox_message_detail`
- Run Checks popup (`routes_run_checks.py`): `cove_summary` added to run payload for `source_type=cove_api` runs with same structured details; mail section hidden for Cove runs
## [2026-03-20] (2)
### Changed
- Cove Accounts page: same clickable-row UX as Cloud Connect — removed per-row "Link / Create Job" button and inline modals (N modals → 1 shared modal):
- Unmatched rows are clickable; modal pre-fills job name, backup software, backup type as read-only
- Customer via datalist auto-complete; Cove's own customer name is used as pre-fill suggestion when it matches an existing customer
- `routes_cove.py`: `customers` passed as dicts for `tojson`; `job_name` and `backup_type` now derived server-side (no longer read from form fields)
## [2026-03-20]
### Fixed
- Cloud Connect accounts: users with multiple repositories (e.g. Veeam Cloud Connect Repository + Cloud Connect Immutable) now get a separate staging account entry per repository instead of overwriting each other:
- `CloudConnectAccount` unique key changed from `(user, section)` to `(user, section, repo_name)`
- Migration `migrate_cc_accounts_repo_unique_key`: drops old constraint, makes `repo_name` NOT NULL (default `''`), adds new constraint
- Importer: upserts on `(user, section, repo_name)`; `external_id` now includes repo slug so each repo gets its own `JobRun`
- Job creation suggestion uses `repo_name` as the default job name (falls back to user when repo_name is empty)
- Cloud Connect accounts page: `data-job-name` attribute on row, modal reads it correctly
- Cloud Connect runs in job detail page popup now show a structured CC summary instead of the raw report email with all tenants:
- `routes_inbox.py` (`inbox_message_detail`): accepts optional `?run_id=` parameter; when the run has `source_type = "cloud_connect"`, returns `cloud_connect_summary` dict and per-run objects from `run_object_links` instead of MailObjects
- `job_detail.html`: passes `run_id` to the detail API; if `cloud_connect_summary` is returned, shows the CC summary panel, collapses the raw email (accessible via "show" toggle), and shows only the single per-run repository object
- "Delete all jobs" in Settings → Maintenance no longer times out on large datasets:
- Replaced ORM-based deletion (loaded all jobs/runs into Python memory, deleted object by object) with direct SQL `DELETE FROM` statements in FK order — handles 650K+ rows in seconds
- Added `job_run_review_events` to the FK cleanup sequence (was causing a FK violation)
- Added `cove_accounts` and `cloud_connect_accounts` unlinking before job deletion
- Gunicorn worker timeout raised from default 30 s to 120 s (`Dockerfile`)
### Changed
- Cloud Connect Accounts page: replaced per-row "Link / Create Job" button and inline modals with clickable rows and a single shared modal (mirrors Inbox UX):
- Clicking an unmatched row opens a modal pre-filled with the account's user and section
- "Create new job" tab: customer via datalist input (auto-complete); Job name (from User) and Backup type (from Section) shown as read-only — not editable
- "Link to existing job" tab: unchanged
- `routes_cloud_connect.py`: job name and backup type are now derived server-side from `acc.user` and `acc.section` instead of reading from form fields
## [2026-03-19] (2)
### Added
- Veeam Cloud Connect importer — same inbox-style staging flow as Cove Data Protection:
- `app/cloud_connect_importer.py` — HTML parser for Cloud Connect daily report emails, upserts tenant rows into `cloud_connect_accounts`, creates `JobRun` records for linked accounts
- `app/main/routes_cloud_connect.py``/cloud-connect/accounts` page with link/unlink actions (create new job or link to existing)
- `templates/main/cloud_connect_accounts.html` — inbox-style page: unmatched accounts first, matched accounts below
- `CloudConnectAccount` model added to `models.py` (staging table, unique on user × section)
- `migrate_cloud_connect_accounts_table()` added to `migrations.py`, registered in `run_all_migrations()`
- `mail_importer.py` — Cloud Connect hook: detects `backup_type == "cloud connect report"`, calls `upsert_cloud_connect_report()`, auto-approves mail on success
- Sidebar link "Cloud Connect" added for admin/operator roles
## [2026-03-19]
### Changed
- Redesigned application layout to sidebar-first design system (Layout v2):
- `static/css/layout.css` completely rewritten with IBM Plex Sans/Mono fonts, CSS design tokens, and a fixed dark sidebar (`--bc-sidebar-w: 220px`)
- `templates/layout/base.html` updated: Google Fonts preload for IBM Plex, `bc-body` class, sidebar-aware structure, simplified dark-mode JS
- `templates/documentation/base.html` and `templates/main/dashboard.html` aligned to new layout structure
- Missed-run generation and Autotask ticket polling now run in a background daemon thread on Run Checks page load:
- Prevents page-load delay caused by heavy DB operations
- Throttled per job (10-minute minimum interval) to avoid pile-up
- Single `_bg_sweep_lock` guard prevents concurrent sweeps
- Cove import manual run now logs and displays per-skip-reason breakdown (`reasons=` in audit log and flash message)
- Cove timestamp parsing now supports additional formats: epoch milliseconds, epoch microseconds/nanoseconds, and .NET JSON Date strings (`/Date(ms)/`)
## [2026-03-12]
### Fixed
- Prevented automatic mail-to-job matching from selecting archived jobs:
- Updated `app/job_matching.py` so `find_matching_job()` excludes `jobs.archived = true`.
- Applied the same archived filter in the VSPC normalized fallback match path.
- This prevents new imports/re-parse auto-approval from attaching messages/runs to archived jobs.
## [2026-03-02]
### Fixed
- Cove run creation after account linking/import:
- Fixed transaction scope in `app/cove_importer.py` for datasource object persistence.
- `run_object_links` / related upserts now use the same SQLAlchemy session transaction as `JobRun` creation instead of a separate engine connection.
- Prevents FK/visibility issues where a new uncommitted `JobRun` was not visible to a second connection, causing run creation to roll back and resulting in no Cove runs appearing.
- Cove link compatibility between two link paths:
- `app/cove_importer.py` now falls back to `jobs.cove_account_id` when `cove_accounts.job_id` is not set yet.
- `main/routes_jobs.py` (`POST /jobs/<id>/set-cove-account`) now synchronizes `cove_accounts.job_id` when the staged Cove account already exists.
- Fixes scenario where Cove import showed many skipped accounts and zero new runs because links were saved only on `jobs.cove_account_id`.
## [2026-02-27]
### Added
- Run Checks user preferences (per user) stored on `users`:
- `run_checks_sort_mode`
- `run_checks_filter_statuses`
- `run_checks_filter_has_ticket`
- `run_checks_filter_has_remark`
- `run_checks_filter_q`
- DB migration `migrate_users_run_checks_preferences()` for the new user preference columns.
- New route `POST /run-checks/preferences` to save current Run Checks filter/sort controls as user defaults.
- User Settings page now includes a dedicated "Run Checks preferences" section (next to password self-service).
### Changed
- Run Checks now supports user-configurable sorting:
- default: Customer > Backup > Type > Job (existing behavior)
- optional: `Critical > Missed > Warning > Success (override) > Success`
- Run Checks now supports user-configurable filtering:
- status filter (`Critical`, `Missed`, `Warning`, `Success (override)`, `Success`)
- only jobs with active ticket
- only jobs with active remark
- Run Checks page loads with the logged-in user's saved defaults when no explicit query parameters are provided.
- Removed the extra Run Checks-specific search field from Run Checks and User Settings UI to keep filtering scope aligned with request.
### Fixed
- Fixed a Run Checks regression where the search string variable was overwritten by the SQLAlchemy query object, which could cause empty results after changing sort/filter options.
## [2026-02-23]
### Added
- Cove Data Protection full integration into Backupchecks:
- `app/cove_importer.py` Cove API client: login, paginated EnumerateAccountStatistics, status mapping, deduplication, per-datasource object persistence
- `app/cove_importer_service.py` background thread that polls Cove API on configurable interval
- `SystemSettings` model: 8 new Cove fields (`cove_enabled`, `cove_api_url`, `cove_api_username`, `cove_api_password`, `cove_import_enabled`, `cove_import_interval_minutes`, `cove_partner_id`, `cove_last_import_at`)
- `Job` model: `cove_account_id` column to link a job to a Cove account
- `JobRun` model: `source_type` (NULL = email, "cove_api") and `external_id` (deduplication key) columns
- DB migration `migrate_cove_integration()` for all new columns + deduplication index
- Settings > Integrations tab: new Cove section with enable toggle, API URL/username/password, import interval, and Test Connection button (AJAX → JSON response with partner ID)
- Job Detail page: Cove Integration card showing Account ID input (only when `cove_enabled`)
- Route `POST /settings/cove/test-connection` verifies Cove credentials and stores partner ID
- Route `POST /settings/cove/run-now` manually trigger a Cove import from the Settings page
- Route `POST /jobs/<id>/set-cove-account` saves or clears Cove Account ID on a job
- Cove Accounts inbox-style flow:
- `CoveAccount` model (staging table): stores all Cove accounts from API, with optional `job_id` link
- DB migration `migrate_cove_accounts_table()` creates `cove_accounts` table with indexes
- `cove_importer.py` updated: always upserts all accounts into staging table; JobRuns only created for accounts with a linked job
- `routes_cove.py` new routes: `GET /cove/accounts`, `POST /cove/accounts/<id>/link`, `POST /cove/accounts/<id>/unlink`
- `cove_accounts.html` inbox-style page: unmatched accounts shown first with "Link / Create job" modals (two tabs: create new job or link to existing), matched accounts listed below with Unlink button
- Nav bar: "Cove Accounts" link added for admin/operator roles when `cove_enabled`
- Route `POST /settings/cove/run-now` triggers manual import (button also shown on Cove Accounts page)
- `cove_api_test.py` standalone Python test script to verify Cove Data Protection API column codes
- Tests D9Fxx (Total), D10Fxx (VssMsSql), D11Fxx (VssSharePoint), and D1Fxx (Files&Folders)
- Displays backup status (F00), timestamps (F09/F15/F18), error counts (F06) per account
- Accepts credentials via CLI args or environment variables
- Summary output showing which column sets work
- Updated `docs/cove_data_protection_api_calls_known_info.md` with N-able support feedback:
- D02/D03 are legacy use D10/D11 or D9 (Total) instead
- All users have the same API access (no MSP-level restriction)
- Session status codes documented (D9F00: 2=Failed, 5=Completed, 8=CompletedWithErrors, etc.)
- Updated `TODO-cove-data-protection.md` with breakthrough status and next steps
### Changed
- Cove import/link behavior and visibility refinements:
- Fixed Settings → Cove "Run import now" button submission issue caused by nested form markup
- `/cove/accounts` now shows derived fields for faster linking decisions:
- `backup_software` (Cove Data Protection)
- derived `backup_type` (Server, Workstation, Microsoft 365)
- derived `job_name`
- human-readable datasource labels
- `computer_name` visible in matched and unmatched sections
- Linking a Cove account now triggers an immediate import attempt, so latest runs can appear without waiting for interval
- Improved feedback after linking with per-linked-job run delta and clearer reason when no run is created
- Cove run enrichment:
- `JobRun.remark` now stores account/computer/customer/status/last-run summary
- per-datasource run object records now include status detail text and datasource session timestamp
- Cove timestamp fallback for run creation: use `D09F15`, fallback to `D09F09` when needed
### Fixed
- Cove deduplication scope:
- Dedup check changed from global `external_id` to per-job (`job_id + external_id`) to prevent newly linked/relinked jobs from being blocked by sessions imported under another job
- Navbar compression for split-screen usage:
- Reworked top navigation to reduce horizontal pressure without forced full collapse
- Moved admin-only links into `Admin` dropdown
- Added `More` dropdown for secondary non-admin navigation links
- Kept primary daily operational link (`Run Checks`) directly visible
- Adjusted role-specific visibility: Viewer now has `Customers` and `Jobs` directly visible in navbar
### Added
- Microsoft Entra SSO implementation (branch `v20260223-07-entra-sso`):
- Authorization code flow routes:
- `GET /auth/entra/login`
- `GET /auth/entra/callback`
- Entra-aware logout flow (redirects through Entra logout endpoint when applicable)
- Login page support for "Sign in with Microsoft" button when SSO is configured/enabled
- Settings-backed Entra configuration fields in `SystemSettings` + migration support:
- `entra_sso_enabled`
- `entra_tenant_id`
- `entra_client_id`
- `entra_client_secret`
- `entra_redirect_uri`
- `entra_allowed_domain`
- `entra_allowed_group_ids`
- `entra_auto_provision_users`
- Optional local auto-provisioning of unknown Entra users as Viewer
- Security-group gate for SSO:
- allowlist by Entra Group Object IDs
- login blocked when token lacks required group context or group overage prevents reliable evaluation
- Documentation updates for Entra SSO:
- Added page `documentation/settings/entra-sso`
- Added navigation entry under Settings
- Marked explicitly as **Untested in Backupchecks**
- Included setup instructions for tenant/client/secret/redirect, group-based access, and troubleshooting
## [2026-02-19]
### Added
- Explicit `Include Autotask IDs` import option in the Approved Jobs JSON import form (Settings -> Maintenance)
- Explicit `Include Autotask IDs` import option in the Customers CSV import form
### Changed
- Approved Jobs import now only applies `autotask_company_id` and `autotask_company_name` when the import option is checked
- Customers CSV import now only applies Autotask mapping fields when the import option is checked
- Import success and audit output now includes whether Autotask IDs were imported
- 3CX parser now recognizes `3CX Notification: Update Successful - <host>` as an informational run with `backup_software: 3CX`, `backup_type: Update`, and `overall_status: Success`, and excludes this type from schedule inference (no Expected/Missed generation)
- Run Checks now hides only non-backup 3CX informational types (`Update`, `SSL Certificate`), while other backup software/types remain visible
- Restored remark visibility in Run Checks and Job Details alerts by loading remarks from both sources: explicit run links (`remark_job_runs`) and active job scopes (`remark_scopes`) with duplicate prevention
## [2026-02-16]
### Added
- Customer-to-jobs navigation by making customer names clickable on the Customers page (`/jobs?customer_id=<id>`)
- Jobs page customer filter context UI with an active filter banner and a "Clear filter" action
- Global search page (`/search`) with grouped results for Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Existing overrides, and Reports
- Navbar search form to trigger global search from all authenticated pages
- Dedicated Remarks section in global search results (with paging and detail links), so remark records are searchable alongside tickets
### Changed
- `/jobs` route now accepts optional `customer_id` and returns only jobs for that customer when provided
- Default Jobs listing keeps inactive-customer filtering only when no `customer_id` filter is applied
- Updated `docs/technical-notes-codex.md` with a new "Last updated" date, Customers->Jobs navigation notes, and test build/push validation snapshot
- Search matching is now case-insensitive with wildcard support (`*`) and automatic contains behavior (`*term*`) per search term
- Global search visibility now only includes sections accessible to the currently active role
- Updated `docs/technical-notes-codex.md` with a dedicated Global Grouped Search section (route/UI/behavior/access rules) and latest test build digest for `v20260216-02-global-search`
- Global search now supports per-section pagination (previous/next), so results beyond the first 10 can be browsed per section while preserving current query/state
- Daily Jobs search result metadata now includes expected run time, success indicator, and run count for the selected day
- Daily Jobs search result links now open the same Daily Jobs modal flow via `open_job_id` (instead of only navigating to the overview page)
- Updated `docs/technical-notes-codex.md` with search pagination query params, Daily Jobs modal-open search behavior, and latest successful test-build digest
- Search pagination buttons now preserve scroll position by linking back to the active section anchor after page navigation
- "Open <section>" behavior now passes `q` into destination pages and applies page-level filtering, so opened overviews reflect the same search term
- Filtering support on Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Overrides, and Reports now accepts wildcard-enabled `q` terms from search
- Reports frontend loading (`/api/reports`) now forwards URL `q` so client-side refresh keeps the same filtered result set
- Daily Jobs search section UI now shows an explicit English note that the Daily Jobs page itself is day-scoped while search matches can reflect jobs across other days
- Updated `docs/technical-notes-codex.md` to include remarks in grouped search sections, `p_remarks` pagination key, q-forwarding to overview pages, and latest test-build digest
### Fixed
- `/search` page crash (`TypeError: 'builtin_function_or_method' object is not iterable`) by replacing Jinja dict access from `section.items` to `section['items']` in `templates/main/search.html`
## [2026-02-13]
### Added
- Added internal technical reference document `docs/technical-notes-codex.md` with repository structure, application architecture, processing flow, parser system rules, ticketing/Autotask constraints, feedback attachment notes, deployment/build workflow, and operational attention points
### Changed
- Changed `docs/technical-notes-codex.md` language from Dutch to English to align with project language rules for documentation
### Fixed
- Fixed Autotask tickets and internal tickets not being linked to missed runs by calling `link_open_internal_tickets_to_run` after creating missed JobRun records in `_ensure_missed_runs_for_job` (both weekly and monthly schedules), ensuring missed runs now receive the same ticket propagation as email-based runs
- Fixed checkboxes being automatically re-selected after delete actions on Inbox and Run Checks pages by adding `autocomplete="off"` attribute to all checkboxes, preventing browser from restoring previous checkbox states after page reload
## [2026-02-12]
### Fixed
- Fixed tickets not being displayed in Run Checks modal detail view (Meldingen section) by extending `/api/job-runs/<run_id>/alerts` endpoint to include both run-specific tickets (via ticket_job_runs) and job-level tickets (via ticket_scopes), ensuring newly created tickets are visible immediately in the modal instead of only after being resolved
- Fixed copy ticket button not working in Edge browser on Job Details page by moving clipboard functions (copyToClipboard, fallbackCopy, showCopyFeedback) inside IIFE scope for proper closure access (Edge is stricter than Firefox about scope resolution)
## [2026-02-10] ## [2026-02-10]
### Added
- Added screenshot attachment support to Feedback/Bug system (user request: allow screenshots for bugs/features)
- New database model: `FeedbackAttachment` with file_data (BYTEA), filename, mime_type, file_size
- Upload support on feedback creation form (multiple files, PNG/JPG/GIF/WEBP, max 5MB each)
- Upload support on reply forms (attach screenshots when replying)
- Inline image display on feedback detail page (thumbnails with click-to-view-full-size)
- Screenshot display for both main feedback items and replies
- File validation: image type verification using imghdr (not just extension), size limits, secure filename handling
- New route: `/feedback/attachment/<id>` to serve images (access-controlled, admins can view deleted item attachments)
- Database migration: auto-creates `feedback_attachments` table with indexes on startup
- Automatic CASCADE delete: removing feedback item or reply automatically removes associated attachments
- Added admin-only deleted items view and permanent delete functionality to Feedback system
- "Show deleted items" checkbox on feedback list page (admin only)
- Deleted items shown with reduced opacity + background color and "Deleted" badge
- Permanent delete action removes item + all attachments from database (hard delete with CASCADE)
- Attachment count shown in deletion confirmation message
- Admins can view detail pages of deleted items including their screenshots
- Two-stage delete: soft delete (audit trail) → permanent delete (database cleanup)
- Prevents accidental permanent deletion (requires item to be soft-deleted first)
- Security: non-admin users cannot view deleted items or their attachments (404 response)
- Added copy ticket button (⧉) to Job Details page modal for quickly copying ticket numbers to clipboard (previously only available on Run Checks page)
### Fixed ### Fixed
- Fixed cross-browser clipboard copy functionality for ticket numbers (previously required manual copy popup in Edge browser)
- Implemented three-tier fallback mechanism: modern Clipboard API → legacy execCommand('copy') → prompt fallback
- Copy button now works directly in all browsers (Firefox, Edge, Chrome) without requiring user interaction
- Applied improved copy mechanism to both Run Checks and Job Details pages
- Fixed Autotask ticket not being automatically linked to new runs when internal ticket is resolved by implementing independent Autotask propagation strategy (now checks for most recent non-deleted and non-resolved Autotask ticket on job regardless of internal ticket status, ensuring PSA ticket reference persists across runs until explicitly resolved or deleted) - Fixed Autotask ticket not being automatically linked to new runs when internal ticket is resolved by implementing independent Autotask propagation strategy (now checks for most recent non-deleted and non-resolved Autotask ticket on job regardless of internal ticket status, ensuring PSA ticket reference persists across runs until explicitly resolved or deleted)
- Fixed internal and Autotask tickets being linked to new runs even after being resolved by removing date-based "open" logic from ticket query (tickets now only link to new runs if they are genuinely unresolved, not based on run date comparisons) - Fixed internal and Autotask tickets being linked to new runs even after being resolved by removing date-based "open" logic from ticket query (tickets now only link to new runs if they are genuinely unresolved, not based on run date comparisons)
- Fixed Job Details page showing resolved tickets for ALL runs by implementing two-source ticket display: directly linked tickets (via ticket_job_runs) are always shown for audit trail, while active window tickets (via scope query) are only shown if unresolved, preserving historical ticket links while preventing resolved tickets from appearing on new runs - Fixed Job Details page showing resolved tickets for ALL runs by implementing two-source ticket display: directly linked tickets (via ticket_job_runs) are always shown for audit trail, while active window tickets (via scope query) are only shown if unresolved, preserving historical ticket links while preventing resolved tickets from appearing on new runs

View File

@ -1,114 +1,3 @@
## v0.2.1
### Added
- **Inbox: Re-parse all progress indicator** — "Re-parse all" now opens a progress modal with a live progress bar instead of blocking the browser; a new `POST /inbox/reparse-batch` endpoint processes 50 messages per call (8 s time budget) and the browser loops automatically until all messages are done, showing parsed / auto-approved / no match / errors stats in real time
- **Jobs export/import: Cove and Cloud Connect account links** — export schema bumped to `v2`; each job entry now includes the linked `cove_account` and `cloud_connect_account` objects; import accepts both `v1` and `v2` files and automatically re-links accounts on import (by `account_id` / `user+section+repo_name`) if not already linked to a different job
### Fixed
- **Missed run false positives** — stale schedule inference caused jobs to be marked missed after a schedule change or after long operation:
- Weekly inference now only looks at the last 90 days, so old time slots are forgotten within 3 months
- Monthly jobs that accumulated enough weekly hits (after ~21 months) no longer trigger daily missed runs; a cadence guard (median gap ≥ 20 days) routes them to monthly inference instead
- Monthly inference limited to the last 180 days so schedule changes are reflected within 6 months
- `MIN_OCCURRENCES` for weekly inference raised from 3 → 5 to reduce false positives during transitional periods
- **Objects list sort order in Run Checks and Job Detail modals** — items with a warning status and a non-empty `error_message` were ranked as Critical instead of Warning; fixed so only `error`/`failed`/`failure` status triggers rank 0
- **Mail iframe height in Run Checks modal** — iframe collapsed to near-zero height due to missing flex rules on the wrapping element; fixed by moving `flex: 1 1 auto; min-height: 0` to `#rcm_mail_iframe_body` and setting `height: 100%` on the iframe itself
- **Inbox badge disappearing on non-dashboard pages**`inbox_count` was only injected in the dashboard route; fixed by adding a Flask `context_processor` that injects it globally for all authenticated requests
## v0.2.0
This release is a significant update since `v0.1.27` (released on February 23, 2026).
It introduces a completely redesigned sidebar-first UI, a new Veeam Cloud Connect importer, Run Checks user preferences, and multiple Cove Data Protection improvements including historical run backfill and a run detail popup. Several bug fixes and UX refinements are included across the board.
### Added
- **Veeam Cloud Connect importer** — inbox-style staging flow matching the Cove integration:
- HTML parser for Cloud Connect daily report emails; upserts tenant rows into `cloud_connect_accounts`; creates `JobRun` records for linked accounts
- Cloud Connect Accounts page (`/cloud-connect/accounts`): unmatched accounts first, matched accounts below; clickable-row UX with a single shared modal (pre-fills job name, backup type, customer)
- `CloudConnectAccount` model and migration
- Sidebar link added for admin/operator roles
- **Cove: historical run backfill** — when a new run is created for a linked job, up to 27 days of history are reconstructed from the colorbar field (`D09F08`); fully idempotent via `external_id` deduplication
- **Cove run details popup** — Cove run rows in the job detail history table are now clickable; popup shows account name, computer, customer, datasource labels, last run, and status; mail section hidden for Cove runs
- **Cove Accounts page: clickable-row UX** — same shared-modal pattern as Cloud Connect; customer via datalist auto-complete with pre-fill suggestion
- **Run Checks user preferences** — per-user sort mode and filter defaults (status, ticket, remark, search query) stored in DB; saved via `POST /run-checks/preferences`; User Settings page includes a dedicated section
- **Login captcha toggle** — Settings → General "Security" card; when disabled, the math captcha is hidden and not validated; migration adds column with `DEFAULT TRUE`
### Changed
- **Layout v2** — complete sidebar-first redesign:
- `layout.css` rewritten with IBM Plex Sans/Mono fonts and CSS design tokens; fixed dark sidebar (220 px)
- `base.html` updated with Google Fonts preload and sidebar-aware structure
- **Run Checks background sweep** — missed-run generation and Autotask ticket polling now run in a background daemon thread on page load, throttled per job (10-minute minimum interval), preventing page-load delays
- **Cloud Connect run detail popup** — shows structured CC summary (account, repository, objects) instead of the raw report email; raw email accessible via a "show" toggle
- **Cloud Connect unique key** — changed from `(user, section)` to `(user, section, repo_name)` so users with multiple repositories each get a separate staging entry
- **Cove timestamp parsing** — now supports epoch milliseconds/microseconds/nanoseconds and .NET JSON Date strings
- **Sandbox banner** — semi-transparent (`rgba(220,53,69,0.45)`) instead of solid red
### Fixed
- Login page layout no longer breaks when flash messages are present
- "Delete all jobs" in Settings → Maintenance no longer times out on large datasets (replaced ORM deletion with direct SQL `DELETE FROM` statements; handles 650K+ rows in seconds)
- Automatic mail-to-job matching no longer selects archived jobs
- Cove run creation transaction scope fixed (FK/visibility issue with second DB connection)
- Cove link sync between both link paths (`cove_accounts.job_id` ↔ `jobs.cove_account_id`)
## v0.1.27
This release is a major functional update since `v0.1.26` (released on February 10, 2026).
It introduces full Cove Data Protection integration, broad search and navigation improvements, and multiple workflow/ticketing fixes. It also adds Microsoft Entra SSO foundations (currently marked as untested in Backupchecks), along with extensive documentation updates and UI refinements.
### Added
- Full Cove Data Protection integration:
- API importer + background polling
- Cove Accounts staging/linking page
- Manual import trigger
- JobRun source tracking + external IDs
- CoveAccount model + migrations
- Per-datasource object persistence
- Cove API test script (`cove_api_test.py`)
- Global grouped search with role-aware results
- Per-section pagination in search
- Remarks included in search results
- Customer → Jobs quick filter navigation
- Optional Autotask ID import toggle (jobs/customers import)
- Microsoft Entra SSO implementation (marked as **untested in Backupchecks**):
- login/callback/logout flow
- settings + migrations
- optional auto-provisioning
- tenant/domain restriction
- security-group gate (allowed Entra group IDs)
- New documentation page for Entra SSO setup
### Changed
- Cove import and linking flow refinements:
- Immediate import after linking
- Type derivation (Server/Workstation/Microsoft 365)
- More readable Cove Accounts display
- Richer run details
- Timestamp fallback (`D09F15` → `D09F09`)
- Navbar restructuring:
- Admin links grouped under `Admin`
- Secondary links grouped under `More`
- Cove Accounts moved back to main bar
- Daily Jobs moved under `More`
- Viewer role now has Customers/Jobs directly on navbar
- Search UX improvements:
- wildcard/contains filtering
- query forwarding to overview pages
- section-anchor and pagination state preservation
- Parser/Run Checks behavior updates:
- 3CX update emails handled as informational
- Non-backup 3CX types hidden from Run Checks
- Documentation expanded and corrected (workflow, run review, mail import, settings, etc.)
### Fixed
- Tickets not shown in Run Checks modal fixed
- Copy ticket button works in Edge (scope/clipboard fallback)
- Resolved tickets incorrectly shown on new runs fixed (explicit link-based logic)
- Duplicate tickets in Run Checks popup fixed
- Missed-run ticket linking with Autotask/internal tickets fixed
- Cove run deduplication corrected to per-job scope
- Cove “Run import now” submit issue fixed
- Checkbox auto-reselect after reload fixed
- Search template crash fixed (`section.items`)
- Cleanup: Python cache artifacts are no longer tracked
## v0.1.26 ## v0.1.26
This critical bug fix release resolves ticket system display issues where resolved tickets were incorrectly appearing on new runs across multiple pages. The ticket system has been completely transitioned from date-based logic to explicit link-based queries, ensuring resolved tickets stop appearing immediately after resolution while preserving audit trail for historical runs. This critical bug fix release resolves ticket system display issues where resolved tickets were incorrectly appearing on new runs across multiple pages. The ticket system has been completely transitioned from date-based logic to explicit link-based queries, ensuring resolved tickets stop appearing immediately after resolution while preserving audit trail for historical runs.

View File

@ -1,230 +0,0 @@
# Cove Data Protection (N-able Backup) Known Information on API Calls
Date: 2026-02-10 (updated 2026-02-23)
Status: Pending re-test with corrected column codes
## ⚠️ Important Update (2026-02-23)
**N-able support (Andrew Robinson, Applications Engineer) confirmed:**
1. **D02 and D03 are legacy column codes** use **D10 and D11** instead.
2. **There is no MSP-level restriction** all API users have the same access level.
3. New documentation: https://developer.n-able.com/n-able-cove/docs/getting-started
4. Column code reference: https://developer.n-able.com/n-able-cove/docs/column-codes
**Impact:** The security error 13501 was caused by using legacy D02Fxx/D03Fxx codes.
Using D9Fxx (Total aggregate), D10Fxx (VssMsSql), D11Fxx (VssSharePoint) should work.
**Key newly available columns (pending re-test):**
- `D9F00` = Last Session Status (2=Failed, 5=Completed, 8=CompletedWithErrors, etc.)
- `D9F06` = Last Session Errors Count
- `D9F09` = Last Successful Session Timestamp (Unix)
- `D9F12` = Session Duration
- `D9F15` = Last Session Timestamp (Unix)
- `D9F17` = Last Completed Session Status
- `D9F18` = Last Completed Session Timestamp (Unix)
**Session status codes (F00):**
1=In process, 2=Failed, 3=Aborted, 5=Completed, 6=Interrupted,
7=NotStarted, 8=CompletedWithErrors, 9=InProgressWithFaults,
10=OverQuota, 11=NoSelection, 12=Restarted
**Test script:** `cove_api_test.py` in project root run this to verify new column codes.
---
## Summary of original findings (2026-02-10)
API access to Cove Data Protection via JSON-RPC **works**, but was **heavily restricted**
because legacy column codes (D02Fxx, D03Fxx) were being used. Now resolved.
Previous error:
```
Operation failed because of security reasons (error 13501)
```
---
## Authentication model (confirmed)
- Endpoint: https://api.backup.management/jsonapi
- Protocol: JSONRPC 2.0
- Method: POST only
- Authentication flow:
1. Login method is called
2. Response returns a **visa** token (toplevel field)
3. The visa **must be included in every subsequent call**
4. Cove may return a new visa in later responses (token chaining)
### Login request (working)
```json
{
"jsonrpc": "2.0",
"method": "Login",
"params": {
"partner": "<EXACT customer/partner name>",
"username": "<api login name>",
"password": "<password>"
},
"id": "1"
}
```
### Login response structure (important)
```json
{
"result": {
"result": {
"PartnerId": <number>,
"Name": "<login name>",
"Flags": ["SecurityOfficer","NonInteractive"]
}
},
"visa": "<visa token>"
}
```
Notes:
- `visa` is **not** inside `result`, but at top level
- `PartnerId` is found at `result.result.PartnerId`
---
## API user scope (critical finding)
- API users are **always bound to a single Partner (customer)** unless created at MSP/root level
- In this environment, it is **not possible to create an MSPlevel API user**
- All testing was therefore done with **customerscoped API users**
Impact:
- Crosscustomer enumeration is impossible
- Only data belonging to the linked customer can be queried
- Some enumerate/reporting calls are blocked regardless of role
---
## EnumerateAccountStatistics what works and what does not
### Method
```json
{
"jsonrpc": "2.0",
"method": "EnumerateAccountStatistics",
"visa": "<visa>",
"params": {
"query": {
"PartnerId": <partner_id>,
"SelectionMode": "Merged",
"StartRecordNumber": 0,
"RecordsCount": 50,
"Columns": [ ... ]
}
}
}
```
### Mandatory behavior
- **Columns are required**; omitting them returns `result: null`
- The API behaves as an **allowlist**:
- If *any* requested column is restricted, the **entire call fails** with error 13501
### Confirmed working (safe) column set
The following column set works reliably:
- I1 → account / device / tenant identifier
- I14 → used storage (bytes)
- I18 → computer name (if applicable)
- D01F00 D01F07 → numeric metrics (exact semantics TBD)
- D09F00 → numeric status/category code
Example (validated working):
```json
"Columns": [
"I1","I14","I18",
"D01F00","D01F01","D01F02","D01F03",
"D01F04","D01F05","D01F06","D01F07",
"D09F00"
]
```
### Confirmed restricted (cause security error 13501)
- Entire D02Fxx range
- Entire D03Fxx range
- Broad Iranges (e.g. I1I10 batches)
- Many individually tested Icodes not in the safe set
Even adding **one restricted code** causes the entire call to fail.
---
## EnumerateAccounts
- Method consistently fails with `Operation failed because of security reasons`
- This applies even with:
- SuperUser role
- SecurityOfficer flag enabled
Conclusion:
- EnumerateAccounts is **not usable** in this tenant for customerscoped API users
---
## Other tested methods
- EnumerateStatistics → Method not found
- GetPartnerInfo → works only for basic partner metadata (not statistics)
---
## Practical implications for BackupChecks
What **is possible**:
- Enumerate accounts implicitly via EnumerateAccountStatistics
- Identify devices/accounts via AccountId + I1/I18
- Collect storage usage (I14)
- Collect numeric status/metrics via D01Fxx and D09F00
What is **not possible (via this API scope)**:
- Reliable last backup timestamp
- Explicit success / failure / warning text
- Error messages
- Enumerating devices via EnumerateAccounts
- Crosscustomer aggregation
### Suggested internal model mapping
- Customer
- external_id = PartnerId
- Job
- external_id = AccountId
- display_name = I1
- hostname = I18 (if present)
- Run (limited)
- metrics only (bytes, counters)
- status must be **derived heuristically** from numeric fields (if possible)
---
## Open questions / next steps
1. Confirm official meaning of:
- D01F00 D01F07
- D09F00
2. Investigate whether:
- A tokenbased (nonJSONRPC) reporting endpoint exists
- Nable support can enable additional reporting columns
- An MSPlevel API user can be provisioned by Nable
3. Decide whether Cove integration in BackupChecks will be:
- Metricsonly (no run result semantics)
- Or require vendor cooperation for expanded API access

View File

@ -1,808 +0,0 @@
# Technical Notes (Internal)
Last updated: 2026-03-20
## Purpose
Internal technical snapshot of the `backupchecks` repository for faster onboarding, troubleshooting, and change impact analysis.
## Repository Overview
- Application: Flask web app with SQLAlchemy and Flask-Migrate.
- Runtime: Containerized (Docker), deployed via Docker Compose stack.
- Primary source code location: `containers/backupchecks/src`.
- The project also contains extensive functional documentation in `docs/` and multiple roadmap TODO files at repository root.
## Main Structure
- `containers/backupchecks/Dockerfile`: Python 3.12-slim image, starts `gunicorn` with `backend.app:create_app()`.
- `containers/backupchecks/requirements.txt`: Flask stack + PostgreSQL driver + reporting libraries (`reportlab`, `Markdown`).
- `containers/backupchecks/src/backend/app`: backend domain logic, routes, parsers, models, migrations.
- `containers/backupchecks/src/templates`: Jinja templates for auth/main/documentation pages.
- `containers/backupchecks/src/static`: CSS, images, favicon.
- `deploy/backupchecks-stack.yml`: compose stack with `backupchecks`, `postgres`, `adminer`.
- `build-and-push.sh`: release/test build script with version bumping, tags, and image push.
- `docs/`: functional design, changelogs, migration notes, API notes.
## Application Architecture (Current Observation)
- Factory pattern: `create_app()` in `containers/backupchecks/src/backend/app/__init__.py`.
- Blueprints:
- `auth_bp` for authentication.
- `main_bp` for core functionality.
- `doc_bp` for internal documentation pages.
- Database initialization at startup:
- `db.create_all()`
- `run_migrations()`
- Background tasks:
- `start_auto_importer(app)` starts the automatic mail importer thread.
- `start_cove_importer(app)` starts the Cove Data Protection polling thread (started only when `cove_import_enabled` is set).
- Global template context:
- `inject_inbox_count()` context processor injects `inbox_count` into every template for authenticated users (sidebar badge).
- Health endpoint:
- `GET /health` returns `{ "status": "ok" }`.
## Functional Processing Flow
- Import:
- Email is fetched via Microsoft Graph API.
- Parse:
- Parser selection through registry + software-specific parser implementations.
- Approve:
- New jobs first appear in Inbox for initial customer assignment.
- Auto-process:
- Subsequent emails for known jobs automatically create `JobRun` records.
- Monitor:
- Runs appear in Daily Jobs and Run Checks.
- Review:
- Manual review removes items from the unreviewed operational queue.
## Configuration and Runtime
- Config is built from environment variables in `containers/backupchecks/src/backend/app/config.py`.
- Important variables:
- `APP_SECRET_KEY`
- `APP_ENV`
- `APP_PORT`
- `POSTGRES_DB`
- `POSTGRES_USER`
- `POSTGRES_PASSWORD`
- `DB_HOST`
- `DB_PORT`
- Database URI pattern:
- `postgresql+psycopg2://<user>:<pass>@<host>:<port>/<db>`
- Default timezone in config: `Europe/Amsterdam`.
## Data Model (High-level)
File: `containers/backupchecks/src/backend/app/models.py`
- Auth/users:
- `User` with role(s), active role in session.
- System settings:
- `SystemSettings` with Graph/mail settings, import settings, UI timezone, dashboard policy, sandbox flag.
- Autotask configuration and cache fields are present.
- Cove Data Protection fields: `cove_enabled`, `cove_api_url`, `cove_api_username`, `cove_api_password`, `cove_import_enabled`, `cove_import_interval_minutes`, `cove_partner_id`, `cove_last_import_at`.
- Microsoft Entra SSO fields: `entra_sso_enabled`, `entra_tenant_id`, `entra_client_id`, `entra_client_secret`, `entra_redirect_uri`, `entra_allowed_domain`, `entra_allowed_group_ids`, `entra_auto_provision_users`.
- Logging:
- `AuditLog` (legacy alias `AdminLog`).
- Domain:
- `Customer`, `Job`, `JobRun`, `Override`
- `MailMessage`, `MailObject`
- `CoveAccount` (Cove staging table — see Cove integration section)
- `CloudConnectAccount` (Cloud Connect staging table — see Cloud Connect integration section)
- `Ticket`, `TicketScope`, `TicketJobRun`
- `Remark`, `RemarkScope`, `RemarkJobRun`
- `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`
### Foreign Key Relationships & Deletion Order
Critical deletion order to avoid constraint violations (used in "Delete all jobs" maintenance route):
1. Unlink staging accounts: `UPDATE cove_accounts SET job_id = NULL`, `UPDATE cloud_connect_accounts SET job_id = NULL`
2. Unlink mails: `UPDATE mail_messages SET job_id = NULL, location = 'inbox'`
3. Delete FK tables referencing `job_runs`: `remark_job_runs`, `ticket_job_runs`, `run_object_links`, `job_run_review_events`
4. Delete FK tables referencing `jobs`: `job_object_links`, `ticket_scopes`, `remark_scopes`, `overrides`
5. `DELETE FROM job_runs`
6. `DELETE FROM jobs`
Note: always use direct SQL (`DELETE FROM`) for bulk deletions — ORM-level deletes load all objects into Python memory and time out on large datasets.
### Key Model Fields
**MailMessage model:**
- `from_address` (NOT `sender`!) - sender email
- `subject` - email subject
- `text_body` - plain text content
- `html_body` - HTML content
- `received_at` - timestamp
- `location` - inbox/processed/deleted
- `job_id` - link to Job (nullable)
**Job model:**
- `customer_id` - FK to Customer
- `job_name` - parsed from email
- `backup_software` - e.g., "Veeam", "Synology", "Cove Data Protection"
- `backup_type` - e.g., "Backup Job", "Active Backup"
- `cove_account_id` - (nullable int) links this job to a Cove AccountId
- Cloud Connect accounts link back via `CloudConnectAccount.job_id` (no FK column on `jobs` — the link is on the staging table side)
**JobRun model:**
- `source_type` - NULL = email (backwards compat), `"cove_api"` for Cove-imported runs
- `external_id` - deduplication key for Cove runs: `"cove-{account_id}-{run_ts}"`
## Parser Architecture
- Folder: `containers/backupchecks/src/backend/app/parsers/`
- Two layers:
- `registry.py`:
- matching/documentation/visibility on `/parsers`.
- examples must stay generic (no customer names).
- parser files (`veeam.py`, `synology.py`, etc.):
- actual detection and parsing logic.
- return structured output: software, type, job name, status, objects.
- Practical rule:
- extend patterns by adding, not replacing (backward compatibility).
### Parser Types
**Informational Parsers:**
- DSM Updates, Account Protection, Firmware Updates
- Set appropriate backup_type (e.g., "Updates", "Firmware Update")
- Do NOT participate in schedule learning
- Usually still visible in Run Checks for awareness
- Exception: non-backup 3CX informational types (`Update`, `SSL Certificate`) are hidden from Run Checks
**Regular Parsers:**
- Backup jobs (Veeam, Synology Active Backup, NAKIVO, etc.)
- Participate in schedule learning (daily/weekly/monthly detection)
- Generate missed runs when expected runs don't occur
**Example: Synology Updates Parser (synology.py)**
- Handles multiple update notification types under same job:
- DSM automatic update cancelled
- Packages out-of-date
- Combined notifications (DSM + packages)
- Detection patterns:
- DSM: "Automatische DSM-update", "DSM-update op", "automatic DSM update"
- Packages: "Packages on", "out-of-date", "Package Center"
- Hostname extraction from multiple patterns
- Returns: backup_type "Updates", job_name "Synology Automatic Update"
## Schedule Inference and Missed Run Detection
### Overview
File: `containers/backupchecks/src/backend/app/main/routes_shared.py`
Missed runs are detected via `_ensure_missed_runs_for_job()` which is called from Run Checks on page load (throttled: max once per 10 minutes per job via in-memory dict). It infers the expected schedule from recent run history and creates `JobRun` records with `missed=True` for any slots that are overdue.
### Weekly Schedule Inference (`_infer_schedule_map_from_runs`)
- **Window**: last **90 days** only (older runs are excluded to handle schedule changes)
- **MIN_OCCURRENCES**: **5** hits on a weekday+time slot to count as expected (raised from 3 to reduce false positives during transitional periods)
- **Cadence guard**: if median gap between runs ≥ 20 days, weekly inference is **skipped** entirely → monthly inference handles the job instead. Prevents monthly jobs from accumulating enough weekly hits after long operation.
- **Key rule**: time-of-day changes or frequency changes stop generating missed runs on old slots within 90 days (no more stale slot false positives)
### Monthly Schedule Inference (`_infer_monthly_schedule_from_runs`)
- **Window**: last **180 days** (enough for ≥ 3 monthly occurrences, but forgotten within 6 months after a schedule change)
- Infers day-of-month + time-of-day from historical runs
- Used when weekly cadence guard fires (median gap ≥ 20 days)
### Important Rules
- **Never** extend the window without considering stale slot false positives
- Schedule changes (time, frequency) take effect in missed run detection within the window period (90d weekly, 180d monthly)
- Informational parsers (`3CX / Update`, `3CX / SSL Certificate`) are excluded from all schedule inference
---
## Cove Data Protection Integration
### Overview
Cove (N-able) Data Protection is a cloud backup platform. Backupchecks integrates with it via the Cove JSON-RPC API, following the same inbox-style staging flow as email imports.
### Files
- `containers/backupchecks/src/backend/app/cove_importer.py` API client, account processing, JobRun creation
- `containers/backupchecks/src/backend/app/cove_importer_service.py` background polling thread
- `containers/backupchecks/src/backend/app/main/routes_cove.py` `/cove/accounts` routes
- `containers/backupchecks/src/templates/main/cove_accounts.html` inbox-style accounts page
### API Details
- Endpoint: `https://api.backup.management/jsonapi` (JSON-RPC 2.0)
- **Login**: `POST` with `{"jsonrpc":"2.0","id":"jsonrpc","method":"Login","params":{"username":"...","password":"..."}}`
- Returns `visa` at top level (`data["visa"]`), **not** inside `result`
- Returns `PartnerId` inside `result`
- **EnumerateAccountStatistics**: `POST` with visa in payload, `query` (lowercase) with `PartnerId`, `StartRecordNumber`, `RecordsCount`, `Columns`
- Settings format per account: `[{"D09F00": "5"}, {"I1": "device name"}, ...]` — list of single-key dicts, flatten with `dict.update(item)`
### Column Codes
| Code | Meaning |
|------|---------|
| `I1` | Account/device name |
| `I18` | Computer name |
| `I8` | Customer/partner name |
| `I78` | Active datasource label |
| `D09F00` | Overall last session status code |
| `D09F09` | Last successful session timestamp (Unix) |
| `D09F15` | Last session end timestamp (Unix) |
| `D09F08` | 28-day colorbar string |
| `D1F00/F15` | Files & Folders status/timestamp |
| `D10F00/F15` | VssMsSql |
| `D11F00/F15` | VssSharePoint |
| `D19F00/F15` | M365 Exchange |
| `D20F00/F15` | M365 OneDrive |
| `D5F00/F15` | M365 SharePoint |
| `D23F00/F15` | M365 Teams |
### Status Code Mapping
| Cove code | Meaning | Backupchecks status |
|-----------|---------|---------------------|
| 1 | In process | Warning |
| 2 | Failed | Error |
| 3 | Aborted | Error |
| 5 | Completed | Success |
| 6 | Interrupted | Error |
| 7 | Not started | Warning |
| 8 | Completed with errors | Warning |
| 9 | In progress with faults | Warning |
| 10 | Over quota | Error |
| 11 | No selection | Warning |
| 12 | Restarted | Warning |
### Inbox-Style Flow (mirrors email import)
1. Cove importer fetches all accounts via paginated `EnumerateAccountStatistics` (250/page).
2. Every account is upserted into the `cove_accounts` staging table (always, regardless of job link).
3. Accounts without a `job_id` appear on `/cove/accounts` ("Cove Accounts" page) for admin action.
4. Admin can:
- **Create new job** creates a `Job` with `backup_software="Cove Data Protection"` and links it.
- **Link to existing job** sets `job.cove_account_id` and `cove_acc.job_id`.
5. Linking an account triggers an immediate import attempt; linked accounts then generate `JobRun` records (deduplicated per job via `job_id + external_id`).
6. Per-datasource objects are persisted to `customer_objects`, `job_object_links`, `run_object_links`.
### CoveAccount Model
```python
class CoveAccount(db.Model):
__tablename__ = "cove_accounts"
id # PK
account_id # Cove AccountId (unique)
account_name # I1
computer_name # I18
customer_name # I8
datasource_types # I78
last_status_code # D09F00 (int)
last_run_at # D09F15 (datetime)
colorbar_28d # D09F08
job_id # FK → jobs.id (nullable — None = unmatched)
first_seen_at
last_seen_at
job # relationship → Job
```
### Deduplication
`external_id = f"cove-{account_id}-{run_ts}"` where `run_ts` is Unix timestamp from `D09F15` (fallback to `D09F09` when needed).
Deduplication is enforced per linked job:
- check `JobRun.query.filter_by(job_id=job.id, external_id=external_id).first()`
- this prevents cross-job collisions when accounts are relinked.
Historical (colorbar) runs use `external_id = f"cove-colorbar-{account_id}-{date_str}"` (e.g. `cove-colorbar-4378343-2026-03-15`).
### Historical Backfill (28-day colorbar)
When a new run is created for a job, `_backfill_colorbar_runs()` is called to reconstruct up to 27 additional days of history from the `D09F08` colorbar field.
- Each character in the colorbar = one day's status (oldest first, position 0 = 27 days ago, last position = today)
- Status 0 = no backup that day → skipped
- `run_at` = same time-of-day as the real run, but on the historical date
- Idempotent: `external_id` deduplication prevents duplicates on subsequent imports
- Only creates runs for days where a backup actually ran (non-zero status code)
### Run Enrichment
- Cove-created `JobRun.remark` contains account/computer/customer and last status/timestamp summary.
- Per-datasource run object records include:
- mapped Backupchecks status
- readable status details in `error_message`
- datasource-level session timestamp in `observed_at`
### Cove Accounts UI Notes
- `/cove/accounts` derives display fields to align with existing job logic:
- `backup_software`: `Cove Data Protection`
- `backup_type`: `Server`, `Workstation`, or `Microsoft 365`
- `job_name`: based on Cove account/computer fallback
- readable datasource labels instead of raw `I78` code stream
- `computer_name` is shown in both unmatched and matched account tables.
### Background Thread
`cove_importer_service.py` — same pattern as `auto_importer_service.py`:
- Thread name: `"cove_importer"`
- Checks `settings.cove_import_enabled`
- Interval: `settings.cove_import_interval_minutes` (default 30)
- Calls `run_cove_import(settings)` which returns `(total, created, skipped, errors)`
### Settings UI
Settings → Integrations → Cove section:
- Enable toggle, API URL, username, password (masked, only overwritten if non-empty)
- Import enabled + interval
- "Test Connection" button (AJAX → `POST /settings/cove/test-connection`) returns `{ok, partner_id, message}`
- "Run import now" button (→ `POST /settings/cove/run-now`) triggers manual import
### Routes
| Route | Method | Description |
|-------|--------|-------------|
| `/cove/accounts` | GET | Inbox-style page: unmatched + matched accounts |
| `/cove/accounts/<id>/link` | POST | `action=create` or `action=link` |
| `/cove/accounts/<id>/unlink` | POST | Removes job link, puts account back in unmatched |
| `/cove/run/<run_id>/detail` | GET | JSON: structured Cove run details for job detail popup |
| `/settings/cove/test-connection` | POST | AJAX: verify credentials, save partner_id |
| `/settings/cove/run-now` | POST | Manual import trigger |
### Job Detail Popup (Cove runs)
Cove run rows in the job detail history table are clickable even without a mail message:
- Row has `data-source-type="cove_api"` and `data-run-id`
- JS detects `source_type === "cove_api"` and fetches `/cove/run/<run_id>/detail` instead of `inbox_message_detail`
- Response includes: `meta` (account name as subject, backup_software, run_at as received_at), `cove_summary` (account, computer, customer, datasources, last run, status), `objects` (per-datasource run_object_links)
- Mail section hidden entirely; Cove summary panel shown instead
### Run Checks Popup (Cove runs)
- `routes_run_checks.py` returns `cove_summary` in the run payload for `source_type="cove_api"` runs
- Includes: account_name, computer_name, customer_name, readable datasource labels, last_run_at, status
- `run_checks.html` shows the Cove summary panel and hides the mail section
### Migrations
- `migrate_cove_integration()` — adds 8 columns to `system_settings`, `cove_account_id` to `jobs`, `source_type` + `external_id` to `job_runs`, dedup index on `job_runs.external_id`
- `migrate_cove_accounts_table()` — creates `cove_accounts` table with indexes
---
## Veeam Cloud Connect Integration
### Overview
Veeam Cloud Connect sends a daily HTML report email (one email per provider, covering all tenants). The importer parses the HTML table and upserts each tenant row into the `cloud_connect_accounts` staging table. Linked accounts create `JobRun` records; unlinked accounts appear on the Cloud Connect Accounts review page.
### Files
- `app/cloud_connect_importer.py` — HTML parser + upsert logic
- `app/main/routes_cloud_connect.py``/cloud-connect/accounts` page, link/unlink/scan-inbox routes
- `templates/main/cloud_connect_accounts.html` — accounts review page
### Email Structure
- One email covers all tenants (unlike Cove, which sends one email per account)
- Sections: `Backup`, `Replication`, `Agent` (detected from `<p>` tags with `font-size: 18px`)
- Each section has a table with columns: User, #VM / #WS+#Server (Agent), Repo Name, Repository, Total quota, Used space, Free space, Last active, Expiry
### Status Mapping (row background colour)
| Colour | Status |
|--------|--------|
| `#fb9895` / `#ff9999` / `#f4cccc` / `#ffb3b3` | Failed |
| `#ffd96c` / `#fff2cc` / `#ffe599` / `#f9cb9c` | Warning |
| white / no background | Success |
**⚠ TODO — Last active detection logic**: The importer currently trusts Veeam's row colour as the sole status indicator (white = Success). An earlier version downgraded white rows to Warning when "Last active" exceeded 3 days, but this was removed because Veeam itself determines row colour. It is still an open question whether backupchecks should apply its own independent "last active" threshold on top of Veeam's colour — e.g. to catch cases where Veeam shows a white row for a backup that hasn't run in a long time. **Needs review before production use.**
### Staging Table: `cloud_connect_accounts`
Unique key: `(user, section, repo_name)` — one row per tenant × section × repository.
A single user can have multiple repositories (e.g. a standard repo + an immutable repo), each stored as a separate account row and each linkable to a separate Backupchecks job.
### Deduplication
`external_id = f"vcc-{user}-{section}-{repo_slug}-{report_date}"` — one `JobRun` per job per repository per report date. Re-importing the same email updates the run status and refreshes `run_object_links`.
### Run Enrichment
- `source_type = "cloud_connect"` on every `JobRun` created by the importer
- `_persist_cc_objects()` upserts the repository as a `customer_object` (type `cloud_connect_repo`) and links it via `run_object_links` — mirrors the Cove datasource pattern and enables per-run reporting
- The `run_at` is set to the mail's `received_at` (not today) so historical re-imports land on the correct date
### Job Detail Popup (job_detail.html)
For CC runs the popup shows a structured Cloud Connect summary panel (User/Section/Repository/Used/Quota/Free/Last active/Status) instead of the raw report email. The report email is still accessible via a collapsible "Source report email" toggle. Only the single per-run repository object is shown (from `run_object_links`), not all tenants from the shared mail.
### Scan Inbox
`POST /cloud-connect/accounts/scan-inbox` — re-processes all stored CC report emails (location ≠ deleted). Safe to run multiple times; deduplication prevents duplicate runs.
### Migrations
- `migrate_cloud_connect_accounts_table()` — creates `cloud_connect_accounts` table with `(user, section)` unique key
- `migrate_cc_accounts_repo_unique_key()` — extends unique key to `(user, section, repo_name)`, makes `repo_name` NOT NULL DEFAULT `''`
---
## Ticketing and Autotask (Critical Rules)
### Two Ticket Types
1. **Internal Tickets** (tickets table)
- Created manually or via Autotask integration
- Stored in `tickets` table with `ticket_code` (e.g., "T20250123.0001")
- Linked to runs via `ticket_job_runs` many-to-many table
- Scoped to jobs via `ticket_scopes` table
- Have `resolved_at` field for resolution tracking
- **Auto-propagation**: Automatically linked to new runs via `link_open_internal_tickets_to_run`
2. **Autotask Tickets** (job_runs columns)
- Created via Run Checks modal → "Create Autotask Ticket"
- Stored directly in JobRun columns: `autotask_ticket_id`, `autotask_ticket_number`, etc.
- When created, also creates matching internal ticket for legacy UI compatibility
## Microsoft Entra SSO (Current State)
### Status
- Implemented but marked **Untested in Backupchecks**.
### Routes
- `GET /auth/entra/login` starts Entra auth code flow.
- `GET /auth/entra/callback` exchanges code, maps/provisions local user, logs in session.
- `/auth/logout` Entra-aware logout redirect when user authenticated via Entra.
### Access Controls
- Optional tenant/domain restriction (`entra_allowed_domain`).
- Optional Entra security-group allowlist (`entra_allowed_group_ids`) based on group object IDs.
- Group overage / missing groups claim blocks login intentionally when group gate is enabled.
### Local User Mapping
- Primary mapping by `preferred_username`/UPN/email.
- Optional auto-provision (`entra_auto_provision_users`) creates local Viewer users for unknown identities.
### Documentation
- Built-in docs page: `/documentation/settings/entra-sso`
- Includes configuration steps and explicit untested warning.
## Navbar Notes (Latest)
- To reduce split-screen overflow, nav is compacted by grouping:
- admin-only links under `Admin` dropdown
- secondary non-admin links under `More` dropdown
- Primary operational links remain visible (notably `Run Checks`).
- Viewer role now exposes `Customers` and `Jobs` directly in navbar.
- Have `autotask_ticket_deleted_at` field for deletion tracking
- Resolution tracked via matching internal ticket's `resolved_at` field
- **Auto-propagation**: Linked to new runs via two-strategy approach
### Ticket Propagation to New Runs
When a new JobRun is created (via email import OR missed run generation), `link_open_internal_tickets_to_run` ensures:
**Strategy 1: Internal ticket linking**
- Query finds tickets where: `COALESCE(ts.resolved_at, t.resolved_at) IS NULL`
- Creates `ticket_job_runs` links automatically
- Tickets remain visible until explicitly resolved
- **NO date-based logic** - resolved = immediately hidden from new runs
**Strategy 2: Autotask ticket propagation (independent)**
1. Check if internal ticket code exists → find matching Autotask run → copy ticket info
2. If no match, directly search for most recent Autotask ticket on job where:
- `autotask_ticket_deleted_at IS NULL` (not deleted in PSA)
- Internal ticket `resolved_at IS NULL` (not resolved in PSA)
3. Copy `autotask_ticket_id`, `autotask_ticket_number`, `created_at`, `created_by_user_id` to new run
### Where Ticket Linking is Called
`link_open_internal_tickets_to_run` is invoked in three locations:
1. **Email-based runs**: `routes_inbox.py` and `mail_importer.py` - after creating JobRun from parsed email
2. **Missed runs**: `routes_run_checks.py` in `_ensure_missed_runs_for_job` - after creating missed JobRun records
- Weekly schedule: After creating weekly missed run (with flush to get run.id)
- Monthly schedule: After creating monthly missed run (with flush to get run.id)
- **Critical**: Without this call, missed runs don't get ticket propagation!
### Display Logic - Link-Based System
All pages use **explicit link-based queries** (no date-based logic):
**Job Details Page:**
- **Two sources** for ticket display:
1. Direct links (`ticket_job_runs WHERE job_run_id = X`) → always show (audit trail)
2. Active window (`ticket_scopes WHERE job_id = Y AND resolved_at IS NULL`) → only unresolved
- Result: Old runs keep their ticket references, new runs don't get resolved tickets
**Run Checks Main Page (Indicators 🎫):**
- Query: `ticket_scopes JOIN tickets WHERE job_id = X AND resolved_at IS NULL`
- Only shows indicator if unresolved tickets exist for the job
**Run Checks Popup Modal:**
- API: `/api/job-runs/<run_id>/alerts`
- **Two-source ticket display**:
1. Direct links: `tickets JOIN ticket_job_runs WHERE job_run_id = X`
2. Job-level scope: `tickets JOIN ticket_scopes WHERE job_id = Y AND resolved_at IS NULL AND active_from_date <= run_date`
- Prevents duplicates by tracking seen ticket IDs
- Shows newly created tickets immediately (via scope) without waiting for resolve action
- **Two-source remark display**:
1. Direct links: `remarks JOIN remark_job_runs WHERE job_run_id = X`
2. Job-level scope: `remarks JOIN remark_scopes WHERE job_id = Y AND resolved_at IS NULL AND active_from_date <= run_date` (with timezone-safe fallback from `start_date`)
- Prevents duplicates by tracking seen remark IDs
### Debug Logging for Ticket Linking (Reference)
If you need to debug ticket linking issues, add this to `link_open_internal_tickets_to_run` in `ticketing_utils.py` after the rows query:
```python
try:
from .models import AuditLog
details = []
if rows:
for tid, code, t_resolved, ts_resolved in rows:
details.append(f"ticket_id={tid}, code={code}, t.resolved_at={t_resolved}, ts.resolved_at={ts_resolved}")
else:
details.append("No open tickets found for this job")
audit = AuditLog(
user="system", event_type="ticket_link_debug",
message=f"link_open_internal_tickets_to_run called: run_id={run.id}, job_id={job.id}, found={len(rows)} ticket(s)",
details="\n".join(details)
)
db.session.add(audit)
db.session.commit()
except Exception:
pass
```
Visible on Logging page under `event_type = "ticket_link_debug"`. Remove after debugging.
### Resolved vs Deleted
- **Resolved**: Ticket completed in Autotask (tracked in internal `tickets.resolved_at`)
- Stops propagating to new runs
- Ticket still exists in PSA
- Synced via PSA polling
- **Deleted**: Ticket removed from Autotask (tracked in `job_runs.autotask_ticket_deleted_at`)
- Also stops propagating
- Ticket no longer exists in PSA
- Rare operation
### Critical Rules
- ❌ **NEVER** use date-based resolved logic: `resolved_at >= run_date` OR `active_from_date <= run_date`
- ✅ Only show tickets that are ACTUALLY LINKED via `ticket_job_runs` table
- ✅ Resolved tickets stop linking immediately when resolved
- ✅ Old links preserved for audit trail (visible on old runs)
- ✅ All queries must use explicit JOIN to link tables
- ✅ Consistency: All pages use same "resolved = NULL" logic
- ✅ **CRITICAL**: Preserve description field during Autotask updates - must include "description" in optional_fields list
## UI and UX Notes
### Layout v2 (2026-03-20)
- Complete sidebar-first redesign replacing the top navbar layout:
- `layout.css` rewritten with IBM Plex Sans/Mono fonts and CSS custom properties (design tokens)
- Fixed dark sidebar (220 px wide)
- `base.html` updated with Google Fonts preload and sidebar-aware structure
- Sandbox banner: semi-transparent (`rgba(220,53,69,0.45)`) instead of solid red
### Navbar (pre-v0.2.0 reference — replaced by sidebar in v0.2.0)
- Fixed-top positioning
- Collapses on mobile (hamburger menu)
- Dynamic padding adjustment via JavaScript (measures navbar height, adjusts main content padding-top)
- Role-based menu items (Admin sees more than Operator/Viewer)
### Status Badges
- Success: Green
- Warning: Yellow/Orange
- Failed/Error: Red
- Override applied: Blue badge
- Reviewed: Checkmark indicator
### Ticket Copy Functionality
- Copy button (⧉) available on both Run Checks and Job Details pages
- Allows quick copying of ticket numbers to clipboard
- Cross-browser compatible with three-tier fallback mechanism:
1. **Modern Clipboard API**: `navigator.clipboard.writeText()` - works in modern browsers with HTTPS
2. **Legacy execCommand**: `document.execCommand('copy')` - fallback for older browsers and Edge
3. **Prompt fallback**: `window.prompt()` - last resort if clipboard access fails
- Visual feedback: button changes to ✓ checkmark for 800ms after successful copy
- Implementation uses hidden textarea for execCommand method to ensure compatibility
- No user interaction required in modern browsers (direct copy)
### Checkbox Behavior
- All checkboxes on Inbox and Run Checks pages use `autocomplete="off"`
- Prevents browser from auto-selecting checkboxes after page reload
- Fixes issue where deleting items would cause same number of new items to be selected
### Customers to Jobs Navigation (2026-02-16)
- Customers page links each customer name to filtered Jobs view:
- `GET /jobs?customer_id=<customer_id>`
- Jobs route behavior:
- Accepts optional `customer_id` query parameter in `routes_jobs.py`.
- If set: returns jobs for that customer only.
- If not set: keeps default filter that hides jobs linked to inactive customers.
- Jobs UI behavior:
- Shows active filter banner with selected customer name.
- Provides "Clear filter" action back to unfiltered `/jobs`.
- Templates touched:
- `templates/main/customers.html`
- `templates/main/jobs.html`
### Global Grouped Search (2026-02-16)
- New route:
- `GET /search` in `main/routes_search.py`
- New UI:
- Navbar search form in `templates/layout/base.html`
- Grouped result page in `templates/main/search.html`
- Search behavior:
- Case-insensitive matching (`ILIKE`).
- `*` wildcard is supported and translated to SQL `%`.
- Automatic contains behavior is applied per term (`*term*`) when wildcard not explicitly set.
- Multi-term queries use AND across terms and OR across configured columns within each section.
- Per-section pagination is supported via query params: `p_inbox`, `p_customers`, `p_jobs`, `p_daily_jobs`, `p_run_checks`, `p_tickets`, `p_remarks`, `p_overrides`, `p_reports`.
- Pagination keeps search state for all sections while browsing one section.
- "Open <section>" links pass `q` to destination overview pages so page-level filtering matches the search term.
- Grouped sections:
- Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Remarks, Existing overrides, Reports.
- Daily Jobs search result details:
- Meta now includes expected run time, success indicator, and run count for the selected day.
- Link now opens Daily Jobs with modal auto-open using `open_job_id` query parameter (same modal flow as clicking a row in Daily Jobs).
- Access control:
- Search results are role-aware and only show sections/data the active role can access.
- `run_checks` results are restricted to `admin`/`operator`.
- `reports` supports `admin`/`operator`/`viewer`/`reporter`.
- Current performance strategy:
- Per-section limit (`SEARCH_LIMIT_PER_SECTION = 10`), with total count per section.
- No schema migration required for V1.
## Jobs Export / Import
### Schema Versions
- **v1** (`approved_jobs_export_v1`): job fields only, no account links
- **v2** (`approved_jobs_export_v2`): same as v1 plus per-job `cove_account` and `cloud_connect_account` objects
### Export (v2)
Each job entry contains:
```json
{
"cove_account": {"account_id": 1234, "account_name": "...", "computer_name": "..."},
"cloud_connect_account": {"user": "...", "section": "Backup", "repo_name": "..."}
}
```
`null` when not linked. File: `routes_settings.py``export_jobs()`.
### Import (v1 + v2)
- Accepts both schema versions (detected via `export_type` field)
- For v2 files: after creating/updating the job, the importer:
1. Looks up `CoveAccount` by `account_id` (fallback: `account_name` + `computer_name`)
2. Looks up `CloudConnectAccount` by `user` + `section` + `repo_name`
3. Links the account to the job — only if the account is not yet linked to a different job
- File: `routes_settings.py``import_jobs()`
## Inbox Batch Re-parse
### Endpoint
`POST /inbox/reparse-batch` — JSON in/out, login required, admin/operator only.
**Request body:**
```json
{"last_id": <int|null>, "total": <int|null>}
```
- `last_id`: keyset cursor from previous batch (process messages with `id < last_id`)
- `total`: total count from first call (avoids re-counting on every batch)
**Response:**
```json
{"processed": 50, "total": 847, "parsed_ok": 42, "auto_approved": 12, "no_match": 8, "errors": 0, "last_id": 1234, "done": false}
```
### Batch Parameters
- `batch_size`: 50 messages per call
- `time_budget_s`: 8 seconds per call (stops processing mid-batch if exceeded)
- Auto-approve logic is identical to `inbox_reparse_all` (including VSPC multi-company handling)
### Frontend (inbox.html)
- "Re-parse all" button opens a Bootstrap modal (`data-bs-backdrop="static"` — cannot close while running)
- JS loop: `fetch → process response → setTimeout(100ms) → repeat` until `done: true`
- Progress bar + live counters update after each batch
- Close button appears only when `done: true` or on error
## Feedback Module with Screenshots
- Models: `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`.
- Attachments:
- multiple uploads, type validation, per-file size limits, storage in database (BYTEA).
## Validation Snapshot
- 2026-02-16: Test build + push succeeded via `update-and-build.sh t`.
- Pushed image: `gitea.oskamp.info/ivooskamp/backupchecks:dev`.
- 2026-02-16: Test build + push succeeded on branch `v20260216-02-global-search`.
- Pushed image digest: `sha256:6996675b9529426fe2ad58b5f353479623f3ebe24b34552c17ad0421d8a7ee0f`.
- 2026-02-16: Additional test build + push cycles succeeded on `v20260216-02-global-search`.
- Latest pushed image digest: `sha256:8ec8bfcbb928e282182fa223ce8bf7f92112d20e79f4a8602d015991700df5d7`.
- 2026-02-16: Additional test build + push cycles succeeded after search enhancements.
- Latest pushed image digest: `sha256:b36b5cdd4bc7c4dadedca0534f1904a6e12b5b97abc4f12bc51e42921976f061`.
- Delete strategy:
- soft delete by default,
- permanent delete only for admins and only after soft delete.
## Deployment and Operations
- Stack exposes:
- app on `8080`
- adminer on `8081`
- PostgreSQL persistent volume:
- `/docker/appdata/backupchecks/backupchecks-postgres:/var/lib/postgresql/data`
- `deploy/backupchecks-stack.yml` also contains example `.env` variables at the bottom.
## Build/Release Flow
File: `build-and-push.sh`
- Bump options:
- `1` patch, `2` minor, `3` major, `t` test.
- Release build:
- update `version.txt`
- commit + tag + push
- docker push of `:<version>`, `:dev`, `:latest`
- Test build:
- only `:dev`
- no commit/tag.
- Services are discovered under `containers/*` with Dockerfile-per-service.
## Technical Observations / Attention Points
- `README.md` is currently empty; quick-start entry context is missing.
- `LICENSE` is currently empty.
- `docs/architecture.md` is currently empty.
- `deploy/backupchecks-stack.yml` contains hardcoded example values (`Changeme`), with risk if used without proper secrets management.
- The app performs DB initialization + migrations at startup; for larger schema changes this can impact startup time/robustness.
- There is significant parser and ticketing complexity; route changes carry regression risk without targeted testing.
- For Autotask update calls, the `description` field must be explicitly preserved to prevent unintended NULL overwrite.
- Security hygiene remains important:
- no customer names in parser examples/source,
- no hardcoded credentials.
## Quick References
- App entrypoint: `containers/backupchecks/src/backend/app/main.py`
- App factory: `containers/backupchecks/src/backend/app/__init__.py`
- Config: `containers/backupchecks/src/backend/app/config.py`
- Models: `containers/backupchecks/src/backend/app/models.py`
- Parsers: `containers/backupchecks/src/backend/app/parsers/registry.py`
- Ticketing utilities: `containers/backupchecks/src/backend/app/ticketing_utils.py`
- Run Checks routes: `containers/backupchecks/src/backend/app/main/routes_run_checks.py`
- Cove importer: `containers/backupchecks/src/backend/app/cove_importer.py`
- Cove routes: `containers/backupchecks/src/backend/app/main/routes_cove.py`
- Cloud Connect importer: `containers/backupchecks/src/backend/app/cloud_connect_importer.py`
- Cloud Connect routes: `containers/backupchecks/src/backend/app/main/routes_cloud_connect.py`
- Inbox routes: `containers/backupchecks/src/backend/app/main/routes_inbox.py`
- Settings routes: `containers/backupchecks/src/backend/app/main/routes_settings.py`
- Compose stack: `deploy/backupchecks-stack.yml`
- Build script: `build-and-push.sh`
## Recent Changes
### 2026-03-20 (v0.2.1)
- **Missed run false positive fix** (`routes_shared.py`):
- Weekly inference window: last 90 days only (was unbounded). Eliminates stale slot false positives after time-of-day or frequency changes.
- Cadence guard: if median gap between runs ≥ 20 days, skip weekly inference and let monthly inference handle the job. Fixes monthly jobs accumulating enough weekly hits after ~21 months.
- Monthly inference window: last 180 days (was unbounded).
- `MIN_OCCURRENCES` raised from 3 → 5 for weekly inference.
- **Objects sort fix** (`run_checks.html`, `job_detail.html`):
- `objectSeverityRank`: `|| err` on rank-0 check caused Warning items with `error_message` to rank as Critical. Fixed: only `error`/`failed`/`failure` status → rank 0; `|| err` moved to rank 1.
- **Mail iframe height fix** (`run_checks.html`):
- Flex rules were on `#rcm_body_iframe` but the iframe is not a direct flex child of `.rcm-mail-panel`. Fixed by moving `flex: 1 1 auto; min-height: 0` to the wrapper `#rcm_mail_iframe_body` and setting `height: 100%` on the iframe itself.
- **Inbox sidebar badge on all pages** (`__init__.py`):
- Added `inject_inbox_count()` Flask context processor — injects `inbox_count` into every template for authenticated users. Previously only injected in the dashboard route.
- **Jobs export/import schema v2** (`routes_settings.py`):
- Export: includes `cove_account` and `cloud_connect_account` per job.
- Import: accepts v1 and v2; links Cove/CC accounts on import if not yet linked to a different job.
- **Inbox re-parse progress modal** (`routes_inbox.py`, `inbox.html`):
- New `POST /inbox/reparse-batch` endpoint: 50 messages per call, 8 s time budget, keyset pagination, full auto-approve logic (including VSPC multi-company). Returns JSON progress.
- "Re-parse all" button replaced with modal trigger; JS loop calls batch endpoint until `done: true` and updates live progress bar + stats.
### 2026-03-20 (v0.2.0)
- **Layout v2**: complete sidebar-first redesign (`layout.css`, `base.html`). IBM Plex Sans/Mono fonts, CSS design tokens, fixed 220 px dark sidebar.
- **Veeam Cloud Connect importer**: HTML parser for daily report emails → `cloud_connect_accounts` staging table → `JobRun` records for linked accounts. `CloudConnectAccount` model + migrations. `/cloud-connect/accounts` review page. Sidebar link for admin/operator.
- **Cove historical run backfill**: `_backfill_colorbar_runs()` reconstructs up to 27 days of history from the `D09F08` colorbar on first run creation. Idempotent via `external_id = "cove-colorbar-{account_id}-{date}"`.
- **Cove run details popup**: Cove runs in job detail are clickable; popup fetches `/cove/run/<run_id>/detail` (structured Cove summary, per-datasource objects, mail section hidden).
- **Run Checks user preferences**: per-user sort mode + filter defaults stored in DB; `POST /run-checks/preferences`; User Settings page section.
- **Login captcha toggle**: Settings → General → Security card; `login_captcha_enabled` column with `DEFAULT TRUE`.
- **Cloud Connect unique key**: changed from `(user, section)` to `(user, section, repo_name)` — supports multiple repos per user.
- **Cloud Connect run detail popup**: shows structured CC summary instead of raw email; raw email accessible via toggle.
- **Entra SSO**: implemented (marked untested). Login/callback/logout flow, optional auto-provisioning, tenant/domain + security-group restrictions.
- **Fixes**: login page layout with flash messages; "Delete all jobs" timeout (replaced ORM with direct SQL); archived job auto-matching; Cove link sync between `cove_accounts.job_id``jobs.cove_account_id`; Cove run creation transaction scope.
### 2026-02-23
- **Cove Data Protection full integration**:
- `cove_importer.py` Cove API client (login, paginated enumeration, status mapping, deduplication, per-datasource object persistence)
- `cove_importer_service.py` background polling thread (same pattern as `auto_importer_service.py`)
- `CoveAccount` staging model + `migrate_cove_accounts_table()` migration
- `SystemSettings` 8 new Cove fields, `Job` `cove_account_id`, `JobRun` `source_type` + `external_id`
- `routes_cove.py` inbox-style `/cove/accounts` with link/unlink routes
- `cove_accounts.html` unmatched accounts shown first with Bootstrap modals (create job / link to existing), matched accounts with Unlink
- Settings > Integrations: Cove section with test connection (AJAX) and manual import trigger
- Navbar: "Cove Accounts" link for admin/operator when `cove_enabled`
- **Cove API key findings** (from test script + N-able support):
- Visa is returned at top level of login response, not inside `result`
- Settings per account are a list of single-key dicts `[{"D09F00":"5"}, ...]` — flatten with `flat.update(item)`
- EnumerateAccountStatistics params must use lowercase `query` key and `RecordsCount` (not `RecordCount`)
- Login params must use lowercase `username`/`password`
- D02/D03 are legacy; use D10/D11 or D09 (Total) instead
### 2026-02-19
- **Added 3CX Update parser support**: `threecx.py` now recognizes subject `3CX Notification: Update Successful - <host>` and stores it as informational with:
- `backup_software = 3CX`
- `backup_type = Update`
- `overall_status = Success`
- **3CX informational schedule behavior**:
- `3CX / Update` and `3CX / SSL Certificate` are excluded from schedule inference in `routes_shared.py` (no Expected/Missed generation).
- **Run Checks visibility scope (3CX-only)**:
- Run Checks now hides only non-backup 3CX informational jobs (`Update`, `SSL Certificate`).
- Other backup software/types remain visible and unchanged.
- **Fixed remark visibility mismatch**:
- `/api/job-runs/<run_id>/alerts` now loads remarks from both:
1. `remark_job_runs` (explicit run links),
2. `remark_scopes` (active job-scoped remarks),
- with duplicate prevention by remark ID.
- This resolves cases where the remark indicator appeared but remarks were not shown in Run Checks modal or Job Details modal.
### 2026-02-13
- **Fixed missed runs ticket propagation**: Added `link_open_internal_tickets_to_run` calls in `_ensure_missed_runs_for_job` (routes_run_checks.py) after creating both weekly and monthly missed JobRun records. Previously only email-based runs got ticket linking, causing missed runs to not show internal tickets or Autotask tickets. Required `db.session.flush()` before linking to ensure run.id is available.
- **Fixed checkbox auto-selection**: Added `autocomplete="off"` to all checkboxes on Inbox and Run Checks pages. Prevents browser from automatically re-selecting checkboxes after page reload following delete actions.
### 2026-02-12
- **Fixed Run Checks modal ticket display**: Implemented two-source display logic (ticket_job_runs + ticket_scopes). Previously only showed tickets after they were resolved (when ticket_job_runs entry was created). Now shows tickets immediately upon creation via scope query.
- **Fixed copy button in Edge**: Moved clipboard functions inside IIFE scope for proper closure access (Edge is stricter than Firefox about scope resolution).
### 2026-02-10
- **Added screenshot support to Feedback system**: Multiple file upload, inline display, two-stage delete (soft delete for audit trail, permanent delete for cleanup).
- **Completed transition to link-based ticket system**: All pages now use JOIN queries, no date-based logic. Added cross-browser copy ticket functionality with three-tier fallback mechanism to both Run Checks and Job Details pages.

View File

@ -1 +1 @@
v0.2.1 v0.1.26