Compare commits
88 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| e7ccff89ee | |||
| 10f4754a83 | |||
| ea7c5e29c5 | |||
| 9ecbbdd523 | |||
| a450817b78 | |||
| 03de2d294c | |||
| 7905e988c7 | |||
| abd98f5bc6 | |||
| 2cd704182c | |||
| 40ffe3656e | |||
| 8e847e802b | |||
| b992d6382a | |||
| 6bf81bd730 | |||
| 5274286c04 | |||
| 47bb4ee4f0 | |||
| 8fe3f99e40 | |||
| f68f92e63a | |||
| 06abd8c7a3 | |||
| a0abd3d58e | |||
| 7803b7647c | |||
| 0c5adf17ab | |||
| 8deecd4c11 | |||
| 9b19283c97 | |||
| c045240001 | |||
| 3bd8178464 | |||
| 467f350184 | |||
| 2f1cc20263 | |||
| dde2ccbb5d | |||
| a30d51bed0 | |||
| 6d086a883f | |||
| f35ec25163 | |||
| 6f2f7b593b | |||
| 38f0f8954e | |||
| 2ee5db8882 | |||
| ea244193e0 | |||
| c6ff104767 | |||
| 441f5a8e50 | |||
| 3c629bb664 | |||
| e0e8ed2b0d | |||
| 53b028ef78 | |||
| 1fb99dc6e7 | |||
| f2c0d0b36a | |||
| 652da5e117 | |||
| c8e7491c94 | |||
| e5da01cfbb | |||
| b46010dbc2 | |||
| f90b2bdcf6 | |||
| fcbf67aeb3 | |||
| 2beba3bc9d | |||
| ded71cb50f | |||
| dc3eb2f73c | |||
| 8a8f957c9f | |||
| 8c29f527c6 | |||
| fcce3b8854 | |||
| 79933c2ecd | |||
| d84d2142ec | |||
| 7476ebcbe3 | |||
| 189dc4ed37 | |||
| f4384086f2 | |||
| dca117ed79 | |||
| ecdb331c9b | |||
| 084c91945a | |||
| d2cdd34541 | |||
| b5cf91d5f2 | |||
| 385aeb901c | |||
| 6468cbbc74 | |||
| 0e1e7e053d | |||
| bd72f91598 | |||
| 2e0baa4e35 | |||
| 9dee9c300a | |||
| c5cf07f4e5 | |||
| 91755c6e85 | |||
| 6674d40f4b | |||
| 32e68d7209 | |||
| 23e59ab459 | |||
| b2992acc56 | |||
| 200dd23285 | |||
| d1023f9e52 | |||
| 1de1b032e7 | |||
| 661a5783cf | |||
| dfe86a6ed1 | |||
| 35ec337c54 | |||
| c777728c91 | |||
| 0510613708 | |||
| fc99f17db3 | |||
| 1a506c0713 | |||
| 85798a07ae | |||
| 451ce1ab22 |
7
.gitignore
vendored
7
.gitignore
vendored
@ -1,2 +1,9 @@
|
||||
# Claude Code confidential files
|
||||
.claude/
|
||||
|
||||
# Codex local workspace files
|
||||
.codex/
|
||||
|
||||
# Python cache artifacts
|
||||
__pycache__/
|
||||
*.pyc
|
||||
|
||||
@ -1 +1 @@
|
||||
main
|
||||
v20260223-08-fix-navbar-mobile-overflow
|
||||
|
||||
249
TODO-cove-data-protection.md
Normal file
249
TODO-cove-data-protection.md
Normal file
@ -0,0 +1,249 @@
|
||||
# TODO: Cove Data Protection Integration
|
||||
|
||||
**Date:** 2026-02-23
|
||||
**Status:** Research COMPLETED — Ready for implementation
|
||||
**Priority:** Medium
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Goal
|
||||
|
||||
Integrate Cove Data Protection (formerly N-able Backup / SolarWinds Backup) into Backupchecks for backup status monitoring via scheduled API polling. The integration runs server-side within the Backupchecks web application.
|
||||
|
||||
**Challenge:** Cove does NOT work with email notifications like other backup systems (Veeam, Synology, NAKIVO). We use the JSON-RPC API instead.
|
||||
|
||||
---
|
||||
|
||||
## ✅ Research Phase — COMPLETED (2026-02-23)
|
||||
|
||||
### Confirmed findings
|
||||
|
||||
- **API endpoint:** `https://api.backup.management/jsonapi`
|
||||
- **Protocol:** JSON-RPC 2.0, POST requests, `Content-Type: application/json`
|
||||
- **Authentication:** Login method returns a `visa` token — include in all subsequent calls
|
||||
- **PartnerId:** `139124` (MCC Automatisering) — required for all queries, partnernaam is NIET nodig
|
||||
- **Alle benodigde data is beschikbaar** — eerdere blockers (D02/D03 errors) waren door gebruik van legacy column codes. Vervangen door D10/D11.
|
||||
- **Geen MSP-level beperking** — elke API user heeft dezelfde toegang. Toegang tot alle sub-customers via top-level account.
|
||||
- **Geen EnumerateAccounts nodig** — `EnumerateAccountStatistics` met juiste columns geeft alles wat we nodig hebben.
|
||||
|
||||
### Officiële documentatie (van N-able support, Andrew Robinson)
|
||||
- **Getting Started:** https://developer.n-able.com/n-able-cove/docs/getting-started
|
||||
- **Column Codes:** https://developer.n-able.com/n-able-cove/docs/column-codes
|
||||
- **Construct a Call:** https://developer.n-able.com/n-able-cove/docs/construct-a-json-rpc-api-call
|
||||
- **Authorization:** https://developer.n-able.com/n-able-cove/docs/authorization
|
||||
|
||||
---
|
||||
|
||||
## 📡 API — Vastgestelde werking
|
||||
|
||||
### Stap 1: Login
|
||||
|
||||
```json
|
||||
POST https://api.backup.management/jsonapi
|
||||
Content-Type: application/json
|
||||
|
||||
{
|
||||
"jsonrpc": "2.0",
|
||||
"id": "jsonrpc",
|
||||
"method": "Login",
|
||||
"params": {
|
||||
"username": "{{cove_api_username}}",
|
||||
"password": "{{cove_api_password}}"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Response bevat:**
|
||||
- `visa` — sessie token (meegeven in alle vervolg calls)
|
||||
- `result.PartnerId` — het partner ID (139124 voor MCC Automatisering)
|
||||
|
||||
### Stap 2: EnumerateAccountStatistics
|
||||
|
||||
```json
|
||||
{
|
||||
"jsonrpc": "2.0",
|
||||
"visa": "{{visa}}",
|
||||
"id": "jsonrpc",
|
||||
"method": "EnumerateAccountStatistics",
|
||||
"params": {
|
||||
"query": {
|
||||
"PartnerId": 139124,
|
||||
"StartRecordNumber": 0,
|
||||
"RecordsCount": 250,
|
||||
"Columns": [
|
||||
"I1", "I18", "I8", "I78",
|
||||
"D09F00", "D09F09", "D09F15", "D09F08",
|
||||
"D1F00", "D1F15",
|
||||
"D10F00", "D10F15",
|
||||
"D11F00", "D11F15",
|
||||
"D19F00", "D19F15",
|
||||
"D20F00", "D20F15",
|
||||
"D5F00", "D5F15",
|
||||
"D23F00", "D23F15"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📋 Column codes — wat ze betekenen
|
||||
|
||||
### Device info
|
||||
| Column | Betekenis | Type |
|
||||
|--------|-----------|------|
|
||||
| `I1` | Device naam (intern, uniek) | String |
|
||||
| `I18` | Computer naam (leesbaar) — leeg bij M365 | String |
|
||||
| `I8` | Klant naam | String |
|
||||
| `I78` | Actieve datasources, bijv. `D01D02D10` | String |
|
||||
|
||||
### Datasource status (per datasource herhaalbaar)
|
||||
| Suffix | Betekenis | Type |
|
||||
|--------|-----------|------|
|
||||
| `F00` | Status van laatste sessie | Int (zie tabel) |
|
||||
| `F09` | Tijdstip laatste **succesvolle** sessie | Unix timestamp |
|
||||
| `F15` | Tijdstip laatste sessie (ongeacht status) | Unix timestamp |
|
||||
| `F08` | Color bar laatste 28 dagen (28 cijfers) | String |
|
||||
|
||||
### Status waarden (F00)
|
||||
| Waarde | Betekenis |
|
||||
|--------|-----------|
|
||||
| `1` | In process |
|
||||
| `2` | Failed ❌ |
|
||||
| `3` | Aborted |
|
||||
| `5` | Completed ✅ |
|
||||
| `6` | Interrupted |
|
||||
| `8` | CompletedWithErrors ⚠️ |
|
||||
| `9` | InProgressWithFaults |
|
||||
| `10` | OverQuota |
|
||||
| `11` | NoSelection (geconfigureerd maar niets geselecteerd) |
|
||||
| `12` | Restarted |
|
||||
|
||||
### Datasources
|
||||
| Code | Naam | Gebruik |
|
||||
|-------|------|---------|
|
||||
| `D09` | Total (alle datasources gecombineerd) | Altijd aanwezig, beste voor overall status |
|
||||
| `D1` | Files & Folders | Servers/workstations |
|
||||
| `D2` | System State | Servers/workstations |
|
||||
| `D10` | VssMsSql (SQL Server) | Servers met SQL |
|
||||
| `D11` | VssSharePoint | Servers met SharePoint |
|
||||
| `D19` | Microsoft 365 Exchange | M365 tenants |
|
||||
| `D20` | Microsoft 365 OneDrive | M365 tenants |
|
||||
| `D5` | Microsoft 365 SharePoint | M365 tenants |
|
||||
| `D23` | Microsoft 365 Teams | M365 tenants |
|
||||
|
||||
**Let op:** D02 en D03 zijn legacy codes — gebruik D10 en D11.
|
||||
|
||||
### Device types herkennen via I78
|
||||
- `I78` bevat waarden zoals `D01D02`, `D01D02D10`, `D19D20D05D23`
|
||||
- Leeg `I18` veld = Microsoft 365 tenant
|
||||
- Gevuld `I18` veld = server of workstation
|
||||
|
||||
### D09F08 — Color bar decoderen
|
||||
28 tekens, elk karakter = 1 dag (oudste eerst):
|
||||
- `5` = Completed ✅
|
||||
- `8` = CompletedWithErrors ⚠️
|
||||
- `2` = Failed ❌
|
||||
- `1` = In progress
|
||||
- `0` = Geen backup
|
||||
|
||||
---
|
||||
|
||||
## 🏗️ Architectuur beslissing
|
||||
|
||||
**Gekozen: Option 2 — Parallel Import System**
|
||||
|
||||
```
|
||||
API Poller → Cove API Parser → JobRun (direct, zonder MailMessage)
|
||||
```
|
||||
|
||||
Rationale:
|
||||
- Schone scheiding van email- en API-gebaseerde imports
|
||||
- Geen misbruik van MailMessage model voor data zonder email context
|
||||
- Toekomstbestendig voor andere API-gebaseerde backup systemen
|
||||
|
||||
### Database wijzigingen nodig
|
||||
- `JobRun.source_type` — nieuw veld: `"email"` of `"api"`
|
||||
- `JobRun.external_id` — Cove `AccountId` als externe referentie
|
||||
- `JobRun.mail_message` — moet nullable worden (of aparte tabel)
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Implementatie fases
|
||||
|
||||
### Phase 1: Database migratie
|
||||
- [ ] `source_type` veld toevoegen aan JobRun (`email` / `api`)
|
||||
- [ ] `external_id` veld toevoegen aan JobRun (voor Cove AccountId)
|
||||
- [ ] `mail_message` FK nullable maken voor API-gebaseerde runs
|
||||
- [ ] Migratie schrijven en testen
|
||||
|
||||
### Phase 2: Cove API client
|
||||
- [ ] Nieuw bestand: `app/services/cove_client.py`
|
||||
- [ ] Login methode (visa token ophalen)
|
||||
- [ ] `enumerate_account_statistics()` methode
|
||||
- [ ] Paginatie afhandelen (RecordsCount / StartRecordNumber)
|
||||
- [ ] Token verloop afhandelen (opnieuw inloggen)
|
||||
- [ ] Error handling & retry logic
|
||||
|
||||
### Phase 3: Data transformatie
|
||||
- [ ] Nieuw bestand: `app/services/cove_importer.py`
|
||||
- [ ] Settings lijst omzetten naar dict voor makkelijke lookup
|
||||
- [ ] Unix timestamps omzetten naar datetime
|
||||
- [ ] Datasource status mappen naar Backupchecks status (success/warning/failed)
|
||||
- [ ] Device type bepalen (server vs M365) via `I18` en `I78`
|
||||
- [ ] JobRun records aanmaken per device
|
||||
|
||||
### Phase 4: Scheduled polling
|
||||
- [ ] Cronjob of scheduled task (elke 15-60 minuten?)
|
||||
- [ ] Duplicate detectie op basis van `external_id` + tijdstip
|
||||
- [ ] Logging & audit trail
|
||||
- [ ] Rate limiting respecteren
|
||||
|
||||
### Phase 5: UI aanpassingen
|
||||
- [ ] Job Details: geen "Download EML" knop voor API-gebaseerde runs
|
||||
- [ ] Indicatie dat job afkomstig is van Cove API (niet email)
|
||||
- [ ] 28-daagse color bar eventueel tonen
|
||||
|
||||
### Phase 6: Configuratie
|
||||
- [ ] Cove API credentials opslaan in SystemSettings
|
||||
- [ ] PartnerId configureerbaar maken
|
||||
- [ ] Polling interval instelbaar
|
||||
|
||||
---
|
||||
|
||||
## 🔑 API Credentials
|
||||
|
||||
- **API User:** `backupchecks-cove-01`
|
||||
- **User ID:** `1665555`
|
||||
- **PartnerId:** `139124`
|
||||
- **Role:** SuperUser + SecurityOfficer
|
||||
- **Portal:** https://backup.management/#/api-users
|
||||
|
||||
**BELANGRIJK:** Token opslaan in password manager — kan niet opnieuw worden opgevraagd!
|
||||
|
||||
---
|
||||
|
||||
## ❓ Openstaande vragen voor implementatie
|
||||
|
||||
1. Hoe slaan we de Cove API credentials veilig op in Backupchecks? (SystemSettings? Environment variable?)
|
||||
2. Wat is de gewenste polling frequentie? (15 min / 30 min / 1 uur?)
|
||||
3. Willen we historische data importeren bij eerste run, of alleen nieuwe sessies?
|
||||
4. Willen we de 28-daagse color bar (`D09F08`) tonen in de UI?
|
||||
5. Ondersteunen we meerdere Cove accounts (meerdere MSPs)?
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Success Criteria (MVP)
|
||||
|
||||
- [ ] Backup status (success/warning/failed) per device zichtbaar in Backupchecks
|
||||
- [ ] Klant naam en device naam correct gekoppeld
|
||||
- [ ] Tijdstip laatste backup beschikbaar
|
||||
- [ ] Zichtbaar in Daily Jobs & Run Checks
|
||||
- [ ] Servers én Microsoft 365 tenants worden ondersteund
|
||||
- [ ] Geen duplicates bij herhaalde polling
|
||||
|
||||
### Nice to Have
|
||||
- [ ] 28-daagse history grafiek
|
||||
- [ ] Per-datasource status (SQL, Exchange, etc.)
|
||||
- [ ] Polling frequentie instelbaar per klant
|
||||
@ -13,6 +13,7 @@ from .main.routes import main_bp
|
||||
from .main.routes_documentation import doc_bp
|
||||
from .migrations import run_migrations
|
||||
from .auto_importer_service import start_auto_importer
|
||||
from .cove_importer_service import start_cove_importer
|
||||
|
||||
|
||||
def _get_today_ui_date() -> str:
|
||||
@ -212,4 +213,7 @@ def create_app():
|
||||
# Start automatic mail importer background thread
|
||||
start_auto_importer(app)
|
||||
|
||||
# Start Cove Data Protection importer background thread
|
||||
start_cove_importer(app)
|
||||
|
||||
return app
|
||||
|
||||
@ -1,5 +1,11 @@
|
||||
import base64
|
||||
import binascii
|
||||
import hashlib
|
||||
import os
|
||||
import random
|
||||
import secrets
|
||||
from functools import wraps
|
||||
from urllib.parse import urlencode
|
||||
|
||||
from flask import (
|
||||
Blueprint,
|
||||
@ -11,9 +17,10 @@ from flask import (
|
||||
session,
|
||||
)
|
||||
from flask_login import login_user, logout_user, login_required, current_user
|
||||
import requests
|
||||
|
||||
from ..database import db
|
||||
from ..models import User
|
||||
from ..models import SystemSettings, User
|
||||
|
||||
auth_bp = Blueprint("auth", __name__, url_prefix="/auth")
|
||||
|
||||
@ -31,6 +38,131 @@ def generate_captcha():
|
||||
return question, answer
|
||||
|
||||
|
||||
def _entra_effective_config() -> dict:
|
||||
"""Return effective Entra SSO config from DB settings with env fallback."""
|
||||
settings = SystemSettings.query.first()
|
||||
|
||||
enabled = bool(getattr(settings, "entra_sso_enabled", False)) if settings else False
|
||||
tenant_id = (getattr(settings, "entra_tenant_id", None) or "").strip() if settings else ""
|
||||
client_id = (getattr(settings, "entra_client_id", None) or "").strip() if settings else ""
|
||||
client_secret = (getattr(settings, "entra_client_secret", None) or "").strip() if settings else ""
|
||||
redirect_uri = (getattr(settings, "entra_redirect_uri", None) or "").strip() if settings else ""
|
||||
allowed_domain = (getattr(settings, "entra_allowed_domain", None) or "").strip().lower() if settings else ""
|
||||
allowed_group_ids = (getattr(settings, "entra_allowed_group_ids", None) or "").strip() if settings else ""
|
||||
auto_provision = bool(getattr(settings, "entra_auto_provision_users", False)) if settings else False
|
||||
|
||||
if not tenant_id:
|
||||
tenant_id = (os.environ.get("ENTRA_TENANT_ID", "") or "").strip()
|
||||
if not client_id:
|
||||
client_id = (os.environ.get("ENTRA_CLIENT_ID", "") or "").strip()
|
||||
if not client_secret:
|
||||
client_secret = (os.environ.get("ENTRA_CLIENT_SECRET", "") or "").strip()
|
||||
if not redirect_uri:
|
||||
redirect_uri = (os.environ.get("ENTRA_REDIRECT_URI", "") or "").strip()
|
||||
if not allowed_domain:
|
||||
allowed_domain = (os.environ.get("ENTRA_ALLOWED_DOMAIN", "") or "").strip().lower()
|
||||
if not enabled:
|
||||
env_enabled = (os.environ.get("ENTRA_SSO_ENABLED", "") or "").strip().lower()
|
||||
enabled = env_enabled in ("1", "true", "yes", "on")
|
||||
if not auto_provision:
|
||||
env_auto = (os.environ.get("ENTRA_AUTO_PROVISION_USERS", "") or "").strip().lower()
|
||||
auto_provision = env_auto in ("1", "true", "yes", "on")
|
||||
if not allowed_group_ids:
|
||||
allowed_group_ids = (os.environ.get("ENTRA_ALLOWED_GROUP_IDS", "") or "").strip()
|
||||
|
||||
return {
|
||||
"enabled": enabled,
|
||||
"tenant_id": tenant_id,
|
||||
"client_id": client_id,
|
||||
"client_secret": client_secret,
|
||||
"redirect_uri": redirect_uri,
|
||||
"allowed_domain": allowed_domain,
|
||||
"allowed_group_ids": allowed_group_ids,
|
||||
"auto_provision": auto_provision,
|
||||
}
|
||||
|
||||
|
||||
def _parse_group_ids(raw: str | None) -> set[str]:
|
||||
if not raw:
|
||||
return set()
|
||||
normalized = raw.replace("\n", ",").replace(";", ",")
|
||||
out = set()
|
||||
for item in normalized.split(","):
|
||||
value = (item or "").strip()
|
||||
if value:
|
||||
out.add(value.lower())
|
||||
return out
|
||||
|
||||
|
||||
def _b64url_decode(data: str) -> bytes:
|
||||
pad = "=" * (-len(data) % 4)
|
||||
return base64.urlsafe_b64decode((data + pad).encode("ascii"))
|
||||
|
||||
|
||||
def _decode_id_token_payload(id_token: str) -> dict:
|
||||
"""Decode JWT payload without signature verification (token comes from Entra token endpoint)."""
|
||||
if not id_token or "." not in id_token:
|
||||
return {}
|
||||
parts = id_token.split(".")
|
||||
if len(parts) < 2:
|
||||
return {}
|
||||
try:
|
||||
payload_raw = _b64url_decode(parts[1])
|
||||
import json
|
||||
payload = json.loads(payload_raw.decode("utf-8"))
|
||||
if isinstance(payload, dict):
|
||||
return payload
|
||||
except (binascii.Error, ValueError, UnicodeDecodeError):
|
||||
return {}
|
||||
return {}
|
||||
|
||||
|
||||
def _resolve_sso_user(claims: dict, auto_provision: bool) -> User | None:
|
||||
"""Resolve or optionally create a local user from Entra claims."""
|
||||
username = (
|
||||
(claims.get("preferred_username") or "")
|
||||
or (claims.get("upn") or "")
|
||||
or (claims.get("email") or "")
|
||||
).strip()
|
||||
email = ((claims.get("email") or claims.get("preferred_username") or "") or "").strip() or None
|
||||
|
||||
if not username:
|
||||
return None
|
||||
|
||||
user = User.query.filter_by(username=username).first()
|
||||
if not user and email:
|
||||
user = User.query.filter_by(email=email).first()
|
||||
if user:
|
||||
return user
|
||||
if not auto_provision:
|
||||
return None
|
||||
|
||||
new_username = username
|
||||
if User.query.filter_by(username=new_username).first():
|
||||
base = new_username
|
||||
idx = 1
|
||||
while User.query.filter_by(username=f"{base}.{idx}").first():
|
||||
idx += 1
|
||||
new_username = f"{base}.{idx}"
|
||||
|
||||
# Random local password as fallback; SSO users authenticate via Entra.
|
||||
random_password = secrets.token_urlsafe(32)
|
||||
new_user = User(username=new_username, email=email, role="viewer")
|
||||
new_user.set_password(random_password)
|
||||
db.session.add(new_user)
|
||||
db.session.commit()
|
||||
return new_user
|
||||
|
||||
|
||||
def _groups_from_claims(claims: dict) -> set[str]:
|
||||
groups = claims.get("groups")
|
||||
if isinstance(groups, list):
|
||||
return {str(x).strip().lower() for x in groups if str(x).strip()}
|
||||
if isinstance(groups, str) and groups.strip():
|
||||
return {groups.strip().lower()}
|
||||
return set()
|
||||
|
||||
|
||||
def captcha_required(func):
|
||||
@wraps(func)
|
||||
def wrapper(*args, **kwargs):
|
||||
@ -42,10 +174,18 @@ def captcha_required(func):
|
||||
# regenerate captcha for re-render
|
||||
question, answer = generate_captcha()
|
||||
session["captcha_answer"] = answer
|
||||
cfg = _entra_effective_config()
|
||||
entra_ready = bool(
|
||||
cfg.get("enabled")
|
||||
and cfg.get("tenant_id")
|
||||
and cfg.get("client_id")
|
||||
and cfg.get("client_secret")
|
||||
)
|
||||
return render_template(
|
||||
"auth/login.html",
|
||||
captcha_question=question,
|
||||
username=request.form.get("username", ""),
|
||||
entra_sso_enabled=entra_ready,
|
||||
)
|
||||
return func(*args, **kwargs)
|
||||
|
||||
@ -61,7 +201,18 @@ def login():
|
||||
|
||||
question, answer = generate_captcha()
|
||||
session["captcha_answer"] = answer
|
||||
return render_template("auth/login.html", captcha_question=question)
|
||||
cfg = _entra_effective_config()
|
||||
entra_ready = bool(
|
||||
cfg.get("enabled")
|
||||
and cfg.get("tenant_id")
|
||||
and cfg.get("client_id")
|
||||
and cfg.get("client_secret")
|
||||
)
|
||||
return render_template(
|
||||
"auth/login.html",
|
||||
captcha_question=question,
|
||||
entra_sso_enabled=entra_ready,
|
||||
)
|
||||
|
||||
# POST
|
||||
username = (request.form.get("username") or "").strip()
|
||||
@ -72,8 +223,18 @@ def login():
|
||||
flash("Invalid username or password.", "danger")
|
||||
question, answer = generate_captcha()
|
||||
session["captcha_answer"] = answer
|
||||
cfg = _entra_effective_config()
|
||||
entra_ready = bool(
|
||||
cfg.get("enabled")
|
||||
and cfg.get("tenant_id")
|
||||
and cfg.get("client_id")
|
||||
and cfg.get("client_secret")
|
||||
)
|
||||
return render_template(
|
||||
"auth/login.html", captcha_question=question, username=username
|
||||
"auth/login.html",
|
||||
captcha_question=question,
|
||||
username=username,
|
||||
entra_sso_enabled=entra_ready,
|
||||
)
|
||||
|
||||
login_user(user)
|
||||
@ -81,18 +242,180 @@ def login():
|
||||
session["active_role"] = user.roles[0]
|
||||
except Exception:
|
||||
session["active_role"] = (getattr(user, "role", "viewer") or "viewer").split(",")[0].strip() or "viewer"
|
||||
session["auth_provider"] = "local"
|
||||
flash("You are now logged in.", "success")
|
||||
return redirect(url_for("main.dashboard"))
|
||||
|
||||
|
||||
@auth_bp.route("/entra/login")
|
||||
def entra_login():
|
||||
"""Start Microsoft Entra ID authorization code flow."""
|
||||
cfg = _entra_effective_config()
|
||||
if not cfg.get("enabled"):
|
||||
flash("Microsoft Entra SSO is not enabled.", "warning")
|
||||
return redirect(url_for("auth.login"))
|
||||
if not cfg.get("tenant_id") or not cfg.get("client_id") or not cfg.get("client_secret"):
|
||||
flash("Microsoft Entra SSO is not fully configured.", "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
redirect_uri = cfg.get("redirect_uri") or url_for("auth.entra_callback", _external=True)
|
||||
state = secrets.token_urlsafe(24)
|
||||
nonce = hashlib.sha256(secrets.token_bytes(32)).hexdigest()
|
||||
session["entra_state"] = state
|
||||
session["entra_nonce"] = nonce
|
||||
|
||||
params = {
|
||||
"client_id": cfg["client_id"],
|
||||
"response_type": "code",
|
||||
"redirect_uri": redirect_uri,
|
||||
"response_mode": "query",
|
||||
"scope": "openid profile email",
|
||||
"state": state,
|
||||
"nonce": nonce,
|
||||
"prompt": "select_account",
|
||||
}
|
||||
auth_url = (
|
||||
f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/authorize?"
|
||||
f"{urlencode(params)}"
|
||||
)
|
||||
return redirect(auth_url)
|
||||
|
||||
|
||||
@auth_bp.route("/entra/callback")
|
||||
def entra_callback():
|
||||
"""Handle Microsoft Entra ID callback and log in mapped local user."""
|
||||
cfg = _entra_effective_config()
|
||||
if not cfg.get("enabled"):
|
||||
flash("Microsoft Entra SSO is not enabled.", "warning")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
error = (request.args.get("error") or "").strip()
|
||||
if error:
|
||||
desc = (request.args.get("error_description") or "").strip()
|
||||
flash(f"Microsoft Entra login failed: {error} {desc}".strip(), "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
state = (request.args.get("state") or "").strip()
|
||||
expected_state = (session.get("entra_state") or "").strip()
|
||||
if not state or not expected_state or state != expected_state:
|
||||
flash("Invalid SSO state. Please try again.", "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
code = (request.args.get("code") or "").strip()
|
||||
if not code:
|
||||
flash("No authorization code returned by Microsoft Entra.", "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
redirect_uri = cfg.get("redirect_uri") or url_for("auth.entra_callback", _external=True)
|
||||
token_url = f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/token"
|
||||
token_payload = {
|
||||
"client_id": cfg["client_id"],
|
||||
"client_secret": cfg["client_secret"],
|
||||
"grant_type": "authorization_code",
|
||||
"code": code,
|
||||
"redirect_uri": redirect_uri,
|
||||
"scope": "openid profile email",
|
||||
}
|
||||
try:
|
||||
token_resp = requests.post(token_url, data=token_payload, timeout=30)
|
||||
token_resp.raise_for_status()
|
||||
token_data = token_resp.json()
|
||||
except Exception as exc:
|
||||
flash(f"Failed to fetch token from Microsoft Entra: {exc}", "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
id_token = token_data.get("id_token")
|
||||
claims = _decode_id_token_payload(id_token or "")
|
||||
if not claims:
|
||||
flash("Could not read Microsoft Entra ID token.", "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
expected_nonce = (session.get("entra_nonce") or "").strip()
|
||||
token_nonce = (claims.get("nonce") or "").strip()
|
||||
if expected_nonce and token_nonce and token_nonce != expected_nonce:
|
||||
flash("Invalid SSO nonce. Please try again.", "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
allowed_domain = (cfg.get("allowed_domain") or "").strip().lower()
|
||||
if allowed_domain:
|
||||
token_tid = (claims.get("tid") or "").strip().lower()
|
||||
token_domain = ""
|
||||
upn = (claims.get("preferred_username") or claims.get("email") or "").strip().lower()
|
||||
if "@" in upn:
|
||||
token_domain = upn.split("@", 1)[1]
|
||||
if allowed_domain not in {token_tid, token_domain}:
|
||||
flash("Your Microsoft account is not allowed for this instance.", "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
allowed_groups = _parse_group_ids(cfg.get("allowed_group_ids"))
|
||||
if allowed_groups:
|
||||
claim_names = claims.get("_claim_names") or {}
|
||||
groups_overage = isinstance(claim_names, dict) and "groups" in claim_names
|
||||
token_groups = _groups_from_claims(claims)
|
||||
|
||||
if groups_overage:
|
||||
flash(
|
||||
"Group-based access check could not be completed because token group overage is active. "
|
||||
"Limit group claims to assigned groups or reduce memberships.",
|
||||
"danger",
|
||||
)
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
if not token_groups:
|
||||
flash(
|
||||
"Group-based access is enabled, but no groups claim was received from Microsoft Entra. "
|
||||
"Configure group claims in the Entra app token settings.",
|
||||
"danger",
|
||||
)
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
if token_groups.isdisjoint(allowed_groups):
|
||||
flash("Your Microsoft account is not in an allowed security group.", "danger")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
user = _resolve_sso_user(claims, auto_provision=bool(cfg.get("auto_provision")))
|
||||
if not user:
|
||||
flash(
|
||||
"No local Backupchecks user is mapped to this Microsoft account. "
|
||||
"Ask an admin to create or map your account.",
|
||||
"danger",
|
||||
)
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
login_user(user)
|
||||
try:
|
||||
session["active_role"] = user.roles[0]
|
||||
except Exception:
|
||||
session["active_role"] = (getattr(user, "role", "viewer") or "viewer").split(",")[0].strip() or "viewer"
|
||||
session["auth_provider"] = "entra"
|
||||
session.pop("entra_state", None)
|
||||
session.pop("entra_nonce", None)
|
||||
flash("You are now logged in with Microsoft Entra.", "success")
|
||||
return redirect(url_for("main.dashboard"))
|
||||
|
||||
|
||||
@auth_bp.route("/logout")
|
||||
@login_required
|
||||
def logout():
|
||||
cfg = _entra_effective_config()
|
||||
auth_provider = (session.get("auth_provider") or "").strip()
|
||||
logout_user()
|
||||
try:
|
||||
session.pop("active_role", None)
|
||||
session.pop("auth_provider", None)
|
||||
session.pop("entra_state", None)
|
||||
session.pop("entra_nonce", None)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if auth_provider == "entra" and cfg.get("enabled") and cfg.get("tenant_id"):
|
||||
post_logout = url_for("auth.login", _external=True)
|
||||
logout_url = (
|
||||
f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/logout?"
|
||||
f"{urlencode({'post_logout_redirect_uri': post_logout})}"
|
||||
)
|
||||
return redirect(logout_url)
|
||||
|
||||
flash("You have been logged out.", "info")
|
||||
return redirect(url_for("auth.login"))
|
||||
|
||||
|
||||
@ -3,6 +3,106 @@ Changelog data structure for Backupchecks
|
||||
"""
|
||||
|
||||
CHANGELOG = [
|
||||
{
|
||||
"version": "v0.1.27",
|
||||
"date": "2026-02-23",
|
||||
"summary": "This release is a major functional update since v0.1.26 (released on February 10, 2026). It introduces full Cove Data Protection integration, broad search and navigation improvements, and multiple workflow/ticketing fixes. It also adds Microsoft Entra SSO foundations (currently marked as untested in Backupchecks), along with extensive documentation updates and UI refinements.",
|
||||
"sections": [
|
||||
{
|
||||
"title": "Added",
|
||||
"type": "feature",
|
||||
"subsections": [
|
||||
{
|
||||
"subtitle": "Cove Data Protection",
|
||||
"changes": [
|
||||
"Full Cove Data Protection integration with API importer and background polling",
|
||||
"Cove Accounts staging/linking page for unmatched and matched account workflow",
|
||||
"Manual import trigger and JobRun source tracking via source_type and external_id",
|
||||
"CoveAccount model and migrations for staging and account linkage",
|
||||
"Per-datasource object persistence for reporting and run inspection"
|
||||
]
|
||||
},
|
||||
{
|
||||
"subtitle": "Search and Navigation",
|
||||
"changes": [
|
||||
"Global grouped search with role-aware results",
|
||||
"Per-section pagination in search",
|
||||
"Remarks included in search results",
|
||||
"Customer-to-Jobs quick filter navigation"
|
||||
]
|
||||
},
|
||||
{
|
||||
"subtitle": "Microsoft Entra SSO (Untested)",
|
||||
"changes": [
|
||||
"Added Microsoft Entra SSO login/callback/logout flow",
|
||||
"Added settings and migrations for tenant/client/secret/redirect configuration",
|
||||
"Added optional auto-provisioning for unknown users as Viewer",
|
||||
"Added optional tenant/domain restriction",
|
||||
"Added security-group gate using allowed Entra group IDs",
|
||||
"Added dedicated Entra SSO documentation page"
|
||||
]
|
||||
},
|
||||
{
|
||||
"subtitle": "Other",
|
||||
"changes": [
|
||||
"Added Cove API test script (cove_api_test.py)",
|
||||
"Added optional Autotask ID import toggle for jobs/customers import"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Changed",
|
||||
"type": "improvement",
|
||||
"subsections": [
|
||||
{
|
||||
"subtitle": "Cove Import and Linking",
|
||||
"changes": [
|
||||
"Immediate import after linking a Cove account",
|
||||
"Type derivation refined to Server/Workstation/Microsoft 365",
|
||||
"Cove Accounts display improved with clearer derived fields and readable datasource labels",
|
||||
"Richer run details in Cove-created runs and datasource object records",
|
||||
"Timestamp fallback for run creation from D09F15 to D09F09 when needed"
|
||||
]
|
||||
},
|
||||
{
|
||||
"subtitle": "Navbar Restructuring",
|
||||
"changes": [
|
||||
"Admin-only links grouped under an Admin dropdown",
|
||||
"Secondary non-admin links grouped under a More dropdown",
|
||||
"Cove Accounts moved back to main bar and Daily Jobs moved under More",
|
||||
"Viewer role now has Customers and Jobs directly visible on navbar",
|
||||
"Run Checks remains directly visible for daily operations"
|
||||
]
|
||||
},
|
||||
{
|
||||
"subtitle": "Documentation and UX",
|
||||
"changes": [
|
||||
"Documentation expanded and corrected across workflow and settings topics",
|
||||
"Search UX improved with wildcard/contains filtering and section/pagination state preservation",
|
||||
"Parser/Run Checks behavior updated for informational 3CX update handling"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"title": "Fixed",
|
||||
"type": "bugfix",
|
||||
"changes": [
|
||||
"Fixed tickets not showing in Run Checks modal",
|
||||
"Fixed copy ticket button behavior in Edge via improved clipboard fallback",
|
||||
"Fixed resolved tickets incorrectly appearing on new runs using explicit link-based logic",
|
||||
"Fixed duplicate tickets in Run Checks popup",
|
||||
"Fixed missed-run ticket linking with internal and Autotask tickets",
|
||||
"Fixed Cove run deduplication by scoping dedupe per job",
|
||||
"Fixed Cove Run import now submit issue in settings UI",
|
||||
"Fixed checkbox auto-reselect behavior after reload",
|
||||
"Fixed search template crash caused by section.items access",
|
||||
"Stopped tracking Python cache artifacts in version control"
|
||||
]
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"version": "v0.1.26",
|
||||
"date": "2026-02-10",
|
||||
|
||||
536
containers/backupchecks/src/backend/app/cove_importer.py
Normal file
536
containers/backupchecks/src/backend/app/cove_importer.py
Normal file
@ -0,0 +1,536 @@
|
||||
"""Cove Data Protection API importer.
|
||||
|
||||
Fetches backup job run data from the Cove (N-able) API.
|
||||
|
||||
Flow (mirrors the mail Inbox flow):
|
||||
1. All Cove accounts are upserted into the `cove_accounts` staging table.
|
||||
2. Accounts without a linked job appear on the Cove Accounts page where
|
||||
an admin can create or link a job (same as approving a mail from Inbox).
|
||||
3. For accounts that have a linked job, a JobRun is created per new session
|
||||
(deduplicated via external_id).
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Any
|
||||
|
||||
import requests
|
||||
from sqlalchemy import text
|
||||
|
||||
from .database import db
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
COVE_DEFAULT_URL = "https://api.backup.management/jsonapi"
|
||||
|
||||
# Columns to request from EnumerateAccountStatistics
|
||||
COVE_COLUMNS = [
|
||||
"I1", # Account/device name
|
||||
"I18", # Computer name
|
||||
"I8", # Customer / partner name
|
||||
"I78", # Active datasource label
|
||||
"D09F00", # Overall last session status
|
||||
"D09F09", # Last successful session timestamp
|
||||
"D09F15", # Last session end timestamp
|
||||
"D09F08", # 28-day colorbar
|
||||
# Datasource-specific status (F00) and last session time (F15)
|
||||
"D1F00", "D1F15", # Files & Folders
|
||||
"D10F00", "D10F15", # VssMsSql
|
||||
"D11F00", "D11F15", # VssSharePoint
|
||||
"D19F00", "D19F15", # M365 Exchange
|
||||
"D20F00", "D20F15", # M365 OneDrive
|
||||
"D5F00", "D5F15", # M365 SharePoint
|
||||
"D23F00", "D23F15", # M365 Teams
|
||||
]
|
||||
|
||||
# Mapping from Cove status code to Backupchecks status string
|
||||
STATUS_MAP: dict[int, str] = {
|
||||
1: "Warning", # In process
|
||||
2: "Error", # Failed
|
||||
3: "Error", # Aborted
|
||||
5: "Success", # Completed
|
||||
6: "Error", # Interrupted
|
||||
7: "Warning", # NotStarted
|
||||
8: "Warning", # CompletedWithErrors
|
||||
9: "Warning", # InProgressWithFaults
|
||||
10: "Error", # OverQuota
|
||||
11: "Warning", # NoSelection
|
||||
12: "Warning", # Restarted
|
||||
}
|
||||
|
||||
# Mapping from Cove status code to readable label
|
||||
STATUS_LABELS: dict[int, str] = {
|
||||
1: "In process",
|
||||
2: "Failed",
|
||||
3: "Aborted",
|
||||
5: "Completed",
|
||||
6: "Interrupted",
|
||||
7: "Not started",
|
||||
8: "Completed with errors",
|
||||
9: "In progress with faults",
|
||||
10: "Over quota",
|
||||
11: "No selection",
|
||||
12: "Restarted",
|
||||
}
|
||||
|
||||
# Datasource label mapping (column prefix → human-readable label)
|
||||
DATASOURCE_LABELS: dict[str, str] = {
|
||||
"D1": "Files & Folders",
|
||||
"D10": "VssMsSql",
|
||||
"D11": "VssSharePoint",
|
||||
"D19": "M365 Exchange",
|
||||
"D20": "M365 OneDrive",
|
||||
"D5": "M365 SharePoint",
|
||||
"D23": "M365 Teams",
|
||||
}
|
||||
|
||||
|
||||
class CoveImportError(Exception):
|
||||
"""Raised when Cove API interaction fails."""
|
||||
|
||||
|
||||
def _cove_login(url: str, username: str, password: str) -> tuple[str, int]:
|
||||
"""Login to the Cove API and return (visa, partner_id).
|
||||
|
||||
Raises CoveImportError on failure.
|
||||
"""
|
||||
payload = {
|
||||
"jsonrpc": "2.0",
|
||||
"id": "jsonrpc",
|
||||
"method": "Login",
|
||||
"params": {
|
||||
"username": username,
|
||||
"password": password,
|
||||
},
|
||||
}
|
||||
try:
|
||||
resp = requests.post(
|
||||
url,
|
||||
json=payload,
|
||||
headers={"Content-Type": "application/json"},
|
||||
timeout=30,
|
||||
)
|
||||
resp.raise_for_status()
|
||||
data = resp.json()
|
||||
except requests.RequestException as exc:
|
||||
raise CoveImportError(f"Cove login request failed: {exc}") from exc
|
||||
except ValueError as exc:
|
||||
raise CoveImportError(f"Cove login response is not valid JSON: {exc}") from exc
|
||||
|
||||
if "error" in data and data["error"]:
|
||||
error = data["error"]
|
||||
msg = error.get("message") or str(error) if isinstance(error, dict) else str(error)
|
||||
raise CoveImportError(f"Cove login failed: {msg}")
|
||||
|
||||
# Visa is returned at the top level of the response (not inside result)
|
||||
visa = data.get("visa") or ""
|
||||
if not visa:
|
||||
raise CoveImportError("Cove login succeeded but no visa token returned")
|
||||
|
||||
# PartnerId is inside result
|
||||
result = data.get("result") or {}
|
||||
partner_id = (
|
||||
result.get("PartnerId")
|
||||
or result.get("PartnerID")
|
||||
or result.get("result", {}).get("PartnerId")
|
||||
or 0
|
||||
)
|
||||
|
||||
return visa, int(partner_id)
|
||||
|
||||
|
||||
def _cove_enumerate(
|
||||
url: str,
|
||||
visa: str,
|
||||
partner_id: int,
|
||||
start: int,
|
||||
count: int,
|
||||
) -> list[dict]:
|
||||
"""Call EnumerateAccountStatistics and return a list of account dicts.
|
||||
|
||||
Returns empty list when no more results.
|
||||
"""
|
||||
payload = {
|
||||
"jsonrpc": "2.0",
|
||||
"visa": visa,
|
||||
"id": "jsonrpc",
|
||||
"method": "EnumerateAccountStatistics",
|
||||
"params": {
|
||||
"query": {
|
||||
"PartnerId": partner_id,
|
||||
"StartRecordNumber": start,
|
||||
"RecordsCount": count,
|
||||
"Columns": COVE_COLUMNS,
|
||||
}
|
||||
},
|
||||
}
|
||||
try:
|
||||
resp = requests.post(
|
||||
url,
|
||||
json=payload,
|
||||
headers={"Content-Type": "application/json"},
|
||||
timeout=60,
|
||||
)
|
||||
resp.raise_for_status()
|
||||
data = resp.json()
|
||||
except requests.RequestException as exc:
|
||||
raise CoveImportError(f"Cove EnumerateAccountStatistics request failed: {exc}") from exc
|
||||
except ValueError as exc:
|
||||
raise CoveImportError(f"Cove EnumerateAccountStatistics response is not valid JSON: {exc}") from exc
|
||||
|
||||
if "error" in data and data["error"]:
|
||||
error = data["error"]
|
||||
msg = error.get("message") or str(error) if isinstance(error, dict) else str(error)
|
||||
raise CoveImportError(f"Cove EnumerateAccountStatistics failed: {msg}")
|
||||
|
||||
result = data.get("result")
|
||||
if result is None:
|
||||
return []
|
||||
|
||||
# Unwrap possible nested result
|
||||
if isinstance(result, dict) and "result" in result:
|
||||
result = result["result"]
|
||||
|
||||
# Accounts can be a list directly or wrapped in an "Accounts" key
|
||||
if isinstance(result, list):
|
||||
return result
|
||||
if isinstance(result, dict):
|
||||
return result.get("Accounts", []) or []
|
||||
return []
|
||||
|
||||
|
||||
def _flatten_settings(account: dict) -> dict:
|
||||
"""Convert the Settings array in an account dict to a flat key→value dict.
|
||||
|
||||
Cove returns settings as a list of single-key dicts, e.g.:
|
||||
[{"D09F00": "5"}, {"I1": "device name"}, ...]
|
||||
"""
|
||||
flat: dict[str, Any] = {}
|
||||
settings_list = account.get("Settings") or []
|
||||
if isinstance(settings_list, list):
|
||||
for item in settings_list:
|
||||
if isinstance(item, dict):
|
||||
flat.update(item)
|
||||
return flat
|
||||
|
||||
|
||||
def _map_status(code: Any) -> str:
|
||||
"""Map a Cove status code (int) to a Backupchecks status string."""
|
||||
if code is None:
|
||||
return "Warning"
|
||||
try:
|
||||
return STATUS_MAP.get(int(code), "Warning")
|
||||
except (ValueError, TypeError):
|
||||
return "Warning"
|
||||
|
||||
|
||||
def _status_label(code: Any) -> str:
|
||||
"""Map a Cove status code (int) to a human-readable label."""
|
||||
if code is None:
|
||||
return "Unknown"
|
||||
try:
|
||||
return STATUS_LABELS.get(int(code), f"Code {int(code)}")
|
||||
except (ValueError, TypeError):
|
||||
return "Unknown"
|
||||
|
||||
|
||||
def _ts_to_dt(value: Any) -> datetime | None:
|
||||
"""Convert a Unix timestamp (int or str) to a naive UTC datetime."""
|
||||
if value is None:
|
||||
return None
|
||||
try:
|
||||
ts = int(value)
|
||||
if ts <= 0:
|
||||
return None
|
||||
return datetime.fromtimestamp(ts, tz=timezone.utc).replace(tzinfo=None)
|
||||
except (ValueError, TypeError, OSError):
|
||||
return None
|
||||
|
||||
|
||||
def _fmt_utc(dt: datetime | None) -> str:
|
||||
"""Format a naive UTC datetime to readable text for run object messages."""
|
||||
if not dt:
|
||||
return "unknown"
|
||||
return dt.strftime("%Y-%m-%d %H:%M UTC")
|
||||
|
||||
|
||||
def run_cove_import(settings) -> tuple[int, int, int, int]:
|
||||
"""Fetch Cove account statistics and update the staging table + JobRuns.
|
||||
|
||||
For every account:
|
||||
- Upsert into cove_accounts (always)
|
||||
- If the account has a linked job → create a JobRun if not already seen
|
||||
|
||||
Args:
|
||||
settings: SystemSettings ORM object with cove_* fields.
|
||||
|
||||
Returns:
|
||||
Tuple of (total_accounts, created_runs, skipped_runs, error_count).
|
||||
|
||||
Raises:
|
||||
CoveImportError if the API login fails.
|
||||
"""
|
||||
url = (getattr(settings, "cove_api_url", None) or "").strip() or COVE_DEFAULT_URL
|
||||
username = (getattr(settings, "cove_api_username", None) or "").strip()
|
||||
password = (getattr(settings, "cove_api_password", None) or "").strip()
|
||||
|
||||
if not username or not password:
|
||||
raise CoveImportError("Cove API username or password not configured")
|
||||
|
||||
visa, partner_id = _cove_login(url, username, password)
|
||||
|
||||
# Save partner_id back to settings
|
||||
if partner_id and partner_id != getattr(settings, "cove_partner_id", None):
|
||||
settings.cove_partner_id = partner_id
|
||||
try:
|
||||
db.session.commit()
|
||||
except Exception:
|
||||
db.session.rollback()
|
||||
|
||||
total = 0
|
||||
created = 0
|
||||
skipped = 0
|
||||
errors = 0
|
||||
|
||||
page_size = 250
|
||||
start = 0
|
||||
|
||||
while True:
|
||||
try:
|
||||
accounts = _cove_enumerate(url, visa, partner_id, start, page_size)
|
||||
except CoveImportError:
|
||||
raise
|
||||
except Exception as exc:
|
||||
raise CoveImportError(f"Unexpected error fetching accounts at offset {start}: {exc}") from exc
|
||||
|
||||
if not accounts:
|
||||
break
|
||||
|
||||
for account in accounts:
|
||||
total += 1
|
||||
try:
|
||||
run_created = _process_account(account)
|
||||
if run_created:
|
||||
created += 1
|
||||
else:
|
||||
skipped += 1
|
||||
except Exception as exc:
|
||||
errors += 1
|
||||
logger.warning("Cove import: error processing account: %s", exc)
|
||||
try:
|
||||
db.session.rollback()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
if len(accounts) < page_size:
|
||||
break
|
||||
start += page_size
|
||||
|
||||
# Update last import timestamp
|
||||
settings.cove_last_import_at = datetime.utcnow()
|
||||
try:
|
||||
db.session.commit()
|
||||
except Exception:
|
||||
db.session.rollback()
|
||||
|
||||
return total, created, skipped, errors
|
||||
|
||||
|
||||
def _process_account(account: dict) -> bool:
|
||||
"""Upsert a Cove account into the staging table and create a JobRun if linked.
|
||||
|
||||
Returns True if a new JobRun was created, False otherwise.
|
||||
"""
|
||||
from .models import CoveAccount, JobRun
|
||||
|
||||
flat = _flatten_settings(account)
|
||||
|
||||
# AccountId is a top-level field
|
||||
account_id = account.get("AccountId") or account.get("AccountID")
|
||||
if not account_id:
|
||||
return False
|
||||
try:
|
||||
account_id = int(account_id)
|
||||
except (ValueError, TypeError):
|
||||
return False
|
||||
|
||||
# Extract metadata from flat settings
|
||||
account_name = (flat.get("I1") or "").strip() or None
|
||||
computer_name = (flat.get("I18") or "").strip() or None
|
||||
customer_name = (flat.get("I8") or "").strip() or None
|
||||
datasource_types = (flat.get("I78") or "").strip() or None
|
||||
# Prefer "last session end" (D09F15); fallback to "last successful session" (D09F09)
|
||||
# so accounts without D09F15 can still produce an initial run.
|
||||
last_run_ts_raw = flat.get("D09F15")
|
||||
last_run_at = _ts_to_dt(last_run_ts_raw)
|
||||
if last_run_at is None:
|
||||
last_run_ts_raw = flat.get("D09F09")
|
||||
last_run_at = _ts_to_dt(last_run_ts_raw)
|
||||
colorbar_28d = (flat.get("D09F08") or "").strip() or None
|
||||
try:
|
||||
last_status_code = int(flat["D09F00"]) if flat.get("D09F00") is not None else None
|
||||
except (ValueError, TypeError):
|
||||
last_status_code = None
|
||||
|
||||
# Upsert into cove_accounts staging table
|
||||
cove_acc = CoveAccount.query.filter_by(account_id=account_id).first()
|
||||
if cove_acc is None:
|
||||
cove_acc = CoveAccount(
|
||||
account_id=account_id,
|
||||
first_seen_at=datetime.utcnow(),
|
||||
)
|
||||
db.session.add(cove_acc)
|
||||
|
||||
cove_acc.account_name = account_name
|
||||
cove_acc.computer_name = computer_name
|
||||
cove_acc.customer_name = customer_name
|
||||
cove_acc.datasource_types = datasource_types
|
||||
cove_acc.last_status_code = last_status_code
|
||||
cove_acc.last_run_at = last_run_at
|
||||
cove_acc.colorbar_28d = colorbar_28d
|
||||
cove_acc.last_seen_at = datetime.utcnow()
|
||||
|
||||
db.session.flush() # ensure cove_acc.id is set
|
||||
|
||||
# If not linked to a job yet, nothing more to do (shows up in Cove Accounts page)
|
||||
if not cove_acc.job_id:
|
||||
db.session.commit()
|
||||
return False
|
||||
|
||||
# Account is linked: create a JobRun if the last session is new
|
||||
if not last_run_at:
|
||||
db.session.commit()
|
||||
return False
|
||||
|
||||
try:
|
||||
run_ts = int(last_run_ts_raw or 0)
|
||||
except (TypeError, ValueError):
|
||||
run_ts = 0
|
||||
|
||||
# Fetch the linked job
|
||||
from .models import Job
|
||||
job = Job.query.get(cove_acc.job_id)
|
||||
if not job:
|
||||
db.session.commit()
|
||||
return False
|
||||
|
||||
external_id = f"cove-{account_id}-{run_ts}"
|
||||
|
||||
# Deduplicate per job + session, not globally.
|
||||
# This avoids blocking a run on a newly linked/relinked job when the same
|
||||
# Cove session was previously stored under another job.
|
||||
existing = JobRun.query.filter_by(job_id=job.id, external_id=external_id).first()
|
||||
if existing:
|
||||
db.session.commit()
|
||||
return False
|
||||
|
||||
status = _map_status(last_status_code)
|
||||
run_remark = (
|
||||
f"Cove account: {account_name or account_id} | "
|
||||
f"Computer: {computer_name or '-'} | "
|
||||
f"Customer: {customer_name or '-'} | "
|
||||
f"Last status: {_status_label(last_status_code)} ({last_status_code if last_status_code is not None else '-'}) | "
|
||||
f"Last run: {_fmt_utc(last_run_at)}"
|
||||
)
|
||||
|
||||
run = JobRun(
|
||||
job_id=job.id,
|
||||
mail_message_id=None,
|
||||
run_at=last_run_at,
|
||||
status=status,
|
||||
remark=run_remark,
|
||||
missed=False,
|
||||
override_applied=False,
|
||||
source_type="cove_api",
|
||||
external_id=external_id,
|
||||
)
|
||||
db.session.add(run)
|
||||
db.session.flush() # get run.id
|
||||
|
||||
# Persist per-datasource objects
|
||||
if job.customer_id:
|
||||
_persist_datasource_objects(flat, job.customer_id, job.id, run.id, last_run_at)
|
||||
|
||||
db.session.commit()
|
||||
return True
|
||||
|
||||
|
||||
def _persist_datasource_objects(
|
||||
flat: dict,
|
||||
customer_id: int,
|
||||
job_id: int,
|
||||
run_id: int,
|
||||
observed_at: datetime,
|
||||
) -> None:
|
||||
"""Create run_object_links for each active datasource found in the account stats."""
|
||||
engine = db.get_engine()
|
||||
|
||||
with engine.begin() as conn:
|
||||
for ds_prefix, ds_label in DATASOURCE_LABELS.items():
|
||||
status_key = f"{ds_prefix}F00"
|
||||
status_code = flat.get(status_key)
|
||||
if status_code is None:
|
||||
continue
|
||||
|
||||
status = _map_status(status_code)
|
||||
ds_last_ts = _ts_to_dt(flat.get(f"{ds_prefix}F15"))
|
||||
status_msg = (
|
||||
f"Cove datasource status: {_status_label(status_code)} "
|
||||
f"({status_code}); last session: {_fmt_utc(ds_last_ts)}"
|
||||
)
|
||||
|
||||
# Upsert customer_objects
|
||||
customer_object_id = conn.execute(
|
||||
text(
|
||||
"""
|
||||
INSERT INTO customer_objects (customer_id, object_name, object_type, first_seen_at, last_seen_at)
|
||||
VALUES (:customer_id, :object_name, :object_type, NOW(), NOW())
|
||||
ON CONFLICT (customer_id, object_name)
|
||||
DO UPDATE SET
|
||||
last_seen_at = NOW(),
|
||||
object_type = COALESCE(EXCLUDED.object_type, customer_objects.object_type)
|
||||
RETURNING id
|
||||
"""
|
||||
),
|
||||
{
|
||||
"customer_id": customer_id,
|
||||
"object_name": ds_label,
|
||||
"object_type": "cove_datasource",
|
||||
},
|
||||
).scalar()
|
||||
|
||||
# Upsert job_object_links
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
INSERT INTO job_object_links (job_id, customer_object_id, first_seen_at, last_seen_at)
|
||||
VALUES (:job_id, :customer_object_id, NOW(), NOW())
|
||||
ON CONFLICT (job_id, customer_object_id)
|
||||
DO UPDATE SET last_seen_at = NOW()
|
||||
"""
|
||||
),
|
||||
{"job_id": job_id, "customer_object_id": customer_object_id},
|
||||
)
|
||||
|
||||
# Upsert run_object_links
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
INSERT INTO run_object_links (run_id, customer_object_id, status, error_message, observed_at)
|
||||
VALUES (:run_id, :customer_object_id, :status, :error_message, :observed_at)
|
||||
ON CONFLICT (run_id, customer_object_id)
|
||||
DO UPDATE SET
|
||||
status = EXCLUDED.status,
|
||||
error_message = EXCLUDED.error_message,
|
||||
observed_at = EXCLUDED.observed_at
|
||||
"""
|
||||
),
|
||||
{
|
||||
"run_id": run_id,
|
||||
"customer_object_id": customer_object_id,
|
||||
"status": status,
|
||||
"error_message": status_msg,
|
||||
"observed_at": ds_last_ts or observed_at,
|
||||
},
|
||||
)
|
||||
101
containers/backupchecks/src/backend/app/cove_importer_service.py
Normal file
101
containers/backupchecks/src/backend/app/cove_importer_service.py
Normal file
@ -0,0 +1,101 @@
|
||||
"""Cove Data Protection importer background service.
|
||||
|
||||
Runs a background thread that periodically fetches backup job run data
|
||||
from the Cove API and creates JobRun records in the local database.
|
||||
"""
|
||||
from __future__ import annotations
|
||||
|
||||
import threading
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
from .admin_logging import log_admin_event
|
||||
from .cove_importer import CoveImportError, run_cove_import
|
||||
from .models import SystemSettings
|
||||
|
||||
|
||||
_COVE_IMPORTER_THREAD_NAME = "cove_importer"
|
||||
|
||||
|
||||
def start_cove_importer(app) -> None:
|
||||
"""Start the Cove importer background thread.
|
||||
|
||||
The thread checks settings on every loop and only runs imports when
|
||||
enabled and the configured interval has elapsed.
|
||||
"""
|
||||
|
||||
# Avoid starting multiple threads if create_app() is called more than once.
|
||||
if any(t.name == _COVE_IMPORTER_THREAD_NAME for t in threading.enumerate()):
|
||||
return
|
||||
|
||||
def _worker() -> None:
|
||||
last_run_at: datetime | None = None
|
||||
|
||||
while True:
|
||||
try:
|
||||
with app.app_context():
|
||||
settings = SystemSettings.query.first()
|
||||
if settings is None:
|
||||
time.sleep(10)
|
||||
continue
|
||||
|
||||
enabled = bool(getattr(settings, "cove_import_enabled", False))
|
||||
try:
|
||||
interval_minutes = int(getattr(settings, "cove_import_interval_minutes", 30) or 30)
|
||||
except (TypeError, ValueError):
|
||||
interval_minutes = 30
|
||||
if interval_minutes < 1:
|
||||
interval_minutes = 1
|
||||
|
||||
now = datetime.utcnow()
|
||||
due = False
|
||||
if enabled:
|
||||
if last_run_at is None:
|
||||
due = True
|
||||
else:
|
||||
due = (now - last_run_at).total_seconds() >= (interval_minutes * 60)
|
||||
|
||||
if not due:
|
||||
time.sleep(5)
|
||||
continue
|
||||
|
||||
try:
|
||||
total, created, skipped, errors = run_cove_import(settings)
|
||||
except CoveImportError as exc:
|
||||
log_admin_event(
|
||||
"cove_import_error",
|
||||
f"Cove import failed: {exc}",
|
||||
)
|
||||
last_run_at = now
|
||||
time.sleep(5)
|
||||
continue
|
||||
except Exception as exc:
|
||||
log_admin_event(
|
||||
"cove_import_error",
|
||||
f"Unexpected error during Cove import: {exc}",
|
||||
)
|
||||
last_run_at = now
|
||||
time.sleep(5)
|
||||
continue
|
||||
|
||||
log_admin_event(
|
||||
"cove_import",
|
||||
f"Cove import finished. accounts={total}, created={created}, skipped={skipped}, errors={errors}",
|
||||
)
|
||||
last_run_at = now
|
||||
|
||||
except Exception:
|
||||
# Never let the thread die.
|
||||
try:
|
||||
with app.app_context():
|
||||
log_admin_event(
|
||||
"cove_import_error",
|
||||
"Cove importer thread recovered from an unexpected exception.",
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
time.sleep(5)
|
||||
|
||||
t = threading.Thread(target=_worker, name=_COVE_IMPORTER_THREAD_NAME, daemon=True)
|
||||
t.start()
|
||||
Binary file not shown.
@ -26,5 +26,7 @@ from . import routes_feedback # noqa: F401
|
||||
from . import routes_api # noqa: F401
|
||||
from . import routes_reporting_api # noqa: F401
|
||||
from . import routes_user_settings # noqa: F401
|
||||
from . import routes_search # noqa: F401
|
||||
from . import routes_cove # noqa: F401
|
||||
|
||||
__all__ = ["main_bp", "roles_required"]
|
||||
|
||||
@ -16,9 +16,11 @@ def api_job_run_alerts(run_id: int):
|
||||
tickets = []
|
||||
remarks = []
|
||||
|
||||
# Tickets linked to this specific run
|
||||
# Only show tickets that were explicitly linked via ticket_job_runs
|
||||
# Tickets linked to this run:
|
||||
# 1. Explicitly linked via ticket_job_runs (audit trail when resolved)
|
||||
# 2. Linked to the job via ticket_scopes (active on run date)
|
||||
try:
|
||||
# First, get tickets explicitly linked to this run via ticket_job_runs
|
||||
rows = (
|
||||
db.session.execute(
|
||||
text(
|
||||
@ -43,7 +45,11 @@ def api_job_run_alerts(run_id: int):
|
||||
.all()
|
||||
)
|
||||
|
||||
ticket_ids_seen = set()
|
||||
for r in rows:
|
||||
ticket_id = int(r.get("id"))
|
||||
ticket_ids_seen.add(ticket_id)
|
||||
|
||||
resolved_at = r.get("resolved_at")
|
||||
resolved_same_day = False
|
||||
if resolved_at and run_date:
|
||||
@ -52,7 +58,62 @@ def api_job_run_alerts(run_id: int):
|
||||
|
||||
tickets.append(
|
||||
{
|
||||
"id": int(r.get("id")),
|
||||
"id": ticket_id,
|
||||
"ticket_code": r.get("ticket_code") or "",
|
||||
"description": r.get("description") or "",
|
||||
"start_date": _format_datetime(r.get("start_date")),
|
||||
"active_from_date": str(r.get("active_from_date")) if r.get("active_from_date") else "",
|
||||
"resolved_at": _format_datetime(r.get("resolved_at")) if r.get("resolved_at") else "",
|
||||
"active": bool(active_now),
|
||||
"resolved_same_day": bool(resolved_same_day),
|
||||
}
|
||||
)
|
||||
|
||||
# Second, get tickets linked to the job via ticket_scopes
|
||||
# These are tickets that apply to the whole job (not just a specific run)
|
||||
rows = (
|
||||
db.session.execute(
|
||||
text(
|
||||
"""
|
||||
SELECT DISTINCT t.id,
|
||||
t.ticket_code,
|
||||
t.description,
|
||||
t.start_date,
|
||||
t.resolved_at,
|
||||
t.active_from_date
|
||||
FROM tickets t
|
||||
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
||||
WHERE ts.job_id = :job_id
|
||||
AND t.active_from_date <= :run_date
|
||||
AND COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
||||
ORDER BY t.start_date DESC
|
||||
"""
|
||||
),
|
||||
{
|
||||
"job_id": job.id if job else 0,
|
||||
"run_date": run_date,
|
||||
},
|
||||
)
|
||||
.mappings()
|
||||
.all()
|
||||
)
|
||||
|
||||
for r in rows:
|
||||
ticket_id = int(r.get("id"))
|
||||
# Skip if already added via ticket_job_runs
|
||||
if ticket_id in ticket_ids_seen:
|
||||
continue
|
||||
ticket_ids_seen.add(ticket_id)
|
||||
|
||||
resolved_at = r.get("resolved_at")
|
||||
resolved_same_day = False
|
||||
if resolved_at and run_date:
|
||||
resolved_same_day = _to_amsterdam_date(resolved_at) == run_date
|
||||
active_now = r.get("resolved_at") is None
|
||||
|
||||
tickets.append(
|
||||
{
|
||||
"id": ticket_id,
|
||||
"ticket_code": r.get("ticket_code") or "",
|
||||
"description": r.get("description") or "",
|
||||
"start_date": _format_datetime(r.get("start_date")),
|
||||
@ -65,9 +126,13 @@ def api_job_run_alerts(run_id: int):
|
||||
except Exception as exc:
|
||||
return jsonify({"status": "error", "message": str(exc) or "Failed to load tickets."}), 500
|
||||
|
||||
# Remarks linked to this specific run
|
||||
# Only show remarks that were explicitly linked via remark_job_runs
|
||||
# Remarks linked to this run:
|
||||
# 1. Explicitly linked via remark_job_runs (audit trail when resolved)
|
||||
# 2. Linked to the job via remark_scopes (active on run date)
|
||||
try:
|
||||
remark_ids_seen = set()
|
||||
|
||||
# First, remarks explicitly linked to this run.
|
||||
rows = (
|
||||
db.session.execute(
|
||||
text(
|
||||
@ -88,6 +153,9 @@ def api_job_run_alerts(run_id: int):
|
||||
)
|
||||
|
||||
for rr in rows:
|
||||
remark_id = int(rr.get("id"))
|
||||
remark_ids_seen.add(remark_id)
|
||||
|
||||
body = (rr.get("body") or "").strip()
|
||||
if len(body) > 180:
|
||||
body = body[:177] + "..."
|
||||
@ -101,7 +169,64 @@ def api_job_run_alerts(run_id: int):
|
||||
|
||||
remarks.append(
|
||||
{
|
||||
"id": int(rr.get("id")),
|
||||
"id": remark_id,
|
||||
"body": body,
|
||||
"start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-",
|
||||
"active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "",
|
||||
"resolved_at": _format_datetime(rr.get("resolved_at")) if rr.get("resolved_at") else "",
|
||||
"active": bool(active_now),
|
||||
"resolved_same_day": bool(resolved_same_day),
|
||||
}
|
||||
)
|
||||
|
||||
# Second, active job-level remarks from scope (not yet explicitly linked to this run).
|
||||
ui_tz = _get_ui_timezone_name()
|
||||
rows = (
|
||||
db.session.execute(
|
||||
text(
|
||||
"""
|
||||
SELECT DISTINCT r.id, r.body, r.start_date, r.resolved_at, r.active_from_date
|
||||
FROM remarks r
|
||||
JOIN remark_scopes rs ON rs.remark_id = r.id
|
||||
WHERE rs.job_id = :job_id
|
||||
AND COALESCE(
|
||||
r.active_from_date,
|
||||
((r.start_date AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date)
|
||||
) <= :run_date
|
||||
AND r.resolved_at IS NULL
|
||||
ORDER BY r.start_date DESC
|
||||
"""
|
||||
),
|
||||
{
|
||||
"job_id": job.id if job else 0,
|
||||
"run_date": run_date,
|
||||
"ui_tz": ui_tz,
|
||||
},
|
||||
)
|
||||
.mappings()
|
||||
.all()
|
||||
)
|
||||
|
||||
for rr in rows:
|
||||
remark_id = int(rr.get("id"))
|
||||
if remark_id in remark_ids_seen:
|
||||
continue
|
||||
remark_ids_seen.add(remark_id)
|
||||
|
||||
body = (rr.get("body") or "").strip()
|
||||
if len(body) > 180:
|
||||
body = body[:177] + "..."
|
||||
|
||||
resolved_at = rr.get("resolved_at")
|
||||
resolved_same_day = False
|
||||
if resolved_at and run_date:
|
||||
resolved_same_day = _to_amsterdam_date(resolved_at) == run_date
|
||||
|
||||
active_now = resolved_at is None or (not resolved_same_day)
|
||||
|
||||
remarks.append(
|
||||
{
|
||||
"id": remark_id,
|
||||
"body": body,
|
||||
"start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-",
|
||||
"active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "",
|
||||
|
||||
313
containers/backupchecks/src/backend/app/main/routes_cove.py
Normal file
313
containers/backupchecks/src/backend/app/main/routes_cove.py
Normal file
@ -0,0 +1,313 @@
|
||||
"""Cove Data Protection – account review routes.
|
||||
|
||||
Mirrors the Inbox flow for mail messages:
|
||||
/cove/accounts – list all Cove accounts (unmatched first)
|
||||
/cove/accounts/<id>/link – link an account to an existing or new job
|
||||
/cove/accounts/<id>/unlink – remove the job link
|
||||
"""
|
||||
import re
|
||||
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
from .routes_shared import _log_admin_event
|
||||
from ..cove_importer import CoveImportError, run_cove_import
|
||||
|
||||
from ..models import CoveAccount, Customer, Job, JobRun, SystemSettings
|
||||
|
||||
|
||||
_COVE_DATASOURCE_LABELS = {
|
||||
"D01": "Files & Folders",
|
||||
"D1": "Files & Folders",
|
||||
"D02": "System State",
|
||||
"D2": "System State",
|
||||
"D10": "VssMsSql",
|
||||
"D11": "VssSharePoint",
|
||||
"D19": "M365 Exchange",
|
||||
"D20": "M365 OneDrive",
|
||||
"D05": "M365 SharePoint",
|
||||
"D5": "M365 SharePoint",
|
||||
"D23": "M365 Teams",
|
||||
}
|
||||
|
||||
_COVE_M365_CODES = {"D19", "D20", "D05", "D5", "D23"}
|
||||
_COVE_SERVER_CODES = {"D10", "D11"}
|
||||
|
||||
|
||||
def _parse_cove_datasource_codes(raw: str | None) -> list[str]:
|
||||
"""Extract datasource codes from Cove I78 strings like 'D01D02D10'."""
|
||||
text = (raw or "").strip().upper()
|
||||
if not text:
|
||||
return []
|
||||
return re.findall(r"D\d{1,2}", text)
|
||||
|
||||
|
||||
def _derive_backup_type_for_account(cove_acc: CoveAccount) -> str:
|
||||
"""Return Backupchecks-style backup type for a Cove account.
|
||||
|
||||
Heuristic:
|
||||
- M365 datasource present -> Microsoft 365
|
||||
- Server-specific datasource -> Server
|
||||
- Otherwise -> Workstation
|
||||
"""
|
||||
codes = set(_parse_cove_datasource_codes(getattr(cove_acc, "datasource_types", None)))
|
||||
if codes.intersection(_COVE_M365_CODES):
|
||||
return "Microsoft 365"
|
||||
if codes.intersection(_COVE_SERVER_CODES):
|
||||
return "Server"
|
||||
return "Workstation"
|
||||
|
||||
|
||||
def _humanize_datasources(raw: str | None) -> str:
|
||||
"""Return readable datasource labels from Cove I78 code string."""
|
||||
labels: list[str] = []
|
||||
for code in _parse_cove_datasource_codes(raw):
|
||||
label = _COVE_DATASOURCE_LABELS.get(code, code)
|
||||
if label not in labels:
|
||||
labels.append(label)
|
||||
return ", ".join(labels)
|
||||
|
||||
|
||||
@main_bp.route("/cove/accounts")
|
||||
@login_required
|
||||
@roles_required("admin", "operator")
|
||||
def cove_accounts():
|
||||
settings = SystemSettings.query.first()
|
||||
if not settings or not getattr(settings, "cove_enabled", False):
|
||||
flash("Cove integration is not enabled.", "warning")
|
||||
return redirect(url_for("main.settings", section="integrations"))
|
||||
|
||||
# Unmatched accounts (no job linked) – shown first, like Inbox items
|
||||
unmatched = (
|
||||
CoveAccount.query
|
||||
.filter(CoveAccount.job_id.is_(None))
|
||||
.order_by(CoveAccount.customer_name.asc().nullslast(), CoveAccount.account_name.asc())
|
||||
.all()
|
||||
)
|
||||
|
||||
# Matched accounts
|
||||
matched = (
|
||||
CoveAccount.query
|
||||
.filter(CoveAccount.job_id.isnot(None))
|
||||
.order_by(CoveAccount.customer_name.asc().nullslast(), CoveAccount.account_name.asc())
|
||||
.all()
|
||||
)
|
||||
|
||||
customers = Customer.query.filter_by(active=True).order_by(Customer.name.asc()).all()
|
||||
jobs = Job.query.filter_by(archived=False).order_by(Job.job_name.asc()).all()
|
||||
|
||||
for acc in unmatched + matched:
|
||||
acc.derived_backup_software = "Cove Data Protection"
|
||||
acc.derived_backup_type = _derive_backup_type_for_account(acc)
|
||||
acc.derived_job_name = (acc.account_name or acc.computer_name or f"Cove account {acc.account_id}").strip()
|
||||
acc.datasource_display = _humanize_datasources(acc.datasource_types) or "—"
|
||||
|
||||
return render_template(
|
||||
"main/cove_accounts.html",
|
||||
unmatched=unmatched,
|
||||
matched=matched,
|
||||
customers=customers,
|
||||
jobs=jobs,
|
||||
settings=settings,
|
||||
STATUS_LABELS={
|
||||
1: "In process", 2: "Failed", 3: "Aborted", 5: "Completed",
|
||||
6: "Interrupted", 7: "Not started", 8: "Completed with errors",
|
||||
9: "In progress with faults", 10: "Over quota",
|
||||
11: "No selection", 12: "Restarted",
|
||||
},
|
||||
STATUS_CLASS={
|
||||
1: "warning", 2: "danger", 3: "danger", 5: "success",
|
||||
6: "danger", 7: "secondary", 8: "warning", 9: "warning",
|
||||
10: "danger", 11: "warning", 12: "warning",
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
@main_bp.route("/cove/accounts/<int:cove_account_db_id>/link", methods=["POST"])
|
||||
@login_required
|
||||
@roles_required("admin", "operator")
|
||||
def cove_account_link(cove_account_db_id: int):
|
||||
"""Link a Cove account to a job (create a new one or select existing)."""
|
||||
cove_acc = CoveAccount.query.get_or_404(cove_account_db_id)
|
||||
|
||||
action = (request.form.get("action") or "").strip() # "create" or "link"
|
||||
|
||||
linked_job_name = ""
|
||||
|
||||
if action == "create":
|
||||
# Create a new job from the Cove account data
|
||||
customer_id_raw = (request.form.get("customer_id") or "").strip()
|
||||
if not customer_id_raw:
|
||||
flash("Please select a customer.", "danger")
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
|
||||
try:
|
||||
customer_id = int(customer_id_raw)
|
||||
except ValueError:
|
||||
flash("Invalid customer selection.", "danger")
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
|
||||
customer = Customer.query.get(customer_id)
|
||||
if not customer:
|
||||
flash("Customer not found.", "danger")
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
|
||||
default_job_name = (cove_acc.account_name or cove_acc.computer_name or f"Cove account {cove_acc.account_id}").strip()
|
||||
job_name = (request.form.get("job_name") or default_job_name).strip()
|
||||
backup_type = (request.form.get("backup_type") or _derive_backup_type_for_account(cove_acc)).strip()
|
||||
|
||||
job = Job(
|
||||
customer_id=customer.id,
|
||||
backup_software="Cove Data Protection",
|
||||
backup_type=backup_type,
|
||||
job_name=job_name,
|
||||
cove_account_id=cove_acc.account_id,
|
||||
active=True,
|
||||
auto_approve=True,
|
||||
)
|
||||
db.session.add(job)
|
||||
db.session.flush()
|
||||
|
||||
cove_acc.job_id = job.id
|
||||
db.session.commit()
|
||||
|
||||
_log_admin_event(
|
||||
"cove_account_linked",
|
||||
f"Created job {job.id} and linked Cove account {cove_acc.account_id} ({cove_acc.account_name})",
|
||||
details=f"customer={customer.name}, job_name={job_name}",
|
||||
)
|
||||
linked_job_name = job_name
|
||||
flash(f"Job '{job_name}' created for customer '{customer.name}'.", "success")
|
||||
|
||||
elif action == "link":
|
||||
# Link to an existing job
|
||||
job_id_raw = (request.form.get("job_id") or "").strip()
|
||||
if not job_id_raw:
|
||||
flash("Please select a job.", "danger")
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
|
||||
try:
|
||||
job_id = int(job_id_raw)
|
||||
except ValueError:
|
||||
flash("Invalid job selection.", "danger")
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
|
||||
job = Job.query.get(job_id)
|
||||
if not job:
|
||||
flash("Job not found.", "danger")
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
|
||||
job.cove_account_id = cove_acc.account_id
|
||||
cove_acc.job_id = job.id
|
||||
db.session.commit()
|
||||
|
||||
_log_admin_event(
|
||||
"cove_account_linked",
|
||||
f"Linked Cove account {cove_acc.account_id} ({cove_acc.account_name}) to existing job {job.id}",
|
||||
details=f"job_name={job.job_name}",
|
||||
)
|
||||
linked_job_name = job.job_name or ""
|
||||
flash(f"Cove account linked to job '{job.job_name}'.", "success")
|
||||
|
||||
else:
|
||||
flash("Unknown action.", "warning")
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
|
||||
# Trigger an immediate import so the latest Cove run appears right away
|
||||
# after linking (instead of waiting for the next scheduled/manual import).
|
||||
settings = SystemSettings.query.first()
|
||||
if settings and getattr(settings, "cove_enabled", False):
|
||||
linked_job_id = cove_acc.job_id
|
||||
before_count = 0
|
||||
if linked_job_id:
|
||||
before_count = (
|
||||
JobRun.query
|
||||
.filter_by(job_id=linked_job_id, source_type="cove_api")
|
||||
.count()
|
||||
)
|
||||
try:
|
||||
total, created, skipped, errors = run_cove_import(settings)
|
||||
after_count = 0
|
||||
if linked_job_id:
|
||||
after_count = (
|
||||
JobRun.query
|
||||
.filter_by(job_id=linked_job_id, source_type="cove_api")
|
||||
.count()
|
||||
)
|
||||
linked_created = max(after_count - before_count, 0)
|
||||
|
||||
_log_admin_event(
|
||||
"cove_import_after_link",
|
||||
(
|
||||
"Triggered immediate Cove import after account link. "
|
||||
f"accounts={total}, created={created}, skipped={skipped}, errors={errors}"
|
||||
),
|
||||
)
|
||||
if linked_created > 0:
|
||||
flash(
|
||||
(
|
||||
f"Immediate import complete for '{linked_job_name}'. "
|
||||
f"New linked runs: {linked_created} (accounts: {total}, skipped: {skipped}, errors: {errors})."
|
||||
),
|
||||
"success" if errors == 0 else "warning",
|
||||
)
|
||||
else:
|
||||
latest_cove = CoveAccount.query.get(cove_acc.id)
|
||||
if latest_cove and latest_cove.last_run_at:
|
||||
reason = (
|
||||
"latest run seems unchanged (already imported) "
|
||||
"or Cove has not published a newer session yet"
|
||||
)
|
||||
else:
|
||||
reason = "Cove returned no usable last-session timestamp yet for this account"
|
||||
flash(
|
||||
(
|
||||
f"Immediate import complete for '{linked_job_name}', but no new run was found yet. "
|
||||
f"Reason: {reason}. (accounts: {total}, skipped: {skipped}, errors: {errors})"
|
||||
),
|
||||
"info" if errors == 0 else "warning",
|
||||
)
|
||||
except CoveImportError as exc:
|
||||
_log_admin_event(
|
||||
"cove_import_after_link_error",
|
||||
f"Immediate Cove import after account link failed: {exc}",
|
||||
)
|
||||
flash(
|
||||
"Account linked, but immediate import failed. "
|
||||
"You can run import again from Cove settings.",
|
||||
"warning",
|
||||
)
|
||||
except Exception as exc:
|
||||
_log_admin_event(
|
||||
"cove_import_after_link_error",
|
||||
f"Unexpected immediate Cove import error after account link: {exc}",
|
||||
)
|
||||
flash(
|
||||
"Account linked, but immediate import encountered an unexpected error. "
|
||||
"You can run import again from Cove settings.",
|
||||
"warning",
|
||||
)
|
||||
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
|
||||
|
||||
@main_bp.route("/cove/accounts/<int:cove_account_db_id>/unlink", methods=["POST"])
|
||||
@login_required
|
||||
@roles_required("admin", "operator")
|
||||
def cove_account_unlink(cove_account_db_id: int):
|
||||
"""Remove the job link from a Cove account (puts it back in the unmatched list)."""
|
||||
cove_acc = CoveAccount.query.get_or_404(cove_account_db_id)
|
||||
|
||||
old_job_id = cove_acc.job_id
|
||||
if old_job_id:
|
||||
job = Job.query.get(old_job_id)
|
||||
if job and job.cove_account_id == cove_acc.account_id:
|
||||
job.cove_account_id = None
|
||||
|
||||
cove_acc.job_id = None
|
||||
db.session.commit()
|
||||
|
||||
_log_admin_event(
|
||||
"cove_account_unlinked",
|
||||
f"Unlinked Cove account {cove_acc.account_id} ({cove_acc.account_name}) from job {old_job_id}",
|
||||
)
|
||||
flash("Cove account unlinked.", "success")
|
||||
return redirect(url_for("main.cove_accounts"))
|
||||
@ -63,7 +63,27 @@ def _get_or_create_settings_local():
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
def customers():
|
||||
items = Customer.query.order_by(Customer.name.asc()).all()
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
query = Customer.query
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
query = query.filter(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
|
||||
|
||||
items = query.order_by(Customer.name.asc()).all()
|
||||
|
||||
settings = _get_or_create_settings_local()
|
||||
autotask_enabled = bool(getattr(settings, "autotask_enabled", False))
|
||||
@ -105,6 +125,7 @@ def customers():
|
||||
can_manage=can_manage,
|
||||
autotask_enabled=autotask_enabled,
|
||||
autotask_configured=autotask_configured,
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
@ -484,6 +505,7 @@ def customers_export():
|
||||
@roles_required("admin", "operator")
|
||||
def customers_import():
|
||||
file = request.files.get("file")
|
||||
include_autotask_ids = bool(request.form.get("include_autotask_ids"))
|
||||
if not file or not getattr(file, "filename", ""):
|
||||
flash("No file selected.", "warning")
|
||||
return redirect(url_for("main.customers"))
|
||||
@ -520,10 +542,11 @@ def customers_import():
|
||||
# Detect Autotask columns (backwards compatible - these are optional)
|
||||
autotask_id_idx = None
|
||||
autotask_name_idx = None
|
||||
if "autotask_company_id" in header:
|
||||
autotask_id_idx = header.index("autotask_company_id")
|
||||
if "autotask_company_name" in header:
|
||||
autotask_name_idx = header.index("autotask_company_name")
|
||||
if include_autotask_ids:
|
||||
if "autotask_company_id" in header:
|
||||
autotask_id_idx = header.index("autotask_company_id")
|
||||
if "autotask_company_name" in header:
|
||||
autotask_name_idx = header.index("autotask_company_name")
|
||||
|
||||
for r in rows[start_idx:]:
|
||||
if not r:
|
||||
@ -561,7 +584,7 @@ def customers_import():
|
||||
if active_val is not None:
|
||||
existing.active = active_val
|
||||
# Update Autotask mapping if provided in CSV
|
||||
if autotask_company_id is not None:
|
||||
if include_autotask_ids and autotask_company_id is not None:
|
||||
existing.autotask_company_id = autotask_company_id
|
||||
existing.autotask_company_name = autotask_company_name
|
||||
existing.autotask_mapping_status = None # Will be resynced
|
||||
@ -579,7 +602,10 @@ def customers_import():
|
||||
|
||||
try:
|
||||
db.session.commit()
|
||||
flash(f"Import finished. Created: {created}, Updated: {updated}, Skipped: {skipped}.", "success")
|
||||
flash(
|
||||
f"Import finished. Created: {created}, Updated: {updated}, Skipped: {skipped}. Autotask IDs imported: {'yes' if include_autotask_ids else 'no'}.",
|
||||
"success",
|
||||
)
|
||||
|
||||
# Audit logging
|
||||
import json
|
||||
@ -588,6 +614,7 @@ def customers_import():
|
||||
f"Imported customers from CSV",
|
||||
details=json.dumps({
|
||||
"format": "CSV",
|
||||
"include_autotask_ids": include_autotask_ids,
|
||||
"created": created,
|
||||
"updated": updated,
|
||||
"skipped": skipped
|
||||
@ -599,5 +626,3 @@ def customers_import():
|
||||
flash("Failed to import customers.", "danger")
|
||||
|
||||
return redirect(url_for("main.customers"))
|
||||
|
||||
|
||||
|
||||
@ -9,6 +9,21 @@ MISSED_GRACE_WINDOW = timedelta(hours=1)
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
def daily_jobs():
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
# Determine target date (default: today) in Europe/Amsterdam
|
||||
date_str = request.args.get("date")
|
||||
try:
|
||||
@ -74,10 +89,21 @@ def daily_jobs():
|
||||
|
||||
weekday_idx = target_date.weekday() # 0=Mon..6=Sun
|
||||
|
||||
jobs = (
|
||||
jobs_query = (
|
||||
Job.query.join(Customer, isouter=True)
|
||||
.filter(Job.archived.is_(False))
|
||||
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
)
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
jobs_query = jobs_query.filter(
|
||||
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
|
||||
)
|
||||
jobs = (
|
||||
jobs_query
|
||||
.order_by(Customer.name.asc().nullslast(), Job.backup_software.asc(), Job.backup_type.asc(), Job.job_name.asc())
|
||||
.all()
|
||||
)
|
||||
@ -306,7 +332,7 @@ def daily_jobs():
|
||||
)
|
||||
|
||||
target_date_str = target_date.strftime("%Y-%m-%d")
|
||||
return render_template("main/daily_jobs.html", rows=rows, target_date_str=target_date_str)
|
||||
return render_template("main/daily_jobs.html", rows=rows, target_date_str=target_date_str, q=q)
|
||||
|
||||
|
||||
@main_bp.route("/daily-jobs/details")
|
||||
|
||||
@ -89,6 +89,7 @@ DOCUMENTATION_STRUCTURE = {
|
||||
{'slug': 'general', 'title': 'General Settings'},
|
||||
{'slug': 'mail-configuration', 'title': 'Mail Configuration'},
|
||||
{'slug': 'autotask-integration', 'title': 'Autotask Integration'},
|
||||
{'slug': 'entra-sso', 'title': 'Microsoft Entra SSO'},
|
||||
{'slug': 'reporting-settings', 'title': 'Reporting Settings'},
|
||||
{'slug': 'user-management', 'title': 'User Management'},
|
||||
{'slug': 'maintenance', 'title': 'Maintenance'},
|
||||
|
||||
@ -1,5 +1,53 @@
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
from .routes_shared import _format_datetime
|
||||
from werkzeug.utils import secure_filename
|
||||
import imghdr
|
||||
|
||||
|
||||
# Allowed image extensions and max file size
|
||||
ALLOWED_EXTENSIONS = {'png', 'jpg', 'jpeg', 'gif', 'webp'}
|
||||
MAX_FILE_SIZE = 5 * 1024 * 1024 # 5 MB
|
||||
|
||||
|
||||
def _validate_image_file(file):
|
||||
"""Validate uploaded image file.
|
||||
|
||||
Returns (is_valid, error_message, mime_type)
|
||||
"""
|
||||
if not file or not file.filename:
|
||||
return False, "No file selected", None
|
||||
|
||||
# Check file size
|
||||
file.seek(0, 2) # Seek to end
|
||||
size = file.tell()
|
||||
file.seek(0) # Reset to beginning
|
||||
|
||||
if size > MAX_FILE_SIZE:
|
||||
return False, f"File too large (max {MAX_FILE_SIZE // (1024*1024)}MB)", None
|
||||
|
||||
if size == 0:
|
||||
return False, "Empty file", None
|
||||
|
||||
# Check extension
|
||||
filename = secure_filename(file.filename)
|
||||
if '.' not in filename:
|
||||
return False, "File must have an extension", None
|
||||
|
||||
ext = filename.rsplit('.', 1)[1].lower()
|
||||
if ext not in ALLOWED_EXTENSIONS:
|
||||
return False, f"Only images allowed ({', '.join(ALLOWED_EXTENSIONS)})", None
|
||||
|
||||
# Verify it's actually an image by reading header
|
||||
file_data = file.read()
|
||||
file.seek(0)
|
||||
|
||||
image_type = imghdr.what(None, h=file_data)
|
||||
if image_type is None:
|
||||
return False, "Invalid image file", None
|
||||
|
||||
mime_type = f"image/{image_type}"
|
||||
|
||||
return True, None, mime_type
|
||||
|
||||
|
||||
@main_bp.route("/feedback")
|
||||
@ -21,7 +69,14 @@ def feedback_page():
|
||||
if sort not in ("votes", "newest", "updated"):
|
||||
sort = "votes"
|
||||
|
||||
where = ["fi.deleted_at IS NULL"]
|
||||
# Admin-only: show deleted items
|
||||
show_deleted = False
|
||||
if get_active_role() == "admin":
|
||||
show_deleted = request.args.get("show_deleted", "0") in ("1", "true", "yes", "on")
|
||||
|
||||
where = []
|
||||
if not show_deleted:
|
||||
where.append("fi.deleted_at IS NULL")
|
||||
params = {"user_id": int(current_user.id)}
|
||||
|
||||
if item_type:
|
||||
@ -58,6 +113,8 @@ def feedback_page():
|
||||
fi.status,
|
||||
fi.created_at,
|
||||
fi.updated_at,
|
||||
fi.deleted_at,
|
||||
fi.deleted_by_user_id,
|
||||
u.username AS created_by,
|
||||
COALESCE(v.vote_count, 0) AS vote_count,
|
||||
EXISTS (
|
||||
@ -95,6 +152,8 @@ def feedback_page():
|
||||
"created_by": r["created_by"] or "-",
|
||||
"vote_count": int(r["vote_count"] or 0),
|
||||
"user_voted": bool(r["user_voted"]),
|
||||
"is_deleted": bool(r["deleted_at"]),
|
||||
"deleted_at": _format_datetime(r["deleted_at"]) if r["deleted_at"] else "",
|
||||
}
|
||||
)
|
||||
|
||||
@ -105,6 +164,7 @@ def feedback_page():
|
||||
status=status,
|
||||
q=q,
|
||||
sort=sort,
|
||||
show_deleted=show_deleted,
|
||||
)
|
||||
|
||||
|
||||
@ -135,6 +195,31 @@ def feedback_new():
|
||||
created_by_user_id=int(current_user.id),
|
||||
)
|
||||
db.session.add(item)
|
||||
db.session.flush() # Get item.id for attachments
|
||||
|
||||
# Handle file uploads (multiple files allowed)
|
||||
files = request.files.getlist('screenshots')
|
||||
for file in files:
|
||||
if file and file.filename:
|
||||
is_valid, error_msg, mime_type = _validate_image_file(file)
|
||||
if not is_valid:
|
||||
db.session.rollback()
|
||||
flash(f"Screenshot error: {error_msg}", "danger")
|
||||
return redirect(url_for("main.feedback_new"))
|
||||
|
||||
filename = secure_filename(file.filename)
|
||||
file_data = file.read()
|
||||
|
||||
attachment = FeedbackAttachment(
|
||||
feedback_item_id=item.id,
|
||||
feedback_reply_id=None,
|
||||
filename=filename,
|
||||
file_data=file_data,
|
||||
mime_type=mime_type,
|
||||
file_size=len(file_data),
|
||||
)
|
||||
db.session.add(attachment)
|
||||
|
||||
db.session.commit()
|
||||
|
||||
flash("Feedback item created.", "success")
|
||||
@ -148,7 +233,8 @@ def feedback_new():
|
||||
@roles_required("admin", "operator", "reporter", "viewer")
|
||||
def feedback_detail(item_id: int):
|
||||
item = FeedbackItem.query.get_or_404(item_id)
|
||||
if item.deleted_at is not None:
|
||||
# Allow admins to view deleted items
|
||||
if item.deleted_at is not None and get_active_role() != "admin":
|
||||
abort(404)
|
||||
|
||||
vote_count = (
|
||||
@ -174,6 +260,15 @@ def feedback_detail(item_id: int):
|
||||
resolved_by = User.query.get(item.resolved_by_user_id)
|
||||
resolved_by_name = resolved_by.username if resolved_by else ""
|
||||
|
||||
# Get attachments for the main item (not linked to a reply)
|
||||
item_attachments = (
|
||||
FeedbackAttachment.query.filter(
|
||||
FeedbackAttachment.feedback_item_id == item.id,
|
||||
FeedbackAttachment.feedback_reply_id.is_(None),
|
||||
)
|
||||
.order_by(FeedbackAttachment.created_at.asc())
|
||||
.all()
|
||||
)
|
||||
|
||||
replies = (
|
||||
FeedbackReply.query.filter(FeedbackReply.feedback_item_id == item.id)
|
||||
@ -181,6 +276,25 @@ def feedback_detail(item_id: int):
|
||||
.all()
|
||||
)
|
||||
|
||||
# Get attachments for each reply
|
||||
reply_ids = [r.id for r in replies]
|
||||
reply_attachments_list = []
|
||||
if reply_ids:
|
||||
reply_attachments_list = (
|
||||
FeedbackAttachment.query.filter(
|
||||
FeedbackAttachment.feedback_reply_id.in_(reply_ids)
|
||||
)
|
||||
.order_by(FeedbackAttachment.created_at.asc())
|
||||
.all()
|
||||
)
|
||||
|
||||
# Map reply_id -> list of attachments
|
||||
reply_attachments_map = {}
|
||||
for att in reply_attachments_list:
|
||||
if att.feedback_reply_id not in reply_attachments_map:
|
||||
reply_attachments_map[att.feedback_reply_id] = []
|
||||
reply_attachments_map[att.feedback_reply_id].append(att)
|
||||
|
||||
reply_user_ids = sorted({int(r.user_id) for r in replies})
|
||||
reply_users = (
|
||||
User.query.filter(User.id.in_(reply_user_ids)).all() if reply_user_ids else []
|
||||
@ -196,6 +310,8 @@ def feedback_detail(item_id: int):
|
||||
user_voted=bool(user_voted),
|
||||
replies=replies,
|
||||
reply_user_map=reply_user_map,
|
||||
item_attachments=item_attachments,
|
||||
reply_attachments_map=reply_attachments_map,
|
||||
)
|
||||
|
||||
@main_bp.route("/feedback/<int:item_id>/reply", methods=["POST"])
|
||||
@ -222,6 +338,31 @@ def feedback_reply(item_id: int):
|
||||
created_at=datetime.utcnow(),
|
||||
)
|
||||
db.session.add(reply)
|
||||
db.session.flush() # Get reply.id for attachments
|
||||
|
||||
# Handle file uploads (multiple files allowed)
|
||||
files = request.files.getlist('screenshots')
|
||||
for file in files:
|
||||
if file and file.filename:
|
||||
is_valid, error_msg, mime_type = _validate_image_file(file)
|
||||
if not is_valid:
|
||||
db.session.rollback()
|
||||
flash(f"Screenshot error: {error_msg}", "danger")
|
||||
return redirect(url_for("main.feedback_detail", item_id=item.id))
|
||||
|
||||
filename = secure_filename(file.filename)
|
||||
file_data = file.read()
|
||||
|
||||
attachment = FeedbackAttachment(
|
||||
feedback_item_id=item.id,
|
||||
feedback_reply_id=reply.id,
|
||||
filename=filename,
|
||||
file_data=file_data,
|
||||
mime_type=mime_type,
|
||||
file_size=len(file_data),
|
||||
)
|
||||
db.session.add(attachment)
|
||||
|
||||
db.session.commit()
|
||||
|
||||
flash("Reply added.", "success")
|
||||
@ -308,3 +449,60 @@ def feedback_delete(item_id: int):
|
||||
|
||||
flash("Feedback item deleted.", "success")
|
||||
return redirect(url_for("main.feedback_page"))
|
||||
|
||||
|
||||
@main_bp.route("/feedback/<int:item_id>/permanent-delete", methods=["POST"])
|
||||
@login_required
|
||||
@roles_required("admin")
|
||||
def feedback_permanent_delete(item_id: int):
|
||||
"""Permanently delete a feedback item and all its attachments from the database.
|
||||
|
||||
This is a hard delete - the item and all associated data will be removed permanently.
|
||||
Only available for items that are already soft-deleted.
|
||||
"""
|
||||
item = FeedbackItem.query.get_or_404(item_id)
|
||||
|
||||
# Only allow permanent delete on already soft-deleted items
|
||||
if item.deleted_at is None:
|
||||
flash("Item must be deleted first before permanent deletion.", "warning")
|
||||
return redirect(url_for("main.feedback_detail", item_id=item.id))
|
||||
|
||||
# Get attachment count for feedback message
|
||||
attachment_count = FeedbackAttachment.query.filter_by(feedback_item_id=item.id).count()
|
||||
|
||||
# Hard delete - CASCADE will automatically delete:
|
||||
# - feedback_votes
|
||||
# - feedback_replies
|
||||
# - feedback_attachments (via replies CASCADE)
|
||||
# - feedback_attachments (direct, via item CASCADE)
|
||||
db.session.delete(item)
|
||||
db.session.commit()
|
||||
|
||||
flash(f"Feedback item permanently deleted ({attachment_count} screenshot(s) removed).", "success")
|
||||
return redirect(url_for("main.feedback_page", show_deleted="1"))
|
||||
|
||||
|
||||
@main_bp.route("/feedback/attachment/<int:attachment_id>")
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "reporter", "viewer")
|
||||
def feedback_attachment(attachment_id: int):
|
||||
"""Serve a feedback attachment image."""
|
||||
attachment = FeedbackAttachment.query.get_or_404(attachment_id)
|
||||
|
||||
# Check if the feedback item is deleted - allow admins to view
|
||||
item = FeedbackItem.query.get(attachment.feedback_item_id)
|
||||
if not item:
|
||||
abort(404)
|
||||
if item.deleted_at is not None and get_active_role() != "admin":
|
||||
abort(404)
|
||||
|
||||
# Serve the image
|
||||
from flask import send_file
|
||||
import io
|
||||
|
||||
return send_file(
|
||||
io.BytesIO(attachment.file_data),
|
||||
mimetype=attachment.mime_type,
|
||||
as_attachment=False,
|
||||
download_name=attachment.filename,
|
||||
)
|
||||
|
||||
@ -9,12 +9,28 @@ from ..ticketing_utils import link_open_internal_tickets_to_run
|
||||
import time
|
||||
import re
|
||||
import html as _html
|
||||
from sqlalchemy import cast, String
|
||||
|
||||
|
||||
@main_bp.route("/inbox")
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
def inbox():
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
try:
|
||||
page = int(request.args.get("page", "1"))
|
||||
except ValueError:
|
||||
@ -28,6 +44,18 @@ def inbox():
|
||||
# Use location column if available; otherwise just return all
|
||||
if hasattr(MailMessage, "location"):
|
||||
query = query.filter(MailMessage.location == "inbox")
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
query = query.filter(
|
||||
(func.coalesce(MailMessage.from_address, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.subject, "").ilike(pat, escape="\\"))
|
||||
| (cast(MailMessage.received_at, String).ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.backup_software, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.backup_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.job_name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.parse_result, "").ilike(pat, escape="\\"))
|
||||
| (cast(MailMessage.parsed_at, String).ilike(pat, escape="\\"))
|
||||
)
|
||||
|
||||
total_items = query.count()
|
||||
total_pages = max(1, math.ceil(total_items / per_page)) if total_items else 1
|
||||
@ -79,6 +107,7 @@ def inbox():
|
||||
customers=customer_rows,
|
||||
can_bulk_delete=(get_active_role() in ("admin", "operator")),
|
||||
is_admin=(get_active_role() == "admin"),
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
|
||||
@ -13,12 +13,56 @@ from .routes_shared import (
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
def jobs():
|
||||
# Join with customers for display
|
||||
jobs = (
|
||||
selected_customer_id = None
|
||||
selected_customer_name = ""
|
||||
q = (request.args.get("q") or "").strip()
|
||||
customer_id_raw = (request.args.get("customer_id") or "").strip()
|
||||
if customer_id_raw:
|
||||
try:
|
||||
selected_customer_id = int(customer_id_raw)
|
||||
except ValueError:
|
||||
selected_customer_id = None
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
base_query = (
|
||||
Job.query
|
||||
.filter(Job.archived.is_(False))
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
)
|
||||
|
||||
if selected_customer_id is not None:
|
||||
base_query = base_query.filter(Job.customer_id == selected_customer_id)
|
||||
selected_customer = Customer.query.filter(Customer.id == selected_customer_id).first()
|
||||
if selected_customer is not None:
|
||||
selected_customer_name = selected_customer.name or ""
|
||||
else:
|
||||
# Default listing hides jobs for inactive customers.
|
||||
base_query = base_query.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
base_query = base_query.filter(
|
||||
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
|
||||
)
|
||||
|
||||
# Join with customers for display
|
||||
jobs = (
|
||||
base_query
|
||||
.add_columns(
|
||||
Job.id,
|
||||
Job.backup_software,
|
||||
@ -54,6 +98,9 @@ def jobs():
|
||||
"main/jobs.html",
|
||||
jobs=rows,
|
||||
can_manage_jobs=can_manage_jobs,
|
||||
selected_customer_id=selected_customer_id,
|
||||
selected_customer_name=selected_customer_name,
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
@ -140,6 +187,35 @@ def unarchive_job(job_id: int):
|
||||
return redirect(url_for("main.archived_jobs"))
|
||||
|
||||
|
||||
@main_bp.route("/jobs/<int:job_id>/set-cove-account", methods=["POST"])
|
||||
@login_required
|
||||
@roles_required("admin", "operator")
|
||||
def job_set_cove_account(job_id: int):
|
||||
"""Save or clear the Cove Account ID for this job."""
|
||||
job = Job.query.get_or_404(job_id)
|
||||
account_id_raw = (request.form.get("cove_account_id") or "").strip()
|
||||
if account_id_raw:
|
||||
try:
|
||||
job.cove_account_id = int(account_id_raw)
|
||||
except (ValueError, TypeError):
|
||||
flash("Invalid Cove Account ID – must be a number.", "warning")
|
||||
return redirect(url_for("main.job_detail", job_id=job_id))
|
||||
else:
|
||||
job.cove_account_id = None
|
||||
|
||||
db.session.commit()
|
||||
try:
|
||||
log_admin_event(
|
||||
"job_cove_account_set",
|
||||
f"Set Cove Account ID for job {job.id} to {job.cove_account_id!r}",
|
||||
details=f"job_name={job.job_name}",
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
flash("Cove Account ID saved.", "success")
|
||||
return redirect(url_for("main.job_detail", job_id=job_id))
|
||||
|
||||
|
||||
@main_bp.route("/jobs/<int:job_id>")
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
@ -444,6 +520,11 @@ def job_detail(job_id: int):
|
||||
if job.customer_id:
|
||||
customer = Customer.query.get(job.customer_id)
|
||||
|
||||
# Load system settings for Cove integration display
|
||||
from ..models import SystemSettings as _SystemSettings
|
||||
_settings = _SystemSettings.query.first()
|
||||
cove_enabled = bool(getattr(_settings, "cove_enabled", False)) if _settings else False
|
||||
|
||||
return render_template(
|
||||
"main/job_detail.html",
|
||||
job=job,
|
||||
@ -460,6 +541,7 @@ def job_detail(job_id: int):
|
||||
has_prev=has_prev,
|
||||
has_next=has_next,
|
||||
can_manage_jobs=can_manage_jobs,
|
||||
cove_enabled=cove_enabled,
|
||||
)
|
||||
|
||||
|
||||
|
||||
@ -11,6 +11,16 @@ _OVERRIDE_DEFAULT_START_AT = datetime(1970, 1, 1)
|
||||
def overrides():
|
||||
can_manage = get_active_role() in ("admin", "operator")
|
||||
can_delete = get_active_role() == "admin"
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _match_query(text: str, raw_query: str) -> bool:
|
||||
hay = (text or "").lower()
|
||||
tokens = [t.strip() for t in (raw_query or "").split() if t.strip()]
|
||||
for tok in tokens:
|
||||
needle = tok.lower().replace("*", "")
|
||||
if needle and needle not in hay:
|
||||
return False
|
||||
return True
|
||||
|
||||
overrides_q = Override.query.order_by(Override.level.asc(), Override.start_at.desc()).all()
|
||||
|
||||
@ -92,16 +102,31 @@ def overrides():
|
||||
|
||||
rows = []
|
||||
for ov in overrides_q:
|
||||
scope_text = _describe_scope(ov)
|
||||
start_text = _format_datetime(ov.start_at)
|
||||
end_text = _format_datetime(ov.end_at) if ov.end_at else ""
|
||||
comment_text = ov.comment or ""
|
||||
if q:
|
||||
full_text = " | ".join([
|
||||
ov.level or "",
|
||||
scope_text,
|
||||
start_text,
|
||||
end_text,
|
||||
comment_text,
|
||||
])
|
||||
if not _match_query(full_text, q):
|
||||
continue
|
||||
|
||||
rows.append(
|
||||
{
|
||||
"id": ov.id,
|
||||
"level": ov.level or "",
|
||||
"scope": _describe_scope(ov),
|
||||
"start_at": _format_datetime(ov.start_at),
|
||||
"end_at": _format_datetime(ov.end_at) if ov.end_at else "",
|
||||
"scope": scope_text,
|
||||
"start_at": start_text,
|
||||
"end_at": end_text,
|
||||
"active": bool(ov.active),
|
||||
"treat_as_success": bool(ov.treat_as_success),
|
||||
"comment": ov.comment or "",
|
||||
"comment": comment_text,
|
||||
"match_status": ov.match_status or "",
|
||||
"match_error_contains": ov.match_error_contains or "",
|
||||
"match_error_mode": getattr(ov, "match_error_mode", None) or "",
|
||||
@ -122,6 +147,7 @@ def overrides():
|
||||
jobs_for_select=jobs_for_select,
|
||||
backup_software_options=backup_software_options,
|
||||
backup_type_options=backup_type_options,
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
@ -398,4 +424,3 @@ def overrides_toggle(override_id: int):
|
||||
|
||||
flash("Override status updated.", "success")
|
||||
return redirect(url_for("main.overrides"))
|
||||
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy import text, cast, String
|
||||
import json
|
||||
import csv
|
||||
import io
|
||||
@ -101,12 +101,33 @@ def api_reports_list():
|
||||
if err is not None:
|
||||
return err
|
||||
|
||||
rows = (
|
||||
db.session.query(ReportDefinition)
|
||||
.order_by(ReportDefinition.created_at.desc())
|
||||
.limit(200)
|
||||
.all()
|
||||
)
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
query = db.session.query(ReportDefinition)
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
query = query.filter(
|
||||
(func.coalesce(ReportDefinition.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(ReportDefinition.report_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(ReportDefinition.output_format, "").ilike(pat, escape="\\"))
|
||||
| (cast(ReportDefinition.period_start, String).ilike(pat, escape="\\"))
|
||||
| (cast(ReportDefinition.period_end, String).ilike(pat, escape="\\"))
|
||||
)
|
||||
|
||||
rows = query.order_by(ReportDefinition.created_at.desc()).limit(200).all()
|
||||
return {
|
||||
"items": [
|
||||
{
|
||||
|
||||
@ -1,6 +1,7 @@
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
from datetime import date, timedelta
|
||||
from .routes_reporting_api import build_report_columns_meta, build_report_job_filters_meta
|
||||
from sqlalchemy import cast, String
|
||||
|
||||
def get_default_report_period():
|
||||
"""Return default report period (last 7 days)."""
|
||||
@ -52,13 +53,33 @@ def _build_report_item(r):
|
||||
@main_bp.route("/reports")
|
||||
@login_required
|
||||
def reports():
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
# Pre-render items so the page is usable even if JS fails to load/execute.
|
||||
rows = (
|
||||
db.session.query(ReportDefinition)
|
||||
.order_by(ReportDefinition.created_at.desc())
|
||||
.limit(200)
|
||||
.all()
|
||||
)
|
||||
query = db.session.query(ReportDefinition)
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
query = query.filter(
|
||||
(func.coalesce(ReportDefinition.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(ReportDefinition.report_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(ReportDefinition.output_format, "").ilike(pat, escape="\\"))
|
||||
| (cast(ReportDefinition.period_start, String).ilike(pat, escape="\\"))
|
||||
| (cast(ReportDefinition.period_end, String).ilike(pat, escape="\\"))
|
||||
)
|
||||
rows = query.order_by(ReportDefinition.created_at.desc()).limit(200).all()
|
||||
items = [_build_report_item(r) for r in rows]
|
||||
|
||||
period_start, period_end = get_default_report_period()
|
||||
@ -70,6 +91,7 @@ def reports():
|
||||
job_filters_meta=build_report_job_filters_meta(),
|
||||
default_period_start=period_start.isoformat(),
|
||||
default_period_end=period_end.isoformat(),
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
|
||||
@ -38,11 +38,19 @@ from ..models import (
|
||||
TicketScope,
|
||||
User,
|
||||
)
|
||||
from ..ticketing_utils import link_open_internal_tickets_to_run
|
||||
|
||||
|
||||
AUTOTASK_TERMINAL_STATUS_IDS = {5}
|
||||
|
||||
|
||||
def _is_hidden_3cx_non_backup(backup_software: str | None, backup_type: str | None) -> bool:
|
||||
"""Hide non-backup 3CX informational jobs from Run Checks."""
|
||||
bs = (backup_software or "").strip().lower()
|
||||
bt = (backup_type or "").strip().lower()
|
||||
return bs == "3cx" and bt in {"update", "ssl certificate"}
|
||||
|
||||
|
||||
def _ensure_internal_ticket_for_autotask(
|
||||
*,
|
||||
ticket_number: str,
|
||||
@ -725,6 +733,8 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
|
||||
mail_message_id=None,
|
||||
)
|
||||
db.session.add(miss)
|
||||
db.session.flush() # Ensure miss.id is available for ticket linking
|
||||
link_open_internal_tickets_to_run(run=miss, job=job)
|
||||
inserted += 1
|
||||
|
||||
d = d + timedelta(days=1)
|
||||
@ -806,6 +816,8 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
|
||||
mail_message_id=None,
|
||||
)
|
||||
db.session.add(miss)
|
||||
db.session.flush() # Ensure miss.id is available for ticket linking
|
||||
link_open_internal_tickets_to_run(run=miss, job=job)
|
||||
inserted += 1
|
||||
|
||||
# Next month
|
||||
@ -825,6 +837,21 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
|
||||
def run_checks_page():
|
||||
"""Run Checks page: list jobs that have runs to review (including generated missed runs)."""
|
||||
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
include_reviewed = False
|
||||
if get_active_role() == "admin":
|
||||
include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on")
|
||||
@ -850,6 +877,8 @@ def run_checks_page():
|
||||
today_local = _to_amsterdam_date(datetime.utcnow()) or datetime.utcnow().date()
|
||||
|
||||
for job in jobs:
|
||||
if _is_hidden_3cx_non_backup(getattr(job, "backup_software", None), getattr(job, "backup_type", None)):
|
||||
continue
|
||||
last_rev = last_reviewed_map.get(int(job.id))
|
||||
if last_rev:
|
||||
start_date = _to_amsterdam_date(last_rev) or settings_start
|
||||
@ -884,6 +913,14 @@ def run_checks_page():
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(Job.archived.is_(False))
|
||||
)
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
base = base.filter(
|
||||
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
|
||||
)
|
||||
|
||||
# Runs to show in the overview: unreviewed (or all if admin toggle enabled)
|
||||
run_filter = []
|
||||
@ -956,7 +993,7 @@ def run_checks_page():
|
||||
Job.id.asc(),
|
||||
)
|
||||
|
||||
rows = q.limit(2000).all()
|
||||
rows = [r for r in q.limit(2000).all() if not _is_hidden_3cx_non_backup(r.backup_software, r.backup_type)]
|
||||
|
||||
# Ensure override flags are up-to-date for the runs shown in this overview.
|
||||
# The Run Checks modal computes override status on-the-fly, but the overview
|
||||
@ -1131,6 +1168,7 @@ def run_checks_page():
|
||||
is_admin=(get_active_role() == "admin"),
|
||||
include_reviewed=include_reviewed,
|
||||
autotask_enabled=autotask_enabled,
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
@ -1151,6 +1189,15 @@ def run_checks_details():
|
||||
include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on")
|
||||
|
||||
job = Job.query.get_or_404(job_id)
|
||||
if _is_hidden_3cx_non_backup(getattr(job, "backup_software", None), getattr(job, "backup_type", None)):
|
||||
job_payload = {
|
||||
"id": job.id,
|
||||
"customer_name": job.customer.name if job.customer else "",
|
||||
"backup_software": job.backup_software or "",
|
||||
"backup_type": job.backup_type or "",
|
||||
"job_name": job.job_name or "",
|
||||
}
|
||||
return jsonify({"status": "ok", "job": job_payload, "runs": [], "message": "This 3CX informational type is hidden from Run Checks."})
|
||||
|
||||
q = JobRun.query.filter(JobRun.job_id == job.id)
|
||||
if not include_reviewed:
|
||||
|
||||
963
containers/backupchecks/src/backend/app/main/routes_search.py
Normal file
963
containers/backupchecks/src/backend/app/main/routes_search.py
Normal file
@ -0,0 +1,963 @@
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
from .routes_shared import (
|
||||
_apply_overrides_to_run,
|
||||
_format_datetime,
|
||||
_get_or_create_settings,
|
||||
_get_ui_timezone,
|
||||
_infer_monthly_schedule_from_runs,
|
||||
_infer_schedule_map_from_runs,
|
||||
)
|
||||
|
||||
from sqlalchemy import and_, cast, func, or_, String
|
||||
import math
|
||||
|
||||
|
||||
SEARCH_LIMIT_PER_SECTION = 10
|
||||
SEARCH_SECTION_KEYS = [
|
||||
"inbox",
|
||||
"customers",
|
||||
"jobs",
|
||||
"daily_jobs",
|
||||
"run_checks",
|
||||
"tickets",
|
||||
"remarks",
|
||||
"overrides",
|
||||
"reports",
|
||||
]
|
||||
|
||||
|
||||
def _is_section_allowed(section: str) -> bool:
|
||||
role = get_active_role()
|
||||
allowed = {
|
||||
"inbox": {"admin", "operator", "viewer"},
|
||||
"customers": {"admin", "operator", "viewer"},
|
||||
"jobs": {"admin", "operator", "viewer"},
|
||||
"daily_jobs": {"admin", "operator", "viewer"},
|
||||
"run_checks": {"admin", "operator"},
|
||||
"tickets": {"admin", "operator", "viewer"},
|
||||
"remarks": {"admin", "operator", "viewer"},
|
||||
"overrides": {"admin", "operator", "viewer"},
|
||||
"reports": {"admin", "operator", "viewer", "reporter"},
|
||||
}
|
||||
return role in allowed.get(section, set())
|
||||
|
||||
|
||||
def _build_patterns(raw_query: str) -> list[str]:
|
||||
tokens = [t.strip() for t in (raw_query or "").split() if t.strip()]
|
||||
patterns: list[str] = []
|
||||
for token in tokens:
|
||||
p = token.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = f"%{p}"
|
||||
if not p.endswith("%"):
|
||||
p = f"{p}%"
|
||||
patterns.append(p)
|
||||
return patterns
|
||||
|
||||
|
||||
def _contains_all_terms(columns: list, patterns: list[str]):
|
||||
if not patterns or not columns:
|
||||
return None
|
||||
term_filters = []
|
||||
for pattern in patterns:
|
||||
per_term = [col.ilike(pattern, escape="\\") for col in columns]
|
||||
term_filters.append(or_(*per_term))
|
||||
return and_(*term_filters)
|
||||
|
||||
|
||||
def _parse_page(value: str | None) -> int:
|
||||
try:
|
||||
page = int((value or "").strip())
|
||||
except Exception:
|
||||
page = 1
|
||||
return page if page > 0 else 1
|
||||
|
||||
|
||||
def _paginate_query(query, page: int, order_by_cols: list):
|
||||
total = query.count()
|
||||
total_pages = max(1, math.ceil(total / SEARCH_LIMIT_PER_SECTION)) if total else 1
|
||||
current_page = min(max(page, 1), total_pages)
|
||||
rows = (
|
||||
query.order_by(*order_by_cols)
|
||||
.offset((current_page - 1) * SEARCH_LIMIT_PER_SECTION)
|
||||
.limit(SEARCH_LIMIT_PER_SECTION)
|
||||
.all()
|
||||
)
|
||||
return total, current_page, total_pages, rows
|
||||
|
||||
|
||||
def _enrich_paging(section: dict, total: int, current_page: int, total_pages: int) -> None:
|
||||
section["total"] = int(total or 0)
|
||||
section["current_page"] = int(current_page or 1)
|
||||
section["total_pages"] = int(total_pages or 1)
|
||||
section["has_prev"] = section["current_page"] > 1
|
||||
section["has_next"] = section["current_page"] < section["total_pages"]
|
||||
section["prev_url"] = ""
|
||||
section["next_url"] = ""
|
||||
|
||||
|
||||
def _build_inbox_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "inbox",
|
||||
"title": "Inbox",
|
||||
"view_all_url": url_for("main.inbox"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("inbox"):
|
||||
return section
|
||||
|
||||
query = MailMessage.query
|
||||
if hasattr(MailMessage, "location"):
|
||||
query = query.filter(MailMessage.location == "inbox")
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(MailMessage.from_address, ""),
|
||||
func.coalesce(MailMessage.subject, ""),
|
||||
cast(MailMessage.received_at, String),
|
||||
func.coalesce(MailMessage.backup_software, ""),
|
||||
func.coalesce(MailMessage.backup_type, ""),
|
||||
func.coalesce(MailMessage.job_name, ""),
|
||||
func.coalesce(MailMessage.parse_result, ""),
|
||||
cast(MailMessage.parsed_at, String),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[MailMessage.received_at.desc().nullslast(), MailMessage.id.desc()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
|
||||
for msg in rows:
|
||||
parsed_flag = bool(getattr(msg, "parsed_at", None) or (msg.parse_result or ""))
|
||||
section["items"].append(
|
||||
{
|
||||
"title": msg.subject or f"Message #{msg.id}",
|
||||
"subtitle": f"{msg.from_address or '-'} | {_format_datetime(msg.received_at)}",
|
||||
"meta": f"{msg.backup_software or '-'} / {msg.backup_type or '-'} / {msg.job_name or '-'} | Parsed: {'Yes' if parsed_flag else 'No'}",
|
||||
"link": url_for("main.inbox"),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_customers_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "customers",
|
||||
"title": "Customers",
|
||||
"view_all_url": url_for("main.customers"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("customers"):
|
||||
return section
|
||||
|
||||
query = Customer.query
|
||||
match_expr = _contains_all_terms([func.coalesce(Customer.name, "")], patterns)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[Customer.name.asc()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for c in rows:
|
||||
try:
|
||||
job_count = c.jobs.count()
|
||||
except Exception:
|
||||
job_count = 0
|
||||
section["items"].append(
|
||||
{
|
||||
"title": c.name or f"Customer #{c.id}",
|
||||
"subtitle": f"Jobs: {job_count}",
|
||||
"meta": "Active" if c.active else "Inactive",
|
||||
"link": url_for("main.jobs", customer_id=c.id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_jobs_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "jobs",
|
||||
"title": "Jobs",
|
||||
"view_all_url": url_for("main.jobs"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("jobs"):
|
||||
return section
|
||||
|
||||
query = (
|
||||
db.session.query(
|
||||
Job.id.label("job_id"),
|
||||
Job.backup_software.label("backup_software"),
|
||||
Job.backup_type.label("backup_type"),
|
||||
Job.job_name.label("job_name"),
|
||||
Customer.name.label("customer_name"),
|
||||
)
|
||||
.select_from(Job)
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(Job.archived.is_(False))
|
||||
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(Job.backup_software, ""),
|
||||
func.coalesce(Job.backup_type, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[
|
||||
Customer.name.asc().nullslast(),
|
||||
Job.backup_software.asc(),
|
||||
Job.backup_type.asc(),
|
||||
Job.job_name.asc(),
|
||||
],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for row in rows:
|
||||
section["items"].append(
|
||||
{
|
||||
"title": row.job_name or f"Job #{row.job_id}",
|
||||
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
|
||||
"meta": "",
|
||||
"link": url_for("main.job_detail", job_id=row.job_id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_daily_jobs_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "daily_jobs",
|
||||
"title": "Daily Jobs",
|
||||
"view_all_url": url_for("main.daily_jobs"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("daily_jobs"):
|
||||
return section
|
||||
|
||||
try:
|
||||
tz = _get_ui_timezone()
|
||||
except Exception:
|
||||
tz = None
|
||||
|
||||
try:
|
||||
target_date = datetime.now(tz).date() if tz else datetime.utcnow().date()
|
||||
except Exception:
|
||||
target_date = datetime.utcnow().date()
|
||||
|
||||
settings = _get_or_create_settings()
|
||||
missed_start_date = getattr(settings, "daily_jobs_start_date", None)
|
||||
|
||||
if tz:
|
||||
local_midnight = datetime(
|
||||
year=target_date.year,
|
||||
month=target_date.month,
|
||||
day=target_date.day,
|
||||
hour=0,
|
||||
minute=0,
|
||||
second=0,
|
||||
tzinfo=tz,
|
||||
)
|
||||
start_of_day = local_midnight.astimezone(datetime_module.timezone.utc).replace(tzinfo=None)
|
||||
end_of_day = (local_midnight + timedelta(days=1)).astimezone(datetime_module.timezone.utc).replace(tzinfo=None)
|
||||
else:
|
||||
start_of_day = datetime(
|
||||
year=target_date.year,
|
||||
month=target_date.month,
|
||||
day=target_date.day,
|
||||
hour=0,
|
||||
minute=0,
|
||||
second=0,
|
||||
)
|
||||
end_of_day = start_of_day + timedelta(days=1)
|
||||
|
||||
def _to_local(dt_utc):
|
||||
if not dt_utc or not tz:
|
||||
return dt_utc
|
||||
try:
|
||||
if dt_utc.tzinfo is None:
|
||||
dt_utc = dt_utc.replace(tzinfo=datetime_module.timezone.utc)
|
||||
return dt_utc.astimezone(tz)
|
||||
except Exception:
|
||||
return dt_utc
|
||||
|
||||
def _bucket_15min(dt_utc):
|
||||
d = _to_local(dt_utc)
|
||||
if not d:
|
||||
return None
|
||||
minute_bucket = (d.minute // 15) * 15
|
||||
return f"{d.hour:02d}:{minute_bucket:02d}"
|
||||
|
||||
def _is_success_status(value: str) -> bool:
|
||||
s = (value or "").strip().lower()
|
||||
if not s:
|
||||
return False
|
||||
return ("success" in s) or ("override" in s)
|
||||
|
||||
query = (
|
||||
db.session.query(
|
||||
Job.id.label("job_id"),
|
||||
Job.job_name.label("job_name"),
|
||||
Job.backup_software.label("backup_software"),
|
||||
Job.backup_type.label("backup_type"),
|
||||
Customer.name.label("customer_name"),
|
||||
)
|
||||
.select_from(Job)
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(Job.archived.is_(False))
|
||||
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(Job.backup_software, ""),
|
||||
func.coalesce(Job.backup_type, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[
|
||||
Customer.name.asc().nullslast(),
|
||||
Job.backup_software.asc(),
|
||||
Job.backup_type.asc(),
|
||||
Job.job_name.asc(),
|
||||
],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for row in rows:
|
||||
expected_times = (_infer_schedule_map_from_runs(row.job_id).get(target_date.weekday()) or [])
|
||||
if not expected_times:
|
||||
monthly = _infer_monthly_schedule_from_runs(row.job_id)
|
||||
if monthly:
|
||||
try:
|
||||
dom = int(monthly.get("day_of_month") or 0)
|
||||
except Exception:
|
||||
dom = 0
|
||||
mtimes = monthly.get("times") or []
|
||||
try:
|
||||
import calendar as _calendar
|
||||
last_dom = _calendar.monthrange(target_date.year, target_date.month)[1]
|
||||
except Exception:
|
||||
last_dom = target_date.day
|
||||
scheduled_dom = dom if (dom and dom <= last_dom) else last_dom
|
||||
if target_date.day == scheduled_dom:
|
||||
expected_times = list(mtimes)
|
||||
|
||||
runs_for_day = (
|
||||
JobRun.query.filter(
|
||||
JobRun.job_id == row.job_id,
|
||||
JobRun.run_at >= start_of_day,
|
||||
JobRun.run_at < end_of_day,
|
||||
)
|
||||
.order_by(JobRun.run_at.asc())
|
||||
.all()
|
||||
)
|
||||
run_count = len(runs_for_day)
|
||||
|
||||
last_status = "-"
|
||||
expected_display = expected_times[-1] if expected_times else "-"
|
||||
if run_count > 0:
|
||||
last_run = runs_for_day[-1]
|
||||
try:
|
||||
job_obj = Job.query.get(int(row.job_id))
|
||||
status_display, _override_applied, _override_level, _ov_id, _ov_reason = _apply_overrides_to_run(job_obj, last_run)
|
||||
if getattr(last_run, "missed", False):
|
||||
last_status = status_display or "Missed"
|
||||
else:
|
||||
last_status = status_display or (last_run.status or "-")
|
||||
except Exception:
|
||||
last_status = last_run.status or "-"
|
||||
expected_display = _bucket_15min(last_run.run_at) or expected_display
|
||||
else:
|
||||
try:
|
||||
today_local = datetime.now(tz).date() if tz else datetime.utcnow().date()
|
||||
except Exception:
|
||||
today_local = datetime.utcnow().date()
|
||||
if target_date > today_local:
|
||||
last_status = "Expected"
|
||||
elif target_date == today_local:
|
||||
last_status = "Expected"
|
||||
else:
|
||||
if missed_start_date and target_date < missed_start_date:
|
||||
last_status = "-"
|
||||
else:
|
||||
last_status = "Missed"
|
||||
|
||||
success_text = "Yes" if _is_success_status(last_status) else "No"
|
||||
section["items"].append(
|
||||
{
|
||||
"title": row.job_name or f"Job #{row.job_id}",
|
||||
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
|
||||
"meta": f"Expected: {expected_display} | Successful: {success_text} | Runs: {run_count}",
|
||||
"link": url_for("main.daily_jobs", date=target_date.strftime("%Y-%m-%d"), open_job_id=row.job_id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_run_checks_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "run_checks",
|
||||
"title": "Run Checks",
|
||||
"view_all_url": url_for("main.run_checks_page"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("run_checks"):
|
||||
return section
|
||||
|
||||
agg = (
|
||||
db.session.query(
|
||||
JobRun.job_id.label("job_id"),
|
||||
func.count(JobRun.id).label("run_count"),
|
||||
)
|
||||
.filter(JobRun.reviewed_at.is_(None))
|
||||
.group_by(JobRun.job_id)
|
||||
.subquery()
|
||||
)
|
||||
|
||||
query = (
|
||||
db.session.query(
|
||||
Job.id.label("job_id"),
|
||||
Job.job_name.label("job_name"),
|
||||
Job.backup_software.label("backup_software"),
|
||||
Job.backup_type.label("backup_type"),
|
||||
Customer.name.label("customer_name"),
|
||||
agg.c.run_count.label("run_count"),
|
||||
)
|
||||
.select_from(Job)
|
||||
.join(agg, agg.c.job_id == Job.id)
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(Job.archived.is_(False))
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(Job.backup_software, ""),
|
||||
func.coalesce(Job.backup_type, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[
|
||||
Customer.name.asc().nullslast(),
|
||||
Job.backup_software.asc().nullslast(),
|
||||
Job.backup_type.asc().nullslast(),
|
||||
Job.job_name.asc().nullslast(),
|
||||
],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for row in rows:
|
||||
section["items"].append(
|
||||
{
|
||||
"title": row.job_name or f"Job #{row.job_id}",
|
||||
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
|
||||
"meta": f"Unreviewed runs: {int(row.run_count or 0)}",
|
||||
"link": url_for("main.run_checks_page"),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_tickets_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "tickets",
|
||||
"title": "Tickets",
|
||||
"view_all_url": url_for("main.tickets_page"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("tickets"):
|
||||
return section
|
||||
|
||||
query = (
|
||||
db.session.query(Ticket)
|
||||
.select_from(Ticket)
|
||||
.outerjoin(TicketScope, TicketScope.ticket_id == Ticket.id)
|
||||
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
|
||||
.outerjoin(Job, Job.id == TicketScope.job_id)
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Ticket.ticket_code, ""),
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(TicketScope.scope_type, ""),
|
||||
func.coalesce(TicketScope.backup_software, ""),
|
||||
func.coalesce(TicketScope.backup_type, ""),
|
||||
func.coalesce(TicketScope.job_name_match, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
query = query.distinct()
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[Ticket.start_date.desc().nullslast()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
|
||||
for t in rows:
|
||||
customer_display = "-"
|
||||
scope_summary = "-"
|
||||
try:
|
||||
scope_rows = (
|
||||
db.session.query(
|
||||
TicketScope.scope_type.label("scope_type"),
|
||||
TicketScope.backup_software.label("backup_software"),
|
||||
TicketScope.backup_type.label("backup_type"),
|
||||
Customer.name.label("customer_name"),
|
||||
)
|
||||
.select_from(TicketScope)
|
||||
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
|
||||
.filter(TicketScope.ticket_id == t.id)
|
||||
.all()
|
||||
)
|
||||
customer_names = []
|
||||
for s in scope_rows:
|
||||
cname = getattr(s, "customer_name", None)
|
||||
if cname and cname not in customer_names:
|
||||
customer_names.append(cname)
|
||||
if customer_names:
|
||||
customer_display = customer_names[0]
|
||||
if len(customer_names) > 1:
|
||||
customer_display = f"{customer_display} +{len(customer_names)-1}"
|
||||
|
||||
if scope_rows:
|
||||
s = scope_rows[0]
|
||||
bits = []
|
||||
if getattr(s, "scope_type", None):
|
||||
bits.append(str(getattr(s, "scope_type")))
|
||||
if getattr(s, "backup_software", None):
|
||||
bits.append(str(getattr(s, "backup_software")))
|
||||
if getattr(s, "backup_type", None):
|
||||
bits.append(str(getattr(s, "backup_type")))
|
||||
scope_summary = " / ".join(bits) if bits else "-"
|
||||
except Exception:
|
||||
customer_display = "-"
|
||||
scope_summary = "-"
|
||||
|
||||
section["items"].append(
|
||||
{
|
||||
"title": t.ticket_code or f"Ticket #{t.id}",
|
||||
"subtitle": f"{customer_display} | {scope_summary}",
|
||||
"meta": _format_datetime(t.start_date),
|
||||
"link": url_for("main.ticket_detail", ticket_id=t.id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_remarks_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "remarks",
|
||||
"title": "Remarks",
|
||||
"view_all_url": url_for("main.tickets_page", tab="remarks"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("remarks"):
|
||||
return section
|
||||
|
||||
query = (
|
||||
db.session.query(Remark)
|
||||
.select_from(Remark)
|
||||
.outerjoin(RemarkScope, RemarkScope.remark_id == Remark.id)
|
||||
.outerjoin(Customer, Customer.id == RemarkScope.customer_id)
|
||||
.outerjoin(Job, Job.id == RemarkScope.job_id)
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Remark.title, ""),
|
||||
func.coalesce(Remark.body, ""),
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(RemarkScope.scope_type, ""),
|
||||
func.coalesce(RemarkScope.backup_software, ""),
|
||||
func.coalesce(RemarkScope.backup_type, ""),
|
||||
func.coalesce(RemarkScope.job_name_match, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
cast(Remark.start_date, String),
|
||||
cast(Remark.resolved_at, String),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
query = query.distinct()
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[Remark.start_date.desc().nullslast()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
|
||||
for r in rows:
|
||||
customer_display = "-"
|
||||
scope_summary = "-"
|
||||
try:
|
||||
scope_rows = (
|
||||
db.session.query(
|
||||
RemarkScope.scope_type.label("scope_type"),
|
||||
RemarkScope.backup_software.label("backup_software"),
|
||||
RemarkScope.backup_type.label("backup_type"),
|
||||
Customer.name.label("customer_name"),
|
||||
)
|
||||
.select_from(RemarkScope)
|
||||
.outerjoin(Customer, Customer.id == RemarkScope.customer_id)
|
||||
.filter(RemarkScope.remark_id == r.id)
|
||||
.all()
|
||||
)
|
||||
customer_names = []
|
||||
for s in scope_rows:
|
||||
cname = getattr(s, "customer_name", None)
|
||||
if cname and cname not in customer_names:
|
||||
customer_names.append(cname)
|
||||
if customer_names:
|
||||
customer_display = customer_names[0]
|
||||
if len(customer_names) > 1:
|
||||
customer_display = f"{customer_display} +{len(customer_names)-1}"
|
||||
|
||||
if scope_rows:
|
||||
s = scope_rows[0]
|
||||
bits = []
|
||||
if getattr(s, "scope_type", None):
|
||||
bits.append(str(getattr(s, "scope_type")))
|
||||
if getattr(s, "backup_software", None):
|
||||
bits.append(str(getattr(s, "backup_software")))
|
||||
if getattr(s, "backup_type", None):
|
||||
bits.append(str(getattr(s, "backup_type")))
|
||||
scope_summary = " / ".join(bits) if bits else "-"
|
||||
except Exception:
|
||||
customer_display = "-"
|
||||
scope_summary = "-"
|
||||
|
||||
preview = (r.title or r.body or "").strip()
|
||||
if len(preview) > 80:
|
||||
preview = preview[:77] + "..."
|
||||
|
||||
section["items"].append(
|
||||
{
|
||||
"title": preview or f"Remark #{r.id}",
|
||||
"subtitle": f"{customer_display} | {scope_summary}",
|
||||
"meta": _format_datetime(r.start_date),
|
||||
"link": url_for("main.remark_detail", remark_id=r.id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_overrides_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "overrides",
|
||||
"title": "Existing overrides",
|
||||
"view_all_url": url_for("main.overrides"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("overrides"):
|
||||
return section
|
||||
|
||||
query = (
|
||||
db.session.query(
|
||||
Override.id.label("id"),
|
||||
Override.level.label("level"),
|
||||
Override.backup_software.label("backup_software"),
|
||||
Override.backup_type.label("backup_type"),
|
||||
Override.object_name.label("object_name"),
|
||||
Override.start_at.label("start_at"),
|
||||
Override.end_at.label("end_at"),
|
||||
Override.comment.label("comment"),
|
||||
Customer.name.label("customer_name"),
|
||||
Job.job_name.label("job_name"),
|
||||
)
|
||||
.select_from(Override)
|
||||
.outerjoin(Job, Job.id == Override.job_id)
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Override.level, ""),
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(Override.backup_software, ""),
|
||||
func.coalesce(Override.backup_type, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
func.coalesce(Override.object_name, ""),
|
||||
cast(Override.start_at, String),
|
||||
cast(Override.end_at, String),
|
||||
func.coalesce(Override.comment, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[Override.level.asc(), Override.start_at.desc()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for row in rows:
|
||||
scope_bits = []
|
||||
if row.customer_name:
|
||||
scope_bits.append(row.customer_name)
|
||||
if row.backup_software:
|
||||
scope_bits.append(row.backup_software)
|
||||
if row.backup_type:
|
||||
scope_bits.append(row.backup_type)
|
||||
if row.job_name:
|
||||
scope_bits.append(row.job_name)
|
||||
if row.object_name:
|
||||
scope_bits.append(f"object: {row.object_name}")
|
||||
scope_text = " / ".join(scope_bits) if scope_bits else "All jobs"
|
||||
|
||||
section["items"].append(
|
||||
{
|
||||
"title": (row.level or "override").capitalize(),
|
||||
"subtitle": scope_text,
|
||||
"meta": f"From {_format_datetime(row.start_at)} to {_format_datetime(row.end_at) if row.end_at else '-'} | {row.comment or ''}",
|
||||
"link": url_for("main.overrides"),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_reports_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "reports",
|
||||
"title": "Reports",
|
||||
"view_all_url": url_for("main.reports"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("reports"):
|
||||
return section
|
||||
|
||||
query = ReportDefinition.query
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(ReportDefinition.name, ""),
|
||||
func.coalesce(ReportDefinition.report_type, ""),
|
||||
cast(ReportDefinition.period_start, String),
|
||||
cast(ReportDefinition.period_end, String),
|
||||
func.coalesce(ReportDefinition.output_format, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[ReportDefinition.created_at.desc()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
|
||||
can_edit = get_active_role() in ("admin", "operator", "reporter")
|
||||
for r in rows:
|
||||
section["items"].append(
|
||||
{
|
||||
"title": r.name or f"Report #{r.id}",
|
||||
"subtitle": f"{r.report_type or '-'} | {r.output_format or '-'}",
|
||||
"meta": f"{_format_datetime(r.period_start)} -> {_format_datetime(r.period_end)}",
|
||||
"link": (url_for("main.reports_edit", report_id=r.id) if can_edit else url_for("main.reports")),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
@main_bp.route("/search")
|
||||
@login_required
|
||||
def search_page():
|
||||
query = (request.args.get("q") or "").strip()
|
||||
patterns = _build_patterns(query)
|
||||
|
||||
requested_pages = {
|
||||
key: _parse_page(request.args.get(f"p_{key}"))
|
||||
for key in SEARCH_SECTION_KEYS
|
||||
}
|
||||
|
||||
sections = []
|
||||
if patterns:
|
||||
sections.append(_build_inbox_results(patterns, requested_pages["inbox"]))
|
||||
sections.append(_build_customers_results(patterns, requested_pages["customers"]))
|
||||
sections.append(_build_jobs_results(patterns, requested_pages["jobs"]))
|
||||
sections.append(_build_daily_jobs_results(patterns, requested_pages["daily_jobs"]))
|
||||
sections.append(_build_run_checks_results(patterns, requested_pages["run_checks"]))
|
||||
sections.append(_build_tickets_results(patterns, requested_pages["tickets"]))
|
||||
sections.append(_build_remarks_results(patterns, requested_pages["remarks"]))
|
||||
sections.append(_build_overrides_results(patterns, requested_pages["overrides"]))
|
||||
sections.append(_build_reports_results(patterns, requested_pages["reports"]))
|
||||
else:
|
||||
sections = [
|
||||
{"key": "inbox", "title": "Inbox", "view_all_url": url_for("main.inbox"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "customers", "title": "Customers", "view_all_url": url_for("main.customers"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "jobs", "title": "Jobs", "view_all_url": url_for("main.jobs"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "daily_jobs", "title": "Daily Jobs", "view_all_url": url_for("main.daily_jobs"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "run_checks", "title": "Run Checks", "view_all_url": url_for("main.run_checks_page"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "tickets", "title": "Tickets", "view_all_url": url_for("main.tickets_page"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "remarks", "title": "Remarks", "view_all_url": url_for("main.tickets_page", tab="remarks"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "overrides", "title": "Existing overrides", "view_all_url": url_for("main.overrides"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "reports", "title": "Reports", "view_all_url": url_for("main.reports"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
]
|
||||
|
||||
visible_sections = [s for s in sections if _is_section_allowed(s["key"])]
|
||||
current_pages = {
|
||||
s["key"]: int(s.get("current_page", 1) or 1)
|
||||
for s in sections
|
||||
}
|
||||
|
||||
def _build_search_url(page_overrides: dict[str, int]) -> str:
|
||||
args = {"q": query}
|
||||
for key in SEARCH_SECTION_KEYS:
|
||||
args[f"p_{key}"] = int(page_overrides.get(key, current_pages.get(key, 1)))
|
||||
return url_for("main.search_page", **args)
|
||||
|
||||
for s in visible_sections:
|
||||
key = s["key"]
|
||||
cur = int(s.get("current_page", 1) or 1)
|
||||
if query:
|
||||
if key == "inbox":
|
||||
s["view_all_url"] = url_for("main.inbox", q=query)
|
||||
elif key == "customers":
|
||||
s["view_all_url"] = url_for("main.customers", q=query)
|
||||
elif key == "jobs":
|
||||
s["view_all_url"] = url_for("main.jobs", q=query)
|
||||
elif key == "daily_jobs":
|
||||
s["view_all_url"] = url_for("main.daily_jobs", q=query)
|
||||
elif key == "run_checks":
|
||||
s["view_all_url"] = url_for("main.run_checks_page", q=query)
|
||||
elif key == "tickets":
|
||||
s["view_all_url"] = url_for("main.tickets_page", q=query)
|
||||
elif key == "remarks":
|
||||
s["view_all_url"] = url_for("main.tickets_page", tab="remarks", q=query)
|
||||
elif key == "overrides":
|
||||
s["view_all_url"] = url_for("main.overrides", q=query)
|
||||
elif key == "reports":
|
||||
s["view_all_url"] = url_for("main.reports", q=query)
|
||||
if s.get("has_prev"):
|
||||
prev_pages = dict(current_pages)
|
||||
prev_pages[key] = cur - 1
|
||||
s["prev_url"] = _build_search_url(prev_pages)
|
||||
if s.get("has_next"):
|
||||
next_pages = dict(current_pages)
|
||||
next_pages[key] = cur + 1
|
||||
s["next_url"] = _build_search_url(next_pages)
|
||||
|
||||
total_hits = sum(int(s.get("total", 0) or 0) for s in visible_sections)
|
||||
|
||||
return render_template(
|
||||
"main/search.html",
|
||||
query=query,
|
||||
sections=visible_sections,
|
||||
total_hits=total_hits,
|
||||
limit_per_section=SEARCH_LIMIT_PER_SECTION,
|
||||
)
|
||||
@ -585,6 +585,7 @@ def settings_jobs_export():
|
||||
@roles_required("admin")
|
||||
def settings_jobs_import():
|
||||
upload = request.files.get("jobs_file")
|
||||
include_autotask_ids = bool(request.form.get("include_autotask_ids"))
|
||||
if not upload or not upload.filename:
|
||||
flash("No import file was provided.", "danger")
|
||||
return redirect(url_for("main.settings", section="general"))
|
||||
@ -621,14 +622,17 @@ def settings_jobs_import():
|
||||
if not cust_name:
|
||||
continue
|
||||
|
||||
# Read Autotask fields (backwards compatible - optional)
|
||||
autotask_company_id = cust_item.get("autotask_company_id")
|
||||
autotask_company_name = cust_item.get("autotask_company_name")
|
||||
autotask_company_id = None
|
||||
autotask_company_name = None
|
||||
if include_autotask_ids:
|
||||
# Read Autotask fields (backwards compatible - optional)
|
||||
autotask_company_id = cust_item.get("autotask_company_id")
|
||||
autotask_company_name = cust_item.get("autotask_company_name")
|
||||
|
||||
existing_customer = Customer.query.filter_by(name=cust_name).first()
|
||||
if existing_customer:
|
||||
# Update Autotask mapping if provided
|
||||
if autotask_company_id is not None:
|
||||
# Update Autotask mapping only when explicitly allowed by import option.
|
||||
if include_autotask_ids and autotask_company_id is not None:
|
||||
existing_customer.autotask_company_id = autotask_company_id
|
||||
existing_customer.autotask_company_name = autotask_company_name
|
||||
existing_customer.autotask_mapping_status = None # Will be resynced
|
||||
@ -747,7 +751,7 @@ def settings_jobs_import():
|
||||
|
||||
db.session.commit()
|
||||
flash(
|
||||
f"Import completed. Customers created: {created_customers}, updated: {updated_customers}. Jobs created: {created_jobs}, updated: {updated_jobs}.",
|
||||
f"Import completed. Customers created: {created_customers}, updated: {updated_customers}. Jobs created: {created_jobs}, updated: {updated_jobs}. Autotask IDs imported: {'yes' if include_autotask_ids else 'no'}.",
|
||||
"success",
|
||||
)
|
||||
|
||||
@ -758,6 +762,7 @@ def settings_jobs_import():
|
||||
details=json.dumps({
|
||||
"format": "JSON",
|
||||
"schema": payload.get("schema"),
|
||||
"include_autotask_ids": include_autotask_ids,
|
||||
"customers_created": created_customers,
|
||||
"customers_updated": updated_customers,
|
||||
"jobs_created": created_jobs,
|
||||
@ -781,6 +786,8 @@ def settings():
|
||||
|
||||
if request.method == "POST":
|
||||
autotask_form_touched = any(str(k).startswith("autotask_") for k in (request.form or {}).keys())
|
||||
cove_form_touched = any(str(k).startswith("cove_") for k in (request.form or {}).keys())
|
||||
entra_form_touched = any(str(k).startswith("entra_") for k in (request.form or {}).keys())
|
||||
import_form_touched = any(str(k).startswith("auto_import_") or str(k).startswith("manual_import_") or str(k).startswith("ingest_eml_") for k in (request.form or {}).keys())
|
||||
general_form_touched = "ui_timezone" in request.form
|
||||
mail_form_touched = any(k in request.form for k in ["graph_tenant_id", "graph_client_id", "graph_mailbox", "incoming_folder", "processed_folder"])
|
||||
@ -903,6 +910,51 @@ def settings():
|
||||
except (ValueError, TypeError):
|
||||
pass
|
||||
|
||||
# Cove Data Protection integration
|
||||
if cove_form_touched:
|
||||
settings.cove_enabled = bool(request.form.get("cove_enabled"))
|
||||
settings.cove_import_enabled = bool(request.form.get("cove_import_enabled"))
|
||||
|
||||
if "cove_api_url" in request.form:
|
||||
settings.cove_api_url = (request.form.get("cove_api_url") or "").strip() or None
|
||||
|
||||
if "cove_api_username" in request.form:
|
||||
settings.cove_api_username = (request.form.get("cove_api_username") or "").strip() or None
|
||||
|
||||
if "cove_api_password" in request.form:
|
||||
pw = (request.form.get("cove_api_password") or "").strip()
|
||||
if pw:
|
||||
settings.cove_api_password = pw
|
||||
|
||||
if "cove_import_interval_minutes" in request.form:
|
||||
try:
|
||||
interval = int(request.form.get("cove_import_interval_minutes") or 30)
|
||||
if interval < 1:
|
||||
interval = 1
|
||||
settings.cove_import_interval_minutes = interval
|
||||
except (ValueError, TypeError):
|
||||
pass
|
||||
|
||||
# Microsoft Entra SSO
|
||||
if entra_form_touched:
|
||||
settings.entra_sso_enabled = bool(request.form.get("entra_sso_enabled"))
|
||||
settings.entra_auto_provision_users = bool(request.form.get("entra_auto_provision_users"))
|
||||
|
||||
if "entra_tenant_id" in request.form:
|
||||
settings.entra_tenant_id = (request.form.get("entra_tenant_id") or "").strip() or None
|
||||
if "entra_client_id" in request.form:
|
||||
settings.entra_client_id = (request.form.get("entra_client_id") or "").strip() or None
|
||||
if "entra_redirect_uri" in request.form:
|
||||
settings.entra_redirect_uri = (request.form.get("entra_redirect_uri") or "").strip() or None
|
||||
if "entra_allowed_domain" in request.form:
|
||||
settings.entra_allowed_domain = (request.form.get("entra_allowed_domain") or "").strip() or None
|
||||
if "entra_allowed_group_ids" in request.form:
|
||||
settings.entra_allowed_group_ids = (request.form.get("entra_allowed_group_ids") or "").strip() or None
|
||||
if "entra_client_secret" in request.form:
|
||||
pw = (request.form.get("entra_client_secret") or "").strip()
|
||||
if pw:
|
||||
settings.entra_client_secret = pw
|
||||
|
||||
# Daily Jobs
|
||||
if "daily_jobs_start_date" in request.form:
|
||||
daily_jobs_start_date_str = (request.form.get("daily_jobs_start_date") or "").strip()
|
||||
@ -1114,6 +1166,8 @@ def settings():
|
||||
|
||||
has_client_secret = bool(settings.graph_client_secret)
|
||||
has_autotask_password = bool(getattr(settings, "autotask_api_password", None))
|
||||
has_cove_password = bool(getattr(settings, "cove_api_password", None))
|
||||
has_entra_secret = bool(getattr(settings, "entra_client_secret", None))
|
||||
|
||||
# Common UI timezones (IANA names)
|
||||
tz_options = [
|
||||
@ -1239,6 +1293,8 @@ def settings():
|
||||
free_disk_warning=free_disk_warning,
|
||||
has_client_secret=has_client_secret,
|
||||
has_autotask_password=has_autotask_password,
|
||||
has_cove_password=has_cove_password,
|
||||
has_entra_secret=has_entra_secret,
|
||||
tz_options=tz_options,
|
||||
users=users,
|
||||
admin_users_count=admin_users_count,
|
||||
@ -1253,6 +1309,83 @@ def settings():
|
||||
)
|
||||
|
||||
|
||||
@main_bp.route("/settings/cove/test-connection", methods=["POST"])
|
||||
@login_required
|
||||
@roles_required("admin")
|
||||
def settings_cove_test_connection():
|
||||
"""Test the Cove Data Protection API connection and return JSON result."""
|
||||
from flask import jsonify
|
||||
from ..cove_importer import CoveImportError, _cove_login, COVE_DEFAULT_URL
|
||||
|
||||
settings = _get_or_create_settings()
|
||||
|
||||
username = (getattr(settings, "cove_api_username", None) or "").strip()
|
||||
password = (getattr(settings, "cove_api_password", None) or "").strip()
|
||||
url = (getattr(settings, "cove_api_url", None) or "").strip() or COVE_DEFAULT_URL
|
||||
|
||||
if not username or not password:
|
||||
return jsonify({"ok": False, "message": "Cove API username and password must be saved first."})
|
||||
|
||||
try:
|
||||
visa, partner_id = _cove_login(url, username, password)
|
||||
# Store the partner_id
|
||||
settings.cove_partner_id = partner_id
|
||||
db.session.commit()
|
||||
_log_admin_event(
|
||||
"cove_test_connection",
|
||||
f"Cove connection test succeeded. Partner ID: {partner_id}",
|
||||
)
|
||||
return jsonify({
|
||||
"ok": True,
|
||||
"partner_id": partner_id,
|
||||
"message": f"Connected – Partner ID: {partner_id}",
|
||||
})
|
||||
except CoveImportError as exc:
|
||||
db.session.rollback()
|
||||
return jsonify({"ok": False, "message": str(exc)})
|
||||
except Exception as exc:
|
||||
db.session.rollback()
|
||||
return jsonify({"ok": False, "message": f"Unexpected error: {exc}"})
|
||||
|
||||
|
||||
@main_bp.route("/settings/cove/run-now", methods=["POST"])
|
||||
@login_required
|
||||
@roles_required("admin")
|
||||
def settings_cove_run_now():
|
||||
"""Manually trigger a Cove import and show the result as a flash message."""
|
||||
from ..cove_importer import CoveImportError, run_cove_import
|
||||
|
||||
settings = _get_or_create_settings()
|
||||
|
||||
if not getattr(settings, "cove_enabled", False):
|
||||
flash("Cove integration is not enabled.", "warning")
|
||||
return redirect(url_for("main.settings", section="integrations"))
|
||||
|
||||
username = (getattr(settings, "cove_api_username", None) or "").strip()
|
||||
password = (getattr(settings, "cove_api_password", None) or "").strip()
|
||||
if not username or not password:
|
||||
flash("Cove API credentials not configured.", "warning")
|
||||
return redirect(url_for("main.settings", section="integrations"))
|
||||
|
||||
try:
|
||||
total, created, skipped, errors = run_cove_import(settings)
|
||||
_log_admin_event(
|
||||
"cove_import_manual",
|
||||
f"Manual Cove import finished. accounts={total}, created={created}, skipped={skipped}, errors={errors}",
|
||||
)
|
||||
flash(
|
||||
f"Cove import finished. Accounts: {total}, new runs: {created}, skipped: {skipped}, errors: {errors}.",
|
||||
"success" if errors == 0 else "warning",
|
||||
)
|
||||
except CoveImportError as exc:
|
||||
_log_admin_event("cove_import_manual_error", f"Manual Cove import failed: {exc}")
|
||||
flash(f"Cove import failed: {exc}", "danger")
|
||||
except Exception as exc:
|
||||
_log_admin_event("cove_import_manual_error", f"Unexpected error during manual Cove import: {exc}")
|
||||
flash(f"Unexpected error: {exc}", "danger")
|
||||
|
||||
return redirect(url_for("main.settings", section="integrations"))
|
||||
|
||||
|
||||
@main_bp.route("/settings/news/create", methods=["POST"])
|
||||
@login_required
|
||||
|
||||
@ -52,6 +52,7 @@ from ..models import (
|
||||
FeedbackItem,
|
||||
FeedbackVote,
|
||||
FeedbackReply,
|
||||
FeedbackAttachment,
|
||||
NewsItem,
|
||||
NewsRead,
|
||||
ReportDefinition,
|
||||
@ -678,6 +679,10 @@ def _infer_schedule_map_from_runs(job_id: int):
|
||||
return schedule
|
||||
if bs == 'qnap' and bt == 'firmware update':
|
||||
return schedule
|
||||
if bs == '3cx' and bt == 'update':
|
||||
return schedule
|
||||
if bs == '3cx' and bt == 'ssl certificate':
|
||||
return schedule
|
||||
if bs == 'syncovery' and bt == 'syncovery':
|
||||
return schedule
|
||||
except Exception:
|
||||
@ -993,4 +998,3 @@ def _next_ticket_code(now_utc: datetime) -> str:
|
||||
seq = 1
|
||||
|
||||
return f"{prefix}{seq:04d}"
|
||||
|
||||
|
||||
@ -28,17 +28,33 @@ def tickets_page():
|
||||
|
||||
if tab == "tickets":
|
||||
query = Ticket.query
|
||||
joined_scope = False
|
||||
if active_only:
|
||||
query = query.filter(Ticket.resolved_at.is_(None))
|
||||
if q:
|
||||
like_q = f"%{q}%"
|
||||
query = (
|
||||
query
|
||||
.outerjoin(TicketScope, TicketScope.ticket_id == Ticket.id)
|
||||
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
|
||||
.outerjoin(Job, Job.id == TicketScope.job_id)
|
||||
)
|
||||
joined_scope = True
|
||||
query = query.filter(
|
||||
(Ticket.ticket_code.ilike(like_q))
|
||||
| (Ticket.description.ilike(like_q))
|
||||
| (Customer.name.ilike(like_q))
|
||||
| (TicketScope.scope_type.ilike(like_q))
|
||||
| (TicketScope.backup_software.ilike(like_q))
|
||||
| (TicketScope.backup_type.ilike(like_q))
|
||||
| (TicketScope.job_name_match.ilike(like_q))
|
||||
| (Job.job_name.ilike(like_q))
|
||||
)
|
||||
query = query.distinct()
|
||||
|
||||
if customer_id or backup_software or backup_type:
|
||||
query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id)
|
||||
if not joined_scope:
|
||||
query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id)
|
||||
if customer_id:
|
||||
query = query.filter(TicketScope.customer_id == customer_id)
|
||||
if backup_software:
|
||||
@ -322,4 +338,3 @@ def ticket_detail(ticket_id: int):
|
||||
scopes=scopes,
|
||||
runs=runs,
|
||||
)
|
||||
|
||||
|
||||
@ -1070,6 +1070,141 @@ def migrate_rename_admin_logs_to_audit_logs() -> None:
|
||||
print("[migrations] audit_logs table will be created by db.create_all()")
|
||||
|
||||
|
||||
def migrate_cove_accounts_table() -> None:
|
||||
"""Create the cove_accounts staging table if it does not exist.
|
||||
|
||||
This table stores all accounts returned by Cove EnumerateAccountStatistics.
|
||||
Unlinked accounts (job_id IS NULL) appear in the Cove Accounts review page.
|
||||
"""
|
||||
try:
|
||||
engine = db.get_engine()
|
||||
except Exception as exc:
|
||||
print(f"[migrations] Could not get engine for cove_accounts migration: {exc}")
|
||||
return
|
||||
|
||||
try:
|
||||
with engine.begin() as conn:
|
||||
conn.execute(text("""
|
||||
CREATE TABLE IF NOT EXISTS cove_accounts (
|
||||
id SERIAL PRIMARY KEY,
|
||||
account_id INTEGER NOT NULL UNIQUE,
|
||||
account_name VARCHAR(512) NULL,
|
||||
computer_name VARCHAR(512) NULL,
|
||||
customer_name VARCHAR(255) NULL,
|
||||
datasource_types VARCHAR(255) NULL,
|
||||
last_status_code INTEGER NULL,
|
||||
last_run_at TIMESTAMP NULL,
|
||||
colorbar_28d VARCHAR(64) NULL,
|
||||
job_id INTEGER NULL REFERENCES jobs(id) ON DELETE SET NULL,
|
||||
first_seen_at TIMESTAMP NOT NULL DEFAULT NOW(),
|
||||
last_seen_at TIMESTAMP NOT NULL DEFAULT NOW()
|
||||
)
|
||||
"""))
|
||||
conn.execute(text(
|
||||
"CREATE INDEX IF NOT EXISTS idx_cove_accounts_account_id ON cove_accounts (account_id)"
|
||||
))
|
||||
conn.execute(text(
|
||||
"CREATE INDEX IF NOT EXISTS idx_cove_accounts_job_id ON cove_accounts (job_id)"
|
||||
))
|
||||
print("[migrations] migrate_cove_accounts_table completed.")
|
||||
except Exception as exc:
|
||||
print(f"[migrations] Failed to migrate cove_accounts table: {exc}")
|
||||
|
||||
|
||||
def migrate_cove_integration() -> None:
|
||||
"""Add Cove Data Protection integration columns if missing.
|
||||
|
||||
Adds to system_settings:
|
||||
- cove_enabled (BOOLEAN NOT NULL DEFAULT FALSE)
|
||||
- cove_api_url (VARCHAR(255) NULL)
|
||||
- cove_api_username (VARCHAR(255) NULL)
|
||||
- cove_api_password (VARCHAR(255) NULL)
|
||||
- cove_import_enabled (BOOLEAN NOT NULL DEFAULT FALSE)
|
||||
- cove_import_interval_minutes (INTEGER NOT NULL DEFAULT 30)
|
||||
- cove_partner_id (INTEGER NULL)
|
||||
- cove_last_import_at (TIMESTAMP NULL)
|
||||
|
||||
Adds to jobs:
|
||||
- cove_account_id (INTEGER NULL)
|
||||
|
||||
Adds to job_runs:
|
||||
- source_type (VARCHAR(20) NULL)
|
||||
- external_id (VARCHAR(100) NULL)
|
||||
"""
|
||||
try:
|
||||
engine = db.get_engine()
|
||||
except Exception as exc:
|
||||
print(f"[migrations] Could not get engine for Cove integration migration: {exc}")
|
||||
return
|
||||
|
||||
try:
|
||||
with engine.begin() as conn:
|
||||
# system_settings columns
|
||||
ss_columns = [
|
||||
("cove_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
|
||||
("cove_api_url", "VARCHAR(255) NULL"),
|
||||
("cove_api_username", "VARCHAR(255) NULL"),
|
||||
("cove_api_password", "VARCHAR(255) NULL"),
|
||||
("cove_import_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
|
||||
("cove_import_interval_minutes", "INTEGER NOT NULL DEFAULT 30"),
|
||||
("cove_partner_id", "INTEGER NULL"),
|
||||
("cove_last_import_at", "TIMESTAMP NULL"),
|
||||
]
|
||||
for column, ddl in ss_columns:
|
||||
if _column_exists_on_conn(conn, "system_settings", column):
|
||||
continue
|
||||
conn.execute(text(f'ALTER TABLE "system_settings" ADD COLUMN {column} {ddl}'))
|
||||
|
||||
# jobs column
|
||||
if not _column_exists_on_conn(conn, "jobs", "cove_account_id"):
|
||||
conn.execute(text('ALTER TABLE "jobs" ADD COLUMN cove_account_id INTEGER NULL'))
|
||||
|
||||
# job_runs columns
|
||||
if not _column_exists_on_conn(conn, "job_runs", "source_type"):
|
||||
conn.execute(text('ALTER TABLE "job_runs" ADD COLUMN source_type VARCHAR(20) NULL'))
|
||||
if not _column_exists_on_conn(conn, "job_runs", "external_id"):
|
||||
conn.execute(text('ALTER TABLE "job_runs" ADD COLUMN external_id VARCHAR(100) NULL'))
|
||||
|
||||
# Index for deduplication lookups
|
||||
conn.execute(text(
|
||||
'CREATE INDEX IF NOT EXISTS idx_job_runs_external_id ON "job_runs" (external_id)'
|
||||
))
|
||||
|
||||
print("[migrations] migrate_cove_integration completed.")
|
||||
except Exception as exc:
|
||||
print(f"[migrations] Failed to migrate Cove integration columns: {exc}")
|
||||
|
||||
|
||||
def migrate_entra_sso_settings() -> None:
|
||||
"""Add Microsoft Entra SSO columns to system_settings if missing."""
|
||||
try:
|
||||
engine = db.get_engine()
|
||||
except Exception as exc:
|
||||
print(f"[migrations] Could not get engine for Entra SSO migration: {exc}")
|
||||
return
|
||||
|
||||
columns = [
|
||||
("entra_sso_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
|
||||
("entra_tenant_id", "VARCHAR(128) NULL"),
|
||||
("entra_client_id", "VARCHAR(128) NULL"),
|
||||
("entra_client_secret", "VARCHAR(255) NULL"),
|
||||
("entra_redirect_uri", "VARCHAR(512) NULL"),
|
||||
("entra_allowed_domain", "VARCHAR(255) NULL"),
|
||||
("entra_allowed_group_ids", "TEXT NULL"),
|
||||
("entra_auto_provision_users", "BOOLEAN NOT NULL DEFAULT FALSE"),
|
||||
]
|
||||
|
||||
try:
|
||||
with engine.begin() as conn:
|
||||
for column, ddl in columns:
|
||||
if _column_exists_on_conn(conn, "system_settings", column):
|
||||
continue
|
||||
conn.execute(text(f'ALTER TABLE "system_settings" ADD COLUMN {column} {ddl}'))
|
||||
print("[migrations] migrate_entra_sso_settings completed.")
|
||||
except Exception as exc:
|
||||
print(f"[migrations] Failed to migrate Entra SSO columns: {exc}")
|
||||
|
||||
|
||||
def run_migrations() -> None:
|
||||
print("[migrations] Starting migrations...")
|
||||
migrate_add_username_to_users()
|
||||
@ -1095,6 +1230,7 @@ def run_migrations() -> None:
|
||||
migrate_object_persistence_tables()
|
||||
migrate_feedback_tables()
|
||||
migrate_feedback_replies_table()
|
||||
migrate_feedback_attachments_table()
|
||||
migrate_tickets_active_from_date()
|
||||
migrate_tickets_resolved_origin()
|
||||
migrate_remarks_active_from_date()
|
||||
@ -1111,6 +1247,9 @@ def run_migrations() -> None:
|
||||
migrate_performance_indexes()
|
||||
migrate_system_settings_require_daily_dashboard_visit()
|
||||
migrate_rename_admin_logs_to_audit_logs()
|
||||
migrate_cove_integration()
|
||||
migrate_cove_accounts_table()
|
||||
migrate_entra_sso_settings()
|
||||
print("[migrations] All migrations completed.")
|
||||
|
||||
|
||||
@ -1446,6 +1585,49 @@ def migrate_feedback_replies_table() -> None:
|
||||
print("[migrations] Feedback replies table ensured.")
|
||||
|
||||
|
||||
def migrate_feedback_attachments_table() -> None:
|
||||
"""Ensure feedback attachments table exists.
|
||||
|
||||
Table:
|
||||
- feedback_attachments (screenshots/images for feedback items and replies)
|
||||
"""
|
||||
engine = db.get_engine()
|
||||
with engine.begin() as conn:
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
CREATE TABLE IF NOT EXISTS feedback_attachments (
|
||||
id SERIAL PRIMARY KEY,
|
||||
feedback_item_id INTEGER NOT NULL REFERENCES feedback_items(id) ON DELETE CASCADE,
|
||||
feedback_reply_id INTEGER REFERENCES feedback_replies(id) ON DELETE CASCADE,
|
||||
filename VARCHAR(255) NOT NULL,
|
||||
file_data BYTEA NOT NULL,
|
||||
mime_type VARCHAR(64) NOT NULL,
|
||||
file_size INTEGER NOT NULL,
|
||||
created_at TIMESTAMP NOT NULL DEFAULT NOW()
|
||||
);
|
||||
"""
|
||||
)
|
||||
)
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
CREATE INDEX IF NOT EXISTS idx_feedback_attachments_item
|
||||
ON feedback_attachments (feedback_item_id);
|
||||
"""
|
||||
)
|
||||
)
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
CREATE INDEX IF NOT EXISTS idx_feedback_attachments_reply
|
||||
ON feedback_attachments (feedback_reply_id);
|
||||
"""
|
||||
)
|
||||
)
|
||||
print("[migrations] Feedback attachments table ensured.")
|
||||
|
||||
|
||||
def migrate_tickets_active_from_date() -> None:
|
||||
"""Ensure tickets.active_from_date exists and is populated.
|
||||
|
||||
|
||||
@ -117,6 +117,26 @@ class SystemSettings(db.Model):
|
||||
# this is not a production environment.
|
||||
is_sandbox_environment = db.Column(db.Boolean, nullable=False, default=False)
|
||||
|
||||
# Cove Data Protection integration settings
|
||||
cove_enabled = db.Column(db.Boolean, nullable=False, default=False)
|
||||
cove_api_url = db.Column(db.String(255), nullable=True) # default: https://api.backup.management/jsonapi
|
||||
cove_api_username = db.Column(db.String(255), nullable=True)
|
||||
cove_api_password = db.Column(db.String(255), nullable=True)
|
||||
cove_import_enabled = db.Column(db.Boolean, nullable=False, default=False)
|
||||
cove_import_interval_minutes = db.Column(db.Integer, nullable=False, default=30)
|
||||
cove_partner_id = db.Column(db.Integer, nullable=True) # stored after successful login
|
||||
cove_last_import_at = db.Column(db.DateTime, nullable=True)
|
||||
|
||||
# Microsoft Entra SSO settings
|
||||
entra_sso_enabled = db.Column(db.Boolean, nullable=False, default=False)
|
||||
entra_tenant_id = db.Column(db.String(128), nullable=True)
|
||||
entra_client_id = db.Column(db.String(128), nullable=True)
|
||||
entra_client_secret = db.Column(db.String(255), nullable=True)
|
||||
entra_redirect_uri = db.Column(db.String(512), nullable=True)
|
||||
entra_allowed_domain = db.Column(db.String(255), nullable=True)
|
||||
entra_allowed_group_ids = db.Column(db.Text, nullable=True) # comma/newline separated Entra Group Object IDs
|
||||
entra_auto_provision_users = db.Column(db.Boolean, nullable=False, default=False)
|
||||
|
||||
# Autotask integration settings
|
||||
autotask_enabled = db.Column(db.Boolean, nullable=False, default=False)
|
||||
autotask_environment = db.Column(db.String(32), nullable=True) # sandbox | production
|
||||
@ -242,6 +262,9 @@ class Job(db.Model):
|
||||
auto_approve = db.Column(db.Boolean, nullable=False, default=True)
|
||||
active = db.Column(db.Boolean, nullable=False, default=True)
|
||||
|
||||
# Cove Data Protection integration (legacy: account ID stored directly on job)
|
||||
cove_account_id = db.Column(db.Integer, nullable=True) # kept for backwards compat
|
||||
|
||||
# Archived jobs are excluded from Daily Jobs and Run Checks.
|
||||
# JobRuns remain in the database and are still included in reporting.
|
||||
archived = db.Column(db.Boolean, nullable=False, default=False)
|
||||
@ -290,6 +313,10 @@ class JobRun(db.Model):
|
||||
reviewed_at = db.Column(db.DateTime, nullable=True)
|
||||
reviewed_by_user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=True)
|
||||
|
||||
# Import source tracking
|
||||
source_type = db.Column(db.String(20), nullable=True) # NULL = email (backwards compat), "cove_api"
|
||||
external_id = db.Column(db.String(100), nullable=True) # e.g. "cove-{account_id}-{run_ts}" for deduplication
|
||||
|
||||
# Autotask integration (Phase 4: ticket creation from Run Checks)
|
||||
autotask_ticket_id = db.Column(db.Integer, nullable=True)
|
||||
autotask_ticket_number = db.Column(db.String(64), nullable=True)
|
||||
@ -314,6 +341,41 @@ class JobRun(db.Model):
|
||||
autotask_ticket_created_by = db.relationship("User", foreign_keys=[autotask_ticket_created_by_user_id])
|
||||
|
||||
|
||||
class CoveAccount(db.Model):
|
||||
"""Staging table for Cove Data Protection accounts.
|
||||
|
||||
All accounts returned by EnumerateAccountStatistics are upserted here.
|
||||
Unlinked accounts (job_id IS NULL) appear in the Cove Accounts page
|
||||
where an admin can create or link a job – the same flow as the mail Inbox.
|
||||
Once linked, the importer creates JobRuns for each new session.
|
||||
"""
|
||||
__tablename__ = "cove_accounts"
|
||||
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
|
||||
# Cove account identifier (unique, from AccountId field)
|
||||
account_id = db.Column(db.Integer, nullable=False, unique=True)
|
||||
|
||||
# Account/device info from Cove columns
|
||||
account_name = db.Column(db.String(512), nullable=True) # I1 – device/backup name
|
||||
computer_name = db.Column(db.String(512), nullable=True) # I18 – computer name
|
||||
customer_name = db.Column(db.String(255), nullable=True) # I8 – Cove customer/partner name
|
||||
datasource_types = db.Column(db.String(255), nullable=True) # I78 – active datasource label
|
||||
|
||||
# Last known status
|
||||
last_status_code = db.Column(db.Integer, nullable=True) # D09F00
|
||||
last_run_at = db.Column(db.DateTime, nullable=True) # D09F15 (converted from Unix ts)
|
||||
colorbar_28d = db.Column(db.String(64), nullable=True) # D09F08
|
||||
|
||||
# Link to a Backupchecks job (NULL = unmatched, needs review)
|
||||
job_id = db.Column(db.Integer, db.ForeignKey("jobs.id"), nullable=True)
|
||||
|
||||
first_seen_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
|
||||
last_seen_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
|
||||
|
||||
job = db.relationship("Job", backref=db.backref("cove_account", uselist=False))
|
||||
|
||||
|
||||
class JobRunReviewEvent(db.Model):
|
||||
__tablename__ = "job_run_review_events"
|
||||
|
||||
@ -567,6 +629,23 @@ class FeedbackReply(db.Model):
|
||||
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
|
||||
|
||||
|
||||
class FeedbackAttachment(db.Model):
|
||||
__tablename__ = "feedback_attachments"
|
||||
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
feedback_item_id = db.Column(
|
||||
db.Integer, db.ForeignKey("feedback_items.id", ondelete="CASCADE"), nullable=False
|
||||
)
|
||||
feedback_reply_id = db.Column(
|
||||
db.Integer, db.ForeignKey("feedback_replies.id", ondelete="CASCADE"), nullable=True
|
||||
)
|
||||
filename = db.Column(db.String(255), nullable=False)
|
||||
file_data = db.Column(db.LargeBinary, nullable=False)
|
||||
mime_type = db.Column(db.String(64), nullable=False)
|
||||
file_size = db.Column(db.Integer, nullable=False)
|
||||
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
|
||||
|
||||
|
||||
class NewsItem(db.Model):
|
||||
__tablename__ = "news_items"
|
||||
|
||||
|
||||
@ -24,6 +24,10 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
|
||||
- SSL Certificate Renewal (informational)
|
||||
Subject: '3CX Notification: SSL Certificate Renewal - <host>'
|
||||
Body contains an informational message about the renewal.
|
||||
|
||||
- Update Successful (informational)
|
||||
Subject: '3CX Notification: Update Successful - <host>'
|
||||
Body confirms update completion and healthy services.
|
||||
"""
|
||||
subject = (msg.subject or "").strip()
|
||||
if not subject:
|
||||
@ -38,11 +42,16 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
|
||||
subject,
|
||||
flags=re.IGNORECASE,
|
||||
)
|
||||
m_update = re.match(
|
||||
r"^3CX Notification:\s*Update Successful\s*-\s*(.+)$",
|
||||
subject,
|
||||
flags=re.IGNORECASE,
|
||||
)
|
||||
|
||||
if not m_backup and not m_ssl:
|
||||
if not m_backup and not m_ssl and not m_update:
|
||||
return False, {}, []
|
||||
|
||||
job_name = (m_backup or m_ssl).group(1).strip()
|
||||
job_name = (m_backup or m_ssl or m_update).group(1).strip()
|
||||
|
||||
body = (getattr(msg, "text_body", None) or getattr(msg, "body", None) or "")
|
||||
if not body:
|
||||
@ -60,6 +69,17 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
|
||||
}
|
||||
return True, result, []
|
||||
|
||||
# Update successful: store as tracked informational run
|
||||
if m_update:
|
||||
result = {
|
||||
"backup_software": "3CX",
|
||||
"backup_type": "Update",
|
||||
"job_name": job_name,
|
||||
"overall_status": "Success",
|
||||
"overall_message": body or None,
|
||||
}
|
||||
return True, result, []
|
||||
|
||||
# Backup complete
|
||||
backup_file = None
|
||||
m_file = re.search(r"^\s*Backup\s+name\s*:\s*(.+?)\s*$", body, flags=re.IGNORECASE | re.MULTILINE)
|
||||
|
||||
@ -11,7 +11,7 @@
|
||||
class="form-control"
|
||||
id="username"
|
||||
name="username"
|
||||
value="{{ email or '' }}"
|
||||
value="{{ username or '' }}"
|
||||
required
|
||||
/>
|
||||
</div>
|
||||
@ -39,6 +39,12 @@
|
||||
<a class="btn btn-link" href="{{ url_for('auth.password_reset_request') }}">Forgot password?</a>
|
||||
</div>
|
||||
</form>
|
||||
{% if entra_sso_enabled %}
|
||||
<div class="my-3"><hr /></div>
|
||||
<a class="btn btn-outline-secondary w-100" href="{{ url_for('auth.entra_login') }}">
|
||||
Sign in with Microsoft
|
||||
</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endblock %}
|
||||
|
||||
@ -0,0 +1,121 @@
|
||||
{% extends "documentation/base.html" %}
|
||||
|
||||
{% block doc_content %}
|
||||
<h1>Microsoft Entra SSO</h1>
|
||||
<p>Use Microsoft Entra ID (Azure AD) to let users sign in with their Microsoft account.</p>
|
||||
|
||||
<div class="doc-callout doc-callout-warning">
|
||||
<strong>Status: Untested in Backupchecks.</strong>
|
||||
This SSO implementation has not yet been end-to-end validated in Backupchecks itself.
|
||||
Treat this page as implementation guidance for future rollout, not as a confirmed production setup.
|
||||
</div>
|
||||
|
||||
<div class="doc-callout doc-callout-info">
|
||||
<strong>Scope:</strong> this page explains the setup for Backupchecks and Microsoft Entra.
|
||||
It does not replace your internal identity/security policies.
|
||||
</div>
|
||||
|
||||
<h2>Prerequisites</h2>
|
||||
<ul>
|
||||
<li>Admin access to your Microsoft Entra tenant.</li>
|
||||
<li>Admin access to Backupchecks <strong>Settings → Integrations</strong>.</li>
|
||||
<li>A stable HTTPS URL for Backupchecks (recommended for production).</li>
|
||||
</ul>
|
||||
|
||||
<h2>Step 1: Register an app in Microsoft Entra</h2>
|
||||
<ol>
|
||||
<li>Open <strong>Microsoft Entra admin center</strong> → <strong>App registrations</strong>.</li>
|
||||
<li>Create a new registration (single-tenant is typical for internal use).</li>
|
||||
<li>Set a name, for example <code>Backupchecks SSO</code>.</li>
|
||||
<li>After creation, copy:
|
||||
<ul>
|
||||
<li><strong>Application (client) ID</strong></li>
|
||||
<li><strong>Directory (tenant) ID</strong></li>
|
||||
</ul>
|
||||
</li>
|
||||
</ol>
|
||||
|
||||
<h2>Step 2: Configure redirect URI</h2>
|
||||
<ol>
|
||||
<li>In the app registration, open <strong>Authentication</strong>.</li>
|
||||
<li>Add a <strong>Web</strong> redirect URI:
|
||||
<ul>
|
||||
<li><code>https://your-backupchecks-domain/auth/entra/callback</code></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Save the authentication settings.</li>
|
||||
</ol>
|
||||
|
||||
<h2>Step 3: Create client secret</h2>
|
||||
<ol>
|
||||
<li>Open <strong>Certificates & secrets</strong> in the app registration.</li>
|
||||
<li>Create a new client secret.</li>
|
||||
<li>Copy the secret value immediately (it is shown only once).</li>
|
||||
</ol>
|
||||
|
||||
<h2>Step 4: Configure Backupchecks</h2>
|
||||
<ol>
|
||||
<li>Open <strong>Settings → Integrations → Microsoft Entra SSO</strong>.</li>
|
||||
<li>Enable <strong>Microsoft sign-in</strong>.</li>
|
||||
<li>Fill in:
|
||||
<ul>
|
||||
<li><strong>Tenant ID</strong></li>
|
||||
<li><strong>Client ID</strong></li>
|
||||
<li><strong>Client Secret</strong></li>
|
||||
<li><strong>Redirect URI</strong> (optional override, leave empty to auto-use callback URL)</li>
|
||||
<li><strong>Allowed domain/tenant</strong> (optional restriction)</li>
|
||||
<li><strong>Allowed Entra Group Object ID(s)</strong> (optional but recommended)</li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>Optional: enable <strong>Auto-provision unknown users as Viewer</strong>.</li>
|
||||
<li>Save settings.</li>
|
||||
</ol>
|
||||
|
||||
<h2>Security Group Restriction (recommended)</h2>
|
||||
<p>You can enforce that only members of one or more specific Entra security groups can sign in.</p>
|
||||
|
||||
<ol>
|
||||
<li>Create or choose a security group in Entra (for example <code>Backupchecks-Users</code>).</li>
|
||||
<li>Add the allowed users to that group.</li>
|
||||
<li>Copy the group <strong>Object ID</strong> (not display name).</li>
|
||||
<li>Paste one or more group object IDs in:
|
||||
<ul>
|
||||
<li><strong>Settings → Integrations → Microsoft Entra SSO → Allowed Entra Group Object ID(s)</strong></li>
|
||||
</ul>
|
||||
</li>
|
||||
<li>In the Entra app registration, configure <strong>Token configuration</strong> to include the <code>groups</code> claim in ID tokens.</li>
|
||||
</ol>
|
||||
|
||||
<div class="doc-callout doc-callout-warning">
|
||||
<strong>Important:</strong> if users are member of many groups, Entra may return a "group overage" token without inline
|
||||
<code>groups</code> list. In that case Backupchecks cannot verify membership and login is blocked by design.
|
||||
</div>
|
||||
|
||||
<h2>Step 5: Test sign-in</h2>
|
||||
<ol>
|
||||
<li>Open <strong>/auth/login</strong> in a private/incognito browser session.</li>
|
||||
<li>Click <strong>Sign in with Microsoft</strong>.</li>
|
||||
<li>Authenticate with an allowed account.</li>
|
||||
<li>Confirm you are redirected back into Backupchecks.</li>
|
||||
</ol>
|
||||
|
||||
<h2>User mapping behavior</h2>
|
||||
<ul>
|
||||
<li>Backupchecks first tries to match Entra user to local user by username/email.</li>
|
||||
<li>If no match exists:
|
||||
<ul>
|
||||
<li>With auto-provision disabled: login is rejected.</li>
|
||||
<li>With auto-provision enabled: a new local user is created with <strong>Viewer</strong> role.</li>
|
||||
</ul>
|
||||
</li>
|
||||
</ul>
|
||||
|
||||
<h2>Troubleshooting</h2>
|
||||
<ul>
|
||||
<li><strong>Redirect URI mismatch:</strong> ensure Entra app URI exactly matches Backupchecks callback URI.</li>
|
||||
<li><strong>SSO button not visible:</strong> check that SSO is enabled and Tenant/Client/Secret are saved.</li>
|
||||
<li><strong>Account not allowed:</strong> verify tenant/domain restriction in <em>Allowed domain/tenant</em>.</li>
|
||||
<li><strong>Group restricted login fails:</strong> verify group object IDs and ensure the ID token includes a <code>groups</code> claim.</li>
|
||||
<li><strong>No local user mapping:</strong> create a matching local user or enable auto-provision.</li>
|
||||
</ul>
|
||||
{% endblock %}
|
||||
@ -95,68 +95,73 @@
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.inbox') }}">Inbox</a>
|
||||
</li>
|
||||
{% if active_role == 'admin' %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.admin_all_mails') }}">All Mail</a>
|
||||
</li>
|
||||
{% endif %}
|
||||
{% if active_role == 'admin' %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.inbox_deleted_mails') }}">Deleted mails</a>
|
||||
</li>
|
||||
{% endif %}
|
||||
{% if active_role == 'viewer' %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.customers') }}">Customers</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.jobs') }}">Jobs</a>
|
||||
</li>
|
||||
{% if active_role == 'admin' %}
|
||||
{% endif %}
|
||||
{% if system_settings and system_settings.cove_enabled and active_role in ('admin', 'operator') %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.archived_jobs') }}">Archived Jobs</a>
|
||||
<a class="nav-link" href="{{ url_for('main.cove_accounts') }}">Cove Accounts</a>
|
||||
</li>
|
||||
{% endif %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.daily_jobs') }}">Daily Jobs</a>
|
||||
</li>
|
||||
{% if active_role in ('admin', 'operator') %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.run_checks_page') }}">Run Checks</a>
|
||||
</li>
|
||||
{% endif %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.tickets_page') }}">Tickets</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.overrides') }}">Overrides</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.reports') }}">Reports</a>
|
||||
</li>
|
||||
<li class="nav-item dropdown">
|
||||
<a class="nav-link dropdown-toggle" href="#" id="moreMenu" role="button" data-bs-toggle="dropdown" aria-expanded="false">
|
||||
More
|
||||
</a>
|
||||
<ul class="dropdown-menu" aria-labelledby="moreMenu">
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.daily_jobs') }}">Daily Jobs</a></li>
|
||||
{% if active_role != 'viewer' %}
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.customers') }}">Customers</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.jobs') }}">Jobs</a></li>
|
||||
{% endif %}
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.tickets_page') }}">Tickets</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.overrides') }}">Overrides</a></li>
|
||||
<li><a class="dropdown-item {% if request.path.startswith('/documentation') %}active{% endif %}" href="{{ url_for('documentation.index') }}">Documentation</a></li>
|
||||
<li><a class="dropdown-item" href='{{ url_for("main.changelog_page") }}'>Changelog</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.feedback_page') }}">Feedback</a></li>
|
||||
</ul>
|
||||
</li>
|
||||
{% if active_role == 'admin' %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.settings') }}">Settings</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.logging_page') }}">Logging</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.parsers_overview') }}">Parsers</a>
|
||||
<li class="nav-item dropdown">
|
||||
<a class="nav-link dropdown-toggle" href="#" id="adminMenu" role="button" data-bs-toggle="dropdown" aria-expanded="false">
|
||||
Admin
|
||||
</a>
|
||||
<ul class="dropdown-menu" aria-labelledby="adminMenu">
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.admin_all_mails') }}">All Mail</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.inbox_deleted_mails') }}">Deleted mails</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.archived_jobs') }}">Archived Jobs</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.settings') }}">Settings</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.logging_page') }}">Logging</a></li>
|
||||
<li><a class="dropdown-item" href="{{ url_for('main.parsers_overview') }}">Parsers</a></li>
|
||||
</ul>
|
||||
</li>
|
||||
{% endif %}
|
||||
<li class="nav-item">
|
||||
<a class="nav-link {% if request.path.startswith('/documentation') %}active{% endif %}" href="{{ url_for('documentation.index') }}">
|
||||
<span class="nav-icon">📖</span> Documentation
|
||||
</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href='{{ url_for("main.changelog_page") }}'>Changelog</a>
|
||||
</li>
|
||||
<li class="nav-item">
|
||||
<a class="nav-link" href="{{ url_for('main.feedback_page') }}">Feedback</a>
|
||||
</li>
|
||||
{% endif %}
|
||||
</ul>
|
||||
<form method="get" action="{{ url_for('main.search_page') }}" class="d-flex me-3 mb-2 mb-lg-0" role="search" autocomplete="off">
|
||||
<input
|
||||
class="form-control form-control-sm me-2"
|
||||
type="search"
|
||||
name="q"
|
||||
placeholder="Search"
|
||||
aria-label="Search"
|
||||
value="{{ request.args.get('q','') if request.path == url_for('main.search_page') else '' }}"
|
||||
style="min-width: 220px;"
|
||||
/>
|
||||
<button class="btn btn-outline-secondary btn-sm" type="submit">Search</button>
|
||||
</form>
|
||||
<span class="navbar-text me-3">
|
||||
<a class="text-decoration-none" href="{{ url_for('main.user_settings') }}">
|
||||
{{ current_user.username }} ({{ active_role }})
|
||||
|
||||
241
containers/backupchecks/src/templates/main/cove_accounts.html
Normal file
241
containers/backupchecks/src/templates/main/cove_accounts.html
Normal file
@ -0,0 +1,241 @@
|
||||
{% extends "layout/base.html" %}
|
||||
{% block content %}
|
||||
<div class="d-flex justify-content-between align-items-center mb-3">
|
||||
<h2 class="mb-0">Cove Accounts</h2>
|
||||
<div class="d-flex gap-2">
|
||||
{% if settings.cove_partner_id %}
|
||||
<form method="post" action="{{ url_for('main.settings_cove_run_now') }}" class="mb-0">
|
||||
<button type="submit" class="btn btn-sm btn-outline-primary">Run import now</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
<a href="{{ url_for('main.settings', section='integrations') }}" class="btn btn-sm btn-outline-secondary">Cove settings</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% if settings.cove_last_import_at %}
|
||||
<p class="text-muted small mb-3">Last import: {{ settings.cove_last_import_at|local_datetime }}</p>
|
||||
{% else %}
|
||||
<p class="text-muted small mb-3">No import has run yet. Click <strong>Run import now</strong> to fetch Cove accounts.</p>
|
||||
{% endif %}
|
||||
|
||||
{# ── Unmatched accounts (need a job) ─────────────────────────────────────── #}
|
||||
{% if unmatched %}
|
||||
<h4 class="mb-2">Unmatched <span class="badge bg-warning text-dark">{{ unmatched|length }}</span></h4>
|
||||
<p class="text-muted small mb-3">These accounts have no linked job yet. Create a new job or link to an existing one.</p>
|
||||
|
||||
<div class="table-responsive mb-4">
|
||||
<table class="table table-sm table-hover align-middle">
|
||||
<thead class="table-light">
|
||||
<tr>
|
||||
<th>Backup software</th>
|
||||
<th>Type</th>
|
||||
<th>Job name</th>
|
||||
<th>Computer</th>
|
||||
<th>Customer (Cove)</th>
|
||||
<th>Datasources</th>
|
||||
<th>Last status</th>
|
||||
<th>Last run</th>
|
||||
<th>First seen</th>
|
||||
<th></th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for acc in unmatched %}
|
||||
<tr>
|
||||
<td>{{ acc.derived_backup_software }}</td>
|
||||
<td>{{ acc.derived_backup_type }}</td>
|
||||
<td>{{ acc.derived_job_name }}</td>
|
||||
<td class="text-muted small">{{ acc.computer_name or '—' }}</td>
|
||||
<td>{{ acc.customer_name or '—' }}</td>
|
||||
<td class="text-muted small">{{ acc.datasource_display }}</td>
|
||||
<td>
|
||||
{% if acc.last_status_code is not none %}
|
||||
<span class="badge bg-{{ STATUS_CLASS.get(acc.last_status_code, 'secondary') }}">
|
||||
{{ STATUS_LABELS.get(acc.last_status_code, acc.last_status_code) }}
|
||||
</span>
|
||||
{% else %}—{% endif %}
|
||||
</td>
|
||||
<td class="text-muted small">{{ acc.last_run_at|local_datetime if acc.last_run_at else '—' }}</td>
|
||||
<td class="text-muted small">{{ acc.first_seen_at|local_datetime }}</td>
|
||||
<td>
|
||||
<button class="btn btn-sm btn-primary"
|
||||
data-bs-toggle="modal"
|
||||
data-bs-target="#link-modal-{{ acc.id }}">
|
||||
Link / Create job
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
{# Link modal #}
|
||||
<div class="modal fade" id="link-modal-{{ acc.id }}" tabindex="-1">
|
||||
<div class="modal-dialog">
|
||||
<div class="modal-content">
|
||||
<div class="modal-header">
|
||||
<h5 class="modal-title">Link: {{ acc.account_name or acc.account_id }}</h5>
|
||||
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
|
||||
</div>
|
||||
<div class="modal-body">
|
||||
<p class="text-muted small mb-3">
|
||||
Cove account <strong>{{ acc.account_id }}</strong> –
|
||||
customer: <strong>{{ acc.customer_name or '?' }}</strong>
|
||||
</p>
|
||||
|
||||
<ul class="nav nav-tabs mb-3" id="tab-{{ acc.id }}" role="tablist">
|
||||
<li class="nav-item" role="presentation">
|
||||
<button class="nav-link active" data-bs-toggle="tab"
|
||||
data-bs-target="#create-{{ acc.id }}" type="button">
|
||||
Create new job
|
||||
</button>
|
||||
</li>
|
||||
<li class="nav-item" role="presentation">
|
||||
<button class="nav-link" data-bs-toggle="tab"
|
||||
data-bs-target="#existing-{{ acc.id }}" type="button">
|
||||
Link to existing job
|
||||
</button>
|
||||
</li>
|
||||
</ul>
|
||||
|
||||
<div class="tab-content">
|
||||
|
||||
{# Tab 1: Create new job #}
|
||||
<div class="tab-pane fade show active" id="create-{{ acc.id }}">
|
||||
<form method="post" action="{{ url_for('main.cove_account_link', cove_account_db_id=acc.id) }}">
|
||||
<input type="hidden" name="action" value="create" />
|
||||
<div class="mb-3">
|
||||
<label class="form-label">Customer <span class="text-danger">*</span></label>
|
||||
<select class="form-select" name="customer_id" required>
|
||||
<option value="">Select customer…</option>
|
||||
{% for c in customers %}
|
||||
<option value="{{ c.id }}"
|
||||
{% if acc.customer_name and acc.customer_name.lower() == c.name.lower() %}selected{% endif %}>
|
||||
{{ c.name }}
|
||||
</option>
|
||||
{% endfor %}
|
||||
</select>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label class="form-label">Job name</label>
|
||||
<input type="text" class="form-control" name="job_name"
|
||||
value="{{ acc.derived_job_name }}" />
|
||||
<div class="form-text">Defaults to the Cove account name.</div>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<label class="form-label">Backup type</label>
|
||||
<input type="text" class="form-control" name="backup_type"
|
||||
value="{{ acc.derived_backup_type }}" />
|
||||
<div class="form-text">Derived from Cove datasource profile.</div>
|
||||
</div>
|
||||
<div class="d-flex justify-content-end gap-2">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="submit" class="btn btn-primary">Create job & link</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
|
||||
{# Tab 2: Link to existing job #}
|
||||
<div class="tab-pane fade" id="existing-{{ acc.id }}">
|
||||
<form method="post" action="{{ url_for('main.cove_account_link', cove_account_db_id=acc.id) }}">
|
||||
<input type="hidden" name="action" value="link" />
|
||||
<div class="mb-3">
|
||||
<label class="form-label">Job <span class="text-danger">*</span></label>
|
||||
<select class="form-select" name="job_id" required>
|
||||
<option value="">Select job…</option>
|
||||
{% for j in jobs %}
|
||||
<option value="{{ j.id }}">
|
||||
{{ j.customer.name ~ ' – ' if j.customer else '' }}{{ j.backup_software }} / {{ j.job_name }}
|
||||
</option>
|
||||
{% endfor %}
|
||||
</select>
|
||||
</div>
|
||||
<div class="d-flex justify-content-end gap-2">
|
||||
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
|
||||
<button type="submit" class="btn btn-primary">Link to job</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
|
||||
</div>{# /tab-content #}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>{# /modal #}
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="alert alert-success mb-4">
|
||||
<strong>All accounts matched.</strong>
|
||||
{% if not settings.cove_last_import_at %}
|
||||
Run an import first to see Cove accounts here.
|
||||
{% else %}
|
||||
No unmatched Cove accounts.
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{# ── Matched accounts ────────────────────────────────────────────────────── #}
|
||||
{% if matched %}
|
||||
<h4 class="mb-2">Linked <span class="badge bg-success">{{ matched|length }}</span></h4>
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm table-hover align-middle">
|
||||
<thead class="table-light">
|
||||
<tr>
|
||||
<th>Backup software</th>
|
||||
<th>Type</th>
|
||||
<th>Job name</th>
|
||||
<th>Computer</th>
|
||||
<th>Customer (Cove)</th>
|
||||
<th>Datasources</th>
|
||||
<th>Last status</th>
|
||||
<th>Last run</th>
|
||||
<th>Linked job</th>
|
||||
<th></th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for acc in matched %}
|
||||
<tr>
|
||||
<td>{{ acc.derived_backup_software }}</td>
|
||||
<td>{{ acc.derived_backup_type }}</td>
|
||||
<td>{{ acc.derived_job_name }}</td>
|
||||
<td class="text-muted small">{{ acc.computer_name or '—' }}</td>
|
||||
<td>{{ acc.customer_name or '—' }}</td>
|
||||
<td class="text-muted small">{{ acc.datasource_display }}</td>
|
||||
<td>
|
||||
{% if acc.last_status_code is not none %}
|
||||
<span class="badge bg-{{ STATUS_CLASS.get(acc.last_status_code, 'secondary') }}">
|
||||
{{ STATUS_LABELS.get(acc.last_status_code, acc.last_status_code) }}
|
||||
</span>
|
||||
{% else %}—{% endif %}
|
||||
</td>
|
||||
<td class="text-muted small">{{ acc.last_run_at|local_datetime if acc.last_run_at else '—' }}</td>
|
||||
<td>
|
||||
{% if acc.job %}
|
||||
<a href="{{ url_for('main.job_detail', job_id=acc.job.id) }}">
|
||||
{{ acc.job.customer.name ~ ' – ' if acc.job.customer else '' }}{{ acc.job.job_name }}
|
||||
</a>
|
||||
{% else %}—{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
<form method="post"
|
||||
action="{{ url_for('main.cove_account_unlink', cove_account_db_id=acc.id) }}"
|
||||
onsubmit="return confirm('Remove link between this Cove account and the job?');"
|
||||
class="mb-0">
|
||||
<button type="submit" class="btn btn-sm btn-outline-secondary">Unlink</button>
|
||||
</form>
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% if not unmatched and not matched %}
|
||||
<div class="alert alert-info">
|
||||
No Cove accounts found. Run an import first via the button above or via Settings → Integrations → Cove.
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% endblock %}
|
||||
@ -15,6 +15,10 @@
|
||||
|
||||
<form method="post" action="{{ url_for('main.customers_import') }}" enctype="multipart/form-data" class="d-flex align-items-center gap-2 mb-0">
|
||||
<input type="file" name="file" accept=".csv,text/csv" class="form-control form-control-sm" required style="max-width: 420px;" />
|
||||
<div class="form-check mb-0">
|
||||
<input class="form-check-input" type="checkbox" value="1" id="include_autotask_ids_customers" name="include_autotask_ids" />
|
||||
<label class="form-check-label small" for="include_autotask_ids_customers">Include Autotask IDs</label>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-outline-secondary btn-sm" style="white-space: nowrap;">Import CSV</button>
|
||||
</form>
|
||||
|
||||
@ -45,7 +49,11 @@
|
||||
{% if customers %}
|
||||
{% for c in customers %}
|
||||
<tr>
|
||||
<td>{{ c.name }}</td>
|
||||
<td>
|
||||
<a href="{{ url_for('main.jobs', customer_id=c.id) }}" class="link-primary text-decoration-none">
|
||||
{{ c.name }}
|
||||
</a>
|
||||
</td>
|
||||
<td>
|
||||
{% if c.active %}
|
||||
<span class="badge bg-success">Active</span>
|
||||
|
||||
@ -4,6 +4,9 @@
|
||||
<h2 class="mb-3">Daily Jobs</h2>
|
||||
|
||||
<form method="get" class="row g-3 mb-3">
|
||||
{% if q %}
|
||||
<input type="hidden" name="q" value="{{ q }}" />
|
||||
{% endif %}
|
||||
<div class="col-auto">
|
||||
<label for="dj_date" class="form-label">Date</label>
|
||||
<input
|
||||
@ -665,7 +668,7 @@ if (tStatus) tStatus.textContent = '';
|
||||
});
|
||||
}
|
||||
|
||||
function attachDailyJobsHandlers() {
|
||||
function attachDailyJobsHandlers() {
|
||||
var rows = document.querySelectorAll(".daily-job-row");
|
||||
if (!rows.length) {
|
||||
return;
|
||||
@ -771,9 +774,43 @@ if (tStatus) tStatus.textContent = '';
|
||||
});
|
||||
}
|
||||
|
||||
function autoOpenJobFromQuery() {
|
||||
try {
|
||||
var params = new URLSearchParams(window.location.search || "");
|
||||
var openJobId = (params.get("open_job_id") || "").trim();
|
||||
if (!openJobId) {
|
||||
return;
|
||||
}
|
||||
|
||||
var rows = document.querySelectorAll(".daily-job-row");
|
||||
var targetRow = null;
|
||||
rows.forEach(function (row) {
|
||||
if ((row.getAttribute("data-job-id") || "") === openJobId) {
|
||||
targetRow = row;
|
||||
}
|
||||
});
|
||||
|
||||
if (!targetRow) {
|
||||
return;
|
||||
}
|
||||
|
||||
targetRow.click();
|
||||
|
||||
params.delete("open_job_id");
|
||||
var nextQuery = params.toString();
|
||||
var nextUrl = window.location.pathname + (nextQuery ? ("?" + nextQuery) : "");
|
||||
if (window.history && window.history.replaceState) {
|
||||
window.history.replaceState({}, document.title, nextUrl);
|
||||
}
|
||||
} catch (e) {
|
||||
// no-op
|
||||
}
|
||||
}
|
||||
|
||||
document.addEventListener("DOMContentLoaded", function () {
|
||||
bindInlineCreateForms();
|
||||
attachDailyJobsHandlers();
|
||||
autoOpenJobFromQuery();
|
||||
});
|
||||
})();
|
||||
</script>
|
||||
|
||||
@ -34,6 +34,16 @@
|
||||
<div class="col-6 col-md-3">
|
||||
<button class="btn btn-outline-secondary" type="submit">Apply</button>
|
||||
</div>
|
||||
{% if active_role == 'admin' %}
|
||||
<div class="col-12">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" name="show_deleted" value="1" id="show_deleted" {% if show_deleted %}checked{% endif %} onchange="this.form.submit()">
|
||||
<label class="form-check-label" for="show_deleted">
|
||||
Show deleted items
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</form>
|
||||
|
||||
<div class="table-responsive">
|
||||
@ -46,6 +56,9 @@
|
||||
<th style="width: 160px;">Component</th>
|
||||
<th style="width: 120px;">Status</th>
|
||||
<th style="width: 170px;">Created</th>
|
||||
{% if active_role == 'admin' and show_deleted %}
|
||||
<th style="width: 140px;">Actions</th>
|
||||
{% endif %}
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
@ -56,20 +69,30 @@
|
||||
{% endif %}
|
||||
|
||||
{% for i in items %}
|
||||
<tr>
|
||||
<tr {% if i.is_deleted %}style="opacity: 0.6; background-color: var(--bs-secondary-bg);"{% endif %}>
|
||||
<td>
|
||||
{% if not i.is_deleted %}
|
||||
<form method="post" action="{{ url_for('main.feedback_vote', item_id=i.id) }}">
|
||||
<input type="hidden" name="ref" value="list" />
|
||||
<button type="submit" class="btn btn-sm {% if i.user_voted %}btn-success{% else %}btn-outline-secondary{% endif %}">
|
||||
+ {{ i.vote_count }}
|
||||
</button>
|
||||
</form>
|
||||
{% else %}
|
||||
<span class="text-muted">+ {{ i.vote_count }}</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
<a href="{{ url_for('main.feedback_detail', item_id=i.id) }}">{{ i.title }}</a>
|
||||
{% if i.is_deleted %}
|
||||
<span class="badge text-bg-dark ms-2">Deleted</span>
|
||||
{% endif %}
|
||||
{% if i.created_by %}
|
||||
<div class="text-muted" style="font-size: 0.85rem;">by {{ i.created_by }}</div>
|
||||
{% endif %}
|
||||
{% if i.is_deleted and i.deleted_at %}
|
||||
<div class="text-muted" style="font-size: 0.85rem;">Deleted {{ i.deleted_at|local_datetime }}</div>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
{% if i.item_type == 'bug' %}
|
||||
@ -90,6 +113,15 @@
|
||||
<div>{{ i.created_at|local_datetime }}</div>
|
||||
<div class="text-muted" style="font-size: 0.85rem;">Updated {{ i.updated_at|local_datetime }}</div>
|
||||
</td>
|
||||
{% if active_role == 'admin' and show_deleted %}
|
||||
<td>
|
||||
{% if i.is_deleted %}
|
||||
<form method="post" action="{{ url_for('main.feedback_permanent_delete', item_id=i.id) }}" onsubmit="return confirm('Permanently delete this item and all screenshots? This cannot be undone!');">
|
||||
<button type="submit" class="btn btn-sm btn-danger">Permanent Delete</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
</td>
|
||||
{% endif %}
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
|
||||
@ -15,6 +15,9 @@
|
||||
{% else %}
|
||||
<span class="badge text-bg-warning">Open</span>
|
||||
{% endif %}
|
||||
{% if item.deleted_at %}
|
||||
<span class="badge text-bg-dark">Deleted</span>
|
||||
{% endif %}
|
||||
<span class="ms-2">by {{ created_by_name }}</span>
|
||||
</div>
|
||||
</div>
|
||||
@ -29,6 +32,23 @@
|
||||
<div class="mb-2"><strong>Component:</strong> {{ item.component }}</div>
|
||||
{% endif %}
|
||||
<div style="white-space: pre-wrap;">{{ item.description }}</div>
|
||||
|
||||
{% if item_attachments %}
|
||||
<div class="mt-3">
|
||||
<strong>Screenshots:</strong>
|
||||
<div class="d-flex flex-wrap gap-2 mt-2">
|
||||
{% for att in item_attachments %}
|
||||
<a href="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}" target="_blank">
|
||||
<img src="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}"
|
||||
alt="{{ att.filename }}"
|
||||
class="img-thumbnail"
|
||||
style="max-height: 200px; max-width: 300px; cursor: pointer;"
|
||||
title="Click to view full size" />
|
||||
</a>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="card-footer d-flex justify-content-between align-items-center">
|
||||
<div class="text-muted" style="font-size: 0.9rem;">
|
||||
@ -63,6 +83,22 @@
|
||||
</span>
|
||||
</div>
|
||||
<div style="white-space: pre-wrap;">{{ r.message }}</div>
|
||||
|
||||
{% if r.id in reply_attachments_map %}
|
||||
<div class="mt-2">
|
||||
<div class="d-flex flex-wrap gap-2">
|
||||
{% for att in reply_attachments_map[r.id] %}
|
||||
<a href="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}" target="_blank">
|
||||
<img src="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}"
|
||||
alt="{{ att.filename }}"
|
||||
class="img-thumbnail"
|
||||
style="max-height: 150px; max-width: 200px; cursor: pointer;"
|
||||
title="Click to view full size" />
|
||||
</a>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
@ -76,10 +112,15 @@
|
||||
<div class="card-body">
|
||||
<h5 class="card-title mb-3">Add reply</h5>
|
||||
{% if item.status == 'open' %}
|
||||
<form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}">
|
||||
<form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}" enctype="multipart/form-data">
|
||||
<div class="mb-2">
|
||||
<textarea class="form-control" name="message" rows="4" required></textarea>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<label class="form-label">Screenshots (optional)</label>
|
||||
<input type="file" name="screenshots" class="form-control" multiple accept="image/png,image/jpeg,image/jpg,image/gif,image/webp" />
|
||||
<div class="form-text">You can attach multiple screenshots (PNG, JPG, GIF, WEBP, max 5MB each)</div>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-primary">Post reply</button>
|
||||
</form>
|
||||
{% else %}
|
||||
@ -95,21 +136,32 @@
|
||||
<h2 class="h6">Actions</h2>
|
||||
|
||||
{% if active_role == 'admin' %}
|
||||
{% if item.status == 'resolved' %}
|
||||
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
|
||||
<input type="hidden" name="action" value="reopen" />
|
||||
<button type="submit" class="btn btn-outline-secondary w-100">Reopen</button>
|
||||
</form>
|
||||
{% if item.deleted_at %}
|
||||
{# Item is deleted - show permanent delete option #}
|
||||
<div class="alert alert-warning mb-2" style="font-size: 0.9rem;">
|
||||
This item is deleted.
|
||||
</div>
|
||||
<form method="post" action="{{ url_for('main.feedback_permanent_delete', item_id=item.id) }}" onsubmit="return confirm('Permanently delete this item and all screenshots? This cannot be undone!');">
|
||||
<button type="submit" class="btn btn-danger w-100">Permanent Delete</button>
|
||||
</form>
|
||||
{% else %}
|
||||
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
|
||||
<input type="hidden" name="action" value="resolve" />
|
||||
<button type="submit" class="btn btn-success w-100">Mark as resolved</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
{# Item is not deleted - show normal actions #}
|
||||
{% if item.status == 'resolved' %}
|
||||
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
|
||||
<input type="hidden" name="action" value="reopen" />
|
||||
<button type="submit" class="btn btn-outline-secondary w-100">Reopen</button>
|
||||
</form>
|
||||
{% else %}
|
||||
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
|
||||
<input type="hidden" name="action" value="resolve" />
|
||||
<button type="submit" class="btn btn-success w-100">Mark as resolved</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
|
||||
<form method="post" action="{{ url_for('main.feedback_delete', item_id=item.id) }}" onsubmit="return confirm('Delete this item?');">
|
||||
<button type="submit" class="btn btn-danger w-100">Delete</button>
|
||||
</form>
|
||||
<form method="post" action="{{ url_for('main.feedback_delete', item_id=item.id) }}" onsubmit="return confirm('Delete this item?');">
|
||||
<button type="submit" class="btn btn-danger w-100">Delete</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<div class="text-muted">Only administrators can resolve or delete items.</div>
|
||||
{% endif %}
|
||||
|
||||
@ -6,7 +6,7 @@
|
||||
<a class="btn btn-outline-secondary" href="{{ url_for('main.feedback_page') }}">Back</a>
|
||||
</div>
|
||||
|
||||
<form method="post" class="card">
|
||||
<form method="post" enctype="multipart/form-data" class="card">
|
||||
<div class="card-body">
|
||||
<div class="row g-3">
|
||||
<div class="col-12 col-md-3">
|
||||
@ -28,6 +28,11 @@
|
||||
<label class="form-label">Component (optional)</label>
|
||||
<input type="text" name="component" class="form-control" />
|
||||
</div>
|
||||
<div class="col-12">
|
||||
<label class="form-label">Screenshots (optional)</label>
|
||||
<input type="file" name="screenshots" class="form-control" multiple accept="image/png,image/jpeg,image/jpg,image/gif,image/webp" />
|
||||
<div class="form-text">You can attach multiple screenshots (PNG, JPG, GIF, WEBP, max 5MB each)</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-footer d-flex justify-content-end">
|
||||
|
||||
@ -14,12 +14,12 @@
|
||||
<div class="d-flex justify-content-between align-items-center my-2">
|
||||
<div>
|
||||
{% if has_prev %}
|
||||
<a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.inbox', page=page-1) }}">Previous</a>
|
||||
<a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.inbox', page=page-1, q=q) }}">Previous</a>
|
||||
{% else %}
|
||||
<button class="btn btn-outline-secondary btn-sm" disabled>Previous</button>
|
||||
{% endif %}
|
||||
{% if has_next %}
|
||||
<a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.inbox', page=page+1) }}">Next</a>
|
||||
<a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.inbox', page=page+1, q=q) }}">Next</a>
|
||||
{% else %}
|
||||
<button class="btn btn-outline-secondary btn-sm ms-2" disabled>Next</button>
|
||||
{% endif %}
|
||||
@ -73,7 +73,7 @@
|
||||
<tr>
|
||||
{% if can_bulk_delete %}
|
||||
<th scope="col" style="width: 34px;">
|
||||
<input class="form-check-input" type="checkbox" id="inbox_select_all" />
|
||||
<input class="form-check-input" type="checkbox" id="inbox_select_all" autocomplete="off" />
|
||||
</th>
|
||||
{% endif %}
|
||||
<th scope="col">From</th>
|
||||
@ -93,7 +93,7 @@
|
||||
<tr class="inbox-row" data-message-id="{{ row.id }}" style="cursor: pointer;">
|
||||
{% if can_bulk_delete %}
|
||||
<td onclick="event.stopPropagation();">
|
||||
<input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" />
|
||||
<input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" autocomplete="off" />
|
||||
</td>
|
||||
{% endif %}
|
||||
<td>{{ row.from_address }}</td>
|
||||
|
||||
@ -59,6 +59,34 @@
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% if cove_enabled and can_manage_jobs %}
|
||||
<div class="card mb-3">
|
||||
<div class="card-header">Cove Integration</div>
|
||||
<div class="card-body">
|
||||
<form method="post" action="{{ url_for('main.job_set_cove_account', job_id=job.id) }}" class="row g-2 align-items-end mb-0">
|
||||
<div class="col-auto">
|
||||
<label for="cove_account_id" class="form-label mb-1">Cove Account ID</label>
|
||||
<input type="number" class="form-control form-control-sm" id="cove_account_id" name="cove_account_id"
|
||||
value="{{ job.cove_account_id or '' }}" placeholder="e.g. 4504627" style="width: 180px;" />
|
||||
</div>
|
||||
<div class="col-auto">
|
||||
<button type="submit" class="btn btn-sm btn-primary">Save</button>
|
||||
{% if job.cove_account_id %}
|
||||
<button type="submit" name="cove_account_id" value="" class="btn btn-sm btn-outline-secondary ms-1">Clear</button>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="col-auto text-muted small">
|
||||
{% if job.cove_account_id %}
|
||||
Linked to Cove account <strong>{{ job.cove_account_id }}</strong>
|
||||
{% else %}
|
||||
Not linked to a Cove account – runs will not be imported automatically.
|
||||
{% endif %}
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<h3 class="mt-4 mb-3">Job history</h3>
|
||||
|
||||
<div class="table-responsive">
|
||||
@ -287,6 +315,60 @@
|
||||
(function () {
|
||||
var currentRunId = null;
|
||||
|
||||
// Cross-browser copy to clipboard function
|
||||
function copyToClipboard(text, button) {
|
||||
// Method 1: Modern Clipboard API (works in most browsers with HTTPS)
|
||||
if (navigator.clipboard && navigator.clipboard.writeText) {
|
||||
navigator.clipboard.writeText(text)
|
||||
.then(function () {
|
||||
showCopyFeedback(button);
|
||||
})
|
||||
.catch(function () {
|
||||
// Fallback to method 2 if clipboard API fails
|
||||
fallbackCopy(text, button);
|
||||
});
|
||||
} else {
|
||||
// Method 2: Legacy execCommand method
|
||||
fallbackCopy(text, button);
|
||||
}
|
||||
}
|
||||
|
||||
function fallbackCopy(text, button) {
|
||||
var textarea = document.createElement('textarea');
|
||||
textarea.value = text;
|
||||
textarea.style.position = 'fixed';
|
||||
textarea.style.opacity = '0';
|
||||
textarea.style.top = '0';
|
||||
textarea.style.left = '0';
|
||||
document.body.appendChild(textarea);
|
||||
textarea.focus();
|
||||
textarea.select();
|
||||
|
||||
try {
|
||||
var successful = document.execCommand('copy');
|
||||
if (successful) {
|
||||
showCopyFeedback(button);
|
||||
} else {
|
||||
// If execCommand fails, use prompt as last resort
|
||||
window.prompt('Copy ticket number:', text);
|
||||
}
|
||||
} catch (err) {
|
||||
// If all else fails, show prompt
|
||||
window.prompt('Copy ticket number:', text);
|
||||
}
|
||||
|
||||
document.body.removeChild(textarea);
|
||||
}
|
||||
|
||||
function showCopyFeedback(button) {
|
||||
if (!button) return;
|
||||
var original = button.textContent;
|
||||
button.textContent = '✓';
|
||||
setTimeout(function () {
|
||||
button.textContent = original;
|
||||
}, 800);
|
||||
}
|
||||
|
||||
function apiJson(url, opts) {
|
||||
opts = opts || {};
|
||||
opts.headers = opts.headers || {};
|
||||
@ -319,12 +401,14 @@
|
||||
html += '<div class="mb-2"><strong>Tickets</strong><div class="mt-1">';
|
||||
tickets.forEach(function (t) {
|
||||
var status = t.resolved_at ? 'Resolved' : 'Active';
|
||||
var ticketCode = (t.ticket_code || '').toString();
|
||||
html += '<div class="mb-2 border rounded p-2" data-alert-type="ticket" data-id="' + t.id + '">' +
|
||||
'<div class="d-flex align-items-start justify-content-between gap-2">' +
|
||||
'<div class="flex-grow-1 min-w-0">' +
|
||||
'<div class="text-truncate">' +
|
||||
'<span class="me-1" title="Ticket">🎫</span>' +
|
||||
'<span class="fw-semibold">' + escapeHtml(t.ticket_code || '') + '</span>' +
|
||||
'<span class="fw-semibold">' + escapeHtml(ticketCode) + '</span>' +
|
||||
'<button type="button" class="btn btn-sm btn-outline-secondary ms-2 py-0 px-1" title="Copy ticket number" data-action="copy-ticket" data-code="' + escapeHtml(ticketCode) + '">⧉</button>' +
|
||||
'<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
@ -371,7 +455,16 @@
|
||||
ev.preventDefault();
|
||||
var action = btn.getAttribute('data-action');
|
||||
var id = btn.getAttribute('data-id');
|
||||
if (!action || !id) return;
|
||||
if (!action) return;
|
||||
|
||||
if (action === 'copy-ticket') {
|
||||
var code = btn.getAttribute('data-code') || '';
|
||||
if (!code) return;
|
||||
copyToClipboard(code, btn);
|
||||
return;
|
||||
}
|
||||
|
||||
if (!id) return;
|
||||
if (action === 'resolve-ticket') {
|
||||
if (!confirm('Mark ticket as resolved?')) return;
|
||||
apiJson('/api/tickets/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})
|
||||
|
||||
@ -2,6 +2,16 @@
|
||||
{% block content %}
|
||||
<h2 class="mb-3">Jobs</h2>
|
||||
|
||||
{% if selected_customer_id %}
|
||||
<div class="alert alert-info d-flex justify-content-between align-items-center py-2" role="alert">
|
||||
<span>
|
||||
Filtered on customer:
|
||||
<strong>{{ selected_customer_name or ('#' ~ selected_customer_id) }}</strong>
|
||||
</span>
|
||||
<a href="{{ url_for('main.jobs') }}" class="btn btn-sm btn-outline-primary">Clear filter</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm table-hover align-middle">
|
||||
<thead class="table-light">
|
||||
|
||||
@ -422,7 +422,10 @@ function loadRawData() {
|
||||
|
||||
function loadReports() {
|
||||
setTableLoading('Loading…');
|
||||
fetch('/api/reports', { credentials: 'same-origin' })
|
||||
var params = new URLSearchParams(window.location.search || '');
|
||||
var q = (params.get('q') || '').trim();
|
||||
var apiUrl = '/api/reports' + (q ? ('?q=' + encodeURIComponent(q)) : '');
|
||||
fetch(apiUrl, { credentials: 'same-origin' })
|
||||
.then(function (r) { return r.json(); })
|
||||
.then(function (data) {
|
||||
renderTable((data && data.items) ? data.items : []);
|
||||
|
||||
@ -48,7 +48,7 @@
|
||||
<thead class="table-light">
|
||||
<tr>
|
||||
<th scope="col" style="width: 34px;">
|
||||
<input class="form-check-input" type="checkbox" id="rc_select_all" />
|
||||
<input class="form-check-input" type="checkbox" id="rc_select_all" autocomplete="off" />
|
||||
</th>
|
||||
<th scope="col">Customer</th>
|
||||
<th scope="col">Backup</th>
|
||||
@ -63,7 +63,7 @@
|
||||
{% for r in rows %}
|
||||
<tr class="rc-job-row" data-job-id="{{ r.job_id }}" style="cursor: pointer;">
|
||||
<td onclick="event.stopPropagation();">
|
||||
<input class="form-check-input rc_row_cb" type="checkbox" value="{{ r.job_id }}" />
|
||||
<input class="form-check-input rc_row_cb" type="checkbox" value="{{ r.job_id }}" autocomplete="off" />
|
||||
</td>
|
||||
<td>{{ r.customer_name }}</td>
|
||||
<td>{{ r.backup_software }}</td>
|
||||
@ -447,6 +447,60 @@ function escapeHtml(s) {
|
||||
.replace(/'/g, "'");
|
||||
}
|
||||
|
||||
// Cross-browser copy to clipboard function
|
||||
function copyToClipboard(text, button) {
|
||||
// Method 1: Modern Clipboard API (works in most browsers with HTTPS)
|
||||
if (navigator.clipboard && navigator.clipboard.writeText) {
|
||||
navigator.clipboard.writeText(text)
|
||||
.then(function () {
|
||||
showCopyFeedback(button);
|
||||
})
|
||||
.catch(function () {
|
||||
// Fallback to method 2 if clipboard API fails
|
||||
fallbackCopy(text, button);
|
||||
});
|
||||
} else {
|
||||
// Method 2: Legacy execCommand method
|
||||
fallbackCopy(text, button);
|
||||
}
|
||||
}
|
||||
|
||||
function fallbackCopy(text, button) {
|
||||
var textarea = document.createElement('textarea');
|
||||
textarea.value = text;
|
||||
textarea.style.position = 'fixed';
|
||||
textarea.style.opacity = '0';
|
||||
textarea.style.top = '0';
|
||||
textarea.style.left = '0';
|
||||
document.body.appendChild(textarea);
|
||||
textarea.focus();
|
||||
textarea.select();
|
||||
|
||||
try {
|
||||
var successful = document.execCommand('copy');
|
||||
if (successful) {
|
||||
showCopyFeedback(button);
|
||||
} else {
|
||||
// If execCommand fails, use prompt as last resort
|
||||
window.prompt('Copy ticket number:', text);
|
||||
}
|
||||
} catch (err) {
|
||||
// If all else fails, show prompt
|
||||
window.prompt('Copy ticket number:', text);
|
||||
}
|
||||
|
||||
document.body.removeChild(textarea);
|
||||
}
|
||||
|
||||
function showCopyFeedback(button) {
|
||||
if (!button) return;
|
||||
var original = button.textContent;
|
||||
button.textContent = '✓';
|
||||
setTimeout(function () {
|
||||
button.textContent = original;
|
||||
}, 800);
|
||||
}
|
||||
|
||||
function getSelectedJobIds() {
|
||||
var cbs = table.querySelectorAll('tbody .rc_row_cb');
|
||||
var ids = [];
|
||||
@ -840,20 +894,7 @@ table.addEventListener('change', function (e) {
|
||||
if (action === 'copy-ticket') {
|
||||
var code = btn.getAttribute('data-code') || '';
|
||||
if (!code) return;
|
||||
if (navigator.clipboard && navigator.clipboard.writeText) {
|
||||
navigator.clipboard.writeText(code)
|
||||
.then(function () {
|
||||
var original = btn.textContent;
|
||||
btn.textContent = '✓';
|
||||
setTimeout(function () { btn.textContent = original; }, 800);
|
||||
})
|
||||
.catch(function () {
|
||||
// Fallback: select/copy via prompt
|
||||
window.prompt('Copy ticket number:', code);
|
||||
});
|
||||
} else {
|
||||
window.prompt('Copy ticket number:', code);
|
||||
}
|
||||
copyToClipboard(code, btn);
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
75
containers/backupchecks/src/templates/main/search.html
Normal file
75
containers/backupchecks/src/templates/main/search.html
Normal file
@ -0,0 +1,75 @@
|
||||
{% extends "layout/base.html" %}
|
||||
{% block content %}
|
||||
<h2 class="mb-3">Search</h2>
|
||||
|
||||
{% if query %}
|
||||
<p class="text-muted mb-3">
|
||||
Query: <strong>{{ query }}</strong> | Total hits: <strong>{{ total_hits }}</strong>
|
||||
</p>
|
||||
{% else %}
|
||||
<div class="alert alert-secondary py-2">
|
||||
Enter a search term in the top navigation bar.
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% for section in sections %}
|
||||
<div class="card mb-3" id="search-section-{{ section['key'] }}" style="scroll-margin-top: 96px;">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<span>{{ section['title'] }} ({{ section['total'] }})</span>
|
||||
<a href="{{ section['view_all_url'] }}" class="btn btn-sm btn-outline-secondary">Open {{ section['title'] }}</a>
|
||||
</div>
|
||||
{% if section['key'] == 'daily_jobs' %}
|
||||
<div class="px-3 py-2 small text-muted border-bottom">
|
||||
Note: The Daily Jobs page itself only shows results for the selected day. Search results can include matches that relate to jobs across other days.
|
||||
</div>
|
||||
{% endif %}
|
||||
<div class="card-body p-0">
|
||||
{% if section['items'] %}
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm mb-0 align-middle">
|
||||
<thead class="table-light">
|
||||
<tr>
|
||||
<th>Result</th>
|
||||
<th>Details</th>
|
||||
<th>Meta</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for item in section['items'] %}
|
||||
<tr>
|
||||
<td>
|
||||
{% if item.link %}
|
||||
<a href="{{ item.link }}">{{ item.title }}</a>
|
||||
{% else %}
|
||||
{{ item.title }}
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>{{ item.subtitle }}</td>
|
||||
<td>{{ item.meta }}</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="p-3 text-muted">No results in this section.</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% if section['total_pages'] > 1 %}
|
||||
<div class="card-footer d-flex justify-content-between align-items-center small">
|
||||
<span class="text-muted">
|
||||
Page {{ section['current_page'] }} of {{ section['total_pages'] }} ({{ section['total'] }} results)
|
||||
</span>
|
||||
<div class="d-flex gap-2">
|
||||
{% if section['has_prev'] %}
|
||||
<a class="btn btn-sm btn-outline-secondary" href="{{ section['prev_url'] }}#search-section-{{ section['key'] }}">Previous</a>
|
||||
{% endif %}
|
||||
{% if section['has_next'] %}
|
||||
<a class="btn btn-sm btn-outline-secondary" href="{{ section['next_url'] }}#search-section-{{ section['key'] }}">Next</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endblock %}
|
||||
@ -504,6 +504,178 @@
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% if section == 'integrations' %}
|
||||
<form method="post" class="mb-4" id="cove-settings-form">
|
||||
<div class="card mb-3">
|
||||
<div class="card-header">Cove Data Protection (N-able)</div>
|
||||
<div class="card-body">
|
||||
<div class="form-check form-switch mb-3">
|
||||
<input class="form-check-input" type="checkbox" id="cove_enabled" name="cove_enabled" {% if settings.cove_enabled %}checked{% endif %} />
|
||||
<label class="form-check-label" for="cove_enabled">Enable Cove integration</label>
|
||||
</div>
|
||||
|
||||
<div class="row g-3">
|
||||
<div class="col-md-12">
|
||||
<label for="cove_api_url" class="form-label">API URL</label>
|
||||
<input type="url" class="form-control" id="cove_api_url" name="cove_api_url"
|
||||
value="{{ settings.cove_api_url or '' }}"
|
||||
placeholder="https://api.backup.management/jsonapi" />
|
||||
<div class="form-text">Leave empty to use the default Cove API endpoint.</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-6">
|
||||
<label for="cove_api_username" class="form-label">API Username <span class="text-danger">*</span></label>
|
||||
<input type="text" class="form-control" id="cove_api_username" name="cove_api_username"
|
||||
value="{{ settings.cove_api_username or '' }}" />
|
||||
</div>
|
||||
|
||||
<div class="col-md-6">
|
||||
<label for="cove_api_password" class="form-label">API Password {% if not has_cove_password %}<span class="text-danger">*</span>{% endif %}</label>
|
||||
<input type="password" class="form-control" id="cove_api_password" name="cove_api_password"
|
||||
placeholder="{% if has_cove_password %}******** (stored){% else %}enter password{% endif %}" />
|
||||
<div class="form-text">Leave empty to keep the existing password.</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-6">
|
||||
<div class="form-check form-switch mt-2">
|
||||
<input class="form-check-input" type="checkbox" id="cove_import_enabled" name="cove_import_enabled" {% if settings.cove_import_enabled %}checked{% endif %} />
|
||||
<label class="form-check-label" for="cove_import_enabled">Enable automatic import</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="col-md-6">
|
||||
<label for="cove_import_interval_minutes" class="form-label">Import interval (minutes)</label>
|
||||
<input type="number" class="form-control" id="cove_import_interval_minutes" name="cove_import_interval_minutes"
|
||||
value="{{ settings.cove_import_interval_minutes or 30 }}" min="1" max="1440" />
|
||||
<div class="form-text">How often (in minutes) to fetch new data from the Cove API.</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="d-flex justify-content-between align-items-center mt-3">
|
||||
<div id="cove-test-result" class="small"></div>
|
||||
<div class="d-flex gap-2">
|
||||
<button type="button" class="btn btn-outline-secondary" id="cove-test-btn">Test connection</button>
|
||||
<button type="submit" class="btn btn-primary">Save Cove Settings</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% if settings.cove_partner_id %}
|
||||
<div class="mt-2 d-flex justify-content-between align-items-center flex-wrap gap-2">
|
||||
<div class="text-muted small">
|
||||
Connected – Partner ID: <strong>{{ settings.cove_partner_id }}</strong>
|
||||
{% if settings.cove_last_import_at %}
|
||||
· Last import: {{ settings.cove_last_import_at|local_datetime }}
|
||||
{% else %}
|
||||
· No import yet
|
||||
{% endif %}
|
||||
</div>
|
||||
<button
|
||||
type="submit"
|
||||
class="btn btn-sm btn-outline-primary"
|
||||
formaction="{{ url_for('main.settings_cove_run_now') }}"
|
||||
formmethod="post"
|
||||
>
|
||||
Run import now
|
||||
</button>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</form>
|
||||
|
||||
<script>
|
||||
(function () {
|
||||
var btn = document.getElementById('cove-test-btn');
|
||||
var resultDiv = document.getElementById('cove-test-result');
|
||||
if (!btn) return;
|
||||
btn.addEventListener('click', function () {
|
||||
btn.disabled = true;
|
||||
resultDiv.textContent = 'Testing…';
|
||||
resultDiv.className = 'small text-muted';
|
||||
fetch('{{ url_for("main.settings_cove_test_connection") }}', {
|
||||
method: 'POST',
|
||||
headers: { 'X-CSRFToken': document.querySelector('meta[name="csrf-token"]') ? document.querySelector('meta[name="csrf-token"]').content : '' },
|
||||
credentials: 'same-origin',
|
||||
})
|
||||
.then(function (r) { return r.json(); })
|
||||
.then(function (data) {
|
||||
if (data.ok) {
|
||||
resultDiv.textContent = data.message;
|
||||
resultDiv.className = 'small text-success';
|
||||
} else {
|
||||
resultDiv.textContent = data.message;
|
||||
resultDiv.className = 'small text-danger';
|
||||
}
|
||||
})
|
||||
.catch(function (err) {
|
||||
resultDiv.textContent = 'Request failed: ' + err;
|
||||
resultDiv.className = 'small text-danger';
|
||||
})
|
||||
.finally(function () { btn.disabled = false; });
|
||||
});
|
||||
})();
|
||||
</script>
|
||||
|
||||
<form method="post" class="mb-4" id="entra-settings-form">
|
||||
<div class="card mb-3">
|
||||
<div class="card-header">Microsoft Entra SSO</div>
|
||||
<div class="card-body">
|
||||
<div class="form-check form-switch mb-3">
|
||||
<input class="form-check-input" type="checkbox" id="entra_sso_enabled" name="entra_sso_enabled" {% if settings.entra_sso_enabled %}checked{% endif %} />
|
||||
<label class="form-check-label" for="entra_sso_enabled">Enable Microsoft sign-in</label>
|
||||
</div>
|
||||
|
||||
<div class="row g-3">
|
||||
<div class="col-md-6">
|
||||
<label for="entra_tenant_id" class="form-label">Tenant ID</label>
|
||||
<input type="text" class="form-control" id="entra_tenant_id" name="entra_tenant_id"
|
||||
value="{{ settings.entra_tenant_id or '' }}" placeholder="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" />
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<label for="entra_client_id" class="form-label">Client ID</label>
|
||||
<input type="text" class="form-control" id="entra_client_id" name="entra_client_id"
|
||||
value="{{ settings.entra_client_id or '' }}" placeholder="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" />
|
||||
</div>
|
||||
<div class="col-md-12">
|
||||
<label for="entra_client_secret" class="form-label">Client Secret {% if not has_entra_secret %}<span class="text-danger">*</span>{% endif %}</label>
|
||||
<input type="password" class="form-control" id="entra_client_secret" name="entra_client_secret"
|
||||
placeholder="{% if has_entra_secret %}******** (stored){% else %}enter secret{% endif %}" />
|
||||
<div class="form-text">Leave empty to keep the existing secret.</div>
|
||||
</div>
|
||||
<div class="col-md-12">
|
||||
<label for="entra_redirect_uri" class="form-label">Redirect URI (optional override)</label>
|
||||
<input type="url" class="form-control" id="entra_redirect_uri" name="entra_redirect_uri"
|
||||
value="{{ settings.entra_redirect_uri or '' }}"
|
||||
placeholder="https://your-domain.example/auth/entra/callback" />
|
||||
<div class="form-text">If empty, Backupchecks uses its own external callback URL.</div>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<label for="entra_allowed_domain" class="form-label">Allowed domain/tenant (optional)</label>
|
||||
<input type="text" class="form-control" id="entra_allowed_domain" name="entra_allowed_domain"
|
||||
value="{{ settings.entra_allowed_domain or '' }}" placeholder="contoso.com or tenant-id" />
|
||||
<div class="form-text">Restrict sign-ins to one tenant id or one email domain.</div>
|
||||
</div>
|
||||
<div class="col-md-12">
|
||||
<label for="entra_allowed_group_ids" class="form-label">Allowed Entra Group Object ID(s) (optional)</label>
|
||||
<textarea class="form-control" id="entra_allowed_group_ids" name="entra_allowed_group_ids" rows="3"
|
||||
placeholder="group-object-id-1 group-object-id-2">{{ settings.entra_allowed_group_ids or '' }}</textarea>
|
||||
<div class="form-text">Optional hard access gate. Enter one or more Entra security group object IDs (comma or newline separated). User must be member of at least one.</div>
|
||||
</div>
|
||||
<div class="col-md-6">
|
||||
<div class="form-check form-switch mt-4">
|
||||
<input class="form-check-input" type="checkbox" id="entra_auto_provision_users" name="entra_auto_provision_users" {% if settings.entra_auto_provision_users %}checked{% endif %} />
|
||||
<label class="form-check-label" for="entra_auto_provision_users">Auto-provision unknown users as Viewer</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="d-flex justify-content-end mt-3">
|
||||
<button type="submit" class="btn btn-primary">Save Entra Settings</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</form>
|
||||
{% endif %}
|
||||
|
||||
{% if section == 'maintenance' %}
|
||||
<div class="row g-3 mb-4">
|
||||
@ -528,8 +700,16 @@
|
||||
<div class="col-md-4 d-flex align-items-end">
|
||||
<button type="submit" class="btn btn-primary w-100">Import jobs</button>
|
||||
</div>
|
||||
<div class="col-12">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" value="1" id="include_autotask_ids_jobs" name="include_autotask_ids" />
|
||||
<label class="form-check-label" for="include_autotask_ids_jobs">
|
||||
Include Autotask IDs from import file
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-8">
|
||||
<div class="form-text">Use a JSON export created by this application.</div>
|
||||
<div class="form-text">Use a JSON export created by this application. Leave Autotask IDs unchecked for sandbox/development environments with a different Autotask database.</div>
|
||||
</div>
|
||||
</div>
|
||||
</form>
|
||||
|
||||
310
cove_api_test.py
Normal file
310
cove_api_test.py
Normal file
@ -0,0 +1,310 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Cove Data Protection API – Test Script
|
||||
=======================================
|
||||
Verified working via Postman (2026-02-23). Uses confirmed column codes.
|
||||
|
||||
Usage:
|
||||
python3 cove_api_test.py --username "api-user" --password "secret"
|
||||
|
||||
Or via environment variables:
|
||||
COVE_USERNAME="api-user" COVE_PASSWORD="secret" python3 cove_api_test.py
|
||||
|
||||
Optional:
|
||||
--url API endpoint (default: https://api.backup.management/jsonapi)
|
||||
--records Max records to fetch (default: 50)
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
from datetime import datetime, timezone
|
||||
|
||||
import requests
|
||||
|
||||
API_URL = "https://api.backup.management/jsonapi"
|
||||
|
||||
# Session status codes (F00 / F15 / F09)
|
||||
SESSION_STATUS = {
|
||||
1: "In process",
|
||||
2: "Failed",
|
||||
3: "Aborted",
|
||||
5: "Completed",
|
||||
6: "Interrupted",
|
||||
7: "NotStarted",
|
||||
8: "CompletedWithErrors",
|
||||
9: "InProgressWithFaults",
|
||||
10: "OverQuota",
|
||||
11: "NoSelection",
|
||||
12: "Restarted",
|
||||
}
|
||||
|
||||
# Backupchecks status mapping
|
||||
STATUS_MAP = {
|
||||
1: "Warning", # In process
|
||||
2: "Error", # Failed
|
||||
3: "Error", # Aborted
|
||||
5: "Success", # Completed
|
||||
6: "Error", # Interrupted
|
||||
7: "Warning", # NotStarted
|
||||
8: "Warning", # CompletedWithErrors
|
||||
9: "Warning", # InProgressWithFaults
|
||||
10: "Error", # OverQuota
|
||||
11: "Warning", # NoSelection
|
||||
12: "Warning", # Restarted
|
||||
}
|
||||
|
||||
# Confirmed working columns (verified via Postman 2026-02-23)
|
||||
COLUMNS = [
|
||||
"I1", "I18", "I8", "I78",
|
||||
"D09F00", "D09F09", "D09F15", "D09F08",
|
||||
"D1F00", "D1F15",
|
||||
"D10F00", "D10F15",
|
||||
"D11F00", "D11F15",
|
||||
"D19F00", "D19F15",
|
||||
"D20F00", "D20F15",
|
||||
"D5F00", "D5F15",
|
||||
"D23F00", "D23F15",
|
||||
]
|
||||
|
||||
# Datasource labels
|
||||
DATASOURCE_LABELS = {
|
||||
"D09": "Total",
|
||||
"D1": "Files & Folders",
|
||||
"D2": "System State",
|
||||
"D10": "VssMsSql (SQL Server)",
|
||||
"D11": "VssSharePoint",
|
||||
"D19": "M365 Exchange",
|
||||
"D20": "M365 OneDrive",
|
||||
"D5": "M365 SharePoint",
|
||||
"D23": "M365 Teams",
|
||||
}
|
||||
|
||||
|
||||
def _post(url: str, payload: dict, timeout: int = 30) -> dict:
|
||||
headers = {"Content-Type": "application/json"}
|
||||
resp = requests.post(url, json=payload, headers=headers, timeout=timeout)
|
||||
resp.raise_for_status()
|
||||
return resp.json()
|
||||
|
||||
|
||||
def login(url: str, username: str, password: str) -> tuple[str, int]:
|
||||
"""Authenticate and return (visa, partner_id)."""
|
||||
payload = {
|
||||
"jsonrpc": "2.0",
|
||||
"id": "jsonrpc",
|
||||
"method": "Login",
|
||||
"params": {
|
||||
"username": username,
|
||||
"password": password,
|
||||
},
|
||||
}
|
||||
data = _post(url, payload)
|
||||
|
||||
if "error" in data:
|
||||
raise RuntimeError(f"Login failed: {data['error']}")
|
||||
|
||||
visa = data.get("visa")
|
||||
if not visa:
|
||||
raise RuntimeError(f"No visa token in response: {data}")
|
||||
|
||||
result = data.get("result", {})
|
||||
partner_id = result.get("PartnerId") or result.get("result", {}).get("PartnerId")
|
||||
if not partner_id:
|
||||
raise RuntimeError(f"Could not find PartnerId in response: {data}")
|
||||
|
||||
return visa, int(partner_id)
|
||||
|
||||
|
||||
def enumerate_statistics(url: str, visa: str, partner_id: int, columns: list[str], records: int = 50) -> dict:
|
||||
payload = {
|
||||
"jsonrpc": "2.0",
|
||||
"visa": visa,
|
||||
"id": "jsonrpc",
|
||||
"method": "EnumerateAccountStatistics",
|
||||
"params": {
|
||||
"query": {
|
||||
"PartnerId": partner_id,
|
||||
"StartRecordNumber": 0,
|
||||
"RecordsCount": records,
|
||||
"Columns": columns,
|
||||
}
|
||||
},
|
||||
}
|
||||
return _post(url, payload)
|
||||
|
||||
|
||||
def fmt_ts(value) -> str:
|
||||
if not value:
|
||||
return "(none)"
|
||||
try:
|
||||
ts = int(value)
|
||||
if ts == 0:
|
||||
return "(none)"
|
||||
dt = datetime.fromtimestamp(ts, tz=timezone.utc)
|
||||
return dt.strftime("%Y-%m-%d %H:%M UTC")
|
||||
except (ValueError, TypeError, OSError):
|
||||
return str(value)
|
||||
|
||||
|
||||
def fmt_status(value) -> str:
|
||||
if value is None:
|
||||
return "(none)"
|
||||
try:
|
||||
code = int(value)
|
||||
bc = STATUS_MAP.get(code, "?")
|
||||
label = SESSION_STATUS.get(code, f"Unknown")
|
||||
return f"{code} ({label}) → {bc}"
|
||||
except (ValueError, TypeError):
|
||||
return str(value)
|
||||
|
||||
|
||||
def fmt_colorbar(value: str) -> str:
|
||||
if not value:
|
||||
return "(none)"
|
||||
icons = {"5": "✅", "8": "⚠️", "2": "❌", "1": "🔄", "0": "·"}
|
||||
return "".join(icons.get(c, c) for c in str(value))
|
||||
|
||||
|
||||
def print_header(title: str) -> None:
|
||||
print()
|
||||
print("=" * 70)
|
||||
print(f" {title}")
|
||||
print("=" * 70)
|
||||
|
||||
|
||||
def run(url: str, username: str, password: str, records: int, debug: bool = False) -> None:
|
||||
print_header("Cove Data Protection API – Test")
|
||||
print(f" URL: {url}")
|
||||
print(f" Username: {username}")
|
||||
|
||||
# Login
|
||||
print_header("Step 1: Login")
|
||||
visa, partner_id = login(url, username, password)
|
||||
print(f" ✅ Login OK")
|
||||
print(f" PartnerId: {partner_id}")
|
||||
print(f" Visa: {visa[:40]}...")
|
||||
|
||||
# Fetch statistics
|
||||
print_header("Step 2: EnumerateAccountStatistics")
|
||||
print(f" Columns: {', '.join(COLUMNS)}")
|
||||
print(f" Records: max {records}")
|
||||
|
||||
data = enumerate_statistics(url, visa, partner_id, COLUMNS, records)
|
||||
|
||||
if debug:
|
||||
print(f"\n RAW response (first 2000 chars):")
|
||||
print(json.dumps(data, indent=2)[:2000])
|
||||
|
||||
if "error" in data:
|
||||
err = data["error"]
|
||||
print(f" ❌ FAILED – error {err.get('code')}: {err.get('message')}")
|
||||
print(f" Data: {err.get('data')}")
|
||||
sys.exit(1)
|
||||
|
||||
result = data.get("result")
|
||||
if result is None:
|
||||
print(" ⚠️ result is null – raw response:")
|
||||
print(json.dumps(data, indent=2)[:1000])
|
||||
sys.exit(0)
|
||||
|
||||
if debug:
|
||||
print(f"\n result type: {type(result).__name__}")
|
||||
if isinstance(result, dict):
|
||||
print(f" result keys: {list(result.keys())}")
|
||||
|
||||
# Unwrap possible nested result
|
||||
if isinstance(result, dict) and "result" in result:
|
||||
result = result["result"]
|
||||
|
||||
# Result can be a list directly or wrapped in Accounts key
|
||||
accounts = result if isinstance(result, list) else result.get("Accounts", []) if isinstance(result, dict) else []
|
||||
total = len(accounts)
|
||||
print(f" ✅ SUCCESS – {total} account(s) returned")
|
||||
|
||||
# Per-account output
|
||||
print_header(f"Step 3: Account Details ({total} total)")
|
||||
|
||||
for i, acc in enumerate(accounts):
|
||||
# Settings is a list of single-key dicts: [{"D09F00": "5"}, {"I1": "name"}, ...]
|
||||
# Flatten to a single dict for easy lookup.
|
||||
s: dict = {}
|
||||
for item in acc.get("Settings", []):
|
||||
s.update(item)
|
||||
|
||||
account_id = acc.get("AccountId", "?")
|
||||
device_name = s.get("I1", "(no name)")
|
||||
computer = s.get("I18") or "(M365 tenant)"
|
||||
customer = s.get("I8", "")
|
||||
active_ds = s.get("I78", "")
|
||||
|
||||
print(f"\n [{i+1}/{total}] {device_name} (AccountId: {account_id})")
|
||||
print(f" Computer : {computer}")
|
||||
print(f" Customer : {customer}")
|
||||
print(f" Datasrc : {active_ds}")
|
||||
|
||||
# Total (D09)
|
||||
print(f" Total:")
|
||||
print(f" Status : {fmt_status(s.get('D09F00'))}")
|
||||
print(f" Last session: {fmt_ts(s.get('D09F15'))}")
|
||||
print(f" Last success: {fmt_ts(s.get('D09F09'))}")
|
||||
print(f" 28-day bar : {fmt_colorbar(s.get('D09F08'))}")
|
||||
|
||||
# Per-datasource (only if present in response)
|
||||
ds_pairs = [
|
||||
("D1", "D1F00", "D1F15"),
|
||||
("D10", "D10F00", "D10F15"),
|
||||
("D11", "D11F00", "D11F15"),
|
||||
("D19", "D19F00", "D19F15"),
|
||||
("D20", "D20F00", "D20F15"),
|
||||
("D5", "D5F00", "D5F15"),
|
||||
("D23", "D23F00", "D23F15"),
|
||||
]
|
||||
for ds_code, f00_col, f15_col in ds_pairs:
|
||||
f00 = s.get(f00_col)
|
||||
f15 = s.get(f15_col)
|
||||
if f00 is None and f15 is None:
|
||||
continue
|
||||
label = DATASOURCE_LABELS.get(ds_code, ds_code)
|
||||
print(f" {label}:")
|
||||
print(f" Status : {fmt_status(f00)}")
|
||||
print(f" Last session: {fmt_ts(f15)}")
|
||||
|
||||
# Summary
|
||||
print_header("Summary")
|
||||
status_counts: dict[str, int] = {}
|
||||
for acc in accounts:
|
||||
flat: dict = {}
|
||||
for item in acc.get("Settings", []):
|
||||
flat.update(item)
|
||||
raw = flat.get("D09F00")
|
||||
bc = STATUS_MAP.get(int(raw), "Unknown") if raw is not None else "No data"
|
||||
status_counts[bc] = status_counts.get(bc, 0) + 1
|
||||
|
||||
for status, count in sorted(status_counts.items()):
|
||||
icon = {"Success": "✅", "Warning": "⚠️", "Error": "❌"}.get(status, " ")
|
||||
print(f" {icon} {status}: {count}")
|
||||
print(f"\n Total accounts: {total}")
|
||||
print()
|
||||
|
||||
|
||||
def main() -> None:
|
||||
parser = argparse.ArgumentParser(description="Test Cove Data Protection API")
|
||||
parser.add_argument("--url", default=os.environ.get("COVE_URL", API_URL))
|
||||
parser.add_argument("--username", default=os.environ.get("COVE_USERNAME", ""))
|
||||
parser.add_argument("--password", default=os.environ.get("COVE_PASSWORD", ""))
|
||||
parser.add_argument("--records", type=int, default=50, help="Max accounts to fetch")
|
||||
parser.add_argument("--debug", action="store_true", help="Print raw API responses")
|
||||
args = parser.parse_args()
|
||||
|
||||
if not args.username or not args.password:
|
||||
print("Error: --username and --password are required.")
|
||||
print("Or set COVE_USERNAME and COVE_PASSWORD environment variables.")
|
||||
sys.exit(1)
|
||||
|
||||
run(args.url, args.username, args.password, args.records, args.debug)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@ -2,9 +2,183 @@
|
||||
|
||||
This file documents all changes made to this project via Claude Code.
|
||||
|
||||
## [2026-02-10]
|
||||
## [2026-02-23]
|
||||
|
||||
### Added
|
||||
- Cove Data Protection full integration into Backupchecks:
|
||||
- `app/cove_importer.py` – Cove API client: login, paginated EnumerateAccountStatistics, status mapping, deduplication, per-datasource object persistence
|
||||
- `app/cove_importer_service.py` – background thread that polls Cove API on configurable interval
|
||||
- `SystemSettings` model: 8 new Cove fields (`cove_enabled`, `cove_api_url`, `cove_api_username`, `cove_api_password`, `cove_import_enabled`, `cove_import_interval_minutes`, `cove_partner_id`, `cove_last_import_at`)
|
||||
- `Job` model: `cove_account_id` column to link a job to a Cove account
|
||||
- `JobRun` model: `source_type` (NULL = email, "cove_api") and `external_id` (deduplication key) columns
|
||||
- DB migration `migrate_cove_integration()` for all new columns + deduplication index
|
||||
- Settings > Integrations tab: new Cove section with enable toggle, API URL/username/password, import interval, and Test Connection button (AJAX → JSON response with partner ID)
|
||||
- Job Detail page: Cove Integration card showing Account ID input (only when `cove_enabled`)
|
||||
- Route `POST /settings/cove/test-connection` – verifies Cove credentials and stores partner ID
|
||||
- Route `POST /settings/cove/run-now` – manually trigger a Cove import from the Settings page
|
||||
- Route `POST /jobs/<id>/set-cove-account` – saves or clears Cove Account ID on a job
|
||||
- Cove Accounts inbox-style flow:
|
||||
- `CoveAccount` model (staging table): stores all Cove accounts from API, with optional `job_id` link
|
||||
- DB migration `migrate_cove_accounts_table()` creates `cove_accounts` table with indexes
|
||||
- `cove_importer.py` updated: always upserts all accounts into staging table; JobRuns only created for accounts with a linked job
|
||||
- `routes_cove.py` – new routes: `GET /cove/accounts`, `POST /cove/accounts/<id>/link`, `POST /cove/accounts/<id>/unlink`
|
||||
- `cove_accounts.html` – inbox-style page: unmatched accounts shown first with "Link / Create job" modals (two tabs: create new job or link to existing), matched accounts listed below with Unlink button
|
||||
- Nav bar: "Cove Accounts" link added for admin/operator roles when `cove_enabled`
|
||||
- Route `POST /settings/cove/run-now` triggers manual import (button also shown on Cove Accounts page)
|
||||
- `cove_api_test.py` – standalone Python test script to verify Cove Data Protection API column codes
|
||||
- Tests D9Fxx (Total), D10Fxx (VssMsSql), D11Fxx (VssSharePoint), and D1Fxx (Files&Folders)
|
||||
- Displays backup status (F00), timestamps (F09/F15/F18), error counts (F06) per account
|
||||
- Accepts credentials via CLI args or environment variables
|
||||
- Summary output showing which column sets work
|
||||
- Updated `docs/cove_data_protection_api_calls_known_info.md` with N-able support feedback:
|
||||
- D02/D03 are legacy – use D10/D11 or D9 (Total) instead
|
||||
- All users have the same API access (no MSP-level restriction)
|
||||
- Session status codes documented (D9F00: 2=Failed, 5=Completed, 8=CompletedWithErrors, etc.)
|
||||
- Updated `TODO-cove-data-protection.md` with breakthrough status and next steps
|
||||
|
||||
### Changed
|
||||
- Cove import/link behavior and visibility refinements:
|
||||
- Fixed Settings → Cove "Run import now" button submission issue caused by nested form markup
|
||||
- `/cove/accounts` now shows derived fields for faster linking decisions:
|
||||
- `backup_software` (Cove Data Protection)
|
||||
- derived `backup_type` (Server, Workstation, Microsoft 365)
|
||||
- derived `job_name`
|
||||
- human-readable datasource labels
|
||||
- `computer_name` visible in matched and unmatched sections
|
||||
- Linking a Cove account now triggers an immediate import attempt, so latest runs can appear without waiting for interval
|
||||
- Improved feedback after linking with per-linked-job run delta and clearer reason when no run is created
|
||||
- Cove run enrichment:
|
||||
- `JobRun.remark` now stores account/computer/customer/status/last-run summary
|
||||
- per-datasource run object records now include status detail text and datasource session timestamp
|
||||
- Cove timestamp fallback for run creation: use `D09F15`, fallback to `D09F09` when needed
|
||||
|
||||
### Fixed
|
||||
- Cove deduplication scope:
|
||||
- Dedup check changed from global `external_id` to per-job (`job_id + external_id`) to prevent newly linked/relinked jobs from being blocked by sessions imported under another job
|
||||
- Navbar compression for split-screen usage:
|
||||
- Reworked top navigation to reduce horizontal pressure without forced full collapse
|
||||
- Moved admin-only links into `Admin` dropdown
|
||||
- Added `More` dropdown for secondary non-admin navigation links
|
||||
- Kept primary daily operational link (`Run Checks`) directly visible
|
||||
- Adjusted role-specific visibility: Viewer now has `Customers` and `Jobs` directly visible in navbar
|
||||
|
||||
### Added
|
||||
- Microsoft Entra SSO implementation (branch `v20260223-07-entra-sso`):
|
||||
- Authorization code flow routes:
|
||||
- `GET /auth/entra/login`
|
||||
- `GET /auth/entra/callback`
|
||||
- Entra-aware logout flow (redirects through Entra logout endpoint when applicable)
|
||||
- Login page support for "Sign in with Microsoft" button when SSO is configured/enabled
|
||||
- Settings-backed Entra configuration fields in `SystemSettings` + migration support:
|
||||
- `entra_sso_enabled`
|
||||
- `entra_tenant_id`
|
||||
- `entra_client_id`
|
||||
- `entra_client_secret`
|
||||
- `entra_redirect_uri`
|
||||
- `entra_allowed_domain`
|
||||
- `entra_allowed_group_ids`
|
||||
- `entra_auto_provision_users`
|
||||
- Optional local auto-provisioning of unknown Entra users as Viewer
|
||||
- Security-group gate for SSO:
|
||||
- allowlist by Entra Group Object IDs
|
||||
- login blocked when token lacks required group context or group overage prevents reliable evaluation
|
||||
- Documentation updates for Entra SSO:
|
||||
- Added page `documentation/settings/entra-sso`
|
||||
- Added navigation entry under Settings
|
||||
- Marked explicitly as **Untested in Backupchecks**
|
||||
- Included setup instructions for tenant/client/secret/redirect, group-based access, and troubleshooting
|
||||
|
||||
## [2026-02-19]
|
||||
|
||||
### Added
|
||||
- Explicit `Include Autotask IDs` import option in the Approved Jobs JSON import form (Settings -> Maintenance)
|
||||
- Explicit `Include Autotask IDs` import option in the Customers CSV import form
|
||||
|
||||
### Changed
|
||||
- Approved Jobs import now only applies `autotask_company_id` and `autotask_company_name` when the import option is checked
|
||||
- Customers CSV import now only applies Autotask mapping fields when the import option is checked
|
||||
- Import success and audit output now includes whether Autotask IDs were imported
|
||||
- 3CX parser now recognizes `3CX Notification: Update Successful - <host>` as an informational run with `backup_software: 3CX`, `backup_type: Update`, and `overall_status: Success`, and excludes this type from schedule inference (no Expected/Missed generation)
|
||||
- Run Checks now hides only non-backup 3CX informational types (`Update`, `SSL Certificate`), while other backup software/types remain visible
|
||||
- Restored remark visibility in Run Checks and Job Details alerts by loading remarks from both sources: explicit run links (`remark_job_runs`) and active job scopes (`remark_scopes`) with duplicate prevention
|
||||
|
||||
## [2026-02-16]
|
||||
|
||||
### Added
|
||||
- Customer-to-jobs navigation by making customer names clickable on the Customers page (`/jobs?customer_id=<id>`)
|
||||
- Jobs page customer filter context UI with an active filter banner and a "Clear filter" action
|
||||
- Global search page (`/search`) with grouped results for Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Existing overrides, and Reports
|
||||
- Navbar search form to trigger global search from all authenticated pages
|
||||
- Dedicated Remarks section in global search results (with paging and detail links), so remark records are searchable alongside tickets
|
||||
|
||||
### Changed
|
||||
- `/jobs` route now accepts optional `customer_id` and returns only jobs for that customer when provided
|
||||
- Default Jobs listing keeps inactive-customer filtering only when no `customer_id` filter is applied
|
||||
- Updated `docs/technical-notes-codex.md` with a new "Last updated" date, Customers->Jobs navigation notes, and test build/push validation snapshot
|
||||
- Search matching is now case-insensitive with wildcard support (`*`) and automatic contains behavior (`*term*`) per search term
|
||||
- Global search visibility now only includes sections accessible to the currently active role
|
||||
- Updated `docs/technical-notes-codex.md` with a dedicated Global Grouped Search section (route/UI/behavior/access rules) and latest test build digest for `v20260216-02-global-search`
|
||||
- Global search now supports per-section pagination (previous/next), so results beyond the first 10 can be browsed per section while preserving current query/state
|
||||
- Daily Jobs search result metadata now includes expected run time, success indicator, and run count for the selected day
|
||||
- Daily Jobs search result links now open the same Daily Jobs modal flow via `open_job_id` (instead of only navigating to the overview page)
|
||||
- Updated `docs/technical-notes-codex.md` with search pagination query params, Daily Jobs modal-open search behavior, and latest successful test-build digest
|
||||
- Search pagination buttons now preserve scroll position by linking back to the active section anchor after page navigation
|
||||
- "Open <section>" behavior now passes `q` into destination pages and applies page-level filtering, so opened overviews reflect the same search term
|
||||
- Filtering support on Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Overrides, and Reports now accepts wildcard-enabled `q` terms from search
|
||||
- Reports frontend loading (`/api/reports`) now forwards URL `q` so client-side refresh keeps the same filtered result set
|
||||
- Daily Jobs search section UI now shows an explicit English note that the Daily Jobs page itself is day-scoped while search matches can reflect jobs across other days
|
||||
- Updated `docs/technical-notes-codex.md` to include remarks in grouped search sections, `p_remarks` pagination key, q-forwarding to overview pages, and latest test-build digest
|
||||
|
||||
### Fixed
|
||||
- `/search` page crash (`TypeError: 'builtin_function_or_method' object is not iterable`) by replacing Jinja dict access from `section.items` to `section['items']` in `templates/main/search.html`
|
||||
|
||||
## [2026-02-13]
|
||||
|
||||
### Added
|
||||
- Added internal technical reference document `docs/technical-notes-codex.md` with repository structure, application architecture, processing flow, parser system rules, ticketing/Autotask constraints, feedback attachment notes, deployment/build workflow, and operational attention points
|
||||
|
||||
### Changed
|
||||
- Changed `docs/technical-notes-codex.md` language from Dutch to English to align with project language rules for documentation
|
||||
|
||||
### Fixed
|
||||
- Fixed Autotask tickets and internal tickets not being linked to missed runs by calling `link_open_internal_tickets_to_run` after creating missed JobRun records in `_ensure_missed_runs_for_job` (both weekly and monthly schedules), ensuring missed runs now receive the same ticket propagation as email-based runs
|
||||
- Fixed checkboxes being automatically re-selected after delete actions on Inbox and Run Checks pages by adding `autocomplete="off"` attribute to all checkboxes, preventing browser from restoring previous checkbox states after page reload
|
||||
|
||||
## [2026-02-12]
|
||||
|
||||
### Fixed
|
||||
- Fixed tickets not being displayed in Run Checks modal detail view (Meldingen section) by extending `/api/job-runs/<run_id>/alerts` endpoint to include both run-specific tickets (via ticket_job_runs) and job-level tickets (via ticket_scopes), ensuring newly created tickets are visible immediately in the modal instead of only after being resolved
|
||||
- Fixed copy ticket button not working in Edge browser on Job Details page by moving clipboard functions (copyToClipboard, fallbackCopy, showCopyFeedback) inside IIFE scope for proper closure access (Edge is stricter than Firefox about scope resolution)
|
||||
|
||||
## [2026-02-10]
|
||||
|
||||
### Added
|
||||
- Added screenshot attachment support to Feedback/Bug system (user request: allow screenshots for bugs/features)
|
||||
- New database model: `FeedbackAttachment` with file_data (BYTEA), filename, mime_type, file_size
|
||||
- Upload support on feedback creation form (multiple files, PNG/JPG/GIF/WEBP, max 5MB each)
|
||||
- Upload support on reply forms (attach screenshots when replying)
|
||||
- Inline image display on feedback detail page (thumbnails with click-to-view-full-size)
|
||||
- Screenshot display for both main feedback items and replies
|
||||
- File validation: image type verification using imghdr (not just extension), size limits, secure filename handling
|
||||
- New route: `/feedback/attachment/<id>` to serve images (access-controlled, admins can view deleted item attachments)
|
||||
- Database migration: auto-creates `feedback_attachments` table with indexes on startup
|
||||
- Automatic CASCADE delete: removing feedback item or reply automatically removes associated attachments
|
||||
- Added admin-only deleted items view and permanent delete functionality to Feedback system
|
||||
- "Show deleted items" checkbox on feedback list page (admin only)
|
||||
- Deleted items shown with reduced opacity + background color and "Deleted" badge
|
||||
- Permanent delete action removes item + all attachments from database (hard delete with CASCADE)
|
||||
- Attachment count shown in deletion confirmation message
|
||||
- Admins can view detail pages of deleted items including their screenshots
|
||||
- Two-stage delete: soft delete (audit trail) → permanent delete (database cleanup)
|
||||
- Prevents accidental permanent deletion (requires item to be soft-deleted first)
|
||||
- Security: non-admin users cannot view deleted items or their attachments (404 response)
|
||||
- Added copy ticket button (⧉) to Job Details page modal for quickly copying ticket numbers to clipboard (previously only available on Run Checks page)
|
||||
|
||||
### Fixed
|
||||
- Fixed cross-browser clipboard copy functionality for ticket numbers (previously required manual copy popup in Edge browser)
|
||||
- Implemented three-tier fallback mechanism: modern Clipboard API → legacy execCommand('copy') → prompt fallback
|
||||
- Copy button now works directly in all browsers (Firefox, Edge, Chrome) without requiring user interaction
|
||||
- Applied improved copy mechanism to both Run Checks and Job Details pages
|
||||
- Fixed Autotask ticket not being automatically linked to new runs when internal ticket is resolved by implementing independent Autotask propagation strategy (now checks for most recent non-deleted and non-resolved Autotask ticket on job regardless of internal ticket status, ensuring PSA ticket reference persists across runs until explicitly resolved or deleted)
|
||||
- Fixed internal and Autotask tickets being linked to new runs even after being resolved by removing date-based "open" logic from ticket query (tickets now only link to new runs if they are genuinely unresolved, not based on run date comparisons)
|
||||
- Fixed Job Details page showing resolved tickets for ALL runs by implementing two-source ticket display: directly linked tickets (via ticket_job_runs) are always shown for audit trail, while active window tickets (via scope query) are only shown if unresolved, preserving historical ticket links while preventing resolved tickets from appearing on new runs
|
||||
|
||||
@ -1,3 +1,64 @@
|
||||
## v0.1.27
|
||||
|
||||
This release is a major functional update since `v0.1.26` (released on February 10, 2026).
|
||||
It introduces full Cove Data Protection integration, broad search and navigation improvements, and multiple workflow/ticketing fixes. It also adds Microsoft Entra SSO foundations (currently marked as untested in Backupchecks), along with extensive documentation updates and UI refinements.
|
||||
|
||||
### Added
|
||||
- Full Cove Data Protection integration:
|
||||
- API importer + background polling
|
||||
- Cove Accounts staging/linking page
|
||||
- Manual import trigger
|
||||
- JobRun source tracking + external IDs
|
||||
- CoveAccount model + migrations
|
||||
- Per-datasource object persistence
|
||||
- Cove API test script (`cove_api_test.py`)
|
||||
- Global grouped search with role-aware results
|
||||
- Per-section pagination in search
|
||||
- Remarks included in search results
|
||||
- Customer → Jobs quick filter navigation
|
||||
- Optional Autotask ID import toggle (jobs/customers import)
|
||||
- Microsoft Entra SSO implementation (marked as **untested in Backupchecks**):
|
||||
- login/callback/logout flow
|
||||
- settings + migrations
|
||||
- optional auto-provisioning
|
||||
- tenant/domain restriction
|
||||
- security-group gate (allowed Entra group IDs)
|
||||
- New documentation page for Entra SSO setup
|
||||
|
||||
### Changed
|
||||
- Cove import and linking flow refinements:
|
||||
- Immediate import after linking
|
||||
- Type derivation (Server/Workstation/Microsoft 365)
|
||||
- More readable Cove Accounts display
|
||||
- Richer run details
|
||||
- Timestamp fallback (`D09F15` → `D09F09`)
|
||||
- Navbar restructuring:
|
||||
- Admin links grouped under `Admin`
|
||||
- Secondary links grouped under `More`
|
||||
- Cove Accounts moved back to main bar
|
||||
- Daily Jobs moved under `More`
|
||||
- Viewer role now has Customers/Jobs directly on navbar
|
||||
- Search UX improvements:
|
||||
- wildcard/contains filtering
|
||||
- query forwarding to overview pages
|
||||
- section-anchor and pagination state preservation
|
||||
- Parser/Run Checks behavior updates:
|
||||
- 3CX update emails handled as informational
|
||||
- Non-backup 3CX types hidden from Run Checks
|
||||
- Documentation expanded and corrected (workflow, run review, mail import, settings, etc.)
|
||||
|
||||
### Fixed
|
||||
- Tickets not shown in Run Checks modal fixed
|
||||
- Copy ticket button works in Edge (scope/clipboard fallback)
|
||||
- Resolved tickets incorrectly shown on new runs fixed (explicit link-based logic)
|
||||
- Duplicate tickets in Run Checks popup fixed
|
||||
- Missed-run ticket linking with Autotask/internal tickets fixed
|
||||
- Cove run deduplication corrected to per-job scope
|
||||
- Cove “Run import now” submit issue fixed
|
||||
- Checkbox auto-reselect after reload fixed
|
||||
- Search template crash fixed (`section.items`)
|
||||
- Cleanup: Python cache artifacts are no longer tracked
|
||||
|
||||
## v0.1.26
|
||||
|
||||
This critical bug fix release resolves ticket system display issues where resolved tickets were incorrectly appearing on new runs across multiple pages. The ticket system has been completely transitioned from date-based logic to explicit link-based queries, ensuring resolved tickets stop appearing immediately after resolution while preserving audit trail for historical runs.
|
||||
|
||||
230
docs/cove_data_protection_api_calls_known_info.md
Normal file
230
docs/cove_data_protection_api_calls_known_info.md
Normal file
@ -0,0 +1,230 @@
|
||||
# Cove Data Protection (N-able Backup) – Known Information on API Calls
|
||||
|
||||
Date: 2026-02-10 (updated 2026-02-23)
|
||||
Status: Pending re-test with corrected column codes
|
||||
|
||||
## ⚠️ Important Update (2026-02-23)
|
||||
|
||||
**N-able support (Andrew Robinson, Applications Engineer) confirmed:**
|
||||
|
||||
1. **D02 and D03 are legacy column codes** – use **D10 and D11** instead.
|
||||
2. **There is no MSP-level restriction** – all API users have the same access level.
|
||||
3. New documentation: https://developer.n-able.com/n-able-cove/docs/getting-started
|
||||
4. Column code reference: https://developer.n-able.com/n-able-cove/docs/column-codes
|
||||
|
||||
**Impact:** The security error 13501 was caused by using legacy D02Fxx/D03Fxx codes.
|
||||
Using D9Fxx (Total aggregate), D10Fxx (VssMsSql), D11Fxx (VssSharePoint) should work.
|
||||
|
||||
**Key newly available columns (pending re-test):**
|
||||
- `D9F00` = Last Session Status (2=Failed, 5=Completed, 8=CompletedWithErrors, etc.)
|
||||
- `D9F06` = Last Session Errors Count
|
||||
- `D9F09` = Last Successful Session Timestamp (Unix)
|
||||
- `D9F12` = Session Duration
|
||||
- `D9F15` = Last Session Timestamp (Unix)
|
||||
- `D9F17` = Last Completed Session Status
|
||||
- `D9F18` = Last Completed Session Timestamp (Unix)
|
||||
|
||||
**Session status codes (F00):**
|
||||
1=In process, 2=Failed, 3=Aborted, 5=Completed, 6=Interrupted,
|
||||
7=NotStarted, 8=CompletedWithErrors, 9=InProgressWithFaults,
|
||||
10=OverQuota, 11=NoSelection, 12=Restarted
|
||||
|
||||
**Test script:** `cove_api_test.py` in project root – run this to verify new column codes.
|
||||
|
||||
---
|
||||
|
||||
## Summary of original findings (2026-02-10)
|
||||
|
||||
API access to Cove Data Protection via JSON-RPC **works**, but was **heavily restricted**
|
||||
because legacy column codes (D02Fxx, D03Fxx) were being used. Now resolved.
|
||||
|
||||
Previous error:
|
||||
```
|
||||
Operation failed because of security reasons (error 13501)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Authentication model (confirmed)
|
||||
|
||||
- Endpoint: https://api.backup.management/jsonapi
|
||||
- Protocol: JSON‑RPC 2.0
|
||||
- Method: POST only
|
||||
- Authentication flow:
|
||||
1. Login method is called
|
||||
2. Response returns a **visa** token (top‑level field)
|
||||
3. The visa **must be included in every subsequent call**
|
||||
4. Cove may return a new visa in later responses (token chaining)
|
||||
|
||||
### Login request (working)
|
||||
|
||||
```json
|
||||
{
|
||||
"jsonrpc": "2.0",
|
||||
"method": "Login",
|
||||
"params": {
|
||||
"partner": "<EXACT customer/partner name>",
|
||||
"username": "<api login name>",
|
||||
"password": "<password>"
|
||||
},
|
||||
"id": "1"
|
||||
}
|
||||
```
|
||||
|
||||
### Login response structure (important)
|
||||
|
||||
```json
|
||||
{
|
||||
"result": {
|
||||
"result": {
|
||||
"PartnerId": <number>,
|
||||
"Name": "<login name>",
|
||||
"Flags": ["SecurityOfficer","NonInteractive"]
|
||||
}
|
||||
},
|
||||
"visa": "<visa token>"
|
||||
}
|
||||
```
|
||||
|
||||
Notes:
|
||||
- `visa` is **not** inside `result`, but at top level
|
||||
- `PartnerId` is found at `result.result.PartnerId`
|
||||
|
||||
---
|
||||
|
||||
## API user scope (critical finding)
|
||||
|
||||
- API users are **always bound to a single Partner (customer)** unless created at MSP/root level
|
||||
- In this environment, it is **not possible to create an MSP‑level API user**
|
||||
- All testing was therefore done with **customer‑scoped API users**
|
||||
|
||||
Impact:
|
||||
- Cross‑customer enumeration is impossible
|
||||
- Only data belonging to the linked customer can be queried
|
||||
- Some enumerate/reporting calls are blocked regardless of role
|
||||
|
||||
---
|
||||
|
||||
## EnumerateAccountStatistics – what works and what does not
|
||||
|
||||
### Method
|
||||
|
||||
```json
|
||||
{
|
||||
"jsonrpc": "2.0",
|
||||
"method": "EnumerateAccountStatistics",
|
||||
"visa": "<visa>",
|
||||
"params": {
|
||||
"query": {
|
||||
"PartnerId": <partner_id>,
|
||||
"SelectionMode": "Merged",
|
||||
"StartRecordNumber": 0,
|
||||
"RecordsCount": 50,
|
||||
"Columns": [ ... ]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Mandatory behavior
|
||||
|
||||
- **Columns are required**; omitting them returns `result: null`
|
||||
- The API behaves as an **allow‑list**:
|
||||
- If *any* requested column is restricted, the **entire call fails** with error 13501
|
||||
|
||||
### Confirmed working (safe) column set
|
||||
|
||||
The following column set works reliably:
|
||||
|
||||
- I1 → account / device / tenant identifier
|
||||
- I14 → used storage (bytes)
|
||||
- I18 → computer name (if applicable)
|
||||
- D01F00 – D01F07 → numeric metrics (exact semantics TBD)
|
||||
- D09F00 → numeric status/category code
|
||||
|
||||
Example (validated working):
|
||||
|
||||
```json
|
||||
"Columns": [
|
||||
"I1","I14","I18",
|
||||
"D01F00","D01F01","D01F02","D01F03",
|
||||
"D01F04","D01F05","D01F06","D01F07",
|
||||
"D09F00"
|
||||
]
|
||||
```
|
||||
|
||||
### Confirmed restricted (cause security error 13501)
|
||||
|
||||
- Entire D02Fxx range
|
||||
- Entire D03Fxx range
|
||||
- Broad I‑ranges (e.g. I1–I10 batches)
|
||||
- Many individually tested I‑codes not in the safe set
|
||||
|
||||
Even adding **one restricted code** causes the entire call to fail.
|
||||
|
||||
---
|
||||
|
||||
## EnumerateAccounts
|
||||
|
||||
- Method consistently fails with `Operation failed because of security reasons`
|
||||
- This applies even with:
|
||||
- SuperUser role
|
||||
- SecurityOfficer flag enabled
|
||||
|
||||
Conclusion:
|
||||
- EnumerateAccounts is **not usable** in this tenant for customer‑scoped API users
|
||||
|
||||
---
|
||||
|
||||
## Other tested methods
|
||||
|
||||
- EnumerateStatistics → Method not found
|
||||
- GetPartnerInfo → works only for basic partner metadata (not statistics)
|
||||
|
||||
---
|
||||
|
||||
## Practical implications for BackupChecks
|
||||
|
||||
What **is possible**:
|
||||
- Enumerate accounts implicitly via EnumerateAccountStatistics
|
||||
- Identify devices/accounts via AccountId + I1/I18
|
||||
- Collect storage usage (I14)
|
||||
- Collect numeric status/metrics via D01Fxx and D09F00
|
||||
|
||||
What is **not possible (via this API scope)**:
|
||||
- Reliable last backup timestamp
|
||||
- Explicit success / failure / warning text
|
||||
- Error messages
|
||||
- Enumerating devices via EnumerateAccounts
|
||||
- Cross‑customer aggregation
|
||||
|
||||
### Suggested internal model mapping
|
||||
|
||||
- Customer
|
||||
- external_id = PartnerId
|
||||
|
||||
- Job
|
||||
- external_id = AccountId
|
||||
- display_name = I1
|
||||
- hostname = I18 (if present)
|
||||
|
||||
- Run (limited)
|
||||
- metrics only (bytes, counters)
|
||||
- status must be **derived heuristically** from numeric fields (if possible)
|
||||
|
||||
---
|
||||
|
||||
## Open questions / next steps
|
||||
|
||||
1. Confirm official meaning of:
|
||||
- D01F00 – D01F07
|
||||
- D09F00
|
||||
|
||||
2. Investigate whether:
|
||||
- A token‑based (non‑JSON‑RPC) reporting endpoint exists
|
||||
- N‑able support can enable additional reporting columns
|
||||
- An MSP‑level API user can be provisioned by N‑able
|
||||
|
||||
3. Decide whether Cove integration in BackupChecks will be:
|
||||
- Metrics‑only (no run result semantics)
|
||||
- Or require vendor cooperation for expanded API access
|
||||
609
docs/technical-notes-codex.md
Normal file
609
docs/technical-notes-codex.md
Normal file
@ -0,0 +1,609 @@
|
||||
# Technical Notes (Internal)
|
||||
|
||||
Last updated: 2026-02-23 (late)
|
||||
|
||||
## Purpose
|
||||
Internal technical snapshot of the `backupchecks` repository for faster onboarding, troubleshooting, and change impact analysis.
|
||||
|
||||
## Repository Overview
|
||||
- Application: Flask web app with SQLAlchemy and Flask-Migrate.
|
||||
- Runtime: Containerized (Docker), deployed via Docker Compose stack.
|
||||
- Primary source code location: `containers/backupchecks/src`.
|
||||
- The project also contains extensive functional documentation in `docs/` and multiple roadmap TODO files at repository root.
|
||||
|
||||
## Main Structure
|
||||
- `containers/backupchecks/Dockerfile`: Python 3.12-slim image, starts `gunicorn` with `backend.app:create_app()`.
|
||||
- `containers/backupchecks/requirements.txt`: Flask stack + PostgreSQL driver + reporting libraries (`reportlab`, `Markdown`).
|
||||
- `containers/backupchecks/src/backend/app`: backend domain logic, routes, parsers, models, migrations.
|
||||
- `containers/backupchecks/src/templates`: Jinja templates for auth/main/documentation pages.
|
||||
- `containers/backupchecks/src/static`: CSS, images, favicon.
|
||||
- `deploy/backupchecks-stack.yml`: compose stack with `backupchecks`, `postgres`, `adminer`.
|
||||
- `build-and-push.sh`: release/test build script with version bumping, tags, and image push.
|
||||
- `docs/`: functional design, changelogs, migration notes, API notes.
|
||||
|
||||
## Application Architecture (Current Observation)
|
||||
- Factory pattern: `create_app()` in `containers/backupchecks/src/backend/app/__init__.py`.
|
||||
- Blueprints:
|
||||
- `auth_bp` for authentication.
|
||||
- `main_bp` for core functionality.
|
||||
- `doc_bp` for internal documentation pages.
|
||||
- Database initialization at startup:
|
||||
- `db.create_all()`
|
||||
- `run_migrations()`
|
||||
- Background tasks:
|
||||
- `start_auto_importer(app)` starts the automatic mail importer thread.
|
||||
- `start_cove_importer(app)` starts the Cove Data Protection polling thread (started only when `cove_import_enabled` is set).
|
||||
- Health endpoint:
|
||||
- `GET /health` returns `{ "status": "ok" }`.
|
||||
|
||||
## Functional Processing Flow
|
||||
- Import:
|
||||
- Email is fetched via Microsoft Graph API.
|
||||
- Parse:
|
||||
- Parser selection through registry + software-specific parser implementations.
|
||||
- Approve:
|
||||
- New jobs first appear in Inbox for initial customer assignment.
|
||||
- Auto-process:
|
||||
- Subsequent emails for known jobs automatically create `JobRun` records.
|
||||
- Monitor:
|
||||
- Runs appear in Daily Jobs and Run Checks.
|
||||
- Review:
|
||||
- Manual review removes items from the unreviewed operational queue.
|
||||
|
||||
## Configuration and Runtime
|
||||
- Config is built from environment variables in `containers/backupchecks/src/backend/app/config.py`.
|
||||
- Important variables:
|
||||
- `APP_SECRET_KEY`
|
||||
- `APP_ENV`
|
||||
- `APP_PORT`
|
||||
- `POSTGRES_DB`
|
||||
- `POSTGRES_USER`
|
||||
- `POSTGRES_PASSWORD`
|
||||
- `DB_HOST`
|
||||
- `DB_PORT`
|
||||
- Database URI pattern:
|
||||
- `postgresql+psycopg2://<user>:<pass>@<host>:<port>/<db>`
|
||||
- Default timezone in config: `Europe/Amsterdam`.
|
||||
|
||||
## Data Model (High-level)
|
||||
File: `containers/backupchecks/src/backend/app/models.py`
|
||||
- Auth/users:
|
||||
- `User` with role(s), active role in session.
|
||||
- System settings:
|
||||
- `SystemSettings` with Graph/mail settings, import settings, UI timezone, dashboard policy, sandbox flag.
|
||||
- Autotask configuration and cache fields are present.
|
||||
- Cove Data Protection fields: `cove_enabled`, `cove_api_url`, `cove_api_username`, `cove_api_password`, `cove_import_enabled`, `cove_import_interval_minutes`, `cove_partner_id`, `cove_last_import_at`.
|
||||
- Microsoft Entra SSO fields: `entra_sso_enabled`, `entra_tenant_id`, `entra_client_id`, `entra_client_secret`, `entra_redirect_uri`, `entra_allowed_domain`, `entra_allowed_group_ids`, `entra_auto_provision_users`.
|
||||
- Logging:
|
||||
- `AuditLog` (legacy alias `AdminLog`).
|
||||
- Domain:
|
||||
- `Customer`, `Job`, `JobRun`, `Override`
|
||||
- `MailMessage`, `MailObject`
|
||||
- `CoveAccount` (Cove staging table — see Cove integration section)
|
||||
- `Ticket`, `TicketScope`, `TicketJobRun`
|
||||
- `Remark`, `RemarkScope`, `RemarkJobRun`
|
||||
- `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`
|
||||
|
||||
### Foreign Key Relationships & Deletion Order
|
||||
Critical deletion order to avoid constraint violations:
|
||||
1. Clean auxiliary tables (ticket_job_runs, remark_job_runs, scopes, overrides)
|
||||
2. Unlink mails from jobs (UPDATE mail_messages SET job_id = NULL)
|
||||
3. Delete mail_objects
|
||||
4. Delete jobs (cascades to job_runs)
|
||||
5. Delete mails
|
||||
|
||||
### Key Model Fields
|
||||
**MailMessage model:**
|
||||
- `from_address` (NOT `sender`!) - sender email
|
||||
- `subject` - email subject
|
||||
- `text_body` - plain text content
|
||||
- `html_body` - HTML content
|
||||
- `received_at` - timestamp
|
||||
- `location` - inbox/processed/deleted
|
||||
- `job_id` - link to Job (nullable)
|
||||
|
||||
**Job model:**
|
||||
- `customer_id` - FK to Customer
|
||||
- `job_name` - parsed from email
|
||||
- `backup_software` - e.g., "Veeam", "Synology", "Cove Data Protection"
|
||||
- `backup_type` - e.g., "Backup Job", "Active Backup"
|
||||
- `cove_account_id` - (nullable int) links this job to a Cove AccountId
|
||||
|
||||
**JobRun model:**
|
||||
- `source_type` - NULL = email (backwards compat), `"cove_api"` for Cove-imported runs
|
||||
- `external_id` - deduplication key for Cove runs: `"cove-{account_id}-{run_ts}"`
|
||||
|
||||
## Parser Architecture
|
||||
- Folder: `containers/backupchecks/src/backend/app/parsers/`
|
||||
- Two layers:
|
||||
- `registry.py`:
|
||||
- matching/documentation/visibility on `/parsers`.
|
||||
- examples must stay generic (no customer names).
|
||||
- parser files (`veeam.py`, `synology.py`, etc.):
|
||||
- actual detection and parsing logic.
|
||||
- return structured output: software, type, job name, status, objects.
|
||||
- Practical rule:
|
||||
- extend patterns by adding, not replacing (backward compatibility).
|
||||
|
||||
### Parser Types
|
||||
**Informational Parsers:**
|
||||
- DSM Updates, Account Protection, Firmware Updates
|
||||
- Set appropriate backup_type (e.g., "Updates", "Firmware Update")
|
||||
- Do NOT participate in schedule learning
|
||||
- Usually still visible in Run Checks for awareness
|
||||
- Exception: non-backup 3CX informational types (`Update`, `SSL Certificate`) are hidden from Run Checks
|
||||
|
||||
**Regular Parsers:**
|
||||
- Backup jobs (Veeam, Synology Active Backup, NAKIVO, etc.)
|
||||
- Participate in schedule learning (daily/weekly/monthly detection)
|
||||
- Generate missed runs when expected runs don't occur
|
||||
|
||||
**Example: Synology Updates Parser (synology.py)**
|
||||
- Handles multiple update notification types under same job:
|
||||
- DSM automatic update cancelled
|
||||
- Packages out-of-date
|
||||
- Combined notifications (DSM + packages)
|
||||
- Detection patterns:
|
||||
- DSM: "Automatische DSM-update", "DSM-update op", "automatic DSM update"
|
||||
- Packages: "Packages on", "out-of-date", "Package Center"
|
||||
- Hostname extraction from multiple patterns
|
||||
- Returns: backup_type "Updates", job_name "Synology Automatic Update"
|
||||
|
||||
## Cove Data Protection Integration
|
||||
|
||||
### Overview
|
||||
Cove (N-able) Data Protection is a cloud backup platform. Backupchecks integrates with it via the Cove JSON-RPC API, following the same inbox-style staging flow as email imports.
|
||||
|
||||
### Files
|
||||
- `containers/backupchecks/src/backend/app/cove_importer.py` – API client, account processing, JobRun creation
|
||||
- `containers/backupchecks/src/backend/app/cove_importer_service.py` – background polling thread
|
||||
- `containers/backupchecks/src/backend/app/main/routes_cove.py` – `/cove/accounts` routes
|
||||
- `containers/backupchecks/src/templates/main/cove_accounts.html` – inbox-style accounts page
|
||||
|
||||
### API Details
|
||||
- Endpoint: `https://api.backup.management/jsonapi` (JSON-RPC 2.0)
|
||||
- **Login**: `POST` with `{"jsonrpc":"2.0","id":"jsonrpc","method":"Login","params":{"username":"...","password":"..."}}`
|
||||
- Returns `visa` at top level (`data["visa"]`), **not** inside `result`
|
||||
- Returns `PartnerId` inside `result`
|
||||
- **EnumerateAccountStatistics**: `POST` with visa in payload, `query` (lowercase) with `PartnerId`, `StartRecordNumber`, `RecordsCount`, `Columns`
|
||||
- Settings format per account: `[{"D09F00": "5"}, {"I1": "device name"}, ...]` — list of single-key dicts, flatten with `dict.update(item)`
|
||||
|
||||
### Column Codes
|
||||
| Code | Meaning |
|
||||
|------|---------|
|
||||
| `I1` | Account/device name |
|
||||
| `I18` | Computer name |
|
||||
| `I8` | Customer/partner name |
|
||||
| `I78` | Active datasource label |
|
||||
| `D09F00` | Overall last session status code |
|
||||
| `D09F09` | Last successful session timestamp (Unix) |
|
||||
| `D09F15` | Last session end timestamp (Unix) |
|
||||
| `D09F08` | 28-day colorbar string |
|
||||
| `D1F00/F15` | Files & Folders status/timestamp |
|
||||
| `D10F00/F15` | VssMsSql |
|
||||
| `D11F00/F15` | VssSharePoint |
|
||||
| `D19F00/F15` | M365 Exchange |
|
||||
| `D20F00/F15` | M365 OneDrive |
|
||||
| `D5F00/F15` | M365 SharePoint |
|
||||
| `D23F00/F15` | M365 Teams |
|
||||
|
||||
### Status Code Mapping
|
||||
| Cove code | Meaning | Backupchecks status |
|
||||
|-----------|---------|---------------------|
|
||||
| 1 | In process | Warning |
|
||||
| 2 | Failed | Error |
|
||||
| 3 | Aborted | Error |
|
||||
| 5 | Completed | Success |
|
||||
| 6 | Interrupted | Error |
|
||||
| 7 | Not started | Warning |
|
||||
| 8 | Completed with errors | Warning |
|
||||
| 9 | In progress with faults | Warning |
|
||||
| 10 | Over quota | Error |
|
||||
| 11 | No selection | Warning |
|
||||
| 12 | Restarted | Warning |
|
||||
|
||||
### Inbox-Style Flow (mirrors email import)
|
||||
1. Cove importer fetches all accounts via paginated `EnumerateAccountStatistics` (250/page).
|
||||
2. Every account is upserted into the `cove_accounts` staging table (always, regardless of job link).
|
||||
3. Accounts without a `job_id` appear on `/cove/accounts` ("Cove Accounts" page) for admin action.
|
||||
4. Admin can:
|
||||
- **Create new job** – creates a `Job` with `backup_software="Cove Data Protection"` and links it.
|
||||
- **Link to existing job** – sets `job.cove_account_id` and `cove_acc.job_id`.
|
||||
5. Linking an account triggers an immediate import attempt; linked accounts then generate `JobRun` records (deduplicated per job via `job_id + external_id`).
|
||||
6. Per-datasource objects are persisted to `customer_objects`, `job_object_links`, `run_object_links`.
|
||||
|
||||
### CoveAccount Model
|
||||
```python
|
||||
class CoveAccount(db.Model):
|
||||
__tablename__ = "cove_accounts"
|
||||
id # PK
|
||||
account_id # Cove AccountId (unique)
|
||||
account_name # I1
|
||||
computer_name # I18
|
||||
customer_name # I8
|
||||
datasource_types # I78
|
||||
last_status_code # D09F00 (int)
|
||||
last_run_at # D09F15 (datetime)
|
||||
colorbar_28d # D09F08
|
||||
job_id # FK → jobs.id (nullable — None = unmatched)
|
||||
first_seen_at
|
||||
last_seen_at
|
||||
job # relationship → Job
|
||||
```
|
||||
|
||||
### Deduplication
|
||||
`external_id = f"cove-{account_id}-{run_ts}"` where `run_ts` is Unix timestamp from `D09F15` (fallback to `D09F09` when needed).
|
||||
|
||||
Deduplication is enforced per linked job:
|
||||
- check `JobRun.query.filter_by(job_id=job.id, external_id=external_id).first()`
|
||||
- this prevents cross-job collisions when accounts are relinked.
|
||||
|
||||
### Run Enrichment
|
||||
- Cove-created `JobRun.remark` contains account/computer/customer and last status/timestamp summary.
|
||||
- Per-datasource run object records include:
|
||||
- mapped Backupchecks status
|
||||
- readable status details in `error_message`
|
||||
- datasource-level session timestamp in `observed_at`
|
||||
|
||||
### Cove Accounts UI Notes
|
||||
- `/cove/accounts` derives display fields to align with existing job logic:
|
||||
- `backup_software`: `Cove Data Protection`
|
||||
- `backup_type`: `Server`, `Workstation`, or `Microsoft 365`
|
||||
- `job_name`: based on Cove account/computer fallback
|
||||
- readable datasource labels instead of raw `I78` code stream
|
||||
- `computer_name` is shown in both unmatched and matched account tables.
|
||||
|
||||
### Background Thread
|
||||
`cove_importer_service.py` — same pattern as `auto_importer_service.py`:
|
||||
- Thread name: `"cove_importer"`
|
||||
- Checks `settings.cove_import_enabled`
|
||||
- Interval: `settings.cove_import_interval_minutes` (default 30)
|
||||
- Calls `run_cove_import(settings)` which returns `(total, created, skipped, errors)`
|
||||
|
||||
### Settings UI
|
||||
Settings → Integrations → Cove section:
|
||||
- Enable toggle, API URL, username, password (masked, only overwritten if non-empty)
|
||||
- Import enabled + interval
|
||||
- "Test Connection" button (AJAX → `POST /settings/cove/test-connection`) returns `{ok, partner_id, message}`
|
||||
- "Run import now" button (→ `POST /settings/cove/run-now`) triggers manual import
|
||||
|
||||
### Routes
|
||||
| Route | Method | Description |
|
||||
|-------|--------|-------------|
|
||||
| `/cove/accounts` | GET | Inbox-style page: unmatched + matched accounts |
|
||||
| `/cove/accounts/<id>/link` | POST | `action=create` or `action=link` |
|
||||
| `/cove/accounts/<id>/unlink` | POST | Removes job link, puts account back in unmatched |
|
||||
| `/settings/cove/test-connection` | POST | AJAX: verify credentials, save partner_id |
|
||||
| `/settings/cove/run-now` | POST | Manual import trigger |
|
||||
|
||||
### Migrations
|
||||
- `migrate_cove_integration()` — adds 8 columns to `system_settings`, `cove_account_id` to `jobs`, `source_type` + `external_id` to `job_runs`, dedup index on `job_runs.external_id`
|
||||
- `migrate_cove_accounts_table()` — creates `cove_accounts` table with indexes
|
||||
|
||||
---
|
||||
|
||||
## Ticketing and Autotask (Critical Rules)
|
||||
|
||||
### Two Ticket Types
|
||||
1. **Internal Tickets** (tickets table)
|
||||
- Created manually or via Autotask integration
|
||||
- Stored in `tickets` table with `ticket_code` (e.g., "T20250123.0001")
|
||||
- Linked to runs via `ticket_job_runs` many-to-many table
|
||||
- Scoped to jobs via `ticket_scopes` table
|
||||
- Have `resolved_at` field for resolution tracking
|
||||
- **Auto-propagation**: Automatically linked to new runs via `link_open_internal_tickets_to_run`
|
||||
|
||||
2. **Autotask Tickets** (job_runs columns)
|
||||
- Created via Run Checks modal → "Create Autotask Ticket"
|
||||
- Stored directly in JobRun columns: `autotask_ticket_id`, `autotask_ticket_number`, etc.
|
||||
- When created, also creates matching internal ticket for legacy UI compatibility
|
||||
|
||||
## Microsoft Entra SSO (Current State)
|
||||
|
||||
### Status
|
||||
- Implemented but marked **Untested in Backupchecks**.
|
||||
|
||||
### Routes
|
||||
- `GET /auth/entra/login` – starts Entra auth code flow.
|
||||
- `GET /auth/entra/callback` – exchanges code, maps/provisions local user, logs in session.
|
||||
- `/auth/logout` – Entra-aware logout redirect when user authenticated via Entra.
|
||||
|
||||
### Access Controls
|
||||
- Optional tenant/domain restriction (`entra_allowed_domain`).
|
||||
- Optional Entra security-group allowlist (`entra_allowed_group_ids`) based on group object IDs.
|
||||
- Group overage / missing groups claim blocks login intentionally when group gate is enabled.
|
||||
|
||||
### Local User Mapping
|
||||
- Primary mapping by `preferred_username`/UPN/email.
|
||||
- Optional auto-provision (`entra_auto_provision_users`) creates local Viewer users for unknown identities.
|
||||
|
||||
### Documentation
|
||||
- Built-in docs page: `/documentation/settings/entra-sso`
|
||||
- Includes configuration steps and explicit untested warning.
|
||||
|
||||
## Navbar Notes (Latest)
|
||||
|
||||
- To reduce split-screen overflow, nav is compacted by grouping:
|
||||
- admin-only links under `Admin` dropdown
|
||||
- secondary non-admin links under `More` dropdown
|
||||
- Primary operational links remain visible (notably `Run Checks`).
|
||||
- Viewer role now exposes `Customers` and `Jobs` directly in navbar.
|
||||
- Have `autotask_ticket_deleted_at` field for deletion tracking
|
||||
- Resolution tracked via matching internal ticket's `resolved_at` field
|
||||
- **Auto-propagation**: Linked to new runs via two-strategy approach
|
||||
|
||||
### Ticket Propagation to New Runs
|
||||
When a new JobRun is created (via email import OR missed run generation), `link_open_internal_tickets_to_run` ensures:
|
||||
|
||||
**Strategy 1: Internal ticket linking**
|
||||
- Query finds tickets where: `COALESCE(ts.resolved_at, t.resolved_at) IS NULL`
|
||||
- Creates `ticket_job_runs` links automatically
|
||||
- Tickets remain visible until explicitly resolved
|
||||
- **NO date-based logic** - resolved = immediately hidden from new runs
|
||||
|
||||
**Strategy 2: Autotask ticket propagation (independent)**
|
||||
1. Check if internal ticket code exists → find matching Autotask run → copy ticket info
|
||||
2. If no match, directly search for most recent Autotask ticket on job where:
|
||||
- `autotask_ticket_deleted_at IS NULL` (not deleted in PSA)
|
||||
- Internal ticket `resolved_at IS NULL` (not resolved in PSA)
|
||||
3. Copy `autotask_ticket_id`, `autotask_ticket_number`, `created_at`, `created_by_user_id` to new run
|
||||
|
||||
### Where Ticket Linking is Called
|
||||
`link_open_internal_tickets_to_run` is invoked in three locations:
|
||||
1. **Email-based runs**: `routes_inbox.py` and `mail_importer.py` - after creating JobRun from parsed email
|
||||
2. **Missed runs**: `routes_run_checks.py` in `_ensure_missed_runs_for_job` - after creating missed JobRun records
|
||||
- Weekly schedule: After creating weekly missed run (with flush to get run.id)
|
||||
- Monthly schedule: After creating monthly missed run (with flush to get run.id)
|
||||
- **Critical**: Without this call, missed runs don't get ticket propagation!
|
||||
|
||||
### Display Logic - Link-Based System
|
||||
All pages use **explicit link-based queries** (no date-based logic):
|
||||
|
||||
**Job Details Page:**
|
||||
- **Two sources** for ticket display:
|
||||
1. Direct links (`ticket_job_runs WHERE job_run_id = X`) → always show (audit trail)
|
||||
2. Active window (`ticket_scopes WHERE job_id = Y AND resolved_at IS NULL`) → only unresolved
|
||||
- Result: Old runs keep their ticket references, new runs don't get resolved tickets
|
||||
|
||||
**Run Checks Main Page (Indicators 🎫):**
|
||||
- Query: `ticket_scopes JOIN tickets WHERE job_id = X AND resolved_at IS NULL`
|
||||
- Only shows indicator if unresolved tickets exist for the job
|
||||
|
||||
**Run Checks Popup Modal:**
|
||||
- API: `/api/job-runs/<run_id>/alerts`
|
||||
- **Two-source ticket display**:
|
||||
1. Direct links: `tickets JOIN ticket_job_runs WHERE job_run_id = X`
|
||||
2. Job-level scope: `tickets JOIN ticket_scopes WHERE job_id = Y AND resolved_at IS NULL AND active_from_date <= run_date`
|
||||
- Prevents duplicates by tracking seen ticket IDs
|
||||
- Shows newly created tickets immediately (via scope) without waiting for resolve action
|
||||
- **Two-source remark display**:
|
||||
1. Direct links: `remarks JOIN remark_job_runs WHERE job_run_id = X`
|
||||
2. Job-level scope: `remarks JOIN remark_scopes WHERE job_id = Y AND resolved_at IS NULL AND active_from_date <= run_date` (with timezone-safe fallback from `start_date`)
|
||||
- Prevents duplicates by tracking seen remark IDs
|
||||
|
||||
### Debug Logging for Ticket Linking (Reference)
|
||||
If you need to debug ticket linking issues, add this to `link_open_internal_tickets_to_run` in `ticketing_utils.py` after the rows query:
|
||||
|
||||
```python
|
||||
try:
|
||||
from .models import AuditLog
|
||||
details = []
|
||||
if rows:
|
||||
for tid, code, t_resolved, ts_resolved in rows:
|
||||
details.append(f"ticket_id={tid}, code={code}, t.resolved_at={t_resolved}, ts.resolved_at={ts_resolved}")
|
||||
else:
|
||||
details.append("No open tickets found for this job")
|
||||
audit = AuditLog(
|
||||
user="system", event_type="ticket_link_debug",
|
||||
message=f"link_open_internal_tickets_to_run called: run_id={run.id}, job_id={job.id}, found={len(rows)} ticket(s)",
|
||||
details="\n".join(details)
|
||||
)
|
||||
db.session.add(audit)
|
||||
db.session.commit()
|
||||
except Exception:
|
||||
pass
|
||||
```
|
||||
Visible on Logging page under `event_type = "ticket_link_debug"`. Remove after debugging.
|
||||
|
||||
### Resolved vs Deleted
|
||||
- **Resolved**: Ticket completed in Autotask (tracked in internal `tickets.resolved_at`)
|
||||
- Stops propagating to new runs
|
||||
- Ticket still exists in PSA
|
||||
- Synced via PSA polling
|
||||
- **Deleted**: Ticket removed from Autotask (tracked in `job_runs.autotask_ticket_deleted_at`)
|
||||
- Also stops propagating
|
||||
- Ticket no longer exists in PSA
|
||||
- Rare operation
|
||||
|
||||
### Critical Rules
|
||||
- ❌ **NEVER** use date-based resolved logic: `resolved_at >= run_date` OR `active_from_date <= run_date`
|
||||
- ✅ Only show tickets that are ACTUALLY LINKED via `ticket_job_runs` table
|
||||
- ✅ Resolved tickets stop linking immediately when resolved
|
||||
- ✅ Old links preserved for audit trail (visible on old runs)
|
||||
- ✅ All queries must use explicit JOIN to link tables
|
||||
- ✅ Consistency: All pages use same "resolved = NULL" logic
|
||||
- ✅ **CRITICAL**: Preserve description field during Autotask updates - must include "description" in optional_fields list
|
||||
|
||||
## UI and UX Notes
|
||||
|
||||
### Navbar
|
||||
- Fixed-top positioning
|
||||
- Collapses on mobile (hamburger menu)
|
||||
- Dynamic padding adjustment via JavaScript (measures navbar height, adjusts main content padding-top)
|
||||
- Role-based menu items (Admin sees more than Operator/Viewer)
|
||||
|
||||
### Status Badges
|
||||
- Success: Green
|
||||
- Warning: Yellow/Orange
|
||||
- Failed/Error: Red
|
||||
- Override applied: Blue badge
|
||||
- Reviewed: Checkmark indicator
|
||||
|
||||
### Ticket Copy Functionality
|
||||
- Copy button (⧉) available on both Run Checks and Job Details pages
|
||||
- Allows quick copying of ticket numbers to clipboard
|
||||
- Cross-browser compatible with three-tier fallback mechanism:
|
||||
1. **Modern Clipboard API**: `navigator.clipboard.writeText()` - works in modern browsers with HTTPS
|
||||
2. **Legacy execCommand**: `document.execCommand('copy')` - fallback for older browsers and Edge
|
||||
3. **Prompt fallback**: `window.prompt()` - last resort if clipboard access fails
|
||||
- Visual feedback: button changes to ✓ checkmark for 800ms after successful copy
|
||||
- Implementation uses hidden textarea for execCommand method to ensure compatibility
|
||||
- No user interaction required in modern browsers (direct copy)
|
||||
|
||||
### Checkbox Behavior
|
||||
- All checkboxes on Inbox and Run Checks pages use `autocomplete="off"`
|
||||
- Prevents browser from auto-selecting checkboxes after page reload
|
||||
- Fixes issue where deleting items would cause same number of new items to be selected
|
||||
|
||||
### Customers to Jobs Navigation (2026-02-16)
|
||||
- Customers page links each customer name to filtered Jobs view:
|
||||
- `GET /jobs?customer_id=<customer_id>`
|
||||
- Jobs route behavior:
|
||||
- Accepts optional `customer_id` query parameter in `routes_jobs.py`.
|
||||
- If set: returns jobs for that customer only.
|
||||
- If not set: keeps default filter that hides jobs linked to inactive customers.
|
||||
- Jobs UI behavior:
|
||||
- Shows active filter banner with selected customer name.
|
||||
- Provides "Clear filter" action back to unfiltered `/jobs`.
|
||||
- Templates touched:
|
||||
- `templates/main/customers.html`
|
||||
- `templates/main/jobs.html`
|
||||
|
||||
### Global Grouped Search (2026-02-16)
|
||||
- New route:
|
||||
- `GET /search` in `main/routes_search.py`
|
||||
- New UI:
|
||||
- Navbar search form in `templates/layout/base.html`
|
||||
- Grouped result page in `templates/main/search.html`
|
||||
- Search behavior:
|
||||
- Case-insensitive matching (`ILIKE`).
|
||||
- `*` wildcard is supported and translated to SQL `%`.
|
||||
- Automatic contains behavior is applied per term (`*term*`) when wildcard not explicitly set.
|
||||
- Multi-term queries use AND across terms and OR across configured columns within each section.
|
||||
- Per-section pagination is supported via query params: `p_inbox`, `p_customers`, `p_jobs`, `p_daily_jobs`, `p_run_checks`, `p_tickets`, `p_remarks`, `p_overrides`, `p_reports`.
|
||||
- Pagination keeps search state for all sections while browsing one section.
|
||||
- "Open <section>" links pass `q` to destination overview pages so page-level filtering matches the search term.
|
||||
- Grouped sections:
|
||||
- Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Remarks, Existing overrides, Reports.
|
||||
- Daily Jobs search result details:
|
||||
- Meta now includes expected run time, success indicator, and run count for the selected day.
|
||||
- Link now opens Daily Jobs with modal auto-open using `open_job_id` query parameter (same modal flow as clicking a row in Daily Jobs).
|
||||
- Access control:
|
||||
- Search results are role-aware and only show sections/data the active role can access.
|
||||
- `run_checks` results are restricted to `admin`/`operator`.
|
||||
- `reports` supports `admin`/`operator`/`viewer`/`reporter`.
|
||||
- Current performance strategy:
|
||||
- Per-section limit (`SEARCH_LIMIT_PER_SECTION = 10`), with total count per section.
|
||||
- No schema migration required for V1.
|
||||
|
||||
## Feedback Module with Screenshots
|
||||
- Models: `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`.
|
||||
- Attachments:
|
||||
- multiple uploads, type validation, per-file size limits, storage in database (BYTEA).
|
||||
|
||||
## Validation Snapshot
|
||||
- 2026-02-16: Test build + push succeeded via `update-and-build.sh t`.
|
||||
- Pushed image: `gitea.oskamp.info/ivooskamp/backupchecks:dev`.
|
||||
- 2026-02-16: Test build + push succeeded on branch `v20260216-02-global-search`.
|
||||
- Pushed image digest: `sha256:6996675b9529426fe2ad58b5f353479623f3ebe24b34552c17ad0421d8a7ee0f`.
|
||||
- 2026-02-16: Additional test build + push cycles succeeded on `v20260216-02-global-search`.
|
||||
- Latest pushed image digest: `sha256:8ec8bfcbb928e282182fa223ce8bf7f92112d20e79f4a8602d015991700df5d7`.
|
||||
- 2026-02-16: Additional test build + push cycles succeeded after search enhancements.
|
||||
- Latest pushed image digest: `sha256:b36b5cdd4bc7c4dadedca0534f1904a6e12b5b97abc4f12bc51e42921976f061`.
|
||||
- Delete strategy:
|
||||
- soft delete by default,
|
||||
- permanent delete only for admins and only after soft delete.
|
||||
|
||||
## Deployment and Operations
|
||||
- Stack exposes:
|
||||
- app on `8080`
|
||||
- adminer on `8081`
|
||||
- PostgreSQL persistent volume:
|
||||
- `/docker/appdata/backupchecks/backupchecks-postgres:/var/lib/postgresql/data`
|
||||
- `deploy/backupchecks-stack.yml` also contains example `.env` variables at the bottom.
|
||||
|
||||
## Build/Release Flow
|
||||
File: `build-and-push.sh`
|
||||
- Bump options:
|
||||
- `1` patch, `2` minor, `3` major, `t` test.
|
||||
- Release build:
|
||||
- update `version.txt`
|
||||
- commit + tag + push
|
||||
- docker push of `:<version>`, `:dev`, `:latest`
|
||||
- Test build:
|
||||
- only `:dev`
|
||||
- no commit/tag.
|
||||
- Services are discovered under `containers/*` with Dockerfile-per-service.
|
||||
|
||||
## Technical Observations / Attention Points
|
||||
- `README.md` is currently empty; quick-start entry context is missing.
|
||||
- `LICENSE` is currently empty.
|
||||
- `docs/architecture.md` is currently empty.
|
||||
- `deploy/backupchecks-stack.yml` contains hardcoded example values (`Changeme`), with risk if used without proper secrets management.
|
||||
- The app performs DB initialization + migrations at startup; for larger schema changes this can impact startup time/robustness.
|
||||
- There is significant parser and ticketing complexity; route changes carry regression risk without targeted testing.
|
||||
- For Autotask update calls, the `description` field must be explicitly preserved to prevent unintended NULL overwrite.
|
||||
- Security hygiene remains important:
|
||||
- no customer names in parser examples/source,
|
||||
- no hardcoded credentials.
|
||||
|
||||
## Quick References
|
||||
- App entrypoint: `containers/backupchecks/src/backend/app/main.py`
|
||||
- App factory: `containers/backupchecks/src/backend/app/__init__.py`
|
||||
- Config: `containers/backupchecks/src/backend/app/config.py`
|
||||
- Models: `containers/backupchecks/src/backend/app/models.py`
|
||||
- Parsers: `containers/backupchecks/src/backend/app/parsers/registry.py`
|
||||
- Ticketing utilities: `containers/backupchecks/src/backend/app/ticketing_utils.py`
|
||||
- Run Checks routes: `containers/backupchecks/src/backend/app/main/routes_run_checks.py`
|
||||
- Cove importer: `containers/backupchecks/src/backend/app/cove_importer.py`
|
||||
- Cove routes: `containers/backupchecks/src/backend/app/main/routes_cove.py`
|
||||
- Compose stack: `deploy/backupchecks-stack.yml`
|
||||
- Build script: `build-and-push.sh`
|
||||
|
||||
## Recent Changes
|
||||
|
||||
### 2026-02-23
|
||||
- **Cove Data Protection full integration**:
|
||||
- `cove_importer.py` – Cove API client (login, paginated enumeration, status mapping, deduplication, per-datasource object persistence)
|
||||
- `cove_importer_service.py` – background polling thread (same pattern as `auto_importer_service.py`)
|
||||
- `CoveAccount` staging model + `migrate_cove_accounts_table()` migration
|
||||
- `SystemSettings` – 8 new Cove fields, `Job` – `cove_account_id`, `JobRun` – `source_type` + `external_id`
|
||||
- `routes_cove.py` – inbox-style `/cove/accounts` with link/unlink routes
|
||||
- `cove_accounts.html` – unmatched accounts shown first with Bootstrap modals (create job / link to existing), matched accounts with Unlink
|
||||
- Settings > Integrations: Cove section with test connection (AJAX) and manual import trigger
|
||||
- Navbar: "Cove Accounts" link for admin/operator when `cove_enabled`
|
||||
- **Cove API key findings** (from test script + N-able support):
|
||||
- Visa is returned at top level of login response, not inside `result`
|
||||
- Settings per account are a list of single-key dicts `[{"D09F00":"5"}, ...]` — flatten with `flat.update(item)`
|
||||
- EnumerateAccountStatistics params must use lowercase `query` key and `RecordsCount` (not `RecordCount`)
|
||||
- Login params must use lowercase `username`/`password`
|
||||
- D02/D03 are legacy; use D10/D11 or D09 (Total) instead
|
||||
|
||||
### 2026-02-19
|
||||
- **Added 3CX Update parser support**: `threecx.py` now recognizes subject `3CX Notification: Update Successful - <host>` and stores it as informational with:
|
||||
- `backup_software = 3CX`
|
||||
- `backup_type = Update`
|
||||
- `overall_status = Success`
|
||||
- **3CX informational schedule behavior**:
|
||||
- `3CX / Update` and `3CX / SSL Certificate` are excluded from schedule inference in `routes_shared.py` (no Expected/Missed generation).
|
||||
- **Run Checks visibility scope (3CX-only)**:
|
||||
- Run Checks now hides only non-backup 3CX informational jobs (`Update`, `SSL Certificate`).
|
||||
- Other backup software/types remain visible and unchanged.
|
||||
- **Fixed remark visibility mismatch**:
|
||||
- `/api/job-runs/<run_id>/alerts` now loads remarks from both:
|
||||
1. `remark_job_runs` (explicit run links),
|
||||
2. `remark_scopes` (active job-scoped remarks),
|
||||
- with duplicate prevention by remark ID.
|
||||
- This resolves cases where the remark indicator appeared but remarks were not shown in Run Checks modal or Job Details modal.
|
||||
|
||||
### 2026-02-13
|
||||
- **Fixed missed runs ticket propagation**: Added `link_open_internal_tickets_to_run` calls in `_ensure_missed_runs_for_job` (routes_run_checks.py) after creating both weekly and monthly missed JobRun records. Previously only email-based runs got ticket linking, causing missed runs to not show internal tickets or Autotask tickets. Required `db.session.flush()` before linking to ensure run.id is available.
|
||||
- **Fixed checkbox auto-selection**: Added `autocomplete="off"` to all checkboxes on Inbox and Run Checks pages. Prevents browser from automatically re-selecting checkboxes after page reload following delete actions.
|
||||
|
||||
### 2026-02-12
|
||||
- **Fixed Run Checks modal ticket display**: Implemented two-source display logic (ticket_job_runs + ticket_scopes). Previously only showed tickets after they were resolved (when ticket_job_runs entry was created). Now shows tickets immediately upon creation via scope query.
|
||||
- **Fixed copy button in Edge**: Moved clipboard functions inside IIFE scope for proper closure access (Edge is stricter than Firefox about scope resolution).
|
||||
|
||||
### 2026-02-10
|
||||
- **Added screenshot support to Feedback system**: Multiple file upload, inline display, two-stage delete (soft delete for audit trail, permanent delete for cleanup).
|
||||
- **Completed transition to link-based ticket system**: All pages now use JOIN queries, no date-based logic. Added cross-browser copy ticket functionality with three-tier fallback mechanism to both Run Checks and Job Details pages.
|
||||
@ -1 +1 @@
|
||||
v0.1.26
|
||||
v0.1.27
|
||||
|
||||
Loading…
Reference in New Issue
Block a user