Compare commits

..

No commits in common. "main" and "v0.2.1" have entirely different histories.
main ... v0.2.1

48 changed files with 367 additions and 2689 deletions

View File

@ -1 +1 @@
v20260402-01 25

View File

@ -2,7 +2,7 @@
**Branch:** `v20260206-10-audit-logging-expansion` **Branch:** `v20260206-10-audit-logging-expansion`
**Datum:** 2026-02-07 **Datum:** 2026-02-07
**Status:** Volledig afgerond op 2026-03-26 (Deel 1 + Deel 2) **Status:** Deel 1 compleet, deel 2 nog te doen
--- ---
@ -23,9 +23,7 @@
--- ---
## ✅ Deel 2 afgerond ## 🔄 Wat moet nog (Deel 2)
Alle onderstaande punten zijn uitgevoerd en opgenomen in de codebase en changelog van deze oplevering.
### 1. UI Updates ### 1. UI Updates

View File

@ -1,28 +1,11 @@
# TODO: Cove Data Protection Integration # TODO: Cove Data Protection Integration
**Date:** 2026-02-23 **Date:** 2026-02-23
**Status:** COMPLETED (implemented) **Status:** Research COMPLETED — Ready for implementation
**Priority:** Medium **Priority:** Medium
--- ---
## ✅ Completion Update (2026-03-26)
This TODO is now completed and superseded by the implemented Cove integration in the codebase.
Implemented scope includes:
- Cove API client + importer with deduplication and status mapping
- Scheduled background polling service with configurable interval
- Database/model support (`cove_accounts`, `job_runs.source_type`, `job_runs.external_id`, Cove settings fields)
- Cove settings flows (`test-connection`, `run-now`, credentials + partner ID handling)
- Cove Accounts inbox-style UI and link/unlink workflow
- Run Checks + Job Detail Cove run presentation
- Historical run backfill via 28-day colorbar (`D09F08`)
Remaining items in this file under "Nice to Have" are optional future enhancements, not blockers for Cove integration completion.
---
## 🎯 Goal ## 🎯 Goal
Integrate Cove Data Protection (formerly N-able Backup / SolarWinds Backup) into Backupchecks for backup status monitoring via scheduled API polling. The integration runs server-side within the Backupchecks web application. Integrate Cove Data Protection (formerly N-able Backup / SolarWinds Backup) into Backupchecks for backup status monitoring via scheduled API polling. The integration runs server-side within the Backupchecks web application.

View File

@ -1,179 +1,9 @@
# TODO: Documentation System # TODO: Documentation System
**Branch:** `v20260326-02-documentation-audit-batch3` **Branch:** `v20260207-02-wiki-documentation`
**Date Started:** 2026-02-07 **Date Started:** 2026-02-07
**Date Updated:** 2026-03-26 (Latest: Batch 3 Settings documentation completed) **Date Updated:** 2026-02-08 (Latest: Per-job review corrections)
**Status:** In Progress - documentation audit batches 1-3 completed; remaining: Mail Import re-validation, Troubleshooting pages, Sidebar Layout v2 consistency pass **Status:** In Progress - 19 of 33 pages complete (58%)
---
## ✅ Validation Update (2026-03-26)
This TODO was re-checked against the current codebase and documentation templates.
Findings:
- The progress metrics in this file are outdated (`19/33`) and no longer reflect the repository state.
- The documentation template tree currently contains **39 files** under `containers/backupchecks/src/templates/documentation`.
- Because the app layout and multiple workflows changed after the original documentation wave, **all documentation pages require content review** for functional correctness (not only layout styling).
Decision:
- Keep this TODO active.
- Treat this as a full documentation audit task: verify each page against current UI/routes/behavior and update screenshots/text where needed.
---
## 🧾 Review Checklist (2026-03-26)
Gebruik deze lijst voor de volledige inhoudsreview tegen de huidige applicatie.
**Legenda**
- `P1` = hoge prioriteit (kritieke workflows / grootste kans op afwijkingen)
- `P2` = normale prioriteit
- `P3` = lage prioriteit
### Global checks (voor elke pagina)
- [ ] Route, paginatitel en navigatiepad kloppen met huidige UI
- [ ] Terminologie klopt met huidige labels/knoppen in de app
- [ ] Screenshots zijn actueel (nieuwe layout) of vervangen/verwijderd
- [ ] Tekst verwijst niet naar verwijderde/gewijzigde functies
- [ ] Role-based behavior (admin/operator/viewer/reporter) klopt
### Missing documentation topics (new pages required)
- [x] `[P1]` Cove Accounts + Cove run detail flow documenteren
- [x] `[P1]` Veeam Cloud Connect accounts/run flow documenteren
- [x] `[P1]` Run Checks Autotask "Link existing" gedrag updaten (incl. cross-company)
- [ ] `[P1]` Sidebar layout v2 consistent door alle docs verwerken
### Page-by-page review
#### Getting Started
- [ ] `[P2]` getting-started/what-is-backupchecks
- [ ] `[P2]` getting-started/first-login
- [ ] `[P2]` getting-started/quick-start
#### User Management
- [ ] `[P2]` users/users-and-roles
- [ ] `[P2]` users/login-authentication
- [ ] `[P2]` users/profile-settings
#### Customers & Jobs
- [ ] `[P2]` customers-jobs/managing-customers
- [ ] `[P2]` customers-jobs/configuring-jobs
- [ ] `[P2]` customers-jobs/approved-jobs
- [ ] `[P2]` customers-jobs/job-schedules
#### Mail & Import
- [ ] `[P1]` mail-import/setup
- [ ] `[P1]` mail-import/inbox-management
- [ ] `[P1]` mail-import/mail-parsing
- [ ] `[P1]` mail-import/auto-import
#### Backup Review
- [ ] `[P1]` backup-review/approving-backups
- [ ] `[P1]` backup-review/daily-jobs
- [ ] `[P1]` backup-review/run-checks-modal
- [ ] `[P1]` backup-review/overrides
- [ ] `[P1]` backup-review/remarks-tickets
#### Reports
- [ ] `[P1]` reports/creating-reports
- [ ] `[P1]` reports/relative-periods
- [ ] `[P1]` reports/scheduling
- [ ] `[P1]` reports/exporting-data
#### Autotask Integration
- [x] `[P1]` autotask/setup-configuration
- [x] `[P1]` autotask/company-mapping
- [x] `[P1]` autotask/creating-tickets
- [x] `[P1]` autotask/ticket-management
#### Settings
- [x] `[P1]` settings/general
- [x] `[P1]` settings/mail-configuration
- [x] `[P1]` settings/autotask-integration
- [x] `[P1]` settings/entra-sso
- [x] `[P1]` settings/reporting-settings
- [x] `[P1]` settings/user-management
- [x] `[P1]` settings/maintenance
#### Troubleshooting
- [ ] `[P2]` troubleshooting/common-issues
- [ ] `[P2]` troubleshooting/faq
- [ ] `[P2]` troubleshooting/support-contact
---
## 🔎 Batch 1 Findings (P1) — 2026-03-26
### A. Immediate correctness fixes (existing content)
- [x] `documentation/backup-review/daily-jobs.html`
- Remove/replace incorrect claim that successful jobs are automatically reviewed.
- Align workflow text with current behavior: review is handled via Run Checks job-level review.
- [x] `documentation/backup-review/approving-backups.html`
- Replace wording "select multiple runs" with "select multiple jobs" where bulk review is described.
- Re-verify Daily Jobs vs Run Checks role split text for current operational flow.
- [x] `documentation/backup-review/run-checks-modal.html`
- Fix broken cross-link: `url_for('documentation.page', section='autotask', page='overview')` does not exist.
- Replace with valid links to existing Autotask pages.
- [x] `documentation/backup-review/remarks-tickets.html`
- Fix same broken `autotask/overview` link.
- Re-check Autotask behavior section against current link-existing/create/resolve-note flow.
### B. Placeholder pages that require full rewrite (currently "Coming Soon")
- [ ] `documentation/reports/creating-reports.html`
- [ ] `documentation/reports/relative-periods.html`
- [ ] `documentation/reports/scheduling.html`
- [ ] `documentation/reports/exporting-data.html`
- [x] `documentation/autotask/setup-configuration.html`
- [x] `documentation/autotask/company-mapping.html`
- [x] `documentation/autotask/creating-tickets.html`
- [x] `documentation/autotask/ticket-management.html`
- [x] `documentation/settings/general.html`
- [x] `documentation/settings/mail-configuration.html`
- [x] `documentation/settings/autotask-integration.html`
- [x] `documentation/settings/reporting-settings.html`
- [x] `documentation/settings/user-management.html`
- [x] `documentation/settings/maintenance.html`
### C. Pages with content present but requiring targeted re-validation
- [x] `documentation/settings/entra-sso.html`
- Verify navigation path labels (Integrations wording/layout) against current settings UI.
- Keep untested warning unless production validation has been completed.
- [ ] `documentation/mail-import/setup.html`
- Re-check exact settings navigation wording and folder-browser flow against current UI labels.
- [ ] `documentation/mail-import/auto-import.html`
- Re-check references to Logging page path/wording and Imports section labels.
---
## 🔎 Batch 2 Findings (P1) — 2026-03-26 (Completed)
### A. Autotask docs rewritten from placeholders
- [x] `documentation/autotask/setup-configuration.html`
- [x] `documentation/autotask/company-mapping.html`
- [x] `documentation/autotask/creating-tickets.html`
- [x] `documentation/autotask/ticket-management.html`
### B. Autotask behavior alignment fixes
- [x] Documented `Link existing` cross-company behavior for shared/umbrella tickets.
- [x] Removed references to non-existent `autotask/overview` page and replaced broken links.
- [x] Re-validated ticket lifecycle notes (create/link/resolve note) against current Run Checks behavior.
---
## 🔎 Batch 3 Findings (P1) — 2026-03-26 (Completed)
### A. Settings docs rewritten from placeholders
- [x] `documentation/settings/general.html`
- [x] `documentation/settings/mail-configuration.html`
- [x] `documentation/settings/autotask-integration.html`
- [x] `documentation/settings/reporting-settings.html`
- [x] `documentation/settings/user-management.html`
- [x] `documentation/settings/maintenance.html`
### B. Settings content alignment notes
- [x] Re-validated `documentation/settings/entra-sso.html` against current Settings -> Integrations navigation and field names.
- [x] Reporting page updated to explicitly document current status: no dedicated Reporting settings card is available in Settings.
- [x] Removed all placeholder text from Settings documentation pages.
--- ---
@ -237,17 +67,17 @@ Gebruik deze lijst voor de volledige inhoudsreview tegen de huidige applicatie.
### Remaining Work 🚧 ### Remaining Work 🚧
**Phase 4: Advanced Features (10/14 pages complete)** **Phase 4: Advanced Features (0/14 pages - PLACEHOLDER)**
- Reports (0/4 pages) - Reports (0/4 pages)
- Autotask Integration (4/4 pages - COMPLETE) - Autotask Integration (0/4 pages)
- Settings (6/6 pages - COMPLETE) - Settings (0/6 pages)
- Troubleshooting (0/3 pages) - Troubleshooting (0/3 pages)
**Progress Summary:** **Progress Summary:**
- ✅ Batch 1 documentation updates completed (Integrations + critical Run Checks wording/link fixes). - ✅ 19 of 33 pages complete (58%)
- ✅ Batch 2 documentation updates completed (Autotask section rewritten and aligned with current behavior). - ✅ 10 screenshots added
- ✅ Batch 3 documentation updates completed (Settings section rewritten and revalidated). - ✅ All completed pages reviewed and corrected based on actual UI
- ⏳ Remaining focus: Mail Import re-validation pages, Troubleshooting pages, and final Sidebar Layout v2 consistency pass. - ⏳ 14 pages remaining (placeholders created)
--- ---
@ -1163,7 +993,7 @@ Add to navigation menu (after existing items):
- [x] Fix CSS image centering - [x] Fix CSS image centering
- [x] Add dark mode support - [x] Add dark mode support
**Status:** In Progress - documentation audit batches 1-3 completed; remaining: Mail Import re-validation, Troubleshooting pages, Sidebar Layout v2 consistency pass **Status:** COMPLETE
**Time Spent:** ~4 hours **Time Spent:** ~4 hours
### Phase 2: Content Pages - Getting Started ✅ COMPLETE ### Phase 2: Content Pages - Getting Started ✅ COMPLETE
@ -1172,7 +1002,7 @@ Add to navigation menu (after existing items):
- [x] Quick Start Checklist - [x] Quick Start Checklist
- [x] Take screenshots for Getting Started section - [x] Take screenshots for Getting Started section
**Status:** In Progress - documentation audit batches 1-3 completed; remaining: Mail Import re-validation, Troubleshooting pages, Sidebar Layout v2 consistency pass **Status:** COMPLETE
**Time Spent:** ~6 hours **Time Spent:** ~6 hours
### Phase 3: Content Pages - Core Features ✅ COMPLETE (16 of 16 complete) ### Phase 3: Content Pages - Core Features ✅ COMPLETE (16 of 16 complete)
@ -1198,7 +1028,7 @@ Add to navigation menu (after existing items):
- [x] Remarks & Tickets - [x] Remarks & Tickets
- [x] Take screenshots for core features (10 screenshots added) - [x] Take screenshots for core features (10 screenshots added)
**Status:** In Progress - documentation audit batches 1-3 completed; remaining: Mail Import re-validation, Troubleshooting pages, Sidebar Layout v2 consistency pass **Status:** COMPLETE (16/16 pages)
**Time Spent:** ~21 hours **Time Spent:** ~21 hours
### Phase 4: Content Pages - Advanced Features (0/14 pages) ### Phase 4: Content Pages - Advanced Features (0/14 pages)

View File

@ -6,18 +6,6 @@
--- ---
## ✅ Validation Update (2026-03-26)
Status re-checked against current codebase: this TODO is still open and not implemented yet.
Confirmed as NOT present:
- `operator_notifications` table/model
- Notification inbox routes/UI (e.g. `/notifications`)
- Notification lifecycle audit events (`notification_created`, `notification_read`, `notification_handled`)
- Dedicated mailbox alias ingestion flow for `backups+notification@...`
---
## 🎯 Goal ## 🎯 Goal
Maak een notificatieflow waarbij collega's een email sturen naar `backups+notification@...`, waarna Backupchecks deze berichten ophaalt en de operator informeert in de applicatie (optioneel ook via email). Maak een notificatieflow waarbij collega's een email sturen naar `backups+notification@...`, waarna Backupchecks deze berichten ophaalt en de operator informeert in de applicatie (optioneel ook via email).

View File

@ -6,20 +6,6 @@
--- ---
## ✅ Validation Update (2026-03-26)
Status re-checked against current codebase: this TODO is still open and largely not implemented yet.
Confirmed as NOT present (core planned scope):
- Reporting settings extension in Settings (`reporting_*` fields, branding/email configuration)
- Relative period engine (`period_type`, `relative_period`, timezone-aware calculator)
- Report model extensions for scheduling metadata and per-report email template fields
- Scheduling execution/retry flow for automatic report delivery
Note: only small earlier UI items at the top of this TODO appear completed; the main roadmap sections remain open.
---
## ✅ Wat is al gedaan ## ✅ Wat is al gedaan
- ✅ Scheduling placeholder verwijderd van reports overview pagina (reports.html) - ✅ Scheduling placeholder verwijderd van reports overview pagina (reports.html)

View File

@ -3,109 +3,6 @@ Changelog data structure for Backupchecks
""" """
CHANGELOG = [ CHANGELOG = [
{
"version": "v0.2.5",
"date": "2026-04-13",
"summary": "Consolidated release since v0.2.4 with manual schedule overrides, Autotask/remark synchronization improvements, Run Checks stability fixes, and refreshed operational documentation.",
"sections": [
{
"title": "Added",
"type": "feature",
"changes": [
"Manual schedule override support in Job Details (daily/weekly/monthly) with save/clear endpoint POST /jobs/<job_id>/schedule",
"Job Details now shows First backup detected based on earliest non-missed run",
"Remarks now support source and ticket_id metadata with migration and indexes",
"Autotask resolution text can be mirrored to active internal remarks with source=autotask_resolution and deduplication",
"Documentation Integrations section added with dedicated Cove Data Protection and Veeam Cloud Connect pages"
]
},
{
"title": "Changed",
"type": "improvement",
"changes": [
"Effective schedule resolution is now manual-first across Daily Jobs, Dashboard, Search, Job Details, and Run Checks missed-run generation",
"Missed-run grace window increased from +/- 1 hour to +/- 3 hours in Run Checks and Daily Jobs",
"Schedule inference now also includes Cove API runs (source_type=cove_api) in addition to mail-based runs",
"Run Checks now suppresses repeated Cove runs on the same local day after the first complete success run for that job/day",
"Ticket API active-state logic now uses effective status from both ticket-level and scope-level resolution",
"Settings/Autotask documentation pages were rewritten from placeholder content to current operational guidance"
]
},
{
"title": "Fixed",
"type": "bugfix",
"changes": [
"Run Checks modal mail visibility no longer remains hidden after navigating from Cove runs",
"Run Checks modal responsive behavior improved for smaller viewports so scrolling/content access remains usable",
"Run Checks Link existing Autotask ticket supports cross-company shared/umbrella tickets while preserving validation checks",
"Ticket copy action in Run Checks and Job Detail hardened with improved click handling and clipboard fallback",
"Autotask unresolved-ticket propagation to new runs fixed for edge cases where internal open-ticket rows are temporarily absent"
]
}
]
},
{
"version": "v0.2.4",
"date": "2026-03-26",
"summary": "Hotfix release that restores support for linking existing umbrella/shared Autotask tickets across companies from Run Checks.",
"sections": [
{
"title": "Fixed",
"type": "bugfix",
"changes": [
"Run Checks Link existing Autotask ticket no longer blocks cross-company tickets when the selected ticket company differs from the mapped customer company",
"Umbrella/shared Autotask tickets can be linked again while existing safeguards remain: ticket must exist, include a ticket number, and not be terminal/completed"
]
}
]
},
{
"version": "v0.2.3",
"date": "2026-03-23",
"summary": "Update release that improves Autotask existing-ticket linking transparency, Cove object detail propagation, and Customers page link styling consistency.",
"sections": [
{
"title": "Added",
"type": "feature",
"changes": [
"Autotask Link existing ticket now also posts a ticket note when an additional Backupchecks alert/run is linked, including customer, job, run context and a Backupchecks deep-link"
]
},
{
"title": "Changed",
"type": "improvement",
"changes": [
"Link existing Autotask API response now includes note_posted and note_warning fields for operator visibility",
"Customers page customer-name links to filtered Jobs now use sidebar-matching text and hover styling instead of default blue link styling"
]
},
{
"title": "Fixed",
"type": "bugfix",
"changes": [
"Autotask object retrieval for ticket composition now reads objects from run_object_links/customer_objects first (same source as Run Checks UI), then falls back to legacy job_objects/mail_objects, preventing missing object details",
"Cove Accounts and Cove-run object visibility improved by consistently using persisted run object links as primary object source, aligning operator view and ticket details",
"Autotask ticket affected-objects list now includes only problem objects (failed/error/warning/missed) and excludes completed/success objects"
]
}
]
},
{
"version": "v0.2.2",
"date": "2026-03-23",
"summary": "Hotfix release that corrects Synology Active Backup for Business parsing for completion mails using has been wording, preventing wrong job-name extraction.",
"sections": [
{
"title": "Fixed",
"type": "bugfix",
"changes": [
"Synology Active Backup for Business parser now recognizes completion mails using has been completed wording",
"Prevents fallback to the generic Synology Active Backup parser that could incorrectly take the bracketed subject prefix as job name",
"ABB mails like backup task dc001 on DS220p has been completed now keep the expected identity: backup_software=Synology, backup_type=Active Backup for Business, job_name=dc001"
]
}
]
},
{ {
"version": "v0.2.1", "version": "v0.2.1",
"date": "2026-03-20", "date": "2026-03-20",

View File

@ -45,13 +45,6 @@ COVE_COLUMNS = [
"D23F00", "D23F15", # M365 Teams "D23F00", "D23F15", # M365 Teams
] ]
# Optional datasource-specific columns for datasources that may be active in I78
# but are not always available in every tenant/API scope.
COVE_OPTIONAL_COLUMNS = [
"D2F00", "D2F15", # System State
"D6F00", "D6F15", # Network Shares
]
# Mapping from Cove status code to Backupchecks status string # Mapping from Cove status code to Backupchecks status string
STATUS_MAP: dict[int, str] = { STATUS_MAP: dict[int, str] = {
1: "Warning", # In process 1: "Warning", # In process
@ -82,21 +75,14 @@ STATUS_LABELS: dict[int, str] = {
12: "Restarted", 12: "Restarted",
} }
# Datasource label mapping (Cove datasource code → human-readable label). # Datasource label mapping (column prefix → human-readable label)
# Keep both canonical and zero-padded aliases so UI summary and importer stay in sync.
DATASOURCE_LABELS: dict[str, str] = { DATASOURCE_LABELS: dict[str, str] = {
"D1": "Files & Folders", "D1": "Files & Folders",
"D01": "Files & Folders",
"D2": "System State",
"D02": "System State",
"D6": "Network Shares",
"D06": "Network Shares",
"D10": "VssMsSql", "D10": "VssMsSql",
"D11": "VssSharePoint", "D11": "VssSharePoint",
"D19": "M365 Exchange", "D19": "M365 Exchange",
"D20": "M365 OneDrive", "D20": "M365 OneDrive",
"D5": "M365 SharePoint", "D5": "M365 SharePoint",
"D05": "M365 SharePoint",
"D23": "M365 Teams", "D23": "M365 Teams",
} }
@ -161,7 +147,6 @@ def _cove_enumerate(
partner_id: int, partner_id: int,
start: int, start: int,
count: int, count: int,
columns: list[str] | None = None,
) -> list[dict]: ) -> list[dict]:
"""Call EnumerateAccountStatistics and return a list of account dicts. """Call EnumerateAccountStatistics and return a list of account dicts.
@ -177,7 +162,7 @@ def _cove_enumerate(
"PartnerId": partner_id, "PartnerId": partner_id,
"StartRecordNumber": start, "StartRecordNumber": start,
"RecordsCount": count, "RecordsCount": count,
"Columns": columns or COVE_COLUMNS, "Columns": COVE_COLUMNS,
} }
}, },
} }
@ -216,12 +201,6 @@ def _cove_enumerate(
return [] return []
def _is_cove_security_column_error(exc: Exception) -> bool:
"""Return True when Enumerate failed due to restricted columns/security scope."""
msg = str(exc or "").lower()
return ("security reasons" in msg) or ("13501" in msg)
def _flatten_settings(account: dict) -> dict: def _flatten_settings(account: dict) -> dict:
"""Convert the Settings array in an account dict to a flat key→value dict. """Convert the Settings array in an account dict to a flat key→value dict.
@ -237,42 +216,6 @@ def _flatten_settings(account: dict) -> dict:
return flat return flat
def _normalize_ds_code(code: str) -> str:
"""Normalize datasource codes like D01 -> D1."""
m = re.fullmatch(r"D(\d{1,2})", (code or "").strip().upper())
if not m:
return (code or "").strip().upper()
return f"D{int(m.group(1))}"
def _parse_active_datasource_codes(raw: Any) -> list[str]:
"""Extract unique active datasource codes from I78 (e.g. D01D02D10)."""
text = str(raw or "").strip().upper()
if not text:
return []
seen: set[str] = set()
out: list[str] = []
for code in re.findall(r"D\d{1,2}", text):
norm = _normalize_ds_code(code)
if not norm or norm in seen:
continue
seen.add(norm)
out.append(norm)
return out
def _label_for_ds_code(code: str) -> str:
"""Resolve human label for a datasource code (supports canonical + padded aliases)."""
norm = _normalize_ds_code(code)
if norm in DATASOURCE_LABELS:
return DATASOURCE_LABELS[norm]
m = re.fullmatch(r"D(\d{1,2})", norm)
if m:
padded = f"D{int(m.group(1)):02d}"
return DATASOURCE_LABELS.get(padded, norm)
return DATASOURCE_LABELS.get(code, code)
def _map_status(code: Any) -> str: def _map_status(code: Any) -> str:
"""Map a Cove status code (int) to a Backupchecks status string.""" """Map a Cove status code (int) to a Backupchecks status string."""
if code is None: if code is None:
@ -493,26 +436,11 @@ def run_cove_import(settings, include_reasons: bool = False):
page_size = 250 page_size = 250
start = 0 start = 0
base_columns = list(COVE_COLUMNS)
extended_columns = base_columns + list(COVE_OPTIONAL_COLUMNS)
use_optional_columns = True
while True: while True:
columns = extended_columns if use_optional_columns else base_columns
try: try:
accounts = _cove_enumerate(url, visa, partner_id, start, page_size, columns=columns) accounts = _cove_enumerate(url, visa, partner_id, start, page_size)
except CoveImportError as exc: except CoveImportError:
# Some tenants block specific datasource columns; fall back safely. raise
if use_optional_columns and _is_cove_security_column_error(exc):
logger.warning(
"Cove import: optional datasource columns blocked by API scope; "
"falling back to base columns. Error: %s",
exc,
)
use_optional_columns = False
accounts = _cove_enumerate(url, visa, partner_id, start, page_size, columns=base_columns)
else:
raise
except Exception as exc: except Exception as exc:
raise CoveImportError(f"Unexpected error fetching accounts at offset {start}: {exc}") from exc raise CoveImportError(f"Unexpected error fetching accounts at offset {start}: {exc}") from exc
@ -656,9 +584,6 @@ def _process_account(account: dict) -> str:
# Cove session was previously stored under another job. # Cove session was previously stored under another job.
existing = JobRun.query.filter_by(job_id=job.id, external_id=external_id).first() existing = JobRun.query.filter_by(job_id=job.id, external_id=external_id).first()
if existing: if existing:
# Keep objects in sync even when the run itself is a duplicate session.
if job.customer_id:
_persist_datasource_objects(flat, job.customer_id, job.id, existing.id, last_run_at)
db.session.commit() db.session.commit()
return "skip_duplicate" return "skip_duplicate"
@ -713,49 +638,18 @@ def _persist_datasource_objects(
observed_at: datetime, observed_at: datetime,
) -> None: ) -> None:
"""Create run_object_links for each active datasource found in the account stats.""" """Create run_object_links for each active datasource found in the account stats."""
# Use I78 as source-of-truth for active datasources so object count matches Cove UI. for ds_prefix, ds_label in DATASOURCE_LABELS.items():
ds_codes = _parse_active_datasource_codes(flat.get("I78")) status_key = f"{ds_prefix}F00"
# Fallback: when I78 is missing, derive from present DxxF00 keys to avoid empty objects.
if not ds_codes:
seen: set[str] = set()
for key in flat.keys():
m = re.fullmatch(r"(D\d{1,2})F00", str(key or "").upper())
if not m:
continue
norm = _normalize_ds_code(m.group(1))
if norm and norm not in seen:
seen.add(norm)
ds_codes.append(norm)
overall_status_code = flat.get("D09F00")
overall_last_ts = _ts_to_dt(flat.get("D09F15"))
for ds_code in ds_codes:
ds_label = _label_for_ds_code(ds_code)
status_key = f"{ds_code}F00"
ts_key = f"{ds_code}F15"
status_code = flat.get(status_key) status_code = flat.get(status_key)
uses_overall_fallback = status_code is None
if status_code is None: if status_code is None:
status_code = overall_status_code continue
status = _map_status(status_code) status = _map_status(status_code)
ds_last_ts = _ts_to_dt(flat.get(ts_key)) or overall_last_ts ds_last_ts = _ts_to_dt(flat.get(f"{ds_prefix}F15"))
status_msg = (
if uses_overall_fallback: f"Cove datasource status: {_status_label(status_code)} "
status_msg = ( f"({status_code}); last session: {_fmt_utc(ds_last_ts)}"
f"Cove datasource status: {_status_label(status_code)} " )
f"({status_code}); datasource-specific status not returned by API columns; "
f"using overall account status; last session: {_fmt_utc(ds_last_ts)}"
)
else:
status_msg = (
f"Cove datasource status: {_status_label(status_code)} "
f"({status_code}); last session: {_fmt_utc(ds_last_ts)}"
)
# Use the same SQLAlchemy session/transaction as JobRun creation. # Use the same SQLAlchemy session/transaction as JobRun creation.
# A separate engine connection cannot reliably see the uncommitted run row. # A separate engine connection cannot reliably see the uncommitted run row.

View File

@ -137,7 +137,7 @@ def api_job_run_alerts(run_id: int):
db.session.execute( db.session.execute(
text( text(
""" """
SELECT DISTINCT r.id, r.body, r.source, r.ticket_id, r.start_date, r.resolved_at, r.active_from_date SELECT DISTINCT r.id, r.body, r.start_date, r.resolved_at, r.active_from_date
FROM remarks r FROM remarks r
JOIN remark_job_runs rjr ON rjr.remark_id = r.id JOIN remark_job_runs rjr ON rjr.remark_id = r.id
WHERE rjr.job_run_id = :run_id WHERE rjr.job_run_id = :run_id
@ -171,8 +171,6 @@ def api_job_run_alerts(run_id: int):
{ {
"id": remark_id, "id": remark_id,
"body": body, "body": body,
"source": (rr.get("source") or "manual"),
"ticket_id": rr.get("ticket_id"),
"start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-", "start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-",
"active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "", "active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "",
"resolved_at": _format_datetime(rr.get("resolved_at")) if rr.get("resolved_at") else "", "resolved_at": _format_datetime(rr.get("resolved_at")) if rr.get("resolved_at") else "",
@ -187,7 +185,7 @@ def api_job_run_alerts(run_id: int):
db.session.execute( db.session.execute(
text( text(
""" """
SELECT DISTINCT r.id, r.body, r.source, r.ticket_id, r.start_date, r.resolved_at, r.active_from_date SELECT DISTINCT r.id, r.body, r.start_date, r.resolved_at, r.active_from_date
FROM remarks r FROM remarks r
JOIN remark_scopes rs ON rs.remark_id = r.id JOIN remark_scopes rs ON rs.remark_id = r.id
WHERE rs.job_id = :job_id WHERE rs.job_id = :job_id
@ -230,8 +228,6 @@ def api_job_run_alerts(run_id: int):
{ {
"id": remark_id, "id": remark_id,
"body": body, "body": body,
"source": (rr.get("source") or "manual"),
"ticket_id": rr.get("ticket_id"),
"start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-", "start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-",
"active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "", "active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "",
"resolved_at": _format_datetime(rr.get("resolved_at")) if rr.get("resolved_at") else "", "resolved_at": _format_datetime(rr.get("resolved_at")) if rr.get("resolved_at") else "",
@ -266,6 +262,8 @@ def api_tickets():
customer_id = 0 customer_id = 0
query = Ticket.query query = Ticket.query
if active:
query = query.filter(Ticket.resolved_at.is_(None))
if q: if q:
like_q = f"%{q}%" like_q = f"%{q}%"
query = query.filter( query = query.filter(
@ -275,43 +273,9 @@ def api_tickets():
if customer_id: if customer_id:
query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id).filter(TicketScope.customer_id == customer_id) query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id).filter(TicketScope.customer_id == customer_id)
tickets_raw = query.order_by(Ticket.start_date.desc()).limit(500).all() query = query.order_by(Ticket.start_date.desc()).limit(500)
ticket_ids = [t.id for t in tickets_raw]
scope_total_map = {}
scope_open_map = {}
if ticket_ids:
try:
rows = (
db.session.execute(
text(
"""
SELECT
ticket_id,
COUNT(*) AS total_count,
SUM(CASE WHEN resolved_at IS NULL THEN 1 ELSE 0 END) AS open_count
FROM ticket_scopes
WHERE ticket_id = ANY(:ids)
GROUP BY ticket_id
"""
),
{"ids": ticket_ids},
)
.fetchall()
)
for tid, total_cnt, open_cnt in rows:
scope_total_map[int(tid)] = int(total_cnt or 0)
scope_open_map[int(tid)] = int(open_cnt or 0)
except Exception:
scope_total_map = {}
scope_open_map = {}
items = [] items = []
for t in tickets_raw: for t in query.all():
total_scopes = int(scope_total_map.get(t.id, 0) or 0)
open_scopes = int(scope_open_map.get(t.id, 0) or 0)
active_effective = (t.resolved_at is None) and (total_scopes == 0 or open_scopes > 0)
if active and not active_effective:
continue
items.append( items.append(
{ {
"id": t.id, "id": t.id,
@ -320,7 +284,7 @@ def api_tickets():
"active_from_date": str(getattr(t, "active_from_date", "") or ""), "active_from_date": str(getattr(t, "active_from_date", "") or ""),
"start_date": _format_datetime(t.start_date), "start_date": _format_datetime(t.start_date),
"resolved_at": _format_datetime(t.resolved_at) if t.resolved_at else "", "resolved_at": _format_datetime(t.resolved_at) if t.resolved_at else "",
"active": bool(active_effective), "active": (t.resolved_at is None and TicketScope.query.filter_by(ticket_id=t.id, resolved_at=None).first() is not None),
} }
) )
return jsonify({"status": "ok", "tickets": items}) return jsonify({"status": "ok", "tickets": items})
@ -574,8 +538,6 @@ def api_remarks():
{ {
"id": r.id, "id": r.id,
"body": r.body or "", "body": r.body or "",
"source": (getattr(r, "source", None) or "manual"),
"ticket_id": getattr(r, "ticket_id", None),
"active_from_date": str(getattr(r, "active_from_date", "") or ""), "active_from_date": str(getattr(r, "active_from_date", "") or ""),
"start_date": _format_datetime(r.start_date) if r.start_date else "-", "start_date": _format_datetime(r.start_date) if r.start_date else "-",
"resolved_at": _format_datetime(r.resolved_at) if r.resolved_at else "", "resolved_at": _format_datetime(r.resolved_at) if r.resolved_at else "",
@ -607,8 +569,6 @@ def api_remarks():
remark = Remark( remark = Remark(
title=None, title=None,
body=body, body=body,
source="manual",
ticket_id=None,
active_from_date=_to_amsterdam_date(run.run_at) or _to_amsterdam_date(now) or now.date(), active_from_date=_to_amsterdam_date(run.run_at) or _to_amsterdam_date(now) or now.date(),
start_date=now, start_date=now,
resolved_at=None, resolved_at=None,
@ -644,8 +604,6 @@ def api_remarks():
"remark": { "remark": {
"id": remark.id, "id": remark.id,
"body": remark.body or "", "body": remark.body or "",
"source": (getattr(remark, "source", None) or "manual"),
"ticket_id": getattr(remark, "ticket_id", None),
"start_date": _format_datetime(remark.start_date), "start_date": _format_datetime(remark.start_date),
"resolved_at": "", "resolved_at": "",
"active": True, "active": True,

View File

@ -1,12 +1,5 @@
from .routes_shared import * # noqa: F401,F403 from .routes_shared import * # noqa: F401,F403
from .routes_shared import ( from .routes_shared import _format_datetime, _get_database_size_bytes, _apply_overrides_to_run, _format_bytes, _get_free_disk_bytes, _infer_schedule_map_from_runs
_format_datetime,
_get_database_size_bytes,
_apply_overrides_to_run,
_format_bytes,
_get_expected_times_for_job_on_date,
_get_free_disk_bytes,
)
@main_bp.route("/") @main_bp.route("/")
@login_required @login_required
@ -63,6 +56,8 @@ def dashboard():
) )
end_of_day = start_of_day + timedelta(days=1) end_of_day = start_of_day + timedelta(days=1)
weekday_idx = today_date.weekday() # 0=Mon..6=Sun
jobs_success_count = 0 jobs_success_count = 0
jobs_success_override_count = 0 jobs_success_override_count = 0
jobs_expected_count = 0 jobs_expected_count = 0
@ -76,7 +71,8 @@ def dashboard():
jobs = Job.query.join(Customer, isouter=True).all() jobs = Job.query.join(Customer, isouter=True).all()
for job in jobs: for job in jobs:
expected_times = _get_expected_times_for_job_on_date(job, today_date) schedule_map = _infer_schedule_map_from_runs(job.id)
expected_times = schedule_map.get(weekday_idx) or []
if not expected_times: if not expected_times:
continue continue

View File

@ -19,8 +19,6 @@ _COVE_DATASOURCE_LABELS = {
"D1": "Files & Folders", "D1": "Files & Folders",
"D02": "System State", "D02": "System State",
"D2": "System State", "D2": "System State",
"D06": "Network Shares",
"D6": "Network Shares",
"D10": "VssMsSql", "D10": "VssMsSql",
"D11": "VssSharePoint", "D11": "VssSharePoint",
"D19": "M365 Exchange", "D19": "M365 Exchange",

View File

@ -1,14 +1,9 @@
from .routes_shared import * # noqa: F401,F403 from .routes_shared import * # noqa: F401,F403
from .routes_shared import ( from .routes_shared import _format_datetime, _get_or_create_settings, _apply_overrides_to_run, _infer_schedule_map_from_runs, _infer_monthly_schedule_from_runs
_format_datetime,
_get_or_create_settings,
_apply_overrides_to_run,
_get_expected_times_for_job_on_date,
)
# Grace window for today's Expected/Missed transition. # Grace window for today's Expected/Missed transition.
# A job is only marked Missed after the latest expected time plus this grace. # A job is only marked Missed after the latest expected time plus this grace.
MISSED_GRACE_WINDOW = timedelta(hours=3) MISSED_GRACE_WINDOW = timedelta(hours=1)
@main_bp.route("/daily-jobs") @main_bp.route("/daily-jobs")
@login_required @login_required
@ -92,6 +87,8 @@ def daily_jobs():
minute_bucket = (d.minute // 15) * 15 minute_bucket = (d.minute // 15) * 15
return f"{d.hour:02d}:{minute_bucket:02d}" return f"{d.hour:02d}:{minute_bucket:02d}"
weekday_idx = target_date.weekday() # 0=Mon..6=Sun
jobs_query = ( jobs_query = (
Job.query.join(Customer, isouter=True) Job.query.join(Customer, isouter=True)
.filter(Job.archived.is_(False)) .filter(Job.archived.is_(False))
@ -113,7 +110,24 @@ def daily_jobs():
rows = [] rows = []
for job in jobs: for job in jobs:
expected_times = _get_expected_times_for_job_on_date(job, target_date) schedule_map = _infer_schedule_map_from_runs(job.id)
expected_times = schedule_map.get(weekday_idx) or []
# If no weekly schedule is inferred (e.g. monthly jobs), try monthly inference.
if not expected_times:
monthly = _infer_monthly_schedule_from_runs(job.id)
if monthly:
dom = int(monthly.get("day_of_month") or 0)
mtimes = monthly.get("times") or []
# For months shorter than dom, treat the last day of month as the scheduled day.
try:
import calendar as _calendar
last_dom = _calendar.monthrange(target_date.year, target_date.month)[1]
except Exception:
last_dom = target_date.day
scheduled_dom = dom if (dom and dom <= last_dom) else last_dom
if target_date.day == scheduled_dom:
expected_times = list(mtimes)
if not expected_times: if not expected_times:
continue continue

View File

@ -51,14 +51,6 @@ DOCUMENTATION_STRUCTURE = {
{'slug': 'auto-import', 'title': 'Auto-Import Configuration'}, {'slug': 'auto-import', 'title': 'Auto-Import Configuration'},
] ]
}, },
'integrations': {
'title': 'Integrations',
'icon': '🔌',
'pages': [
{'slug': 'cove-data-protection', 'title': 'Cove Data Protection'},
{'slug': 'veeam-cloud-connect', 'title': 'Veeam Cloud Connect'},
]
},
'backup-review': { 'backup-review': {
'title': 'Backup Review', 'title': 'Backup Review',
'icon': '', 'icon': '',

View File

@ -3,16 +3,12 @@ from .routes_shared import (
_apply_overrides_to_run, _apply_overrides_to_run,
_describe_schedule, _describe_schedule,
_format_datetime, _format_datetime,
_get_effective_schedule_for_job,
_get_ui_timezone_name, _get_ui_timezone_name,
_infer_schedule_map_from_runs, _infer_schedule_map_from_runs,
_parse_schedule_times_csv,
_schedule_map_to_desc, _schedule_map_to_desc,
_to_amsterdam_date, _to_amsterdam_date,
) )
_WEEKDAY_LABELS = ["Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun"]
@main_bp.route("/jobs") @main_bp.route("/jobs")
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
@ -235,106 +231,12 @@ def job_set_cove_account(job_id: int):
return redirect(url_for("main.job_detail", job_id=job_id)) return redirect(url_for("main.job_detail", job_id=job_id))
@main_bp.route("/jobs/<int:job_id>/schedule", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def job_set_schedule(job_id: int):
"""Save or clear manual schedule override for this job."""
job = Job.query.get_or_404(job_id)
if (request.form.get("clear_schedule") or "").strip() == "1":
job.schedule_type = None
job.schedule_days_of_week = None
job.schedule_day_of_month = None
job.schedule_times = None
db.session.commit()
try:
log_admin_event(
"job_schedule_cleared",
f"Cleared manual schedule override for job {job.id}",
details=f"job_name={job.job_name}",
)
except Exception:
pass
flash("Manual schedule override removed. Inferred schedule is active again.", "success")
return redirect(url_for("main.job_detail", job_id=job_id))
schedule_type = (request.form.get("schedule_type") or "").strip().lower()
times_raw = (request.form.get("schedule_times") or "").strip()
times = _parse_schedule_times_csv(times_raw)
if not times:
flash("Invalid schedule times. Use comma-separated HH:MM values, e.g. 01:00,13:15.", "warning")
return redirect(url_for("main.job_detail", job_id=job_id))
if schedule_type not in ("daily", "weekly", "monthly"):
flash("Invalid schedule type. Choose Daily, Weekly, or Monthly.", "warning")
return redirect(url_for("main.job_detail", job_id=job_id))
day_labels: list[str] = []
day_of_month: int | None = None
if schedule_type == "weekly":
raw_days = request.form.getlist("schedule_weekdays")
idxs: list[int] = []
for value in raw_days:
try:
idx = int(str(value).strip())
except Exception:
continue
if 0 <= idx <= 6 and idx not in idxs:
idxs.append(idx)
idxs = sorted(idxs)
if not idxs:
flash("Weekly schedule requires at least one day.", "warning")
return redirect(url_for("main.job_detail", job_id=job_id))
day_labels = [_WEEKDAY_LABELS[i] for i in idxs]
if schedule_type == "monthly":
dom_raw = (request.form.get("schedule_day_of_month") or "").strip()
try:
day_of_month = int(dom_raw)
except Exception:
day_of_month = None
if day_of_month is None or day_of_month < 1 or day_of_month > 31:
flash("Monthly schedule requires a day of month between 1 and 31.", "warning")
return redirect(url_for("main.job_detail", job_id=job_id))
job.schedule_type = schedule_type
job.schedule_times = ",".join(times)
job.schedule_days_of_week = ",".join(day_labels) if day_labels else None
job.schedule_day_of_month = day_of_month if schedule_type == "monthly" else None
db.session.commit()
try:
details = f"type={schedule_type}; times={job.schedule_times}; days={job.schedule_days_of_week}; dom={job.schedule_day_of_month}"
log_admin_event(
"job_schedule_set",
f"Set manual schedule override for job {job.id}",
details=details,
)
except Exception:
pass
flash("Manual schedule override saved.", "success")
return redirect(url_for("main.job_detail", job_id=job_id))
@main_bp.route("/jobs/<int:job_id>") @main_bp.route("/jobs/<int:job_id>")
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def job_detail(job_id: int): def job_detail(job_id: int):
job = Job.query.get_or_404(job_id) job = Job.query.get_or_404(job_id)
first_detected_run_at = (
db.session.query(func.min(JobRun.run_at))
.filter(
JobRun.job_id == job.id,
JobRun.run_at.isnot(None),
JobRun.missed.is_(False),
)
.scalar()
)
# History pagination # History pagination
try: try:
page = int(request.args.get("page", "1")) page = int(request.args.get("page", "1"))
@ -621,24 +523,13 @@ def job_detail(job_id: int):
can_manage_jobs = current_user.is_authenticated and get_active_role() in ("admin", "operator") can_manage_jobs = current_user.is_authenticated and get_active_role() in ("admin", "operator")
schedule_map = None
schedule_desc = _describe_schedule(job) schedule_desc = _describe_schedule(job)
inferred_schedule_map = _infer_schedule_map_from_runs(job.id) if schedule_desc.startswith("No schedule configured"):
effective_schedule = _get_effective_schedule_for_job(job) schedule_map = _infer_schedule_map_from_runs(job.id)
effective_source = effective_schedule.get("source") or "none" schedule_desc = _schedule_map_to_desc(schedule_map)
effective_weekly_map = effective_schedule.get("weekly_map") or {i: [] for i in range(7)} else:
effective_monthly = effective_schedule.get("monthly") schedule_map = _infer_schedule_map_from_runs(job.id)
if effective_source == "inferred_weekly":
schedule_desc = _schedule_map_to_desc(effective_weekly_map)
elif effective_source == "inferred_monthly" and effective_monthly:
dom = effective_monthly.get("day_of_month")
mtimes = effective_monthly.get("times") or []
if mtimes:
schedule_desc = f"Inferred monthly on day {dom} at {', '.join(mtimes)}."
else:
schedule_desc = f"Inferred monthly on day {dom}."
elif effective_source == "none":
schedule_desc = _schedule_map_to_desc(inferred_schedule_map)
# For convenience, also load customer # For convenience, also load customer
customer = None customer = None
@ -654,12 +545,8 @@ def job_detail(job_id: int):
"main/job_detail.html", "main/job_detail.html",
job=job, job=job,
customer=customer, customer=customer,
first_detected_run_at=first_detected_run_at,
schedule_desc=schedule_desc, schedule_desc=schedule_desc,
schedule_map=inferred_schedule_map, schedule_map=schedule_map,
effective_schedule_source=effective_source,
effective_weekly_schedule_map=effective_weekly_map,
effective_monthly_schedule=effective_monthly,
history_rows=history_rows, history_rows=history_rows,
ticket_open_count=int(ticket_open_count or 0), ticket_open_count=int(ticket_open_count or 0),
ticket_total_count=int(ticket_total_count or 0), ticket_total_count=int(ticket_total_count or 0),

View File

@ -9,15 +9,16 @@ from datetime import date, datetime, time, timedelta, timezone
from flask import flash, jsonify, redirect, render_template, request, url_for from flask import flash, jsonify, redirect, render_template, request, url_for
from urllib.parse import urlencode, urljoin from urllib.parse import urlencode, urljoin
from flask_login import current_user, login_required from flask_login import current_user, login_required
from sqlalchemy import and_, bindparam, or_, func, text from sqlalchemy import and_, or_, func, text
from .routes_shared import ( from .routes_shared import (
_apply_overrides_to_run, _apply_overrides_to_run,
_get_effective_schedule_for_job,
_format_datetime, _format_datetime,
_get_ui_timezone, _get_ui_timezone,
_get_ui_timezone_name, _get_ui_timezone_name,
_get_or_create_settings, _get_or_create_settings,
_infer_schedule_map_from_runs,
_infer_monthly_schedule_from_runs,
_to_amsterdam_date, _to_amsterdam_date,
main_bp, main_bp,
roles_required, roles_required,
@ -34,9 +35,6 @@ from ..models import (
MailMessage, MailMessage,
MailObject, MailObject,
Override, Override,
Remark,
RemarkJobRun,
RemarkScope,
Ticket, Ticket,
TicketJobRun, TicketJobRun,
TicketScope, TicketScope,
@ -46,7 +44,6 @@ from ..ticketing_utils import link_open_internal_tickets_to_run
AUTOTASK_TERMINAL_STATUS_IDS = {5} AUTOTASK_TERMINAL_STATUS_IDS = {5}
BACKUPCHECKS_RESOLVE_MARKER = "[Backupchecks] Marked as resolved in Backupchecks"
RUN_CHECKS_SORT_MODES = {"customer", "status"} RUN_CHECKS_SORT_MODES = {"customer", "status"}
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
@ -167,118 +164,6 @@ def _is_hidden_3cx_non_backup(backup_software: str | None, backup_type: str | No
return bs == "3cx" and bt in {"update", "ssl certificate"} return bs == "3cx" and bt in {"update", "ssl certificate"}
def _chunked(values: list[int], size: int = 500) -> list[list[int]]:
if not values:
return []
return [values[i:i + size] for i in range(0, len(values), size)]
def _get_cove_complete_success_run_ids(run_ids: list[int]) -> set[int]:
"""Return Cove run ids that have at least one object and all object statuses are Success."""
if not run_ids:
return set()
complete_success_ids: set[int] = set()
for chunk in _chunked(run_ids, size=500):
rows = db.session.execute(
text(
"""
SELECT
rol.run_id AS run_id,
COUNT(*) AS obj_count,
SUM(CASE WHEN LOWER(COALESCE(rol.status, '')) = 'success' THEN 1 ELSE 0 END) AS success_count
FROM run_object_links rol
WHERE rol.run_id IN :run_ids
GROUP BY rol.run_id
"""
).bindparams(bindparam("run_ids", expanding=True)),
{"run_ids": chunk},
).mappings().all()
for rr in rows:
run_id = int(rr.get("run_id") or 0)
obj_count = int(rr.get("obj_count") or 0)
success_count = int(rr.get("success_count") or 0)
if run_id > 0 and obj_count > 0 and success_count == obj_count:
complete_success_ids.add(run_id)
return complete_success_ids
def _collect_suppressed_cove_run_ids(
*,
job_ids: list[int] | None = None,
include_reviewed: bool = False,
) -> set[int]:
"""Suppress Cove runs that occur later on a day after the first complete success run."""
q = (
db.session.query(
JobRun.id.label("run_id"),
JobRun.job_id.label("job_id"),
func.coalesce(JobRun.run_at, JobRun.created_at).label("run_ts"),
JobRun.status.label("status"),
)
.filter(JobRun.source_type == "cove_api")
)
if job_ids:
q = q.filter(JobRun.job_id.in_(job_ids))
if not include_reviewed:
q = q.filter(JobRun.reviewed_at.is_(None))
run_rows = q.order_by(
JobRun.job_id.asc(),
func.coalesce(JobRun.run_at, JobRun.created_at).asc(),
JobRun.id.asc(),
).all()
if not run_rows:
return set()
success_candidate_ids = [
int(r.run_id)
for r in run_rows
if ((r.status or "").strip().lower() == "success")
]
complete_success_ids = _get_cove_complete_success_run_ids(success_candidate_ids)
if not complete_success_ids:
return set()
cutoff_by_job_day: dict[tuple[int, date], datetime] = {}
for r in run_rows:
run_id = int(r.run_id or 0)
run_ts = getattr(r, "run_ts", None)
if run_id <= 0 or run_ts is None or run_id not in complete_success_ids:
continue
local_day = _to_amsterdam_date(run_ts)
if local_day is None:
continue
key = (int(r.job_id), local_day)
prev_cutoff = cutoff_by_job_day.get(key)
if prev_cutoff is None or run_ts < prev_cutoff:
cutoff_by_job_day[key] = run_ts
if not cutoff_by_job_day:
return set()
suppressed: set[int] = set()
for r in run_rows:
run_id = int(r.run_id or 0)
run_ts = getattr(r, "run_ts", None)
if run_id <= 0 or run_ts is None:
continue
local_day = _to_amsterdam_date(run_ts)
if local_day is None:
continue
cutoff = cutoff_by_job_day.get((int(r.job_id), local_day))
if cutoff is None:
continue
if run_ts > cutoff:
suppressed.add(run_id)
return suppressed
def _ensure_internal_ticket_for_autotask( def _ensure_internal_ticket_for_autotask(
*, *,
ticket_number: str, ticket_number: str,
@ -391,148 +276,6 @@ def _resolve_internal_ticket_for_job(
db.session.add(TicketJobRun(ticket_id=ticket.id, job_run_id=rid, link_source="autotask")) db.session.add(TicketJobRun(ticket_id=ticket.id, job_run_id=rid, link_source="autotask"))
def _extract_autotask_resolution_text(ticket_payload: dict | None) -> str:
if not isinstance(ticket_payload, dict):
return ""
preferred = [
"resolution",
"resolutionText",
"resolution_text",
"resolutionNote",
"resolutionNotes",
]
for key in preferred:
val = ticket_payload.get(key)
txt = str(val or "").strip()
if txt:
return txt
for key, val in ticket_payload.items():
key_l = str(key or "").strip().lower()
if "resolution" not in key_l:
continue
txt = str(val or "").strip()
if txt:
return txt
return ""
def _maybe_create_autotask_resolution_remark(
*,
ticket_payload: dict | None,
ticket_id: int,
runs_for_ticket: list[JobRun],
now: datetime,
) -> None:
"""Persist PSA resolution text as an active internal remark (deduplicated)."""
if not runs_for_ticket:
return
resolution_text = _extract_autotask_resolution_text(ticket_payload)
if not resolution_text:
return
if BACKUPCHECKS_RESOLVE_MARKER in resolution_text:
# Do not mirror Backupchecks-generated resolve notes back as remarks.
return
job = Job.query.get(runs_for_ticket[0].job_id) if runs_for_ticket else None
if not job:
return
ticket_number = ""
if isinstance(ticket_payload, dict):
ticket_number = str(
ticket_payload.get("ticketNumber")
or ticket_payload.get("number")
or ticket_payload.get("ticket_number")
or ""
).strip()
if not ticket_number:
for rr in runs_for_ticket:
code = str(getattr(rr, "autotask_ticket_number", "") or "").strip()
if code:
ticket_number = code
break
active_from_dt = None
try:
dts = [getattr(x, "run_at", None) for x in runs_for_ticket if getattr(x, "run_at", None)]
active_from_dt = min(dts) if dts else None
except Exception:
active_from_dt = None
internal_ticket = _ensure_internal_ticket_for_autotask(
ticket_number=ticket_number,
job=job,
run_ids=[int(x.id) for x in runs_for_ticket if getattr(x, "id", None)],
now=now,
active_from_dt=active_from_dt,
)
if (getattr(internal_ticket, "resolved_origin", None) or "").strip().lower() == "backupchecks":
return
internal_ticket_id = getattr(internal_ticket, "id", None)
exists = (
db.session.query(Remark.id)
.join(RemarkScope, RemarkScope.remark_id == Remark.id)
.filter(Remark.source == "autotask_resolution")
.filter(Remark.body == resolution_text)
.filter(RemarkScope.job_id == job.id)
)
if internal_ticket_id:
exists = exists.filter(Remark.ticket_id == int(internal_ticket_id))
if exists.first():
return
active_from_date = _to_amsterdam_date(active_from_dt or now) or (active_from_dt or now).date()
title = (
f"Autotask resolution ({ticket_number})"
if ticket_number
else f"Autotask resolution (ID {int(ticket_id)})"
)
remark = Remark(
title=title,
body=resolution_text,
source="autotask_resolution",
ticket_id=(int(internal_ticket_id) if internal_ticket_id else None),
active_from_date=active_from_date,
start_date=now,
resolved_at=None,
)
db.session.add(remark)
db.session.flush()
db.session.add(
RemarkScope(
remark_id=remark.id,
scope_type="job",
customer_id=job.customer_id,
backup_software=job.backup_software,
backup_type=job.backup_type,
job_id=job.id,
job_name_match=job.job_name,
job_name_match_mode="exact",
)
)
for rr in runs_for_ticket:
rid = int(getattr(rr, "id", 0) or 0)
if rid <= 0:
continue
if not RemarkJobRun.query.filter_by(remark_id=remark.id, job_run_id=rid).first():
db.session.add(
RemarkJobRun(
remark_id=remark.id,
job_run_id=rid,
link_source="autotask_resolution",
)
)
def _poll_autotask_ticket_states_for_runs(*, run_ids: list[int]) -> None: def _poll_autotask_ticket_states_for_runs(*, run_ids: list[int]) -> None:
"""Phase 2: Read-only PSA-driven ticket completion sync. """Phase 2: Read-only PSA-driven ticket completion sync.
@ -827,27 +570,6 @@ def _poll_autotask_ticket_states_for_runs(*, run_ids: list[int]) -> None:
origin="psa", origin="psa",
) )
# Mirror Autotask resolution text into active internal remarks for follow-up visibility.
for tid in ticket_ids:
if tid in deleted_map:
continue
runs_for_ticket = ticket_to_runs.get(tid) or []
if not runs_for_ticket:
continue
try:
ticket_payload = client.get_ticket(int(tid))
except Exception:
continue
try:
_maybe_create_autotask_resolution_remark(
ticket_payload=ticket_payload,
ticket_id=int(tid),
runs_for_ticket=runs_for_ticket,
now=now,
)
except Exception:
continue
try: try:
db.session.commit() db.session.commit()
except Exception: except Exception:
@ -923,113 +645,39 @@ def _compose_autotask_ticket_description(
lines.append("Summary:") lines.append("Summary:")
lines.append(overall_message) lines.append(overall_message)
lines.append("") lines.append("")
lines.append("Multiple objects reported messages. See Backupchecks for full details.")
# Always include object-level details so technicians can immediately see else:
# which objects failed/warned from the ticket itself. # Fallback to object-level messages with a hard limit.
def _object_priority(obj: dict[str, str]) -> int:
combined = f"{obj.get('status', '')} {obj.get('error_message', '')}".strip().lower()
if any(x in combined for x in ("failed", "error", "missed")):
return 0
if "warning" in combined:
return 1
return 2
def _is_problem_object(status: str, err: str) -> bool:
s = (status or "").strip().lower()
e = (err or "").strip().lower()
combined = f"{s} {e}".strip()
if any(x in combined for x in ("failed", "error", "warning", "missed")):
return True
if s in ("success", "succeeded", "completed", "ok"):
return False
if "completed" in combined:
return False
# Keep uncommon non-success statuses visible.
return bool(s or e)
candidates: list[dict[str, str]] = []
for o in objects_payload or []:
name = (o.get("name") or "").strip()
err = (o.get("error_message") or "").strip()
st = (o.get("status") or "").strip()
if not name:
continue
if not _is_problem_object(st, err):
continue
candidates.append(
{
"name": name,
"status": st,
"error_message": err,
}
)
if candidates:
sorted_candidates = sorted(
candidates,
key=lambda x: (_object_priority(x), x["name"].lower()),
)
lines.append("Affected objects:")
limit = 10 limit = 10
shown = 0 shown = 0
for o in sorted_candidates: total = 0
for o in objects_payload or []:
name = (o.get("name") or "").strip()
err = (o.get("error_message") or "").strip()
st = (o.get("status") or "").strip()
if not name:
continue
if not err and not st:
continue
total += 1
if shown >= limit: if shown >= limit:
break continue
msg = o["error_message"] or o["status"] msg = err or st
lines.append(f"- {o['name']}: {msg}") lines.append(f"- {name}: {msg}")
shown += 1 shown += 1
if len(sorted_candidates) > shown:
lines.append(f"And {int(len(sorted_candidates) - shown)} additional objects reported messages.") if total == 0:
else: lines.append("No detailed object messages available. See Backupchecks for full details.")
lines.append("No detailed object messages available. See Backupchecks for full details.") elif total > shown:
lines.append(f"And {int(total - shown)} additional objects reported similar messages.")
lines.append("") lines.append("")
lines.append(f"Backupchecks details: {job_link}") lines.append(f"Backupchecks details: {job_link}")
return "\n".join(lines).strip() + "\n" return "\n".join(lines).strip() + "\n"
def _compose_autotask_link_existing_note(
*,
settings,
job: Job,
run: JobRun,
ticket_number: str,
linked_run_count: int,
) -> str:
tz_name = _get_ui_timezone_name() or "Europe/Amsterdam"
run_at_str = _format_datetime(run.run_at) if run.run_at else "-"
actor = (getattr(current_user, "email", None) or getattr(current_user, "username", None) or "operator")
base_url = (getattr(settings, "autotask_base_url", None) or "").strip()
job_rel = url_for("main.job_detail", job_id=job.id)
job_link = urljoin(base_url.rstrip("/") + "/", job_rel.lstrip("/"))
if run.id:
job_link = f"{job_link}?run_id={int(run.id)}"
lines: list[str] = []
lines.append("[Backupchecks] Additional alert linked to this ticket.")
lines.append(f"Customer: {job.customer.name if job.customer else ''}")
lines.append(f"Job: {job.job_name or ''}")
lines.append(f"Backup: {job.backup_software or ''} / {job.backup_type or ''}")
lines.append(f"Run ID: {int(run.id) if run.id else 0}")
lines.append(f"Run at ({tz_name}): {run_at_str}")
lines.append(f"Run status: {run.status or ''}")
lines.append(f"Linked active runs for this job: {int(linked_run_count or 0)}")
if ticket_number:
lines.append(f"Ticket: {ticket_number}")
lines.append(f"Linked by: {actor}")
lines.append("")
lines.append(f"Backupchecks details: {job_link}")
return "\n".join(lines).strip() + "\n"
# Grace window for matching real runs to an expected schedule slot. # Grace window for matching real runs to an expected schedule slot.
# A run within +/- 3 hours of the inferred schedule time counts as fulfilling the slot. # A run within +/- 1 hour of the inferred schedule time counts as fulfilling the slot.
MISSED_GRACE_WINDOW = timedelta(hours=3) MISSED_GRACE_WINDOW = timedelta(hours=1)
def _status_is_success(status: str | None) -> bool: def _status_is_success(status: str | None) -> bool:
@ -1095,10 +743,12 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
return 0 return 0
tz = _get_ui_timezone() tz = _get_ui_timezone()
resolved_schedule = _get_effective_schedule_for_job(job) schedule_map = _infer_schedule_map_from_runs(job.id) or {}
schedule_map = resolved_schedule.get("weekly_map") or {i: [] for i in range(7)}
has_weekly_times = any((schedule_map.get(i) or []) for i in range(7)) has_weekly_times = any((schedule_map.get(i) or []) for i in range(7))
monthly = resolved_schedule.get("monthly")
monthly = None
if not has_weekly_times:
monthly = _infer_monthly_schedule_from_runs(job.id)
if (not has_weekly_times) and (not monthly): if (not has_weekly_times) and (not monthly):
return 0 return 0
@ -1449,13 +1099,6 @@ def run_checks_page():
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\")) | (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
) )
# Restrict Cove suppression calculation to currently relevant jobs.
candidate_job_ids = [int(x) for (x,) in base.with_entities(Job.id).limit(4000).all()]
suppressed_cove_run_ids = _collect_suppressed_cove_run_ids(
job_ids=candidate_job_ids,
include_reviewed=include_reviewed,
)
# Runs to show in the overview: unreviewed (or all if admin toggle enabled) # Runs to show in the overview: unreviewed (or all if admin toggle enabled)
run_filter = [] run_filter = []
if not include_reviewed: if not include_reviewed:
@ -1507,8 +1150,6 @@ def run_checks_page():
) )
if run_filter: if run_filter:
agg = agg.filter(*run_filter) agg = agg.filter(*run_filter)
if suppressed_cove_run_ids:
agg = agg.filter(~JobRun.id.in_(list(suppressed_cove_run_ids)))
agg = agg.subquery() agg = agg.subquery()
@ -1560,8 +1201,6 @@ def run_checks_page():
) )
if run_filter: if run_filter:
s_q = s_q.filter(*run_filter) s_q = s_q.filter(*run_filter)
if suppressed_cove_run_ids:
s_q = s_q.filter(~JobRun.id.in_(list(suppressed_cove_run_ids)))
s_q = s_q.group_by(JobRun.job_id, JobRun.status, JobRun.missed, JobRun.override_applied) s_q = s_q.group_by(JobRun.job_id, JobRun.status, JobRun.missed, JobRun.override_applied)
for jid, status, missed, override_applied, cnt in s_q.all(): for jid, status, missed, override_applied, cnt in s_q.all():
@ -1821,13 +1460,6 @@ def run_checks_details():
if not include_reviewed: if not include_reviewed:
q = q.filter(JobRun.reviewed_at.is_(None)) q = q.filter(JobRun.reviewed_at.is_(None))
suppressed_cove_run_ids = _collect_suppressed_cove_run_ids(
job_ids=[int(job.id)],
include_reviewed=include_reviewed,
)
if suppressed_cove_run_ids:
q = q.filter(~JobRun.id.in_(list(suppressed_cove_run_ids)))
runs = q.order_by(func.coalesce(JobRun.run_at, JobRun.created_at).desc(), JobRun.id.desc()).limit(400).all() runs = q.order_by(func.coalesce(JobRun.run_at, JobRun.created_at).desc(), JobRun.id.desc()).limit(400).all()
# Prefetch internal ticket resolution info for Autotask-linked runs (Phase 2 UI). # Prefetch internal ticket resolution info for Autotask-linked runs (Phase 2 UI).
@ -2158,58 +1790,21 @@ def api_run_checks_create_autotask_ticket():
overall_message = (getattr(msg, "overall_message", None) or "") if msg else "" overall_message = (getattr(msg, "overall_message", None) or "") if msg else ""
objects_payload: list[dict[str, str]] = [] objects_payload: list[dict[str, str]] = []
# Preferred source: run_object_links/customer_objects, also used by Run Checks UI
# for Cove/cloud sourced runs.
try: try:
rows = ( objs = run.objects.order_by(JobObject.object_name.asc()).all()
db.session.execute(
text(
"""
SELECT
co.object_name AS name,
rol.status AS status,
rol.error_message AS error_message
FROM run_object_links rol
JOIN customer_objects co ON co.id = rol.customer_object_id
WHERE rol.run_id = :run_id
ORDER BY co.object_name ASC
"""
),
{"run_id": run.id},
)
.mappings()
.all()
)
for rr in rows:
objects_payload.append(
{
"name": rr.get("name") or "",
"type": "",
"status": rr.get("status") or "",
"error_message": rr.get("error_message") or "",
}
)
except Exception: except Exception:
pass objs = list(run.objects or [])
for o in objs or []:
objects_payload.append(
{
"name": getattr(o, "object_name", "") or "",
"type": getattr(o, "object_type", "") or "",
"status": getattr(o, "status", "") or "",
"error_message": getattr(o, "error_message", "") or "",
}
)
# Fallback for legacy mail parser sourced runs. if (not objects_payload) and msg:
if not objects_payload:
try:
objs = run.objects.order_by(JobObject.object_name.asc()).all()
except Exception:
objs = list(run.objects or [])
for o in objs or []:
objects_payload.append(
{
"name": getattr(o, "object_name", "") or "",
"type": getattr(o, "object_type", "") or "",
"status": getattr(o, "status", "") or "",
"error_message": getattr(o, "error_message", "") or "",
}
)
if (not objects_payload) and msg and getattr(run, "source_type", None) != "cloud_connect":
try: try:
mos = MailObject.query.filter_by(mail_message_id=msg.id).order_by(MailObject.object_name.asc()).all() mos = MailObject.query.filter_by(mail_message_id=msg.id).order_by(MailObject.object_name.asc()).all()
except Exception: except Exception:
@ -2534,13 +2129,15 @@ def api_run_checks_autotask_link_existing_ticket():
if not isinstance(t, dict): if not isinstance(t, dict):
return jsonify({"status": "error", "message": "Autotask did not return a ticket object."}), 400 return jsonify({"status": "error", "message": "Autotask did not return a ticket object."}), 400
# Allow cross-company linking for shared/umbrella Autotask tickets. # Enforce company scope.
# Keep parsed company id for diagnostics/audit context, but do not block on mismatch.
try: try:
t_company = int(t.get("companyID") or 0) t_company = int(t.get("companyID") or 0)
except Exception: except Exception:
t_company = 0 t_company = 0
if t_company != int(customer.autotask_company_id):
return jsonify({"status": "error", "message": "Selected ticket does not belong to the mapped Autotask company."}), 400
tnum = (t.get("ticketNumber") or t.get("number") or "") tnum = (t.get("ticketNumber") or t.get("number") or "")
tnum = str(tnum or "").strip() tnum = str(tnum or "").strip()
if not tnum: if not tnum:
@ -2596,38 +2193,12 @@ def api_run_checks_autotask_link_existing_ticket():
db.session.rollback() db.session.rollback()
return jsonify({"status": "error", "message": "Failed to persist Autotask ticket link."}), 500 return jsonify({"status": "error", "message": "Failed to persist Autotask ticket link."}), 500
note_posted = False
note_warning = ""
try:
settings = _get_or_create_settings()
note_body = _compose_autotask_link_existing_note(
settings=settings,
job=job,
run=run,
ticket_number=tnum,
linked_run_count=len(run_ids),
)
client.create_ticket_note(
{
"ticketID": int(ticket_id),
"title": "Backupchecks",
"description": note_body,
"publish": 1,
}
)
note_posted = True
except Exception as exc:
note_posted = False
note_warning = f"Autotask ticket was linked, but posting the link-update note failed: {exc}"
return jsonify( return jsonify(
{ {
"status": "ok", "status": "ok",
"ticket_id": int(ticket_id), "ticket_id": int(ticket_id),
"ticket_number": tnum, "ticket_number": tnum,
"internal_ticket_id": int(getattr(internal_ticket, "id", 0) or 0) if internal_ticket else 0, "internal_ticket_id": int(getattr(internal_ticket, "id", 0) or 0) if internal_ticket else 0,
"note_posted": bool(note_posted),
"note_warning": note_warning,
} }
) )
@ -2698,7 +2269,7 @@ def api_run_checks_autotask_resolve_note():
ticket_number = str(getattr(run, "autotask_ticket_number", "") or "").strip() ticket_number = str(getattr(run, "autotask_ticket_number", "") or "").strip()
# Build dynamic message based on time entry check # Build dynamic message based on time entry check
marker = BACKUPCHECKS_RESOLVE_MARKER marker = "[Backupchecks] Marked as resolved in Backupchecks"
if has_time_entries: if has_time_entries:
status_note = "(ticket remains open in Autotask due to existing time entries)" status_note = "(ticket remains open in Autotask due to existing time entries)"
else: else:

View File

@ -3,8 +3,9 @@ from .routes_shared import (
_apply_overrides_to_run, _apply_overrides_to_run,
_format_datetime, _format_datetime,
_get_or_create_settings, _get_or_create_settings,
_get_expected_times_for_job_on_date,
_get_ui_timezone, _get_ui_timezone,
_infer_monthly_schedule_from_runs,
_infer_schedule_map_from_runs,
) )
from sqlalchemy import and_, cast, func, or_, String from sqlalchemy import and_, cast, func, or_, String
@ -381,8 +382,23 @@ def _build_daily_jobs_results(patterns: list[str], page: int) -> dict:
) )
_enrich_paging(section, total, current_page, total_pages) _enrich_paging(section, total, current_page, total_pages)
for row in rows: for row in rows:
job_obj = Job.query.get(int(row.job_id)) expected_times = (_infer_schedule_map_from_runs(row.job_id).get(target_date.weekday()) or [])
expected_times = _get_expected_times_for_job_on_date(job_obj, target_date) if not expected_times:
monthly = _infer_monthly_schedule_from_runs(row.job_id)
if monthly:
try:
dom = int(monthly.get("day_of_month") or 0)
except Exception:
dom = 0
mtimes = monthly.get("times") or []
try:
import calendar as _calendar
last_dom = _calendar.monthrange(target_date.year, target_date.month)[1]
except Exception:
last_dom = target_date.day
scheduled_dom = dom if (dom and dom <= last_dom) else last_dom
if target_date.day == scheduled_dom:
expected_times = list(mtimes)
runs_for_day = ( runs_for_day = (
JobRun.query.filter( JobRun.query.filter(
@ -400,6 +416,7 @@ def _build_daily_jobs_results(patterns: list[str], page: int) -> dict:
if run_count > 0: if run_count > 0:
last_run = runs_for_day[-1] last_run = runs_for_day[-1]
try: try:
job_obj = Job.query.get(int(row.job_id))
status_display, _override_applied, _override_level, _ov_id, _ov_reason = _apply_overrides_to_run(job_obj, last_run) status_display, _override_applied, _override_level, _ov_id, _ov_reason = _apply_overrides_to_run(job_obj, last_run)
if getattr(last_run, "missed", False): if getattr(last_run, "missed", False):
last_status = status_display or "Missed" last_status = status_display or "Missed"

View File

@ -655,9 +655,7 @@ def _infer_schedule_map_from_runs(job_id: int):
Returns dict weekday->sorted list of 'HH:MM' strings in configured UI local time. Returns dict weekday->sorted list of 'HH:MM' strings in configured UI local time.
Notes: Notes:
- Considers real runs from: - Only considers real runs that came from mail reports (mail_message_id is not NULL).
- mail reports (mail_message_id is not NULL), and
- Cove API imports (source_type == "cove_api").
- Synthetic missed rows never influence schedule inference. - Synthetic missed rows never influence schedule inference.
- To reduce noise, a weekday/time bucket must occur at least MIN_OCCURRENCES times. - To reduce noise, a weekday/time bucket must occur at least MIN_OCCURRENCES times.
""" """
@ -693,8 +691,7 @@ def _infer_schedule_map_from_runs(job_id: int):
pass pass
try: try:
# Only infer schedules from real runs that came from mail reports # Only infer schedules from real runs that came from mail reports.
# or Cove API imports.
# Synthetic "Missed" rows must never influence schedule inference. # Synthetic "Missed" rows must never influence schedule inference.
# Limit to the last 90 days so that schedule changes (different day, # Limit to the last 90 days so that schedule changes (different day,
# time, or frequency) take effect quickly and do not leave stale slots # time, or frequency) take effect quickly and do not leave stale slots
@ -706,10 +703,7 @@ def _infer_schedule_map_from_runs(job_id: int):
JobRun.job_id == job_id, JobRun.job_id == job_id,
JobRun.run_at.isnot(None), JobRun.run_at.isnot(None),
JobRun.missed.is_(False), JobRun.missed.is_(False),
or_( JobRun.mail_message_id.isnot(None),
JobRun.mail_message_id.isnot(None),
JobRun.source_type == "cove_api",
),
JobRun.run_at >= cutoff_utc, JobRun.run_at >= cutoff_utc,
) )
.order_by(JobRun.run_at.desc()) .order_by(JobRun.run_at.desc())
@ -788,17 +782,14 @@ def _infer_monthly_schedule_from_runs(job_id: int):
or None if not enough evidence. or None if not enough evidence.
Rules: Rules:
- Uses only real runs from mail and Cove API imports: - Uses only real mail-based runs (mail_message_id is not NULL) and excludes synthetic missed rows.
- mail-based runs (mail_message_id is not NULL)
- Cove API runs (source_type == "cove_api")
and excludes synthetic missed rows.
- Requires at least MIN_OCCURRENCES occurrences for the inferred day-of-month. - Requires at least MIN_OCCURRENCES occurrences for the inferred day-of-month.
- Uses a simple cadence heuristic: typical gaps between runs must be >= 20 days to qualify as monthly. - Uses a simple cadence heuristic: typical gaps between runs must be >= 20 days to qualify as monthly.
""" """
MIN_OCCURRENCES = 3 MIN_OCCURRENCES = 3
try: try:
# Same "real run" rule as weekly inference (mail + Cove API). # Same "real run" rule as weekly inference.
# 180 days gives ~6 occurrences for a monthly job (enough for # 180 days gives ~6 occurrences for a monthly job (enough for
# MIN_OCCURRENCES=3) while still discarding stale schedule data. # MIN_OCCURRENCES=3) while still discarding stale schedule data.
cutoff_utc = datetime.utcnow() - timedelta(days=180) cutoff_utc = datetime.utcnow() - timedelta(days=180)
@ -808,10 +799,7 @@ def _infer_monthly_schedule_from_runs(job_id: int):
JobRun.job_id == job_id, JobRun.job_id == job_id,
JobRun.run_at.isnot(None), JobRun.run_at.isnot(None),
JobRun.missed.is_(False), JobRun.missed.is_(False),
or_( JobRun.mail_message_id.isnot(None),
JobRun.mail_message_id.isnot(None),
JobRun.source_type == "cove_api",
),
JobRun.run_at >= cutoff_utc, JobRun.run_at >= cutoff_utc,
) )
.order_by(JobRun.run_at.asc()) .order_by(JobRun.run_at.asc())
@ -911,161 +899,6 @@ def _infer_monthly_schedule_from_runs(job_id: int):
return {"day_of_month": int(best_dom), "times": keep_times} return {"day_of_month": int(best_dom), "times": keep_times}
def _parse_schedule_times_csv(raw: str | None) -> list[str]:
out: list[str] = []
seen: set[str] = set()
for part in str(raw or "").split(","):
token = part.strip()
if not token:
continue
m = re.match(r"^(\d{1,2}):(\d{2})$", token)
if not m:
continue
hh = int(m.group(1))
mm = int(m.group(2))
if hh < 0 or hh > 23 or mm < 0 or mm > 59:
continue
norm = f"{hh:02d}:{mm:02d}"
if norm in seen:
continue
seen.add(norm)
out.append(norm)
return sorted(out)
def _parse_schedule_days_csv(raw: str | None) -> list[int]:
day_map = {
"mon": 0,
"monday": 0,
"tue": 1,
"tues": 1,
"tuesday": 1,
"wed": 2,
"wednesday": 2,
"thu": 3,
"thurs": 3,
"thursday": 3,
"fri": 4,
"friday": 4,
"sat": 5,
"saturday": 5,
"sun": 6,
"sunday": 6,
}
out: list[int] = []
seen: set[int] = set()
for part in str(raw or "").split(","):
token = part.strip().lower()
if not token:
continue
wd = None
if token.isdigit():
v = int(token)
if 0 <= v <= 6:
wd = v
else:
wd = day_map.get(token)
if wd is None or wd in seen:
continue
seen.add(wd)
out.append(wd)
return sorted(out)
def _get_manual_schedule_for_job(job: Job | None):
"""Return normalized manual schedule payload or None when not configured/invalid."""
if not job:
return None
stype = str(getattr(job, "schedule_type", "") or "").strip().lower()
if not stype:
return None
times = _parse_schedule_times_csv(getattr(job, "schedule_times", None))
if not times:
return None
if stype == "daily":
weekly_map = {i: list(times) for i in range(7)}
return {"mode": "weekly", "weekly_map": weekly_map}
if stype == "weekly":
weekly_map = {i: [] for i in range(7)}
for wd in _parse_schedule_days_csv(getattr(job, "schedule_days_of_week", None)):
weekly_map[wd] = list(times)
if not any(weekly_map.get(i) for i in range(7)):
return None
return {"mode": "weekly", "weekly_map": weekly_map}
if stype == "monthly":
try:
dom = int(getattr(job, "schedule_day_of_month", None) or 0)
except Exception:
dom = 0
if dom < 1 or dom > 31:
return None
return {"mode": "monthly", "monthly": {"day_of_month": dom, "times": list(times)}}
return None
def _get_effective_schedule_for_job(job: Job | None):
"""Resolve schedule with precedence: manual override first, inferred fallback."""
empty_weekly = {i: [] for i in range(7)}
if not job:
return {"source": "none", "weekly_map": empty_weekly, "monthly": None}
manual = _get_manual_schedule_for_job(job)
if manual:
if manual.get("mode") == "weekly":
return {"source": "manual", "weekly_map": manual.get("weekly_map") or empty_weekly, "monthly": None}
return {"source": "manual", "weekly_map": empty_weekly, "monthly": manual.get("monthly")}
inferred_weekly = _infer_schedule_map_from_runs(job.id) or empty_weekly
has_weekly = any((inferred_weekly.get(i) or []) for i in range(7))
if has_weekly:
return {"source": "inferred_weekly", "weekly_map": inferred_weekly, "monthly": None}
monthly = _infer_monthly_schedule_from_runs(job.id)
if monthly:
return {"source": "inferred_monthly", "weekly_map": empty_weekly, "monthly": monthly}
return {"source": "none", "weekly_map": inferred_weekly, "monthly": None}
def _get_expected_times_for_job_on_date(job: Job | None, target_date) -> list[str]:
"""Return expected HH:MM slots for a job on a specific date using effective schedule."""
if not job or not target_date:
return []
resolved = _get_effective_schedule_for_job(job)
weekly_map = resolved.get("weekly_map") or {}
expected_times = list(weekly_map.get(int(target_date.weekday())) or [])
if expected_times:
return expected_times
monthly = resolved.get("monthly")
if not monthly:
return []
try:
dom = int(monthly.get("day_of_month") or 0)
except Exception:
dom = 0
mtimes = monthly.get("times") or []
if dom <= 0 or not mtimes:
return []
try:
last_dom = calendar.monthrange(target_date.year, target_date.month)[1]
except Exception:
last_dom = target_date.day
scheduled_dom = dom if dom <= last_dom else last_dom
if int(target_date.day) != int(scheduled_dom):
return []
return list(mtimes)
def _schedule_map_to_desc(schedule_map): def _schedule_map_to_desc(schedule_map):
weekday_names = ["Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun"] weekday_names = ["Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun"]
any_times = any(schedule_map.get(i) for i in range(7)) any_times = any(schedule_map.get(i) for i in range(7))

View File

@ -29,6 +29,8 @@ def tickets_page():
if tab == "tickets": if tab == "tickets":
query = Ticket.query query = Ticket.query
joined_scope = False joined_scope = False
if active_only:
query = query.filter(Ticket.resolved_at.is_(None))
if q: if q:
like_q = f"%{q}%" like_q = f"%{q}%"
query = ( query = (
@ -66,8 +68,6 @@ def tickets_page():
ticket_ids = [t.id for t in tickets_raw] ticket_ids = [t.id for t in tickets_raw]
customer_map = {} customer_map = {}
run_count_map = {} run_count_map = {}
scope_total_map = {}
scope_open_map = {}
if ticket_ids: if ticket_ids:
try: try:
@ -113,31 +113,6 @@ def tickets_page():
except Exception: except Exception:
run_count_map = {} run_count_map = {}
try:
rows = (
db.session.execute(
text(
"""
SELECT
ticket_id,
COUNT(*) AS total_count,
SUM(CASE WHEN resolved_at IS NULL THEN 1 ELSE 0 END) AS open_count
FROM ticket_scopes
WHERE ticket_id = ANY(:ids)
GROUP BY ticket_id
"""
),
{"ids": ticket_ids},
)
.fetchall()
)
for tid, total_cnt, open_cnt in rows:
scope_total_map[int(tid)] = int(total_cnt or 0)
scope_open_map[int(tid)] = int(open_cnt or 0)
except Exception:
scope_total_map = {}
scope_open_map = {}
for t in tickets_raw: for t in tickets_raw:
customers_for_ticket = customer_map.get(t.id) or [] customers_for_ticket = customer_map.get(t.id) or []
if customers_for_ticket: if customers_for_ticket:
@ -166,11 +141,6 @@ def tickets_page():
scope_summary = " / ".join([p for p in parts if p]) or "-" scope_summary = " / ".join([p for p in parts if p]) or "-"
except Exception: except Exception:
scope_summary = "-" scope_summary = "-"
total_scopes = int(scope_total_map.get(t.id, 0) or 0)
open_scopes = int(scope_open_map.get(t.id, 0) or 0)
active_effective = (t.resolved_at is None) and (total_scopes == 0 or open_scopes > 0)
if active_only and not active_effective:
continue
tickets.append( tickets.append(
{ {
@ -180,7 +150,7 @@ def tickets_page():
"active_from_date": str(getattr(t, "active_from_date", "") or ""), "active_from_date": str(getattr(t, "active_from_date", "") or ""),
"start_date": _format_datetime(t.start_date), "start_date": _format_datetime(t.start_date),
"resolved_at": _format_datetime(t.resolved_at) if t.resolved_at else "", "resolved_at": _format_datetime(t.resolved_at) if t.resolved_at else "",
"active": bool(active_effective), "active": t.resolved_at is None,
"customers": customer_display, "customers": customer_display,
"scope_summary": scope_summary, "scope_summary": scope_summary,
"linked_runs": run_count_map.get(t.id, 0), "linked_runs": run_count_map.get(t.id, 0),

View File

@ -631,46 +631,6 @@ def migrate_remarks_active_from_date() -> None:
print("[migrations] remarks.active_from_date added and backfilled.") print("[migrations] remarks.active_from_date added and backfilled.")
def migrate_remarks_source_and_ticket_id() -> None:
"""Ensure remarks.source and remarks.ticket_id exist."""
table = "remarks"
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for remarks source/ticket migration: {exc}")
return
try:
with engine.begin() as conn:
if not _column_exists_on_conn(conn, table, "source"):
conn.execute(text('ALTER TABLE "remarks" ADD COLUMN source VARCHAR(64)'))
conn.execute(
text(
"""
UPDATE "remarks"
SET source = 'manual'
WHERE source IS NULL OR source = '';
"""
)
)
try:
conn.execute(text('ALTER TABLE "remarks" ALTER COLUMN source SET NOT NULL'))
except Exception:
pass
if not _column_exists_on_conn(conn, table, "ticket_id"):
conn.execute(text('ALTER TABLE "remarks" ADD COLUMN ticket_id INTEGER REFERENCES tickets(id)'))
conn.execute(text('CREATE INDEX IF NOT EXISTS idx_remarks_source ON remarks (source)'))
conn.execute(text('CREATE INDEX IF NOT EXISTS idx_remarks_ticket_id ON remarks (ticket_id)'))
except Exception as exc:
print(f"[migrations] Failed migrate_remarks_source_and_ticket_id: {exc}")
return
print("[migrations] migrate_remarks_source_and_ticket_id completed.")
def migrate_overrides_match_columns() -> None: def migrate_overrides_match_columns() -> None:
"""Add match_status / match_error columns to overrides table if missing.""" """Add match_status / match_error columns to overrides table if missing."""
engine = db.get_engine() engine = db.get_engine()
@ -1468,7 +1428,6 @@ def run_migrations() -> None:
migrate_tickets_active_from_date() migrate_tickets_active_from_date()
migrate_tickets_resolved_origin() migrate_tickets_resolved_origin()
migrate_remarks_active_from_date() migrate_remarks_active_from_date()
migrate_remarks_source_and_ticket_id()
migrate_overrides_match_columns() migrate_overrides_match_columns()
migrate_job_runs_review_tracking() migrate_job_runs_review_tracking()
migrate_job_runs_override_metadata() migrate_job_runs_override_metadata()
@ -2261,8 +2220,6 @@ def migrate_object_persistence_tables() -> None:
id SERIAL PRIMARY KEY, id SERIAL PRIMARY KEY,
title VARCHAR(255), title VARCHAR(255),
body TEXT NOT NULL, body TEXT NOT NULL,
source VARCHAR(64) NOT NULL DEFAULT 'manual',
ticket_id INTEGER REFERENCES tickets(id),
start_date TIMESTAMP, start_date TIMESTAMP,
resolved_at TIMESTAMP, resolved_at TIMESTAMP,
created_at TIMESTAMP NOT NULL, created_at TIMESTAMP NOT NULL,

View File

@ -584,8 +584,6 @@ class Remark(db.Model):
id = db.Column(db.Integer, primary_key=True) id = db.Column(db.Integer, primary_key=True)
title = db.Column(db.String(255)) title = db.Column(db.String(255))
body = db.Column(db.Text, nullable=False) body = db.Column(db.Text, nullable=False)
source = db.Column(db.String(64), nullable=False, default="manual")
ticket_id = db.Column(db.Integer, db.ForeignKey("tickets.id"), nullable=True)
# Date (Europe/Amsterdam) from which this remark should be considered active # Date (Europe/Amsterdam) from which this remark should be considered active
# for the scoped job(s) in Daily Jobs / Job Details views. # for the scoped job(s) in Daily Jobs / Job Details views.

View File

@ -265,7 +265,7 @@ _ABB_SUBJECT_RE = re.compile(r"\bactive\s+backup\s+for\s+business\b", re.I)
# "backup task vSphere-Task-1 on KANTOOR-NEW was skipped" # "backup task vSphere-Task-1 on KANTOOR-NEW was skipped"
_ABB_COMPLETED_RE = re.compile( _ABB_COMPLETED_RE = re.compile(
r"\b(?:virtuele\s+machine\s+)?(?:de\s+)?back-?up\s*(?:taak|job)\s+(?:van\s+deze\s+taak\s+)?(?P<job>.+?)\s+op\s+(?P<host>[A-Za-z0-9._-]+)\s+is\s+(?P<status>voltooid|gedeeltelijk\s+voltooid|genegeerd)\b" r"\b(?:virtuele\s+machine\s+)?(?:de\s+)?back-?up\s*(?:taak|job)\s+(?:van\s+deze\s+taak\s+)?(?P<job>.+?)\s+op\s+(?P<host>[A-Za-z0-9._-]+)\s+is\s+(?P<status>voltooid|gedeeltelijk\s+voltooid|genegeerd)\b"
r"|\b(?:virtual\s+machine\s+)?(?:the\s+)?back-?up\s+(?:task|job)\s+(?P<job_en>.+?)\s+on\s+(?P<host_en>[A-Za-z0-9._-]+)\s+(?:(?:is|was|has\s+been|has)\s+)?(?P<status_en>completed|finished|partially\s+completed|skipped|ignored)\b", r"|\b(?:virtual\s+machine\s+)?(?:the\s+)?back-?up\s+(?:task|job)\s+(?P<job_en>.+?)\s+on\s+(?P<host_en>[A-Za-z0-9._-]+)\s+(?:is\s+|was\s+)?(?P<status_en>completed|finished|has\s+completed|partially\s+completed|skipped|ignored)\b",
re.I, re.I,
) )

View File

@ -193,6 +193,9 @@ def link_open_internal_tickets_to_run(*, run: JobRun, job: Job) -> None:
except Exception: except Exception:
rows = [] rows = []
if not rows:
return
# Link all open tickets to this run (idempotent) # Link all open tickets to this run (idempotent)
for tid, code, t_resolved, ts_resolved in rows: for tid, code, t_resolved, ts_resolved in rows:
if not TicketJobRun.query.filter_by(ticket_id=int(tid), job_run_id=int(run.id)).first(): if not TicketJobRun.query.filter_by(ticket_id=int(tid), job_run_id=int(run.id)).first():

View File

@ -173,24 +173,6 @@ body.bc-body {
color: var(--bc-sidebar-active-text); color: var(--bc-sidebar-active-text);
} }
.bc-sidebar-link-inline {
display: inline-flex;
align-items: center;
padding: 2px 6px;
margin: -2px -6px;
border-radius: 6px;
text-decoration: none;
color: var(--bc-sidebar-text);
font-weight: 500;
transition: background var(--bc-transition), color var(--bc-transition);
}
.bc-sidebar-link-inline:hover {
background: rgba(255,255,255,0.06);
color: var(--bc-sidebar-text-hover);
}
.bc-nav-icon { display: flex; align-items: center; flex-shrink: 0; } .bc-nav-icon { display: flex; align-items: center; flex-shrink: 0; }
.bc-nav-label-text { flex: 1; overflow: hidden; text-overflow: ellipsis; } .bc-nav-label-text { flex: 1; overflow: hidden; text-overflow: ellipsis; }
@ -430,25 +412,6 @@ body.bc-body {
white-space: nowrap; white-space: nowrap;
} }
/* Keep long object/datasource values fully visible in operational modals. */
#rcm_objects_table th,
#rcm_objects_table td,
#dj_objects_table th,
#dj_objects_table td,
#run_msg_objects_container table th,
#run_msg_objects_container table td {
white-space: normal;
overflow-wrap: anywhere;
word-break: break-word;
}
#rcm_cove_datasources,
#jdm_cove_datasources {
white-space: normal;
overflow-wrap: anywhere;
word-break: break-word;
}
/* Cards */ /* Cards */
.card { .card {
border-radius: var(--bc-radius); border-radius: var(--bc-radius);

View File

@ -4,65 +4,16 @@
<h1>Company Mapping</h1> <h1>Company Mapping</h1>
<p class="lead"> <p class="lead">
Map each Backupchecks customer to the correct Autotask company. Map customers to Autotask companies.
</p> </p>
<div class="doc-callout doc-callout-warning"> <div class="doc-callout doc-callout-info">
<strong>Required for Autotask actions:</strong><br> <strong>📝 Coming Soon:</strong>
Creating or linking Autotask tickets from Run Checks requires a valid customer mapping status. This page is under construction. Full content will be added in a future update.
</div> </div>
<h2>Where to Manage Mappings</h2> <h2>Content</h2>
<ol> <p>Detailed content will be added here in a future update.</p>
<li>Open <a href="{{ url_for('main.customers') }}"><strong>Customers</strong></a>.</li>
<li>Use <strong>Edit</strong> on a customer.</li>
<li>In the Autotask mapping section:
<ul>
<li>Search Autotask companies</li>
<li>Select result</li>
<li>Set mapping / Refresh status / Clear mapping</li>
</ul>
</li>
</ol>
<h2>Mapping Status Values</h2>
<ul>
<li><strong>OK</strong>: mapping is valid.</li>
<li><strong>Renamed</strong>: company still exists but name changed in Autotask.</li>
<li><strong>Missing</strong>: temporary lookup/API issue.</li>
<li><strong>Invalid</strong>: mapped company no longer exists (e.g. 404).</li>
<li><strong>Not mapped</strong>: no company ID set.</li>
</ul>
<h2>Refresh Operations</h2>
<ul>
<li>Per customer: <strong>Refresh status</strong> in the edit modal.</li>
<li>Bulk: <strong>Refresh all Autotask mappings</strong> on Customers page.</li>
</ul>
<h2>Import/Export Behavior</h2>
<ul>
<li>Customer CSV supports optional Autotask ID/name columns.</li>
<li>Import can include mapping IDs only when explicit option is enabled.</li>
<li>After import with IDs, mapping status is revalidated via refresh.</li>
</ul>
<h2>Troubleshooting</h2>
<ul>
<li>If company search fails, verify integration setup in Settings.</li>
<li>If status stays <strong>Missing</strong>, check API connectivity/permissions.</li>
<li>If status is <strong>Invalid</strong>, remap to the current Autotask company.</li>
</ul>
<h2>Next Steps</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='autotask', page='creating-tickets') }}">Creating Tickets</a></li>
<li><a href="{{ url_for('documentation.page', section='autotask', page='ticket-management') }}">Ticket Management</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -4,63 +4,16 @@
<h1>Creating Tickets</h1> <h1>Creating Tickets</h1>
<p class="lead"> <p class="lead">
Use Run Checks to create new Autotask tickets or link existing ones to active runs. Manually create tickets for failed backups.
</p> </p>
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>Where:</strong><br> <strong>📝 Coming Soon:</strong>
Ticket actions are available in the Run Checks modal for admin/operator roles. This page is under construction. Full content will be added in a future update.
</div> </div>
<h2>Before You Start</h2> <h2>Content</h2>
<ul> <p>Detailed content will be added here in a future update.</p>
<li>Autotask integration enabled and configured.</li>
<li>Customer has a valid Autotask mapping (<strong>OK</strong> or <strong>Renamed</strong>).</li>
<li>Run is selected in Run Checks modal.</li>
</ul>
<h2>Create New Ticket</h2>
<ol>
<li>Open Run Checks and select a run.</li>
<li>Click <strong>Create</strong> (or <strong>Create new</strong> when previous linked ticket is resolved/deleted).</li>
<li>Backupchecks creates ticket using configured queue/source/status/priority defaults.</li>
<li>Ticket info is stored on the run and visible in the modal.</li>
</ol>
<h2>Link Existing Ticket</h2>
<ol>
<li>Click <strong>Link existing</strong>.</li>
<li>Search tickets by title/number.</li>
<li>Select and confirm <strong>Link</strong>.</li>
</ol>
<p>Behavior:</p>
<ul>
<li>Link propagates to all active (unreviewed) runs of the same job.</li>
<li>Terminal/completed tickets are blocked from linking.</li>
<li>Cross-company linking is allowed for overarching/shared tickets.</li>
<li>Backupchecks attempts to post a note on the linked Autotask ticket.</li>
</ul>
<h2>Internal Ticket Relationship</h2>
<p>When Autotask tickets are created/linked, Backupchecks also keeps internal ticket linkage for consistent indicators and history behavior.</p>
<h2>Troubleshooting</h2>
<ul>
<li><strong>No tickets found:</strong> verify customer mapping and search terms.</li>
<li><strong>Link blocked:</strong> check whether ticket is terminal/completed.</li>
<li><strong>Create fails:</strong> validate default queue/source/status configuration in settings.</li>
</ul>
<h2>Next Steps</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='autotask', page='ticket-management') }}">Ticket Management</a></li>
<li><a href="{{ url_for('documentation.page', section='backup-review', page='run-checks-modal') }}">Run Checks Modal</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -4,66 +4,16 @@
<h1>Setup & Configuration</h1> <h1>Setup & Configuration</h1>
<p class="lead"> <p class="lead">
Configure Autotask PSA integration so operators can create and link PSA tickets from Run Checks. Configure Autotask PSA integration.
</p> </p>
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>Scope:</strong><br> <strong>📝 Coming Soon:</strong>
This page covers technical setup in Backupchecks Settings. Customer-to-company mapping is documented separately. This page is under construction. Full content will be added in a future update.
</div> </div>
<h2>Prerequisites</h2> <h2>Content</h2>
<ul> <p>Detailed content will be added here in a future update.</p>
<li>Autotask API user with access to tickets and companies.</li>
<li>Autotask API integration tracking identifier.</li>
<li>Backupchecks admin access to <strong>Settings</strong>.</li>
</ul>
<h2>Step 1: Enable Integration</h2>
<ol>
<li>Open <strong>Settings</strong>.</li>
<li>In the Autotask section, enable integration.</li>
<li>Select environment (<code>production</code> or <code>sandbox</code>).</li>
<li>Fill in username, password, and tracking identifier.</li>
<li>Optional: set <strong>Autotask Base URL</strong> for links in notes/details.</li>
</ol>
<h2>Step 2: Configure Ticket Defaults</h2>
<p>Backupchecks ticket creation needs default values from Autotask reference data.</p>
<ul>
<li>Default queue</li>
<li>Default ticket source</li>
<li>Default ticket status</li>
<li>Priority for warning and error conditions</li>
</ul>
<p>Use the reference-data refresh action if dropdowns are empty.</p>
<h2>Step 3: Validate Connection</h2>
<ol>
<li>Use the <strong>Test connection</strong> action in settings.</li>
<li>Confirm the API call succeeds.</li>
<li>If test fails, re-check credentials, environment, and integration code.</li>
</ol>
<h2>Operational Notes</h2>
<ul>
<li>Passwords are stored in settings but not shown back in plain text.</li>
<li>Settings changes are audit logged in Backupchecks logging.</li>
<li>Ticket creation in Run Checks depends on customer mapping being valid.</li>
</ul>
<h2>Next Steps</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='autotask', page='company-mapping') }}">Company Mapping</a> - map customers to Autotask companies.</li>
<li><a href="{{ url_for('documentation.page', section='autotask', page='creating-tickets') }}">Creating Tickets</a> - create or link tickets from Run Checks.</li>
<li><a href="{{ url_for('main.settings') }}">Open Settings</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -4,61 +4,16 @@
<h1>Ticket Management</h1> <h1>Ticket Management</h1>
<p class="lead"> <p class="lead">
Understand ticket lifecycle, synchronization behavior, and operator actions after linking. Manage and track Autotask tickets.
</p> </p>
<h2>Lifecycle in Run Checks</h2> <div class="doc-callout doc-callout-info">
<strong>📝 Coming Soon:</strong>
This page is under construction. Full content will be added in a future update.
</div>
<ul> <h2>Content</h2>
<li>Linked ticket info is shown per run (ticket number/state).</li>
<li>If linked ticket is resolved or deleted in PSA, state is reflected in Backupchecks.</li>
<li>When a linked ticket is resolved/deleted, operators can create a new ticket for the same run.</li>
</ul>
<h2>Resolve Note Action</h2> <p>Detailed content will be added here in a future update.</p>
<ol>
<li>In Run Checks modal, click <strong>Resolve</strong> on Autotask ticket section.</li>
<li>Backupchecks posts a user-visible update note to the PSA ticket.</li>
<li>If ticket has no time entries, Autotask can close it; otherwise it may stay open.</li>
</ol>
<h2>Propagation and Scope</h2>
<ul>
<li>Link-existing operation propagates to all active runs of the job.</li>
<li>New runs can inherit open ticket context through the link-based internal ticket model.</li>
<li>Resolved internal ticket state stops propagation to future runs.</li>
</ul>
<h2>Deleted vs Resolved</h2>
<ul>
<li><strong>Resolved:</strong> ticket still exists in PSA but is closed/completed.</li>
<li><strong>Deleted:</strong> ticket was removed in PSA; deletion metadata is stored on runs.</li>
</ul>
<h2>Operational Best Practices</h2>
<ul>
<li>Use <strong>Link existing</strong> for umbrella incidents affecting multiple customers/jobs.</li>
<li>Use <strong>Create</strong> when no suitable active PSA ticket exists.</li>
<li>Always complete Run Checks review after ticket actions.</li>
</ul>
<h2>Troubleshooting</h2>
<ul>
<li>If resolve note fails, ticket link can still remain valid; check warning message.</li>
<li>If state looks stale, refresh Run Checks details and verify PSA accessibility.</li>
<li>If no Autotask actions appear, verify role permissions and integration settings.</li>
</ul>
<h2>See Also</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='autotask', page='setup-configuration') }}">Setup & Configuration</a></li>
<li><a href="{{ url_for('documentation.page', section='autotask', page='company-mapping') }}">Company Mapping</a></li>
<li><a href="{{ url_for('documentation.page', section='autotask', page='creating-tickets') }}">Creating Tickets</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -207,7 +207,7 @@
<div class="doc-callout doc-callout-warning"> <div class="doc-callout doc-callout-warning">
<strong>⚠️ Always Mark as Reviewed:</strong><br> <strong>⚠️ Always Mark as Reviewed:</strong><br>
Regardless of which action you take (override, remark, or ticket), you <strong>must always mark the job as reviewed</strong> afterwards. If you don't, the run will remain in the Run Checks list and can show the wrong status the next day (only the first unreviewed status is displayed). Regardless of which action you take (override, remark, or ticket), you <strong>must always mark the run as reviewed</strong> afterwards. If you don't, the run will remain in the Run Checks list and can show the wrong status the next day (only the first unreviewed status is displayed).
</div> </div>
<h2>Stage 7: Mark as Reviewed</h2> <h2>Stage 7: Mark as Reviewed</h2>
@ -217,7 +217,7 @@
<h3>What Happens</h3> <h3>What Happens</h3>
<ul> <ul>
<li>Click <strong>Mark as reviewed</strong> in the Run Checks modal (or select multiple jobs and mark all at once)</li> <li>Click <strong>Mark as reviewed</strong> in the Run Checks modal (or select multiple runs and mark all at once)</li>
<li>The run is marked with a review timestamp</li> <li>The run is marked with a review timestamp</li>
<li>The run disappears from the Run Checks page</li> <li>The run disappears from the Run Checks page</li>
<li>A review audit record is created (visible to Admins)</li> <li>A review audit record is created (visible to Admins)</li>
@ -225,7 +225,7 @@
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>💡 Goal: Empty Run Checks Page:</strong><br> <strong>💡 Goal: Empty Run Checks Page:</strong><br>
The objective is to have the <strong>Run Checks page completely empty</strong> - this means all runs have been reviewed. Even successful backups must be marked as reviewed, because they haven't been <em>looked at</em>, but they still need to be marked to clear the Run Checks page. You can select multiple jobs and mark them all as reviewed at once for efficiency. The objective is to have the <strong>Run Checks page completely empty</strong> - this means all runs have been reviewed. Even successful backups must be marked as reviewed, because they haven't been <em>looked at</em>, but they still need to be marked to clear the Run Checks page. You can select multiple runs and mark them all as reviewed at once for efficiency.
</div> </div>
<h3>Bulk Review</h3> <h3>Bulk Review</h3>
@ -245,11 +245,11 @@
<ul> <ul>
<li><strong>Start with Run Checks every day:</strong> Open Run Checks page first thing every morning - this is your primary workflow</li> <li><strong>Start with Run Checks every day:</strong> Open Run Checks page first thing every morning - this is your primary workflow</li>
<li><strong>Goal: Empty Run Checks page:</strong> Mark all jobs as reviewed (including successful ones) until the page is completely empty</li> <li><strong>Goal: Empty Run Checks page:</strong> Mark all runs as reviewed (even successful ones) until the page is completely empty</li>
<li><strong>Approve inbox emails quickly:</strong> New customer emails won't appear in Run Checks until approved</li> <li><strong>Approve inbox emails quickly:</strong> New customer emails won't appear in Run Checks until approved</li>
<li><strong>Triage by status:</strong> Review red (Failed) before yellow (Warning) before green (Success)</li> <li><strong>Triage by status:</strong> Review red (Failed) before yellow (Warning) before green (Success)</li>
<li><strong>Use bulk review for successes:</strong> Select multiple successful jobs and mark them all at once</li> <li><strong>Use bulk review for successes:</strong> Select multiple successful runs and mark them all at once</li>
<li><strong>Always mark as reviewed:</strong> Even after creating tickets/remarks/overrides, you must still mark the job as reviewed</li> <li><strong>Always mark as reviewed:</strong> Even after creating tickets/remarks/overrides, you must still mark the run as reviewed</li>
<li><strong>Use overrides for recurring acceptable warnings:</strong> Don't mark the same warning as reviewed every day - create an override</li> <li><strong>Use overrides for recurring acceptable warnings:</strong> Don't mark the same warning as reviewed every day - create an override</li>
<li><strong>Create tickets for customer action items:</strong> Formal tracking ensures followup</li> <li><strong>Create tickets for customer action items:</strong> Formal tracking ensures followup</li>
<li><strong>Use remarks for temporary notes:</strong> Don't create tickets for short-term issues</li> <li><strong>Use remarks for temporary notes:</strong> Don't create tickets for short-term issues</li>

View File

@ -235,7 +235,7 @@
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>💡 Review Workflow:</strong><br> <strong>💡 Review Workflow:</strong><br>
Jobs with warnings or failures typically need investigation first. Successful runs can also appear in Run Checks until they are explicitly marked as reviewed (often via bulk review). The Run Checks workflow helps you triage and clear the unreviewed queue. Only jobs with warnings or failures need to be reviewed. Successful jobs are automatically considered reviewed. The Run Checks workflow is designed to help you quickly triage and acknowledge known issues.
</div> </div>
<h2>Common Workflows</h2> <h2>Common Workflows</h2>

View File

@ -384,7 +384,7 @@
<p>BackupChecks monitors Autotask for ticket status changes and can automatically resolve internal tickets when the corresponding Autotask ticket is resolved or deleted.</p> <p>BackupChecks monitors Autotask for ticket status changes and can automatically resolve internal tickets when the corresponding Autotask ticket is resolved or deleted.</p>
<p>See <a href="{{ url_for('documentation.page', section='autotask', page='setup-configuration') }}">Autotask Integration</a> for details on setup and PSA ticket synchronization.</p> <p>See <a href="{{ url_for('documentation.page', section='autotask', page='overview') }}">Autotask Integration</a> for details on setup and PSA ticket synchronization.</p>
<h2>Troubleshooting</h2> <h2>Troubleshooting</h2>
@ -423,7 +423,7 @@
<li><a href="{{ url_for('documentation.page', section='backup-review', page='daily-jobs') }}">Daily Jobs View</a> - See how ticket and remark indicators appear in daily monitoring</li> <li><a href="{{ url_for('documentation.page', section='backup-review', page='daily-jobs') }}">Daily Jobs View</a> - See how ticket and remark indicators appear in daily monitoring</li>
<li><a href="{{ url_for('documentation.page', section='backup-review', page='run-checks-modal') }}">Run Checks Modal</a> - Create tickets and remarks while reviewing job runs</li> <li><a href="{{ url_for('documentation.page', section='backup-review', page='run-checks-modal') }}">Run Checks Modal</a> - Create tickets and remarks while reviewing job runs</li>
<li><a href="{{ url_for('documentation.page', section='backup-review', page='overrides') }}">Overrides & Exceptions</a> - Handle known issues with automated rules</li> <li><a href="{{ url_for('documentation.page', section='backup-review', page='overrides') }}">Overrides & Exceptions</a> - Handle known issues with automated rules</li>
<li><a href="{{ url_for('documentation.page', section='autotask', page='setup-configuration') }}">Autotask Integration</a> - Learn about PSA integration and ticket synchronization</li> <li><a href="{{ url_for('documentation.page', section='autotask', page='overview') }}">Autotask Integration</a> - Learn about PSA integration and ticket synchronization</li>
</ul> </ul>
{% endblock %} {% endblock %}

View File

@ -170,7 +170,7 @@
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>💡 Autotask Integration:</strong><br> <strong>💡 Autotask Integration:</strong><br>
Autotask ticket information only appears if the Autotask integration is enabled in Settings and a ticket was created for this run. See <a href="{{ url_for('documentation.page', section='autotask', page='setup-configuration') }}">Autotask Integration</a> for details. Autotask ticket information only appears if the Autotask integration is enabled in Settings and a ticket was created for this run. See <a href="{{ url_for('documentation.page', section='autotask', page='overview') }}">Autotask Integration</a> for details.
</div> </div>
<h2>Review Actions</h2> <h2>Review Actions</h2>

View File

@ -1,65 +0,0 @@
{% extends "documentation/base.html" %}
{% block doc_content %}
<h1>Cove Data Protection</h1>
<p class="lead">
Integrate N-able Cove backup accounts into Backupchecks using the Cove API.
</p>
<div class="doc-callout doc-callout-info">
<strong>Scope:</strong><br>
Cove integration is API-based (not email-based). Accounts are staged first, then linked to jobs.
</div>
<h2>How the Cove Flow Works</h2>
<ol>
<li>Cove import reads account statistics from the Cove API.</li>
<li>Accounts are upserted into the Cove staging table.</li>
<li>Unlinked accounts appear on <strong>Cove Accounts</strong>.</li>
<li>Operator links account to an existing job or creates a new job.</li>
<li>After linking, an immediate import runs so new Cove runs appear right away.</li>
</ol>
<h2>Open Cove Accounts</h2>
<ol>
<li>Open <a href="{{ url_for('main.cove_accounts') }}"><strong>Cove Accounts</strong></a>.</li>
<li>Review <strong>unmatched</strong> accounts first.</li>
<li>Link each account to a job (create or existing).</li>
</ol>
<h2>Settings and Operations</h2>
<ul>
<li>Configure Cove under <strong>Settings → Integrations</strong>.</li>
<li>Use <strong>Test connection</strong> to validate credentials and partner access.</li>
<li>Use <strong>Run import now</strong> for manual sync.</li>
<li>Background import runs on configured interval when enabled.</li>
</ul>
<h2>Run Visibility</h2>
<ul>
<li>Cove runs are stored as API runs (<code>source_type = cove_api</code>).</li>
<li>Cove run details are visible in Job Detail and Run Checks with a Cove summary panel.</li>
<li>No mail message is required for Cove runs.</li>
<li>Historical backfill can create runs from Cove 28-day colorbar data.</li>
</ul>
<h2>Troubleshooting</h2>
<ul>
<li>If <strong>Cove Accounts</strong> is empty, run a manual import and check credentials.</li>
<li>If a linked account shows no runs, verify link target job and import logs.</li>
<li>If statuses look outdated, run <strong>Run import now</strong> and refresh Run Checks/Job Detail.</li>
</ul>
<h2>See Also</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='backup-review', page='run-checks-modal') }}">Run Checks Modal</a></li>
<li><a href="{{ url_for('documentation.page', section='customers-jobs', page='approved-jobs') }}">Approved Jobs</a></li>
</ul>
{% endblock %}

View File

@ -1,63 +0,0 @@
{% extends "documentation/base.html" %}
{% block doc_content %}
<h1>Veeam Cloud Connect</h1>
<p class="lead">
Review and link Cloud Connect tenant rows from shared report emails to Backupchecks jobs.
</p>
<div class="doc-callout doc-callout-info">
<strong>Scope:</strong><br>
Cloud Connect imports from report emails, stages accounts, then creates runs only for linked accounts.
</div>
<h2>How the Cloud Connect Flow Works</h2>
<ol>
<li>Cloud Connect report emails are parsed from Inbox mail storage.</li>
<li>Per-tenant rows are upserted into <strong>Cloud Connect Accounts</strong>.</li>
<li>Unlinked rows are reviewed and linked to jobs.</li>
<li>On link, historical report mails are re-processed for that account/job mapping.</li>
</ol>
<h2>Open Cloud Connect Accounts</h2>
<ol>
<li>Open <a href="{{ url_for('main.cloud_connect_accounts') }}"><strong>Cloud Connect Accounts</strong></a>.</li>
<li>Use <strong>Scan inbox</strong> to re-process stored Cloud Connect report mails.</li>
<li>Link each account row to an existing job or create a new one.</li>
</ol>
<h2>Important Mapping Notes</h2>
<ul>
<li>Rows are account-level; one user can have multiple repositories.</li>
<li>Repository-level linking is supported (separate rows/jobs where needed).</li>
<li>Unlinking removes mapping only; historical runs remain as records.</li>
</ul>
<h2>Run Visibility</h2>
<ul>
<li>Cloud Connect runs are stored with <code>source_type = cloud_connect</code>.</li>
<li>Run Checks and Job Detail show structured Cloud Connect summaries.</li>
<li>Shared source report email can still be inspected from detail UI.</li>
</ul>
<h2>Troubleshooting</h2>
<ul>
<li>If expected rows are missing, run <strong>Scan inbox</strong> and verify report mail type.</li>
<li>If runs are missing after link, re-link check + scan inbox again.</li>
<li>If one tenant has multiple repos, verify each repo row is linked as intended.</li>
</ul>
<h2>See Also</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='mail-import', page='inbox-management') }}">Inbox Management</a></li>
<li><a href="{{ url_for('documentation.page', section='backup-review', page='run-checks-modal') }}">Run Checks Modal</a></li>
<li><a href="{{ url_for('documentation.page', section='customers-jobs', page='approved-jobs') }}">Approved Jobs</a></li>
</ul>
{% endblock %}

View File

@ -4,58 +4,16 @@
<h1>Autotask Integration</h1> <h1>Autotask Integration</h1>
<p class="lead"> <p class="lead">
Configure Autotask connectivity and ticket defaults used by Run Checks ticket actions. Configure Autotask API settings.
</p> </p>
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>Where:</strong><br> <strong>📝 Coming Soon:</strong>
Open <a href="{{ url_for('main.settings', section='integrations') }}"><strong>Settings -> Integrations</strong></a> and use the <strong>Autotask</strong> cards. This page is under construction. Full content will be added in a future update.
</div> </div>
<h2>Autotask Connection Settings</h2> <h2>Content</h2>
<ul> <p>Detailed content will be added here in a future update.</p>
<li><strong>Enable Autotask integration</strong></li>
<li><strong>Environment:</strong> Sandbox or Production</li>
<li><strong>API Username</strong></li>
<li><strong>API Password</strong> (leave empty to keep stored password)</li>
<li><strong>Tracking Identifier (Integration Code)</strong></li>
<li><strong>Backupchecks Base URL:</strong> used for links in ticket notes/details</li>
</ul>
<h2>Ticket Defaults</h2>
<p>Required for ticket creation from Run Checks:</p>
<ul>
<li><strong>Default Queue</strong></li>
<li><strong>Ticket Source</strong></li>
<li><strong>Default Ticket Status</strong></li>
<li><strong>Priority for Warning</strong></li>
<li><strong>Priority for Error</strong></li>
</ul>
<h2>Diagnostics & Reference Data</h2>
<ul>
<li><strong>Test connection:</strong> validates API credentials/access.</li>
<li><strong>Refresh reference data:</strong> reloads queues, sources, statuses, priorities.</li>
<li>When cache is missing and integration is valid, reference data can auto-load on Settings page open.</li>
</ul>
<h2>Operational Notes</h2>
<ul>
<li>Customer mappings are configured in Customers, not in Settings.</li>
<li>Settings updates are audit logged (password never logged).</li>
<li>If dropdowns are empty, refresh reference data first.</li>
</ul>
<h2>Related Pages</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='autotask', page='setup-configuration') }}">Autotask: Setup & Configuration</a></li>
<li><a href="{{ url_for('documentation.page', section='autotask', page='company-mapping') }}">Autotask: Company Mapping</a></li>
<li><a href="{{ url_for('main.customers') }}">Open Customers</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -2,77 +2,120 @@
{% block doc_content %} {% block doc_content %}
<h1>Microsoft Entra SSO</h1> <h1>Microsoft Entra SSO</h1>
<p>Use Microsoft Entra ID (Azure AD) to let users sign in with their Microsoft account.</p>
<p class="lead">
Configure Microsoft Entra sign-in so users can authenticate with their Microsoft account.
</p>
<div class="doc-callout doc-callout-warning"> <div class="doc-callout doc-callout-warning">
<strong>Status:</strong><br> <strong>Status: Untested in Backupchecks.</strong>
This flow is configurable in Settings and available in the application, but should still be validated in your own tenant before production rollout. This SSO implementation has not yet been end-to-end validated in Backupchecks itself.
Treat this page as implementation guidance for future rollout, not as a confirmed production setup.
</div> </div>
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>Where:</strong><br> <strong>Scope:</strong> this page explains the setup for Backupchecks and Microsoft Entra.
Open <a href="{{ url_for('main.settings', section='integrations') }}"><strong>Settings -> Integrations</strong></a> and use the <strong>Microsoft Entra SSO</strong> card. It does not replace your internal identity/security policies.
</div> </div>
<h2>Prerequisites</h2> <h2>Prerequisites</h2>
<ul> <ul>
<li>Admin access to Microsoft Entra (Azure AD).</li> <li>Admin access to your Microsoft Entra tenant.</li>
<li>Admin access to Backupchecks Settings.</li> <li>Admin access to Backupchecks <strong>Settings → Integrations</strong>.</li>
<li>Stable HTTPS URL for Backupchecks.</li> <li>A stable HTTPS URL for Backupchecks (recommended for production).</li>
</ul> </ul>
<h2>Entra App Registration</h2> <h2>Step 1: Register an app in Microsoft Entra</h2>
<ol> <ol>
<li>Create an app registration in Entra.</li> <li>Open <strong>Microsoft Entra admin center</strong><strong>App registrations</strong>.</li>
<li>Capture <strong>Tenant ID</strong> and <strong>Client ID</strong>.</li> <li>Create a new registration (single-tenant is typical for internal use).</li>
<li>Create and copy a <strong>Client Secret</strong>.</li> <li>Set a name, for example <code>Backupchecks SSO</code>.</li>
<li>Add redirect URI: <code>https://your-domain/auth/entra/callback</code>.</li> <li>After creation, copy:
<ul>
<li><strong>Application (client) ID</strong></li>
<li><strong>Directory (tenant) ID</strong></li>
</ul>
</li>
</ol> </ol>
<h2>Backupchecks Fields</h2> <h2>Step 2: Configure redirect URI</h2>
<ol>
<ul> <li>In the app registration, open <strong>Authentication</strong>.</li>
<li><strong>Enable Microsoft sign-in</strong></li> <li>Add a <strong>Web</strong> redirect URI:
<li><strong>Tenant ID</strong></li>
<li><strong>Client ID</strong></li>
<li><strong>Client Secret</strong> (leave empty to keep stored secret)</li>
<li><strong>Redirect URI</strong> (optional override)</li>
<li><strong>Allowed domain/tenant</strong> (optional restriction)</li>
<li><strong>Allowed Entra Group Object ID(s)</strong> (optional hard access gate)</li>
<li><strong>Auto-provision unknown users as Viewer</strong> (optional)</li>
</ul>
<h2>Group Restriction Notes</h2>
<ul>
<li>Use Entra security group object IDs (not display names).</li>
<li>User must be member of at least one configured group.</li>
<li>Ensure ID token includes <code>groups</code> claim.</li>
<li>When Entra returns group overage tokens without inline groups, access is denied by design.</li>
</ul>
<h2>User Mapping Behavior</h2>
<ul>
<li>Existing users are matched by username/email mapping logic.</li>
<li>If no local user exists:
<ul> <ul>
<li>Auto-provision disabled: login is rejected.</li> <li><code>https://your-backupchecks-domain/auth/entra/callback</code></li>
<li>Auto-provision enabled: user is created with Viewer role.</li> </ul>
</li>
<li>Save the authentication settings.</li>
</ol>
<h2>Step 3: Create client secret</h2>
<ol>
<li>Open <strong>Certificates &amp; secrets</strong> in the app registration.</li>
<li>Create a new client secret.</li>
<li>Copy the secret value immediately (it is shown only once).</li>
</ol>
<h2>Step 4: Configure Backupchecks</h2>
<ol>
<li>Open <strong>Settings → Integrations → Microsoft Entra SSO</strong>.</li>
<li>Enable <strong>Microsoft sign-in</strong>.</li>
<li>Fill in:
<ul>
<li><strong>Tenant ID</strong></li>
<li><strong>Client ID</strong></li>
<li><strong>Client Secret</strong></li>
<li><strong>Redirect URI</strong> (optional override, leave empty to auto-use callback URL)</li>
<li><strong>Allowed domain/tenant</strong> (optional restriction)</li>
<li><strong>Allowed Entra Group Object ID(s)</strong> (optional but recommended)</li>
</ul>
</li>
<li>Optional: enable <strong>Auto-provision unknown users as Viewer</strong>.</li>
<li>Save settings.</li>
</ol>
<h2>Security Group Restriction (recommended)</h2>
<p>You can enforce that only members of one or more specific Entra security groups can sign in.</p>
<ol>
<li>Create or choose a security group in Entra (for example <code>Backupchecks-Users</code>).</li>
<li>Add the allowed users to that group.</li>
<li>Copy the group <strong>Object ID</strong> (not display name).</li>
<li>Paste one or more group object IDs in:
<ul>
<li><strong>Settings → Integrations → Microsoft Entra SSO → Allowed Entra Group Object ID(s)</strong></li>
</ul>
</li>
<li>In the Entra app registration, configure <strong>Token configuration</strong> to include the <code>groups</code> claim in ID tokens.</li>
</ol>
<div class="doc-callout doc-callout-warning">
<strong>Important:</strong> if users are member of many groups, Entra may return a "group overage" token without inline
<code>groups</code> list. In that case Backupchecks cannot verify membership and login is blocked by design.
</div>
<h2>Step 5: Test sign-in</h2>
<ol>
<li>Open <strong>/auth/login</strong> in a private/incognito browser session.</li>
<li>Click <strong>Sign in with Microsoft</strong>.</li>
<li>Authenticate with an allowed account.</li>
<li>Confirm you are redirected back into Backupchecks.</li>
</ol>
<h2>User mapping behavior</h2>
<ul>
<li>Backupchecks first tries to match Entra user to local user by username/email.</li>
<li>If no match exists:
<ul>
<li>With auto-provision disabled: login is rejected.</li>
<li>With auto-provision enabled: a new local user is created with <strong>Viewer</strong> role.</li>
</ul> </ul>
</li> </li>
</ul> </ul>
<h2>Troubleshooting</h2> <h2>Troubleshooting</h2>
<ul> <ul>
<li><strong>Button not visible:</strong> verify SSO enabled and required IDs/secrets saved.</li> <li><strong>Redirect URI mismatch:</strong> ensure Entra app URI exactly matches Backupchecks callback URI.</li>
<li><strong>Redirect mismatch:</strong> ensure callback URL matches Entra app configuration exactly.</li> <li><strong>SSO button not visible:</strong> check that SSO is enabled and Tenant/Client/Secret are saved.</li>
<li><strong>Access denied:</strong> verify allowed domain/tenant and group membership settings.</li> <li><strong>Account not allowed:</strong> verify tenant/domain restriction in <em>Allowed domain/tenant</em>.</li>
<li><strong>Group restricted login fails:</strong> verify group object IDs and ensure the ID token includes a <code>groups</code> claim.</li>
<li><strong>No local user mapping:</strong> create a matching local user or enable auto-provision.</li>
</ul> </ul>
{% endblock %} {% endblock %}

View File

@ -4,61 +4,16 @@
<h1>General Settings</h1> <h1>General Settings</h1>
<p class="lead"> <p class="lead">
Use the General tab to configure core application behavior, display defaults, and safety features. Configure general system settings.
</p> </p>
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>Where:</strong><br> <strong>📝 Coming Soon:</strong>
Open <a href="{{ url_for('main.settings', section='general') }}"><strong>Settings -> General</strong></a>. This page is under construction. Full content will be added in a future update.
</div> </div>
<h2>System Status</h2> <h2>Content</h2>
<ul> <p>Detailed content will be added here in a future update.</p>
<li><strong>Database size:</strong> current database footprint.</li>
<li><strong>Free disk space:</strong> highlighted in red when low; mail import is blocked below 2 GB.</li>
</ul>
<h2>Daily Jobs</h2>
<ul>
<li><strong>Daily Jobs start date:</strong> defines when missed-run checks start.</li>
<li>Older runs are still used to learn schedules.</li>
</ul>
<h2>Display</h2>
<ul>
<li><strong>Timezone:</strong> controls timestamp rendering across UI pages (Logging, Jobs, Daily Jobs, Run Checks).</li>
</ul>
<h2>Navigation</h2>
<ul>
<li><strong>Require dashboard visit on first page view each day:</strong> forces first daily navigation to Dashboard.</li>
</ul>
<h2>Environment</h2>
<ul>
<li><strong>Sandbox/Development environment:</strong> shows a visual non-production banner across the app.</li>
</ul>
<h2>Security</h2>
<ul>
<li><strong>Enable login captcha:</strong> requires users to solve a simple math challenge on login.</li>
<li>Enabled by default for new and migrated installations.</li>
</ul>
<h2>Audit Logging</h2>
<p>General settings changes are written to the admin audit log for traceability.</p>
<h2>Related Pages</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='settings', page='mail-configuration') }}">Mail Configuration</a></li>
<li><a href="{{ url_for('documentation.page', section='settings', page='maintenance') }}">Maintenance</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -4,68 +4,16 @@
<h1>Mail Configuration</h1> <h1>Mail Configuration</h1>
<p class="lead"> <p class="lead">
Configure Microsoft Graph access and import behavior for backup report emails. Configure Graph API mail settings.
</p> </p>
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>Where:</strong><br> <strong>📝 Coming Soon:</strong>
Credentials and folders are in <a href="{{ url_for('main.settings', section='general') }}"><strong>Settings -> General</strong></a>.<br> This page is under construction. Full content will be added in a future update.
Import behavior is in <a href="{{ url_for('main.settings', section='imports') }}"><strong>Settings -> Imports</strong></a>.
</div> </div>
<h2>Graph Connection (General tab)</h2> <h2>Content</h2>
<ul> <p>Detailed content will be added here in a future update.</p>
<li><strong>Tenant ID</strong></li>
<li><strong>Client ID</strong></li>
<li><strong>Client secret</strong> (leave empty to keep stored secret)</li>
<li><strong>Mailbox address</strong></li>
</ul>
<h2>Folder Selection (General tab)</h2>
<ul>
<li><strong>Incoming folder:</strong> source folder read by import.</li>
<li><strong>Processed folder:</strong> destination after processing.</li>
<li>Use <strong>Browse...</strong> to load folder tree via Graph and select the exact path.</li>
</ul>
<h2>Automatic Import (Imports tab)</h2>
<ul>
<li><strong>Enable automatic mail import</strong></li>
<li><strong>Interval (minutes)</strong></li>
<li><strong>Automatic importer cutoff date:</strong> older messages are ignored.</li>
<li><strong>Manual import batch size:</strong> configurable between 1 and 50.</li>
</ul>
<h2>EML Retention (Imports tab)</h2>
<ul>
<li><strong>Store EML for debugging:</strong> Off, 7 days, or 14 days.</li>
<li>When set to <strong>Off</strong>, stored EML blobs are cleared.</li>
</ul>
<h2>Manual Import</h2>
<ul>
<li>Use <strong>Run import</strong> in Settings -> Imports for one-time import.</li>
<li>Manual import is blocked when free disk space is below 2 GB.</li>
<li>Results and errors are shown via notifications and logged in Logging.</li>
</ul>
<h2>Operational Notes</h2>
<ul>
<li>Automatic importer batch size is fixed to 50 items internally.</li>
<li>Secret fields are write-only in the UI.</li>
<li>Mail/import setting changes are audit logged.</li>
</ul>
<h2>Related Pages</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='mail-import', page='setup') }}">Mail Import Setup</a></li>
<li><a href="{{ url_for('documentation.page', section='mail-import', page='auto-import') }}">Auto-Import Configuration</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -4,67 +4,16 @@
<h1>Maintenance</h1> <h1>Maintenance</h1>
<p class="lead"> <p class="lead">
Use Maintenance tools for data migration, cleanup, test data generation, and emergency reset actions. System maintenance and data management.
</p> </p>
<div class="doc-callout doc-callout-warning">
<strong>Admin only:</strong><br>
Several actions on this page are destructive and cannot be undone.
</div>
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>Where:</strong><br> <strong>📝 Coming Soon:</strong>
Open <a href="{{ url_for('main.settings', section='maintenance') }}"><strong>Settings -> Maintenance</strong></a>. This page is under construction. Full content will be added in a future update.
</div> </div>
<h2>Approved Jobs Export / Import</h2> <h2>Content</h2>
<ul> <p>Detailed content will be added here in a future update.</p>
<li><strong>Download export (JSON):</strong> exports customers/jobs for migration or restore scenarios.</li>
<li><strong>Import jobs:</strong> updates existing jobs by key (Customer + Backup + Type + Job name) or creates missing jobs.</li>
<li><strong>Include Autotask IDs:</strong> keep disabled when importing into environments with different Autotask datasets.</li>
</ul>
<h2>Object Maintenance</h2>
<ul>
<li><strong>Backfill objects:</strong> rebuilds missing object link data for existing approved runs to repair reporting relationships.</li>
</ul>
<h2>Cleanup Orphaned Jobs</h2>
<ul>
<li><strong>Preview orphaned jobs:</strong> shows jobs without a valid customer link.</li>
<li><strong>Delete orphaned:</strong> removes orphaned jobs and related run/email data.</li>
</ul>
<h2>Generate Test Emails</h2>
<ul>
<li>Generates one Veeam test email per selected status: Success, Warning, or Error.</li>
<li>Useful for parser testing and maintenance validation.</li>
</ul>
<h2>Jobs Maintenance</h2>
<ul>
<li><strong>Delete all jobs:</strong> removes all jobs and job runs.</li>
<li>Related mails are moved back to Inbox (job link removed).</li>
</ul>
<h2>Danger Zone</h2>
<ul>
<li><strong>Reset application:</strong> permanently clears application data and users.</li>
<li>Requires explicit confirmation value <code>RESET</code>.</li>
<li>After reset, application returns to initial setup flow.</li>
</ul>
<h2>Best Practices</h2>
<ul>
<li>Always export before destructive operations.</li>
<li>Use orphaned-job preview before deletion.</li>
<li>Run destructive actions in maintenance windows.</li>
</ul>
{% endblock %} {% endblock %}

View File

@ -4,32 +4,16 @@
<h1>Reporting Settings</h1> <h1>Reporting Settings</h1>
<p class="lead"> <p class="lead">
There is currently no dedicated Reporting configuration card in Settings. Configure reporting defaults.
</p> </p>
<div class="doc-callout doc-callout-warning"> <div class="doc-callout doc-callout-info">
<strong>Current status:</strong><br> <strong>📝 Coming Soon:</strong>
Reporting settings in the Settings page are not implemented as a separate configurable section. This page is under construction. Full content will be added in a future update.
</div> </div>
<h2>What This Means</h2> <h2>Content</h2>
<ul> <p>Detailed content will be added here in a future update.</p>
<li>You will not find a <strong>Reporting</strong> tab/card under Settings.</li>
<li>Global reporting defaults are therefore not managed in Settings at this time.</li>
</ul>
<h2>Related Functionality</h2>
<ul>
<li>Object maintenance/backfill actions that support report data quality are available in <a href="{{ url_for('main.settings', section='maintenance') }}">Settings -> Maintenance</a>.</li>
<li>Reporting feature documentation will be expanded when the reporting module/settings are finalized.</li>
</ul>
<h2>Next Steps</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='reports', page='creating-reports') }}">Reports: Creating Reports</a> (planned/placeholder)</li>
<li><a href="{{ url_for('documentation.page', section='settings', page='maintenance') }}">Settings: Maintenance</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -4,62 +4,16 @@
<h1>User Management</h1> <h1>User Management</h1>
<p class="lead"> <p class="lead">
Admins can create users, change roles, reset passwords, and remove accounts from Settings. Manage users and roles.
</p> </p>
<div class="doc-callout doc-callout-info"> <div class="doc-callout doc-callout-info">
<strong>Where:</strong><br> <strong>📝 Coming Soon:</strong>
Open <a href="{{ url_for('main.settings', section='users') }}"><strong>Settings -> Users</strong></a>. This page is under construction. Full content will be added in a future update.
</div> </div>
<h2>Available Roles</h2> <h2>Content</h2>
<ul> <p>Detailed content will be added here in a future update.</p>
<li><strong>Admin</strong></li>
<li><strong>Operator</strong></li>
<li><strong>Reporter</strong></li>
<li><strong>Viewer</strong></li>
</ul>
<h2>Create User</h2>
<ol>
<li>Enter <strong>Username</strong>.</li>
<li>Select one or more roles (default fallback is Viewer).</li>
<li>Set initial password.</li>
<li>Click <strong>Create</strong>.</li>
</ol>
<h2>Update Roles</h2>
<ul>
<li>Role changes are saved per user via the inline <strong>Save</strong> button.</li>
<li>At least one role is always enforced (Viewer fallback).</li>
<li>The last remaining admin cannot lose Admin role.</li>
</ul>
<h2>Password Reset</h2>
<ul>
<li>Use the per-user reset field in the Actions column.</li>
<li>New password is required.</li>
</ul>
<h2>Delete User</h2>
<ul>
<li>Use the per-user <strong>Delete</strong> action.</li>
<li>The last remaining admin account cannot be deleted.</li>
</ul>
<h2>Audit Logging</h2>
<p>User creation, role updates, password resets, and deletions are written to the admin audit log.</p>
<h2>Related Pages</h2>
<ul>
<li><a href="{{ url_for('documentation.page', section='users', page='users-and-roles') }}">Users & Roles</a></li>
<li><a href="{{ url_for('documentation.page', section='users', page='profile-settings') }}">Profile Settings</a></li>
</ul>
{% endblock %} {% endblock %}

View File

@ -50,7 +50,7 @@
{% for c in customers %} {% for c in customers %}
<tr> <tr>
<td> <td>
<a href="{{ url_for('main.jobs', customer_id=c.id) }}" class="bc-sidebar-link-inline"> <a href="{{ url_for('main.jobs', customer_id=c.id) }}" class="link-primary text-decoration-none">
{{ c.name }} {{ c.name }}
</a> </a>
</td> </td>

View File

@ -22,59 +22,6 @@
<dt class="col-sm-3">Remarks</dt> <dt class="col-sm-3">Remarks</dt>
<dd class="col-sm-9">{{ remark_open_count }} open / {{ remark_total_count }} total</dd> <dd class="col-sm-9">{{ remark_open_count }} open / {{ remark_total_count }} total</dd>
<dt class="col-sm-3">First backup detected</dt>
<dd class="col-sm-9">
{% if first_detected_run_at %}
{{ first_detected_run_at|local_datetime }}
{% else %}
Not detected yet
{% endif %}
</dd>
<dt class="col-sm-3">Schedule (effective)</dt>
<dd class="col-sm-9">
{% if effective_schedule_source == 'manual' %}
<span class="badge bg-primary-subtle text-primary-emphasis border border-primary-subtle">Manual override</span>
{% elif effective_schedule_source == 'inferred_weekly' %}
<span class="badge bg-secondary-subtle text-secondary-emphasis border border-secondary-subtle">Inferred (weekly)</span>
{% elif effective_schedule_source == 'inferred_monthly' %}
<span class="badge bg-secondary-subtle text-secondary-emphasis border border-secondary-subtle">Inferred (monthly)</span>
{% else %}
<span class="badge bg-secondary text-white border border-secondary">No schedule</span>
{% endif %}
<div class="mt-2 small">{{ schedule_desc }}</div>
</dd>
{% if effective_weekly_schedule_map and (effective_weekly_schedule_map[0] or effective_weekly_schedule_map[1] or effective_weekly_schedule_map[2] or effective_weekly_schedule_map[3] or effective_weekly_schedule_map[4] or effective_weekly_schedule_map[5] or effective_weekly_schedule_map[6]) %}
<dt class="col-sm-3">Effective schedule</dt>
<dd class="col-sm-9">
<div class="table-responsive">
<table class="table table-sm table-bordered mb-0">
<thead class="table-light">
<tr><th style="width: 120px;">Day</th><th>Times (15 min blocks)</th></tr>
</thead>
<tbody>
<tr><td>Mon</td><td>{{ ', '.join(effective_weekly_schedule_map[0]) if effective_weekly_schedule_map[0] else '—' }}</td></tr>
<tr><td>Tue</td><td>{{ ', '.join(effective_weekly_schedule_map[1]) if effective_weekly_schedule_map[1] else '—' }}</td></tr>
<tr><td>Wed</td><td>{{ ', '.join(effective_weekly_schedule_map[2]) if effective_weekly_schedule_map[2] else '—' }}</td></tr>
<tr><td>Thu</td><td>{{ ', '.join(effective_weekly_schedule_map[3]) if effective_weekly_schedule_map[3] else '—' }}</td></tr>
<tr><td>Fri</td><td>{{ ', '.join(effective_weekly_schedule_map[4]) if effective_weekly_schedule_map[4] else '—' }}</td></tr>
<tr><td>Sat</td><td>{{ ', '.join(effective_weekly_schedule_map[5]) if effective_weekly_schedule_map[5] else '—' }}</td></tr>
<tr><td>Sun</td><td>{{ ', '.join(effective_weekly_schedule_map[6]) if effective_weekly_schedule_map[6] else '—' }}</td></tr>
</tbody>
</table>
</div>
</dd>
{% endif %}
{% if effective_monthly_schedule %}
<dt class="col-sm-3">Effective monthly</dt>
<dd class="col-sm-9">
Day {{ effective_monthly_schedule.day_of_month }} at {{ ', '.join(effective_monthly_schedule.times or []) }}
</dd>
{% endif %}
{% if schedule_map %} {% if schedule_map %}
<dt class="col-sm-3">Schedule (inferred)</dt> <dt class="col-sm-3">Schedule (inferred)</dt>
<dd class="col-sm-9"> <dd class="col-sm-9">
@ -101,86 +48,6 @@
</div> </div>
{% if can_manage_jobs %} {% if can_manage_jobs %}
<div class="card mb-3">
<div class="card-header d-flex justify-content-between align-items-center">
<span>Schedule Override</span>
<button
class="btn btn-sm btn-outline-secondary"
type="button"
data-bs-toggle="collapse"
data-bs-target="#scheduleOverridePanel"
aria-expanded="false"
aria-controls="scheduleOverridePanel"
>
Open
</button>
</div>
<div class="collapse" id="scheduleOverridePanel">
<div class="card-body">
<form method="post" action="{{ url_for('main.job_set_schedule', job_id=job.id) }}" class="row g-3">
<div class="col-md-4">
<label for="schedule_type" class="form-label">Type</label>
<select class="form-select" id="schedule_type" name="schedule_type">
<option value="daily" {% if (job.schedule_type or '')|lower == 'daily' %}selected{% endif %}>Daily</option>
<option value="weekly" {% if (job.schedule_type or '')|lower == 'weekly' %}selected{% endif %}>Weekly</option>
<option value="monthly" {% if (job.schedule_type or '')|lower == 'monthly' %}selected{% endif %}>Monthly</option>
</select>
</div>
<div class="col-md-8">
<label for="schedule_times" class="form-label">Times (HH:MM, comma separated)</label>
<input
type="text"
class="form-control"
id="schedule_times"
name="schedule_times"
value="{{ job.schedule_times or '' }}"
placeholder="01:00,13:15"
required
/>
</div>
<div class="col-md-12">
<label class="form-label">Weekdays (for weekly)</label>
<div class="d-flex flex-wrap gap-3">
{% set selected_weekdays = ((job.schedule_days_of_week or '').split(',')) %}
{% for idx, label in [(0, 'Mon'), (1, 'Tue'), (2, 'Wed'), (3, 'Thu'), (4, 'Fri'), (5, 'Sat'), (6, 'Sun')] %}
<div class="form-check form-check-inline">
<input
class="form-check-input"
type="checkbox"
name="schedule_weekdays"
id="schedule_weekday_{{ idx }}"
value="{{ idx }}"
{% if label in selected_weekdays %}checked{% endif %}
/>
<label class="form-check-label" for="schedule_weekday_{{ idx }}">{{ label }}</label>
</div>
{% endfor %}
</div>
</div>
<div class="col-md-4">
<label for="schedule_day_of_month" class="form-label">Day of month (for monthly)</label>
<input
type="number"
class="form-control"
id="schedule_day_of_month"
name="schedule_day_of_month"
min="1"
max="31"
value="{{ job.schedule_day_of_month or '' }}"
/>
</div>
<div class="col-12 d-flex flex-wrap gap-2">
<button type="submit" class="btn btn-primary">Save override</button>
<button type="submit" name="clear_schedule" value="1" class="btn btn-outline-secondary">Use inferred schedule</button>
</div>
<div class="col-12 small text-muted">
Manual schedule override is leading for Expected/Missed and missed-run generation.
</div>
</form>
</div>
</div>
</div>
<div class="d-flex flex-wrap gap-2 mb-3"> <div class="d-flex flex-wrap gap-2 mb-3">
<form method="post" action="{{ url_for('main.archive_job', job_id=job.id) }}" class="mb-0" onsubmit="return confirm('Archive this job? No new runs are expected and it will be removed from Daily Jobs and Run Checks.');"> <form method="post" action="{{ url_for('main.archive_job', job_id=job.id) }}" class="mb-0" onsubmit="return confirm('Archive this job? No new runs are expected and it will be removed from Daily Jobs and Run Checks.');">
<button type="submit" class="btn btn-outline-secondary">Archive</button> <button type="submit" class="btn btn-outline-secondary">Archive</button>
@ -316,14 +183,6 @@
@media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } } @media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } }
#run_msg_body_container_iframe { height: 55vh; } #run_msg_body_container_iframe { height: 55vh; }
#run_msg_objects_container { max-height: 25vh; overflow: auto; } #run_msg_objects_container { max-height: 25vh; overflow: auto; }
#jobRunMessageModal.is-cove #jdm_mail_iframe_panel {
display: none !important;
}
#jobRunMessageModal.is-cove #run_msg_objects_container {
max-height: 55vh;
}
</style> </style>
<!-- Inline popup modal for run message details --> <!-- Inline popup modal for run message details -->
@ -427,7 +286,7 @@
</dl> </dl>
</div> </div>
<div class="mb-3" id="jdm_mail_iframe_panel"> <div class="mb-3">
<div class="d-flex align-items-center gap-2 mb-1"> <div class="d-flex align-items-center gap-2 mb-1">
<h6 class="mb-0" id="jdm_mail_heading">Mail</h6> <h6 class="mb-0" id="jdm_mail_heading">Mail</h6>
<a href="#" class="small text-muted" id="jdm_mail_toggle" style="display:none;" <a href="#" class="small text-muted" id="jdm_mail_toggle" style="display:none;"
@ -499,54 +358,47 @@
// Cross-browser copy to clipboard function // Cross-browser copy to clipboard function
function copyToClipboard(text, button) { function copyToClipboard(text, button) {
var value = (text || "").toString().trim(); // Method 1: Modern Clipboard API (works in most browsers with HTTPS)
if (!value) { if (navigator.clipboard && navigator.clipboard.writeText) {
alert("No ticket number available to copy."); navigator.clipboard.writeText(text)
return;
}
if (window.isSecureContext && navigator.clipboard && navigator.clipboard.writeText) {
navigator.clipboard.writeText(value)
.then(function () { .then(function () {
showCopyFeedback(button); showCopyFeedback(button);
}) })
.catch(function () { .catch(function () {
fallbackCopy(value, button); // Fallback to method 2 if clipboard API fails
fallbackCopy(text, button);
}); });
return; } else {
// Method 2: Legacy execCommand method
fallbackCopy(text, button);
} }
fallbackCopy(value, button);
} }
function fallbackCopy(text, button) { function fallbackCopy(text, button) {
var textarea = document.createElement("textarea"); var textarea = document.createElement('textarea');
textarea.value = text; textarea.value = text;
textarea.setAttribute("readonly", "readonly"); textarea.style.position = 'fixed';
textarea.style.position = "fixed"; textarea.style.opacity = '0';
textarea.style.opacity = "0"; textarea.style.top = '0';
textarea.style.top = "0"; textarea.style.left = '0';
textarea.style.left = "0";
document.body.appendChild(textarea); document.body.appendChild(textarea);
textarea.focus(); textarea.focus();
textarea.select(); textarea.select();
textarea.setSelectionRange(0, text.length);
var successful = false;
try { try {
successful = document.execCommand("copy"); var successful = document.execCommand('copy');
if (successful) {
showCopyFeedback(button);
} else {
// If execCommand fails, use prompt as last resort
window.prompt('Copy ticket number:', text);
}
} catch (err) { } catch (err) {
successful = false; // If all else fails, show prompt
window.prompt('Copy ticket number:', text);
} }
document.body.removeChild(textarea); document.body.removeChild(textarea);
if (successful) {
showCopyFeedback(button);
return;
}
window.prompt("Copy ticket number:", text);
} }
function showCopyFeedback(button) { function showCopyFeedback(button) {
@ -642,13 +494,13 @@
Array.prototype.forEach.call(box.querySelectorAll('button[data-action]'), function (btn) { Array.prototype.forEach.call(box.querySelectorAll('button[data-action]'), function (btn) {
btn.addEventListener('click', function (ev) { btn.addEventListener('click', function (ev) {
ev.preventDefault(); ev.preventDefault();
ev.stopPropagation();
var action = btn.getAttribute('data-action'); var action = btn.getAttribute('data-action');
var id = btn.getAttribute('data-id'); var id = btn.getAttribute('data-id');
if (!action) return; if (!action) return;
if (action === 'copy-ticket') { if (action === 'copy-ticket') {
var code = (btn.getAttribute('data-code') || '').trim(); var code = btn.getAttribute('data-code') || '';
if (!code) return;
copyToClipboard(code, btn); copyToClipboard(code, btn);
return; return;
} }
@ -936,11 +788,8 @@ function renderObjects(objects) {
var mailToggle = document.getElementById("jdm_mail_toggle"); var mailToggle = document.getElementById("jdm_mail_toggle");
var mailBody = document.getElementById("jdm_mail_iframe_body"); var mailBody = document.getElementById("jdm_mail_iframe_body");
var bodyFrame = document.getElementById("run_msg_body_container_iframe"); var bodyFrame = document.getElementById("run_msg_body_container_iframe");
var mailPanel = document.getElementById("jdm_mail_iframe_panel");
var modalEl = document.getElementById("jobRunMessageModal");
if (data.cove_summary) { if (data.cove_summary) {
if (modalEl) modalEl.classList.add("is-cove");
var cs = data.cove_summary; var cs = data.cove_summary;
document.getElementById("jdm_cove_account").textContent = cs.account_name || "—"; document.getElementById("jdm_cove_account").textContent = cs.account_name || "—";
document.getElementById("jdm_cove_computer").textContent = cs.computer_name || "—"; document.getElementById("jdm_cove_computer").textContent = cs.computer_name || "—";
@ -953,9 +802,7 @@ function renderObjects(objects) {
if (mailHeading) mailHeading.style.display = "none"; if (mailHeading) mailHeading.style.display = "none";
if (mailToggle) mailToggle.style.display = "none"; if (mailToggle) mailToggle.style.display = "none";
if (mailBody) mailBody.style.display = "none"; if (mailBody) mailBody.style.display = "none";
if (mailPanel) mailPanel.style.display = "none";
} else if (data.cloud_connect_summary) { } else if (data.cloud_connect_summary) {
if (modalEl) modalEl.classList.remove("is-cove");
var s = data.cloud_connect_summary; var s = data.cloud_connect_summary;
document.getElementById("jdm_cc_user").textContent = s.user || ""; document.getElementById("jdm_cc_user").textContent = s.user || "";
document.getElementById("jdm_cc_section").textContent = s.section || ""; document.getElementById("jdm_cc_section").textContent = s.section || "";
@ -969,7 +816,6 @@ function renderObjects(objects) {
if (mailHeading) { mailHeading.style.display = ""; mailHeading.textContent = "Source report email"; } if (mailHeading) { mailHeading.style.display = ""; mailHeading.textContent = "Source report email"; }
if (mailToggle) { mailToggle.style.display = ""; mailToggle.textContent = "show"; } if (mailToggle) { mailToggle.style.display = ""; mailToggle.textContent = "show"; }
if (mailBody) mailBody.style.display = "none"; if (mailBody) mailBody.style.display = "none";
if (mailPanel) mailPanel.style.display = "";
if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(data.body_html || ""); if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(data.body_html || "");
} else { } else {
if (covePanel) covePanel.style.display = "none"; if (covePanel) covePanel.style.display = "none";
@ -977,7 +823,6 @@ function renderObjects(objects) {
if (mailHeading) { mailHeading.style.display = ""; mailHeading.textContent = "Mail"; } if (mailHeading) { mailHeading.style.display = ""; mailHeading.textContent = "Mail"; }
if (mailToggle) mailToggle.style.display = "none"; if (mailToggle) mailToggle.style.display = "none";
if (mailBody) mailBody.style.display = ""; if (mailBody) mailBody.style.display = "";
if (mailPanel) mailPanel.style.display = "";
if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(data.body_html || ""); if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(data.body_html || "");
} }

View File

@ -174,16 +174,11 @@
@media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } } @media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } }
#runChecksModal .modal-content { #runChecksModal .modal-content {
height: min(90vh, calc(100dvh - 1rem)); height: 90vh;
max-height: calc(100dvh - 1rem);
display: flex; display: flex;
flex-direction: column; flex-direction: column;
} }
#runChecksModal .modal-dialog {
margin: 0.5rem auto;
}
#runChecksModal .modal-body { #runChecksModal .modal-body {
overflow: hidden; overflow: hidden;
flex: 1 1 auto; flex: 1 1 auto;
@ -233,48 +228,8 @@
overflow: auto; overflow: auto;
margin-top: 0.5rem; margin-top: 0.5rem;
} }
#runChecksModal.is-cove .rcm-mail-panel {
display: none !important;
}
#runChecksModal.is-cove .rcm-objects-scroll {
max-height: 55vh;
}
@media (max-width: 991.98px) {
#runChecksModal .modal-dialog {
max-width: calc(100vw - 1rem);
margin: 0.5rem;
}
#runChecksModal .modal-content {
height: calc(100dvh - 1rem);
max-height: calc(100dvh - 1rem);
}
#runChecksModal .modal-body {
overflow: auto;
}
#runChecksModal #rcm_content,
#runChecksModal .rcm-main-row,
#runChecksModal .rcm-main-row > .col-md-3,
#runChecksModal .rcm-detail-col {
height: auto;
min-height: initial;
}
#runChecksModal #rcm_runs_list {
max-height: 28vh;
}
#runChecksModal .rcm-objects-scroll,
#runChecksModal.is-cove .rcm-objects-scroll {
max-height: none;
}
}
</style> </style>
<div class="modal fade" id="runChecksModal" tabindex="-1" aria-labelledby="runChecksModalLabel" aria-hidden="true"> <div class="modal fade" id="runChecksModal" tabindex="-1" aria-labelledby="runChecksModalLabel" aria-hidden="true">
<div class="modal-dialog modal-xl modal-dialog-scrollable modal-xxl"> <div class="modal-dialog modal-xl modal-dialog-scrollable modal-xxl">
<div class="modal-content"> <div class="modal-content">
@ -600,54 +555,47 @@ function escapeHtml(s) {
// Cross-browser copy to clipboard function // Cross-browser copy to clipboard function
function copyToClipboard(text, button) { function copyToClipboard(text, button) {
var value = (text || "").toString().trim(); // Method 1: Modern Clipboard API (works in most browsers with HTTPS)
if (!value) { if (navigator.clipboard && navigator.clipboard.writeText) {
alert("No ticket number available to copy."); navigator.clipboard.writeText(text)
return;
}
if (window.isSecureContext && navigator.clipboard && navigator.clipboard.writeText) {
navigator.clipboard.writeText(value)
.then(function () { .then(function () {
showCopyFeedback(button); showCopyFeedback(button);
}) })
.catch(function () { .catch(function () {
fallbackCopy(value, button); // Fallback to method 2 if clipboard API fails
fallbackCopy(text, button);
}); });
return; } else {
// Method 2: Legacy execCommand method
fallbackCopy(text, button);
} }
fallbackCopy(value, button);
} }
function fallbackCopy(text, button) { function fallbackCopy(text, button) {
var textarea = document.createElement("textarea"); var textarea = document.createElement('textarea');
textarea.value = text; textarea.value = text;
textarea.setAttribute("readonly", "readonly"); textarea.style.position = 'fixed';
textarea.style.position = "fixed"; textarea.style.opacity = '0';
textarea.style.opacity = "0"; textarea.style.top = '0';
textarea.style.top = "0"; textarea.style.left = '0';
textarea.style.left = "0";
document.body.appendChild(textarea); document.body.appendChild(textarea);
textarea.focus(); textarea.focus();
textarea.select(); textarea.select();
textarea.setSelectionRange(0, text.length);
var successful = false;
try { try {
successful = document.execCommand("copy"); var successful = document.execCommand('copy');
if (successful) {
showCopyFeedback(button);
} else {
// If execCommand fails, use prompt as last resort
window.prompt('Copy ticket number:', text);
}
} catch (err) { } catch (err) {
successful = false; // If all else fails, show prompt
window.prompt('Copy ticket number:', text);
} }
document.body.removeChild(textarea); document.body.removeChild(textarea);
if (successful) {
showCopyFeedback(button);
return;
}
window.prompt("Copy ticket number:", text);
} }
function showCopyFeedback(button) { function showCopyFeedback(button) {
@ -892,16 +840,10 @@ table.addEventListener('change', function (e) {
opts = opts || {}; opts = opts || {};
opts.headers = opts.headers || {}; opts.headers = opts.headers || {};
opts.headers['Content-Type'] = 'application/json'; opts.headers['Content-Type'] = 'application/json';
opts.headers['X-Requested-With'] = 'XMLHttpRequest';
if (!opts.credentials) opts.credentials = 'same-origin';
return fetch(url, opts).then(function (r) { return fetch(url, opts).then(function (r) {
return r.text().then(function (txt) { return r.json().then(function (j) {
var j = null;
try { j = txt ? JSON.parse(txt) : null; } catch (_) { j = null; }
if (!r.ok || !j || j.status !== 'ok') { if (!r.ok || !j || j.status !== 'ok') {
var msg = (j && j.message) var msg = (j && j.message) ? j.message : ('Request failed (' + r.status + ')');
? j.message
: ((txt && txt.trim()) ? txt.trim() : ('Request failed (' + r.status + ')'));
throw new Error(msg); throw new Error(msg);
} }
return j; return j;
@ -1027,18 +969,12 @@ table.addEventListener('change', function (e) {
html += '<div class="mb-2"><strong>Remarks</strong><div class="mt-1">'; html += '<div class="mb-2"><strong>Remarks</strong><div class="mt-1">';
remarks.forEach(function (r) { remarks.forEach(function (r) {
var status = r.resolved_at ? 'Resolved' : 'Active'; var status = r.resolved_at ? 'Resolved' : 'Active';
var source = (r && r.source) ? String(r.source) : 'manual';
var sourceBadge = '';
if (source === 'autotask_resolution') {
sourceBadge = '<span class="ms-2 badge bg-info text-dark">Autotask</span>';
}
html += '<div class="mb-2 border rounded p-2" data-alert-type="remark" data-id="' + r.id + '">' + html += '<div class="mb-2 border rounded p-2" data-alert-type="remark" data-id="' + r.id + '">' +
'<div class="d-flex align-items-start justify-content-between gap-2">' + '<div class="d-flex align-items-start justify-content-between gap-2">' +
'<div class="flex-grow-1 min-w-0">' + '<div class="flex-grow-1 min-w-0">' +
'<div class="text-truncate">' + '<div class="text-truncate">' +
'<span class="me-1" title="Remark">💬</span>' + '<span class="me-1" title="Remark">💬</span>' +
'<span class="fw-semibold">Remark</span>' + '<span class="fw-semibold">Remark</span>' +
sourceBadge +
'<span class="ms-2 badge ' + (r.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' + '<span class="ms-2 badge ' + (r.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
'</div>' + '</div>' +
(r.body ? ('<div class="small text-muted mt-1">' + escapeHtml(r.body) + '</div>') : '') + (r.body ? ('<div class="small text-muted mt-1">' + escapeHtml(r.body) + '</div>') : '') +
@ -1057,13 +993,13 @@ table.addEventListener('change', function (e) {
Array.prototype.forEach.call(box.querySelectorAll('button[data-action]'), function (btn) { Array.prototype.forEach.call(box.querySelectorAll('button[data-action]'), function (btn) {
btn.addEventListener('click', function (ev) { btn.addEventListener('click', function (ev) {
ev.preventDefault(); ev.preventDefault();
ev.stopPropagation();
var action = btn.getAttribute('data-action'); var action = btn.getAttribute('data-action');
var id = btn.getAttribute('data-id'); var id = btn.getAttribute('data-id');
if (!action) return; if (!action) return;
if (action === 'copy-ticket') { if (action === 'copy-ticket') {
var code = (btn.getAttribute('data-code') || '').trim(); var code = btn.getAttribute('data-code') || '';
if (!code) return;
copyToClipboard(code, btn); copyToClipboard(code, btn);
return; return;
} }
@ -1533,11 +1469,8 @@ table.addEventListener('change', function (e) {
var mailToggle = document.getElementById('rcm_mail_toggle'); var mailToggle = document.getElementById('rcm_mail_toggle');
var mailBody = document.getElementById('rcm_mail_iframe_body'); var mailBody = document.getElementById('rcm_mail_iframe_body');
var bodyFrame = document.getElementById('rcm_body_iframe'); var bodyFrame = document.getElementById('rcm_body_iframe');
var mailPanel = document.getElementById('rcm_mail_iframe_panel');
var modalEl = document.getElementById('runChecksModal');
if (run.cove_summary) { if (run.cove_summary) {
if (modalEl) modalEl.classList.add('is-cove');
var cs = run.cove_summary; var cs = run.cove_summary;
document.getElementById('rcm_cove_account').textContent = cs.account_name || '—'; document.getElementById('rcm_cove_account').textContent = cs.account_name || '—';
document.getElementById('rcm_cove_computer').textContent = cs.computer_name || '—'; document.getElementById('rcm_cove_computer').textContent = cs.computer_name || '—';
@ -1550,9 +1483,7 @@ table.addEventListener('change', function (e) {
if (mailHeading) mailHeading.style.display = 'none'; if (mailHeading) mailHeading.style.display = 'none';
if (mailToggle) mailToggle.style.display = 'none'; if (mailToggle) mailToggle.style.display = 'none';
if (mailBody) mailBody.style.display = 'none'; if (mailBody) mailBody.style.display = 'none';
if (mailPanel) mailPanel.style.display = 'none';
} else if (run.cloud_connect_summary) { } else if (run.cloud_connect_summary) {
if (modalEl) modalEl.classList.remove('is-cove');
var s = run.cloud_connect_summary; var s = run.cloud_connect_summary;
document.getElementById('rcc_user').textContent = s.user || ''; document.getElementById('rcc_user').textContent = s.user || '';
document.getElementById('rcc_section').textContent = s.section || ''; document.getElementById('rcc_section').textContent = s.section || '';
@ -1566,16 +1497,13 @@ table.addEventListener('change', function (e) {
if (mailHeading) { mailHeading.style.display = ''; mailHeading.textContent = 'Source report email'; } if (mailHeading) { mailHeading.style.display = ''; mailHeading.textContent = 'Source report email'; }
if (mailToggle) { mailToggle.style.display = ''; mailToggle.textContent = 'show'; } if (mailToggle) { mailToggle.style.display = ''; mailToggle.textContent = 'show'; }
if (mailBody) mailBody.style.display = 'none'; if (mailBody) mailBody.style.display = 'none';
if (mailPanel) mailPanel.style.display = '';
if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(run.body_html || ''); if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(run.body_html || '');
} else { } else {
if (modalEl) modalEl.classList.remove('is-cove');
if (covePanel) covePanel.style.display = 'none'; if (covePanel) covePanel.style.display = 'none';
if (ccPanel) ccPanel.style.display = 'none'; if (ccPanel) ccPanel.style.display = 'none';
if (mailHeading) { mailHeading.style.display = ''; mailHeading.textContent = 'Mail'; } if (mailHeading) { mailHeading.style.display = ''; mailHeading.textContent = 'Mail'; }
if (mailToggle) mailToggle.style.display = 'none'; if (mailToggle) mailToggle.style.display = 'none';
if (mailBody) mailBody.style.display = ''; if (mailBody) mailBody.style.display = '';
if (mailPanel) mailPanel.style.display = '';
if (bodyFrame) { if (bodyFrame) {
bodyFrame.srcdoc = wrapMailHtml(run.body_html || (run.missed ? '<div class="text-muted">No email for missed run.</div>' : '')); bodyFrame.srcdoc = wrapMailHtml(run.body_html || (run.missed ? '<div class="text-muted">No email for missed run.</div>' : ''));
} }

View File

@ -2,99 +2,6 @@
This file documents all changes made to this project via Claude Code. This file documents all changes made to this project via Claude Code.
## [2026-04-13]
### Fixed
- Run Checks now suppresses repeated Cove runs within the same local day once a **complete success run** has occurred:
- A complete success run is defined as `JobRun.status = Success` with at least one persisted run object and all object statuses equal to `Success`.
- For each Cove job/day, the first complete success run is treated as the cutoff; newer runs on that same day are hidden from Run Checks (both overview aggregation and modal details), regardless of whether they are `Success`, `Warning`, or `Failed/Error`.
- Sorting remains unchanged (`newest -> oldest`).
### Validation
- Test build executed with `./build-and-push.sh t` on 2026-04-13 and pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` (digest `sha256:520778f4b72643c1cd1815fa424317ee2dce182ccfcbea687f4ac711b3d00fb0`).
## [2026-04-02]
### Added
- Job Details now supports manual schedule overrides (Daily/Weekly/Monthly) via `POST /jobs/<job_id>/schedule`:
- Operators/admins can save a manual schedule or clear it to fall back to inferred schedule.
- Effective schedule source is shown in Job Details (`manual`, `inferred weekly`, `inferred monthly`).
- Job Details now shows `First backup detected` using the earliest non-missed run timestamp for the job, to make historical reporting horizon visible.
### Fixed
- Increased missed-run grace/tolerance window from `±1 hour` to `±3 hours` to better handle DST and larger execution-time drift:
- Updated `MISSED_GRACE_WINDOW` in `containers/backupchecks/src/backend/app/main/routes_run_checks.py` to `timedelta(hours=3)` for missed-run generation and duplicate/fulfillment checks.
- Updated `MISSED_GRACE_WINDOW` in `containers/backupchecks/src/backend/app/main/routes_daily_jobs.py` to `timedelta(hours=3)` so Daily Jobs Expected/Missed transitions stay aligned with Run Checks logic.
- Effective schedule resolution now prioritizes manual job schedule over inferred schedule for operational views and missed-run generation:
- Daily Jobs, Search and Dashboard expected/missed calculations now use the effective (manual-first) schedule.
- Run Checks missed-run sweep now generates/removes missed slots from the effective schedule instead of inference-only logic.
- Job Details schedule UI polish:
- `No schedule` badge now uses a dark background with white text for readable contrast in dark-themed pages.
- `Schedule Override` panel is now collapsed by default and can be expanded on demand.
- Schedule inference now also includes Cove API runs (`source_type='cove_api'`) instead of only mail-linked runs, so Cove jobs can get inferred weekly/monthly schedules.
## [2026-03-30]
### Added
- Autotask resolution sync now persists PSA resolution text as internal active remarks for follow-up visibility:
- Added `remarks.source` and `remarks.ticket_id` in models and migrations.
- New source value `autotask_resolution` is used for remarks created from Autotask ticket resolution content.
- Remarks are linked to job scope and all linked runs (`remark_scopes` + `remark_job_runs`) and remain active (not auto-resolved).
- Deduplication prevents creating duplicate remarks for the same job/ticket/resolution text.
### Fixed
- Autotask-to-remark mirroring now skips Backupchecks-generated resolve updates:
- Marker-based skip for `[Backupchecks] Marked as resolved in Backupchecks`.
- Additional guard to skip when the internal ticket resolution origin is `backupchecks`.
- Run Checks remark cards now show an `Autotask` badge for remarks with `source=autotask_resolution`.
## [2026-03-27]
### Fixed
- Run Checks modal (Cove runs) now remains fully usable on smaller resolutions: the modal height is capped to the visible viewport (`100dvh`), mobile/tablet layout can scroll inside the modal body, and rigid full-height column constraints are relaxed under 992px so content and footer stay reachable.
- Tickets pages/API now compute effective Active/Resolved status from both `tickets.resolved_at` and `ticket_scopes` open/closed state; tickets with all scopes resolved no longer remain incorrectly shown as Active.
- Run Checks and Job Detail ticket copy action (`⧉`) was hardened: click handling now stops propagation and the clipboard fallback path is more robust (secure Clipboard API first, then `execCommand`, then prompt).
## [2026-03-26]
### Fixed
- Documentation Batch 3 (Settings): replaced Settings "Coming Soon" pages with current operational guidance (General, Mail Configuration, Autotask Integration, User Management, Maintenance) and updated Reporting Settings to explicitly state that no dedicated reporting settings card exists yet.
- Documentation Batch 2 (Autotask): replaced the four Autotask "Coming Soon" pages with current operational documentation (setup/configuration, company mapping, creating tickets, ticket management), including Run Checks `Link existing` cross-company behavior.
- Documentation Batch 1: added new documentation section `Integrations` with pages for Cove Data Protection and Veeam Cloud Connect, and linked them into the documentation navigation.
- Documentation hotfixes (Batch 2): fixed broken documentation links pointing to non-existent `autotask/overview` page, corrected Run Checks/Daily Jobs review wording to match job-level review behavior, and updated TODO-documentation batch checklist progress.
- Run Checks modal mail visibility hotfix: normal mail runs now explicitly clear the `is-cove` modal class before rendering mail content, preventing intermittent hidden mail after navigating from a Cove run; Cove behavior itself remains unchanged.
- Run Checks Autotask "Link existing ticket" now allows linking cross-company tickets (for shared/umbrella tickets that apply to multiple customers):
- Removed the hard company mismatch block in `POST /api/run-checks/autotask-link-existing-ticket` (`containers/backupchecks/src/backend/app/main/routes_run_checks.py`).
- Tickets are still validated for existence, ticket number presence, and non-terminal status before linking.
## [2026-03-23]
### Fixed
- Autotask "Link existing ticket" now also posts a ticket note on the selected Autotask ticket when an additional Backupchecks alert/run is linked:
- Added `_compose_autotask_link_existing_note()` in `containers/backupchecks/src/backend/app/main/routes_run_checks.py`.
- Extended `POST /api/run-checks/autotask-link-existing-ticket` to create a note via `client.create_ticket_note(...)` after successful link propagation.
- Link operation remains successful even if note posting fails; response now returns `note_posted` and `note_warning` for visibility.
- Customers page job-filter link styling now matches sidebar colors and hover behavior:
- Replaced `link-primary` link class with `bc-sidebar-link-inline` in `containers/backupchecks/src/templates/main/customers.html`.
- Added `bc-sidebar-link-inline` style in `containers/backupchecks/src/static/css/layout.css` using sidebar text and hover tokens.
- Run Checks Autotask ticket description now includes Cove run objects from `run_object_links` / `customer_objects` (same source as Run Checks UI), instead of only `job_objects`/`mail_objects`:
- Fixes missing object lines for Cove runs where ticket text previously showed only a generic "No detailed object messages available".
- Autotask ticket object listing is now problem-focused:
- Includes only objects with problem signals (`failed`, `error`, `warning`, `missed`).
- Excludes success/completed objects (`success`, `succeeded`, `completed`, `ok`, including `Completed (...)` variants).
- Synology Active Backup for Business parser now correctly handles subjects/bodies that use the wording `has been completed` (e.g. `backup task dc001 on DS220p has been completed`):
- Updated ABB completion regex in `containers/backupchecks/src/backend/app/parsers/synology.py` to accept `has been` completion phrasing.
- Prevents fallback to the generic Synology Active Backup parser that could incorrectly take the bracketed subject prefix (e.g. `[Stout Verlichting - DS220p]`) as `job_name`.
- Correct result for this mail shape is now preserved as `backup_software = Synology`, `backup_type = Active Backup for Business`, `job_name = dc001`.
### Validation
- Test build executed with `./build-and-push.sh t` on 2026-03-23 and pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` (digest `sha256:19014477f2ae14eac0a62f07e11c923c83f9cd5e478873290bdcca37e6ab257c`).
- Latest validation build executed with `./build-and-push.sh t` on 2026-03-23 and pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` (digest `sha256:b9bb6d50f131118ebccaed5834513ca83ec7592bd622ecea42c1ce2dd7bf0cfc`).
- Validation build executed with `./build-and-push.sh t` on 2026-03-23 and pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` (digest `sha256:2ff1675996b27bf409687bf5c52e2a3cb3314728ce1c67bc3ffc14fbd0562427`).
- Validation build executed with `./build-and-push.sh t` on 2026-03-23 and pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` (digest `sha256:f87d871caa3501251c31a66f61eb94b249faf0d3ab85da3f0c02c6036855849b`).
## [2026-03-20] (9) ## [2026-03-20] (9)
### Added ### Added

View File

@ -1,55 +1,3 @@
## v0.2.5
This release bundles all changes made since `v0.2.4`, including schedule management improvements, Autotask/remark synchronization, Run Checks stability updates, and a full documentation refresh.
### Added
- **Manual schedule overrides per job** — Job Details now supports saving and clearing manual schedules (`Daily`, `Weekly`, `Monthly`) via `POST /jobs/<job_id>/schedule`.
- **First backup detected in Job Details** — shows the earliest non-missed run timestamp for operational context.
- **Autotask resolution remark metadata** — remarks now support `source` and optional `ticket_id`, with migration and indexes.
- **Autotask resolution mirroring** — PSA ticket resolution text can be mirrored into internal active remarks (`source=autotask_resolution`) with deduplication.
- **Documentation Integrations section** — added dedicated pages for Cove Data Protection and Veeam Cloud Connect.
### Changed
- **Effective schedule resolution (manual-first)** — Daily Jobs, Dashboard, Search, Job Details and Run Checks missed-run logic now use effective schedule resolution (`manual` override first, inferred fallback).
- **Missed-run grace window widened** — tolerance changed from `±1 hour` to `±3 hours` in Daily Jobs and Run Checks.
- **Schedule inference coverage** — inference now also considers Cove API runs (`source_type='cove_api'`) next to mail-based runs.
- **Run Checks Cove deduplication in-day** — once a complete Cove success run is detected for a job/day, newer runs that day are suppressed in Run Checks overview/modal.
- **Tickets API active-state semantics** — effective active state now considers both ticket-level and scope-level resolution.
- **Documentation refresh** — Settings and Autotask documentation pages were replaced with current operational guidance; outdated TODO audit/Cove documents were archived.
### Fixed
- **Run Checks modal mail visibility** — navigating from Cove runs no longer leaves regular mail runs hidden.
- **Run Checks responsive behavior on smaller screens** — modal layout/scroll behavior improved so content and footer remain reachable.
- **Autotask link-existing cross-company support** — shared/umbrella tickets can be linked across companies while terminal/incomplete validations remain enforced.
- **Ticket copy action robustness** — click/copy handling improved in Run Checks and Job Details.
- **Autotask propagation to new runs** — fixed a propagation path where an open Autotask ticket could disappear on a next-day run if internal open-ticket rows were temporarily absent; unresolved ticket links are now propagated consistently.
## v0.2.4
### Fixed
- **Autotask: link existing ticket cross-company hotfix** — removed the hard company mismatch block in the Run Checks endpoint that links existing Autotask tickets, so umbrella/shared tickets can be linked even when the ticket company differs from the mapped customer company.
- Existing safeguards remain active: ticket must exist, must have a ticket number, and may not be in a terminal/completed status.
## v0.2.3
### Added
- **Autotask: link existing ticket note update** — when linking a Run Checks alert to an existing Autotask ticket, Backupchecks now posts an additional note on that ticket indicating that another alert/run was linked, including customer/job/run context and a Backupchecks deep-link.
### Changed
- **Autotask link-existing API response** — response now includes `note_posted` and `note_warning`, so operators can directly see whether posting the additional ticket note succeeded.
- **Customers page job-filter links** — customer name links that open Jobs with a customer filter now use sidebar-matching text and hover styling instead of the default blue link style.
### Fixed
- **Autotask object retrieval for ticket composition (improved)** — object details are now fetched from `run_object_links`/`customer_objects` first (same source as Run Checks UI), with fallback to legacy `job_objects`/`mail_objects`, preventing missing-object details for Cove runs.
- **Cove Accounts / Cove runs object visibility** -- object details shown in Backupchecks are now consistently sourced from persisted run-object links, improving completeness for Cove Data Protection runs and aligning ticket content with what operators see in Run Checks.
- **Autotask affected objects list** — ticket descriptions now include only problem objects (`failed`/`error`/`warning`/`missed`) and no longer include completed/success objects.
## v0.2.2
### Fixed
- **Synology Active Backup for Business parsing** — notifications using wording `has been completed` are now correctly recognized by the ABB parser. This prevents fallback to the generic Synology Active Backup parser that could incorrectly use the bracketed subject prefix (for example `Stout Verlichting - DS220p`) as job name.
- For ABB notifications like `backup task dc001 on DS220p has been completed`, parsed values are now correctly preserved as: backup software `Synology`, backup type `Active Backup for Business`, job name `dc001`.
## v0.2.1 ## v0.2.1
### Added ### Added

View File

@ -60,11 +60,6 @@ Implemented in `backend/app/migrations.py`:
- Adds `report_definitions.report_config` (TEXT) if it does not exist. - Adds `report_definitions.report_config` (TEXT) if it does not exist.
- Stores the JSON report definition for the reporting UI (selected columns, chart types, filters) so the same definition can later be reused for PDF export. - Stores the JSON report definition for the reporting UI (selected columns, chart types, filters) so the same definition can later be reused for PDF export.
- `migrate_remarks_source_and_ticket_id()`
- Adds `remarks.source` (VARCHAR(64), backfilled to `manual`) if it does not exist.
- Adds `remarks.ticket_id` (INTEGER, FK to `tickets.id`) if it does not exist.
- Adds indexes for source and ticket-based filtering (`idx_remarks_source`, `idx_remarks_ticket_id`).
## Future changes ## Future changes
- Every time you introduce a non-trivial schema change, update: - Every time you introduce a non-trivial schema change, update:

View File

@ -1,6 +1,6 @@
# Technical Notes (Internal) # Technical Notes (Internal)
Last updated: 2026-03-23 Last updated: 2026-02-23 (late)
## Purpose ## Purpose
Internal technical snapshot of the `backupchecks` repository for faster onboarding, troubleshooting, and change impact analysis. Internal technical snapshot of the `backupchecks` repository for faster onboarding, troubleshooting, and change impact analysis.
@ -33,8 +33,6 @@ Internal technical snapshot of the `backupchecks` repository for faster onboardi
- Background tasks: - Background tasks:
- `start_auto_importer(app)` starts the automatic mail importer thread. - `start_auto_importer(app)` starts the automatic mail importer thread.
- `start_cove_importer(app)` starts the Cove Data Protection polling thread (started only when `cove_import_enabled` is set). - `start_cove_importer(app)` starts the Cove Data Protection polling thread (started only when `cove_import_enabled` is set).
- Global template context:
- `inject_inbox_count()` context processor injects `inbox_count` into every template for authenticated users (sidebar badge).
- Health endpoint: - Health endpoint:
- `GET /health` returns `{ "status": "ok" }`. - `GET /health` returns `{ "status": "ok" }`.
@ -82,7 +80,6 @@ File: `containers/backupchecks/src/backend/app/models.py`
- `Customer`, `Job`, `JobRun`, `Override` - `Customer`, `Job`, `JobRun`, `Override`
- `MailMessage`, `MailObject` - `MailMessage`, `MailObject`
- `CoveAccount` (Cove staging table — see Cove integration section) - `CoveAccount` (Cove staging table — see Cove integration section)
- `CloudConnectAccount` (Cloud Connect staging table — see Cloud Connect integration section)
- `Ticket`, `TicketScope`, `TicketJobRun` - `Ticket`, `TicketScope`, `TicketJobRun`
- `Remark`, `RemarkScope`, `RemarkJobRun` - `Remark`, `RemarkScope`, `RemarkJobRun`
- `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment` - `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`
@ -114,7 +111,6 @@ Note: always use direct SQL (`DELETE FROM`) for bulk deletions — ORM-level del
- `backup_software` - e.g., "Veeam", "Synology", "Cove Data Protection" - `backup_software` - e.g., "Veeam", "Synology", "Cove Data Protection"
- `backup_type` - e.g., "Backup Job", "Active Backup" - `backup_type` - e.g., "Backup Job", "Active Backup"
- `cove_account_id` - (nullable int) links this job to a Cove AccountId - `cove_account_id` - (nullable int) links this job to a Cove AccountId
- Cloud Connect accounts link back via `CloudConnectAccount.job_id` (no FK column on `jobs` — the link is on the staging table side)
**JobRun model:** **JobRun model:**
- `source_type` - NULL = email (backwards compat), `"cove_api"` for Cove-imported runs - `source_type` - NULL = email (backwards compat), `"cove_api"` for Cove-imported runs
@ -156,31 +152,6 @@ Note: always use direct SQL (`DELETE FROM`) for bulk deletions — ORM-level del
- Hostname extraction from multiple patterns - Hostname extraction from multiple patterns
- Returns: backup_type "Updates", job_name "Synology Automatic Update" - Returns: backup_type "Updates", job_name "Synology Automatic Update"
## Schedule Inference and Missed Run Detection
### Overview
File: `containers/backupchecks/src/backend/app/main/routes_shared.py`
Missed runs are detected via `_ensure_missed_runs_for_job()` which is called from Run Checks on page load (throttled: max once per 10 minutes per job via in-memory dict). It infers the expected schedule from recent run history and creates `JobRun` records with `missed=True` for any slots that are overdue.
### Weekly Schedule Inference (`_infer_schedule_map_from_runs`)
- **Window**: last **90 days** only (older runs are excluded to handle schedule changes)
- **MIN_OCCURRENCES**: **5** hits on a weekday+time slot to count as expected (raised from 3 to reduce false positives during transitional periods)
- **Cadence guard**: if median gap between runs ≥ 20 days, weekly inference is **skipped** entirely → monthly inference handles the job instead. Prevents monthly jobs from accumulating enough weekly hits after long operation.
- **Key rule**: time-of-day changes or frequency changes stop generating missed runs on old slots within 90 days (no more stale slot false positives)
### Monthly Schedule Inference (`_infer_monthly_schedule_from_runs`)
- **Window**: last **180 days** (enough for ≥ 3 monthly occurrences, but forgotten within 6 months after a schedule change)
- Infers day-of-month + time-of-day from historical runs
- Used when weekly cadence guard fires (median gap ≥ 20 days)
### Important Rules
- **Never** extend the window without considering stale slot false positives
- Schedule changes (time, frequency) take effect in missed run detection within the window period (90d weekly, 180d monthly)
- Informational parsers (`3CX / Update`, `3CX / SSL Certificate`) are excluded from all schedule inference
---
## Cove Data Protection Integration ## Cove Data Protection Integration
### Overview ### Overview
@ -331,11 +302,6 @@ Cove run rows in the job detail history table are clickable even without a mail
- `routes_run_checks.py` returns `cove_summary` in the run payload for `source_type="cove_api"` runs - `routes_run_checks.py` returns `cove_summary` in the run payload for `source_type="cove_api"` runs
- Includes: account_name, computer_name, customer_name, readable datasource labels, last_run_at, status - Includes: account_name, computer_name, customer_name, readable datasource labels, last_run_at, status
- `run_checks.html` shows the Cove summary panel and hides the mail section - `run_checks.html` shows the Cove summary panel and hides the mail section
- Duplicate-day suppression for Cove runs:
- Runs are grouped per job per local day (Europe/Amsterdam date derived from run timestamp).
- A run is considered a "complete success" when `JobRun.status == Success` and persisted run objects exist with all object statuses equal to `Success`.
- Once the first complete success exists on that day, all newer Cove runs for the same day are hidden in Run Checks (overview aggregation + details modal), regardless of status (`Success`, `Warning`, `Failed/Error`).
- Sort order in the modal remains unchanged (`newest -> oldest`).
### Migrations ### Migrations
- `migrate_cove_integration()` — adds 8 columns to `system_settings`, `cove_account_id` to `jobs`, `source_type` + `external_id` to `job_runs`, dedup index on `job_runs.external_id` - `migrate_cove_integration()` — adds 8 columns to `system_settings`, `cove_account_id` to `jobs`, `source_type` + `external_id` to `job_runs`, dedup index on `job_runs.external_id`
@ -536,14 +502,7 @@ Visible on Logging page under `event_type = "ticket_link_debug"`. Remove after d
## UI and UX Notes ## UI and UX Notes
### Layout v2 (2026-03-20) ### Navbar
- Complete sidebar-first redesign replacing the top navbar layout:
- `layout.css` rewritten with IBM Plex Sans/Mono fonts and CSS custom properties (design tokens)
- Fixed dark sidebar (220 px wide)
- `base.html` updated with Google Fonts preload and sidebar-aware structure
- Sandbox banner: semi-transparent (`rgba(220,53,69,0.45)`) instead of solid red
### Navbar (pre-v0.2.0 reference — replaced by sidebar in v0.2.0)
- Fixed-top positioning - Fixed-top positioning
- Collapses on mobile (hamburger menu) - Collapses on mobile (hamburger menu)
- Dynamic padding adjustment via JavaScript (measures navbar height, adjusts main content padding-top) - Dynamic padding adjustment via JavaScript (measures navbar height, adjusts main content padding-top)
@ -613,58 +572,6 @@ Visible on Logging page under `event_type = "ticket_link_debug"`. Remove after d
- Per-section limit (`SEARCH_LIMIT_PER_SECTION = 10`), with total count per section. - Per-section limit (`SEARCH_LIMIT_PER_SECTION = 10`), with total count per section.
- No schema migration required for V1. - No schema migration required for V1.
## Jobs Export / Import
### Schema Versions
- **v1** (`approved_jobs_export_v1`): job fields only, no account links
- **v2** (`approved_jobs_export_v2`): same as v1 plus per-job `cove_account` and `cloud_connect_account` objects
### Export (v2)
Each job entry contains:
```json
{
"cove_account": {"account_id": 1234, "account_name": "...", "computer_name": "..."},
"cloud_connect_account": {"user": "...", "section": "Backup", "repo_name": "..."}
}
```
`null` when not linked. File: `routes_settings.py``export_jobs()`.
### Import (v1 + v2)
- Accepts both schema versions (detected via `export_type` field)
- For v2 files: after creating/updating the job, the importer:
1. Looks up `CoveAccount` by `account_id` (fallback: `account_name` + `computer_name`)
2. Looks up `CloudConnectAccount` by `user` + `section` + `repo_name`
3. Links the account to the job — only if the account is not yet linked to a different job
- File: `routes_settings.py``import_jobs()`
## Inbox Batch Re-parse
### Endpoint
`POST /inbox/reparse-batch` — JSON in/out, login required, admin/operator only.
**Request body:**
```json
{"last_id": <int|null>, "total": <int|null>}
```
- `last_id`: keyset cursor from previous batch (process messages with `id < last_id`)
- `total`: total count from first call (avoids re-counting on every batch)
**Response:**
```json
{"processed": 50, "total": 847, "parsed_ok": 42, "auto_approved": 12, "no_match": 8, "errors": 0, "last_id": 1234, "done": false}
```
### Batch Parameters
- `batch_size`: 50 messages per call
- `time_budget_s`: 8 seconds per call (stops processing mid-batch if exceeded)
- Auto-approve logic is identical to `inbox_reparse_all` (including VSPC multi-company handling)
### Frontend (inbox.html)
- "Re-parse all" button opens a Bootstrap modal (`data-bs-backdrop="static"` — cannot close while running)
- JS loop: `fetch → process response → setTimeout(100ms) → repeat` until `done: true`
- Progress bar + live counters update after each batch
- Close button appears only when `done: true` or on error
## Feedback Module with Screenshots ## Feedback Module with Screenshots
- Models: `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`. - Models: `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`.
- Attachments: - Attachments:
@ -726,68 +633,11 @@ File: `build-and-push.sh`
- Run Checks routes: `containers/backupchecks/src/backend/app/main/routes_run_checks.py` - Run Checks routes: `containers/backupchecks/src/backend/app/main/routes_run_checks.py`
- Cove importer: `containers/backupchecks/src/backend/app/cove_importer.py` - Cove importer: `containers/backupchecks/src/backend/app/cove_importer.py`
- Cove routes: `containers/backupchecks/src/backend/app/main/routes_cove.py` - Cove routes: `containers/backupchecks/src/backend/app/main/routes_cove.py`
- Cloud Connect importer: `containers/backupchecks/src/backend/app/cloud_connect_importer.py`
- Cloud Connect routes: `containers/backupchecks/src/backend/app/main/routes_cloud_connect.py`
- Inbox routes: `containers/backupchecks/src/backend/app/main/routes_inbox.py`
- Settings routes: `containers/backupchecks/src/backend/app/main/routes_settings.py`
- Compose stack: `deploy/backupchecks-stack.yml` - Compose stack: `deploy/backupchecks-stack.yml`
- Build script: `build-and-push.sh` - Build script: `build-and-push.sh`
## Recent Changes ## Recent Changes
### 2026-04-13
- **Run Checks Cove daily suppression** (`main/routes_run_checks.py`):
- Added Cove-specific filtering to suppress repeated same-day runs after the first complete success run.
- Complete success criteria: run status `Success`, object set present, all object statuses `Success`.
- Applied consistently to both Run Checks overview aggregation and details modal query.
- Local-day grouping uses the existing Amsterdam date helper for run timestamps.
- **Validation**:
- Test build executed with `./build-and-push.sh t`; pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` with digest `sha256:520778f4b72643c1cd1815fa424317ee2dce182ccfcbea687f4ac711b3d00fb0`.
### 2026-03-23
- **Synology ABB parser fix** (`parsers/synology.py`): ABB completion regex now also matches `has been completed` phrasing.
- **Job name parsing corrected for ABB mails**: messages like `backup task dc001 on DS220p has been completed` no longer fall back to generic Synology Active Backup parsing; `job_name` stays `dc001` instead of bracketed subject prefix values.
- **Validation**: test build ran successfully via `./build-and-push.sh t`; pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` with digest `sha256:19014477f2ae14eac0a62f07e11c923c83f9cd5e478873290bdcca37e6ab257c`.
- **Run Checks Autotask (Cove object source fix)** (`main/routes_run_checks.py`): ticket creation now reads object details from `run_object_links` + `customer_objects` first (same data source as Run Checks modal), then falls back to legacy `job_objects`/`mail_objects`.
- **Autotask ticket object filtering tightened** (`main/routes_run_checks.py`): ticket description now lists only problem objects (`failed`/`error`/`warning`/`missed`) and excludes completed/success objects (including `Completed (...)` text variants).
- **Validation**: latest test build ran successfully via `./build-and-push.sh t`; pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` with digest `sha256:b9bb6d50f131118ebccaed5834513ca83ec7592bd622ecea42c1ce2dd7bf0cfc`.
- **Autotask link-existing ticket note update** (`main/routes_run_checks.py`): linking a run to an existing Autotask ticket now posts an informational ticket note that another alert/run was linked (with customer/job/run context and Backupchecks deep-link).
- **Autotask link-existing API response enriched** (`main/routes_run_checks.py`): response now includes `note_posted` and `note_warning` so UI/operators can see if the extra note call succeeded.
- **Customers job-filter link visual alignment** (`templates/main/customers.html`, `static/css/layout.css`): customer links to filtered Jobs view now use sidebar text colors and hover behavior (`bc-sidebar-link-inline`) instead of Bootstrap `link-primary` blue.
- **Validation**: test build ran successfully via `./build-and-push.sh t`; pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` with digest `sha256:2ff1675996b27bf409687bf5c52e2a3cb3314728ce1c67bc3ffc14fbd0562427`.
- **Validation**: test build ran successfully via `./build-and-push.sh t`; pushed `gitea.oskamp.info/ivooskamp/backupchecks:dev` with digest `sha256:f87d871caa3501251c31a66f61eb94b249faf0d3ab85da3f0c02c6036855849b`.
### 2026-03-20 (v0.2.1)
- **Missed run false positive fix** (`routes_shared.py`):
- Weekly inference window: last 90 days only (was unbounded). Eliminates stale slot false positives after time-of-day or frequency changes.
- Cadence guard: if median gap between runs ≥ 20 days, skip weekly inference and let monthly inference handle the job. Fixes monthly jobs accumulating enough weekly hits after ~21 months.
- Monthly inference window: last 180 days (was unbounded).
- `MIN_OCCURRENCES` raised from 3 → 5 for weekly inference.
- **Objects sort fix** (`run_checks.html`, `job_detail.html`):
- `objectSeverityRank`: `|| err` on rank-0 check caused Warning items with `error_message` to rank as Critical. Fixed: only `error`/`failed`/`failure` status → rank 0; `|| err` moved to rank 1.
- **Mail iframe height fix** (`run_checks.html`):
- Flex rules were on `#rcm_body_iframe` but the iframe is not a direct flex child of `.rcm-mail-panel`. Fixed by moving `flex: 1 1 auto; min-height: 0` to the wrapper `#rcm_mail_iframe_body` and setting `height: 100%` on the iframe itself.
- **Inbox sidebar badge on all pages** (`__init__.py`):
- Added `inject_inbox_count()` Flask context processor — injects `inbox_count` into every template for authenticated users. Previously only injected in the dashboard route.
- **Jobs export/import schema v2** (`routes_settings.py`):
- Export: includes `cove_account` and `cloud_connect_account` per job.
- Import: accepts v1 and v2; links Cove/CC accounts on import if not yet linked to a different job.
- **Inbox re-parse progress modal** (`routes_inbox.py`, `inbox.html`):
- New `POST /inbox/reparse-batch` endpoint: 50 messages per call, 8 s time budget, keyset pagination, full auto-approve logic (including VSPC multi-company). Returns JSON progress.
- "Re-parse all" button replaced with modal trigger; JS loop calls batch endpoint until `done: true` and updates live progress bar + stats.
### 2026-03-20 (v0.2.0)
- **Layout v2**: complete sidebar-first redesign (`layout.css`, `base.html`). IBM Plex Sans/Mono fonts, CSS design tokens, fixed 220 px dark sidebar.
- **Veeam Cloud Connect importer**: HTML parser for daily report emails → `cloud_connect_accounts` staging table → `JobRun` records for linked accounts. `CloudConnectAccount` model + migrations. `/cloud-connect/accounts` review page. Sidebar link for admin/operator.
- **Cove historical run backfill**: `_backfill_colorbar_runs()` reconstructs up to 27 days of history from the `D09F08` colorbar on first run creation. Idempotent via `external_id = "cove-colorbar-{account_id}-{date}"`.
- **Cove run details popup**: Cove runs in job detail are clickable; popup fetches `/cove/run/<run_id>/detail` (structured Cove summary, per-datasource objects, mail section hidden).
- **Run Checks user preferences**: per-user sort mode + filter defaults stored in DB; `POST /run-checks/preferences`; User Settings page section.
- **Login captcha toggle**: Settings → General → Security card; `login_captcha_enabled` column with `DEFAULT TRUE`.
- **Cloud Connect unique key**: changed from `(user, section)` to `(user, section, repo_name)` — supports multiple repos per user.
- **Cloud Connect run detail popup**: shows structured CC summary instead of raw email; raw email accessible via toggle.
- **Entra SSO**: implemented (marked untested). Login/callback/logout flow, optional auto-provisioning, tenant/domain + security-group restrictions.
- **Fixes**: login page layout with flash messages; "Delete all jobs" timeout (replaced ORM with direct SQL); archived job auto-matching; Cove link sync between `cove_accounts.job_id``jobs.cove_account_id`; Cove run creation transaction scope.
### 2026-02-23 ### 2026-02-23
- **Cove Data Protection full integration**: - **Cove Data Protection full integration**:
- `cove_importer.py` Cove API client (login, paginated enumeration, status mapping, deduplication, per-datasource object persistence) - `cove_importer.py` Cove API client (login, paginated enumeration, status mapping, deduplication, per-datasource object persistence)

View File

@ -1 +1 @@
v0.2.5 v0.2.1