Compare commits
156 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 588f788e31 | |||
| a919610d68 | |||
| da9ed8402e | |||
| 8bef63c18a | |||
| 7385ecf94c | |||
| f62c19ddf8 | |||
| 1064bc8d86 | |||
| 5e617cb6a9 | |||
| d467c060dc | |||
| 0d9159ef6f | |||
| 5b940e34f2 | |||
| 43502ae6f3 | |||
| a9cae0f8f5 | |||
| 1b5effc5d2 | |||
| c1aeee2a8c | |||
| aea6a866c9 | |||
| c228d6db19 | |||
| 88b267b8bd | |||
| 4f208aedd0 | |||
| caff435f96 | |||
| f3b1b56b6a | |||
| 596fc94e69 | |||
| 49f24595c3 | |||
| fd3f3765c3 | |||
| 2a03ff0764 | |||
| d7f6de7c23 | |||
| 57196948a7 | |||
| acb9fbb0ea | |||
| 3b48cd401a | |||
| 21f5c01148 | |||
| f539d62daf | |||
| b6a85d1c8e | |||
| 5549323ff2 | |||
| b4aa7ef2f6 | |||
| 9576d1047e | |||
| 1e3a64a78a | |||
| a05dbab574 | |||
| 0827fddaa5 | |||
| 61b8e97e34 | |||
| 9197c311f2 | |||
| 26848998e1 | |||
| 19ef9dc32a | |||
| e187bc3fa5 | |||
| 96092517b4 | |||
| 08437aff7f | |||
| 710aba97e4 | |||
| ff4942272f | |||
| f332e61288 | |||
| 82fff08ebb | |||
| e932fdf30a | |||
| 7f8dffa3ae | |||
| fec28b2bfa | |||
| 91062bdb0d | |||
| ff316d653a | |||
| 60c7e89dc2 | |||
| b0efa7f21d | |||
| b7875dbf55 | |||
| d400534069 | |||
| bb701d5f00 | |||
| 82b96c3448 | |||
| dee99a920d | |||
| 58dd2ce831 | |||
| 28f094f80b | |||
| 7693af9306 | |||
| 5ed4c41b80 | |||
| a910cc4abc | |||
| e12755321a | |||
| 240f8b5c90 | |||
| 02d7bdd5b8 | |||
| 753c14bb4e | |||
| ce245f7d49 | |||
| 34ac317607 | |||
| 3b087540cb | |||
| e5123952b2 | |||
| 4bbde92c8d | |||
| 7b3b89f50c | |||
| 52cd75e420 | |||
| 83d8d85f30 | |||
| 0ddeaf1896 | |||
| 89b9dd0264 | |||
| f91c081456 | |||
| 39bdd49fd0 | |||
| 5ec64e6a13 | |||
| 55c6f7ddd6 | |||
| 2667e44830 | |||
| 04f6041fe6 | |||
| 494f792c0d | |||
| bb804f9a1e | |||
| a4a6a60d45 | |||
| ddc6eaa12a | |||
| f6216b8803 | |||
| fb2651392c | |||
| e3303681e1 | |||
| 3c7f4c7926 | |||
| 3400af58d7 | |||
| 67fb063267 | |||
| ae1865dab3 | |||
| 92c67805e5 | |||
| fc0cf1ef96 | |||
| 899863a0de | |||
| e4e069a6b3 | |||
| dfca88d3bd | |||
| 5c0e1b08aa | |||
| 4b506986a6 | |||
| 5131d24751 | |||
| 63526be592 | |||
| b56cdacf6b | |||
| 4b3b6162a0 | |||
| a7a61fdd64 | |||
| 8407bf45ab | |||
| 0cabd2e0fc | |||
| 0c5dee307f | |||
| 0500491621 | |||
| 890553f23e | |||
| c5ff1e11a3 | |||
| c595c165ed | |||
| d272d12d24 | |||
| 2887a021ba | |||
| d5e3734b35 | |||
| 07e6630a89 | |||
| dabec03f91 | |||
| 36deb77806 | |||
| 82bdebb721 | |||
| f8a57efee0 | |||
| 46cc5b10ab | |||
| 4c18365753 | |||
| 4def0aad46 | |||
| 9025d70b8e | |||
| ef8d12065b | |||
| 25d1962f7b | |||
| 487f923064 | |||
| f780bbc399 | |||
| b46b7fbc21 | |||
| 9399082231 | |||
| 8a16ff010f | |||
| 748769afc0 | |||
| abb6780744 | |||
| 83a29a7a3c | |||
| 66f5a57fe0 | |||
| 473044bd67 | |||
| afd45cc568 | |||
| 3564bcf62f | |||
| 49fd29a6f2 | |||
| 49f6d41715 | |||
| 186807b098 | |||
| c68b401709 | |||
| 5b9b6f4c38 | |||
| 981d65c274 | |||
| 1a2ca59d16 | |||
| 83d487a206 | |||
| 490ab1ae34 | |||
| 1a64627a4e | |||
| d5fdc9a8d9 | |||
| f6310da575 | |||
| 48e7830957 | |||
| 777a9b4b31 |
@ -1 +1 @@
|
|||||||
v20260207-02-wiki-documentation
|
main
|
||||||
|
|||||||
@ -3,6 +3,180 @@ Changelog data structure for Backupchecks
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
CHANGELOG = [
|
CHANGELOG = [
|
||||||
|
{
|
||||||
|
"version": "v0.1.26",
|
||||||
|
"date": "2026-02-10",
|
||||||
|
"summary": "This critical bug fix release resolves ticket system display issues where resolved tickets were incorrectly appearing on new runs across multiple pages. The ticket system has been completely transitioned from date-based logic to explicit link-based queries, ensuring resolved tickets stop appearing immediately after resolution while preserving audit trail for historical runs.",
|
||||||
|
"sections": [
|
||||||
|
{
|
||||||
|
"title": "Bug Fixes",
|
||||||
|
"type": "bugfix",
|
||||||
|
"subsections": [
|
||||||
|
{
|
||||||
|
"subtitle": "Ticket System - Resolved Ticket Display Issues",
|
||||||
|
"changes": [
|
||||||
|
"Root Cause: Multiple pages used legacy date-based logic (active_from_date <= run_date AND resolved_at >= run_date) instead of checking explicit ticket_job_runs links",
|
||||||
|
"Impact: Resolved tickets kept appearing on ALL runs between active_from_date and resolved_at, even runs created after resolution",
|
||||||
|
"Fixed: Ticket Linking (ticketing_utils.py) - Autotask tickets now propagate to new runs using independent strategy that checks for most recent non-deleted and non-resolved Autotask ticket",
|
||||||
|
"Fixed: Internal tickets no longer link to new runs after resolution - removed date-based 'open' logic, now only links if COALESCE(ts.resolved_at, t.resolved_at) IS NULL",
|
||||||
|
"Fixed: Job Details Page - Implemented two-source ticket display: direct links (ticket_job_runs) always shown for audit trail, active window (ticket_scopes) only shown if unresolved",
|
||||||
|
"Fixed: Run Checks Main Page - Ticket/remark indicators (🎫/💬) now only show for genuinely unresolved tickets, removed date-based logic from existence queries",
|
||||||
|
"Fixed: Run Checks Popup Modal - Replaced date-based queries in /api/job-runs/<run_id>/alerts with explicit JOIN queries (ticket_job_runs, remark_job_runs)",
|
||||||
|
"Fixed: Run Checks Popup - Removed unused parameters (run_date, job_id, ui_tz) as they are no longer needed with link-based queries",
|
||||||
|
"Testing: Temporarily added debug logging to link_open_internal_tickets_to_run (wrote to AuditLog with event_type 'ticket_link_debug'), removed after successful resolution",
|
||||||
|
"Result: Resolved tickets stop appearing immediately after resolution, consistent behavior across all pages, audit trail preserved for historical runs",
|
||||||
|
"Result: All queries now use explicit link-based logic with no date comparisons"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"subtitle": "Test Email Generation",
|
||||||
|
"changes": [
|
||||||
|
"Reduced test email generation from 3 emails per status to 1 email per status for simpler testing",
|
||||||
|
"Each button now creates exactly 1 test mail instead of 3"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"subtitle": "User Interface",
|
||||||
|
"changes": [
|
||||||
|
"Updated Settings → Maintenance page text for test email generation to match actual behavior",
|
||||||
|
"Changed description from '3 emails simulating Veeam, Synology, and NAKIVO' to '1 Veeam Backup Job email'",
|
||||||
|
"Updated button labels from '(3)' to '(1)' on all test email generation buttons"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"version": "v0.1.25",
|
||||||
|
"date": "2026-02-09",
|
||||||
|
"summary": "This release focuses on parser improvements and maintenance enhancements, adding support for new notification types across Synology and Veeam backup systems while improving system usability with orphaned job cleanup and test email generation features.",
|
||||||
|
"sections": [
|
||||||
|
{
|
||||||
|
"title": "Parser Enhancements",
|
||||||
|
"type": "feature",
|
||||||
|
"subsections": [
|
||||||
|
{
|
||||||
|
"subtitle": "Synology Parsers",
|
||||||
|
"changes": [
|
||||||
|
"Monthly Drive Health Reports: New parser for Synology NAS drive health notifications with Dutch and English support",
|
||||||
|
"Supports 'Maandelijks schijfintegriteitsrapport' (Dutch) and 'Monthly Drive Health Report' (English) variants",
|
||||||
|
"Automatic status detection: Healthy/Gezond/No problem detected → Success, otherwise → Warning",
|
||||||
|
"Extracts hostname from subject or body pattern (Van/From NAS-HOSTNAME)",
|
||||||
|
"Backup type: 'Health Report', Job name: 'Monthly Drive Health' (informational only, excluded from schedule learning)",
|
||||||
|
"DSM Update Notifications - Extended Coverage: Added 4 new detection patterns for automatic installation announcements",
|
||||||
|
"New patterns: 'belangrijke DSM-update', 'kritieke oplossingen', 'wordt automatisch geïnstalleerd', 'is beschikbaar op'",
|
||||||
|
"Now recognizes 4 notification types: update cancelled, packages out-of-date, new update available, automatic installation scheduled",
|
||||||
|
"All patterns added to existing lists maintaining full backward compatibility",
|
||||||
|
"Active Backup for Business - Skipped Tasks: Extended parser to recognize skipped/ignored backup tasks",
|
||||||
|
"Detects Dutch ('genegeerd') and English ('skipped', 'ignored') status indicators as Warning status",
|
||||||
|
"Common scenario: Backup skipped because previous backup still running"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"subtitle": "Veeam Parsers",
|
||||||
|
"changes": [
|
||||||
|
"Job Not Started Errors: New detection for 'Job did not start on schedule' error notifications",
|
||||||
|
"Recognizes VBO365 and other Veeam backup types that send plain text error notifications",
|
||||||
|
"Extracts backup type from subject (e.g., 'Veeam Backup for Microsoft 365')",
|
||||||
|
"Extracts job name from subject after colon (e.g., 'Backup MDS at Work')",
|
||||||
|
"Reads error message from plain text body (handles base64 UTF-16 encoding)",
|
||||||
|
"Sets overall_status to 'Error' for failed-to-start jobs",
|
||||||
|
"Example messages: 'Proxy server was offline at the time the job was scheduled to run.'"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "Maintenance Improvements",
|
||||||
|
"type": "feature",
|
||||||
|
"subsections": [
|
||||||
|
{
|
||||||
|
"subtitle": "Orphaned Jobs Cleanup",
|
||||||
|
"changes": [
|
||||||
|
"Added 'Cleanup orphaned jobs' option in Settings → Maintenance",
|
||||||
|
"Removes jobs without valid customer links (useful when customers are deleted)",
|
||||||
|
"Permanently deletes job records along with all associated emails and job runs",
|
||||||
|
"'Preview orphaned jobs' button shows detailed list before deletion with email and run counts",
|
||||||
|
"Safety verification step to prevent accidental deletion"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"subtitle": "Test Email Generation",
|
||||||
|
"changes": [
|
||||||
|
"Added 'Generate test emails' feature in Settings → Maintenance",
|
||||||
|
"Three separate buttons to create fixed test email sets: Success, Warning, Error",
|
||||||
|
"Each set contains exactly 3 Veeam Backup Job emails with same job name 'Test-Backup-Job'",
|
||||||
|
"Different dates, objects, and statuses for reproducible testing scenarios",
|
||||||
|
"Proper status flow testing (success → warning → error progression)"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "Data Privacy",
|
||||||
|
"type": "improvement",
|
||||||
|
"subsections": [
|
||||||
|
{
|
||||||
|
"subtitle": "Parser Registry Cleanup",
|
||||||
|
"changes": [
|
||||||
|
"Replaced real customer names in parser registry examples with generic placeholders",
|
||||||
|
"Affected parsers: NTFS Auditing, QNAP Firmware Update, NAKIVO",
|
||||||
|
"Example format now uses: NAS-HOSTNAME, SERVER-HOSTNAME, VM-HOSTNAME, example.local",
|
||||||
|
"Ensures no customer information in codebase or version control"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"subtitle": "Autotask Integration",
|
||||||
|
"changes": [
|
||||||
|
"Removed customer name from Autotask ticket title for concise display",
|
||||||
|
"Format changed from '[Backupchecks] Customer - Job Name - Status' to '[Backupchecks] Job Name - Status'",
|
||||||
|
"Reduces redundancy (customer already visible in ticket company field)"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"title": "Bug Fixes",
|
||||||
|
"type": "bugfix",
|
||||||
|
"subsections": [
|
||||||
|
{
|
||||||
|
"subtitle": "User Interface",
|
||||||
|
"changes": [
|
||||||
|
"Fixed responsive navbar overlapping page content on smaller screens",
|
||||||
|
"Implemented dynamic padding adjustment using JavaScript",
|
||||||
|
"Measures actual navbar height on page load, window resize, and navbar collapse toggle",
|
||||||
|
"Automatically adjusts main content padding-top to prevent overlap",
|
||||||
|
"Debounced resize handler for performance"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"version": "v0.1.24",
|
||||||
|
"date": "2026-02-09",
|
||||||
|
"summary": "Bug fix release addressing Autotask ticket description field being cleared when resolving tickets.",
|
||||||
|
"sections": [
|
||||||
|
{
|
||||||
|
"title": "Bug Fixes",
|
||||||
|
"type": "bugfix",
|
||||||
|
"subsections": [
|
||||||
|
{
|
||||||
|
"subtitle": "Autotask Integration",
|
||||||
|
"changes": [
|
||||||
|
"Fixed Autotask ticket description being cleared when resolving tickets via the update_ticket_resolution_safe function",
|
||||||
|
"Root cause: The description field was not included in the PUT payload when updating ticket resolution, causing Autotask API to set it to NULL",
|
||||||
|
"Solution: Added 'description' to the optional_fields list so it is preserved from GET response and included in PUT request",
|
||||||
|
"Impact: Ticket descriptions now remain intact when marking Autotask tickets as resolved",
|
||||||
|
"Location: containers/backupchecks/src/backend/app/integrations/autotask/client.py line 672"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"version": "v0.1.23",
|
"version": "v0.1.23",
|
||||||
"date": "2026-02-09",
|
"date": "2026-02-09",
|
||||||
|
|||||||
@ -669,6 +669,7 @@ class AutotaskClient:
|
|||||||
"companyID",
|
"companyID",
|
||||||
"queueID",
|
"queueID",
|
||||||
"title",
|
"title",
|
||||||
|
"description",
|
||||||
"priority",
|
"priority",
|
||||||
"dueDateTime",
|
"dueDateTime",
|
||||||
"ticketCategory",
|
"ticketCategory",
|
||||||
|
|||||||
@ -16,33 +16,27 @@ def api_job_run_alerts(run_id: int):
|
|||||||
tickets = []
|
tickets = []
|
||||||
remarks = []
|
remarks = []
|
||||||
|
|
||||||
# Tickets active for this job on this run date (including resolved-on-day)
|
# Tickets linked to this specific run
|
||||||
|
# Only show tickets that were explicitly linked via ticket_job_runs
|
||||||
try:
|
try:
|
||||||
rows = (
|
rows = (
|
||||||
db.session.execute(
|
db.session.execute(
|
||||||
text(
|
text(
|
||||||
"""
|
"""
|
||||||
SELECT t.id,
|
SELECT DISTINCT t.id,
|
||||||
t.ticket_code,
|
t.ticket_code,
|
||||||
t.description,
|
t.description,
|
||||||
t.start_date,
|
t.start_date,
|
||||||
COALESCE(ts.resolved_at, t.resolved_at) AS resolved_at,
|
t.resolved_at,
|
||||||
t.active_from_date
|
t.active_from_date
|
||||||
FROM tickets t
|
FROM tickets t
|
||||||
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
JOIN ticket_job_runs tjr ON tjr.ticket_id = t.id
|
||||||
WHERE ts.job_id = :job_id
|
WHERE tjr.job_run_id = :run_id
|
||||||
AND t.active_from_date <= :run_date
|
|
||||||
AND (
|
|
||||||
COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
|
||||||
OR ((COALESCE(ts.resolved_at, t.resolved_at) AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
|
|
||||||
)
|
|
||||||
ORDER BY t.start_date DESC
|
ORDER BY t.start_date DESC
|
||||||
"""
|
"""
|
||||||
),
|
),
|
||||||
{
|
{
|
||||||
"job_id": job.id if job else None,
|
"run_id": run_id,
|
||||||
"run_date": run_date,
|
|
||||||
"ui_tz": _get_ui_timezone_name(),
|
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
.mappings()
|
.mappings()
|
||||||
@ -71,31 +65,22 @@ def api_job_run_alerts(run_id: int):
|
|||||||
except Exception as exc:
|
except Exception as exc:
|
||||||
return jsonify({"status": "error", "message": str(exc) or "Failed to load tickets."}), 500
|
return jsonify({"status": "error", "message": str(exc) or "Failed to load tickets."}), 500
|
||||||
|
|
||||||
# Remarks active for this job on this run date (including resolved-on-day)
|
# Remarks linked to this specific run
|
||||||
|
# Only show remarks that were explicitly linked via remark_job_runs
|
||||||
try:
|
try:
|
||||||
rows = (
|
rows = (
|
||||||
db.session.execute(
|
db.session.execute(
|
||||||
text(
|
text(
|
||||||
"""
|
"""
|
||||||
SELECT r.id, r.body, r.start_date, r.resolved_at, r.active_from_date
|
SELECT DISTINCT r.id, r.body, r.start_date, r.resolved_at, r.active_from_date
|
||||||
FROM remarks r
|
FROM remarks r
|
||||||
JOIN remark_scopes rs ON rs.remark_id = r.id
|
JOIN remark_job_runs rjr ON rjr.remark_id = r.id
|
||||||
WHERE rs.job_id = :job_id
|
WHERE rjr.job_run_id = :run_id
|
||||||
AND COALESCE(
|
|
||||||
r.active_from_date,
|
|
||||||
((r.start_date AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date)
|
|
||||||
) <= :run_date
|
|
||||||
AND (
|
|
||||||
r.resolved_at IS NULL
|
|
||||||
OR ((r.resolved_at AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
|
|
||||||
)
|
|
||||||
ORDER BY r.start_date DESC
|
ORDER BY r.start_date DESC
|
||||||
"""
|
"""
|
||||||
),
|
),
|
||||||
{
|
{
|
||||||
"job_id": job.id if job else None,
|
"run_id": run_id,
|
||||||
"run_date": run_date,
|
|
||||||
"ui_tz": _get_ui_timezone_name(),
|
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
.mappings()
|
.mappings()
|
||||||
|
|||||||
@ -8,6 +8,35 @@ from ..database import db
|
|||||||
from ..models import SystemSettings
|
from ..models import SystemSettings
|
||||||
|
|
||||||
|
|
||||||
|
def _get_or_create_settings_local():
|
||||||
|
"""Return SystemSettings, creating a default row if missing.
|
||||||
|
|
||||||
|
This module should not depend on star-imported helpers for settings.
|
||||||
|
Mixed deployments (partial container updates) can otherwise raise a
|
||||||
|
NameError on /customers when the shared helper is not present.
|
||||||
|
"""
|
||||||
|
|
||||||
|
settings = SystemSettings.query.first()
|
||||||
|
if settings is None:
|
||||||
|
settings = SystemSettings(
|
||||||
|
auto_import_enabled=False,
|
||||||
|
auto_import_interval_minutes=15,
|
||||||
|
auto_import_max_items=50,
|
||||||
|
manual_import_batch_size=50,
|
||||||
|
auto_import_cutoff_date=datetime.utcnow().date(),
|
||||||
|
ingest_eml_retention_days=7,
|
||||||
|
)
|
||||||
|
db.session.add(settings)
|
||||||
|
db.session.commit()
|
||||||
|
return settings
|
||||||
|
|
||||||
|
# Explicit imports for robustness across mixed deployments.
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from ..database import db
|
||||||
|
from ..models import SystemSettings
|
||||||
|
|
||||||
|
|
||||||
def _get_or_create_settings_local():
|
def _get_or_create_settings_local():
|
||||||
"""Return SystemSettings, creating a default row if missing.
|
"""Return SystemSettings, creating a default row if missing.
|
||||||
|
|
||||||
|
|||||||
@ -168,23 +168,61 @@ def job_detail(job_id: int):
|
|||||||
.all()
|
.all()
|
||||||
)
|
)
|
||||||
|
|
||||||
# Tickets: mark runs that fall within the ticket active window
|
# Tickets: mark runs that fall within the ticket active window OR have direct links
|
||||||
ticket_rows = []
|
ticket_rows = []
|
||||||
ticket_open_count = 0
|
ticket_open_count = 0
|
||||||
ticket_total_count = 0
|
ticket_total_count = 0
|
||||||
|
|
||||||
|
# Map of run_id -> list of directly linked ticket codes (for audit trail)
|
||||||
|
direct_ticket_links = {}
|
||||||
|
|
||||||
remark_rows = []
|
remark_rows = []
|
||||||
remark_open_count = 0
|
remark_open_count = 0
|
||||||
remark_total_count = 0
|
remark_total_count = 0
|
||||||
|
|
||||||
run_dates = []
|
run_dates = []
|
||||||
run_date_map = {}
|
run_date_map = {}
|
||||||
|
run_ids = []
|
||||||
for r in runs:
|
for r in runs:
|
||||||
rd = _to_amsterdam_date(r.run_at) or _to_amsterdam_date(datetime.utcnow())
|
rd = _to_amsterdam_date(r.run_at) or _to_amsterdam_date(datetime.utcnow())
|
||||||
run_date_map[r.id] = rd
|
run_date_map[r.id] = rd
|
||||||
|
run_ids.append(r.id)
|
||||||
if rd:
|
if rd:
|
||||||
run_dates.append(rd)
|
run_dates.append(rd)
|
||||||
|
|
||||||
|
# Get directly linked tickets for these runs (audit trail - show even if resolved)
|
||||||
|
if run_ids:
|
||||||
|
try:
|
||||||
|
rows = (
|
||||||
|
db.session.execute(
|
||||||
|
text(
|
||||||
|
"""
|
||||||
|
SELECT tjr.job_run_id, t.ticket_code, t.resolved_at
|
||||||
|
FROM ticket_job_runs tjr
|
||||||
|
JOIN tickets t ON t.id = tjr.ticket_id
|
||||||
|
WHERE tjr.job_run_id = ANY(:run_ids)
|
||||||
|
"""
|
||||||
|
),
|
||||||
|
{"run_ids": run_ids},
|
||||||
|
)
|
||||||
|
.mappings()
|
||||||
|
.all()
|
||||||
|
)
|
||||||
|
for rr in rows:
|
||||||
|
run_id = rr.get("job_run_id")
|
||||||
|
code = (rr.get("ticket_code") or "").strip()
|
||||||
|
resolved_at = rr.get("resolved_at")
|
||||||
|
if run_id not in direct_ticket_links:
|
||||||
|
direct_ticket_links[run_id] = []
|
||||||
|
direct_ticket_links[run_id].append({
|
||||||
|
"ticket_code": code,
|
||||||
|
"resolved_at": resolved_at,
|
||||||
|
"is_direct_link": True
|
||||||
|
})
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Get active (unresolved) tickets for future runs
|
||||||
if run_dates:
|
if run_dates:
|
||||||
min_date = min(run_dates)
|
min_date = min(run_dates)
|
||||||
max_date = max(run_dates)
|
max_date = max(run_dates)
|
||||||
@ -198,14 +236,10 @@ def job_detail(job_id: int):
|
|||||||
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
||||||
WHERE ts.job_id = :job_id
|
WHERE ts.job_id = :job_id
|
||||||
AND t.active_from_date <= :max_date
|
AND t.active_from_date <= :max_date
|
||||||
AND (
|
AND COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
||||||
COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
|
||||||
OR ((COALESCE(ts.resolved_at, t.resolved_at) AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :min_date
|
|
||||||
)
|
|
||||||
"""
|
"""
|
||||||
),
|
),
|
||||||
{"job_id": job.id, "min_date": min_date,
|
{"job_id": job.id, "max_date": max_date},
|
||||||
"ui_tz": _get_ui_timezone_name(), "max_date": max_date},
|
|
||||||
)
|
)
|
||||||
.mappings()
|
.mappings()
|
||||||
.all()
|
.all()
|
||||||
@ -214,7 +248,12 @@ def job_detail(job_id: int):
|
|||||||
active_from = rr.get("active_from_date")
|
active_from = rr.get("active_from_date")
|
||||||
resolved_at = rr.get("resolved_at")
|
resolved_at = rr.get("resolved_at")
|
||||||
resolved_date = _to_amsterdam_date(resolved_at) if resolved_at else None
|
resolved_date = _to_amsterdam_date(resolved_at) if resolved_at else None
|
||||||
ticket_rows.append({"active_from_date": active_from, "resolved_date": resolved_date, "ticket_code": rr.get("ticket_code")})
|
ticket_rows.append({
|
||||||
|
"active_from_date": active_from,
|
||||||
|
"resolved_date": resolved_date,
|
||||||
|
"ticket_code": rr.get("ticket_code"),
|
||||||
|
"is_direct_link": False
|
||||||
|
})
|
||||||
except Exception:
|
except Exception:
|
||||||
ticket_rows = []
|
ticket_rows = []
|
||||||
|
|
||||||
@ -240,14 +279,10 @@ def job_detail(job_id: int):
|
|||||||
r.active_from_date,
|
r.active_from_date,
|
||||||
((r.start_date AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date)
|
((r.start_date AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date)
|
||||||
) <= :max_date
|
) <= :max_date
|
||||||
AND (
|
AND r.resolved_at IS NULL
|
||||||
r.resolved_at IS NULL
|
|
||||||
OR ((r.resolved_at AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :min_date
|
|
||||||
)
|
|
||||||
"""
|
"""
|
||||||
),
|
),
|
||||||
{"job_id": job.id, "min_date": min_date,
|
{"job_id": job.id, "max_date": max_date},
|
||||||
"ui_tz": _get_ui_timezone_name(), "max_date": max_date},
|
|
||||||
)
|
)
|
||||||
.mappings()
|
.mappings()
|
||||||
.all()
|
.all()
|
||||||
@ -341,11 +376,22 @@ def job_detail(job_id: int):
|
|||||||
ticket_codes = []
|
ticket_codes = []
|
||||||
remark_items = []
|
remark_items = []
|
||||||
|
|
||||||
|
# First: add directly linked tickets (audit trail - always show)
|
||||||
|
if r.id in direct_ticket_links:
|
||||||
|
for tlink in direct_ticket_links[r.id]:
|
||||||
|
code = tlink.get("ticket_code", "")
|
||||||
|
if code and code not in ticket_codes:
|
||||||
|
ticket_codes.append(code)
|
||||||
|
has_ticket = True
|
||||||
|
|
||||||
|
# Second: add active window tickets (only unresolved)
|
||||||
if rd and ticket_rows:
|
if rd and ticket_rows:
|
||||||
for tr in ticket_rows:
|
for tr in ticket_rows:
|
||||||
|
if tr.get("is_direct_link"):
|
||||||
|
continue # Skip, already added above
|
||||||
af = tr.get("active_from_date")
|
af = tr.get("active_from_date")
|
||||||
resd = tr.get("resolved_date")
|
# Only check active_from, resolved tickets already filtered by query
|
||||||
if af and af <= rd and (resd is None or resd >= rd):
|
if af and af <= rd:
|
||||||
has_ticket = True
|
has_ticket = True
|
||||||
code = (tr.get("ticket_code") or "").strip()
|
code = (tr.get("ticket_code") or "").strip()
|
||||||
if code and code not in ticket_codes:
|
if code and code not in ticket_codes:
|
||||||
|
|||||||
@ -1068,14 +1068,11 @@ def run_checks_page():
|
|||||||
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
||||||
WHERE ts.job_id = :job_id
|
WHERE ts.job_id = :job_id
|
||||||
AND t.active_from_date <= :run_date
|
AND t.active_from_date <= :run_date
|
||||||
AND (
|
AND COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
||||||
COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
|
||||||
OR ((COALESCE(ts.resolved_at, t.resolved_at) AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
|
|
||||||
)
|
|
||||||
LIMIT 1
|
LIMIT 1
|
||||||
"""
|
"""
|
||||||
),
|
),
|
||||||
{"job_id": job_id, "run_date": today_local, "ui_tz": ui_tz},
|
{"job_id": job_id, "run_date": today_local},
|
||||||
).first()
|
).first()
|
||||||
has_active_ticket = bool(t_exists)
|
has_active_ticket = bool(t_exists)
|
||||||
|
|
||||||
@ -1090,10 +1087,7 @@ def run_checks_page():
|
|||||||
r.active_from_date,
|
r.active_from_date,
|
||||||
((r.start_date AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date)
|
((r.start_date AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date)
|
||||||
) <= :run_date
|
) <= :run_date
|
||||||
AND (
|
AND r.resolved_at IS NULL
|
||||||
r.resolved_at IS NULL
|
|
||||||
OR ((r.resolved_at AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
|
|
||||||
)
|
|
||||||
LIMIT 1
|
LIMIT 1
|
||||||
"""
|
"""
|
||||||
),
|
),
|
||||||
@ -1464,7 +1458,7 @@ def api_run_checks_create_autotask_ticket():
|
|||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
subject = f"[Backupchecks] {customer.name} - {job.job_name or ''} - {status_display}"
|
subject = f"[Backupchecks] {job.job_name or ''} - {status_display}"
|
||||||
description = _compose_autotask_ticket_description(
|
description = _compose_autotask_ticket_description(
|
||||||
settings=settings,
|
settings=settings,
|
||||||
job=job,
|
job=job,
|
||||||
|
|||||||
@ -124,6 +124,309 @@ def settings_jobs_delete_all():
|
|||||||
return redirect(url_for("main.settings"))
|
return redirect(url_for("main.settings"))
|
||||||
|
|
||||||
|
|
||||||
|
@main_bp.route("/settings/jobs/orphaned", methods=["GET"])
|
||||||
|
@login_required
|
||||||
|
@roles_required("admin")
|
||||||
|
def settings_jobs_orphaned():
|
||||||
|
"""Show list of orphaned jobs for verification before deletion."""
|
||||||
|
# Find jobs without valid customer
|
||||||
|
orphaned_jobs = Job.query.outerjoin(Customer, Job.customer_id == Customer.id).filter(
|
||||||
|
db.or_(
|
||||||
|
Job.customer_id.is_(None),
|
||||||
|
Customer.id.is_(None)
|
||||||
|
)
|
||||||
|
).order_by(Job.job_name.asc()).all()
|
||||||
|
|
||||||
|
# Build list with details
|
||||||
|
jobs_list = []
|
||||||
|
for job in orphaned_jobs:
|
||||||
|
run_count = JobRun.query.filter_by(job_id=job.id).count()
|
||||||
|
mail_count = JobRun.query.filter_by(job_id=job.id).filter(JobRun.mail_message_id.isnot(None)).count()
|
||||||
|
|
||||||
|
jobs_list.append({
|
||||||
|
"id": job.id,
|
||||||
|
"job_name": job.job_name or "Unnamed",
|
||||||
|
"backup_software": job.backup_software or "-",
|
||||||
|
"backup_type": job.backup_type or "-",
|
||||||
|
"customer_id": job.customer_id,
|
||||||
|
"run_count": run_count,
|
||||||
|
"mail_count": mail_count,
|
||||||
|
})
|
||||||
|
|
||||||
|
return render_template(
|
||||||
|
"main/settings_orphaned_jobs.html",
|
||||||
|
orphaned_jobs=jobs_list,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@main_bp.route("/settings/jobs/delete-orphaned", methods=["POST"])
|
||||||
|
@login_required
|
||||||
|
@roles_required("admin")
|
||||||
|
def settings_jobs_delete_orphaned():
|
||||||
|
"""Delete jobs that have no customer (customer_id is NULL or customer does not exist).
|
||||||
|
|
||||||
|
Also deletes all related emails from the database since the customer is gone.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Find jobs without valid customer
|
||||||
|
orphaned_jobs = Job.query.outerjoin(Customer, Job.customer_id == Customer.id).filter(
|
||||||
|
db.or_(
|
||||||
|
Job.customer_id.is_(None),
|
||||||
|
Customer.id.is_(None)
|
||||||
|
)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
if not orphaned_jobs:
|
||||||
|
flash("No orphaned jobs found.", "info")
|
||||||
|
return redirect(url_for("main.settings", section="maintenance"))
|
||||||
|
|
||||||
|
job_count = len(orphaned_jobs)
|
||||||
|
mail_count = 0
|
||||||
|
run_count = 0
|
||||||
|
|
||||||
|
# Collect mail message ids and run ids for cleanup
|
||||||
|
mail_message_ids = []
|
||||||
|
run_ids = []
|
||||||
|
job_ids = [job.id for job in orphaned_jobs]
|
||||||
|
|
||||||
|
for job in orphaned_jobs:
|
||||||
|
for run in job.runs:
|
||||||
|
if run.id is not None:
|
||||||
|
run_ids.append(run.id)
|
||||||
|
run_count += 1
|
||||||
|
if run.mail_message_id:
|
||||||
|
mail_message_ids.append(run.mail_message_id)
|
||||||
|
|
||||||
|
# Helper function for safe SQL execution
|
||||||
|
def _safe_execute(stmt, params):
|
||||||
|
try:
|
||||||
|
db.session.execute(stmt, params)
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Clean up auxiliary tables that may not have ON DELETE CASCADE
|
||||||
|
if run_ids:
|
||||||
|
from sqlalchemy import text, bindparam
|
||||||
|
_safe_execute(
|
||||||
|
text("DELETE FROM ticket_job_runs WHERE job_run_id IN :run_ids").bindparams(
|
||||||
|
bindparam("run_ids", expanding=True)
|
||||||
|
),
|
||||||
|
{"run_ids": run_ids},
|
||||||
|
)
|
||||||
|
_safe_execute(
|
||||||
|
text("DELETE FROM remark_job_runs WHERE job_run_id IN :run_ids").bindparams(
|
||||||
|
bindparam("run_ids", expanding=True)
|
||||||
|
),
|
||||||
|
{"run_ids": run_ids},
|
||||||
|
)
|
||||||
|
|
||||||
|
if job_ids:
|
||||||
|
from sqlalchemy import text, bindparam
|
||||||
|
# Clean up scopes
|
||||||
|
_safe_execute(
|
||||||
|
text("DELETE FROM ticket_scopes WHERE job_id IN :job_ids").bindparams(
|
||||||
|
bindparam("job_ids", expanding=True)
|
||||||
|
),
|
||||||
|
{"job_ids": job_ids},
|
||||||
|
)
|
||||||
|
_safe_execute(
|
||||||
|
text("DELETE FROM remark_scopes WHERE job_id IN :job_ids").bindparams(
|
||||||
|
bindparam("job_ids", expanding=True)
|
||||||
|
),
|
||||||
|
{"job_ids": job_ids},
|
||||||
|
)
|
||||||
|
# Clean up overrides
|
||||||
|
_safe_execute(
|
||||||
|
text("DELETE FROM overrides WHERE job_id IN :job_ids").bindparams(
|
||||||
|
bindparam("job_ids", expanding=True)
|
||||||
|
),
|
||||||
|
{"job_ids": job_ids},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Unlink mails from jobs before deleting jobs
|
||||||
|
# mail_messages.job_id references jobs.id
|
||||||
|
_safe_execute(
|
||||||
|
text("UPDATE mail_messages SET job_id = NULL WHERE job_id IN :job_ids").bindparams(
|
||||||
|
bindparam("job_ids", expanding=True)
|
||||||
|
),
|
||||||
|
{"job_ids": job_ids},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Delete mail_objects before deleting mails
|
||||||
|
# mail_objects.mail_message_id references mail_messages.id
|
||||||
|
if mail_message_ids:
|
||||||
|
from sqlalchemy import text, bindparam
|
||||||
|
_safe_execute(
|
||||||
|
text("DELETE FROM mail_objects WHERE mail_message_id IN :mail_ids").bindparams(
|
||||||
|
bindparam("mail_ids", expanding=True)
|
||||||
|
),
|
||||||
|
{"mail_ids": mail_message_ids},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Delete all orphaned jobs (runs/objects are cascaded via ORM relationships)
|
||||||
|
for job in orphaned_jobs:
|
||||||
|
db.session.delete(job)
|
||||||
|
|
||||||
|
# Now delete related mails permanently (customer is gone)
|
||||||
|
# This must happen AFTER deleting jobs/runs to avoid foreign key constraint violations
|
||||||
|
if mail_message_ids:
|
||||||
|
mail_count = len(mail_message_ids)
|
||||||
|
MailMessage.query.filter(MailMessage.id.in_(mail_message_ids)).delete(synchronize_session=False)
|
||||||
|
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
flash(
|
||||||
|
f"Deleted {job_count} orphaned job(s), {run_count} run(s), and {mail_count} email(s).",
|
||||||
|
"success"
|
||||||
|
)
|
||||||
|
|
||||||
|
_log_admin_event(
|
||||||
|
event_type="maintenance_delete_orphaned_jobs",
|
||||||
|
message=f"Deleted {job_count} orphaned jobs, {run_count} runs, and {mail_count} emails",
|
||||||
|
details=json.dumps({
|
||||||
|
"jobs_deleted": job_count,
|
||||||
|
"runs_deleted": run_count,
|
||||||
|
"mails_deleted": mail_count,
|
||||||
|
}),
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as exc:
|
||||||
|
db.session.rollback()
|
||||||
|
print(f"[settings-jobs] Failed to delete orphaned jobs: {exc}")
|
||||||
|
flash("Failed to delete orphaned jobs.", "danger")
|
||||||
|
|
||||||
|
return redirect(url_for("main.settings", section="maintenance"))
|
||||||
|
|
||||||
|
|
||||||
|
@main_bp.route("/settings/test-emails/generate/<status_type>", methods=["POST"])
|
||||||
|
@login_required
|
||||||
|
@roles_required("admin")
|
||||||
|
def settings_generate_test_emails(status_type):
|
||||||
|
"""Generate test emails in inbox for testing parsers and orphaned jobs cleanup.
|
||||||
|
|
||||||
|
Fixed sets for consistent testing and reproducibility.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
|
||||||
|
# Fixed test email sets per status type (Veeam only for consistent testing)
|
||||||
|
# Single email per status for simpler testing
|
||||||
|
email_sets = {
|
||||||
|
"success": [
|
||||||
|
{
|
||||||
|
"from_address": "veeam@test.local",
|
||||||
|
"subject": 'Veeam Backup Job "Test-Backup-Job" finished with Success',
|
||||||
|
"body": """Backup job: Test-Backup-Job
|
||||||
|
|
||||||
|
Session details:
|
||||||
|
Start time: 2026-02-09 01:00:00
|
||||||
|
End time: 2026-02-09 02:15:00
|
||||||
|
Total size: 150 GB
|
||||||
|
Duration: 01:15:00
|
||||||
|
|
||||||
|
Processing VM-APP01
|
||||||
|
Success
|
||||||
|
|
||||||
|
Processing VM-DB01
|
||||||
|
Success
|
||||||
|
|
||||||
|
Processing VM-WEB01
|
||||||
|
Success
|
||||||
|
|
||||||
|
All backup operations completed without issues.""",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
"warning": [
|
||||||
|
{
|
||||||
|
"from_address": "veeam@test.local",
|
||||||
|
"subject": 'Veeam Backup Job "Test-Backup-Job" finished with WARNING',
|
||||||
|
"body": """Backup job: Test-Backup-Job
|
||||||
|
|
||||||
|
Session details:
|
||||||
|
Start time: 2026-02-09 01:00:00
|
||||||
|
End time: 2026-02-09 02:30:00
|
||||||
|
Total size: 148 GB
|
||||||
|
Duration: 01:30:00
|
||||||
|
|
||||||
|
Processing VM-APP01
|
||||||
|
Success
|
||||||
|
|
||||||
|
Processing VM-DB01
|
||||||
|
Warning
|
||||||
|
Warning: Low free space on target datastore
|
||||||
|
|
||||||
|
Processing VM-WEB01
|
||||||
|
Success
|
||||||
|
|
||||||
|
Backup completed but some files were skipped.""",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
"error": [
|
||||||
|
{
|
||||||
|
"from_address": "veeam@test.local",
|
||||||
|
"subject": 'Veeam Backup Job "Test-Backup-Job" finished with Failed',
|
||||||
|
"body": """Backup job: Test-Backup-Job
|
||||||
|
|
||||||
|
Session details:
|
||||||
|
Start time: 2026-02-09 01:00:00
|
||||||
|
End time: 2026-02-09 01:15:00
|
||||||
|
Total size: 0 GB
|
||||||
|
Duration: 00:15:00
|
||||||
|
|
||||||
|
Processing VM-APP01
|
||||||
|
Failed
|
||||||
|
Error: Cannot create snapshot: VSS error 0x800423f4
|
||||||
|
|
||||||
|
Processing VM-DB01
|
||||||
|
Success
|
||||||
|
|
||||||
|
Processing VM-WEB01
|
||||||
|
Success
|
||||||
|
|
||||||
|
Backup failed. Please check the logs for details.""",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
if status_type not in email_sets:
|
||||||
|
flash("Invalid status type.", "danger")
|
||||||
|
return redirect(url_for("main.settings", section="maintenance"))
|
||||||
|
|
||||||
|
emails = email_sets[status_type]
|
||||||
|
created_count = 0
|
||||||
|
now = datetime.utcnow()
|
||||||
|
|
||||||
|
for email_data in emails:
|
||||||
|
mail = MailMessage(
|
||||||
|
from_address=email_data["from_address"],
|
||||||
|
subject=email_data["subject"],
|
||||||
|
text_body=email_data["body"],
|
||||||
|
html_body=f"<pre>{email_data['body']}</pre>",
|
||||||
|
received_at=now - timedelta(hours=created_count),
|
||||||
|
location="inbox",
|
||||||
|
job_id=None,
|
||||||
|
)
|
||||||
|
db.session.add(mail)
|
||||||
|
created_count += 1
|
||||||
|
|
||||||
|
db.session.commit()
|
||||||
|
|
||||||
|
flash(f"Generated {created_count} {status_type} test email(s) in inbox.", "success")
|
||||||
|
|
||||||
|
_log_admin_event(
|
||||||
|
event_type="maintenance_generate_test_emails",
|
||||||
|
message=f"Generated {created_count} {status_type} test emails",
|
||||||
|
details=json.dumps({"status_type": status_type, "count": created_count}),
|
||||||
|
)
|
||||||
|
|
||||||
|
except Exception as exc:
|
||||||
|
db.session.rollback()
|
||||||
|
print(f"[settings-test] Failed to generate test emails: {exc}")
|
||||||
|
flash("Failed to generate test emails.", "danger")
|
||||||
|
|
||||||
|
return redirect(url_for("main.settings", section="maintenance"))
|
||||||
|
|
||||||
|
|
||||||
@main_bp.route("/settings/objects/backfill", methods=["POST"])
|
@main_bp.route("/settings/objects/backfill", methods=["POST"])
|
||||||
@login_required
|
@login_required
|
||||||
@roles_required("admin")
|
@roles_required("admin")
|
||||||
|
|||||||
@ -50,13 +50,13 @@ PARSER_DEFINITIONS = [
|
|||||||
},
|
},
|
||||||
"description": "Parses NTFS Auditing file audit report mails (attachment-based HTML reports).",
|
"description": "Parses NTFS Auditing file audit report mails (attachment-based HTML reports).",
|
||||||
"example": {
|
"example": {
|
||||||
"subject": "Bouter btr-dc001.bouter.nl file audits → 6 ↑ 12",
|
"subject": "SERVER-HOSTNAME file audits → 6 ↑ 12",
|
||||||
"from_address": "auditing@bouter.nl",
|
"from_address": "auditing@example.local",
|
||||||
"body_snippet": "(empty body, HTML report in attachment)",
|
"body_snippet": "(empty body, HTML report in attachment)",
|
||||||
"parsed_result": {
|
"parsed_result": {
|
||||||
"backup_software": "NTFS Auditing",
|
"backup_software": "NTFS Auditing",
|
||||||
"backup_type": "Audit",
|
"backup_type": "Audit",
|
||||||
"job_name": "btr-dc001.bouter.nl file audits",
|
"job_name": "SERVER-HOSTNAME file audits",
|
||||||
"objects": [],
|
"objects": [],
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
@ -73,16 +73,68 @@ PARSER_DEFINITIONS = [
|
|||||||
},
|
},
|
||||||
"description": "Parses QNAP Notification Center firmware update notifications (informational; excluded from reporting and missing logic).",
|
"description": "Parses QNAP Notification Center firmware update notifications (informational; excluded from reporting and missing logic).",
|
||||||
"example": {
|
"example": {
|
||||||
"subject": "[Info][Firmware Update] Notification from your device: BETSIES-NAS01",
|
"subject": "[Info][Firmware Update] Notification from your device: NAS-HOSTNAME",
|
||||||
"from_address": "notifications@customer.tld",
|
"from_address": "notifications@customer.tld",
|
||||||
"body_snippet": "NAS Name: BETSIES-NAS01\n...\nMessage: ...",
|
"body_snippet": "NAS Name: NAS-HOSTNAME\n...\nMessage: ...",
|
||||||
"parsed_result": {
|
"parsed_result": {
|
||||||
"backup_software": "QNAP",
|
"backup_software": "QNAP",
|
||||||
"backup_type": "Firmware Update",
|
"backup_type": "Firmware Update",
|
||||||
"job_name": "Firmware Update",
|
"job_name": "Firmware Update",
|
||||||
"overall_status": "Warning",
|
"overall_status": "Warning",
|
||||||
"objects": [
|
"objects": [
|
||||||
{"name": "BETSIES-NAS01", "status": "Warning", "error_message": None}
|
{"name": "NAS-HOSTNAME", "status": "Warning", "error_message": None}
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "synology_dsm_update",
|
||||||
|
"backup_software": "Synology",
|
||||||
|
"backup_types": ["Updates"],
|
||||||
|
"order": 236,
|
||||||
|
"enabled": True,
|
||||||
|
"match": {
|
||||||
|
"subject_contains_any": ["DSM-update", "DSM update"],
|
||||||
|
"body_contains_any": ["automatische DSM-update", "automatic DSM update", "Automatic update of DSM"],
|
||||||
|
},
|
||||||
|
"description": "Parses Synology DSM automatic update cancelled notifications (informational; excluded from reporting and missing logic).",
|
||||||
|
"example": {
|
||||||
|
"subject": "Synology NAS-HOSTNAME - Automatische DSM-update op NAS-HOSTNAME is geannuleerd door het systeem",
|
||||||
|
"from_address": "backup@example.local",
|
||||||
|
"body_snippet": "Het systeem heeft de automatische DSM-update op NAS-HOSTNAME geannuleerd...",
|
||||||
|
"parsed_result": {
|
||||||
|
"backup_software": "Synology",
|
||||||
|
"backup_type": "Updates",
|
||||||
|
"job_name": "Synology Automatic Update",
|
||||||
|
"overall_status": "Warning",
|
||||||
|
"objects": [
|
||||||
|
{"name": "NAS-HOSTNAME", "status": "Warning"}
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "synology_drive_health",
|
||||||
|
"backup_software": "Synology",
|
||||||
|
"backup_types": ["Health Report"],
|
||||||
|
"order": 237,
|
||||||
|
"enabled": True,
|
||||||
|
"match": {
|
||||||
|
"subject_contains_any": ["schijfintegriteitsrapport", "Drive Health Report"],
|
||||||
|
"body_contains_any": ["health of the drives", "integriteitsrapport van de schijven"],
|
||||||
|
},
|
||||||
|
"description": "Parses Synology monthly drive health reports (informational; excluded from reporting and missing logic).",
|
||||||
|
"example": {
|
||||||
|
"subject": "[NAS-HOSTNAME] Monthly Drive Health Report on NAS-HOSTNAME - Healthy",
|
||||||
|
"from_address": "nas@example.local",
|
||||||
|
"body_snippet": "The following is your monthly report regarding the health of the drives on NAS-HOSTNAME. No problem detected with the drives in DSM.",
|
||||||
|
"parsed_result": {
|
||||||
|
"backup_software": "Synology",
|
||||||
|
"backup_type": "Health Report",
|
||||||
|
"job_name": "Monthly Drive Health",
|
||||||
|
"overall_status": "Success",
|
||||||
|
"objects": [
|
||||||
|
{"name": "NAS-HOSTNAME", "status": "Success"}
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
@ -383,16 +435,16 @@ PARSER_DEFINITIONS = [
|
|||||||
},
|
},
|
||||||
"description": "Parses NAKIVO Backup & Replication reports for VMware backup jobs.",
|
"description": "Parses NAKIVO Backup & Replication reports for VMware backup jobs.",
|
||||||
"example": {
|
"example": {
|
||||||
"subject": '"exchange01.kuiperbv.nl" job: Successful',
|
"subject": '"VM-HOSTNAME" job: Successful',
|
||||||
"from_address": "NAKIVO Backup & Replication <administrator@customer.local>",
|
"from_address": "NAKIVO Backup & Replication <administrator@customer.local>",
|
||||||
"body_snippet": "Job Run Report... Backup job for VMware ... Successful",
|
"body_snippet": "Job Run Report... Backup job for VMware ... Successful",
|
||||||
"parsed_result": {
|
"parsed_result": {
|
||||||
"backup_software": "NAKIVO",
|
"backup_software": "NAKIVO",
|
||||||
"backup_type": "Backup job for VMware",
|
"backup_type": "Backup job for VMware",
|
||||||
"job_name": "exchange01.kuiperbv.nl",
|
"job_name": "VM-HOSTNAME",
|
||||||
"objects": [
|
"objects": [
|
||||||
{
|
{
|
||||||
"name": "exchange01.kuiperbv.nl",
|
"name": "VM-HOSTNAME",
|
||||||
"status": "Success",
|
"status": "Success",
|
||||||
"error_message": "",
|
"error_message": "",
|
||||||
}
|
}
|
||||||
|
|||||||
@ -18,10 +18,23 @@ DSM_UPDATE_CANCELLED_PATTERNS = [
|
|||||||
"Automatische update van DSM is geannuleerd",
|
"Automatische update van DSM is geannuleerd",
|
||||||
"Automatic DSM update was cancelled",
|
"Automatic DSM update was cancelled",
|
||||||
"Automatic update of DSM was cancelled",
|
"Automatic update of DSM was cancelled",
|
||||||
|
"Automatische DSM-update",
|
||||||
|
"DSM-update op",
|
||||||
|
"Packages on",
|
||||||
|
"out-of-date",
|
||||||
|
"Package Center",
|
||||||
|
"new DSM update",
|
||||||
|
"Auto Update has detected",
|
||||||
|
"new version of DSM",
|
||||||
|
"Update & Restore",
|
||||||
|
"belangrijke DSM-update",
|
||||||
|
"kritieke oplossingen",
|
||||||
|
"wordt automatisch geïnstalleerd",
|
||||||
|
"is beschikbaar op",
|
||||||
]
|
]
|
||||||
|
|
||||||
_DSM_UPDATE_CANCELLED_HOST_RE = re.compile(
|
_DSM_UPDATE_CANCELLED_HOST_RE = re.compile(
|
||||||
r"\b(?:geannuleerd\s+op|cancelled\s+on)\s+(?P<host>[A-Za-z0-9._-]+)\b",
|
r"\b(?:geannuleerd\s+op|cancelled\s+on|DSM-update\s+op|DSM\s+update\s+on|Packages\s+on|running\s+on|detected\s+on)\s+(?P<host>[A-Za-z0-9._-]+)\b",
|
||||||
re.I,
|
re.I,
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -60,6 +73,75 @@ def _parse_synology_dsm_update_cancelled(subject: str, text: str) -> Tuple[bool,
|
|||||||
|
|
||||||
return True, result, objects
|
return True, result, objects
|
||||||
|
|
||||||
|
|
||||||
|
# --- Synology Drive Health Report (informational, excluded from reporting) ---
|
||||||
|
|
||||||
|
DRIVE_HEALTH_PATTERNS = [
|
||||||
|
"schijfintegriteitsrapport",
|
||||||
|
"Drive Health Report",
|
||||||
|
"Monthly Drive Health",
|
||||||
|
"health of the drives",
|
||||||
|
"integriteitsrapport van de schijven",
|
||||||
|
]
|
||||||
|
|
||||||
|
_DRIVE_HEALTH_SUBJECT_RE = re.compile(
|
||||||
|
r"\b(?:schijfintegriteitsrapport\s+over|Drive\s+Health\s+Report\s+on)\s+(?P<host>[A-Za-z0-9._-]+)",
|
||||||
|
re.I,
|
||||||
|
)
|
||||||
|
|
||||||
|
_DRIVE_HEALTH_FROM_RE = re.compile(r"\b(?:Van|From)\s+(?P<host>[A-Za-z0-9._-]+)\b", re.I)
|
||||||
|
|
||||||
|
_DRIVE_HEALTH_STATUS_HEALTHY_RE = re.compile(
|
||||||
|
r"\b(?:Gezond|Healthy|geen\s+problemen\s+gedetecteerd|No\s+problem\s+detected)\b",
|
||||||
|
re.I,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _is_synology_drive_health(subject: str, text: str) -> bool:
|
||||||
|
haystack = f"{subject}\n{text}".lower()
|
||||||
|
return any(p.lower() in haystack for p in DRIVE_HEALTH_PATTERNS)
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_synology_drive_health(subject: str, text: str) -> Tuple[bool, Dict, List[Dict]]:
|
||||||
|
haystack = f"{subject}\n{text}"
|
||||||
|
host = ""
|
||||||
|
|
||||||
|
# Try to extract hostname from subject first
|
||||||
|
m = _DRIVE_HEALTH_SUBJECT_RE.search(subject or "")
|
||||||
|
if m:
|
||||||
|
host = (m.group("host") or "").strip()
|
||||||
|
|
||||||
|
# Fallback: extract from body "Van/From NAS-NAME"
|
||||||
|
if not host:
|
||||||
|
m = _DRIVE_HEALTH_FROM_RE.search(text or "")
|
||||||
|
if m:
|
||||||
|
host = (m.group("host") or "").strip()
|
||||||
|
|
||||||
|
# Determine status based on health indicators
|
||||||
|
overall_status = "Success"
|
||||||
|
overall_message = "Healthy"
|
||||||
|
|
||||||
|
if not _DRIVE_HEALTH_STATUS_HEALTHY_RE.search(haystack):
|
||||||
|
# If we don't find healthy indicators, mark as Warning
|
||||||
|
overall_status = "Warning"
|
||||||
|
overall_message = "Drive health issue detected"
|
||||||
|
|
||||||
|
# Informational job: show in Run Checks, but do not participate in schedules / reporting.
|
||||||
|
result: Dict = {
|
||||||
|
"backup_software": "Synology",
|
||||||
|
"backup_type": "Health Report",
|
||||||
|
"job_name": "Monthly Drive Health",
|
||||||
|
"overall_status": overall_status,
|
||||||
|
"overall_message": overall_message + (f" ({host})" if host else ""),
|
||||||
|
}
|
||||||
|
|
||||||
|
objects: List[Dict] = []
|
||||||
|
if host:
|
||||||
|
objects.append({"name": host, "status": overall_status})
|
||||||
|
|
||||||
|
return True, result, objects
|
||||||
|
|
||||||
|
|
||||||
_BR_RE = re.compile(r"<\s*br\s*/?\s*>", re.I)
|
_BR_RE = re.compile(r"<\s*br\s*/?\s*>", re.I)
|
||||||
_TAG_RE = re.compile(r"<[^>]+>")
|
_TAG_RE = re.compile(r"<[^>]+>")
|
||||||
_WS_RE = re.compile(r"[\t\r\f\v ]+")
|
_WS_RE = re.compile(r"[\t\r\f\v ]+")
|
||||||
@ -176,12 +258,14 @@ _ABB_SUBJECT_RE = re.compile(r"\bactive\s+backup\s+for\s+business\b", re.I)
|
|||||||
# Examples (NL):
|
# Examples (NL):
|
||||||
# "De back-uptaak vSphere-Task-1 op KANTOOR-NEW is voltooid."
|
# "De back-uptaak vSphere-Task-1 op KANTOOR-NEW is voltooid."
|
||||||
# "Virtuele machine back-uptaak vSphere-Task-1 op KANTOOR-NEW is gedeeltelijk voltooid."
|
# "Virtuele machine back-uptaak vSphere-Task-1 op KANTOOR-NEW is gedeeltelijk voltooid."
|
||||||
|
# "back-uptaak vSphere-Task-1 op KANTOOR-NEW is genegeerd"
|
||||||
# Examples (EN):
|
# Examples (EN):
|
||||||
# "The backup task vSphere-Task-1 on KANTOOR-NEW has completed."
|
# "The backup task vSphere-Task-1 on KANTOOR-NEW has completed."
|
||||||
# "Virtual machine backup task vSphere-Task-1 on KANTOOR-NEW partially completed."
|
# "Virtual machine backup task vSphere-Task-1 on KANTOOR-NEW partially completed."
|
||||||
|
# "backup task vSphere-Task-1 on KANTOOR-NEW was skipped"
|
||||||
_ABB_COMPLETED_RE = re.compile(
|
_ABB_COMPLETED_RE = re.compile(
|
||||||
r"\b(?:virtuele\s+machine\s+)?(?:de\s+)?back-?up\s*taak\s+(?P<job>.+?)\s+op\s+(?P<host>[A-Za-z0-9._-]+)\s+is\s+(?P<status>voltooid|gedeeltelijk\s+voltooid)\b"
|
r"\b(?:virtuele\s+machine\s+)?(?:de\s+)?back-?up\s*(?:taak|job)\s+(?:van\s+deze\s+taak\s+)?(?P<job>.+?)\s+op\s+(?P<host>[A-Za-z0-9._-]+)\s+is\s+(?P<status>voltooid|gedeeltelijk\s+voltooid|genegeerd)\b"
|
||||||
r"|\b(?:virtual\s+machine\s+)?(?:the\s+)?back-?up\s+task\s+(?P<job_en>.+?)\s+on\s+(?P<host_en>[A-Za-z0-9._-]+)\s+(?:is\s+)?(?P<status_en>completed|finished|has\s+completed|partially\s+completed)\b",
|
r"|\b(?:virtual\s+machine\s+)?(?:the\s+)?back-?up\s+(?:task|job)\s+(?P<job_en>.+?)\s+on\s+(?P<host_en>[A-Za-z0-9._-]+)\s+(?:is\s+|was\s+)?(?P<status_en>completed|finished|has\s+completed|partially\s+completed|skipped|ignored)\b",
|
||||||
re.I,
|
re.I,
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -233,6 +317,11 @@ def _parse_active_backup_for_business(subject: str, text: str) -> Tuple[bool, Di
|
|||||||
overall_status = "Warning"
|
overall_status = "Warning"
|
||||||
overall_message = "Partially completed"
|
overall_message = "Partially completed"
|
||||||
|
|
||||||
|
# "genegeerd" / "skipped" / "ignored" should be treated as Warning
|
||||||
|
if "genegeerd" in status_raw or "skipped" in status_raw or "ignored" in status_raw:
|
||||||
|
overall_status = "Warning"
|
||||||
|
overall_message = "Skipped"
|
||||||
|
|
||||||
# Explicit failure wording overrides everything
|
# Explicit failure wording overrides everything
|
||||||
if _ABB_FAILED_RE.search(haystack):
|
if _ABB_FAILED_RE.search(haystack):
|
||||||
overall_status = "Error"
|
overall_status = "Error"
|
||||||
@ -489,6 +578,12 @@ def try_parse_synology(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
|
|||||||
if ok:
|
if ok:
|
||||||
return True, result, objects
|
return True, result, objects
|
||||||
|
|
||||||
|
# Drive Health Report (informational; no schedule; excluded from reporting)
|
||||||
|
if _is_synology_drive_health(subject, text):
|
||||||
|
ok, result, objects = _parse_synology_drive_health(subject, text)
|
||||||
|
if ok:
|
||||||
|
return True, result, objects
|
||||||
|
|
||||||
# DSM Account Protection (informational; no schedule)
|
# DSM Account Protection (informational; no schedule)
|
||||||
if _is_synology_account_protection(subject, text):
|
if _is_synology_account_protection(subject, text):
|
||||||
ok, result, objects = _parse_account_protection(subject, text)
|
ok, result, objects = _parse_account_protection(subject, text)
|
||||||
|
|||||||
@ -1177,6 +1177,38 @@ def try_parse_veeam(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
|
|||||||
}
|
}
|
||||||
return True, result, []
|
return True, result, []
|
||||||
|
|
||||||
|
# Job did not start on schedule: special error notification (no objects, plain text body).
|
||||||
|
# Example subject: "[Veeam Backup for Microsoft 365] Job did not start on schedule: Backup MDS at Work"
|
||||||
|
subject_lower = subject.lower()
|
||||||
|
if 'job did not start on schedule' in subject_lower:
|
||||||
|
# Extract backup type from subject (e.g., "Veeam Backup for Microsoft 365")
|
||||||
|
backup_type = None
|
||||||
|
for candidate in VEEAM_BACKUP_TYPES:
|
||||||
|
if candidate.lower() in subject_lower:
|
||||||
|
backup_type = candidate
|
||||||
|
break
|
||||||
|
if not backup_type:
|
||||||
|
backup_type = "Backup Job"
|
||||||
|
|
||||||
|
# Extract job name after the colon (e.g., "Backup MDS at Work")
|
||||||
|
job_name = None
|
||||||
|
m_job = re.search(r'job did not start on schedule:\s*(.+)$', subject, re.IGNORECASE)
|
||||||
|
if m_job:
|
||||||
|
job_name = (m_job.group(1) or '').strip()
|
||||||
|
|
||||||
|
# Get overall message from text_body (can be base64 encoded)
|
||||||
|
text_body = (getattr(msg, 'text_body', None) or '').strip()
|
||||||
|
overall_message = text_body if text_body else 'Job did not start on schedule'
|
||||||
|
|
||||||
|
result = {
|
||||||
|
'backup_software': 'Veeam',
|
||||||
|
'backup_type': backup_type,
|
||||||
|
'job_name': job_name or 'Unknown Job',
|
||||||
|
'overall_status': 'Error',
|
||||||
|
'overall_message': overall_message,
|
||||||
|
}
|
||||||
|
return True, result, []
|
||||||
|
|
||||||
# Configuration Job detection (may not have object details)
|
# Configuration Job detection (may not have object details)
|
||||||
subj_lower = subject.lower()
|
subj_lower = subject.lower()
|
||||||
is_config_job = ('backup configuration job' in subj_lower) or ('configuration backup for' in html_lower)
|
is_config_job = ('backup configuration job' in subj_lower) or ('configuration backup for' in html_lower)
|
||||||
|
|||||||
@ -170,27 +170,23 @@ def link_open_internal_tickets_to_run(*, run: JobRun, job: Job) -> None:
|
|||||||
ui_tz = _get_ui_timezone_name()
|
ui_tz = _get_ui_timezone_name()
|
||||||
run_date = _to_ui_date(getattr(run, "run_at", None)) or _to_ui_date(datetime.utcnow())
|
run_date = _to_ui_date(getattr(run, "run_at", None)) or _to_ui_date(datetime.utcnow())
|
||||||
|
|
||||||
# Find open tickets scoped to this job for the run date window.
|
# Find open (unresolved) tickets scoped to this job.
|
||||||
# This matches the logic used by Job Details and Run Checks indicators.
|
|
||||||
rows = []
|
rows = []
|
||||||
try:
|
try:
|
||||||
rows = (
|
rows = (
|
||||||
db.session.execute(
|
db.session.execute(
|
||||||
text(
|
text(
|
||||||
"""
|
"""
|
||||||
SELECT t.id, t.ticket_code
|
SELECT t.id, t.ticket_code, t.resolved_at, ts.resolved_at as scope_resolved_at
|
||||||
FROM tickets t
|
FROM tickets t
|
||||||
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
||||||
WHERE ts.job_id = :job_id
|
WHERE ts.job_id = :job_id
|
||||||
AND t.active_from_date <= :run_date
|
AND t.active_from_date <= :run_date
|
||||||
AND (
|
AND COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
||||||
COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
|
||||||
OR ((COALESCE(ts.resolved_at, t.resolved_at) AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
|
|
||||||
)
|
|
||||||
ORDER BY t.start_date DESC, t.id DESC
|
ORDER BY t.start_date DESC, t.id DESC
|
||||||
"""
|
"""
|
||||||
),
|
),
|
||||||
{"job_id": int(job.id), "run_date": run_date, "ui_tz": ui_tz},
|
{"job_id": int(job.id), "run_date": run_date},
|
||||||
)
|
)
|
||||||
.fetchall()
|
.fetchall()
|
||||||
)
|
)
|
||||||
@ -201,7 +197,7 @@ def link_open_internal_tickets_to_run(*, run: JobRun, job: Job) -> None:
|
|||||||
return
|
return
|
||||||
|
|
||||||
# Link all open tickets to this run (idempotent)
|
# Link all open tickets to this run (idempotent)
|
||||||
for tid, _code in rows:
|
for tid, code, t_resolved, ts_resolved in rows:
|
||||||
if not TicketJobRun.query.filter_by(ticket_id=int(tid), job_run_id=int(run.id)).first():
|
if not TicketJobRun.query.filter_by(ticket_id=int(tid), job_run_id=int(run.id)).first():
|
||||||
db.session.add(TicketJobRun(ticket_id=int(tid), job_run_id=int(run.id), link_source="inherit"))
|
db.session.add(TicketJobRun(ticket_id=int(tid), job_run_id=int(run.id), link_source="inherit"))
|
||||||
|
|
||||||
@ -213,20 +209,49 @@ def link_open_internal_tickets_to_run(*, run: JobRun, job: Job) -> None:
|
|||||||
except Exception:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
# Strategy 1: Use internal ticket code to find matching Autotask-linked run
|
||||||
|
# The query above only returns unresolved tickets, so we can safely propagate.
|
||||||
try:
|
try:
|
||||||
# Use the newest ticket code to find a matching prior Autotask-linked run.
|
# Use the newest ticket code to find a matching prior Autotask-linked run.
|
||||||
newest_code = (rows[0][1] or "").strip()
|
# rows format: (tid, code, t_resolved, ts_resolved)
|
||||||
if not newest_code:
|
newest_code = (rows[0][1] or "").strip() if rows else ""
|
||||||
return
|
if newest_code:
|
||||||
|
prior = (
|
||||||
|
JobRun.query.filter(JobRun.job_id == job.id)
|
||||||
|
.filter(JobRun.autotask_ticket_id.isnot(None))
|
||||||
|
.filter(JobRun.autotask_ticket_number == newest_code)
|
||||||
|
.order_by(JobRun.id.desc())
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if prior and getattr(prior, "autotask_ticket_id", None):
|
||||||
|
run.autotask_ticket_id = prior.autotask_ticket_id
|
||||||
|
run.autotask_ticket_number = prior.autotask_ticket_number
|
||||||
|
run.autotask_ticket_created_at = getattr(prior, "autotask_ticket_created_at", None)
|
||||||
|
run.autotask_ticket_created_by_user_id = getattr(prior, "autotask_ticket_created_by_user_id", None)
|
||||||
|
return
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Strategy 2: Direct Autotask propagation (independent of internal ticket status)
|
||||||
|
# Find the most recent non-deleted, non-resolved Autotask ticket for this job.
|
||||||
|
try:
|
||||||
prior = (
|
prior = (
|
||||||
JobRun.query.filter(JobRun.job_id == job.id)
|
JobRun.query.filter(JobRun.job_id == job.id)
|
||||||
.filter(JobRun.autotask_ticket_id.isnot(None))
|
.filter(JobRun.autotask_ticket_id.isnot(None))
|
||||||
.filter(JobRun.autotask_ticket_number == newest_code)
|
.filter(JobRun.autotask_ticket_deleted_at.is_(None))
|
||||||
.order_by(JobRun.id.desc())
|
.order_by(JobRun.id.desc())
|
||||||
.first()
|
.first()
|
||||||
)
|
)
|
||||||
if prior and getattr(prior, "autotask_ticket_id", None):
|
if prior and getattr(prior, "autotask_ticket_id", None):
|
||||||
|
# Check if the internal ticket is resolved (Autotask tickets are resolved via internal Ticket)
|
||||||
|
ticket_number = (getattr(prior, "autotask_ticket_number", None) or "").strip()
|
||||||
|
if ticket_number:
|
||||||
|
internal_ticket = Ticket.query.filter_by(ticket_code=ticket_number).first()
|
||||||
|
if internal_ticket and getattr(internal_ticket, "resolved_at", None):
|
||||||
|
# Ticket is resolved, don't propagate
|
||||||
|
return
|
||||||
|
|
||||||
|
# Ticket is not deleted and not resolved, propagate it
|
||||||
run.autotask_ticket_id = prior.autotask_ticket_id
|
run.autotask_ticket_id = prior.autotask_ticket_id
|
||||||
run.autotask_ticket_number = prior.autotask_ticket_number
|
run.autotask_ticket_number = prior.autotask_ticket_number
|
||||||
run.autotask_ticket_created_at = getattr(prior, "autotask_ticket_created_at", None)
|
run.autotask_ticket_created_at = getattr(prior, "autotask_ticket_created_at", None)
|
||||||
|
|||||||
@ -197,7 +197,7 @@
|
|||||||
</div>
|
</div>
|
||||||
</nav>
|
</nav>
|
||||||
|
|
||||||
<main class="{% block main_class %}container content-container{% endblock %}" style="padding-top: 80px;">
|
<main class="{% block main_class %}container content-container{% endblock %}" id="main-content">
|
||||||
{% with messages = get_flashed_messages(with_categories=true) %}
|
{% with messages = get_flashed_messages(with_categories=true) %}
|
||||||
{% if messages %}
|
{% if messages %}
|
||||||
<div class="mb-3">
|
<div class="mb-3">
|
||||||
@ -216,6 +216,58 @@
|
|||||||
|
|
||||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.bundle.min.js"></script>
|
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.bundle.min.js"></script>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
// Dynamic navbar height adjustment
|
||||||
|
(function () {
|
||||||
|
function adjustContentPadding() {
|
||||||
|
try {
|
||||||
|
var navbar = document.querySelector('.navbar.fixed-top');
|
||||||
|
var mainContent = document.getElementById('main-content');
|
||||||
|
if (!navbar || !mainContent) return;
|
||||||
|
|
||||||
|
// Get actual navbar height
|
||||||
|
var navbarHeight = navbar.offsetHeight;
|
||||||
|
|
||||||
|
// Add small buffer (20px) for visual spacing
|
||||||
|
var paddingTop = navbarHeight + 20;
|
||||||
|
|
||||||
|
// Apply padding to main content
|
||||||
|
mainContent.style.paddingTop = paddingTop + 'px';
|
||||||
|
} catch (e) {
|
||||||
|
// Fallback to 80px if something goes wrong
|
||||||
|
var mainContent = document.getElementById('main-content');
|
||||||
|
if (mainContent) {
|
||||||
|
mainContent.style.paddingTop = '80px';
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run on page load
|
||||||
|
if (document.readyState === 'loading') {
|
||||||
|
document.addEventListener('DOMContentLoaded', adjustContentPadding);
|
||||||
|
} else {
|
||||||
|
adjustContentPadding();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Run after navbar is fully rendered
|
||||||
|
window.addEventListener('load', adjustContentPadding);
|
||||||
|
|
||||||
|
// Run on window resize
|
||||||
|
var resizeTimeout;
|
||||||
|
window.addEventListener('resize', function () {
|
||||||
|
clearTimeout(resizeTimeout);
|
||||||
|
resizeTimeout = setTimeout(adjustContentPadding, 100);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Run when navbar collapse is toggled
|
||||||
|
var navbarCollapse = document.getElementById('navbarNav');
|
||||||
|
if (navbarCollapse) {
|
||||||
|
navbarCollapse.addEventListener('shown.bs.collapse', adjustContentPadding);
|
||||||
|
navbarCollapse.addEventListener('hidden.bs.collapse', adjustContentPadding);
|
||||||
|
}
|
||||||
|
})();
|
||||||
|
</script>
|
||||||
|
|
||||||
<script>
|
<script>
|
||||||
(function () {
|
(function () {
|
||||||
function isOverflowing(el) {
|
function isOverflowing(el) {
|
||||||
|
|||||||
@ -549,6 +549,36 @@
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
<div class="col-12 col-lg-6">
|
||||||
|
<div class="card h-100 border-warning">
|
||||||
|
<div class="card-header bg-warning">Cleanup orphaned jobs</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<p class="mb-3">Delete jobs that are no longer linked to an existing customer. Related emails and runs will be <strong>permanently deleted</strong> from the database.</p>
|
||||||
|
<a href="{{ url_for('main.settings_jobs_orphaned') }}" class="btn btn-warning">Preview orphaned jobs</a>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="col-12 col-lg-6">
|
||||||
|
<div class="card h-100 border-info">
|
||||||
|
<div class="card-header bg-info text-white">Generate test emails</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<p class="mb-3">Generate Veeam test emails in the inbox for testing parsers and maintenance operations. Each button creates 1 Veeam Backup Job email with the specified status.</p>
|
||||||
|
<div class="d-flex flex-column gap-2">
|
||||||
|
<form method="post" action="{{ url_for('main.settings_generate_test_emails', status_type='success') }}">
|
||||||
|
<button type="submit" class="btn btn-success w-100">Generate success email (1)</button>
|
||||||
|
</form>
|
||||||
|
<form method="post" action="{{ url_for('main.settings_generate_test_emails', status_type='warning') }}">
|
||||||
|
<button type="submit" class="btn btn-warning w-100">Generate warning email (1)</button>
|
||||||
|
</form>
|
||||||
|
<form method="post" action="{{ url_for('main.settings_generate_test_emails', status_type='error') }}">
|
||||||
|
<button type="submit" class="btn btn-danger w-100">Generate error email (1)</button>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<div class="col-12 col-lg-6">
|
<div class="col-12 col-lg-6">
|
||||||
<div class="card h-100 border-danger">
|
<div class="card h-100 border-danger">
|
||||||
<div class="card-header bg-danger text-white">Jobs maintenance</div>
|
<div class="card-header bg-danger text-white">Jobs maintenance</div>
|
||||||
|
|||||||
@ -0,0 +1,87 @@
|
|||||||
|
{% extends "layout/base.html" %}
|
||||||
|
|
||||||
|
{% block title %}Orphaned Jobs Preview{% endblock %}
|
||||||
|
|
||||||
|
{% block content %}
|
||||||
|
<div class="container-fluid py-4">
|
||||||
|
<div class="d-flex justify-content-between align-items-center mb-4">
|
||||||
|
<div>
|
||||||
|
<h2>Orphaned Jobs Preview</h2>
|
||||||
|
<p class="text-muted mb-0">Jobs without a valid customer link</p>
|
||||||
|
</div>
|
||||||
|
<a href="{{ url_for('main.settings', section='maintenance') }}" class="btn btn-outline-secondary">Back to Settings</a>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% if orphaned_jobs %}
|
||||||
|
<div class="alert alert-warning">
|
||||||
|
<strong>⚠️ Warning:</strong> Found {{ orphaned_jobs|length }} orphaned job(s). Review the list below before deleting.
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card mb-4">
|
||||||
|
<div class="card-header d-flex justify-content-between align-items-center">
|
||||||
|
<span>Orphaned Jobs List</span>
|
||||||
|
<form method="post" action="{{ url_for('main.settings_jobs_delete_orphaned') }}" onsubmit="return confirm('Delete all {{ orphaned_jobs|length }} orphaned jobs and their emails? This cannot be undone.');">
|
||||||
|
<button type="submit" class="btn btn-sm btn-danger">Delete All ({{ orphaned_jobs|length }} jobs)</button>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
<div class="card-body p-0">
|
||||||
|
<div class="table-responsive">
|
||||||
|
<table class="table table-hover mb-0">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Job Name</th>
|
||||||
|
<th>Backup Software</th>
|
||||||
|
<th>Backup Type</th>
|
||||||
|
<th>Customer ID</th>
|
||||||
|
<th class="text-end">Runs</th>
|
||||||
|
<th class="text-end">Emails</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
{% for job in orphaned_jobs %}
|
||||||
|
<tr>
|
||||||
|
<td>{{ job.job_name }}</td>
|
||||||
|
<td>{{ job.backup_software }}</td>
|
||||||
|
<td>{{ job.backup_type }}</td>
|
||||||
|
<td>
|
||||||
|
{% if job.customer_id %}
|
||||||
|
<span class="badge bg-danger">{{ job.customer_id }} (deleted)</span>
|
||||||
|
{% else %}
|
||||||
|
<span class="badge bg-secondary">NULL</span>
|
||||||
|
{% endif %}
|
||||||
|
</td>
|
||||||
|
<td class="text-end">{{ job.run_count }}</td>
|
||||||
|
<td class="text-end">{{ job.mail_count }}</td>
|
||||||
|
</tr>
|
||||||
|
{% endfor %}
|
||||||
|
</tbody>
|
||||||
|
<tfoot>
|
||||||
|
<tr class="table-light">
|
||||||
|
<td colspan="4"><strong>Total</strong></td>
|
||||||
|
<td class="text-end"><strong>{{ orphaned_jobs|sum(attribute='run_count') }}</strong></td>
|
||||||
|
<td class="text-end"><strong>{{ orphaned_jobs|sum(attribute='mail_count') }}</strong></td>
|
||||||
|
</tr>
|
||||||
|
</tfoot>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="alert alert-info">
|
||||||
|
<strong>ℹ️ What will be deleted:</strong>
|
||||||
|
<ul class="mb-0">
|
||||||
|
<li>{{ orphaned_jobs|length }} job(s)</li>
|
||||||
|
<li>{{ orphaned_jobs|sum(attribute='run_count') }} job run(s)</li>
|
||||||
|
<li>{{ orphaned_jobs|sum(attribute='mail_count') }} email(s)</li>
|
||||||
|
<li>All related data (backup objects, ticket/remark links, scopes, overrides)</li>
|
||||||
|
</ul>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{% else %}
|
||||||
|
<div class="alert alert-success">
|
||||||
|
<strong>✅ No orphaned jobs found.</strong>
|
||||||
|
<p class="mb-0">All jobs are properly linked to existing customers.</p>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
||||||
@ -2,8 +2,49 @@
|
|||||||
|
|
||||||
This file documents all changes made to this project via Claude Code.
|
This file documents all changes made to this project via Claude Code.
|
||||||
|
|
||||||
|
## [2026-02-10]
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- Fixed Autotask ticket not being automatically linked to new runs when internal ticket is resolved by implementing independent Autotask propagation strategy (now checks for most recent non-deleted and non-resolved Autotask ticket on job regardless of internal ticket status, ensuring PSA ticket reference persists across runs until explicitly resolved or deleted)
|
||||||
|
- Fixed internal and Autotask tickets being linked to new runs even after being resolved by removing date-based "open" logic from ticket query (tickets now only link to new runs if they are genuinely unresolved, not based on run date comparisons)
|
||||||
|
- Fixed Job Details page showing resolved tickets for ALL runs by implementing two-source ticket display: directly linked tickets (via ticket_job_runs) are always shown for audit trail, while active window tickets (via scope query) are only shown if unresolved, preserving historical ticket links while preventing resolved tickets from appearing on new runs
|
||||||
|
- Fixed Run Checks page showing resolved ticket indicators by removing date-based logic from ticket/remark existence queries (tickets and remarks now only show indicators if genuinely unresolved)
|
||||||
|
- Fixed Run Checks popup showing resolved tickets for runs where they were never linked by replacing date-based ticket/remark queries in `/api/job-runs/<run_id>/alerts` endpoint with explicit link-based queries (now only shows tickets/remarks that were actually linked to the specific run via ticket_job_runs/remark_job_runs tables, completing the transition from date-based to explicit-link ticket system)
|
||||||
|
- **HOTFIX**: Fixed Run Checks popup showing duplicate tickets (same ticket repeated multiple times) by removing unnecessary JOIN with ticket_scopes/remark_scopes tables and adding DISTINCT to prevent duplicate rows (root cause: tickets with multiple scopes created multiple result rows for same ticket via Cartesian product)
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- Added debug logging to ticket linking function to troubleshoot resolved ticket propagation issues (writes to AuditLog table with event_type "ticket_link_debug", visible on Logging page, logs EVERY run import to show whether tickets were found and their resolved_at status, uses commit instead of flush to ensure persistence) - **LATER REMOVED** after ticket system was fixed
|
||||||
|
- Reduced test email generation from 3 emails per status to 1 email per status for simpler testing (each button now creates exactly 1 test mail instead of 3)
|
||||||
|
- Updated Settings Maintenance page text to reflect that test emails are Veeam only and 1 per button (changed from "3 emails simulating Veeam, Synology, and NAKIVO" to "1 Veeam Backup Job email" per status button)
|
||||||
|
|
||||||
|
### Removed
|
||||||
|
- Removed debug logging from ticket linking function after successfully resolving all ticket propagation issues (the logging was temporarily added to troubleshoot why resolved tickets kept appearing on new runs, wrote to AuditLog with event_type "ticket_link_debug" showing ticket_id, code, resolved_at status for every run import, debug code preserved in backupchecks-system.md documentation for future use if similar issues arise)
|
||||||
|
|
||||||
|
### Release
|
||||||
|
- **v0.1.26** - Official release consolidating all ticket system bug fixes from 2026-02-10 (see docs/changelog.md and changelog.py for customer-facing release notes)
|
||||||
|
|
||||||
## [2026-02-09]
|
## [2026-02-09]
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Extended Veeam parser to recognize "Job did not start on schedule" error notifications for Veeam Backup for Microsoft 365 (and other Veeam backup types) with job name extraction from subject and error message from plain text body (proxy server offline, scheduled run failed)
|
||||||
|
- Added parser for Synology monthly drive health reports (backup software: Synology, backup type: Health Report, job name: Monthly Drive Health, informational only, no schedule learning) with support for both Dutch and English notifications ("schijfintegriteitsrapport"/"Drive Health Report") and automatic status detection (Healthy/Gezond → Success, problems → Warning)
|
||||||
|
- Added "Cleanup orphaned jobs" maintenance option in Settings → Maintenance to delete jobs without valid customer links and their associated emails/runs permanently from database (useful when customers are removed)
|
||||||
|
- Added "Preview orphaned jobs" button to show detailed list of jobs to be deleted with run/email counts before confirming deletion (verification step for safety)
|
||||||
|
- Added "Generate test emails" feature in Settings → Maintenance with three separate buttons to create fixed test email sets (success/warning/error) in inbox for testing parsers and maintenance operations (each set contains exactly 3 Veeam Backup Job emails with the same job name "Test-Backup-Job" and different dates/objects/statuses for reproducible testing and proper status flow testing)
|
||||||
|
- Added parser registry entry for Synology DSM automatic update cancelled notifications (backup software: Synology, backup type: Updates, informational only, no schedule learning)
|
||||||
|
- Extended Synology DSM update parser with additional detection patterns ("Automatische DSM-update", "DSM-update op", "Packages on", "out-of-date", "Package Center", "new DSM update", "Auto Update has detected", "Update & Restore", "belangrijke DSM-update", "kritieke oplossingen", "wordt automatisch geïnstalleerd", "is beschikbaar op") and hostname extraction regex to recognize DSM update cancelled, out-of-date packages, new update available, and automatic installation announcements under same Updates job type while maintaining backward compatibility with existing patterns
|
||||||
|
- Extended Synology Active Backup for Business parser to recognize skipped/ignored backup tasks ("genegeerd", "skipped", "ignored") as Warning status when backup was skipped due to previous backup still running
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- Updated `docs/changelog.md` with comprehensive v0.1.25 release notes consolidating all changes from 2026-02-09 (Parser Enhancements for Synology and Veeam, Maintenance Improvements, Data Privacy, Bug Fixes)
|
||||||
|
- Updated `containers/backupchecks/src/backend/app/changelog.py` with v0.1.25 entry in Python structure for website display (4 sections with subsections matching changelog.md content)
|
||||||
|
- Removed customer name from Autotask ticket title to keep titles concise (format changed from "[Backupchecks] Customer - Job Name - Status" to "[Backupchecks] Job Name - Status")
|
||||||
|
- Replaced real customer names in parser registry examples with generic placeholders (NTFS Auditing, QNAP Firmware Update, NAKIVO) to prevent customer information in codebase
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
- Fixed Autotask ticket description being set to NULL when resolving tickets via `update_ticket_resolution_safe` by adding "description" to the optional_fields list, ensuring the original description is preserved during PUT operations
|
||||||
|
- Fixed responsive navbar overlapping page content on smaller screens by implementing dynamic padding adjustment (JavaScript measures actual navbar height and adjusts main content padding-top automatically on page load, window resize, and navbar collapse toggle events)
|
||||||
|
|
||||||
### Changed
|
### Changed
|
||||||
- Updated `docs/changelog.md` with comprehensive v0.1.23 release notes consolidating all changes from 2026-02-06 through 2026-02-08 (Documentation System, Audit Logging, Timezone-Aware Display, Autotask Improvements, Environment Identification, Bug Fixes)
|
- Updated `docs/changelog.md` with comprehensive v0.1.23 release notes consolidating all changes from 2026-02-06 through 2026-02-08 (Documentation System, Audit Logging, Timezone-Aware Display, Autotask Improvements, Environment Identification, Bug Fixes)
|
||||||
- Updated `containers/backupchecks/src/backend/app/changelog.py` with v0.1.23 entry in Python structure for website display (8 sections with subsections matching changelog.md content)
|
- Updated `containers/backupchecks/src/backend/app/changelog.py` with v0.1.23 entry in Python structure for website display (8 sections with subsections matching changelog.md content)
|
||||||
|
|||||||
@ -1,3 +1,160 @@
|
|||||||
|
## v0.1.26
|
||||||
|
|
||||||
|
This critical bug fix release resolves ticket system display issues where resolved tickets were incorrectly appearing on new runs across multiple pages. The ticket system has been completely transitioned from date-based logic to explicit link-based queries, ensuring resolved tickets stop appearing immediately after resolution while preserving audit trail for historical runs.
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
**Ticket System - Resolved Ticket Display Issues:**
|
||||||
|
|
||||||
|
*Root Cause:*
|
||||||
|
- Multiple pages used legacy date-based logic to determine if tickets should be displayed
|
||||||
|
- Queries checked if `active_from_date <= run_date` and `resolved_at >= run_date` instead of checking explicit `ticket_job_runs` links
|
||||||
|
- Result: Resolved tickets kept appearing on ALL runs between active_from_date and resolved_at, even runs created after resolution
|
||||||
|
- Impact: Users saw resolved tickets on new runs, creating confusion about which issues were actually active
|
||||||
|
|
||||||
|
*Fixed Pages and Queries:*
|
||||||
|
|
||||||
|
1. **Ticket Linking (ticketing_utils.py)**
|
||||||
|
- Fixed Autotask tickets not propagating to new runs after internal ticket resolution
|
||||||
|
- Implemented independent Autotask propagation strategy: checks for most recent non-deleted and non-resolved Autotask ticket on job regardless of internal ticket status
|
||||||
|
- Fixed internal tickets being linked to new runs after resolution by removing date-based "open" logic from ticket query
|
||||||
|
- Tickets now only link to new runs if `COALESCE(ts.resolved_at, t.resolved_at) IS NULL` (genuinely unresolved)
|
||||||
|
|
||||||
|
2. **Job Details Page (routes_job_details.py)**
|
||||||
|
- Fixed resolved tickets appearing on ALL runs for a job
|
||||||
|
- Implemented two-source ticket display for proper audit trail:
|
||||||
|
- Direct links via `ticket_job_runs` → always shown (preserves historical context)
|
||||||
|
- Active window via `ticket_scopes` → only shown if unresolved
|
||||||
|
- Result: Old runs keep their ticket references, new runs don't get resolved tickets
|
||||||
|
|
||||||
|
3. **Run Checks Main Page (routes_run_checks.py)**
|
||||||
|
- Fixed ticket/remark indicators (🎫/💬) showing for jobs with resolved tickets
|
||||||
|
- Removed date-based logic from indicator existence queries
|
||||||
|
- Now only shows indicators if `COALESCE(ts.resolved_at, t.resolved_at) IS NULL` (genuinely unresolved)
|
||||||
|
|
||||||
|
4. **Run Checks Popup Modal (routes_api.py)**
|
||||||
|
- Fixed popup showing resolved tickets for runs where they were never linked
|
||||||
|
- Replaced date-based queries in `/api/job-runs/<run_id>/alerts` endpoint with explicit JOIN queries
|
||||||
|
- Tickets query: Now uses `JOIN ticket_job_runs WHERE job_run_id = :run_id`
|
||||||
|
- Remarks query: Now uses `JOIN remark_job_runs WHERE job_run_id = :run_id`
|
||||||
|
- Removed unused parameters: `run_date`, `job_id`, `ui_tz` (no longer needed)
|
||||||
|
- Result: Only shows tickets/remarks that were actually linked to that specific run
|
||||||
|
|
||||||
|
*Testing & Troubleshooting:*
|
||||||
|
- Temporarily added debug logging to `link_open_internal_tickets_to_run` function
|
||||||
|
- Wrote to AuditLog table with event_type "ticket_link_debug" for troubleshooting
|
||||||
|
- Logged ticket_id, code, resolved_at status for every run import
|
||||||
|
- Debug logging removed after successful resolution (code preserved in documentation)
|
||||||
|
|
||||||
|
**Test Email Generation:**
|
||||||
|
- Reduced test email generation from 3 emails per status to 1 email per status
|
||||||
|
- Each button now creates exactly 1 test mail instead of 3 for simpler testing
|
||||||
|
|
||||||
|
**User Interface:**
|
||||||
|
- Updated Settings → Maintenance page text for test email generation
|
||||||
|
- Changed description from "3 emails simulating Veeam, Synology, and NAKIVO" to "1 Veeam Backup Job email"
|
||||||
|
- Updated button labels from "(3)" to "(1)" to match actual behavior
|
||||||
|
|
||||||
|
*Result:*
|
||||||
|
- ✅ Resolved tickets stop appearing immediately after resolution
|
||||||
|
- ✅ Consistent behavior across all pages (Job Details, Run Checks, Run Checks popup)
|
||||||
|
- ✅ Audit trail preserved: old runs keep their historical ticket links
|
||||||
|
- ✅ Clear distinction: new runs only show currently active (unresolved) tickets
|
||||||
|
- ✅ All queries now use explicit link-based logic (no date comparisons)
|
||||||
|
|
||||||
|
## v0.1.25
|
||||||
|
|
||||||
|
This release focuses on parser improvements and maintenance enhancements, adding support for new notification types across Synology and Veeam backup systems while improving system usability with orphaned job cleanup and test email generation features.
|
||||||
|
|
||||||
|
### Parser Enhancements
|
||||||
|
|
||||||
|
**Synology Parsers:**
|
||||||
|
- **Monthly Drive Health Reports**: New parser for Synology NAS drive health notifications
|
||||||
|
- Supports both Dutch ("Maandelijks schijfintegriteitsrapport", "Gezond") and English ("Monthly Drive Health Report", "Healthy") variants
|
||||||
|
- Automatic status detection: Healthy/Gezond/No problem detected → Success, otherwise → Warning
|
||||||
|
- Extracts hostname from subject or body pattern (Van/From NAS-HOSTNAME)
|
||||||
|
- Backup type: "Health Report", Job name: "Monthly Drive Health"
|
||||||
|
- Informational only (excluded from schedule learning and reporting logic)
|
||||||
|
- Registry entry added (order 237) for /parsers page visibility
|
||||||
|
|
||||||
|
- **DSM Update Notifications - Extended Coverage**: Added support for additional DSM update notification variants
|
||||||
|
- New patterns: "belangrijke DSM-update", "kritieke oplossingen", "wordt automatisch geïnstalleerd", "is beschikbaar op"
|
||||||
|
- Now recognizes 4 different notification types under same job:
|
||||||
|
1. Automatic update cancelled
|
||||||
|
2. Packages out-of-date warnings
|
||||||
|
3. New update available announcements
|
||||||
|
4. Automatic installation scheduled notifications
|
||||||
|
- All patterns added to existing lists maintaining full backward compatibility
|
||||||
|
|
||||||
|
- **Active Backup for Business - Skipped Tasks**: Extended parser to recognize skipped/ignored backup tasks
|
||||||
|
- Detects Dutch ("genegeerd") and English ("skipped", "ignored") status indicators
|
||||||
|
- Status mapping: Skipped/Ignored → Warning with "Skipped" message
|
||||||
|
- Common scenario: Backup skipped because previous backup still running
|
||||||
|
|
||||||
|
**Veeam Parsers:**
|
||||||
|
- **Job Not Started Errors**: New detection for "Job did not start on schedule" error notifications
|
||||||
|
- Recognizes VBO365 and other Veeam backup types that send plain text error notifications
|
||||||
|
- Extracts backup type from subject (e.g., "Veeam Backup for Microsoft 365")
|
||||||
|
- Extracts job name from subject after colon (e.g., "Backup MDS at Work")
|
||||||
|
- Reads error message from plain text body (handles base64 UTF-16 encoding)
|
||||||
|
- Sets overall_status to "Error" for failed-to-start jobs
|
||||||
|
- Example messages: "Proxy server was offline at the time the job was scheduled to run."
|
||||||
|
|
||||||
|
### Maintenance Improvements
|
||||||
|
|
||||||
|
**Orphaned Jobs Cleanup:**
|
||||||
|
- Added "Cleanup orphaned jobs" option in Settings → Maintenance
|
||||||
|
- Removes jobs without valid customer links (useful when customers are deleted)
|
||||||
|
- Permanently deletes job records along with all associated emails and job runs
|
||||||
|
- "Preview orphaned jobs" button shows detailed list before deletion
|
||||||
|
- Displays job information with email and run counts
|
||||||
|
- Safety verification step to prevent accidental deletion
|
||||||
|
|
||||||
|
**Test Email Generation:**
|
||||||
|
- Added "Generate test emails" feature in Settings → Maintenance
|
||||||
|
- Three separate buttons to create fixed test email sets for parser testing:
|
||||||
|
- Success emails (3 emails with success status)
|
||||||
|
- Warning emails (3 emails with warning status)
|
||||||
|
- Error emails (3 emails with error status)
|
||||||
|
- Each set contains exactly 3 Veeam Backup Job emails with:
|
||||||
|
- Same job name "Test-Backup-Job" for consistency
|
||||||
|
- Different dates, objects, and statuses
|
||||||
|
- Reproducible testing scenarios
|
||||||
|
- Proper status flow testing (success → warning → error progression)
|
||||||
|
|
||||||
|
### Data Privacy
|
||||||
|
|
||||||
|
**Parser Registry Cleanup:**
|
||||||
|
- Replaced real customer names in parser registry examples with generic placeholders
|
||||||
|
- Affected parsers: NTFS Auditing, QNAP Firmware Update, NAKIVO
|
||||||
|
- Example format now uses: NAS-HOSTNAME, SERVER-HOSTNAME, VM-HOSTNAME, example.local
|
||||||
|
- Ensures no customer information in codebase or version control
|
||||||
|
|
||||||
|
**Autotask Integration:**
|
||||||
|
- Removed customer name from Autotask ticket title for concise display
|
||||||
|
- Format changed from "[Backupchecks] Customer - Job Name - Status" to "[Backupchecks] Job Name - Status"
|
||||||
|
- Reduces redundancy (customer already visible in ticket company field)
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
**User Interface:**
|
||||||
|
- Fixed responsive navbar overlapping page content on smaller screens
|
||||||
|
- Implemented dynamic padding adjustment using JavaScript
|
||||||
|
- Measures actual navbar height on page load, window resize, and navbar collapse toggle
|
||||||
|
- Automatically adjusts main content padding-top to prevent overlap
|
||||||
|
- Debounced resize handler for performance
|
||||||
|
|
||||||
|
## v0.1.24
|
||||||
|
|
||||||
|
### Bug Fixes
|
||||||
|
|
||||||
|
**Autotask Integration:**
|
||||||
|
- Fixed Autotask ticket description being cleared when resolving tickets
|
||||||
|
- Root cause: The `update_ticket_resolution_safe` function was performing a GET to retrieve the ticket, then a PUT to update the resolution field, but the `description` field was not included in the PUT payload
|
||||||
|
- Impact: When clicking "Resolve" on an Autotask ticket, the description would be set to NULL by the Autotask API
|
||||||
|
- Solution: Added `description` to the `optional_fields` list in `update_ticket_resolution_safe` so the original description is preserved from the GET response and included in the PUT request
|
||||||
|
- Location: `containers/backupchecks/src/backend/app/integrations/autotask/client.py` line 672
|
||||||
|
|
||||||
## v0.1.23
|
## v0.1.23
|
||||||
|
|
||||||
This comprehensive release introduces a complete built-in documentation system with 33 pages covering all features, enhanced audit logging for compliance and troubleshooting, timezone-aware datetime display throughout the application, and numerous Autotask PSA integration improvements for better usability and workflow efficiency.
|
This comprehensive release introduces a complete built-in documentation system with 33 pages covering all features, enhanced audit logging for compliance and troubleshooting, timezone-aware datetime display throughout the application, and numerous Autotask PSA integration improvements for better usability and workflow efficiency.
|
||||||
|
|||||||
@ -1 +1 @@
|
|||||||
v0.1.23
|
v0.1.26
|
||||||
|
|||||||
Loading…
Reference in New Issue
Block a user