Compare commits

...

258 Commits

Author SHA1 Message Date
ae66457415 Merge pull request 'v20260113-08-vspc-object-linking-normalize' (#114) from v20260113-08-vspc-object-linking-normalize into main
Reviewed-on: #114
2026-01-13 16:45:59 +01:00
d54b5c1e5d Merge pull request 'Auto-commit local changes before build (2026-01-13 15:07:59)' (#113) from v20260113-08-vspc-object-linking into main
Reviewed-on: #113
2026-01-13 16:45:38 +01:00
e3d109ed24 Merge pull request 'Auto-commit local changes before build (2026-01-13 14:27:23)' (#112) from v20260113-07-job-delete-fix into main
Reviewed-on: #112
2026-01-13 16:45:17 +01:00
d2f7618772 Merge pull request 'Auto-commit local changes before build (2026-01-13 14:12:58)' (#111) from v20260113-06-overrides-error-match-modes into main
Reviewed-on: #111
2026-01-13 16:44:56 +01:00
c57c58cc3d Release v0.1.20 on branch v20260113-08-vspc-object-linking-normalize (bump type 1) 2026-01-13 16:43:25 +01:00
beda8cc0f9 Auto-commit local changes before build (2026-01-13 16:43:00) 2026-01-13 16:43:00 +01:00
fd0051cb29 Auto-commit local changes before build (2026-01-13 16:30:32) 2026-01-13 16:30:32 +01:00
934a495867 Auto-commit local changes before build (2026-01-13 15:07:59) 2026-01-13 15:07:59 +01:00
a5a1cd2743 Auto-commit local changes before build (2026-01-13 14:27:23) 2026-01-13 14:27:23 +01:00
56415eae59 Auto-commit local changes before build (2026-01-13 14:12:58) 2026-01-13 14:12:59 +01:00
a0d6b1e0d4 Merge pull request 'v20260113-05-reporter-menu-restrict' (#110) from v20260113-05-reporter-menu-restrict into main
Reviewed-on: #110
2026-01-13 13:20:22 +01:00
3a31b6c5d2 Merge pull request 'Auto-commit local changes before build (2026-01-13 12:52:53)' (#109) from v20260113-04-edge-initial-setup-users-exist into main
Reviewed-on: #109
2026-01-13 13:20:07 +01:00
506e1f56cd Merge pull request 'Auto-commit local changes before build (2026-01-13 12:41:58)' (#108) from v20260113-03-runchecks-overall-remark into main
Reviewed-on: #108
2026-01-13 13:19:45 +01:00
f3d4145056 Auto-commit local changes before build (2026-01-13 13:17:22) 2026-01-13 13:17:22 +01:00
6c0dcf5a2d Auto-commit local changes before build (2026-01-13 13:15:30) 2026-01-13 13:15:30 +01:00
0c66ced915 Auto-commit local changes before build (2026-01-13 12:52:53) 2026-01-13 12:52:53 +01:00
a35ae4bf7a Auto-commit local changes before build (2026-01-13 12:41:58) 2026-01-13 12:41:58 +01:00
17809e40a5 Auto-commit local changes before build (2026-01-13 11:51:15) 2026-01-13 11:51:15 +01:00
3b5204a970 Auto-commit local changes before build (2026-01-13 11:49:25) 2026-01-13 11:49:25 +01:00
0fa98a5955 Merge pull request 'Auto-commit local changes before build (2026-01-13 10:43:31)' (#107) from v20260113-02-edge-mail-lightmode into main
Reviewed-on: #107
2026-01-13 11:48:16 +01:00
25ba0f5cff Merge branch 'main' into v20260113-02-edge-mail-lightmode 2026-01-13 11:48:09 +01:00
b17608c3c5 Merge pull request 'Auto-commit local changes before build (2026-01-13 10:21:27)' (#106) from v20260113-01-changelog-markdown-render into main
Reviewed-on: #106
2026-01-13 11:47:54 +01:00
2b57800604 Merge branch 'main' into v20260113-01-changelog-markdown-render 2026-01-13 11:47:45 +01:00
d39398113e Merge pull request 'Auto-commit local changes before build (2026-01-12 16:55:54)' (#105) from v20260112-18-changelog-from-gitea into main
Reviewed-on: #105
2026-01-13 11:47:33 +01:00
3bd53bbaca Merge pull request 'Auto-commit local changes before build (2026-01-12 15:53:35)' (#104) from v20260112-17-synology-abb-warning-recognize-objects into main
Reviewed-on: #104
2026-01-13 11:47:17 +01:00
47058d2b27 Merge pull request 'Auto-commit local changes before build (2026-01-12 15:39:19)' (#103) from v20260112-16-runchecks-popup-objects-no-overlap into main
Reviewed-on: #103
2026-01-13 11:47:00 +01:00
3cd491eaf6 Merge pull request 'Auto-commit local changes before build (2026-01-12 15:04:09)' (#102) from v20260112-15-vspc-scroll-partial-approve-objects into main
Reviewed-on: #102
2026-01-13 11:46:44 +01:00
ba8693d512 Merge pull request 'Auto-commit local changes before build (2026-01-12 14:28:50)' (#101) from v20260112-14-vspc-company-mapping-require-all into main
Reviewed-on: #101
2026-01-13 11:46:14 +01:00
90317c804b Merge pull request 'Auto-commit local changes before build (2026-01-12 14:07:33)' (#100) from v20260112-13-vspc-company-mapping-popup-ui into main
Reviewed-on: #100
2026-01-13 11:45:56 +01:00
066c45ab9b Merge pull request 'Auto-commit local changes before build (2026-01-12 13:58:49)' (#99) from v20260112-12-vspc-company-mapping-popup-visible into main
Reviewed-on: #99
2026-01-13 11:43:59 +01:00
09e19a72d0 Merge pull request 'Auto-commit local changes before build (2026-01-12 13:46:37)' (#98) from v20260112-11-show-vspc-company-mapping-popup into main
Reviewed-on: #98
2026-01-13 11:43:43 +01:00
7c204fb8dd Merge pull request 'Auto-commit local changes before build (2026-01-12 13:38:55)' (#97) from v20260112-10-fix-vspc-approve-endpoint-duplicate into main
Reviewed-on: #97
2026-01-13 11:43:28 +01:00
fd175200db Merge pull request 'Auto-commit local changes before build (2026-01-12 13:32:27)' (#96) from v20260112-09-veeam-vspc-company-mapping-popup into main
Reviewed-on: #96
2026-01-13 11:42:47 +01:00
55e159d1fd Merge pull request 'Auto-commit local changes before build (2026-01-12 12:52:12)' (#95) from v20260112-08-fix-veeam-vspc-parser-syntaxerror into main
Reviewed-on: #95
2026-01-13 11:42:30 +01:00
f63b47cdfa Merge pull request 'Auto-commit local changes before build (2026-01-12 12:41:47)' (#94) from v20260112-07-veeam-vspc-active-alarms-parser into main
Reviewed-on: #94
2026-01-13 11:42:15 +01:00
a00155c1f3 Merge pull request 'Auto-commit local changes before build (2026-01-12 12:32:03)' (#93) from v20260112-06-veeam-spc-alarm-summary-parser into main
Reviewed-on: #93
2026-01-13 11:41:59 +01:00
14d2422a1f Merge pull request 'Auto-commit local changes before build (2026-01-12 11:13:43)' (#92) from v20260112-05-qnap-firmware-update-info-parser into main
Reviewed-on: #92
2026-01-13 11:41:39 +01:00
b610ab511d Merge pull request 'Auto-commit local changes before build (2026-01-12 10:20:41)' (#91) from v20260112-04-remove-runchecks-success-override-button into main
Reviewed-on: #91
2026-01-13 11:41:23 +01:00
e52c48fa45 Merge pull request 'Auto-commit local changes before build (2026-01-12 10:11:38)' (#90) from v20260112-03-ntfs-audit-recognize-bouter-hosts into main
Reviewed-on: #90
2026-01-13 11:41:06 +01:00
7c17c55fbe Merge pull request 'Auto-commit local changes before build (2026-01-12 09:52:29)' (#89) from v20260112-02-synology-abb-subject-partial-warning into main
Reviewed-on: #89
2026-01-13 11:38:19 +01:00
cd31b6f305 Merge pull request 'Auto-commit local changes before build (2026-01-12 09:43:48)' (#88) from v20260112-01-synology-abb-partial-warning into main
Reviewed-on: #88
2026-01-13 11:37:59 +01:00
16b928041a Merge pull request 'Auto-commit local changes before build (2026-01-12 09:27:59)' (#87) from v20260109-13-ntfs-audit-jobname-prefix-flex into main
Reviewed-on: #87
2026-01-13 11:37:21 +01:00
1f1f587add Merge pull request 'Auto-commit local changes before build (2026-01-12 09:21:16)' (#86) from v20260109-12-ntfs-audit-fqdn-jobname into main
Reviewed-on: #86
2026-01-13 11:36:53 +01:00
be670c2ae5 Merge pull request 'Auto-commit local changes before build (2026-01-12 09:09:48)' (#85) from v20260109-11-ntfs-auditing-audit-parser into main
Reviewed-on: #85
2026-01-13 11:36:31 +01:00
ffb81e8e3d Merge pull request 'Auto-commit local changes before build (2026-01-09 15:01:37)' (#84) from v20260109-10-veeam-cloud-connect-report-parser into main
Reviewed-on: #84
2026-01-13 11:36:12 +01:00
79829abd70 Merge pull request 'Auto-commit local changes before build (2026-01-09 13:52:15)' (#83) from v20260109-09-ellipsis-reset-on-popup-close into main
Reviewed-on: #83
2026-01-13 11:35:54 +01:00
c928d1bc55 Merge pull request 'Auto-commit local changes before build (2026-01-09 13:34:03)' (#82) from v20260109-08-ui-ellipsis-and-remove-objects-header into main
Reviewed-on: #82
2026-01-13 11:35:15 +01:00
ca319f0b7c Merge pull request 'Auto-commit local changes before build (2026-01-09 13:01:06)' (#81) from v20260109-07-feedback-open-default-resolved-sort into main
Reviewed-on: #81
2026-01-13 11:34:54 +01:00
80447813c0 Merge pull request 'Auto-commit local changes before build (2026-01-09 12:43:58)' (#80) from v20260109-06-user-management-edit-roles into main
Reviewed-on: #80
2026-01-13 11:33:26 +01:00
7a65b1dcfe Merge pull request 'Auto-commit local changes before build (2026-01-09 12:14:18)' (#79) from v20260109-05-fix-parsers-route-import into main
Reviewed-on: #79
2026-01-13 11:32:01 +01:00
6ddea3ad11 Merge pull request 'v20260109-04-parsers-page-all-parsers' (#78) from v20260109-04-parsers-page-all-parsers into main
Reviewed-on: #78
2026-01-13 11:31:42 +01:00
ea264cb3e4 Merge pull request 'Auto-commit local changes before build (2026-01-09 10:13:07)' (#77) from v20260109-03-preserve-ampersand-errors into main
Reviewed-on: #77
2026-01-13 11:31:22 +01:00
6b4b33ff64 Merge pull request 'Auto-commit local changes before build (2026-01-09 09:55:24)' (#76) from v20260109-02-object-list-sorting into main
Reviewed-on: #76
2026-01-13 11:31:03 +01:00
957e4f97e6 Merge pull request 'Auto-commit local changes before build (2026-01-09 09:35:39)' (#75) from v20260109-01-veeam-m365-overall-message into main
Reviewed-on: #75
2026-01-13 11:30:36 +01:00
341530831a Merge branch 'main' into v20260109-01-veeam-m365-overall-message 2026-01-13 11:30:08 +01:00
dfb0d6cc33 Merge pull request 'v20260108-39-changelog-0.1.19' (#74) from v20260108-39-changelog-0.1.19 into main
Reviewed-on: #74
2026-01-13 11:29:49 +01:00
1e683a9c0d Merge pull request 'Auto-commit local changes before build (2026-01-08 16:54:10)' (#73) from v20260108-38-missed-run-grace-window into main
Reviewed-on: #73
2026-01-13 11:29:21 +01:00
f9fd0ce016 Merge pull request 'v20260108-37-synology-updates-info-parser' (#72) from v20260108-37-synology-updates-info-parser into main
Reviewed-on: #72
2026-01-13 11:29:00 +01:00
4e1c300f0c Merge pull request 'Auto-commit local changes before build (2026-01-08 15:47:01)' (#71) from v20260108-36-inbox-jobdetails-details-above-mail into main
Reviewed-on: #71
2026-01-13 11:28:18 +01:00
42694c08cc Merge pull request 'Auto-commit local changes before build (2026-01-08 15:26:35)' (#70) from v20260108-35-3cx-ssl-certificate-tracking into main
Reviewed-on: #70
2026-01-13 11:27:58 +01:00
d08d31f44b Merge pull request 'Auto-commit local changes before build (2026-01-08 15:10:32)' (#69) from v20260108-34-runchecks-popup-mail-body into main
Reviewed-on: #69
2026-01-13 11:27:29 +01:00
58f0e27dd9 Merge pull request 'Auto-commit local changes before build (2026-01-08 14:46:19)' (#68) from v20260108-33-runchecks-success-override into main
Reviewed-on: #68
2026-01-13 11:27:06 +01:00
97f3a7f9dc Merge pull request 'Auto-commit local changes before build (2026-01-08 14:30:19)' (#67) from v20260108-32-runchecks-ticket-copy-button into main
Reviewed-on: #67
2026-01-13 11:26:37 +01:00
861767950d Merge pull request 'Auto-commit local changes before build (2026-01-08 14:18:20)' (#66) from v20260108-31-inbox-empty-body-attachment-render into main
Reviewed-on: #66
2026-01-13 11:26:18 +01:00
60bfbbe2b8 Merge pull request 'Auto-commit local changes before build (2026-01-08 14:05:43)' (#65) from v20260108-30-customer-delete-ticket-remark-scopes into main
Reviewed-on: #65
2026-01-13 11:26:03 +01:00
52873047d6 Merge pull request 'Auto-commit local changes before build (2026-01-08 13:47:59)' (#64) from v20260108-29-inbox-attachment-body-fallback into main
Reviewed-on: #64
2026-01-13 11:25:08 +01:00
c8a078bc45 Merge pull request 'v20260108-28-admin-all-mail-open-fix' (#63) from v20260108-28-admin-all-mail-open-fix into main
Reviewed-on: #63
2026-01-13 11:24:38 +01:00
5617256820 Merge pull request 'Auto-commit local changes before build (2026-01-08 12:40:27)' (#62) from v20260108-27-admin-all-mail-audit-page into main
Reviewed-on: #62
2026-01-13 11:24:17 +01:00
d5eecd9220 Merge pull request 'Auto-commit local changes before build (2026-01-08 11:15:47)' (#61) from v20260108-26-mail-move-only-after-successful-import into main
Reviewed-on: #61
2026-01-13 11:23:53 +01:00
297c7d1789 Merge pull request 'v20260108-26-changelog-0.1.18-update' (#60) from v20260108-26-changelog-0.1.18-update into main
Reviewed-on: #60
2026-01-13 11:23:34 +01:00
a38fe43613 Merge pull request 'Auto-commit local changes before build (2026-01-08 10:13:48)' (#59) from v20260108-25-job-history-ticket-popup into main
Reviewed-on: #59
2026-01-13 11:23:14 +01:00
0ddcc31e26 Merge pull request 'Auto-commit local changes before build (2026-01-06 20:50:50)' (#58) from v20260106-24-ticket-scope-resolve-popup into main
Reviewed-on: #58
2026-01-13 11:22:56 +01:00
44233203e4 Merge pull request 'Auto-commit local changes before build (2026-01-06 20:04:54)' (#57) from v20260106-22-ticket-link-multiple-jobs into main
Reviewed-on: #57
2026-01-13 11:22:29 +01:00
01816813ee Merge pull request 'backupchecks-v20260106-21-changelog-0.1.17' (#56) from backupchecks-v20260106-21-changelog-0.1.17 into main
Reviewed-on: #56
2026-01-13 11:21:40 +01:00
eb3e25b18f Merge pull request 'Auto-commit local changes before build (2026-01-06 16:23:49)' (#55) from v20260106-20-fix-customer-delete-fk into main
Reviewed-on: #55
2026-01-13 11:21:23 +01:00
b12bac5e34 Merge pull request 'Auto-commit local changes before build (2026-01-06 15:56:54)' (#54) from v20260106-19-missed-run-detection-threshold into main
Reviewed-on: #54
2026-01-13 11:21:05 +01:00
b7ad9cca72 Merge pull request 'Auto-commit local changes before build (2026-01-06 15:21:26)' (#53) from v20260106-18-runchecks-popup-objects-fallback into main
Reviewed-on: #53
2026-01-13 11:20:46 +01:00
9f74e516cc Merge pull request 'v20260106-18-Reset' (#52) from v20260106-18-Reset into main
Reviewed-on: #52
2026-01-13 11:20:25 +01:00
16f96ed0be Merge pull request 'Auto-commit local changes before build (2026-01-06 15:04:51)' (#51) from v20260106-17-jobrun-popup-objects-restore into main
Reviewed-on: #51
2026-01-13 11:20:01 +01:00
2a0ffc355d Merge pull request 'v20260106-16-reset' (#50) from v20260106-16-reset into main
Reviewed-on: #50
2026-01-13 11:19:39 +01:00
ea726dc78e Merge pull request 'v20260106-15-jobrun-popup-objects-sort' (#49) from v20260106-15-jobrun-popup-objects-sort into main
Reviewed-on: #49
2026-01-13 11:19:16 +01:00
1f3d6f1eac Merge pull request 'Auto-commit local changes before build (2026-01-06 13:46:57)' (#48) from v20260106-14-veeam-m365-overall-message into main
Reviewed-on: #48
2026-01-13 11:18:59 +01:00
8023181048 Merge pull request 'Auto-commit local changes before build (2026-01-06 13:35:46)' (#47) from v20260106-13-veeam-config-backup-overall-message into main
Reviewed-on: #47
2026-01-13 11:18:39 +01:00
a41622525a Merge pull request 'Auto-commit local changes before build (2026-01-06 13:25:04)' (#46) from v20260106-12-disable-ticket-and-remark-edit into main
Reviewed-on: #46
2026-01-13 11:18:23 +01:00
219775a16b Merge pull request 'Auto-commit local changes before build (2026-01-06 12:43:38)' (#45) from v20260106-11-new-ticket-remove-description into main
Reviewed-on: #45
2026-01-13 11:18:06 +01:00
bedfbde1b0 Merge pull request 'Auto-commit local changes before build (2026-01-06 12:23:16)' (#44) from v20260106-10-fix-remarks-indent-bad-gateway into main
Reviewed-on: #44
2026-01-13 11:08:57 +01:00
9d2ef99cf9 Merge pull request 'Auto-commit local changes before build (2026-01-06 12:19:54)' (#43) from v20260106-09-fix-ticket-detail-indent into main
Reviewed-on: #43
2026-01-13 11:08:39 +01:00
7a879ce7c4 Merge pull request 'Auto-commit local changes before build (2026-01-06 12:16:27)' (#42) from v20260106-08-ticket-code-input-disable-edit into main
Reviewed-on: #42
2026-01-13 11:08:18 +01:00
a5ebe867bb Merge pull request 'Auto-commit local changes before build (2026-01-06 11:47:15)' (#41) from v20260106-07-feedback-open-reply into main
Reviewed-on: #41
2026-01-13 11:07:54 +01:00
c75e3d250b Merge pull request 'Auto-commit local changes before build (2026-01-06 11:29:00)' (#40) from v20260106-06-customers-delete-fk-cascade-fix into main
Reviewed-on: #40
2026-01-13 11:07:30 +01:00
a600a7ad33 Merge pull request 'Auto-commit local changes before build (2026-01-06 11:15:00)' (#39) from v20260106-05-jobs-row-click-and-archive-button-move into main
Reviewed-on: #39
2026-01-13 11:07:03 +01:00
b5183f23f0 Merge pull request 'Auto-commit local changes before build (2026-01-06 11:04:28)' (#38) from v20260106-04-jobs-archive into main
Reviewed-on: #38
2026-01-13 11:06:43 +01:00
a9039ef336 Merge pull request 'Auto-commit local changes before build (2026-01-06 10:11:59)' (#37) from v20260106-03-veeam-full-job-name-merge into main
Reviewed-on: #37
2026-01-13 11:06:25 +01:00
7d8185384e Merge pull request 'Auto-commit local changes before build (2026-01-06 10:02:17)' (#36) from v20260106-02-inbox-bulk-delete into main
Reviewed-on: #36
2026-01-13 11:05:56 +01:00
23f6b4d3e7 Merge pull request 'Auto-commit local changes before build (2026-01-06 09:45:02)' (#35) from v20260106-01-m365-combined-job-name-merge into main
Reviewed-on: #35
2026-01-13 11:04:05 +01:00
effdc3fe00 Merge pull request 'v20260104-20-changelog-0-1-16' (#34) from v20260104-20-changelog-0-1-16 into main
Reviewed-on: #34
2026-01-13 11:03:46 +01:00
582dc06427 Merge pull request 'Auto-commit local changes before build (2026-01-04 18:22:54)' (#33) from backupchecks-v20260104-19-restoredto15-reports-html-layout-gap-fix into main
Reviewed-on: #33
2026-01-13 11:02:56 +01:00
61d0608164 Merge pull request 'Auto-commit local changes before build (2026-01-04 18:19:57)' (#32) from v20260104-18-reports-html-trend-visual-alignment-fix into main
Reviewed-on: #32
2026-01-13 11:02:34 +01:00
8934038320 Merge pull request 'Auto-commit local changes before build (2026-01-04 18:12:52)' (#31) from v20260104-17-reports-html-trend-match-status-size into main
Reviewed-on: #31
2026-01-13 11:02:08 +01:00
9b707f8fad Merge pull request 'Auto-commit local changes before build (2026-01-04 18:07:10)' (#30) from v20260104-16-reports-html-trend-size-match into main
Reviewed-on: #30
2026-01-13 11:00:52 +01:00
814b35458b Merge pull request 'Auto-commit local changes before build (2026-01-04 18:01:23)' (#29) from v20260104-15-reports-html-layout-gap-fix into main
Reviewed-on: #29
2026-01-13 11:00:03 +01:00
8e511a111d Merge pull request 'v20260104-14-reports-stats-total-runs-success-rate-fix' (#28) from v20260104-14-reports-stats-total-runs-success-rate-fix into main
Reviewed-on: #28
2026-01-13 10:59:39 +01:00
a6ac20c525 Upload files to "docs" 2026-01-13 10:59:05 +01:00
4783c91f98 Delete docs/changelog.md 2026-01-13 10:56:17 +01:00
2a2237bd6e Auto-commit local changes before build (2026-01-13 10:43:31) 2026-01-13 10:43:31 +01:00
6efecc848b Auto-commit local changes before build (2026-01-13 10:21:27) 2026-01-13 10:21:27 +01:00
0cc587805f Auto-commit local changes before build (2026-01-12 16:55:54) 2026-01-12 16:55:54 +01:00
67c6db34ee Update docs/changelog.md 2026-01-12 16:33:36 +01:00
0ced2f8a48 Auto-commit local changes before build (2026-01-12 15:53:35) 2026-01-12 15:53:35 +01:00
a7d6237632 Auto-commit local changes before build (2026-01-12 15:39:19) 2026-01-12 15:39:19 +01:00
efe7bd184e Auto-commit local changes before build (2026-01-12 15:04:09) 2026-01-12 15:04:09 +01:00
f18044f72c Auto-commit local changes before build (2026-01-12 14:28:50) 2026-01-12 14:28:50 +01:00
b791c43299 Auto-commit local changes before build (2026-01-12 14:07:33) 2026-01-12 14:07:33 +01:00
1e652fe311 Auto-commit local changes before build (2026-01-12 13:58:49) 2026-01-12 13:58:49 +01:00
8c7f7f8805 Auto-commit local changes before build (2026-01-12 13:46:37) 2026-01-12 13:46:37 +01:00
2b6a78b99b Auto-commit local changes before build (2026-01-12 13:38:55) 2026-01-12 13:38:55 +01:00
0d8f4e88e6 Auto-commit local changes before build (2026-01-12 13:32:27) 2026-01-12 13:32:27 +01:00
e84e42d856 Auto-commit local changes before build (2026-01-12 12:52:12) 2026-01-12 12:52:12 +01:00
ae61c563b8 Auto-commit local changes before build (2026-01-12 12:41:47) 2026-01-12 12:41:47 +01:00
b1522cef2f Auto-commit local changes before build (2026-01-12 12:32:03) 2026-01-12 12:32:03 +01:00
ccf9af43d5 Auto-commit local changes before build (2026-01-12 11:13:43) 2026-01-12 11:13:43 +01:00
2f67b29a99 Auto-commit local changes before build (2026-01-12 10:20:41) 2026-01-12 10:20:41 +01:00
68632d4958 Auto-commit local changes before build (2026-01-12 10:11:38) 2026-01-12 10:11:38 +01:00
a7021de872 Auto-commit local changes before build (2026-01-12 09:52:29) 2026-01-12 09:52:29 +01:00
7fcdf5702f Auto-commit local changes before build (2026-01-12 09:43:48) 2026-01-12 09:43:48 +01:00
ae5c8829d6 Auto-commit local changes before build (2026-01-12 09:27:59) 2026-01-12 09:27:59 +01:00
c8b85316e9 Auto-commit local changes before build (2026-01-12 09:21:16) 2026-01-12 09:21:16 +01:00
32f0f44601 Auto-commit local changes before build (2026-01-12 09:09:48) 2026-01-12 09:09:48 +01:00
166311da43 Auto-commit local changes before build (2026-01-09 15:01:37) 2026-01-09 15:01:37 +01:00
7283eb8d99 Auto-commit local changes before build (2026-01-09 13:52:15) 2026-01-09 13:52:15 +01:00
7da364638a Auto-commit local changes before build (2026-01-09 13:34:03) 2026-01-09 13:34:03 +01:00
3e9bb0e065 Auto-commit local changes before build (2026-01-09 13:01:06) 2026-01-09 13:01:06 +01:00
77416a8382 Auto-commit local changes before build (2026-01-09 12:43:58) 2026-01-09 12:43:58 +01:00
6ccc88c8d2 Auto-commit local changes before build (2026-01-09 12:14:18) 2026-01-09 12:14:18 +01:00
e928eb0c83 Auto-commit local changes before build (2026-01-09 12:06:13) 2026-01-09 12:06:13 +01:00
8f705475db Auto-commit local changes before build (2026-01-09 12:03:02) 2026-01-09 12:03:02 +01:00
443c7a4c71 Auto-commit local changes before build (2026-01-09 10:13:07) 2026-01-09 10:13:07 +01:00
17e36b8633 Auto-commit local changes before build (2026-01-09 09:55:24) 2026-01-09 09:55:24 +01:00
62d65d20ad Auto-commit local changes before build (2026-01-09 09:35:39) 2026-01-09 09:35:39 +01:00
ea12f1ecce Release v0.1.19 on branch v20260108-39-changelog-0.1.19 (bump type 1) 2026-01-08 17:15:08 +01:00
1e7dd551ab Auto-commit local changes before build (2026-01-08 17:14:09) 2026-01-08 17:14:09 +01:00
57773a7860 Auto-commit local changes before build (2026-01-08 16:54:10) 2026-01-08 16:54:10 +01:00
9ac125d60c Auto-commit local changes before build (2026-01-08 16:31:25) 2026-01-08 16:31:25 +01:00
bbfcfebfc2 Auto-commit local changes before build (2026-01-08 16:20:40) 2026-01-08 16:20:40 +01:00
63d4b0126b Auto-commit local changes before build (2026-01-08 16:12:11) 2026-01-08 16:12:11 +01:00
476d9c7703 Auto-commit local changes before build (2026-01-08 15:47:01) 2026-01-08 15:47:01 +01:00
ec1cbd2a2c Auto-commit local changes before build (2026-01-08 15:26:35) 2026-01-08 15:26:35 +01:00
1cbec82d65 Auto-commit local changes before build (2026-01-08 15:10:32) 2026-01-08 15:10:32 +01:00
87581f825f Auto-commit local changes before build (2026-01-08 14:46:19) 2026-01-08 14:46:19 +01:00
b89d86bf66 Auto-commit local changes before build (2026-01-08 14:30:19) 2026-01-08 14:30:19 +01:00
90c24de1f5 Auto-commit local changes before build (2026-01-08 14:18:20) 2026-01-08 14:18:20 +01:00
19fb328602 Auto-commit local changes before build (2026-01-08 14:05:43) 2026-01-08 14:05:43 +01:00
d7ffb8aa52 Auto-commit local changes before build (2026-01-08 13:47:59) 2026-01-08 13:47:59 +01:00
60f6f8e3d6 Auto-commit local changes before build (2026-01-08 13:31:57) 2026-01-08 13:31:57 +01:00
b8f86c183c Auto-commit local changes before build (2026-01-08 12:55:39) 2026-01-08 12:55:39 +01:00
b3fde8f431 Auto-commit local changes before build (2026-01-08 12:54:15) 2026-01-08 12:54:15 +01:00
b7f057f0b5 Auto-commit local changes before build (2026-01-08 12:40:27) 2026-01-08 12:40:27 +01:00
1131f7f2fe Auto-commit local changes before build (2026-01-08 11:15:47) 2026-01-08 11:15:47 +01:00
30fa747fca Release v0.1.18 on branch v20260108-26-changelog-0.1.18-update (bump type 1) 2026-01-08 10:27:17 +01:00
d642e7806d Auto-commit local changes before build (2026-01-08 10:25:33) 2026-01-08 10:25:33 +01:00
eb4b80e792 Auto-commit local changes before build (2026-01-08 10:13:48) 2026-01-08 10:13:48 +01:00
71752fb926 Auto-commit local changes before build (2026-01-06 20:50:50) 2026-01-06 20:50:50 +01:00
92716dc13c Auto-commit local changes before build (2026-01-06 20:04:54) 2026-01-06 20:04:54 +01:00
af0faa37f8 Release v0.1.17 on branch backupchecks-v20260106-21-changelog-0.1.17 (bump type 1) 2026-01-06 19:17:02 +01:00
fb46f98e6d Auto-commit local changes before build (2026-01-06 19:15:16) 2026-01-06 19:15:16 +01:00
82bd361ef2 Auto-commit local changes before build (2026-01-06 16:55:52) 2026-01-06 16:55:52 +01:00
a295d267a6 Auto-commit local changes before build (2026-01-06 16:53:31) 2026-01-06 16:53:31 +01:00
dc8e1d093b Auto-commit local changes before build (2026-01-06 16:23:49) 2026-01-06 16:23:49 +01:00
d0a7452240 Auto-commit local changes before build (2026-01-06 15:56:54) 2026-01-06 15:56:54 +01:00
cf4b23b26e Auto-commit local changes before build (2026-01-06 15:21:26) 2026-01-06 15:21:26 +01:00
f736a62ed5 Auto-commit local changes before build (2026-01-06 15:12:25) 2026-01-06 15:12:25 +01:00
39b4ec6064 Auto-commit local changes before build (2026-01-06 15:07:44) 2026-01-06 15:07:44 +01:00
7bfde72f4d Auto-commit local changes before build (2026-01-06 15:04:51) 2026-01-06 15:04:51 +01:00
9a42df3dd3 Auto-commit local changes before build (2026-01-06 14:22:34) 2026-01-06 14:22:34 +01:00
86ac67a59e Auto-commit local changes before build (2026-01-06 14:21:05) 2026-01-06 14:21:05 +01:00
e23e194e40 Auto-commit local changes before build (2026-01-06 14:19:41) 2026-01-06 14:19:41 +01:00
9ebfecc4bb Auto-commit local changes before build (2026-01-06 14:15:59) 2026-01-06 14:15:59 +01:00
fb09891dc4 Auto-commit local changes before build (2026-01-06 13:57:25) 2026-01-06 13:57:25 +01:00
369341c370 Auto-commit local changes before build (2026-01-06 13:46:57) 2026-01-06 13:46:57 +01:00
0c7aaa61db Auto-commit local changes before build (2026-01-06 13:35:46) 2026-01-06 13:35:46 +01:00
63b47a59e0 Auto-commit local changes before build (2026-01-06 13:25:04) 2026-01-06 13:25:04 +01:00
6984b9ec22 Auto-commit local changes before build (2026-01-06 12:43:38) 2026-01-06 12:43:38 +01:00
9e0f215910 Auto-commit local changes before build (2026-01-06 12:23:16) 2026-01-06 12:23:16 +01:00
1a91591482 Auto-commit local changes before build (2026-01-06 12:19:54) 2026-01-06 12:19:54 +01:00
1db555d487 Auto-commit local changes before build (2026-01-06 12:16:27) 2026-01-06 12:16:27 +01:00
cc0d969ebf Auto-commit local changes before build (2026-01-06 11:47:15) 2026-01-06 11:47:15 +01:00
551e0dec26 Auto-commit local changes before build (2026-01-06 11:29:00) 2026-01-06 11:29:00 +01:00
661dbc7013 Auto-commit local changes before build (2026-01-06 11:15:00) 2026-01-06 11:15:00 +01:00
b54ba900d0 Auto-commit local changes before build (2026-01-06 11:04:28) 2026-01-06 11:04:28 +01:00
13c4c5950d Auto-commit local changes before build (2026-01-06 10:11:59) 2026-01-06 10:11:59 +01:00
19f4b59e23 Auto-commit local changes before build (2026-01-06 10:02:17) 2026-01-06 10:02:17 +01:00
f14e02992d Auto-commit local changes before build (2026-01-06 09:45:02) 2026-01-06 09:45:02 +01:00
fcd8518598 Merge pull request 'Auto-commit local changes before build (2026-01-03 17:50:38)' (#25) from v20260103-12-reports-columns-selector-init-fix into main
Reviewed-on: #25
2026-01-06 09:28:23 +01:00
6944755dd9 Merge pull request 'Auto-commit local changes before build (2026-01-03 16:04:10)' (#24) from v20260103-11-reports-view-raw-columns-fix into main
Reviewed-on: #24
2026-01-06 09:28:07 +01:00
5100093be4 Merge pull request 'Auto-commit local changes before build (2026-01-03 15:17:01)' (#23) from v20260103-10-reports-summary-columns-metadata into main
Reviewed-on: #23
2026-01-06 09:27:50 +01:00
5f23e4cbae Merge pull request 'Auto-commit local changes before build (2026-01-03 14:57:56)' (#22) from v20260103-09-reports-column-selection-ui into main
Reviewed-on: #22
2026-01-06 09:27:34 +01:00
27280f5039 Merge pull request 'Auto-commit local changes before build (2026-01-03 14:15:04)' (#21) from v20260103-08-reports-stats-endpoint-fix into main
Reviewed-on: #21
2026-01-06 09:27:16 +01:00
cf6dbce3bb Merge pull request 'Auto-commit local changes before build (2026-01-03 13:59:54)' (#20) from v20260103-07-reports-advanced-reporting-foundation into main
Reviewed-on: #20
2026-01-06 09:27:01 +01:00
733b64b8b0 Merge pull request 'Auto-commit local changes before build (2026-01-03 13:40:56)' (#19) from v20260103-06-reports-delete-button-fix into main
Reviewed-on: #19
2026-01-06 09:26:40 +01:00
544ce24fdb Merge pull request 'Auto-commit local changes before build (2026-01-03 13:29:09)' (#18) from v20260103-05-reports-date-import-fix into main
Reviewed-on: #18
2026-01-06 09:26:18 +01:00
c5dd98cda6 Merge pull request 'Auto-commit local changes before build (2026-01-03 13:24:32)' (#17) from v20260103-04-reports-default-period-fix into main
Reviewed-on: #17
2026-01-06 09:26:00 +01:00
a339540f4c Merge pull request 'Auto-commit local changes before build (2026-01-03 13:05:53)' (#16) from v20260103-03-reports-loading-fix into main
Reviewed-on: #16
2026-01-06 09:25:40 +01:00
82c67f6b01 Merge pull request 'Auto-commit local changes before build (2026-01-03 12:54:35)' (#15) from changes-v20260103-02-reports-delete into main
Reviewed-on: #15
2026-01-06 09:25:22 +01:00
2eeb8266c7 Merge pull request 'Auto-commit local changes before build (2026-01-03 12:21:44)' (#14) from v20260103-01-reports-jobs-delete into main
Reviewed-on: #14
2026-01-06 09:24:49 +01:00
ef9fae053a Release v0.1.16 on branch v20260104-20-changelog-0-1-16 (bump type 1) 2026-01-04 18:42:49 +01:00
9716d5353b Auto-commit local changes before build (2026-01-04 18:41:10) 2026-01-04 18:41:10 +01:00
aecd1872c0 Auto-commit local changes before build (2026-01-04 18:22:54) 2026-01-04 18:22:54 +01:00
79b9580a66 Auto-commit local changes before build (2026-01-04 18:19:57) 2026-01-04 18:19:57 +01:00
ab4f5ae696 Auto-commit local changes before build (2026-01-04 18:12:52) 2026-01-04 18:12:52 +01:00
9f61ed6629 Auto-commit local changes before build (2026-01-04 18:07:10) 2026-01-04 18:07:10 +01:00
6a0aa53cd0 Auto-commit local changes before build (2026-01-04 18:01:23) 2026-01-04 18:01:23 +01:00
79812d3cec Auto-commit local changes before build (2026-01-04 17:26:07) 2026-01-04 17:26:07 +01:00
d51741532a Auto-commit local changes before build (2026-01-04 16:59:43) 2026-01-04 16:59:43 +01:00
f680e799b2 Auto-commit local changes before build (2026-01-04 16:50:41) 2026-01-04 16:50:41 +01:00
17d0398fb7 Auto-commit local changes before build (2026-01-04 16:16:24) 2026-01-04 16:16:24 +01:00
fdeeaef0e6 Auto-commit local changes before build (2026-01-04 16:02:09) 2026-01-04 16:02:09 +01:00
fc275a0285 Auto-commit local changes before build (2026-01-04 15:50:22) 2026-01-04 15:50:22 +01:00
9c95168098 Auto-commit local changes before build (2026-01-04 15:12:50) 2026-01-04 15:12:50 +01:00
b3f3ac90fd Auto-commit local changes before build (2026-01-04 14:34:57) 2026-01-04 14:34:57 +01:00
c880121cd3 Auto-commit local changes before build (2026-01-04 13:29:28) 2026-01-04 13:29:28 +01:00
a1b8dfe5cf Auto-commit local changes before build (2026-01-04 12:34:15) 2026-01-04 12:34:15 +01:00
843e01e1e6 Auto-commit local changes before build (2026-01-04 12:17:21) 2026-01-04 12:17:21 +01:00
985397afa1 Auto-commit local changes before build (2026-01-04 12:11:00) 2026-01-04 12:11:00 +01:00
cea1df3e38 Auto-commit local changes before build (2026-01-04 11:57:39) 2026-01-04 11:57:39 +01:00
609364ef2f Auto-commit local changes before build (2026-01-04 11:19:59) 2026-01-04 11:19:59 +01:00
8fbf452018 Auto-commit local changes before build (2026-01-04 01:33:56) 2026-01-04 01:33:56 +01:00
22a2e7146b Auto-commit local changes before build (2026-01-03 22:57:17) 2026-01-03 22:57:17 +01:00
b0de64c9fd Auto-commit local changes before build (2026-01-03 22:39:11) 2026-01-03 22:39:11 +01:00
9571716344 Auto-commit local changes before build (2026-01-03 22:22:35) 2026-01-03 22:22:35 +01:00
fdf8ab224f Auto-commit local changes before build (2026-01-03 21:53:18) 2026-01-03 21:53:18 +01:00
fa676a9e4e Auto-commit local changes before build (2026-01-03 21:42:36) 2026-01-03 21:42:36 +01:00
bef1d7d336 Auto-commit local changes before build (2026-01-03 18:48:08) 2026-01-03 18:48:08 +01:00
2710a140ad Auto-commit local changes before build (2026-01-03 18:22:48) 2026-01-03 18:22:48 +01:00
46e93b3f01 Auto-commit local changes before build (2026-01-03 17:50:38) 2026-01-03 17:50:38 +01:00
c528b938a0 Auto-commit local changes before build (2026-01-03 16:04:10) 2026-01-03 16:04:10 +01:00
0065446ae3 Merge pull request 'v20260101-15-changelog-0-1-15' (#13) from v20260101-15-changelog-0-1-15 into main
Reviewed-on: #13
2026-01-03 15:25:09 +01:00
18acb16a3d Merge pull request 'Auto-commit local changes before build (2026-01-01 17:54:06)' (#12) from v20260101-14-run-checks-select-all-indeterminate-stuck-dash-fix into main
Reviewed-on: #12
2026-01-03 15:24:52 +01:00
15befc0b32 Merge pull request 'Auto-commit local changes before build (2026-01-01 17:49:01)' (#11) from v20260101-13-run-checks-select-all-indeterminate-state-fix into main
Reviewed-on: #11
2026-01-03 15:23:30 +01:00
8c40ad4678 Merge pull request 'Auto-commit local changes before build (2026-01-01 17:42:47)' (#10) from v20260101-12-run-checks-select-all-indeterminate-clear-selection-fix into main
Reviewed-on: #10
2026-01-03 15:23:14 +01:00
3508404937 Merge pull request 'Auto-commit local changes before build (2026-01-01 17:36:33)' (#9) from v20260101-11-run-checks-select-all-indeterminate-clear-selection into main
Reviewed-on: #9
2026-01-03 15:22:58 +01:00
65bfbe812a Merge pull request 'Auto-commit local changes before build (2026-01-01 17:28:51)' (#8) from v20260101-10-run-checks-shift-multiselect-last-row-checkbox-fix into main
Reviewed-on: #8
2026-01-03 15:22:39 +01:00
fe9d7293d0 Merge pull request 'Auto-commit local changes before build (2026-01-01 17:23:31)' (#7) from v20260101-09-run-checks-shift-multiselect-range-highlight-fix into main
Reviewed-on: #7
2026-01-03 15:19:03 +01:00
9777cb2ea7 Merge pull request 'Auto-commit local changes before build (2026-01-01 17:15:18)' (#6) from v20260101-08-run-checks-shift-multiselect-persist into main
Reviewed-on: #6
2026-01-03 15:18:30 +01:00
8e6fb4b66d Merge pull request 'Auto-commit local changes before build (2026-01-01 17:08:01)' (#5) from v20260101-07-run-checks-shift-multiselect-delegation-fix into main
Reviewed-on: #5
2026-01-03 15:18:13 +01:00
94ecc305a2 Auto-commit local changes before build (2026-01-03 15:17:01) 2026-01-03 15:17:01 +01:00
7c0c7d8c3e Merge pull request 'Auto-commit local changes before build (2026-01-01 17:00:37)' (#4) from v20260101-06-run-checks-shift-multiselect-fix into main
Reviewed-on: #4
2026-01-03 15:04:33 +01:00
e5a5b22165 Merge pull request 'Auto-commit local changes before build (2026-01-01 16:54:45)' (#3) from v20260101-05-run-checks-shift-multiselect into main
Reviewed-on: #3
2026-01-03 15:04:00 +01:00
fc907349a0 Merge pull request 'Auto-commit local changes before build (2026-01-01 16:45:48)' (#2) from v20260101-04-run-checks-guide-layout-fix into main
Reviewed-on: #2
2026-01-03 15:03:29 +01:00
7c426471ac Merge pull request 'v20260101-03-run-checks-page-guide-text' (#1) from v20260101-03-run-checks-page-guide-text into main
Reviewed-on: #1
2026-01-03 15:02:52 +01:00
5ed3e50288 Auto-commit local changes before build (2026-01-03 14:57:56) 2026-01-03 14:57:56 +01:00
a5c8f2db3c Auto-commit local changes before build (2026-01-03 14:15:04) 2026-01-03 14:15:04 +01:00
dff746b23d Auto-commit local changes before build (2026-01-03 13:59:54) 2026-01-03 13:59:54 +01:00
56d2d68fa0 Auto-commit local changes before build (2026-01-03 13:40:56) 2026-01-03 13:40:56 +01:00
896bcbe55f Auto-commit local changes before build (2026-01-03 13:29:09) 2026-01-03 13:29:09 +01:00
4654fac477 Auto-commit local changes before build (2026-01-03 13:24:32) 2026-01-03 13:24:32 +01:00
750815dcff Auto-commit local changes before build (2026-01-03 13:05:53) 2026-01-03 13:05:54 +01:00
e2822a130f Auto-commit local changes before build (2026-01-03 12:54:35) 2026-01-03 12:54:35 +01:00
b246b82d11 Auto-commit local changes before build (2026-01-03 12:21:44) 2026-01-03 12:21:44 +01:00
60 changed files with 8108 additions and 2028 deletions

View File

@ -1 +1 @@
v20260101-15-changelog-0-1-15
v20260113-08-vspc-object-linking-normalize

View File

@ -6,3 +6,5 @@ psycopg2-binary==2.9.9
python-dateutil==2.9.0.post0
gunicorn==23.0.0
requests==2.32.3
reportlab==4.2.5
Markdown==3.6

View File

@ -18,8 +18,9 @@ from ..models import User
auth_bp = Blueprint("auth", __name__, url_prefix="/auth")
def admin_exists() -> bool:
return db.session.query(User.id).filter_by(role="admin").first() is not None
def users_exist() -> bool:
# Initial setup should only run on a fresh install where NO users exist yet.
return db.session.query(User.id).first() is not None
def generate_captcha():
@ -55,7 +56,7 @@ def captcha_required(func):
@captcha_required
def login():
if request.method == "GET":
if not admin_exists():
if not users_exist():
return redirect(url_for("auth.initial_setup"))
question, answer = generate_captcha()
@ -98,8 +99,8 @@ def logout():
@auth_bp.route("/initial-setup", methods=["GET", "POST"])
def initial_setup():
if admin_exists():
flash("An admin user already exists. Please log in.", "info")
if users_exist():
flash("Users already exist. Please log in.", "info")
return redirect(url_for("auth.login"))
if request.method == "POST":

View File

@ -7,7 +7,7 @@ from datetime import datetime
from .admin_logging import log_admin_event
from .mail_importer import MailImportError, run_auto_import
from .models import SystemSettings
from .object_persistence import persist_objects_for_approved_run
from .object_persistence import persist_objects_for_auto_run
_AUTO_IMPORTER_THREAD_NAME = "auto_importer"
@ -80,7 +80,7 @@ def start_auto_importer(app) -> None:
persisted_errors = 0
for (customer_id, job_id, run_id, mail_message_id) in auto_approved_runs:
try:
persisted_objects += persist_objects_for_approved_run(
persisted_objects += persist_objects_for_auto_run(
int(customer_id), int(job_id), int(run_id), int(mail_message_id)
)
except Exception as exc:

View File

@ -3,6 +3,7 @@ from __future__ import annotations
from email import policy
from email.parser import BytesParser
from email.utils import parseaddr
import re
from typing import List, Optional, Tuple
@ -125,3 +126,42 @@ def extract_best_html_from_eml(
return None
_fn, html_text = items[0]
return html_text or None
def is_effectively_blank_html(value: str | None) -> bool:
"""Return True when an HTML body is effectively empty.
Some sources produce Graph bodies that are non-empty strings but contain only
an empty HTML skeleton (e.g. <html><body></body></html>) or whitespace.
In those cases we want to treat the body as empty so we can fall back to an
HTML report attachment stored in the EML.
"""
if value is None:
return True
if not isinstance(value, str):
return False
raw = value.strip()
if raw == "":
return True
# Fast path: if we clearly have content-bearing elements, it is not blank.
# (This avoids false positives for report-like HTML.)
if re.search(r"<(table|img|svg|pre|ul|ol|li|iframe|object|embed)\b", raw, re.IGNORECASE):
return False
# Try to isolate the body content; if no body tag is present, evaluate the full string.
m = re.search(r"<body\b[^>]*>(.*?)</body>", raw, re.IGNORECASE | re.DOTALL)
body = m.group(1) if m else raw
# Remove comments, scripts, and styles.
body = re.sub(r"<!--.*?-->", "", body, flags=re.DOTALL)
body = re.sub(r"<script\b[^>]*>.*?</script>", "", body, flags=re.IGNORECASE | re.DOTALL)
body = re.sub(r"<style\b[^>]*>.*?</style>", "", body, flags=re.IGNORECASE | re.DOTALL)
# Strip tags and common non-breaking whitespace entities.
text = re.sub(r"<[^>]+>", "", body)
text = text.replace("&nbsp;", " ").replace("\xa0", " ")
text = re.sub(r"\s+", "", text)
return text == ""

View File

@ -68,4 +68,53 @@ def find_matching_job(msg: MailMessage) -> Optional[Job]:
if len(matches) == 1:
return matches[0]
# Backwards-compatible matching for Veeam VSPC Active Alarms summary per-company jobs.
# Earlier versions could store company names with slightly different whitespace / HTML entities,
# while parsers store objects using a normalized company prefix. When the exact match fails,
# try a normalized company comparison so existing jobs continue to match.
try:
bsw = (backup or "").strip().lower()
bt = (btype or "").strip().lower()
jn = (job_name or "").strip()
if bsw == "veeam" and bt == "service provider console" and "|" in jn:
left, right = [p.strip() for p in jn.split("|", 1)]
if left.lower() == "active alarms summary" and right:
from .parsers.veeam import normalize_vspc_company_name # lazy import
target_company = normalize_vspc_company_name(right)
if not target_company:
return None
q2 = Job.query
if norm_from is None:
q2 = q2.filter(Job.from_address.is_(None))
else:
q2 = q2.filter(Job.from_address == norm_from)
q2 = q2.filter(Job.backup_software == backup)
q2 = q2.filter(Job.backup_type == btype)
q2 = q2.filter(Job.job_name.ilike("Active alarms summary | %"))
# Load a small set of candidates and compare the company portion.
candidates = q2.order_by(Job.updated_at.desc(), Job.id.desc()).limit(25).all()
normalized_matches: list[Job] = []
for cand in candidates:
cand_name = (cand.job_name or "").strip()
if "|" not in cand_name:
continue
c_left, c_right = [p.strip() for p in cand_name.split("|", 1)]
if c_left.lower() != "active alarms summary" or not c_right:
continue
if normalize_vspc_company_name(c_right) == target_company:
normalized_matches.append(cand)
if len(normalized_matches) > 1:
customer_ids = {m.customer_id for m in normalized_matches}
if len(customer_ids) == 1:
return normalized_matches[0]
return None
if len(normalized_matches) == 1:
return normalized_matches[0]
except Exception:
pass
return None

View File

@ -11,9 +11,10 @@ import requests
from sqlalchemy import func
from . import db
from .models import MailMessage, SystemSettings, Job, JobRun
from .models import MailMessage, SystemSettings, Job, JobRun, MailObject
from .parsers import parse_mail_message
from .email_utils import normalize_from_address, extract_best_html_from_eml
from .parsers.veeam import extract_vspc_active_alarms_companies
from .email_utils import normalize_from_address, extract_best_html_from_eml, is_effectively_blank_html
from .job_matching import find_matching_job
@ -228,9 +229,15 @@ def _store_messages(settings: SystemSettings, messages):
)
# Some systems send empty bodies and put the actual report in an HTML attachment.
# If we have raw EML bytes and no body content, extract the first HTML attachment
# and use it as the HTML body so parsers and the inbox preview can work.
if not (mail.html_body or mail.text_body) and mail.eml_blob:
# Graph may still return a body that only contains whitespace/newlines; treat that
# as empty so we can fall back to the attachment.
def _is_blank_text(s):
return s is None or (isinstance(s, str) and s.strip() == "")
# If we have raw EML bytes and no meaningful body content, extract the first
# HTML attachment and use it as the HTML body so parsers and the inbox preview
# can work.
if is_effectively_blank_html(mail.html_body) and _is_blank_text(mail.text_body) and mail.eml_blob:
attachment_html = extract_best_html_from_eml(mail.eml_blob)
if attachment_html:
mail.html_body = attachment_html
@ -259,6 +266,94 @@ def _store_messages(settings: SystemSettings, messages):
and getattr(mail, "parse_result", None) == "ok"
and not bool(getattr(mail, "approved", False))
):
# Special case: Veeam VSPC "Active alarms summary" contains multiple companies.
bsw = (getattr(mail, "backup_software", "") or "").strip().lower()
btype = (getattr(mail, "backup_type", "") or "").strip().lower()
jname = (getattr(mail, "job_name", "") or "").strip().lower()
if bsw == "veeam" and btype == "service provider console" and jname == "active alarms summary":
raw = (mail.text_body or "").strip() or (mail.html_body or "")
companies = extract_vspc_active_alarms_companies(raw)
if companies:
def _is_error_status(value: str | None) -> bool:
v = (value or "").strip().lower()
return v in {"error", "failed", "critical"} or v.startswith("fail")
created_any = False
first_job = None
mapped_count = 0
for company in companies:
# Build a temp message using the per-company job name
tmp = MailMessage(
from_address=mail.from_address,
backup_software=mail.backup_software,
backup_type=mail.backup_type,
job_name=f"{(mail.job_name or 'Active alarms summary').strip()} | {company}".strip(),
)
job = find_matching_job(tmp)
if not job:
continue
# Respect per-job flags.
if hasattr(job, "active") and not bool(job.active):
continue
if hasattr(job, "auto_approve") and not bool(job.auto_approve):
continue
mapped_count += 1
objs = (
MailObject.query.filter(MailObject.mail_message_id == mail.id)
.filter(MailObject.object_name.ilike(f"{company} | %"))
.all()
)
saw_error = any(_is_error_status(o.status) for o in objs)
saw_warning = any((o.status or "").strip().lower() == "warning" for o in objs)
status = "Error" if saw_error else ("Warning" if saw_warning else (mail.overall_status or "Success"))
run = JobRun(
job_id=job.id,
mail_message_id=mail.id,
run_at=mail.received_at,
status=status or None,
missed=False,
)
# Optional storage metrics
if hasattr(run, "storage_used_bytes") and hasattr(mail, "storage_used_bytes"):
run.storage_used_bytes = mail.storage_used_bytes
if hasattr(run, "storage_capacity_bytes") and hasattr(mail, "storage_capacity_bytes"):
run.storage_capacity_bytes = mail.storage_capacity_bytes
if hasattr(run, "storage_free_bytes") and hasattr(mail, "storage_free_bytes"):
run.storage_free_bytes = mail.storage_free_bytes
if hasattr(run, "storage_free_percent") and hasattr(mail, "storage_free_percent"):
run.storage_free_percent = mail.storage_free_percent
db.session.add(run)
db.session.flush()
auto_approved_runs.append((job.customer_id, job.id, run.id, mail.id))
created_any = True
if not first_job:
first_job = job
# If all companies are mapped, mark the mail as fully approved and move to history.
if created_any and mapped_count == len(companies):
mail.job_id = first_job.id if first_job else None
if hasattr(mail, "approved"):
mail.approved = True
if hasattr(mail, "approved_at"):
mail.approved_at = datetime.utcnow()
if hasattr(mail, "location"):
mail.location = "history"
auto_approved += 1
# Do not fall back to single-job matching for VSPC summary.
continue
job = find_matching_job(mail)
if job:
# Respect per-job flags.
@ -436,6 +531,21 @@ def run_auto_import(settings: SystemSettings):
new_messages = 0
auto_approved = 0
auto_approved_runs = []
# Never move messages when the import failed (prevents "moved but not stored" situations).
processed_folder_id = None
# Ensure imported messages are committed before moving them to another folder.
# If commit fails, do not move anything.
if processed_folder_id and new_messages >= 0:
try:
db.session.commit()
except Exception as exc:
db.session.rollback()
errors.append(f"Failed to commit imported messages: {exc}")
new_messages = 0
auto_approved = 0
auto_approved_runs = []
processed_folder_id = None
# Move messages to the processed folder if configured
if processed_folder_id:
@ -613,6 +723,21 @@ def run_manual_import(settings: SystemSettings, batch_size: int):
errors.append(str(exc))
new_messages = 0
auto_approved_runs = []
# Never move messages when the import failed (prevents "moved but not stored" situations).
processed_folder_id = None
# Ensure imported messages are committed before moving them to another folder.
# If commit fails, do not move anything.
if processed_folder_id and new_messages >= 0:
try:
db.session.commit()
except Exception as exc:
db.session.rollback()
errors.append(f"Failed to commit imported messages: {exc}")
new_messages = 0
auto_approved = 0
auto_approved_runs = []
processed_folder_id = None
# Move messages to the processed folder if configured
if processed_folder_id:

View File

@ -10,6 +10,7 @@ from .routes_shared import main_bp, roles_required # noqa: F401
from . import routes_core # noqa: F401
from . import routes_news # noqa: F401
from . import routes_inbox # noqa: F401
from . import routes_mail_audit # noqa: F401
from . import routes_customers # noqa: F401
from . import routes_jobs # noqa: F401
from . import routes_settings # noqa: F401

View File

@ -0,0 +1,135 @@
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _format_datetime
@main_bp.route("/admin/all-mail")
@login_required
@roles_required("admin")
def admin_all_mail_page():
# Pagination
try:
page = int(request.args.get("page", "1"))
except ValueError:
page = 1
if page < 1:
page = 1
per_page = 50
# Filters (AND combined)
from_q = (request.args.get("from_q") or "").strip()
subject_q = (request.args.get("subject_q") or "").strip()
backup_q = (request.args.get("backup_q") or "").strip()
type_q = (request.args.get("type_q") or "").strip()
job_name_q = (request.args.get("job_name_q") or "").strip()
received_from = (request.args.get("received_from") or "").strip()
received_to = (request.args.get("received_to") or "").strip()
only_unlinked = (request.args.get("only_unlinked") or "").strip().lower() in (
"1",
"true",
"yes",
"on",
)
query = db.session.query(MailMessage).outerjoin(Job, MailMessage.job_id == Job.id)
if from_q:
query = query.filter(MailMessage.from_address.ilike(f"%{from_q}%"))
if subject_q:
query = query.filter(MailMessage.subject.ilike(f"%{subject_q}%"))
if backup_q:
query = query.filter(MailMessage.backup_software.ilike(f"%{backup_q}%"))
if type_q:
query = query.filter(MailMessage.backup_type.ilike(f"%{type_q}%"))
if job_name_q:
# Prefer stored job_name, but also allow matching the linked Job name.
query = query.filter(
or_(
MailMessage.job_name.ilike(f"%{job_name_q}%"),
Job.name.ilike(f"%{job_name_q}%"),
)
)
if only_unlinked:
query = query.filter(MailMessage.job_id.is_(None))
# Datetime window (received_at)
# Use dateutil.parser when available, otherwise a simple ISO parse fallback.
def _parse_dt(value: str):
if not value:
return None
try:
from dateutil import parser as dtparser # type: ignore
return dtparser.parse(value)
except Exception:
try:
# Accept "YYYY-MM-DDTHH:MM" from datetime-local.
return datetime.fromisoformat(value)
except Exception:
return None
dt_from = _parse_dt(received_from)
dt_to = _parse_dt(received_to)
if dt_from is not None:
query = query.filter(MailMessage.received_at >= dt_from)
if dt_to is not None:
query = query.filter(MailMessage.received_at <= dt_to)
total_items = query.count()
total_pages = max(1, math.ceil(total_items / per_page)) if total_items else 1
if page > total_pages:
page = total_pages
messages = (
query.order_by(
MailMessage.received_at.desc().nullslast(),
MailMessage.id.desc(),
)
.offset((page - 1) * per_page)
.limit(per_page)
.all()
)
rows = []
for msg in messages:
rows.append(
{
"id": msg.id,
"from_address": msg.from_address or "",
"subject": msg.subject or "",
"received_at": _format_datetime(msg.received_at),
"backup_software": msg.backup_software or "",
"backup_type": msg.backup_type or "",
"job_name": (msg.job_name or "") or (msg.job.name if msg.job else ""),
"linked": bool(msg.job_id),
"has_eml": bool(getattr(msg, "eml_stored_at", None)),
}
)
has_prev = page > 1
has_next = page < total_pages
return render_template(
"main/admin_all_mail.html",
rows=rows,
page=page,
total_pages=total_pages,
has_prev=has_prev,
has_next=has_next,
filters={
"from_q": from_q,
"subject_q": subject_q,
"backup_q": backup_q,
"type_q": type_q,
"job_name_q": job_name_q,
"received_from": received_from,
"received_to": received_to,
"only_unlinked": only_unlinked,
},
)

View File

@ -1,5 +1,6 @@
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _format_datetime, _get_ui_timezone_name, _next_ticket_code, _to_amsterdam_date
from .routes_shared import _format_datetime, _get_ui_timezone_name, _to_amsterdam_date
import re
@main_bp.route("/api/job-runs/<int:run_id>/alerts")
@login_required
@ -21,14 +22,19 @@ def api_job_run_alerts(run_id: int):
db.session.execute(
text(
"""
SELECT t.id, t.ticket_code, t.description, t.start_date, t.resolved_at, t.active_from_date
SELECT t.id,
t.ticket_code,
t.description,
t.start_date,
COALESCE(ts.resolved_at, t.resolved_at) AS resolved_at,
t.active_from_date
FROM tickets t
JOIN ticket_scopes ts ON ts.ticket_id = t.id
WHERE ts.job_id = :job_id
AND t.active_from_date <= :run_date
AND (
t.resolved_at IS NULL
OR ((t.resolved_at AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
COALESCE(ts.resolved_at, t.resolved_at) IS NULL
OR ((COALESCE(ts.resolved_at, t.resolved_at) AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
)
ORDER BY t.start_date DESC
"""
@ -168,7 +174,7 @@ def api_tickets():
"active_from_date": str(getattr(t, "active_from_date", "") or ""),
"start_date": _format_datetime(t.start_date),
"resolved_at": _format_datetime(t.resolved_at) if t.resolved_at else "",
"active": t.resolved_at is None,
"active": (t.resolved_at is None and TicketScope.query.filter_by(ticket_id=t.id, resolved_at=None).first() is not None),
}
)
return jsonify({"status": "ok", "tickets": items})
@ -178,7 +184,7 @@ def api_tickets():
return jsonify({"status": "error", "message": "Forbidden."}), 403
payload = request.get_json(silent=True) or {}
description = (payload.get("description") or "").strip() or None
description = None # Description removed from New ticket UI; use remarks for additional context
try:
run_id = int(payload.get("job_run_id") or 0)
except Exception:
@ -194,10 +200,25 @@ def api_tickets():
job = Job.query.get(run.job_id) if run else None
now = datetime.utcnow()
code = _next_ticket_code(now)
ticket_code = (payload.get("ticket_code") or "").strip().upper()
if not ticket_code:
return jsonify({"status": "error", "message": "ticket_code is required."}), 400
# Validate format: TYYYYMMDD.####
if not re.match(r"^T\d{8}\.\d{4}$", ticket_code):
return jsonify({"status": "error", "message": "Invalid ticket_code format. Expected TYYYYMMDD.####."}), 400
existing = Ticket.query.filter_by(ticket_code=ticket_code).first()
is_new = existing is None
# Create new ticket if it doesn't exist; otherwise reuse the existing one so the same
# ticket number can be linked to multiple jobs and job runs.
if existing:
ticket = existing
else:
ticket = Ticket(
ticket_code=code,
ticket_code=ticket_code,
title=None,
description=description,
active_from_date=_to_amsterdam_date(run.run_at) or _to_amsterdam_date(now) or now.date(),
@ -206,10 +227,16 @@ def api_tickets():
)
try:
if is_new:
db.session.add(ticket)
db.session.flush()
# Minimal scope from job
# Ensure a scope exists for this job so alerts/popups can show the ticket code.
scope = None
if job and job.id:
scope = TicketScope.query.filter_by(ticket_id=ticket.id, scope_type="job", job_id=job.id).first()
if not scope:
scope = TicketScope(
ticket_id=ticket.id,
scope_type="job",
@ -219,9 +246,20 @@ def api_tickets():
job_id=job.id if job else None,
job_name_match=job.job_name if job else None,
job_name_match_mode="exact",
resolved_at=None,
)
db.session.add(scope)
else:
# Re-open this ticket for this job if it was previously resolved for this scope.
scope.resolved_at = None
scope.customer_id = job.customer_id if job else scope.customer_id
scope.backup_software = job.backup_software if job else scope.backup_software
scope.backup_type = job.backup_type if job else scope.backup_type
scope.job_name_match = job.job_name if job else scope.job_name_match
scope.job_name_match_mode = "exact"
# Link ticket to this job run (idempotent)
if not TicketJobRun.query.filter_by(ticket_id=ticket.id, job_run_id=run.id).first():
link = TicketJobRun(ticket_id=ticket.id, job_run_id=run.id, link_source="manual")
db.session.add(link)
@ -250,21 +288,8 @@ def api_tickets():
@login_required
@roles_required("admin", "operator", "viewer")
def api_ticket_update(ticket_id: int):
if get_active_role() not in ("admin", "operator"):
return jsonify({"status": "error", "message": "Forbidden."}), 403
ticket = Ticket.query.get_or_404(ticket_id)
payload = request.get_json(silent=True) or {}
if "description" in payload:
ticket.description = (payload.get("description") or "").strip() or None
try:
db.session.commit()
except Exception as exc:
db.session.rollback()
return jsonify({"status": "error", "message": str(exc) or "Failed to update ticket."}), 500
return jsonify({"status": "ok"})
# Editing tickets is not allowed. Resolve the old ticket and create a new one instead.
return jsonify({"status": "error", "message": "Ticket editing is disabled. Resolve the old ticket and create a new one."}), 405
@main_bp.route("/api/tickets/<int:ticket_id>/resolve", methods=["POST"])
@ -275,9 +300,68 @@ def api_ticket_resolve(ticket_id: int):
return jsonify({"status": "error", "message": "Forbidden."}), 403
ticket = Ticket.query.get_or_404(ticket_id)
if ticket.resolved_at is None:
ticket.resolved_at = datetime.utcnow()
now = datetime.utcnow()
payload = request.get_json(silent=True) if request.is_json else {}
try:
run_id = int((payload or {}).get("job_run_id") or 0)
except Exception:
run_id = 0
# Job-scoped resolve (from popups / job details): resolve only for the job of the provided job_run_id
if run_id > 0:
run = JobRun.query.get(run_id)
if not run:
return jsonify({"status": "error", "message": "Job run not found."}), 404
job = Job.query.get(run.job_id) if run else None
job_id = job.id if job else None
try:
scope = None
if job_id:
scope = TicketScope.query.filter_by(ticket_id=ticket.id, scope_type="job", job_id=job_id).first()
if not scope:
scope = TicketScope(
ticket_id=ticket.id,
scope_type="job",
customer_id=job.customer_id if job else None,
backup_software=job.backup_software if job else None,
backup_type=job.backup_type if job else None,
job_id=job_id,
job_name_match=job.job_name if job else None,
job_name_match_mode="exact",
resolved_at=now,
)
db.session.add(scope)
else:
if scope.resolved_at is None:
scope.resolved_at = now
# Keep the audit link to the run (idempotent)
if not TicketJobRun.query.filter_by(ticket_id=ticket.id, job_run_id=run.id).first():
db.session.add(TicketJobRun(ticket_id=ticket.id, job_run_id=run.id, link_source="manual"))
# If all scopes are resolved, also resolve the ticket globally (so the central list shows it as resolved)
open_scope = TicketScope.query.filter_by(ticket_id=ticket.id, resolved_at=None).first()
if open_scope is None and ticket.resolved_at is None:
ticket.resolved_at = now
db.session.commit()
except Exception as exc:
db.session.rollback()
return jsonify({"status": "error", "message": str(exc) or "Failed to resolve ticket."}), 500
return jsonify({"status": "ok", "resolved_at": _format_datetime(now)})
# Global resolve (from central ticket list): resolve ticket and all scopes
if ticket.resolved_at is None:
ticket.resolved_at = now
try:
# Resolve any still-open scopes
TicketScope.query.filter_by(ticket_id=ticket.id, resolved_at=None).update({"resolved_at": now})
db.session.commit()
except Exception as exc:
db.session.rollback()
@ -286,11 +370,9 @@ def api_ticket_resolve(ticket_id: int):
# If this endpoint is called from a regular HTML form submit (e.g. Tickets/Remarks page),
# redirect back instead of showing raw JSON in the browser.
if not request.is_json and "application/json" not in (request.headers.get("Accept") or ""):
return redirect(request.referrer or url_for("main.tickets_page"))
return jsonify({"status": "ok", "resolved_at": _format_datetime(ticket.resolved_at)})
return redirect(request.referrer or url_for("main.tickets"))
return jsonify({"status": "ok", "resolved_at": _format_datetime(now)})
@main_bp.route("/api/tickets/<int:ticket_id>/link-run", methods=["POST"])
@login_required
@roles_required("admin", "operator", "viewer")
@ -420,21 +502,8 @@ def api_remarks():
@login_required
@roles_required("admin", "operator", "viewer")
def api_remark_update(remark_id: int):
if get_active_role() not in ("admin", "operator"):
return jsonify({"status": "error", "message": "Forbidden."}), 403
remark = Remark.query.get_or_404(remark_id)
payload = request.get_json(silent=True) or {}
if "body" in payload:
remark.body = (payload.get("body") or "").strip() or ""
try:
db.session.commit()
except Exception as exc:
db.session.rollback()
return jsonify({"status": "error", "message": str(exc) or "Failed to update remark."}), 500
return jsonify({"status": "ok"})
# Editing remarks is not allowed. Resolve the old remark and create a new one instead.
return jsonify({"status": "error", "message": "Remark editing is disabled. Resolve the old remark and create a new one."}), 405
@main_bp.route("/api/remarks/<int:remark_id>/resolve", methods=["POST"])

View File

@ -100,6 +100,15 @@ def customers_delete(customer_id: int):
customer = Customer.query.get_or_404(customer_id)
try:
# Prevent FK violations and keep historical reporting intact.
# Jobs are not deleted when removing a customer; they are simply unlinked.
Job.query.filter_by(customer_id=customer.id).update({"customer_id": None})
# Ticket/remark scoping rows can reference customers directly (FK),
# so remove those links first to allow the customer to be deleted.
TicketScope.query.filter_by(customer_id=customer.id).delete(synchronize_session=False)
RemarkScope.query.filter_by(customer_id=customer.id).delete(synchronize_session=False)
db.session.delete(customer)
db.session.commit()
flash("Customer deleted.", "success")

View File

@ -1,5 +1,5 @@
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _format_datetime, _get_or_create_settings, _apply_overrides_to_run, _infer_schedule_map_from_runs
from .routes_shared import _format_datetime, _get_or_create_settings, _apply_overrides_to_run, _infer_schedule_map_from_runs, _infer_monthly_schedule_from_runs
# Grace window for today's Expected/Missed transition.
# A job is only marked Missed after the latest expected time plus this grace.
@ -76,6 +76,7 @@ def daily_jobs():
jobs = (
Job.query.join(Customer, isouter=True)
.filter(Job.archived.is_(False))
.order_by(Customer.name.asc().nullslast(), Job.backup_software.asc(), Job.backup_type.asc(), Job.job_name.asc())
.all()
)
@ -84,6 +85,23 @@ def daily_jobs():
for job in jobs:
schedule_map = _infer_schedule_map_from_runs(job.id)
expected_times = schedule_map.get(weekday_idx) or []
# If no weekly schedule is inferred (e.g. monthly jobs), try monthly inference.
if not expected_times:
monthly = _infer_monthly_schedule_from_runs(job.id)
if monthly:
dom = int(monthly.get("day_of_month") or 0)
mtimes = monthly.get("times") or []
# For months shorter than dom, treat the last day of month as the scheduled day.
try:
import calendar as _calendar
last_dom = _calendar.monthrange(target_date.year, target_date.month)[1]
except Exception:
last_dom = target_date.day
scheduled_dom = dom if (dom and dom <= last_dom) else last_dom
if target_date.day == scheduled_dom:
expected_times = list(mtimes)
if not expected_times:
continue

View File

@ -4,15 +4,14 @@ from .routes_shared import _format_datetime
@main_bp.route("/feedback")
@login_required
@roles_required("admin", "operator", "viewer")
@roles_required("admin", "operator", "reporter", "viewer")
def feedback_page():
item_type = (request.args.get("type") or "").strip().lower()
if item_type not in ("", "bug", "feature"):
item_type = ""
# Default to showing both open and resolved items. Resolved items should remain
# visible for all users until an admin deletes them.
status = (request.args.get("status") or "all").strip().lower()
# Default to showing only open items. Users can still switch to Resolved or All via the filter.
status = (request.args.get("status") or "open").strip().lower()
if status not in ("open", "resolved", "all"):
status = "all"
@ -46,6 +45,9 @@ def feedback_page():
else:
order_sql = "vote_count DESC, fi.created_at DESC"
# Always keep resolved items at the bottom when mixing statuses.
order_sql = "CASE WHEN fi.status = 'resolved' THEN 1 ELSE 0 END, " + order_sql
sql = text(
f"""
SELECT
@ -108,7 +110,7 @@ def feedback_page():
@main_bp.route("/feedback/new", methods=["GET", "POST"])
@login_required
@roles_required("admin", "operator", "viewer")
@roles_required("admin", "operator", "reporter", "viewer")
def feedback_new():
if request.method == "POST":
item_type = (request.form.get("item_type") or "").strip().lower()
@ -143,7 +145,7 @@ def feedback_new():
@main_bp.route("/feedback/<int:item_id>")
@login_required
@roles_required("admin", "operator", "viewer")
@roles_required("admin", "operator", "reporter", "viewer")
def feedback_detail(item_id: int):
item = FeedbackItem.query.get_or_404(item_id)
if item.deleted_at is not None:
@ -172,6 +174,19 @@ def feedback_detail(item_id: int):
resolved_by = User.query.get(item.resolved_by_user_id)
resolved_by_name = resolved_by.username if resolved_by else ""
replies = (
FeedbackReply.query.filter(FeedbackReply.feedback_item_id == item.id)
.order_by(FeedbackReply.created_at.asc())
.all()
)
reply_user_ids = sorted({int(r.user_id) for r in replies})
reply_users = (
User.query.filter(User.id.in_(reply_user_ids)).all() if reply_user_ids else []
)
reply_user_map = {int(u.id): (u.username or "") for u in reply_users}
return render_template(
"main/feedback_detail.html",
item=item,
@ -179,12 +194,46 @@ def feedback_detail(item_id: int):
resolved_by_name=resolved_by_name,
vote_count=int(vote_count),
user_voted=bool(user_voted),
replies=replies,
reply_user_map=reply_user_map,
)
@main_bp.route("/feedback/<int:item_id>/reply", methods=["POST"])
@login_required
@roles_required("admin", "operator", "reporter", "viewer")
def feedback_reply(item_id: int):
item = FeedbackItem.query.get_or_404(item_id)
if item.deleted_at is not None:
abort(404)
if (item.status or "").strip().lower() != "open":
flash("Only open feedback items can be replied to.", "warning")
return redirect(url_for("main.feedback_detail", item_id=item.id))
message = (request.form.get("message") or "").strip()
if not message:
flash("Reply message is required.", "danger")
return redirect(url_for("main.feedback_detail", item_id=item.id))
reply = FeedbackReply(
feedback_item_id=int(item.id),
user_id=int(current_user.id),
message=message,
created_at=datetime.utcnow(),
)
db.session.add(reply)
db.session.commit()
flash("Reply added.", "success")
return redirect(url_for("main.feedback_detail", item_id=item.id))
@main_bp.route("/feedback/<int:item_id>/vote", methods=["POST"])
@login_required
@roles_required("admin", "operator", "viewer")
@roles_required("admin", "operator", "reporter", "viewer")
def feedback_vote(item_id: int):
item = FeedbackItem.query.get_or_404(item_id)
if item.deleted_at is not None:

View File

@ -1,7 +1,14 @@
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _format_datetime, _log_admin_event, _send_mail_message_eml_download
from ..email_utils import extract_best_html_from_eml, is_effectively_blank_html
from ..parsers.veeam import extract_vspc_active_alarms_companies
from ..models import MailObject
import time
import re
import html as _html
@main_bp.route("/inbox")
@login_required
@ -69,6 +76,8 @@ def inbox():
has_prev=has_prev,
has_next=has_next,
customers=customer_rows,
can_bulk_delete=(get_active_role() in ("admin", "operator")),
is_admin=(get_active_role() == "admin"),
)
@ -109,11 +118,24 @@ def inbox_message_detail(message_id: int):
),
}
if getattr(msg, "html_body", None):
body_html = msg.html_body
elif getattr(msg, "text_body", None):
def _is_blank_text(s):
return s is None or (isinstance(s, str) and s.strip() == "")
html_body = getattr(msg, "html_body", None)
text_body = getattr(msg, "text_body", None)
# For legacy messages: if the Graph body is empty/whitespace but the real report
# is an HTML attachment in the stored EML, extract and render it.
if is_effectively_blank_html(html_body) and _is_blank_text(text_body) and getattr(msg, "eml_blob", None):
extracted = extract_best_html_from_eml(getattr(msg, "eml_blob", None))
if extracted:
html_body = extracted
if not is_effectively_blank_html(html_body):
body_html = html_body
elif not _is_blank_text(text_body):
escaped = (
msg.text_body.replace("&", "&amp;")
text_body.replace("&", "&amp;")
.replace("<", "&lt;")
.replace(">", "&gt;")
)
@ -132,7 +154,52 @@ def inbox_message_detail(message_id: int):
for obj in MailObject.query.filter_by(mail_message_id=msg.id).order_by(MailObject.object_name.asc()).all()
]
return jsonify({"status": "ok", "meta": meta, "body_html": body_html, "objects": objects})
# VSPC multi-company emails (e.g. "Active alarms summary") may not store parsed objects yet.
# Extract company names from the stored body so the UI can offer a dedicated mapping workflow.
vspc_companies: list[str] = []
vspc_company_defaults: dict[str, dict] = {}
try:
bsw = (getattr(msg, "backup_software", "") or "").strip().lower()
btype = (getattr(msg, "backup_type", "") or "").strip().lower()
jname = (getattr(msg, "job_name", "") or "").strip().lower()
if bsw == "veeam" and btype == "service provider console" and jname == "active alarms summary":
raw = text_body if not _is_blank_text(text_body) else (html_body or "")
vspc_companies = extract_vspc_active_alarms_companies(raw)
# For each company, prefill the UI with the existing customer mapping if we already have a job for it.
# This avoids re-mapping known companies and keeps the message actionable in the Inbox.
if vspc_companies:
for company in vspc_companies:
norm_from, store_backup, store_type, _store_job = build_job_match_key(msg)
company_job_name = f"{(msg.job_name or 'Active alarms summary').strip()} | {company}".strip()
tmp_msg = MailMessage(
from_address=norm_from,
backup_software=store_backup,
backup_type=store_type,
job_name=company_job_name,
)
job = find_matching_job(tmp_msg)
if job and getattr(job, "customer_id", None):
c = Customer.query.get(int(job.customer_id))
if c:
vspc_company_defaults[company] = {
"customer_id": int(c.id),
"customer_name": c.name,
}
except Exception:
vspc_companies = []
vspc_company_defaults = {}
return jsonify({
"status": "ok",
"meta": meta,
"body_html": body_html,
"objects": objects,
"vspc_companies": vspc_companies,
"vspc_company_defaults": vspc_company_defaults,
})
@main_bp.route("/inbox/message/<int:message_id>/eml")
@ -265,6 +332,409 @@ def inbox_message_approve(message_id: int):
return redirect(url_for("main.inbox"))
@main_bp.route("/inbox/<int:message_id>/approve_vspc_companies", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def inbox_message_approve_vspc_companies(message_id: int):
msg = MailMessage.query.get_or_404(message_id)
# Only allow approval from inbox
if getattr(msg, "location", "inbox") != "inbox":
flash("This message is no longer in the Inbox and cannot be approved here.", "warning")
return redirect(url_for("main.inbox"))
mappings_json = (request.form.get("company_mappings_json") or "").strip()
try:
mappings = json.loads(mappings_json) if mappings_json else []
except Exception:
flash("Invalid company mappings payload.", "danger")
return redirect(url_for("main.inbox"))
if mappings is None:
mappings = []
if not isinstance(mappings, list):
flash("Invalid company mappings payload.", "danger")
return redirect(url_for("main.inbox"))
# Validate message type (best-effort guard)
if (getattr(msg, "backup_software", None) or "").strip() != "Veeam" or (getattr(msg, "backup_type", None) or "").strip() != "Service Provider Console":
flash("This approval method is only valid for Veeam Service Provider Console summary emails.", "danger")
return redirect(url_for("main.inbox"))
# Determine companies present in this message (only alarms > 0).
html_body = getattr(msg, "html_body", None)
text_body = getattr(msg, "text_body", None)
raw_for_companies = text_body if (text_body and str(text_body).strip()) else (html_body or "")
companies_present = extract_vspc_active_alarms_companies(raw_for_companies)
if not companies_present:
flash("No companies could be detected in this VSPC summary email.", "danger")
return redirect(url_for("main.inbox"))
# Resolve existing mappings from already-created per-company jobs.
existing_map: dict[str, int] = {}
for company in companies_present:
norm_from, store_backup, store_type, _store_job = build_job_match_key(msg)
company_job_name = f"{(msg.job_name or 'Active alarms summary').strip()} | {company}".strip()
tmp_msg = MailMessage(
from_address=norm_from,
backup_software=store_backup,
backup_type=store_type,
job_name=company_job_name,
)
job = find_matching_job(tmp_msg)
if job and getattr(job, "customer_id", None):
try:
existing_map[company] = int(job.customer_id)
except Exception:
pass
# Resolve mappings provided by the user from the popup.
provided_map: dict[str, int] = {}
for item in mappings:
if not isinstance(item, dict):
continue
company = (item.get("company") or "").strip()
customer_id_raw = str(item.get("customer_id") or "").strip()
if not company or not customer_id_raw:
continue
try:
customer_id = int(customer_id_raw)
except ValueError:
continue
customer = Customer.query.get(customer_id)
if not customer:
continue
provided_map[company] = int(customer.id)
# Persist mapping immediately by creating/updating the per-company job.
# This ensures already mapped companies are shown next time, even if approval is blocked.
norm_from, store_backup, store_type, _store_job = build_job_match_key(msg)
company_job_name = f"{(msg.job_name or 'Active alarms summary').strip()} | {company}".strip()
tmp_msg = MailMessage(
from_address=norm_from,
backup_software=store_backup,
backup_type=store_type,
job_name=company_job_name,
)
job = find_matching_job(tmp_msg)
if job:
if job.customer_id != customer.id:
job.customer_id = customer.id
else:
job = Job(
customer_id=customer.id,
from_address=norm_from,
backup_software=store_backup,
backup_type=store_type,
job_name=company_job_name,
active=True,
auto_approve=True,
)
db.session.add(job)
db.session.flush()
# Commit any mapping updates so they are visible immediately in the UI.
try:
db.session.commit()
except Exception:
db.session.rollback()
flash("Could not save company mappings due to a database error.", "danger")
return redirect(url_for("main.inbox"))
# Final mapping resolution: existing job mappings + any newly provided ones.
final_map: dict[str, int] = dict(existing_map)
final_map.update(provided_map)
missing_companies = [c for c in companies_present if c not in final_map]
mapped_companies = [c for c in companies_present if c in final_map]
if not mapped_companies:
# Nothing to approve yet; user must map at least one company.
missing_str = ", ".join(missing_companies[:10])
if len(missing_companies) > 10:
missing_str += f" (+{len(missing_companies) - 10} more)"
flash(
(
"Please map at least one company before approving."
+ (f" Missing: {missing_str}" if missing_str else "")
),
"danger",
)
return redirect(url_for("main.inbox"))
def _is_error_status(value: str | None) -> bool:
v = (value or "").strip().lower()
return v in {"error", "failed", "critical"} or v.startswith("fail")
created_runs: list[JobRun] = []
skipped_existing = 0
first_job: Job | None = None
# Create runs for mapped companies only. If some companies remain unmapped,
# the message stays in the Inbox so the user can map the remainder later.
for company in mapped_companies:
customer_id = int(final_map[company])
customer = Customer.query.get(customer_id)
if not customer:
continue
norm_from, store_backup, store_type, _store_job = build_job_match_key(msg)
company_job_name = f"{(msg.job_name or 'Active alarms summary').strip()} | {company}".strip()
tmp_msg = MailMessage(
from_address=norm_from,
backup_software=store_backup,
backup_type=store_type,
job_name=company_job_name,
)
job = find_matching_job(tmp_msg)
if job:
if job.customer_id != customer.id:
job.customer_id = customer.id
else:
job = Job(
customer_id=customer.id,
from_address=norm_from,
backup_software=store_backup,
backup_type=store_type,
job_name=company_job_name,
active=True,
auto_approve=True,
)
db.session.add(job)
db.session.flush()
if not first_job:
first_job = job
objs = (
MailObject.query.filter(MailObject.mail_message_id == msg.id)
.filter(MailObject.object_name.ilike(f"{company} | %"))
.all()
)
saw_error = any(_is_error_status(o.status) for o in objs)
saw_warning = any((o.status or "").strip().lower() == "warning" for o in objs)
status = "Error" if saw_error else ("Warning" if saw_warning else (msg.overall_status or "Success"))
# De-duplicate: do not create multiple runs for the same (mail_message_id, job_id).
run = JobRun.query.filter(JobRun.job_id == job.id, JobRun.mail_message_id == msg.id).first()
if run:
skipped_existing += 1
else:
run = JobRun(
job_id=job.id,
mail_message_id=msg.id,
run_at=(msg.received_at or getattr(msg, "parsed_at", None) or datetime.utcnow()),
status=status or None,
missed=False,
)
if hasattr(run, "remark"):
run.remark = getattr(msg, "overall_message", None)
db.session.add(run)
db.session.flush()
created_runs.append(run)
# Persist objects for reporting (idempotent upsert; safe to repeat).
try:
persist_objects_for_approved_run_filtered(
customer.id,
job.id,
run.id,
msg.id,
object_name_prefix=company,
strip_prefix=True,
)
except Exception as exc:
_log_admin_event(
"object_persist_error",
f"Filtered object persistence failed for message {msg.id} (company '{company}', job {job.id}, run {run.id}): {exc}",
)
processed_total = len(created_runs) + skipped_existing
if processed_total <= 0:
flash("No runs could be created for this VSPC summary.", "danger")
return redirect(url_for("main.inbox"))
# Commit created runs and any job mapping updates first.
try:
db.session.commit()
except Exception as exc:
db.session.rollback()
flash("Could not approve this job due to a database error.", "danger")
_log_admin_event("inbox_approve_error", f"Failed to approve VSPC message {msg.id}: {exc}")
return redirect(url_for("main.inbox"))
if missing_companies:
# Keep message in Inbox until all companies are mapped, but keep the already
# created runs for mapped companies.
missing_str = ", ".join(missing_companies[:10])
if len(missing_companies) > 10:
missing_str += f" (+{len(missing_companies) - 10} more)"
_log_admin_event(
"inbox_approve_vspc_partial",
f"Partially approved VSPC message {msg.id}: {processed_total} run(s) processed, missing={missing_str}",
)
flash(
f"Approved {processed_total} mapped compan{'y' if processed_total == 1 else 'ies'}. Message stays in the Inbox until all companies are mapped. Missing: {missing_str}",
"warning",
)
return redirect(url_for("main.inbox"))
# All companies mapped: mark the message as approved and move it to History.
msg.job_id = first_job.id if first_job else None
if hasattr(msg, "approved"):
msg.approved = True
if hasattr(msg, "approved_at"):
msg.approved_at = datetime.utcnow()
if hasattr(msg, "approved_by_id"):
msg.approved_by_id = current_user.id
if hasattr(msg, "location"):
msg.location = "history"
try:
db.session.commit()
except Exception as exc:
db.session.rollback()
flash("Could not finalize approval due to a database error.", "danger")
_log_admin_event("inbox_approve_error", f"Failed to finalize VSPC approval for message {msg.id}: {exc}")
return redirect(url_for("main.inbox"))
# Best-effort: now that company jobs are mapped, auto-approve other inbox
# messages of the same VSPC summary type whose companies are now all mapped.
retro_approved_msgs = 0
try:
q = MailMessage.query
if hasattr(MailMessage, "location"):
q = q.filter(MailMessage.location == "inbox")
q = q.filter(MailMessage.parse_result == "ok")
q = q.filter(MailMessage.job_id.is_(None))
q = q.filter(MailMessage.backup_software == "Veeam")
q = q.filter(MailMessage.backup_type == "Service Provider Console")
q = q.filter(MailMessage.job_name == (msg.job_name or "Active alarms summary"))
q = q.filter(MailMessage.id != msg.id)
candidates = q.order_by(MailMessage.received_at.desc().nullslast(), MailMessage.id.desc()).limit(25).all()
for other in candidates:
nested = db.session.begin_nested()
try:
raw_other = (other.text_body or "").strip() or (other.html_body or "")
companies = extract_vspc_active_alarms_companies(raw_other)
if not companies:
nested.commit()
continue
jobs_by_company: dict[str, Job] = {}
all_mapped = True
for company in companies:
norm_from, store_backup, store_type, _store_job = build_job_match_key(other)
company_job_name = f"{(other.job_name or 'Active alarms summary').strip()} | {company}".strip()
tmp = MailMessage(
from_address=norm_from,
backup_software=store_backup,
backup_type=store_type,
job_name=company_job_name,
)
with db.session.no_autoflush:
j = find_matching_job(tmp)
if not j or not getattr(j, "customer_id", None):
all_mapped = False
break
if hasattr(j, "active") and not bool(j.active):
all_mapped = False
break
if hasattr(j, "auto_approve") and not bool(j.auto_approve):
all_mapped = False
break
jobs_by_company[company] = j
if not all_mapped:
nested.commit()
continue
first_job2: Job | None = None
for company, job2 in jobs_by_company.items():
if not first_job2:
first_job2 = job2
objs2 = (
MailObject.query.filter(MailObject.mail_message_id == other.id)
.filter(MailObject.object_name.ilike(f"{company} | %"))
.all()
)
saw_error2 = any(_is_error_status(o.status) for o in objs2)
saw_warning2 = any((o.status or "").strip().lower() == "warning" for o in objs2)
status2 = "Error" if saw_error2 else ("Warning" if saw_warning2 else (other.overall_status or "Success"))
run2 = JobRun.query.filter(JobRun.job_id == job2.id, JobRun.mail_message_id == other.id).first()
if not run2:
run2 = JobRun(
job_id=job2.id,
mail_message_id=other.id,
run_at=(other.received_at or getattr(other, "parsed_at", None) or datetime.utcnow()),
status=status2 or None,
missed=False,
)
if hasattr(run2, "remark"):
run2.remark = getattr(other, "overall_message", None)
db.session.add(run2)
db.session.flush()
# Persist objects per company
try:
persist_objects_for_approved_run_filtered(
int(job2.customer_id),
int(job2.id),
int(run2.id),
int(other.id),
object_name_prefix=company,
strip_prefix=True,
)
except Exception as exc:
_log_admin_event(
"object_persist_error",
f"Filtered object persistence failed for message {other.id} (company '{company}', job {job2.id}, run {run2.id}): {exc}",
)
other.job_id = first_job2.id if first_job2 else None
if hasattr(other, "approved"):
other.approved = True
if hasattr(other, "approved_at"):
other.approved_at = datetime.utcnow()
if hasattr(other, "approved_by_id"):
other.approved_by_id = current_user.id
if hasattr(other, "location"):
other.location = "history"
nested.commit()
retro_approved_msgs += 1
except Exception:
try:
nested.rollback()
except Exception:
db.session.rollback()
db.session.commit()
except Exception:
try:
db.session.rollback()
except Exception:
pass
_log_admin_event(
"inbox_approve_vspc",
f"Approved VSPC message {msg.id} into {processed_total} run(s) (job_id={msg.job_id}), retro_approved={retro_approved_msgs}",
)
flash(f"Approved VSPC summary into {processed_total} run(s).", "success")
return redirect(url_for("main.inbox"))
@main_bp.route("/inbox/message/<int:message_id>/delete", methods=["POST"])
@login_required
@roles_required("admin", "operator")
@ -296,6 +766,62 @@ def inbox_message_delete(message_id: int):
return redirect(url_for("main.inbox"))
@main_bp.post("/api/inbox/delete")
@login_required
@roles_required("admin", "operator")
def api_inbox_bulk_delete():
"""Bulk delete inbox messages (soft delete -> move to Deleted)."""
data = request.get_json(silent=True) or {}
message_ids = data.get("message_ids") or []
try:
message_ids = [int(x) for x in message_ids]
except Exception:
return jsonify({"status": "error", "message": "Invalid message_ids."}), 400
if not message_ids:
return jsonify({"status": "ok", "updated": 0, "skipped": 0, "missing": 0})
msgs = MailMessage.query.filter(MailMessage.id.in_(message_ids)).all()
msg_map = {int(m.id): m for m in msgs}
now = datetime.utcnow()
updated = 0
skipped = 0
missing = 0
for mid in message_ids:
msg = msg_map.get(int(mid))
if not msg:
missing += 1
continue
if getattr(msg, "location", "inbox") != "inbox":
skipped += 1
continue
if hasattr(msg, "location"):
msg.location = "deleted"
if hasattr(msg, "deleted_at"):
msg.deleted_at = now
if hasattr(msg, "deleted_by_user_id"):
msg.deleted_by_user_id = current_user.id
updated += 1
try:
db.session.commit()
except Exception as exc:
db.session.rollback()
_log_admin_event("inbox_bulk_delete_error", f"Failed to bulk delete inbox messages {message_ids}: {exc}")
return jsonify({"status": "error", "message": "Database error while deleting messages."}), 500
_log_admin_event("inbox_bulk_delete", f"Deleted inbox messages: {message_ids}")
return jsonify({"status": "ok", "updated": updated, "skipped": skipped, "missing": missing})
@main_bp.route("/inbox/deleted")
@login_required
@roles_required("admin")
@ -456,6 +982,95 @@ def inbox_reparse_all():
and getattr(msg, "parse_result", None) == "ok"
and getattr(msg, "job_id", None) is None
):
# Special case: VSPC Active Alarms summary can contain multiple companies.
bsw = (getattr(msg, "backup_software", "") or "").strip().lower()
btype = (getattr(msg, "backup_type", "") or "").strip().lower()
jname = (getattr(msg, "job_name", "") or "").strip().lower()
if bsw == "veeam" and btype == "service provider console" and jname == "active alarms summary":
raw = (getattr(msg, "text_body", None) or "").strip() or (getattr(msg, "html_body", None) or "")
companies = extract_vspc_active_alarms_companies(raw)
if companies:
def _is_error_status(value: str | None) -> bool:
v = (value or "").strip().lower()
return v in {"error", "failed", "critical"} or v.startswith("fail")
first_job = None
mapped_count = 0
created_any = False
for company in companies:
tmp_msg = MailMessage(
from_address=msg.from_address,
backup_software=msg.backup_software,
backup_type=msg.backup_type,
job_name=f"{(msg.job_name or 'Active alarms summary').strip()} | {company}".strip(),
)
with db.session.no_autoflush:
job = find_matching_job(tmp_msg)
if not job:
continue
if hasattr(job, "active") and not bool(job.active):
continue
if hasattr(job, "auto_approve") and not bool(job.auto_approve):
continue
mapped_count += 1
objs = (
MailObject.query.filter(MailObject.mail_message_id == msg.id)
.filter(MailObject.object_name.ilike(f"{company} | %"))
.all()
)
saw_error = any(_is_error_status(o.status) for o in objs)
saw_warning = any((o.status or "").strip().lower() == "warning" for o in objs)
status = "Error" if saw_error else ("Warning" if saw_warning else (msg.overall_status or "Success"))
run = JobRun(
job_id=job.id,
mail_message_id=msg.id,
run_at=(msg.received_at or getattr(msg, "parsed_at", None) or datetime.utcnow()),
status=status or None,
missed=False,
)
if hasattr(run, "remark"):
run.remark = getattr(msg, "overall_message", None)
if hasattr(run, "storage_used_bytes") and hasattr(msg, "storage_used_bytes"):
run.storage_used_bytes = msg.storage_used_bytes
if hasattr(run, "storage_capacity_bytes") and hasattr(msg, "storage_capacity_bytes"):
run.storage_capacity_bytes = msg.storage_capacity_bytes
if hasattr(run, "storage_free_bytes") and hasattr(msg, "storage_free_bytes"):
run.storage_free_bytes = msg.storage_free_bytes
if hasattr(run, "storage_free_percent") and hasattr(msg, "storage_free_percent"):
run.storage_free_percent = msg.storage_free_percent
db.session.add(run)
db.session.flush()
auto_approved_runs.append((job.customer_id, job.id, run.id, msg.id))
created_any = True
if not first_job:
first_job = job
if created_any and mapped_count == len(companies):
msg.job_id = first_job.id if first_job else None
if hasattr(msg, "approved"):
msg.approved = True
if hasattr(msg, "approved_at"):
msg.approved_at = datetime.utcnow()
if hasattr(msg, "approved_by_id"):
msg.approved_by_id = None
if hasattr(msg, "location"):
msg.location = "history"
auto_approved += 1
# Do not fall back to single-job matching for VSPC summary.
continue
# Match approved job on: From + Backup + Type + Job name
# Prevent session autoflush for every match lookup while we
# are still updating many messages in a loop.
@ -636,7 +1251,7 @@ def inbox_reparse_all():
persisted_errors = 0
for (customer_id, job_id, run_id, mail_message_id) in auto_approved_runs:
try:
persisted_objects += persist_objects_for_approved_run(
persisted_objects += persist_objects_for_auto_run(
customer_id, job_id, run_id, mail_message_id
)
except Exception as exc:

View File

@ -16,6 +16,7 @@ def jobs():
# Join with customers for display
jobs = (
Job.query
.filter(Job.archived.is_(False))
.outerjoin(Customer, Customer.id == Job.customer_id)
.add_columns(
Job.id,
@ -55,6 +56,89 @@ def jobs():
)
@main_bp.route("/jobs/<int:job_id>/archive", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def archive_job(job_id: int):
job = Job.query.get_or_404(job_id)
if job.archived:
flash("Job is already archived.", "info")
return redirect(url_for("main.jobs"))
job.archived = True
job.archived_at = datetime.utcnow()
job.archived_by_user_id = current_user.id
db.session.commit()
try:
log_admin_event("job_archived", f"Archived job {job.id}", details=f"job_name={job.job_name}")
except Exception:
pass
flash("Job archived.", "success")
return redirect(url_for("main.jobs"))
@main_bp.route("/archived-jobs")
@login_required
@roles_required("admin")
def archived_jobs():
rows = (
Job.query
.filter(Job.archived.is_(True))
.outerjoin(Customer, Customer.id == Job.customer_id)
.add_columns(
Job.id,
Job.backup_software,
Job.backup_type,
Job.job_name,
Job.archived_at,
Customer.name.label("customer_name"),
)
.order_by(Customer.name.asc().nullslast(), Job.backup_software.asc(), Job.backup_type.asc(), Job.job_name.asc())
.all()
)
out = []
for row in rows:
out.append(
{
"id": row.id,
"customer_name": getattr(row, "customer_name", "") or "",
"backup_software": row.backup_software or "",
"backup_type": row.backup_type or "",
"job_name": row.job_name or "",
"archived_at": _format_datetime(row.archived_at),
}
)
return render_template("main/archived_jobs.html", jobs=out)
@main_bp.route("/jobs/<int:job_id>/unarchive", methods=["POST"])
@login_required
@roles_required("admin")
def unarchive_job(job_id: int):
job = Job.query.get_or_404(job_id)
if not job.archived:
flash("Job is not archived.", "info")
return redirect(url_for("main.archived_jobs"))
job.archived = False
job.archived_at = None
job.archived_by_user_id = None
db.session.commit()
try:
log_admin_event("job_unarchived", f"Unarchived job {job.id}", details=f"job_name={job.job_name}")
except Exception:
pass
flash("Job restored.", "success")
return redirect(url_for("main.archived_jobs"))
@main_bp.route("/jobs/<int:job_id>")
@login_required
@roles_required("admin", "operator", "viewer")
@ -114,8 +198,8 @@ def job_detail(job_id: int):
WHERE ts.job_id = :job_id
AND t.active_from_date <= :max_date
AND (
t.resolved_at IS NULL
OR ((t.resolved_at AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :min_date
COALESCE(ts.resolved_at, t.resolved_at) IS NULL
OR ((COALESCE(ts.resolved_at, t.resolved_at) AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :min_date
)
"""
),
@ -339,25 +423,17 @@ def job_delete(job_id: int):
job = Job.query.get_or_404(job_id)
try:
# Collect run ids for FK cleanup in auxiliary tables that may not have ON DELETE CASCADE
run_ids = []
mail_message_ids = []
# Collect run IDs up-front for cleanup across dependent tables
run_ids = [r.id for r in JobRun.query.filter_by(job_id=job.id).all()]
for run in job.runs:
if run.id is not None:
run_ids.append(run.id)
if run.mail_message_id:
mail_message_ids.append(run.mail_message_id)
# Put related mails back into the inbox and unlink from job
if mail_message_ids:
msgs = MailMessage.query.filter(MailMessage.id.in_(mail_message_ids)).all()
# Put any related mails back into the inbox and unlink from job
msgs = MailMessage.query.filter(MailMessage.job_id == job.id).all()
for msg in msgs:
if hasattr(msg, "location"):
msg.location = "inbox"
msg.job_id = None
# Ensure run_object_links doesn't block job_runs deletion (older schemas may miss ON DELETE CASCADE)
# Clean up tables that may not have ON DELETE CASCADE in older schemas.
if run_ids:
db.session.execute(
text("DELETE FROM run_object_links WHERE run_id IN :run_ids").bindparams(
@ -365,14 +441,48 @@ def job_delete(job_id: int):
),
{"run_ids": run_ids},
)
db.session.execute(
text("DELETE FROM job_run_review_events WHERE run_id IN :run_ids").bindparams(
bindparam("run_ids", expanding=True)
),
{"run_ids": run_ids},
)
db.session.execute(
text("DELETE FROM ticket_job_runs WHERE job_run_id IN :run_ids").bindparams(
bindparam("run_ids", expanding=True)
),
{"run_ids": run_ids},
)
db.session.execute(
text("DELETE FROM remark_job_runs WHERE job_run_id IN :run_ids").bindparams(
bindparam("run_ids", expanding=True)
),
{"run_ids": run_ids},
)
db.session.execute(
text("DELETE FROM job_objects WHERE job_run_id IN :run_ids").bindparams(
bindparam("run_ids", expanding=True)
),
{"run_ids": run_ids},
)
# Overrides scoped to this job (object overrides)
db.session.execute(text("UPDATE overrides SET job_id = NULL WHERE job_id = :job_id"), {"job_id": job.id})
# Ticket/Remark scopes may reference a specific job
db.session.execute(text("UPDATE ticket_scopes SET job_id = NULL WHERE job_id = :job_id"), {"job_id": job.id})
db.session.execute(text("UPDATE remark_scopes SET job_id = NULL WHERE job_id = :job_id"), {"job_id": job.id})
# Ensure job_object_links doesn't block jobs deletion (older schemas may miss ON DELETE CASCADE)
if job.id is not None:
db.session.execute(
text("DELETE FROM job_object_links WHERE job_id = :job_id"),
{"job_id": job.id},
)
# Finally remove runs and the job itself
if run_ids:
db.session.execute(text("DELETE FROM job_runs WHERE job_id = :job_id"), {"job_id": job.id})
db.session.delete(job)
db.session.commit()
flash("Job deleted. Related mails are returned to the inbox.", "success")

View File

@ -0,0 +1,139 @@
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _format_datetime
from datetime import datetime
from sqlalchemy import or_
def _parse_datetime_local(value: str):
if not value:
return None
value = value.strip()
if not value:
return None
try:
# Accept HTML datetime-local values like 2026-01-08T10:30
return datetime.fromisoformat(value)
except Exception:
return None
@main_bp.route("/admin/mails")
@login_required
@roles_required("admin")
def admin_all_mails():
try:
page = int(request.args.get("page", "1"))
except ValueError:
page = 1
if page < 1:
page = 1
per_page = 50
# Filters (AND-combined)
from_q = (request.args.get("from_q", "") or "").strip()
subject_q = (request.args.get("subject_q", "") or "").strip()
backup_q = (request.args.get("backup_q", "") or "").strip()
type_q = (request.args.get("type_q", "") or "").strip()
job_name_q = (request.args.get("job_name_q", "") or "").strip()
received_from_raw = (request.args.get("received_from", "") or "").strip()
received_to_raw = (request.args.get("received_to", "") or "").strip()
only_unlinked = (request.args.get("only_unlinked", "") or "").strip().lower() in (
"1",
"true",
"yes",
"on",
)
received_from = _parse_datetime_local(received_from_raw)
received_to = _parse_datetime_local(received_to_raw)
query = MailMessage.query
# Text filters
if from_q:
query = query.filter(MailMessage.from_address.ilike(f"%{from_q}%"))
if subject_q:
query = query.filter(MailMessage.subject.ilike(f"%{subject_q}%"))
if backup_q:
query = query.filter(MailMessage.backup_software.ilike(f"%{backup_q}%"))
if type_q:
query = query.filter(MailMessage.backup_type.ilike(f"%{type_q}%"))
if job_name_q:
# Prefer stored job_name on message, but also match linked job name
query = query.outerjoin(Job, Job.id == MailMessage.job_id).filter(
or_(
MailMessage.job_name.ilike(f"%{job_name_q}%"),
Job.job_name.ilike(f"%{job_name_q}%"),
)
)
# Time window
if received_from:
query = query.filter(MailMessage.received_at >= received_from)
if received_to:
query = query.filter(MailMessage.received_at <= received_to)
# Linked/unlinked
if only_unlinked:
query = query.filter(MailMessage.job_id.is_(None))
total_items = query.count()
total_pages = max(1, math.ceil(total_items / per_page)) if total_items else 1
if page > total_pages:
page = total_pages
messages = (
query.order_by(
MailMessage.received_at.desc().nullslast(),
MailMessage.id.desc(),
)
.offset((page - 1) * per_page)
.limit(per_page)
.all()
)
rows = []
for msg in messages:
linked = bool(msg.job_id)
rows.append(
{
"id": msg.id,
"from_address": msg.from_address or "",
"subject": msg.subject or "",
"received_at": _format_datetime(msg.received_at),
"backup_software": msg.backup_software or "",
"backup_type": msg.backup_type or "",
"job_name": (msg.job_name or ""),
"linked": linked,
"parsed_at": _format_datetime(msg.parsed_at),
"overall_status": msg.overall_status or "",
"has_eml": bool(getattr(msg, "eml_stored_at", None)),
}
)
has_prev = page > 1
has_next = page < total_pages
filter_params = {
"from_q": from_q,
"subject_q": subject_q,
"backup_q": backup_q,
"type_q": type_q,
"job_name_q": job_name_q,
"received_from": received_from_raw,
"received_to": received_to_raw,
"only_unlinked": "1" if only_unlinked else "",
}
return render_template(
"main/admin_all_mail.html",
rows=rows,
page=page,
total_pages=total_pages,
has_prev=has_prev,
has_next=has_next,
filter_params=filter_params,
)

View File

@ -75,7 +75,16 @@ def overrides():
if ov.match_status:
crit.append(f"status == {ov.match_status}")
if ov.match_error_contains:
crit.append(f"error contains '{ov.match_error_contains}'")
mode = (getattr(ov, "match_error_mode", None) or "contains").strip().lower()
if mode == "exact":
label = "error exact"
elif mode == "starts_with":
label = "error starts with"
elif mode == "ends_with":
label = "error ends with"
else:
label = "error contains"
crit.append(f"{label} '{ov.match_error_contains}'")
if crit:
scope = scope + " [" + ", ".join(crit) + "]"
@ -95,6 +104,13 @@ def overrides():
"comment": ov.comment or "",
"match_status": ov.match_status or "",
"match_error_contains": ov.match_error_contains or "",
"match_error_mode": getattr(ov, "match_error_mode", None) or "",
"backup_software": ov.backup_software or "",
"backup_type": ov.backup_type or "",
"job_id": ov.job_id or "",
"object_name": ov.object_name or "",
"start_at_raw": (ov.start_at.strftime("%Y-%m-%dT%H:%M") if ov.start_at else ""),
"end_at_raw": (ov.end_at.strftime("%Y-%m-%dT%H:%M") if ov.end_at else ""),
}
)
@ -126,6 +142,12 @@ def overrides_create():
match_status = (request.form.get("match_status") or "").strip() or None
match_error_contains = (request.form.get("match_error_contains") or "").strip() or None
match_error_mode = (request.form.get("match_error_mode") or "").strip().lower() or None
if match_error_contains:
if match_error_mode not in ("contains", "exact", "starts_with", "ends_with"):
match_error_mode = "contains"
else:
match_error_mode = None
start_at_str = request.form.get("start_at") or ""
end_at_str = request.form.get("end_at") or ""
@ -159,6 +181,7 @@ def overrides_create():
object_name=object_name if level == "object" else None,
match_status=match_status,
match_error_contains=match_error_contains,
match_error_mode=match_error_mode,
treat_as_success=treat_as_success,
active=True,
comment=comment,
@ -218,6 +241,12 @@ def overrides_update(override_id: int):
match_status = (request.form.get("match_status") or "").strip() or None
match_error_contains = (request.form.get("match_error_contains") or "").strip() or None
match_error_mode = (request.form.get("match_error_mode") or "").strip().lower() or None
if match_error_contains:
if match_error_mode not in ("contains", "exact", "starts_with", "ends_with"):
match_error_mode = "contains"
else:
match_error_mode = None
start_at_str = request.form.get("start_at") or ""
end_at_str = request.form.get("end_at") or ""
@ -252,6 +281,7 @@ def overrides_update(override_id: int):
ov.object_name = object_name if level == "object" else None
ov.match_status = match_status
ov.match_error_contains = match_error_contains
ov.match_error_mode = match_error_mode
ov.treat_as_success = treat_as_success
ov.comment = comment
ov.start_at = start_at

View File

@ -1,94 +1,16 @@
from .routes_shared import * # noqa: F401,F403
# Keep the parser overview page in sync with the actual parser registry.
from ..parsers.registry import PARSER_DEFINITIONS
@main_bp.route("/parsers")
@login_required
@roles_required("admin")
def parsers_overview():
# Only show what is currently implemented in code.
# Currently implemented parsers:
# - 3CX (Backup Complete notifications)
# - Veeam (status mails in multiple variants)
parsers = [
{
"name": "3CX backup complete",
"backup_software": "3CX",
"backup_types": [],
"order": 10,
"enabled": True,
"match": {
"subject_regex": r"^3CX Notification:\\s*Backup Complete\\s*-\\s*(.+)$",
},
"description": "Parses 3CX backup notifications (Backup Complete).",
"examples": [
{
"subject": "3CX Notification: Backup Complete - PBX01",
"from_address": "noreply@3cx.local",
"body_snippet": "Backup name: PBX01_2025-12-17.zip",
"parsed_result": {
"backup_software": "3CX",
"backup_type": "",
"job_name": "PBX01",
"objects": [
{
"name": "PBX01_2025-12-17.zip",
"status": "Success",
"error_message": "",
}
],
},
}
],
},
{
"name": "Veeam status mails",
"backup_software": "Veeam",
"backup_types": [
"Backup Job",
"Backup Copy Job",
"Replica Job",
"Replication job",
"Configuration Backup",
"Agent Backup job",
"Veeam Backup for Microsoft 365",
"Scale Out Back-up Repository",
],
"order": 20,
"enabled": True,
"match": {
"subject_regex": r"\\[(Success|Warning|Failed)\\]\\s*(.+)$",
},
"description": "Parses Veeam status mails. Job name/type are preferably extracted from the HTML header to avoid subject suffix noise.",
"examples": [
{
"subject": "[Warning] Daily-VM-Backup (3 objects) 1 warning",
"from_address": "veeam@customer.local",
"body_snippet": "Backup job: Daily-VM-Backup\\n...",
"parsed_result": {
"backup_software": "Veeam",
"backup_type": "Backup job",
"job_name": "Daily-VM-Backup",
"objects": [
{"name": "VM-APP01", "status": "Success", "error_message": ""},
{"name": "VM-DB01", "status": "Warning", "error_message": "Low disk space"},
],
},
},
{
"subject": "[Success] Offsite-Repository",
"from_address": "veeam@customer.local",
"body_snippet": "Backup Copy job: Offsite-Repository\\n...",
"parsed_result": {
"backup_software": "Veeam",
"backup_type": "Backup Copy job",
"job_name": "Offsite-Repository",
"objects": [
{"name": "Backup Copy Chain", "status": "Success", "error_message": ""}
],
},
},
],
},
]
parsers = sorted(
PARSER_DEFINITIONS,
key=lambda p: (p.get("order", 9999), p.get("backup_software", ""), p.get("name", "")),
)
return render_template(
"main/parsers.html",

View File

@ -1,23 +1,13 @@
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _format_datetime
@main_bp.route("/remarks/<int:remark_id>", methods=["GET", "POST"])
@main_bp.route("/remarks/<int:remark_id>", methods=["GET"])
@login_required
@roles_required("admin", "operator", "viewer")
def remark_detail(remark_id: int):
remark = Remark.query.get_or_404(remark_id)
if request.method == "POST":
if get_active_role() not in ("admin", "operator"):
abort(403)
remark.body = (request.form.get("body") or "").strip() or ""
try:
db.session.commit()
flash("Remark updated.", "success")
except Exception as exc:
db.session.rollback()
flash(f"Failed to update remark: {exc}", "danger")
return redirect(url_for("main.remark_detail", remark_id=remark.id))
# Remark editing is disabled. Resolve the old remark and create a new one instead.
scopes = RemarkScope.query.filter(RemarkScope.remark_id == remark.id).order_by(RemarkScope.id.asc()).all()

View File

@ -1,16 +1,73 @@
from .routes_shared import * # noqa: F401,F403
from datetime import date, timedelta
from .routes_reporting_api import build_report_columns_meta, build_report_job_filters_meta
def get_default_report_period():
"""Return default report period (last 7 days)."""
period_end = date.today()
period_start = period_end - timedelta(days=7)
return period_start, period_end
def _safe_json_list(value):
if not value:
return []
try:
if isinstance(value, (list, tuple)):
return [int(v) for v in value]
return json.loads(value)
except Exception:
return []
def _safe_json_dict(value):
if not value:
return {}
if isinstance(value, dict):
return value
try:
return json.loads(value)
except Exception:
return {}
def _build_report_item(r):
return {
"id": int(r.id),
"name": r.name or "",
"description": r.description or "",
"report_type": r.report_type,
"output_format": r.output_format,
"customer_scope": getattr(r, "customer_scope", "all") or "all",
"customer_ids": _safe_json_list(getattr(r, "customer_ids", None)),
"period_start": r.period_start.isoformat() if getattr(r, "period_start", None) else "",
"period_end": r.period_end.isoformat() if getattr(r, "period_end", None) else "",
"schedule": r.schedule or "",
"report_config": _safe_json_dict(getattr(r, "report_config", None)),
"created_at": r.created_at.isoformat() if getattr(r, "created_at", None) else "",
}
@main_bp.route("/reports")
@login_required
@roles_required("admin", "operator", "reporter", "viewer")
def reports():
# Defaults are used by the Reports UI for quick testing. All values are UTC.
period_end = datetime.utcnow().replace(microsecond=0)
period_start = (period_end - timedelta(days=7)).replace(microsecond=0)
# Pre-render items so the page is usable even if JS fails to load/execute.
rows = (
db.session.query(ReportDefinition)
.order_by(ReportDefinition.created_at.desc())
.limit(200)
.all()
)
items = [_build_report_item(r) for r in rows]
period_start, period_end = get_default_report_period()
return render_template(
"main/reports.html",
initial_reports=items,
columns_meta=build_report_columns_meta(),
job_filters_meta=build_report_job_filters_meta(),
default_period_start=period_start.isoformat(),
default_period_end=period_end.isoformat(),
)
@ -18,6 +75,48 @@ def reports():
@main_bp.route("/reports/new")
@login_required
@roles_required("admin", "operator", "reporter", "viewer")
def reports_new():
return render_template("main/reports_new.html")
# Preload customers so the form remains usable if JS fails to load/execute.
customers = (
db.session.query(Customer)
.filter(Customer.active.is_(True))
.order_by(Customer.name.asc())
.all()
)
customer_items = [{"id": int(c.id), "name": c.name or ""} for c in customers]
return render_template(
"main/reports_new.html",
initial_customers=customer_items,
columns_meta=build_report_columns_meta(),
job_filters_meta=build_report_job_filters_meta(),
is_edit=False,
initial_report=None,
)
@main_bp.route("/reports/<int:report_id>/edit")
@login_required
def reports_edit(report_id: int):
# Editing reports is limited to the same roles that can create them.
if get_active_role() not in ("admin", "operator", "reporter"):
return abort(403)
r = ReportDefinition.query.get_or_404(report_id)
customers = (
db.session.query(Customer)
.filter(Customer.active.is_(True))
.order_by(Customer.name.asc())
.all()
)
customer_items = [{"id": int(c.id), "name": c.name or ""} for c in customers]
return render_template(
"main/reports_new.html",
initial_customers=customer_items,
columns_meta=build_report_columns_meta(),
job_filters_meta=build_report_job_filters_meta(),
is_edit=True,
initial_report=_build_report_item(r),
)

View File

@ -1,5 +1,7 @@
from __future__ import annotations
import calendar
from datetime import date, datetime, time, timedelta, timezone
from flask import jsonify, render_template, request
@ -13,19 +15,40 @@ from .routes_shared import (
_get_ui_timezone_name,
_get_or_create_settings,
_infer_schedule_map_from_runs,
_infer_monthly_schedule_from_runs,
_to_amsterdam_date,
main_bp,
roles_required,
get_active_role,
)
from ..database import db
from ..models import Customer, Job, JobRun, JobRunReviewEvent, MailMessage, User
from ..email_utils import extract_best_html_from_eml, is_effectively_blank_html
from ..models import (
Customer,
Job,
JobObject,
JobRun,
JobRunReviewEvent,
MailMessage,
MailObject,
Override,
User,
)
# Grace window for matching real runs to an expected schedule slot.
# A run within +/- 1 hour of the inferred schedule time counts as fulfilling the slot.
MISSED_GRACE_WINDOW = timedelta(hours=1)
def _status_is_success(status: str | None) -> bool:
s = (status or "").strip().lower()
if not s:
return False
if "override" in s:
return True
return "success" in s
def _utc_naive_from_local(dt_local: datetime) -> datetime:
"""Convert a timezone-aware local datetime to UTC naive, matching DB convention."""
if dt_local.tzinfo is None:
@ -75,7 +98,13 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
"""
tz = _get_ui_timezone()
schedule_map = _infer_schedule_map_from_runs(job.id) or {}
if not schedule_map:
has_weekly_times = any((schedule_map.get(i) or []) for i in range(7))
monthly = None
if not has_weekly_times:
monthly = _infer_monthly_schedule_from_runs(job.id)
if (not has_weekly_times) and (not monthly):
return 0
today_local = _to_amsterdam_date(datetime.utcnow()) or datetime.utcnow().date()
@ -84,6 +113,8 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
now_local_dt = datetime.now(tz) if tz else datetime.utcnow()
now_utc_naive = _utc_naive_from_local(now_local_dt)
# Remove any previously generated missed runs in this date window.
# Missed runs must be based on learned schedule from real mail-reported runs.
try:
@ -111,6 +142,8 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
inserted = 0
d = start_from
while d <= end_inclusive:
if not has_weekly_times:
break
weekday = d.weekday()
times = schedule_map.get(weekday) or []
if not times:
@ -133,6 +166,15 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
slot_utc_naive = _utc_naive_from_local(local_dt)
# Do not mark as missed until the full grace window has passed.
if now_utc_naive <= (slot_utc_naive + MISSED_GRACE_WINDOW):
continue
# Consider any real run near the slot as fulfilling the schedule.
# Also avoid duplicates if a missed run already exists.
window_start = slot_utc_naive - MISSED_GRACE_WINDOW
@ -168,6 +210,91 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
d = d + timedelta(days=1)
# Monthly expected slots (fallback when no stable weekly schedule is detected)
if (not has_weekly_times) and monthly:
try:
dom = int(monthly.get("day_of_month") or 0)
except Exception:
dom = 0
times = monthly.get("times") or []
if dom > 0 and times:
# Iterate months in the window [start_from, end_inclusive]
cur = date(start_from.year, start_from.month, 1)
end_marker = date(end_inclusive.year, end_inclusive.month, 1)
while cur <= end_marker:
try:
last_dom = calendar.monthrange(cur.year, cur.month)[1]
except Exception:
last_dom = 28
scheduled_dom = dom if dom <= last_dom else last_dom
scheduled_date = date(cur.year, cur.month, scheduled_dom)
if scheduled_date >= start_from and scheduled_date <= end_inclusive:
for hhmm in times:
hm = _parse_hhmm(hhmm)
if not hm:
continue
hh, mm = hm
local_dt = datetime.combine(scheduled_date, time(hour=hh, minute=mm))
if tz:
local_dt = local_dt.replace(tzinfo=tz)
# Only generate missed runs for past slots.
if local_dt > now_local_dt:
continue
slot_utc_naive = _utc_naive_from_local(local_dt)
# Do not mark as missed until the full grace window has passed.
if now_utc_naive <= (slot_utc_naive + MISSED_GRACE_WINDOW):
continue
window_start = slot_utc_naive - MISSED_GRACE_WINDOW
window_end = slot_utc_naive + MISSED_GRACE_WINDOW
exists = (
db.session.query(JobRun.id)
.filter(
JobRun.job_id == job.id,
JobRun.run_at.isnot(None),
or_(
and_(JobRun.missed.is_(False), JobRun.mail_message_id.isnot(None)),
and_(JobRun.missed.is_(True), JobRun.mail_message_id.is_(None)),
),
JobRun.run_at >= window_start,
JobRun.run_at <= window_end,
)
.first()
)
if exists:
continue
miss = JobRun(
job_id=job.id,
run_at=slot_utc_naive,
status="Missed",
missed=True,
remark=None,
mail_message_id=None,
)
db.session.add(miss)
inserted += 1
# Next month
if cur.month == 12:
cur = date(cur.year + 1, 1, 1)
else:
cur = date(cur.year, cur.month + 1, 1)
if inserted:
db.session.commit()
return inserted
@ -195,7 +322,7 @@ def run_checks_page():
)
last_reviewed_map = {int(jid): (dt if dt else None) for jid, dt in last_reviewed_rows}
jobs = Job.query.all()
jobs = Job.query.filter(Job.archived.is_(False)).all()
today_local = _to_amsterdam_date(datetime.utcnow()) or datetime.utcnow().date()
for job in jobs:
@ -222,6 +349,7 @@ def run_checks_page():
)
.select_from(Job)
.outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False))
)
# Runs to show in the overview: unreviewed (or all if admin toggle enabled)
@ -408,8 +536,8 @@ def run_checks_page():
WHERE ts.job_id = :job_id
AND t.active_from_date <= :run_date
AND (
t.resolved_at IS NULL
OR ((t.resolved_at AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
COALESCE(ts.resolved_at, t.resolved_at) IS NULL
OR ((COALESCE(ts.resolved_at, t.resolved_at) AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date) >= :run_date
)
LIMIT 1
"""
@ -511,10 +639,34 @@ def run_checks_details():
"subject": msg.subject or "",
"received_at": _format_datetime(msg.received_at),
}
body_html = msg.html_body or ""
def _is_blank_text(s):
return s is None or (isinstance(s, str) and s.strip() == "")
html_body = getattr(msg, "html_body", None)
text_body = getattr(msg, "text_body", None)
# Keep Run Checks consistent with Inbox/All Mail: if the Graph body is empty but the
# real report is stored as an HTML attachment inside the EML, extract it.
if is_effectively_blank_html(html_body) and _is_blank_text(text_body) and getattr(msg, "eml_blob", None):
extracted = extract_best_html_from_eml(getattr(msg, "eml_blob", None))
if extracted:
html_body = extracted
if not is_effectively_blank_html(html_body):
body_html = html_body
elif not _is_blank_text(text_body):
escaped = (
text_body.replace("&", "&amp;")
.replace("<", "&lt;")
.replace(">", "&gt;")
)
body_html = f"<pre>{escaped}</pre>"
else:
body_html = "<p>No message content stored.</p>"
has_eml = bool(getattr(msg, "eml_stored_at", None))
objects_payload = []
# Preferred: read persisted objects for this run from run_object_links/customer_objects (Step 2).
try:
rows = (
db.session.execute(
@ -545,7 +697,40 @@ def run_checks_details():
}
)
except Exception:
objects_payload = []
# Fallback for older data / during upgrades
try:
objects = run.objects.order_by(JobObject.object_name.asc()).all()
except Exception:
objects = list(run.objects or [])
for obj in objects:
objects_payload.append(
{
"name": obj.object_name,
"type": getattr(obj, "object_type", "") or "",
"status": obj.status or "",
"error_message": obj.error_message or "",
}
)
# If no run-linked objects exist yet, fall back to objects parsed/stored on the mail message.
if (not objects_payload) and msg:
try:
for mo in (
MailObject.query.filter_by(mail_message_id=msg.id)
.order_by(MailObject.object_name.asc())
.all()
):
objects_payload.append(
{
"name": mo.object_name or "",
"type": mo.object_type or "",
"status": mo.status or "",
"error_message": mo.error_message or "",
}
)
except Exception:
pass
status_display = run.status or "-"
try:
@ -559,6 +744,7 @@ def run_checks_details():
"run_at": _format_datetime(run.run_at) if run.run_at else "-",
"status": status_display,
"remark": run.remark or "",
"overall_message": (getattr(msg, "overall_message", None) or "") if msg else "",
"missed": bool(run.missed),
"is_reviewed": bool(run.reviewed_at),
"reviewed_at": _format_datetime(run.reviewed_at) if (get_active_role() == "admin" and run.reviewed_at) else "",
@ -691,3 +877,148 @@ def api_run_checks_unmark_reviewed():
db.session.commit()
return jsonify({"status": "ok", "updated": updated, "skipped": skipped})
@main_bp.post("/api/run-checks/mark-success-override")
@login_required
@roles_required("admin", "operator")
def api_run_checks_mark_success_override():
"""Create a time-bounded override so the selected run is treated as Success (override)."""
data = request.get_json(silent=True) or {}
try:
run_id = int(data.get("run_id") or 0)
except Exception:
run_id = 0
if run_id <= 0:
return jsonify({"status": "error", "message": "Invalid run_id."}), 400
run = JobRun.query.get_or_404(run_id)
job = Job.query.get_or_404(run.job_id)
# Do not allow overriding a missed placeholder run.
if bool(getattr(run, "missed", False)):
return jsonify({"status": "error", "message": "Missed runs cannot be marked as success."}), 400
# If it is already a success or already overridden, do nothing.
if bool(getattr(run, "override_applied", False)):
return jsonify({"status": "ok", "message": "Already overridden."})
if _status_is_success(getattr(run, "status", None)):
return jsonify({"status": "ok", "message": "Already successful."})
# Build a tight validity window around this run.
run_ts = getattr(run, "run_at", None) or getattr(run, "created_at", None) or datetime.utcnow()
start_at = run_ts - timedelta(minutes=1)
end_at = run_ts + timedelta(minutes=1)
comment = (data.get("comment") or "").strip()
if not comment:
# Keep it short and consistent; Operators will typically include a ticket number separately.
comment = "Marked as success from Run Checks"
comment = comment[:2000]
created_any = False
# Prefer object-level overrides (scoped to this job) to avoid impacting other jobs.
obj_rows = []
try:
obj_rows = (
db.session.execute(
text(
"""
SELECT
co.object_name AS object_name,
rol.status AS status,
rol.error_message AS error_message
FROM run_object_links rol
JOIN customer_objects co ON co.id = rol.customer_object_id
WHERE rol.run_id = :run_id
ORDER BY co.object_name ASC
"""
),
{"run_id": run.id},
)
.mappings()
.all()
)
except Exception:
obj_rows = []
def _obj_is_problem(status: str | None) -> bool:
s = (status or "").strip().lower()
if not s:
return False
if "success" in s:
return False
if "override" in s:
return False
return True
for rr in obj_rows or []:
obj_name = (rr.get("object_name") or "").strip()
obj_status = (rr.get("status") or "").strip()
if (not obj_name) or (not _obj_is_problem(obj_status)):
continue
err = (rr.get("error_message") or "").strip()
ov = Override(
level="object",
job_id=job.id,
object_name=obj_name,
match_status=(obj_status or None),
match_error_contains=(err[:255] if err else None),
match_error_mode=("contains" if err else None),
treat_as_success=True,
active=True,
comment=comment,
created_by=current_user.username,
start_at=start_at,
end_at=end_at,
)
db.session.add(ov)
created_any = True
# If we couldn't build a safe object-scoped override, fall back to a very tight global override.
if not created_any:
match_error_contains = (getattr(run, "remark", None) or "").strip()
if not match_error_contains:
# As a last resort, try to match any error message from legacy objects.
try:
objs = list(run.objects) if hasattr(run, "objects") else []
except Exception:
objs = []
for obj in objs or []:
em = (getattr(obj, "error_message", None) or "").strip()
if em:
match_error_contains = em
break
ov = Override(
level="global",
backup_software=job.backup_software or None,
backup_type=job.backup_type or None,
match_status=(getattr(run, "status", None) or None),
match_error_contains=(match_error_contains[:255] if match_error_contains else None),
match_error_mode=("contains" if match_error_contains else None),
treat_as_success=True,
active=True,
comment=comment,
created_by=current_user.username,
start_at=start_at,
end_at=end_at,
)
db.session.add(ov)
created_any = True
db.session.commit()
# Recompute flags so the overview and modal reflect the override immediately.
try:
from .routes_shared import _recompute_override_flags_for_runs
_recompute_override_flags_for_runs(job_ids=[job.id], start_at=start_at, end_at=end_at, only_unreviewed=False)
except Exception:
pass
return jsonify({"status": "ok", "message": "Override created."})

View File

@ -169,7 +169,7 @@ def settings_objects_backfill():
for r in rows:
try:
repaired_objects += persist_objects_for_approved_run(
repaired_objects += persist_objects_for_auto_run(
int(r[2]), int(r[1]), int(r[0]), int(r[3])
)
repaired_runs += 1
@ -586,6 +586,15 @@ def settings():
news_admin_stats = {}
users = User.query.order_by(User.username.asc()).all()
# Count users that have 'admin' among their assigned roles (comma-separated storage)
admin_users_count = 0
try:
admin_users_count = sum(1 for u in users if "admin" in (getattr(u, "roles", None) or []))
except Exception:
admin_users_count = 0
return render_template(
"main/settings.html",
settings=settings,
@ -594,7 +603,8 @@ def settings():
free_disk_warning=free_disk_warning,
has_client_secret=has_client_secret,
tz_options=tz_options,
users=User.query.order_by(User.username.asc()).all(),
users=users,
admin_users_count=admin_users_count,
section=section,
news_admin_items=news_admin_items,
news_admin_stats=news_admin_stats,
@ -915,6 +925,53 @@ def settings_users_reset_password(user_id: int):
return redirect(url_for("main.settings", section="users"))
@main_bp.route("/settings/users/<int:user_id>/roles", methods=["POST"])
@login_required
@roles_required("admin")
def settings_users_update_roles(user_id: int):
user = User.query.get_or_404(user_id)
roles = [r.strip() for r in request.form.getlist("roles") if (r or "").strip()]
roles = list(dict.fromkeys(roles))
if not roles:
roles = ["viewer"]
# Prevent removing the last remaining admin role
removing_admin = ("admin" in user.roles) and ("admin" not in roles)
if removing_admin:
try:
all_users = User.query.all()
admin_count = sum(1 for u in all_users if "admin" in (getattr(u, "roles", None) or []))
except Exception:
admin_count = 0
if admin_count <= 1:
flash("Cannot remove admin role from the last admin account.", "danger")
return redirect(url_for("main.settings", section="users"))
old_roles = ",".join(user.roles)
new_roles = ",".join(roles)
user.role = new_roles
try:
db.session.commit()
flash(f"Roles for '{user.username}' have been updated.", "success")
_log_admin_event("user_update_roles", f"User '{user.username}' roles changed from '{old_roles}' to '{new_roles}'.")
# If the updated user is currently logged in, make sure the active role stays valid.
try:
if getattr(current_user, "id", None) == user.id:
current_user.set_active_role(user.roles[0])
except Exception:
pass
except Exception as exc:
db.session.rollback()
print(f"[settings-users] Failed to update roles: {exc}")
flash("Failed to update roles.", "danger")
return redirect(url_for("main.settings", section="users"))
@main_bp.route("/settings/users/<int:user_id>/delete", methods=["POST"])
@login_required
@roles_required("admin")
@ -922,8 +979,13 @@ def settings_users_delete(user_id: int):
user = User.query.get_or_404(user_id)
# Prevent deleting the last admin user
if user.role == "admin":
admin_count = User.query.filter_by(role="admin").count()
if "admin" in user.roles:
try:
all_users = User.query.all()
admin_count = sum(1 for u in all_users if "admin" in (getattr(u, "roles", None) or []))
except Exception:
admin_count = 0
if admin_count <= 1:
flash("Cannot delete the last admin account.", "danger")
return redirect(url_for("main.settings", section="general"))
@ -995,7 +1057,7 @@ def settings_mail_import():
persisted_errors = 0
for (customer_id, job_id, run_id, mail_message_id) in auto_approved_runs:
try:
persisted_objects += persist_objects_for_approved_run(
persisted_objects += persist_objects_for_auto_run(
int(customer_id), int(job_id), int(run_id), int(mail_message_id)
)
except Exception as exc:

View File

@ -6,6 +6,7 @@ import json
import re
import html as _html
import math
import calendar
import datetime as datetime_module
from functools import wraps
@ -50,6 +51,7 @@ from ..models import (
RemarkJobRun,
FeedbackItem,
FeedbackVote,
FeedbackReply,
NewsItem,
NewsRead,
ReportDefinition,
@ -58,7 +60,7 @@ from ..models import (
)
from ..mail_importer import run_manual_import, MailImportError
from ..parsers import parse_mail_message
from ..object_persistence import persist_objects_for_approved_run
from ..object_persistence import persist_objects_for_approved_run, persist_objects_for_auto_run
main_bp = Blueprint("main", __name__)
@ -291,7 +293,8 @@ def _apply_overrides_to_run(job: Job, run: JobRun):
try:
mec = (getattr(ov, "match_error_contains", None) or "").strip()
if mec:
parts.append(f"contains={mec}")
mem = (getattr(ov, "match_error_mode", None) or "contains").strip()
parts.append(f"error_{mem}={mec}")
except Exception:
pass
try:
@ -340,6 +343,40 @@ def _apply_overrides_to_run(job: Job, run: JobRun):
return False
return needle.lower() in haystack.lower()
def _matches_error_text(haystack: str | None, needle: str | None, mode: str | None) -> bool:
"""Match error text using a configured mode.
Modes:
- contains (default)
- exact
- starts_with
- ends_with
Matching is case-insensitive and trims surrounding whitespace.
"""
if not needle:
return True
if not haystack:
return False
hs = (haystack or "").strip()
nd = (needle or "").strip()
if not hs:
return False
hs_l = hs.lower()
nd_l = nd.lower()
m = (mode or "contains").strip().lower()
if m == "exact":
return hs_l == nd_l
if m in ("starts_with", "startswith", "start"):
return hs_l.startswith(nd_l)
if m in ("ends_with", "endswith", "end"):
return hs_l.endswith(nd_l)
# Default/fallback
return nd_l in hs_l
def _matches_status(candidate: str | None, expected: str | None) -> bool:
if not expected:
return True
@ -408,12 +445,12 @@ def _apply_overrides_to_run(job: Job, run: JobRun):
# Global overrides should match both the run-level remark and any object-level error messages.
if ov.match_error_contains:
if _contains(run.remark, ov.match_error_contains):
if _matches_error_text(run.remark, ov.match_error_contains, getattr(ov, "match_error_mode", None)):
return True
# Check persisted run-object error messages.
for row in run_object_rows or []:
if _contains(row.get("error_message"), ov.match_error_contains):
if _matches_error_text(row.get("error_message"), ov.match_error_contains, getattr(ov, "match_error_mode", None)):
return True
objs = []
@ -422,7 +459,7 @@ def _apply_overrides_to_run(job: Job, run: JobRun):
except Exception:
objs = []
for obj in objs or []:
if _contains(getattr(obj, "error_message", None), ov.match_error_contains):
if _matches_error_text(getattr(obj, "error_message", None), ov.match_error_contains, getattr(ov, "match_error_mode", None)):
return True
return False
@ -436,7 +473,7 @@ def _apply_overrides_to_run(job: Job, run: JobRun):
continue
if not _matches_status(row.get("status"), ov.match_status):
continue
if not _contains(row.get("error_message"), ov.match_error_contains):
if not _matches_error_text(row.get("error_message"), ov.match_error_contains, getattr(ov, "match_error_mode", None)):
continue
return True
@ -451,7 +488,7 @@ def _apply_overrides_to_run(job: Job, run: JobRun):
continue
if not _matches_status(getattr(obj, "status", None), ov.match_status):
continue
if not _contains(getattr(obj, "error_message", None), ov.match_error_contains):
if not _matches_error_text(getattr(obj, "error_message", None), ov.match_error_contains, getattr(ov, "match_error_mode", None)):
continue
return True
@ -611,7 +648,13 @@ def _infer_schedule_map_from_runs(job_id: int):
"""Infer weekly schedule blocks (15-min) from historical runs.
Returns dict weekday->sorted list of 'HH:MM' strings in configured UI local time.
Notes:
- Only considers real runs that came from mail reports (mail_message_id is not NULL).
- Synthetic missed rows never influence schedule inference.
- To reduce noise, a weekday/time bucket must occur at least MIN_OCCURRENCES times.
"""
MIN_OCCURRENCES = 3
schedule = {i: [] for i in range(7)} # 0=Mon .. 6=Sun
# Certain job types are informational and should never participate in schedule
@ -627,10 +670,15 @@ def _infer_schedule_map_from_runs(job_id: int):
return schedule
if bs == 'synology' and bt == 'account protection':
return schedule
if bs == 'synology' and bt == 'updates':
return schedule
if bs == 'qnap' and bt == 'firmware update':
return schedule
if bs == 'syncovery' and bt == 'syncovery':
return schedule
except Exception:
pass
try:
# Only infer schedules from real runs that came from mail reports.
# Synthetic "Missed" rows must never influence schedule inference.
@ -652,13 +700,13 @@ def _infer_schedule_map_from_runs(job_id: int):
if not runs:
return schedule
# Convert run_at to UI local time and bucket into 15-minute blocks
# Convert run_at to UI local time and bucket into 15-minute blocks.
try:
tz = _get_ui_timezone()
except Exception:
tz = None
seen = {i: set() for i in range(7)}
counts = {i: {} for i in range(7)} # weekday -> { "HH:MM": count }
for r in runs:
if not r.run_at:
continue
@ -677,14 +725,139 @@ def _infer_schedule_map_from_runs(job_id: int):
minute_bucket = (dt.minute // 15) * 15
hh = dt.hour
tstr = f"{hh:02d}:{minute_bucket:02d}"
seen[wd].add(tstr)
counts[wd][tstr] = int(counts[wd].get(tstr, 0)) + 1
for wd in range(7):
schedule[wd] = sorted(seen[wd])
# Keep only buckets that occur frequently enough.
keep = [t for t, c in counts[wd].items() if int(c) >= MIN_OCCURRENCES]
schedule[wd] = sorted(keep)
return schedule
def _infer_monthly_schedule_from_runs(job_id: int):
"""Infer a monthly schedule from historical runs.
Returns:
dict with keys:
- day_of_month (int)
- times (list[str] of 'HH:MM' 15-min buckets)
or None if not enough evidence.
Rules:
- Uses only real mail-based runs (mail_message_id is not NULL) and excludes synthetic missed rows.
- Requires at least MIN_OCCURRENCES occurrences for the inferred day-of-month.
- Uses a simple cadence heuristic: typical gaps between runs must be >= 20 days to qualify as monthly.
"""
MIN_OCCURRENCES = 3
try:
# Same "real run" rule as weekly inference.
runs = (
JobRun.query
.filter(
JobRun.job_id == job_id,
JobRun.run_at.isnot(None),
JobRun.missed.is_(False),
JobRun.mail_message_id.isnot(None),
)
.order_by(JobRun.run_at.asc())
.limit(500)
.all()
)
except Exception:
runs = []
if len(runs) < MIN_OCCURRENCES:
return None
try:
tz = _get_ui_timezone()
except Exception:
tz = None
# Convert and keep local datetimes.
local_dts = []
for r in runs:
if not r.run_at:
continue
dt = r.run_at
if tz is not None:
try:
if dt.tzinfo is None:
dt = dt.replace(tzinfo=datetime_module.timezone.utc).astimezone(tz)
else:
dt = dt.astimezone(tz)
except Exception:
pass
local_dts.append(dt)
if len(local_dts) < MIN_OCCURRENCES:
return None
# Cadence heuristic: monthly jobs shouldn't look weekly.
local_dts_sorted = sorted(local_dts)
gaps = []
for i in range(1, len(local_dts_sorted)):
try:
delta_days = (local_dts_sorted[i] - local_dts_sorted[i - 1]).total_seconds() / 86400.0
if delta_days > 0:
gaps.append(delta_days)
except Exception:
continue
if gaps:
gaps_sorted = sorted(gaps)
median_gap = gaps_sorted[len(gaps_sorted) // 2]
# If it looks like a weekly/daily cadence, do not classify as monthly.
if median_gap < 20.0:
return None
# Count day-of-month occurrences and time buckets on that day.
dom_counts = {}
time_counts_by_dom = {} # dom -> { "HH:MM": count }
for dt in local_dts:
dom = int(dt.day)
dom_counts[dom] = int(dom_counts.get(dom, 0)) + 1
minute_bucket = (dt.minute // 15) * 15
tstr = f"{int(dt.hour):02d}:{int(minute_bucket):02d}"
if dom not in time_counts_by_dom:
time_counts_by_dom[dom] = {}
time_counts_by_dom[dom][tstr] = int(time_counts_by_dom[dom].get(tstr, 0)) + 1
# Pick the most common day-of-month with enough occurrences.
best_dom = None
best_dom_count = 0
for dom, c in dom_counts.items():
if int(c) >= MIN_OCCURRENCES and int(c) > best_dom_count:
best_dom = int(dom)
best_dom_count = int(c)
if best_dom is None:
return None
# Times on that day must also be stable. Keep frequent buckets; otherwise fall back to the top bucket.
time_counts = time_counts_by_dom.get(best_dom) or {}
keep_times = [t for t, c in time_counts.items() if int(c) >= MIN_OCCURRENCES]
if not keep_times:
# Fallback: choose the single most common time bucket for that day.
best_t = None
best_c = 0
for t, c in time_counts.items():
if int(c) > best_c:
best_t = t
best_c = int(c)
if best_t:
keep_times = [best_t]
keep_times = sorted(set(keep_times))
if not keep_times:
return None
return {"day_of_month": int(best_dom), "times": keep_times}
def _schedule_map_to_desc(schedule_map):
weekday_names = ["Mon", "Tue", "Wed", "Thu", "Fri", "Sat", "Sun"]
any_times = any(schedule_map.get(i) for i in range(7))

View File

@ -270,23 +270,13 @@ def tickets_page():
)
@main_bp.route("/tickets/<int:ticket_id>", methods=["GET", "POST"])
@main_bp.route("/tickets/<int:ticket_id>", methods=["GET"])
@login_required
@roles_required("admin", "operator", "viewer")
def ticket_detail(ticket_id: int):
ticket = Ticket.query.get_or_404(ticket_id)
if request.method == "POST":
if get_active_role() not in ("admin", "operator"):
abort(403)
ticket.description = (request.form.get("description") or "").strip() or None
try:
db.session.commit()
flash("Ticket updated.", "success")
except Exception as exc:
db.session.rollback()
flash(f"Failed to update ticket: {exc}", "danger")
return redirect(url_for("main.ticket_detail", ticket_id=ticket.id))
# Ticket editing is disabled. Resolve the old ticket and create a new one instead.
# Scopes
scopes = TicketScope.query.filter(TicketScope.ticket_id == ticket.id).order_by(TicketScope.id.asc()).all()

View File

@ -378,7 +378,7 @@ def migrate_remarks_active_from_date() -> None:
def migrate_overrides_match_columns() -> None:
"""Add match_status and match_error_contains columns to overrides table if missing."""
"""Add match_status / match_error columns to overrides table if missing."""
engine = db.get_engine()
inspector = inspect(engine)
try:
@ -397,6 +397,25 @@ def migrate_overrides_match_columns() -> None:
print("[migrations] Adding overrides.match_error_contains column...")
conn.execute(text('ALTER TABLE "overrides" ADD COLUMN match_error_contains VARCHAR(255)'))
if "match_error_mode" not in existing_columns:
print("[migrations] Adding overrides.match_error_mode column...")
conn.execute(text('ALTER TABLE "overrides" ADD COLUMN match_error_mode VARCHAR(20)'))
# Backfill mode for existing overrides that already have a match string.
try:
conn.execute(
text(
"""
UPDATE "overrides"
SET match_error_mode = 'contains'
WHERE (match_error_mode IS NULL OR match_error_mode = '')
AND (match_error_contains IS NOT NULL AND match_error_contains <> '');
"""
)
)
except Exception:
pass
print("[migrations] migrate_overrides_match_columns completed.")
@ -772,16 +791,92 @@ def run_migrations() -> None:
migrate_mail_objects_table()
migrate_object_persistence_tables()
migrate_feedback_tables()
migrate_feedback_replies_table()
migrate_tickets_active_from_date()
migrate_remarks_active_from_date()
migrate_overrides_match_columns()
migrate_job_runs_review_tracking()
migrate_job_runs_override_metadata()
migrate_jobs_archiving()
migrate_news_tables()
migrate_reporting_tables()
migrate_reporting_report_config()
print("[migrations] All migrations completed.")
def migrate_jobs_archiving() -> None:
"""Add archiving columns to jobs if missing.
Columns:
- jobs.archived (BOOLEAN NOT NULL DEFAULT FALSE)
- jobs.archived_at (TIMESTAMP NULL)
- jobs.archived_by_user_id (INTEGER NULL)
"""
table = "jobs"
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for jobs archiving migration: {exc}")
return
inspector = inspect(engine)
try:
existing_columns = {col["name"] for col in inspector.get_columns(table)}
except Exception as exc:
print(f"[migrations] {table} table not found for jobs archiving migration, skipping: {exc}")
return
with engine.begin() as conn:
if "archived" not in existing_columns:
print('[migrations] Adding jobs.archived column...')
conn.execute(text('ALTER TABLE "jobs" ADD COLUMN archived BOOLEAN NOT NULL DEFAULT FALSE'))
if "archived_at" not in existing_columns:
print('[migrations] Adding jobs.archived_at column...')
conn.execute(text('ALTER TABLE "jobs" ADD COLUMN archived_at TIMESTAMP'))
if "archived_by_user_id" not in existing_columns:
print('[migrations] Adding jobs.archived_by_user_id column...')
conn.execute(text('ALTER TABLE "jobs" ADD COLUMN archived_by_user_id INTEGER'))
print("[migrations] migrate_jobs_archiving completed.")
def migrate_reporting_report_config() -> None:
"""Add report_definitions.report_config column if missing.
Stores JSON config for reporting UI (selected columns, charts, filters, templates).
"""
table = "report_definitions"
column = "report_config"
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for reporting report_config migration: {exc}")
return
inspector = inspect(engine)
try:
existing_columns = {col["name"] for col in inspector.get_columns(table)}
except Exception as exc:
print(f"[migrations] {table} table not found for report_config migration, skipping: {exc}")
return
if column in existing_columns:
print("[migrations] report_definitions.report_config already exists, skipping.")
return
print("[migrations] Adding report_definitions.report_config column...")
with engine.begin() as conn:
conn.execute(text('ALTER TABLE "report_definitions" ADD COLUMN report_config TEXT'))
print("[migrations] migrate_reporting_report_config completed.")
def migrate_job_runs_override_metadata() -> None:
"""Add override metadata columns to job_runs for reporting.
@ -903,6 +998,40 @@ def migrate_job_runs_review_tracking() -> None:
print("[migrations] migrate_job_runs_review_tracking completed.")
def migrate_feedback_replies_table() -> None:
"""Ensure feedback reply table exists.
Table:
- feedback_replies (messages on open feedback items)
"""
engine = db.get_engine()
with engine.begin() as conn:
conn.execute(
text(
"""
CREATE TABLE IF NOT EXISTS feedback_replies (
id SERIAL PRIMARY KEY,
feedback_item_id INTEGER NOT NULL REFERENCES feedback_items(id) ON DELETE CASCADE,
user_id INTEGER NOT NULL REFERENCES users(id) ON DELETE CASCADE,
message TEXT NOT NULL,
created_at TIMESTAMP NOT NULL DEFAULT NOW()
);
"""
)
)
conn.execute(
text(
"""
CREATE INDEX IF NOT EXISTS idx_feedback_replies_item_created_at
ON feedback_replies (feedback_item_id, created_at);
"""
)
)
print("[migrations] Feedback replies table ensured.")
def migrate_tickets_active_from_date() -> None:
"""Ensure tickets.active_from_date exists and is populated.
@ -1062,6 +1191,39 @@ def migrate_object_persistence_tables() -> None:
'''
)
)
# Ensure existing installations also have ON DELETE CASCADE on customer_objects.customer_id.
# Older schemas created the FK without cascade, which blocks deleting customers.
conn.execute(
text(
'''
DO $$
BEGIN
IF EXISTS (
SELECT 1
FROM information_schema.table_constraints tc
WHERE tc.table_name = 'customer_objects'
AND tc.constraint_type = 'FOREIGN KEY'
AND tc.constraint_name = 'customer_objects_customer_id_fkey'
) THEN
ALTER TABLE customer_objects
DROP CONSTRAINT customer_objects_customer_id_fkey;
END IF;
-- Recreate with cascade (idempotent via the drop above)
ALTER TABLE customer_objects
ADD CONSTRAINT customer_objects_customer_id_fkey
FOREIGN KEY (customer_id)
REFERENCES customers(id)
ON DELETE CASCADE;
EXCEPTION
WHEN duplicate_object THEN
-- Constraint already exists with the correct name.
NULL;
END $$;
'''
)
)
conn.execute(
text(
'CREATE INDEX IF NOT EXISTS idx_customer_objects_customer_name ON customer_objects (customer_id, object_name)'
@ -1083,6 +1245,37 @@ def migrate_object_persistence_tables() -> None:
'''
)
)
# Ensure existing installations also have ON DELETE CASCADE for customer_object_id.
# Older schemas may have created the FK without cascade, blocking customer deletes.
conn.execute(
text(
'''
DO $$
BEGIN
IF EXISTS (
SELECT 1
FROM information_schema.table_constraints tc
WHERE tc.table_name = 'job_object_links'
AND tc.constraint_type = 'FOREIGN KEY'
AND tc.constraint_name = 'job_object_links_customer_object_id_fkey'
) THEN
ALTER TABLE job_object_links
DROP CONSTRAINT job_object_links_customer_object_id_fkey;
END IF;
ALTER TABLE job_object_links
ADD CONSTRAINT job_object_links_customer_object_id_fkey
FOREIGN KEY (customer_object_id)
REFERENCES customer_objects(id)
ON DELETE CASCADE;
EXCEPTION
WHEN duplicate_object THEN
NULL;
END $$;
'''
)
)
conn.execute(
text(
'CREATE INDEX IF NOT EXISTS idx_job_object_links_job_id ON job_object_links (job_id)'
@ -1110,6 +1303,36 @@ def migrate_object_persistence_tables() -> None:
'''
)
)
# Ensure existing installations also have ON DELETE CASCADE for customer_object_id.
conn.execute(
text(
'''
DO $$
BEGIN
IF EXISTS (
SELECT 1
FROM information_schema.table_constraints tc
WHERE tc.table_name = 'run_object_links'
AND tc.constraint_type = 'FOREIGN KEY'
AND tc.constraint_name = 'run_object_links_customer_object_id_fkey'
) THEN
ALTER TABLE run_object_links
DROP CONSTRAINT run_object_links_customer_object_id_fkey;
END IF;
ALTER TABLE run_object_links
ADD CONSTRAINT run_object_links_customer_object_id_fkey
FOREIGN KEY (customer_object_id)
REFERENCES customer_objects(id)
ON DELETE CASCADE;
EXCEPTION
WHEN duplicate_object THEN
NULL;
END $$;
'''
)
)
conn.execute(
text(
'CREATE INDEX IF NOT EXISTS idx_run_object_links_run_id ON run_object_links (run_id)'
@ -1147,6 +1370,7 @@ def migrate_object_persistence_tables() -> None:
job_id INTEGER REFERENCES jobs(id),
job_name_match VARCHAR(255),
job_name_match_mode VARCHAR(32),
resolved_at TIMESTAMP,
created_at TIMESTAMP NOT NULL
);
"""))
@ -1160,6 +1384,12 @@ def migrate_object_persistence_tables() -> None:
UNIQUE(ticket_id, job_run_id)
);
"""))
# Ensure scope-level resolution exists for per-job ticket resolving
conn.execute(text("ALTER TABLE ticket_scopes ADD COLUMN IF NOT EXISTS resolved_at TIMESTAMP"))
conn.execute(text("CREATE INDEX IF NOT EXISTS idx_ticket_scopes_ticket_id ON ticket_scopes (ticket_id)"))
conn.execute(text("CREATE INDEX IF NOT EXISTS idx_ticket_scopes_job_id ON ticket_scopes (job_id)"))
conn.execute(text("CREATE INDEX IF NOT EXISTS idx_ticket_scopes_resolved_at ON ticket_scopes (resolved_at)"))
conn.execute(text("""
CREATE TABLE IF NOT EXISTS remarks (
id SERIAL PRIMARY KEY,
@ -1337,6 +1567,8 @@ def migrate_reporting_tables() -> None:
id SERIAL PRIMARY KEY,
report_id INTEGER NOT NULL REFERENCES report_definitions(id) ON DELETE CASCADE,
object_name TEXT NOT NULL,
customer_id INTEGER NULL,
customer_name TEXT NULL,
total_runs INTEGER NOT NULL DEFAULT 0,
success_count INTEGER NOT NULL DEFAULT 0,
success_override_count INTEGER NOT NULL DEFAULT 0,
@ -1361,5 +1593,7 @@ def migrate_reporting_tables() -> None:
conn.execute(text("ALTER TABLE report_definitions ADD COLUMN IF NOT EXISTS customer_scope VARCHAR(16) NOT NULL DEFAULT 'all'"))
conn.execute(text("ALTER TABLE report_definitions ADD COLUMN IF NOT EXISTS customer_ids TEXT NULL"))
conn.execute(text("ALTER TABLE report_object_snapshots ADD COLUMN IF NOT EXISTS customer_id INTEGER NULL"))
conn.execute(text("ALTER TABLE report_object_summaries ADD COLUMN IF NOT EXISTS customer_id INTEGER NULL"))
conn.execute(text("ALTER TABLE report_object_summaries ADD COLUMN IF NOT EXISTS customer_name TEXT NULL"))
print("[migrations] reporting tables created/verified.")

View File

@ -156,6 +156,8 @@ class Override(db.Model):
# Matching criteria on object status / error message
match_status = db.Column(db.String(32), nullable=True)
match_error_contains = db.Column(db.String(255), nullable=True)
# Matching mode for error text: contains (default), exact, starts_with, ends_with
match_error_mode = db.Column(db.String(20), nullable=True)
# Behaviour flags
treat_as_success = db.Column(db.Boolean, nullable=False, default=True)
@ -196,6 +198,12 @@ class Job(db.Model):
auto_approve = db.Column(db.Boolean, nullable=False, default=True)
active = db.Column(db.Boolean, nullable=False, default=True)
# Archived jobs are excluded from Daily Jobs and Run Checks.
# JobRuns remain in the database and are still included in reporting.
archived = db.Column(db.Boolean, nullable=False, default=False)
archived_at = db.Column(db.DateTime, nullable=True)
archived_by_user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=True)
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
updated_at = db.Column(
db.DateTime, default=datetime.utcnow, onupdate=datetime.utcnow, nullable=False
@ -391,6 +399,7 @@ class TicketScope(db.Model):
job_name_match = db.Column(db.String(255))
job_name_match_mode = db.Column(db.String(32))
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
resolved_at = db.Column(db.DateTime)
class TicketJobRun(db.Model):
@ -488,6 +497,20 @@ class FeedbackVote(db.Model):
)
class FeedbackReply(db.Model):
__tablename__ = "feedback_replies"
id = db.Column(db.Integer, primary_key=True)
feedback_item_id = db.Column(
db.Integer, db.ForeignKey("feedback_items.id", ondelete="CASCADE"), nullable=False
)
user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=False)
message = db.Column(db.Text, nullable=False)
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
class NewsItem(db.Model):
__tablename__ = "news_items"
@ -536,7 +559,7 @@ class ReportDefinition(db.Model):
# one-time | scheduled
report_type = db.Column(db.String(32), nullable=False, default="one-time")
# csv | pdf (pdf is future)
# csv | html | pdf
output_format = db.Column(db.String(16), nullable=False, default="csv")
# customer scope for report generation
@ -551,6 +574,10 @@ class ReportDefinition(db.Model):
# For scheduled reports in later phases (cron / RRULE style string)
schedule = db.Column(db.String(255), nullable=True)
# JSON report definition for UI (columns, charts, filters, templates)
# Stored as TEXT to remain flexible and allow future PDF rendering.
report_config = db.Column(db.Text, nullable=True)
created_by_user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=True)
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
@ -608,6 +635,9 @@ class ReportObjectSummary(db.Model):
report_id = db.Column(db.Integer, db.ForeignKey("report_definitions.id"), nullable=False)
object_name = db.Column(db.Text, nullable=False)
customer_id = db.Column(db.Integer, nullable=True)
customer_name = db.Column(db.Text, nullable=True)
total_runs = db.Column(db.Integer, nullable=False, default=0)
success_count = db.Column(db.Integer, nullable=False, default=0)
success_override_count = db.Column(db.Integer, nullable=False, default=0)

View File

@ -130,3 +130,172 @@ def persist_objects_for_approved_run(customer_id: int, job_id: int, run_id: int,
return processed
def persist_objects_for_approved_run_filtered(
customer_id: int,
job_id: int,
run_id: int,
mail_message_id: int,
*,
object_name_prefix: str,
strip_prefix: bool = True,
) -> int:
"""Persist a subset of mail_objects for a specific approved run.
This is used for multi-tenant / multi-customer summary emails where a single mail_message
contains objects for multiple companies (e.g. Veeam VSPC Active Alarms summary).
Args:
customer_id: Customer id for the target job.
job_id: Job id for the target job.
run_id: JobRun id.
mail_message_id: MailMessage id that contains the parsed mail_objects.
object_name_prefix: Company prefix (exact) used in mail_objects.object_name ("<company> | <object>").
strip_prefix: If True, store object names without the "<company> | " prefix.
Returns:
Number of processed objects.
"""
engine = db.get_engine()
processed = 0
prefix = (object_name_prefix or "").strip()
if not prefix:
return 0
like_value = f"{prefix} | %"
with engine.begin() as conn:
rows = conn.execute(
text(
"""
SELECT object_name, object_type, status, error_message
FROM mail_objects
WHERE mail_message_id = :mail_message_id
AND object_name LIKE :like_value
ORDER BY id
"""
),
{"mail_message_id": mail_message_id, "like_value": like_value},
).fetchall()
for r in rows:
raw_name = (r[0] or "").strip()
if not raw_name:
continue
object_name = raw_name
if strip_prefix and object_name.startswith(f"{prefix} | "):
object_name = object_name[len(prefix) + 3 :].strip()
if not object_name:
continue
object_type = r[1]
status = r[2]
error_message = r[3]
# 1) Upsert customer_objects and get id (schema uses UNIQUE(customer_id, object_name))
customer_object_id = conn.execute(
text(
"""
INSERT INTO customer_objects (customer_id, object_name, object_type, first_seen_at, last_seen_at)
VALUES (:customer_id, :object_name, :object_type, NOW(), NOW())
ON CONFLICT (customer_id, object_name)
DO UPDATE SET
last_seen_at = NOW(),
object_type = COALESCE(EXCLUDED.object_type, customer_objects.object_type)
RETURNING id
"""
),
{
"customer_id": customer_id,
"object_name": object_name,
"object_type": object_type,
},
).scalar()
# 2) Upsert job_object_links (keep timestamps fresh)
conn.execute(
text(
"""
INSERT INTO job_object_links (job_id, customer_object_id, first_seen_at, last_seen_at)
VALUES (:job_id, :customer_object_id, NOW(), NOW())
ON CONFLICT (job_id, customer_object_id)
DO UPDATE SET last_seen_at = NOW()
"""
),
{
"job_id": job_id,
"customer_object_id": customer_object_id,
},
)
# 3) Upsert run_object_links
conn.execute(
text(
"""
INSERT INTO run_object_links (run_id, customer_object_id, status, error_message, observed_at)
VALUES (:run_id, :customer_object_id, :status, :error_message, NOW())
ON CONFLICT (run_id, customer_object_id)
DO UPDATE SET
status = EXCLUDED.status,
error_message = EXCLUDED.error_message,
observed_at = NOW()
"""
),
{
"run_id": run_id,
"customer_object_id": customer_object_id,
"status": status,
"error_message": error_message,
},
)
processed += 1
_update_override_applied_for_run(job_id, run_id)
return processed
def persist_objects_for_auto_run(customer_id: int, job_id: int, run_id: int, mail_message_id: int) -> int:
"""Persist objects for a run created by auto-approve logic.
For VSPC Active Alarms summary, objects are stored on the mail_message with
a "<company> | <object>" prefix. Auto-approved runs are created per-company
job ("Active alarms summary | <company>"). In that case we persist only the
matching subset and strip the prefix so objects are correctly linked.
"""
try:
# Lazy import to avoid circular dependencies.
from .models import Job # noqa
job = Job.query.get(int(job_id))
if not job:
return persist_objects_for_approved_run(customer_id, job_id, run_id, mail_message_id)
bsw = (getattr(job, "backup_software", "") or "").strip().lower()
btype = (getattr(job, "backup_type", "") or "").strip().lower()
jname = (getattr(job, "job_name", "") or "").strip()
if bsw == "veeam" and btype == "service provider console":
# Expected format: "Active alarms summary | <company>"
parts = [p.strip() for p in jname.split("|", 1)]
if len(parts) == 2 and parts[0].strip().lower() == "active alarms summary" and parts[1]:
company = parts[1]
return persist_objects_for_approved_run_filtered(
customer_id,
job_id,
run_id,
mail_message_id,
object_name_prefix=company,
strip_prefix=True,
)
except Exception:
# Fall back to the generic behavior.
pass
return persist_objects_for_approved_run(customer_id, job_id, run_id, mail_message_id)

View File

@ -13,6 +13,8 @@ from .nakivo import try_parse_nakivo
from .veeam import try_parse_veeam
from .rdrive import try_parse_rdrive
from .syncovery import try_parse_syncovery
from .ntfs_auditing import try_parse_ntfs_auditing
from .qnap import try_parse_qnap
def _sanitize_text(value: object) -> object:
@ -43,14 +45,25 @@ def _store_mail_objects(msg: MailMessage, objects: List[Dict]) -> None:
- error_message (optional)
"""
for item in objects or []:
name = (item.get("name") or "").strip()
name = _sanitize_text(item.get("name") or "")
if isinstance(name, str):
name = name.strip()
if not name:
continue
object_type = (item.get("type") or item.get("object_type") or None)
object_type = _sanitize_text(object_type)
if isinstance(object_type, str):
object_type = object_type.strip() or None
status = (item.get("status") or None) or None
status = _sanitize_text(status)
if isinstance(status, str):
status = status.strip() or None
error_message = item.get("error_message") or None
error_message = _sanitize_text(error_message)
if isinstance(error_message, str):
error_message = error_message.strip() or None
db.session.add(
MailObject(
mail_message_id=msg.id,
@ -94,6 +107,8 @@ def parse_mail_message(msg: MailMessage) -> None:
try:
handled, result, objects = try_parse_3cx(msg)
if not handled:
handled, result, objects = try_parse_qnap(msg)
if not handled:
handled, result, objects = try_parse_synology(msg)
if not handled:
@ -106,6 +121,8 @@ def parse_mail_message(msg: MailMessage) -> None:
handled, result, objects = try_parse_veeam(msg)
if not handled:
handled, result, objects = try_parse_syncovery(msg)
if not handled:
handled, result, objects = try_parse_ntfs_auditing(msg)
except Exception as exc:
msg.parse_result = "error"
msg.parse_error = str(exc)[:500]

View File

@ -0,0 +1,89 @@
from __future__ import annotations
import re
from typing import Dict, Tuple, List
from ..models import MailMessage
_HOSTNAME_RE = re.compile(r"""(?ix)
\b
(?:[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?\.)+
(?:[a-z]{2,}|local)
\b
""")
_COUNTS_RE = re.compile(r"""(?x)
[\u2193\u2191] # ↓ or ↑
\s*
(\d+)
""")
def _normalize_subject(subject: str) -> str:
# Some senders use underscores as spaces in the subject.
s = (subject or "").strip()
s = s.replace("_", " ")
s = re.sub(r"\s+", " ", s)
return s.strip()
def _extract_host(subject: str) -> str | None:
subj = _normalize_subject(subject)
lower = subj.lower()
idx = lower.find("file audits")
if idx == -1:
return None
prefix = subj[:idx].strip()
# Some senders add a company prefix in front of the hostname, e.g. "Bouter btr-dc001.bouter.nl ...".
# Extract the last hostname-looking token before "file audits".
hosts = _HOSTNAME_RE.findall(prefix)
if not hosts:
return None
return hosts[-1].lower()
def _extract_counts(subject: str) -> Tuple[int, int]:
# Subject format: "<host> file audits ↓ 0 ↑ 29"
# Not all senders include both arrows, so we parse what we can.
subj = _normalize_subject(subject)
nums = [int(x) for x in _COUNTS_RE.findall(subj)]
down = nums[0] if len(nums) >= 1 else 0
up = nums[1] if len(nums) >= 2 else 0
return down, up
def try_parse_ntfs_auditing(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
subject = getattr(msg, "subject", None) or ""
# Fast checks: this parser is subject-based.
if "file audits" not in _normalize_subject(subject).lower():
return False, {}, []
host = _extract_host(subject)
if not host:
return False, {}, []
down, up = _extract_counts(subject)
# If changes were detected, mark as Warning (auditing reports only changes).
overall_status = "Warning" if (down > 0 or up > 0) else "Success"
overall_message = None
if overall_status == "Warning":
overall_message = f"NTFS auditing detected file changes (deleted: {down}, changed: {up})."
job_name = f"{host} file audits"
result = {
"backup_software": "NTFS Auditing",
"backup_type": "Audit",
"job_name": job_name,
"overall_status": overall_status,
"overall_message": overall_message,
}
# This mail contains an attachment report; objects are not tracked.
return True, result, []

View File

@ -0,0 +1,100 @@
from __future__ import annotations
import html
import re
from typing import Dict, Tuple, List
from ..models import MailMessage
_SUBJECT_RE = re.compile(
r"^\[(?P<severity>info|warning|error)\]\s*\[\s*firmware\s+update\s*\]\s*notification\s+from\s+your\s+device\s*:\s*(?P<host>.+)$",
re.I,
)
_NAS_NAME_RE = re.compile(r"\bNAS\s*Name\s*:\s*(?P<host>[^\n<]+)", re.I)
_APP_NAME_RE = re.compile(r"\bApp\s*Name\s*:\s*(?P<app>[^\n<]+)", re.I)
_CATEGORY_RE = re.compile(r"\bCategory\s*:\s*(?P<cat>[^\n<]+)", re.I)
_MESSAGE_RE = re.compile(r"\bMessage\s*:\s*(?P<msg>.+)$", re.I | re.M)
_BR_RE = re.compile(r"<\s*br\s*/?\s*>", re.I)
_TAG_RE = re.compile(r"<[^>]+>")
_WS_RE = re.compile(r"[\t\r\f\v ]+")
def _html_to_text(value: str) -> str:
if not value:
return ""
s = value
s = _BR_RE.sub("\n", s)
s = _TAG_RE.sub("", s)
s = html.unescape(s)
s = s.replace("\u00a0", " ")
# keep newlines, but normalize whitespace on each line
lines = [(_WS_RE.sub(" ", ln)).strip() for ln in s.split("\n")]
return "\n".join([ln for ln in lines if ln]).strip()
def try_parse_qnap(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
"""Parse QNAP Notification Center e-mails.
Supported (informational):
- Firmware Update notifications
Subject: [Info][Firmware Update] Notification from your device: <HOST>
These notifications are informational: they should be visible in Run Checks,
but they must not participate in schedule inference, missed/expected logic,
or reporting.
"""
subject = (getattr(msg, "subject", None) or "").strip()
if not subject:
return False, {}, []
m = _SUBJECT_RE.match(subject)
if not m:
return False, {}, []
host = (m.group("host") or "").strip()
html_body = getattr(msg, "html_body", None) or ""
text_body = getattr(msg, "text_body", None) or getattr(msg, "body", None) or ""
text = _html_to_text(html_body) if html_body else (text_body or "")
if text:
m_host = _NAS_NAME_RE.search(text)
if m_host:
host = (m_host.group("host") or "").strip() or host
# Prefer the detailed 'Message:' line from the body.
overall_message = None
if text:
m_msg = _MESSAGE_RE.search(text)
if m_msg:
overall_message = (m_msg.group("msg") or "").strip() or None
# If the body doesn't contain a dedicated message line, derive one.
if not overall_message and text:
parts: List[str] = []
m_app = _APP_NAME_RE.search(text)
if m_app:
parts.append((m_app.group("app") or "").strip())
m_cat = _CATEGORY_RE.search(text)
if m_cat:
parts.append((m_cat.group("cat") or "").strip())
if parts:
overall_message = " / ".join([p for p in parts if p]) or None
result: Dict = {
"backup_software": "QNAP",
"backup_type": "Firmware Update",
"job_name": "Firmware Update",
"overall_status": "Warning",
"overall_message": overall_message,
}
objects: List[Dict] = []
if host:
objects.append({"name": host, "status": "Warning"})
return True, result, objects

View File

@ -38,6 +38,55 @@ PARSER_DEFINITIONS = [
},
},
},
{
"name": "ntfs_auditing_audit",
"backup_software": "NTFS Auditing",
"backup_types": ["Audit"],
"order": 220,
"enabled": True,
"match": {
"from_contains": "auditing@",
"subject_contains": "file audits",
},
"description": "Parses NTFS Auditing file audit report mails (attachment-based HTML reports).",
"example": {
"subject": "Bouter btr-dc001.bouter.nl file audits → 6 ↑ 12",
"from_address": "auditing@bouter.nl",
"body_snippet": "(empty body, HTML report in attachment)",
"parsed_result": {
"backup_software": "NTFS Auditing",
"backup_type": "Audit",
"job_name": "btr-dc001.bouter.nl file audits",
"objects": [],
},
},
},
{
"name": "qnap_firmware_update",
"backup_software": "QNAP",
"backup_types": ["Firmware Update"],
"order": 235,
"enabled": True,
"match": {
"from_contains": "notifications@",
"subject_contains": "Firmware Update",
},
"description": "Parses QNAP Notification Center firmware update notifications (informational; excluded from reporting and missing logic).",
"example": {
"subject": "[Info][Firmware Update] Notification from your device: BETSIES-NAS01",
"from_address": "notifications@customer.tld",
"body_snippet": "NAS Name: BETSIES-NAS01\n...\nMessage: ...",
"parsed_result": {
"backup_software": "QNAP",
"backup_type": "Firmware Update",
"job_name": "Firmware Update",
"overall_status": "Warning",
"objects": [
{"name": "BETSIES-NAS01", "status": "Warning", "error_message": None}
],
},
},
},
{
"name": "veeam_replication_job",
"backup_software": "Veeam",

View File

@ -1,5 +1,7 @@
# --- Synology DSM Updates (informational, excluded from reporting) ---
from __future__ import annotations
import re
from typing import Dict, Tuple, List, Optional
@ -12,6 +14,52 @@ from ..models import MailMessage
# - Hyper Backup (Synology): task notifications from Hyper Backup
# - Account Protection (Synology): DSM Account Protection lockout notifications
DSM_UPDATE_CANCELLED_PATTERNS = [
"Automatische update van DSM is geannuleerd",
"Automatic DSM update was cancelled",
"Automatic update of DSM was cancelled",
]
_DSM_UPDATE_CANCELLED_HOST_RE = re.compile(
r"\b(?:geannuleerd\s+op|cancelled\s+on)\s+(?P<host>[A-Za-z0-9._-]+)\b",
re.I,
)
_DSM_UPDATE_FROM_HOST_RE = re.compile(r"\bVan\s+(?P<host>[A-Za-z0-9._-]+)\b", re.I)
def _is_synology_dsm_update_cancelled(subject: str, text: str) -> bool:
haystack = f"{subject}\n{text}".lower()
return any(p.lower() in haystack for p in DSM_UPDATE_CANCELLED_PATTERNS)
def _parse_synology_dsm_update_cancelled(subject: str, text: str) -> Tuple[bool, Dict, List[Dict]]:
haystack = f"{subject}\n{text}"
host = ""
m = _DSM_UPDATE_CANCELLED_HOST_RE.search(haystack)
if m:
host = (m.group("host") or "").strip()
if not host:
m = _DSM_UPDATE_FROM_HOST_RE.search(haystack)
if m:
host = (m.group("host") or "").strip()
# Informational job: show in Run Checks, but do not participate in schedules / reporting.
result: Dict = {
"backup_software": "Synology",
"backup_type": "Updates",
"job_name": "Synology Automatic Update",
"overall_status": "Warning",
"overall_message": "Automatic DSM update cancelled" + (f" ({host})" if host else ""),
}
objects: List[Dict] = []
if host:
objects.append({"name": host, "status": "Warning"})
return True, result, objects
_BR_RE = re.compile(r"<\s*br\s*/?\s*>", re.I)
_TAG_RE = re.compile(r"<[^>]+>")
_WS_RE = re.compile(r"[\t\r\f\v ]+")
@ -125,23 +173,35 @@ def _extract_totals(text: str) -> Tuple[int, int, int]:
_ABB_SUBJECT_RE = re.compile(r"\bactive\s+backup\s+for\s+business\b", re.I)
# Example (NL):
# Examples (NL):
# "De back-uptaak vSphere-Task-1 op KANTOOR-NEW is voltooid."
# Example (EN):
# "Virtuele machine back-uptaak vSphere-Task-1 op KANTOOR-NEW is gedeeltelijk voltooid."
# Examples (EN):
# "The backup task vSphere-Task-1 on KANTOOR-NEW has completed."
# "Virtual machine backup task vSphere-Task-1 on KANTOOR-NEW partially completed."
_ABB_COMPLETED_RE = re.compile(
r"\b(?:de\s+)?back-?up\s*taak\s+(?P<job>.+?)\s+op\s+(?P<host>.+?)\s+is\s+voltooid\b"
r"|\b(?:the\s+)?back-?up\s+task\s+(?P<job_en>.+?)\s+on\s+(?P<host_en>.+?)\s+(?:is\s+)?(?:completed|finished|has\s+completed)\b",
r"\b(?:virtuele\s+machine\s+)?(?:de\s+)?back-?up\s*taak\s+(?P<job>.+?)\s+op\s+(?P<host>[A-Za-z0-9._-]+)\s+is\s+(?P<status>voltooid|gedeeltelijk\s+voltooid)\b"
r"|\b(?:virtual\s+machine\s+)?(?:the\s+)?back-?up\s+task\s+(?P<job_en>.+?)\s+on\s+(?P<host_en>[A-Za-z0-9._-]+)\s+(?:is\s+)?(?P<status_en>completed|finished|has\s+completed|partially\s+completed)\b",
re.I,
)
_ABB_FAILED_RE = re.compile(
r"\b(?:de\s+)?back-?up\s*taak\s+.+?\s+op\s+.+?\s+is\s+mislukt\b"
r"|\b(?:the\s+)?back-?up\s+task\s+.+?\s+on\s+.+?\s+(?:has\s+)?failed\b",
r"\b(?:virtuele\s+machine\s+)?(?:de\s+)?back-?up\s*taak\s+.+?\s+op\s+.+?\s+is\s+mislukt\b"
r"|\b(?:virtual\s+machine\s+)?(?:the\s+)?back-?up\s+task\s+.+?\s+on\s+.+?\s+(?:has\s+)?failed\b",
re.I,
)
_ABB_DEVICE_LIST_RE = re.compile(r"^\s*(?:Apparaatlijst|Device\s+list)\s*:\s*(?P<list>.+?)\s*$", re.I)
# Device list lines in body, e.g.
# "Apparaatlijst (back-up gelukt): DC01, SQL01"
# "Lijst met apparaten (back-up gelukt): DC01, SQL01"
# "Apparaatlijst (back-up mislukt): FS01"
# "Device list (backup succeeded): DC01, SQL01"
# "List of devices (backup succeeded): DC01, SQL01"
# "Device list (backup failed): FS01"
_ABB_DEVICE_LIST_RE = re.compile(
r"^\s*(?:Apparaatlijst|Lijst\s+met\s+apparaten|Device\s+list|List\s+of\s+devices)\s*(?:\((?P<kind>[^)]+)\))?\s*:\s*(?P<list>.*?)\s*$",
re.I,
)
def _is_synology_active_backup_for_business(subject: str, text: str) -> bool:
@ -162,22 +222,60 @@ def _parse_active_backup_for_business(subject: str, text: str) -> Tuple[bool, Di
job_name = (m.group("job") or m.group("job_en") or "").strip()
host = (m.group("host") or m.group("host_en") or "").strip()
# Determine overall status based on completion type and failure markers
status_raw = (m.group("status") or m.group("status_en") or "").lower()
overall_status = "Success"
overall_message = "Success"
# "gedeeltelijk voltooid" / "partially completed" should be treated as Warning
if "gedeeltelijk" in status_raw or "partially" in status_raw:
overall_status = "Warning"
overall_message = "Partially completed"
# Explicit failure wording overrides everything
if _ABB_FAILED_RE.search(haystack):
overall_status = "Error"
overall_message = "Failed"
objects: List[Dict] = []
# Collect device/object statuses while avoiding duplicates.
# Prefer the most severe status when a device appears multiple times.
severity = {"Error": 3, "Failed": 3, "Warning": 2, "Success": 1}
device_status: Dict[str, str] = {}
for line in (text or "").splitlines():
mm = _ABB_DEVICE_LIST_RE.match(line.strip())
if not mm:
continue
raw_list = (mm.group("list") or "").strip()
kind = (mm.group("kind") or "").lower()
line_status = overall_status
kind_is_specific = False
if "gelukt" in kind or "succeeded" in kind or "success" in kind:
line_status = "Success"
kind_is_specific = True
elif "mislukt" in kind or "failed" in kind or "error" in kind:
line_status = "Error"
kind_is_specific = True
# "DC01, SQL01"
for name in [p.strip() for p in raw_list.split(",")]:
if name:
objects.append({"name": name, "status": overall_status})
if not name:
continue
prev = device_status.get(name)
if prev is None:
device_status[name] = line_status
continue
# Do not override specific succeeded/failed lists with a generic "device list".
if not kind_is_specific:
continue
if severity.get(line_status, 0) > severity.get(prev, 0):
device_status[name] = line_status
objects: List[Dict] = [{"name": n, "status": s} for n, s in device_status.items()]
result = {
"backup_software": "Synology",
@ -385,6 +483,12 @@ def try_parse_synology(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
# If html_body is empty, treat text_body as already-normalized text.
text = _html_to_text(html_body) if html_body else (text_body or "")
# DSM Updates (informational; no schedule; excluded from reporting)
if _is_synology_dsm_update_cancelled(subject, text):
ok, result, objects = _parse_synology_dsm_update_cancelled(subject, text)
if ok:
return True, result, objects
# DSM Account Protection (informational; no schedule)
if _is_synology_account_protection(subject, text):
ok, result, objects = _parse_account_protection(subject, text)

View File

@ -6,28 +6,61 @@ from typing import Dict, Tuple, List
from ..models import MailMessage
def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
"""Parse 3CX backup notification e-mails.
def _normalize_text(text: str) -> str:
text = (text or "").replace("\r\n", "\n").replace("\r", "\n")
# Collapse excessive blank lines while keeping readability
text = re.sub(r"\n{4,}", "\n\n\n", text)
return text.strip()
Expected:
def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
"""Parse 3CX notification e-mails.
Supported:
- Backup Complete
Subject: '3CX Notification: Backup Complete - <host>'
Body contains: 'Backup name: <file>'
- SSL Certificate Renewal (informational)
Subject: '3CX Notification: SSL Certificate Renewal - <host>'
Body contains an informational message about the renewal.
"""
subject = (msg.subject or "").strip()
if not subject:
return False, {}, []
m = re.match(r"^3CX Notification:\s*Backup Complete\s*-\s*(.+)$", subject, flags=re.IGNORECASE)
if not m:
# Backup complete
m_backup = re.match(r"^3CX Notification:\s*Backup Complete\s*-\s*(.+)$", subject, flags=re.IGNORECASE)
# SSL certificate renewal (informational)
m_ssl = re.match(
r"^3CX Notification:\s*SSL Certificate Renewal\s*-\s*(.+)$",
subject,
flags=re.IGNORECASE,
)
if not m_backup and not m_ssl:
return False, {}, []
job_name = m.group(1).strip()
job_name = (m_backup or m_ssl).group(1).strip()
body = (getattr(msg, "text_body", None) or getattr(msg, "body", None) or "")
# Some sources store plain text in html_body; fall back if needed.
if not body:
body = getattr(msg, "html_body", None) or ""
body = _normalize_text(body)
# SSL certificate renewal: store as a tracked informational run
if m_ssl:
result = {
"backup_software": "3CX",
"backup_type": "SSL Certificate",
"job_name": job_name,
"overall_status": "Success",
"overall_message": body or None,
}
return True, result, []
# Backup complete
backup_file = None
m_file = re.search(r"^\s*Backup\s+name\s*:\s*(.+?)\s*$", body, flags=re.IGNORECASE | re.MULTILINE)
if m_file:

View File

@ -1,6 +1,7 @@
from __future__ import annotations
import re
import html as _html
from typing import Dict, Tuple, List, Optional
from ..models import MailMessage
@ -17,9 +18,311 @@ VEEAM_BACKUP_TYPES = [
"Veeam Backup for Microsoft 365",
"Scale-out Backup Repository",
"Health Check",
"Cloud Connect Report",
"Service Provider Console",
]
def normalize_vspc_company_name(name: str) -> str:
"""Normalize a VSPC company name so it matches across HTML/text extraction and parsing."""
n = _strip_html_tags(name or "")
n = _html.unescape(n)
n = n.replace("\xa0", " ")
n = re.sub(r"\s+", " ", n).strip()
return n
def extract_vspc_active_alarms_companies(raw: str) -> List[str]:
"""Best-effort extraction of company names from VSPC "Active alarms summary" bodies.
Only returns companies with alarms > 0.
"""
if not raw:
return []
txt = raw
if "<" in txt and ">" in txt:
txt = re.sub(r"<[^>]+>", " ", txt)
txt = _html.unescape(txt)
txt = txt.replace("\xa0", " ")
txt = re.sub(r"\s+", " ", txt).strip()
seen: set[str] = set()
out: List[str] = []
for m in re.finditer(
r"\bCompany:\s*([^\(\r\n]+?)\s*\(\s*alarms?\s*:\s*(\d+)\s*\)",
txt,
flags=re.IGNORECASE,
):
cname = normalize_vspc_company_name((m.group(1) or "").strip())
try:
alarms = int(m.group(2))
except Exception:
alarms = 0
if not cname or alarms <= 0:
continue
if cname in seen:
continue
seen.add(cname)
out.append(cname)
return out
def _parse_vspc_active_alarms_from_html(html: str) -> Tuple[List[Dict], str, Optional[str]]:
"""Parse Veeam Service Provider Console (VSPC) Active Alarms summary emails.
The VSPC summary email can contain multiple companies. We keep this as a
single Backupchecks run, but we prefix object names with the company name
so alarms remain attributable per customer.
Returns: (objects, overall_status, overall_message)
"""
html = _normalize_html(html)
if not html:
return [], "Success", None
html_lower = html.lower()
if "veeam service provider console" not in html_lower or "company" not in html_lower:
return [], "Success", None
# Extract each company block and its first alarm table.
# Company header example: "Company: AKR Performance (alarms: 2)"
# Be defensive about line breaks (CR/LF) and HTML formatting.
company_header_re = re.compile(
r"(?is)company:\s*([^<\r\n]+?)\s*\(\s*alarms\s*:\s*(\d+)\s*\)"
)
# Build spans using HTML positions.
headers = [(m.start(), m.end(), (m.group(1) or "").strip(), m.group(2)) for m in company_header_re.finditer(html)]
if not headers:
return [], "Success", None
objects: List[Dict] = []
saw_failed = False
saw_warning = False
for idx, (h_start, h_end, company_name, alarms_raw) in enumerate(headers):
company_name = normalize_vspc_company_name(company_name)
seg_start = h_end
seg_end = headers[idx + 1][0] if idx + 1 < len(headers) else len(html)
segment_html = html[seg_start:seg_end]
# Find the first table that looks like the Active Alarms table.
m_table = re.search(r"(?is)<table[^>]*>.*?(Current\s*State).*?</table>", segment_html)
if not m_table:
continue
table_html = m_table.group(0)
# Parse rows and cells.
row_re = re.compile(r"(?is)<tr[^>]*>(.*?)</tr>")
cell_re = re.compile(r"(?is)<t[dh][^>]*>(.*?)</t[dh]>")
rows = row_re.findall(table_html)
if not rows:
continue
# Determine column indexes from header row.
colmap = {}
header_cells = [_strip_html_tags(c).strip().lower() for c in cell_re.findall(rows[0])]
for i, c in enumerate(header_cells):
if c in {"current state", "currentstate"}:
colmap["current_state"] = i
elif c in {"object"}:
colmap["object"] = i
elif c in {"object type", "objecttype"}:
colmap["object_type"] = i
elif c in {"hostname"}:
colmap["hostname"] = i
elif c in {"time"}:
colmap["time"] = i
elif c in {"alarm name", "alarmname"}:
colmap["alarm_name"] = i
elif c in {"n. of repeats", "n.of repeats", "repeats"}:
colmap["repeats"] = i
elif c in {"alarm details", "alarmdetails", "details"}:
colmap["alarm_details"] = i
# Basic validation: needs at least object + current state
if "object" not in colmap or "current_state" not in colmap:
continue
# Convert the entire company segment to text once for details matching.
seg_text = _html_to_text_preserve_lines(segment_html)
seg_lines = [ln.strip() for ln in (seg_text or "").splitlines() if ln.strip()]
for r in rows[1:]:
cells = cell_re.findall(r)
if not cells:
continue
plain = [_strip_html_tags(c).strip() for c in cells]
obj_name = plain[colmap["object"]].strip() if colmap["object"] < len(plain) else ""
if not obj_name:
continue
current_state = plain[colmap["current_state"]].strip() if colmap["current_state"] < len(plain) else ""
obj_type = plain[colmap.get("object_type", -1)].strip() if colmap.get("object_type", -1) >= 0 and colmap.get("object_type", -1) < len(plain) else ""
hostname = plain[colmap.get("hostname", -1)].strip() if colmap.get("hostname", -1) >= 0 and colmap.get("hostname", -1) < len(plain) else ""
at_time = plain[colmap.get("time", -1)].strip() if colmap.get("time", -1) >= 0 and colmap.get("time", -1) < len(plain) else ""
alarm_name = plain[colmap.get("alarm_name", -1)].strip() if colmap.get("alarm_name", -1) >= 0 and colmap.get("alarm_name", -1) < len(plain) else ""
repeats = plain[colmap.get("repeats", -1)].strip() if colmap.get("repeats", -1) >= 0 and colmap.get("repeats", -1) < len(plain) else ""
alarm_details = plain[colmap.get("alarm_details", -1)].strip() if colmap.get("alarm_details", -1) >= 0 and colmap.get("alarm_details", -1) < len(plain) else ""
state_lower = (current_state or "").lower()
status = "Success"
if state_lower in {"failed", "error", "critical"}:
status = "Failed"
saw_failed = True
elif state_lower in {"warning", "warn"}:
status = "Warning"
saw_warning = True
# Prefer the explicit "Alarm Details" column if present.
detail_line = alarm_details or None
# Otherwise try to find a more descriptive detail line in the company text.
# Prefer lines that mention the object or alarm name and are long enough to be a real description.
needles = [n for n in [obj_name, alarm_name] if n]
if not detail_line:
for ln in seg_lines:
if len(ln) < 25:
continue
if any(n.lower() in ln.lower() for n in needles):
detail_line = ln
break
if not detail_line and alarm_name:
# fallback: use alarm name with context
parts = [alarm_name]
ctx = []
if hostname:
ctx.append(f"Host: {hostname}")
if at_time:
ctx.append(f"Time: {at_time}")
if repeats:
ctx.append(f"Repeats: {repeats}")
if ctx:
parts.append("(" + ", ".join(ctx) + ")")
detail_line = " ".join(parts).strip() or None
objects.append(
{
"name": f"{company_name} | {obj_name}" if company_name else obj_name,
"type": obj_type or "Alarm",
"status": status,
"error_message": detail_line,
}
)
overall_status = "Success"
if saw_failed:
overall_status = "Failed"
elif saw_warning:
overall_status = "Warning"
overall_message = None
return objects, overall_status, overall_message
def _parse_cloud_connect_report_from_html(html: str) -> Tuple[List[Dict], str]:
"""Parse Veeam Cloud Connect daily report (provider) HTML.
The report contains a "Backup" table with columns including:
User | Repository Name | ...
Objects in our system are a combination of the "User" and "Repository Name"
columns, separated by " | ".
Row background colour indicates status:
- Red/pink rows: Failed/Error
- Yellow/orange rows: Warning
- White rows: Success
The row where the first cell is "TOTAL" is a summary row and is not an object.
Returns: (objects, overall_status)
"""
html = _normalize_html(html)
if not html:
return [], "Success"
# Find the Backup table block.
m_table = re.search(r"(?is)<p[^>]*>\s*Backup\s*</p>\s*<table.*?</table>", html)
if not m_table:
return [], "Success"
table_html = m_table.group(0)
# Extract rows.
row_pattern = re.compile(r"(?is)<tr([^>]*)>(.*?)</tr>")
cell_pattern = re.compile(r"(?is)<t[dh][^>]*>(.*?)</t[dh]>")
objects: List[Dict] = []
saw_failed = False
saw_warning = False
for row_attr, row_inner in row_pattern.findall(table_html):
cells = cell_pattern.findall(row_inner)
if len(cells) < 3:
continue
# Convert cells to plain text.
plain = [_strip_html_tags(c).strip() for c in cells]
if not plain:
continue
# Skip header row.
if plain[0].strip().lower() == "user":
continue
user = (plain[0] or "").strip()
repo_name = (plain[2] or "").strip()
# Skip summary row.
if user.upper() == "TOTAL":
continue
if not user and not repo_name:
continue
# Determine status based on background colour.
# Veeam uses inline styles like: background-color: #fb9895 (error)
# and background-color: #ffd96c (warning).
row_style = (row_attr or "")
m_bg = re.search(r"(?i)background-color\s*:\s*([^;\"\s]+)", row_style)
bg = (m_bg.group(1).strip().lower() if m_bg else "")
status = "Success"
if bg in {"#fb9895", "#ff9999", "#f4cccc", "#ffb3b3"}:
status = "Failed"
saw_failed = True
elif bg in {"#ffd96c", "#fff2cc", "#ffe599", "#f9cb9c"}:
status = "Warning"
saw_warning = True
name = f"{user} | {repo_name}".strip(" |")
objects.append(
{
"name": name,
"type": "Repository",
"status": status,
"error_message": None,
}
)
overall_status = "Success"
if saw_failed:
overall_status = "Failed"
elif saw_warning:
overall_status = "Warning"
return objects, overall_status
def _strip_html_tags(value: str) -> str:
"""Very small helper to strip HTML tags from a string."""
if not value:
@ -79,7 +382,9 @@ def _extract_configuration_job_overall_message(html: str) -> Optional[str]:
for line in text.split("\n"):
# Example:
# 26-12-2025 10:00:23 Warning Skipping server certificate backup because encryption is disabled
if re.match(r"^\d{2}-\d{2}-\d{4}\s+\d{2}:\d{2}:\d{2}\s+(Warning|Failed|Error)\b", line):
# 6-1-2026 10:00:16 Warning Skipping credentials backup because encryption is disabled
# Veeam can format dates as either zero-padded (06-01-2026) or non-padded (6-1-2026).
if re.match(r"^\d{1,2}-\d{1,2}-\d{4}\s+\d{2}:\d{2}:\d{2}\s+(Warning|Failed|Error)\b", line):
wanted_lines.append(line)
if not wanted_lines:
@ -345,12 +650,34 @@ def _extract_m365_overall_details_message(html: str) -> Optional[str]:
if not html:
return None
# Look for the summary "Details" cell (typically a header_td with rowspan).
candidates = re.findall(
r'<td[^>]*rowspan\s*=\s*["\']?\s*2\s*["\']?[^>]*>(.*?)</td>',
html,
flags=re.IGNORECASE | re.DOTALL,
html = _normalize_html(html)
# Strategy 1 (preferred): locate the "Details" header cell and then scan a small
# window after it for a rowspan cell that contains the overall message.
#
# We intentionally avoid a single giant regex over the entire HTML body to keep
# parsing fast and prevent worst-case backtracking on large messages.
candidates: List[str] = []
hdr = re.search(r'(?is)<td[^>]*>\s*<b>\s*Details\s*</b>\s*</td>', html)
if hdr:
window = html[hdr.end() : hdr.end() + 6000]
m = re.search(
r'(?is)<td[^>]*rowspan\s*=\s*["\']?\s*(?:2|3|4|5|6|7|8|9|10)\s*["\']?[^>]*>(.*?)</td>',
window,
)
if m:
candidates = [m.group(1)]
# Strategy 2 (fallback): look for rowspan cells with rowspan >= 2.
if not candidates:
all_rowspans = re.findall(
r'(?is)<td[^>]*rowspan\s*=\s*["\']?\s*([2-9]|10)\s*["\']?[^>]*>(.*?)</td>',
html,
)
# re.findall above returns tuples (rowspan, content)
candidates = [c[1] for c in all_rowspans] if all_rowspans else []
if not candidates:
return None
@ -697,6 +1024,34 @@ def _strip_retry_suffix(job_name: Optional[str]) -> Optional[str]:
return cleaned or None
def _strip_m365_combined_suffix(job_name: Optional[str]) -> Optional[str]:
"""Remove the trailing "(Combined)" suffix from a Veeam M365 job name.
Veeam Backup for Microsoft 365 can send separate report emails where the
job name is suffixed with "(Combined)" (e.g. "Tenant OneDrive (Combined)").
Those should be treated as the same logical job as the non-suffixed name.
"""
if not job_name:
return job_name
cleaned = re.sub(r"\s*\(\s*Combined\s*\)\s*$", "", job_name, flags=re.IGNORECASE).strip()
return cleaned or None
def _strip_full_suffix(job_name: Optional[str]) -> Optional[str]:
"""Remove a trailing "(Full)" suffix from a Veeam job name.
Some Veeam installations create separate emails where the job name is
suffixed with "(Full)" (e.g. "Backup VM DC01 (Full)"). Those should be
treated as the same logical job as the non-suffixed name.
"""
if not job_name:
return job_name
cleaned = re.sub(r"\s*\(\s*Full\s*\)\s*$", "", job_name, flags=re.IGNORECASE).strip()
return cleaned or None
def try_parse_veeam(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
"""Try to parse a Veeam backup report mail.
@ -717,17 +1072,68 @@ def try_parse_veeam(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
html_body = _normalize_html(getattr(msg, "html_body", None) or "")
html_lower = html_body.lower()
# Veeam Cloud Connect provider daily report (no [Success]/[Warning] marker).
is_cloud_connect_report = (
"veeam cloud connect" in subject.lower()
and "daily report" in subject.lower()
and "repository name" in html_lower
and "infrastructure status" in html_lower
)
# Special-case: Veeam Backup for Microsoft 365 mails can come without a
# subject marker. Detect via HTML and extract status from the banner.
is_m365 = "veeam backup for microsoft 365" in html_lower
# VSPC Active Alarms summary (no [Success]/[Warning] marker).
is_vspc_active_alarms = (
("veeam service provider console" in html_lower)
and ("active alarms" in html_lower or "active alarms summary" in subject.lower())
and ("company:" in html_lower and "alarms" in html_lower)
)
# If we cannot detect a status marker and this is not an M365 report,
# we still try to parse when the subject strongly indicates a Veeam report.
if not m_status and not m_finished and not is_m365:
if not m_status and not m_finished and not is_m365 and not is_cloud_connect_report and not is_vspc_active_alarms:
lowered = subject.lower()
if not any(k in lowered for k in ["veeam", "backup job", "backup copy job", "replica job", "configuration backup", "health check"]):
if not any(k in lowered for k in ["veeam", "cloud connect", "backup job", "backup copy job", "replica job", "configuration backup", "health check"]):
return False, {}, []
# Handle Cloud Connect daily report early: overall status is derived from row colours.
if is_cloud_connect_report:
objects, overall_status = _parse_cloud_connect_report_from_html(html_body)
overall_message = None
# Use the short subject summary when present, e.g. ": 2 Errors, 1 Warnings, 49 Successes".
m_sum = re.search(r"(?i)daily\s+report\s*:\s*(.+)$", subject)
if m_sum:
overall_message = (m_sum.group(1) or "").strip() or None
result = {
"backup_software": "Veeam",
"backup_type": "Cloud Connect Report",
"job_name": "Daily report",
"overall_status": overall_status,
}
if overall_message:
result["overall_message"] = overall_message
return True, result, objects
# Handle VSPC Active Alarms summary early.
if is_vspc_active_alarms:
objects, overall_status, overall_message = _parse_vspc_active_alarms_from_html(html_body)
result = {
"backup_software": "Veeam",
"backup_type": "Service Provider Console",
"job_name": "Active alarms summary",
"overall_status": overall_status,
}
if overall_message:
result["overall_message"] = overall_message
return True, result, objects
if m_status:
status_word = m_status.group(1)
rest = m_status.group(2)
@ -852,6 +1258,15 @@ def try_parse_veeam(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
# Do not let retry counters create distinct job names.
job_name = _strip_retry_suffix(job_name)
# Veeam can append a "(Full)" suffix to the job name in some reports.
# Strip it so full/non-full mails map to the same logical job.
job_name = _strip_full_suffix(job_name)
# Veeam Backup for Microsoft 365 reports can add a "(Combined)" suffix.
# Strip it so combined/non-combined mails map to the same job.
if (backup_type or "") == "Veeam Backup for Microsoft 365":
job_name = _strip_m365_combined_suffix(job_name)
# Health Check reports should always map to a stable job name.
if (backup_type or '').lower() == 'health check':
job_name = 'Health Check'
@ -866,7 +1281,7 @@ def try_parse_veeam(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
result: Dict = {
"backup_software": "Veeam",
"backup_type": backup_type,
"job_name": _strip_retry_suffix(job_name),
"job_name": job_name,
"overall_status": status_word,
}
@ -893,8 +1308,17 @@ def try_parse_veeam(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
# Keep detailed overall message for non-success states, and always keep
# the "Processing <object>" marker when present (used for overrides/rules).
# Veeam Backup for Microsoft 365 can include a meaningful overall warning/info
# even when the run is reported as Success (e.g. missing application
# permissions/roles). Store it so it becomes visible in details and can be
# used for overrides.
is_m365 = (backup_type or "") == "Veeam Backup for Microsoft 365"
if overall_message:
if status_word != "Success" or overall_message.lower().startswith("processing "):
if (
status_word != "Success"
or overall_message.lower().startswith("processing ")
or is_m365
):
result["overall_message"] = overall_message
return True, result, objects

View File

@ -14,3 +14,76 @@ main.dashboard-container {
width: min(90vw, 1728px);
max-width: 1728px;
}
/* Prevent long detail values (e.g., email addresses) from overlapping other fields */
.dl-compact dt {
white-space: nowrap;
}
.dl-compact .ellipsis-field {
min-width: 0;
overflow: hidden;
text-overflow: ellipsis;
white-space: nowrap;
cursor: pointer;
}
.dl-compact .ellipsis-field.is-expanded {
overflow: visible;
text-overflow: clip;
white-space: normal;
cursor: text;
}
/* Markdown rendering (e.g., changelog page) */
.markdown-content {
overflow-wrap: anywhere;
}
.markdown-content h1,
.markdown-content h2,
.markdown-content h3,
.markdown-content h4,
.markdown-content h5,
.markdown-content h6 {
margin-top: 1.25rem;
margin-bottom: 0.75rem;
}
.markdown-content p {
margin-bottom: 0.75rem;
}
.markdown-content ul,
.markdown-content ol {
margin-bottom: 0.75rem;
}
.markdown-content pre {
padding: 0.75rem;
border-radius: 0.5rem;
background: rgba(0, 0, 0, 0.05);
overflow: auto;
}
.markdown-content code {
font-size: 0.95em;
}
.markdown-content table {
width: 100%;
margin-bottom: 1rem;
}
.markdown-content table th,
.markdown-content table td {
padding: 0.5rem;
border-top: 1px solid rgba(0, 0, 0, 0.15);
}
.markdown-content blockquote {
border-left: 0.25rem solid rgba(0, 0, 0, 0.15);
padding-left: 0.75rem;
margin-left: 0;
color: rgba(0, 0, 0, 0.7);
}

View File

@ -68,10 +68,26 @@
<div class="collapse navbar-collapse" id="navbarNav">
{% if current_user.is_authenticated %}
<ul class="navbar-nav me-auto mb-2 mb-lg-0">
{% if active_role == 'reporter' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.reports') }}">Reports</a>
</li>
<li class="nav-item">
<a class="nav-link" href='{{ url_for("main.changelog_page") }}'>Changelog</a>
</li>
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.feedback_page') }}">Feedback</a>
</li>
{% else %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.inbox') }}">Inbox</a>
</li>
{% if active_role == 'admin' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.admin_all_mails') }}">All Mail</a>
</li>
{% endif %}
{% if active_role == 'admin' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.inbox_deleted_mails') }}">Deleted mails</a>
</li>
@ -82,6 +98,11 @@
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.jobs') }}">Jobs</a>
</li>
{% if active_role == 'admin' %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.archived_jobs') }}">Archived Jobs</a>
</li>
{% endif %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.daily_jobs') }}">Daily Jobs</a>
</li>
@ -116,6 +137,7 @@
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.feedback_page') }}">Feedback</a>
</li>
{% endif %}
</ul>
<span class="navbar-text me-3">
<a class="text-decoration-none" href="{{ url_for('main.user_settings') }}">
@ -175,5 +197,98 @@
</main>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.bundle.min.js"></script>
<script>
(function () {
function isOverflowing(el) {
try {
return el && el.scrollWidth > el.clientWidth;
} catch (e) {
return false;
}
}
function collapseExpandedEllipsis(root) {
try {
if (!root || !root.querySelectorAll) return;
var expanded = root.querySelectorAll('.ellipsis-field.is-expanded');
if (!expanded || !expanded.length) return;
expanded.forEach(function (el) {
el.classList.remove('is-expanded');
setEllipsisTitle(el);
});
} catch (e) {
// no-op
}
}
function setEllipsisTitle(el) {
if (!el || el.classList.contains('is-expanded')) {
return;
}
var txt = (el.textContent || '').trim();
if (!txt) {
el.removeAttribute('title');
return;
}
if (isOverflowing(el)) {
el.setAttribute('title', txt);
} else {
el.removeAttribute('title');
}
}
document.addEventListener('click', function (e) {
var el = e.target;
if (!el) return;
if (!el.classList || !el.classList.contains('ellipsis-field')) return;
// Ignore clicks on interactive children
if (e.target.closest && e.target.closest('a, button, input, select, textarea, label')) return;
el.classList.toggle('is-expanded');
if (el.classList.contains('is-expanded')) {
el.removeAttribute('title');
} else {
setEllipsisTitle(el);
}
});
document.addEventListener('dblclick', function (e) {
var el = e.target;
if (!el || !el.classList || !el.classList.contains('ellipsis-field')) return;
// Expand on double click and select all text
el.classList.add('is-expanded');
el.removeAttribute('title');
try {
var range = document.createRange();
range.selectNodeContents(el);
var sel = window.getSelection();
sel.removeAllRanges();
sel.addRange(range);
} catch (err) {
// no-op
}
});
document.addEventListener('mouseover', function (e) {
var el = e.target;
if (!el || !el.classList || !el.classList.contains('ellipsis-field')) return;
setEllipsisTitle(el);
});
// Ensure expanded fields do not persist between popup/modal openings.
document.addEventListener('show.bs.modal', function (e) {
collapseExpandedEllipsis(e.target);
});
document.addEventListener('hidden.bs.modal', function (e) {
collapseExpandedEllipsis(e.target);
});
document.addEventListener('show.bs.offcanvas', function (e) {
collapseExpandedEllipsis(e.target);
});
document.addEventListener('hidden.bs.offcanvas', function (e) {
collapseExpandedEllipsis(e.target);
});
})();
</script>
</body>
</html>

View File

@ -0,0 +1,362 @@
{% extends "layout/base.html" %}
<style>
.modal-xxl { max-width: 98vw; }
@media (min-width: 1400px) { .modal-xxl { max-width: 1400px; } }
#msg_body_container_iframe { height: 55vh; }
#msg_objects_container { max-height: 25vh; overflow: auto; }
.filter-card .form-label { font-size: 0.85rem; }
</style>
{# Pager macro must be defined before it is used #}
{% macro pager(position, page, total_pages, has_prev, has_next, filter_params) -%}
<div class="d-flex justify-content-between align-items-center my-2">
<div>
{% if has_prev %}
<a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.admin_all_mails', page=page-1, **filter_params) }}">Previous</a>
{% else %}
<button class="btn btn-outline-secondary btn-sm" disabled>Previous</button>
{% endif %}
{% if has_next %}
<a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.admin_all_mails', page=page+1, **filter_params) }}">Next</a>
{% else %}
<button class="btn btn-outline-secondary btn-sm ms-2" disabled>Next</button>
{% endif %}
</div>
<div class="d-flex align-items-center">
<span class="me-2">Page {{ page }} of {{ total_pages }}</span>
<form method="get" class="d-flex align-items-center mb-0">
{% for k, v in filter_params.items() %}
{% if v %}
<input type="hidden" name="{{ k }}" value="{{ v }}" />
{% endif %}
{% endfor %}
<label for="page_{{ position }}" class="form-label me-1 mb-0">Go to:</label>
<input
type="number"
min="1"
max="{{ total_pages }}"
class="form-control form-control-sm me-1"
id="page_{{ position }}"
name="page"
value="{{ page }}"
style="width: 5rem;"
/>
<button type="submit" class="btn btn-primary btn-sm">Go</button>
</form>
</div>
</div>
{%- endmacro %}
{% block content %}
<h2 class="mb-3">All Mail</h2>
<div class="card mb-3 filter-card">
<div class="card-header d-flex justify-content-between align-items-center">
<span>Search Filters</span>
<div class="d-flex gap-3">
<a class="small" href="{{ url_for('main.admin_all_mails') }}">Clear Filter Values</a>
<button class="btn btn-primary btn-sm" type="submit" form="mailFilterForm">Search</button>
</div>
</div>
<div class="card-body">
<form id="mailFilterForm" method="get" action="{{ url_for('main.admin_all_mails') }}">
<div class="row g-3">
<div class="col-12 col-lg-3">
<label class="form-label" for="from_q">From contains</label>
<input class="form-control form-control-sm" type="text" id="from_q" name="from_q" value="{{ filter_params.from_q }}" />
</div>
<div class="col-12 col-lg-3">
<label class="form-label" for="subject_q">Subject contains</label>
<input class="form-control form-control-sm" type="text" id="subject_q" name="subject_q" value="{{ filter_params.subject_q }}" />
</div>
<div class="col-12 col-lg-3">
<label class="form-label" for="backup_q">Backup contains</label>
<input class="form-control form-control-sm" type="text" id="backup_q" name="backup_q" value="{{ filter_params.backup_q }}" />
</div>
<div class="col-12 col-lg-3">
<label class="form-label" for="type_q">Type contains</label>
<input class="form-control form-control-sm" type="text" id="type_q" name="type_q" value="{{ filter_params.type_q }}" />
</div>
<div class="col-12 col-lg-3">
<label class="form-label" for="job_name_q">Job name contains</label>
<input class="form-control form-control-sm" type="text" id="job_name_q" name="job_name_q" value="{{ filter_params.job_name_q }}" />
</div>
<div class="col-12 col-lg-3">
<label class="form-label" for="received_from">Received &gt;=</label>
<input class="form-control form-control-sm" type="datetime-local" id="received_from" name="received_from" value="{{ filter_params.received_from }}" />
</div>
<div class="col-12 col-lg-3">
<label class="form-label" for="received_to">Received &lt;=</label>
<input class="form-control form-control-sm" type="datetime-local" id="received_to" name="received_to" value="{{ filter_params.received_to }}" />
</div>
<div class="col-12 col-lg-3 d-flex align-items-end">
<div class="form-check">
<input class="form-check-input" type="checkbox" id="only_unlinked" name="only_unlinked" value="1" {% if filter_params.only_unlinked %}checked{% endif %} />
<label class="form-check-label" for="only_unlinked">Only unlinked</label>
</div>
</div>
</div>
</form>
</div>
</div>
{{ pager("top", page, total_pages, has_prev, has_next, filter_params) }}
<div class="table-responsive">
<table class="table table-sm table-hover align-middle" id="mailAuditTable">
<thead class="table-light">
<tr>
<th scope="col">Received</th>
<th scope="col">From</th>
<th scope="col">Subject</th>
<th scope="col">Backup</th>
<th scope="col">Type</th>
<th scope="col">Job name</th>
<th scope="col">Linked</th>
<th scope="col">Parsed</th>
<th scope="col">EML</th>
</tr>
</thead>
<tbody>
{% if rows %}
{% for row in rows %}
<tr class="mail-row" data-message-id="{{ row.id }}" style="cursor: pointer;">
<td>{{ row.received_at }}</td>
<td>{{ row.from_address }}</td>
<td>{{ row.subject }}</td>
<td>{{ row.backup_software }}</td>
<td>{{ row.backup_type }}</td>
<td>{{ row.job_name }}</td>
<td>
{% if row.linked %}
<span class="badge bg-success">Linked</span>
{% else %}
<span class="badge bg-warning text-dark">Unlinked</span>
{% endif %}
</td>
<td>{{ row.parsed_at }}</td>
<td>
{% if row.has_eml %}
<a class="eml-download" href="{{ url_for('main.inbox_message_eml', message_id=row.id) }}" onclick="event.stopPropagation();">EML</a>
{% endif %}
</td>
</tr>
{% endfor %}
{% else %}
<tr>
<td colspan="9" class="text-center text-muted py-3">No messages found.</td>
</tr>
{% endif %}
</tbody>
</table>
</div>
{{ pager("bottom", page, total_pages, has_prev, has_next, filter_params) }}
<div class="modal fade" id="mailMessageModal" tabindex="-1" aria-labelledby="mailMessageModalLabel" aria-hidden="true">
<div class="modal-dialog modal-xl modal-dialog-scrollable modal-xxl">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title" id="mailMessageModalLabel">Message details</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<div class="modal-body">
<div class="row">
<div class="col-md-3">
<dl class="row mb-0 dl-compact">
<dt class="col-4">From</dt>
<dd class="col-8 ellipsis-field" id="msg_from"></dd>
<dt class="col-4">Backup</dt>
<dd class="col-8 ellipsis-field" id="msg_backup"></dd>
<dt class="col-4">Type</dt>
<dd class="col-8 ellipsis-field" id="msg_type"></dd>
<dt class="col-4">Job</dt>
<dd class="col-8 ellipsis-field" id="msg_job"></dd>
<dt class="col-4">Overall</dt>
<dd class="col-8 ellipsis-field" id="msg_overall"></dd>
<dt class="col-4">Customer</dt>
<dd class="col-8 ellipsis-field" id="msg_customer"></dd>
<dt class="col-4">Received</dt>
<dd class="col-8 ellipsis-field" id="msg_received"></dd>
<dt class="col-4">Parsed</dt>
<dd class="col-8 ellipsis-field" id="msg_parsed"></dd>
<dt class="col-4">Details</dt>
<dd class="col-8" id="msg_overall_message" style="white-space: pre-wrap;"></dd>
</dl>
</div>
<div class="col-md-9">
<div class="border rounded p-2 p-0" style="overflow:hidden;">
<iframe id="msg_body_container_iframe" class="w-100" style="height:55vh; border:0; background:transparent;" sandbox="allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation"></iframe>
</div>
<div class="mt-3">
<div id="msg_objects_container"></div>
</div>
</div>
</div>
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Close</button>
</div>
</div>
</div>
</div>
<script>
(function () {
function initAdminAllMailPopup() {
var table = document.getElementById('mailAuditTable');
var modalEl = document.getElementById('mailMessageModal');
if (!table || !modalEl) return;
// base.html loads Bootstrap JS after the page content. Initialize after DOMContentLoaded
// so bootstrap.Modal is guaranteed to be available.
if (typeof bootstrap === 'undefined' || !bootstrap.Modal) return;
var modal = new bootstrap.Modal(modalEl);
function setText(id, value) {
var el = document.getElementById(id);
if (el) el.textContent = value || '';
}
function objectSeverityRank(o) {
var st = String((o && o.status) || '').trim().toLowerCase();
var err = String((o && o.error_message) || '').trim();
if (st === 'error' || st === 'failed' || st === 'failure' || err) return 0;
if (st === 'warning') return 1;
return 2;
}
function sortObjects(objects) {
return (objects || []).slice().sort(function (a, b) {
var ra = objectSeverityRank(a);
var rb = objectSeverityRank(b);
if (ra !== rb) return ra - rb;
var na = String((a && a.name) || '').toLowerCase();
var nb = String((b && b.name) || '').toLowerCase();
if (na < nb) return -1;
if (na > nb) return 1;
var ta = String((a && a.type) || '').toLowerCase();
var tb = String((b && b.type) || '').toLowerCase();
if (ta < tb) return -1;
if (ta > tb) return 1;
return 0;
});
}
function renderObjects(objects) {
var container = document.getElementById('msg_objects_container');
if (!container) return;
container.innerHTML = '';
if (!objects || !objects.length) {
container.innerHTML = '<div class="text-muted">No objects stored.</div>';
return;
}
var tableHtml = '<div class="table-responsive"><table class="table table-sm table-hover align-middle">' +
'<thead class="table-light"><tr><th>Name</th><th>Type</th><th>Status</th><th>Error</th></tr></thead><tbody>';
var sorted = sortObjects(objects);
for (var i = 0; i < sorted.length; i++) {
var o = sorted[i] || {};
tableHtml += '<tr>' +
'<td>' + (o.name || '') + '</td>' +
'<td>' + (o.type || '') + '</td>' +
'<td>' + (o.status || '') + '</td>' +
'<td style="white-space: pre-wrap;">' + (o.error_message || '') + '</td>' +
'</tr>';
}
tableHtml += '</tbody></table></div>';
container.innerHTML = tableHtml;
}
function wrapMailHtml(html) {
html = html || "";
var trimmed = (typeof html === "string") ? html.trim() : "";
var injection = '<meta charset="utf-8"><meta name="color-scheme" content="light"><meta name="supported-color-schemes" content="light"><meta name="viewport" content="width=device-width, initial-scale=1"><base target="_blank"><style>:root{color-scheme:light;}html{color-scheme:light;}body{margin:0;padding:8px;background:#fff;forced-color-adjust:none;-ms-high-contrast-adjust:none;}</style>';
function injectIntoFullDoc(doc) {
var d = doc || "";
if (/<head[^>]*>/i.test(d)) {
return d.replace(/<head[^>]*>/i, function (m) { return m + injection; });
}
if (/<html[^>]*>/i.test(d)) {
return d.replace(/<html[^>]*>/i, function (m) { return m + "<head>" + injection + "</head>"; });
}
return "<!doctype html><html><head>" + injection + "</head><body>" + d + "</body></html>";
}
if (trimmed.toLowerCase().indexOf("<!doctype") === 0 || trimmed.toLowerCase().indexOf("<html") === 0) {
return injectIntoFullDoc(trimmed);
}
return "<!doctype html><html><head>" + injection + "</head><body>" + html + "</body></html>";
}
function setIframeHtml(html) {
var iframe = document.getElementById('msg_body_container_iframe');
if (!iframe) return;
iframe.srcdoc = wrapMailHtml(html || '<p>No message content stored.</p>');
}
async function openMessage(messageId) {
try {
var res = await fetch('{{ url_for('main.inbox_message_detail', message_id=0) }}'.replace('/0', '/' + messageId));
if (!res.ok) throw new Error('Failed to load message');
var data = await res.json();
if (!data || data.status !== 'ok') throw new Error('Invalid response');
var meta = data.meta || {};
setText('msg_from', meta.from_address);
setText('msg_backup', meta.backup_software);
setText('msg_type', meta.backup_type);
setText('msg_job', meta.job_name);
setText('msg_overall', meta.overall_status);
setText('msg_customer', meta.customer_name);
setText('msg_received', meta.received_at);
setText('msg_parsed', meta.parsed_at);
setText('msg_overall_message', meta.overall_message);
setIframeHtml(data.body_html || "");
renderObjects(data.objects);
modal.show();
} catch (e) {
alert('Unable to open message details.');
}
}
table.addEventListener('click', function (e) {
var tr = e.target.closest('tr.mail-row');
if (!tr) return;
var id = tr.getAttribute('data-message-id');
if (!id) return;
openMessage(id);
});
}
document.addEventListener('DOMContentLoaded', initAdminAllMailPopup);
})();
</script>
{% endblock %}

View File

@ -0,0 +1,45 @@
{% extends "layout/base.html" %}
{% block content %}
<h2 class="mb-3">Archived Jobs</h2>
<div class="table-responsive">
<table class="table table-sm table-hover align-middle">
<thead class="table-light">
<tr>
<th scope="col">Customer</th>
<th scope="col">Backup</th>
<th scope="col">Type</th>
<th scope="col">Job name</th>
<th scope="col">Archived at</th>
<th scope="col" class="text-end">Actions</th>
</tr>
</thead>
<tbody>
{% if jobs %}
{% for j in jobs %}
<tr>
<td>{{ j.customer_name }}</td>
<td>{{ j.backup_software }}</td>
<td>{{ j.backup_type }}</td>
<td>
<a class="text-decoration-none" href="{{ url_for('main.job_detail', job_id=j.id) }}">{{ j.job_name }}</a>
</td>
<td>{{ j.archived_at }}</td>
<td class="text-end">
<form method="post" action="{{ url_for('main.unarchive_job', job_id=j.id) }}" style="display:inline;">
<button type="submit" class="btn btn-sm btn-outline-secondary">Restore</button>
</form>
</td>
</tr>
{% endfor %}
{% else %}
<tr>
<td colspan="6" class="text-center text-muted py-3">
No archived jobs found.
</td>
</tr>
{% endif %}
</tbody>
</table>
</div>
{% endblock %}

View File

@ -4,74 +4,29 @@
<div class="d-flex align-items-center justify-content-between mb-3">
<div>
<h1 class="h3 mb-1">Changelog</h1>
<div class="text-body-secondary">Product versions and changes.</div>
<div class="text-body-secondary">Loaded live from the repository.</div>
</div>
{% if changelog_source_url %}
<div class="text-end">
<a class="btn btn-sm btn-outline-secondary" href="{{ changelog_source_url }}" target="_blank" rel="noopener">
View source
</a>
</div>
{% endif %}
</div>
{# Completed (summary) #}
<div class="card mb-4">
<div class="card-header d-flex align-items-center justify-content-between">
<div class="fw-semibold">Completed</div>
<span class="badge text-bg-primary">History</span>
{% if changelog_error %}
<div class="alert alert-warning" role="alert">
{{ changelog_error }}
</div>
{% endif %}
<div class="card">
<div class="card-body">
{% if changelog.completed_summary and changelog.completed_summary|length > 0 %}
<div class="accordion" id="changelogCompletedAccordion">
{% for item in changelog.completed_summary %}
<div class="accordion-item">
<h2 class="accordion-header" id="completedHeading{{ loop.index }}">
<button class="accordion-button {% if not loop.first %}collapsed{% endif %}" type="button" data-bs-toggle="collapse" data-bs-target="#completedCollapse{{ loop.index }}" aria-expanded="{% if loop.first %}true{% else %}false{% endif %}" aria-controls="completedCollapse{{ loop.index }}">
<span class="fw-semibold">v{{ item.version }}</span>
</button>
</h2>
<div id="completedCollapse{{ loop.index }}" class="accordion-collapse collapse {% if loop.first %}show{% endif %}" aria-labelledby="completedHeading{{ loop.index }}" data-bs-parent="#changelogCompletedAccordion">
<div class="accordion-body">
{% if item.overview and item.overview|length > 0 %}
{% for p in item.overview %}
<p class="mb-2">{{ p }}</p>
{% endfor %}
{% endif %}
{% if item.categories and item.categories|length > 0 %}
{% for cat in item.categories %}
<div class="fw-semibold mb-2">{{ cat.category }}</div>
{# NOTE: 'items' is a dict key; use bracket notation to avoid calling dict.items() #}
{% if cat['items'] and cat['items']|length > 0 %}
{% for it in cat['items'] %}
<div class="mb-3">
{% if it.title %}
<div class="fw-semibold">{{ it.title }}</div>
{% endif %}
{% if it.details and it.details|length > 0 %}
<ul class="mb-0">
{% for d in it.details %}
<li>{{ d }}</li>
{% endfor %}
</ul>
{% endif %}
</div>
{% endfor %}
{% if changelog_html %}
<div class="markdown-content">{{ changelog_html | safe }}</div>
{% else %}
<div class="text-body-secondary mb-3">No items in this section.</div>
{% endif %}
{% endfor %}
{% elif item.highlights and item.highlights|length > 0 %}
<ul class="mb-0">
{% for h in item.highlights %}
<li>{{ h }}</li>
{% endfor %}
</ul>
{% else %}
<div class="text-body-secondary">No details.</div>
{% endif %}
</div>
</div>
</div>
{% endfor %}
</div>
{% else %}
<div class="text-body-secondary">No completed items.</div>
<div class="text-body-secondary">No changelog content available.</div>
{% endif %}
</div>
</div>

View File

@ -172,18 +172,18 @@
<div id="dj_runs_list" class="list-group"></div>
</div>
<div class="col-md-9 dj-detail-col">
<dl class="row mb-3">
<dl class="row mb-3 dl-compact">
<dt class="col-3">From</dt>
<dd class="col-9" id="dj_from"></dd>
<dd class="col-9 ellipsis-field" id="dj_from"></dd>
<dt class="col-3">Subject</dt>
<dd class="col-9" id="dj_subject"></dd>
<dd class="col-9 ellipsis-field" id="dj_subject"></dd>
<dt class="col-3">Received</dt>
<dd class="col-9" id="dj_received"></dd>
<dd class="col-9 ellipsis-field" id="dj_received"></dd>
<dt class="col-3">Status</dt>
<dd class="col-9" id="dj_status"></dd>
<dd class="col-9 ellipsis-field" id="dj_status"></dd>
<dt class="col-3">Remark</dt>
<dd class="col-9" id="dj_remark" style="white-space: pre-wrap;"></dd>
@ -200,7 +200,7 @@
<button type="button" class="btn btn-sm btn-outline-primary" id="dj_ticket_save">Add</button>
</div>
<div class="mt-2">
<textarea class="form-control form-control-sm" id="dj_ticket_description" rows="2" placeholder="Description (optional)"></textarea>
<input class="form-control form-control-sm" id="dj_ticket_code" type="text" placeholder="Ticket number (e.g., T20260106.0001)" />
</div>
<div class="mt-2 small text-muted" id="dj_ticket_status"></div>
</div>
@ -234,7 +234,6 @@
</div>
<div class="dj-objects-panel">
<h6>Objects</h6>
<div class="table-responsive dj-objects-scroll">
<table class="table table-sm table-bordered" id="dj_objects_table">
<thead class="table-light" style="position: sticky; top: 0; z-index: 1;">
@ -293,22 +292,57 @@
return "";
}
function objectSeverityRank(o) {
var st = String((o && o.status) || '').trim().toLowerCase();
var err = String((o && o.error_message) || '').trim();
if (st === 'error' || st === 'failed' || st === 'failure' || err) return 0;
if (st === 'warning') return 1;
return 2;
}
function sortObjects(objects) {
return (objects || []).slice().sort(function (a, b) {
var ra = objectSeverityRank(a);
var rb = objectSeverityRank(b);
if (ra !== rb) return ra - rb;
var na = String((a && a.name) || '').toLowerCase();
var nb = String((b && b.name) || '').toLowerCase();
if (na < nb) return -1;
if (na > nb) return 1;
var ta = String((a && a.type) || '').toLowerCase();
var tb = String((b && b.type) || '').toLowerCase();
if (ta < tb) return -1;
if (ta > tb) return 1;
return 0;
});
}
function wrapMailHtml(html) {
html = html || "";
// Ensure we render the mail HTML with its own CSS, isolated from the site styling.
return (
"<!doctype html><html><head><meta charset=\"utf-8\">" +
"<base target=\"_blank\">" +
"</head><body style=\"margin:0; padding:8px;\">" +
html +
"</body></html>"
);
var trimmed = (typeof html === "string") ? html.trim() : "";
var injection = '<meta charset="utf-8"><meta name="color-scheme" content="light"><meta name="supported-color-schemes" content="light"><meta name="viewport" content="width=device-width, initial-scale=1"><base target="_blank"><style>:root{color-scheme:light;}html{color-scheme:light;}body{margin:0;padding:8px;background:#fff;forced-color-adjust:none;-ms-high-contrast-adjust:none;}</style>';
function injectIntoFullDoc(doc) {
var d = doc || "";
if (/<head[^>]*>/i.test(d)) {
return d.replace(/<head[^>]*>/i, function (m) { return m + injection; });
}
if (/<html[^>]*>/i.test(d)) {
return d.replace(/<html[^>]*>/i, function (m) { return m + "<head>" + injection + "</head>"; });
}
return "<!doctype html><html><head>" + injection + "</head><body>" + d + "</body></html>";
}
var currentJobId = null;
var currentRunId = null;
if (trimmed.toLowerCase().indexOf("<!doctype") === 0 || trimmed.toLowerCase().indexOf("<html") === 0) {
return injectIntoFullDoc(trimmed);
}
return "<!doctype html><html><head>" + injection + "</head><body>" + html + "</body></html>";
}
function escapeHtml(s) {
return (s || "").toString()
@ -358,25 +392,11 @@
'<span class="fw-semibold">' + escapeHtml(t.ticket_code || '') + '</span>' +
'<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
'</div>' +
(t.description ? ('<div class="small text-muted mt-1">' + escapeHtml(t.description) + '</div>') : '') +
'</div>' +
'<div class="d-flex gap-1 flex-shrink-0">' +
'<button type="button" class="btn btn-sm btn-outline-secondary" data-action="toggle-edit-ticket" data-id="' + t.id + '" ' + (t.resolved_at ? 'disabled' : '') + '>Edit</button>' +
'<button type="button" class="btn btn-sm btn-outline-success" data-action="resolve-ticket" data-id="' + t.id + '" ' + (t.resolved_at ? 'disabled' : '') + '>Resolve</button>' +
'</div>' +
'</div>' +
'<div class="mt-2" data-edit="ticket" style="display:none;">' +
'<div class="row g-2">' +
'<div class="col-12">' +
'<textarea class="form-control form-control-sm" data-field="description" rows="2" placeholder="Description (optional)">' + escapeHtml(t.description || '') + '</textarea>' +
'</div>' +
'<div class="col-12 d-flex gap-2">' +
'<button type="button" class="btn btn-sm btn-primary" data-action="save-ticket" data-id="' + t.id + '">Save</button>' +
'<button type="button" class="btn btn-sm btn-outline-secondary" data-action="cancel-edit" data-id="' + t.id + '">Cancel</button>' +
'<div class="small text-muted align-self-center" data-field="status"></div>' +
'</div>' +
'</div>' +
'</div>' +
'</div>';
});
html += '</div></div>';
@ -397,22 +417,9 @@
(r.body ? ('<div class="small text-muted mt-1">' + escapeHtml(r.body) + '</div>') : '') +
'</div>' +
'<div class="d-flex gap-1 flex-shrink-0">' +
'<button type="button" class="btn btn-sm btn-outline-secondary" data-action="toggle-edit-remark" data-id="' + r.id + '" ' + (r.resolved_at ? 'disabled' : '') + '>Edit</button>' +
'<button type="button" class="btn btn-sm btn-outline-success" data-action="resolve-remark" data-id="' + r.id + '" ' + (r.resolved_at ? 'disabled' : '') + '>Resolve</button>' +
'</div>' +
'</div>' +
'<div class="mt-2" data-edit="remark" style="display:none;">' +
'<div class="row g-2">' +
'<div class="col-12">' +
'<textarea class="form-control form-control-sm" data-field="body" rows="2" placeholder="Body (required)">' + escapeHtml(r.body || '') + '</textarea>' +
'</div>' +
'<div class="col-12 d-flex gap-2">' +
'<button type="button" class="btn btn-sm btn-primary" data-action="save-remark" data-id="' + r.id + '">Save</button>' +
'<button type="button" class="btn btn-sm btn-outline-secondary" data-action="cancel-edit" data-id="' + r.id + '">Cancel</button>' +
'<div class="small text-muted align-self-center" data-field="status"></div>' +
'</div>' +
'</div>' +
'</div>' +
'</div>';
});
html += '</div></div>';
@ -427,8 +434,6 @@
var id = btn.getAttribute('data-id');
if (!action || !id) return;
var wrapper = btn.closest('[data-alert-type]');
if (action === 'resolve-ticket') {
if (!confirm('Mark ticket as resolved?')) return;
apiJson('/api/tickets/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})
@ -439,59 +444,6 @@
apiJson('/api/remarks/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})
.then(function () { loadAlerts(currentRunId); })
.catch(function (e) { alert(e.message || 'Failed.'); });
} else if (action === 'toggle-edit-ticket') {
if (!wrapper) return;
var edit = wrapper.querySelector('[data-edit="ticket"]');
if (!edit) return;
edit.style.display = (edit.style.display === 'none' || !edit.style.display) ? '' : 'none';
} else if (action === 'toggle-edit-remark') {
if (!wrapper) return;
var edit2 = wrapper.querySelector('[data-edit="remark"]');
if (!edit2) return;
edit2.style.display = (edit2.style.display === 'none' || !edit2.style.display) ? '' : 'none';
} else if (action === 'cancel-edit') {
if (!wrapper) return;
var editAny = wrapper.querySelector('[data-edit]');
if (editAny) editAny.style.display = 'none';
} else if (action === 'save-ticket') {
if (!wrapper) return;
var editT = wrapper.querySelector('[data-edit="ticket"]');
if (!editT) return;
var descEl = editT.querySelector('[data-field="description"]');
var statusEl = editT.querySelector('[data-field="status"]');
var descVal = descEl ? descEl.value : '';
if (statusEl) statusEl.textContent = 'Saving...';
apiJson('/api/tickets/' + encodeURIComponent(id), {
method: 'PATCH',
body: JSON.stringify({description: descVal})
})
.then(function () { loadAlerts(currentRunId); })
.catch(function (e) {
if (statusEl) statusEl.textContent = e.message || 'Failed.';
else alert(e.message || 'Failed.');
});
} else if (action === 'save-remark') {
if (!wrapper) return;
var editR = wrapper.querySelector('[data-edit="remark"]');
if (!editR) return;
var bodyEl2 = editR.querySelector('[data-field="body"]');
var statusEl2 = editR.querySelector('[data-field="status"]');
var bodyVal2 = bodyEl2 ? bodyEl2.value : '';
if (!bodyVal2 || !bodyVal2.trim()) {
if (statusEl2) statusEl2.textContent = 'Body is required.';
else alert('Body is required.');
return;
}
if (statusEl2) statusEl2.textContent = 'Saving...';
apiJson('/api/remarks/' + encodeURIComponent(id), {
method: 'PATCH',
body: JSON.stringify({body: bodyVal2})
})
.then(function () { loadAlerts(currentRunId); })
.catch(function (e) {
if (statusEl2) statusEl2.textContent = e.message || 'Failed.';
else alert(e.message || 'Failed.');
});
}
});
});
@ -528,7 +480,7 @@
function bindInlineCreateForms() {
var btnTicket = document.getElementById('dj_ticket_save');
var btnRemark = document.getElementById('dj_remark_save');
var tDesc = document.getElementById('dj_ticket_description');
var tCode = document.getElementById('dj_ticket_code');
var tStatus = document.getElementById('dj_ticket_status');
var rBody = document.getElementById('dj_remark_body');
var rStatus = document.getElementById('dj_remark_status');
@ -541,7 +493,7 @@
function setDisabled(disabled) {
if (btnTicket) btnTicket.disabled = disabled;
if (btnRemark) btnRemark.disabled = disabled;
if (tDesc) tDesc.disabled = disabled;
if (tCode) tCode.disabled = disabled;
if (rBody) rBody.disabled = disabled;
}
@ -552,14 +504,24 @@
btnTicket.addEventListener('click', function () {
if (!currentRunId) { alert('Select a run first.'); return; }
clearStatus();
var description = tDesc ? tDesc.value : '';
var ticket_code = tCode ? (tCode.value || '').trim().toUpperCase() : '';
if (!ticket_code) {
if (tStatus) tStatus.textContent = 'Ticket number is required.';
else alert('Ticket number is required.');
return;
}
if (!/^T\d{8}\.\d{4}$/.test(ticket_code)) {
if (tStatus) tStatus.textContent = 'Invalid ticket number format. Expected TYYYYMMDD.####.';
else alert('Invalid ticket number format. Expected TYYYYMMDD.####.');
return;
}
if (tStatus) tStatus.textContent = 'Saving...';
apiJson('/api/tickets', {
method: 'POST',
body: JSON.stringify({job_run_id: currentRunId, description: description})
body: JSON.stringify({job_run_id: currentRunId, ticket_code: ticket_code})
})
.then(function () {
if (tDesc) tDesc.value = '';
if (tCode) tCode.value = '';
if (tStatus) tStatus.textContent = '';
loadAlerts(currentRunId);
})
@ -678,7 +640,7 @@
var tbody = document.querySelector("#dj_objects_table tbody");
tbody.innerHTML = "";
(run.objects || []).forEach(function (obj) {
sortObjects(run.objects || []).forEach(function (obj) {
var tr = document.createElement("tr");
var tdName = document.createElement("td");

View File

@ -49,6 +49,46 @@
</div>
</div>
<div class="card mb-3">
<div class="card-body">
<h5 class="card-title mb-3">Replies</h5>
{% if replies %}
<div class="list-group list-group-flush">
{% for r in replies %}
<div class="list-group-item px-0">
<div class="d-flex justify-content-between align-items-start">
<strong>{{ reply_user_map.get(r.user_id, '') or ('User #' ~ r.user_id) }}</strong>
<span class="text-muted" style="font-size: 0.85rem;">
{{ r.created_at.strftime('%d-%m-%Y %H:%M:%S') if r.created_at else '' }}
</span>
</div>
<div style="white-space: pre-wrap;">{{ r.message }}</div>
</div>
{% endfor %}
</div>
{% else %}
<div class="text-muted">No replies yet.</div>
{% endif %}
</div>
</div>
<div class="card mb-3">
<div class="card-body">
<h5 class="card-title mb-3">Add reply</h5>
{% if item.status == 'open' %}
<form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}">
<div class="mb-2">
<textarea class="form-control" name="message" rows="4" required></textarea>
</div>
<button type="submit" class="btn btn-primary">Post reply</button>
</form>
{% else %}
<div class="text-muted">Replies can only be added while the item is open.</div>
{% endif %}
</div>
</div>
<div class="col-12 col-lg-4">
<div class="card">
<div class="card-body">

View File

@ -56,10 +56,26 @@
{{ pager("top", page, total_pages, has_prev, has_next) }}
{% if can_bulk_delete %}
<div class="d-flex justify-content-between align-items-center mb-2">
<div class="btn-group">
<button type="button" class="btn btn-sm btn-outline-danger" id="btn_inbox_delete_selected" disabled>Delete selected</button>
</div>
</div>
<div class="small text-muted mb-2" id="inbox_status"></div>
{% endif %}
<div class="table-responsive">
<table class="table table-sm table-hover align-middle">
<table class="table table-sm table-hover align-middle" id="inboxTable">
<thead class="table-light">
<tr>
{% if can_bulk_delete %}
<th scope="col" style="width: 34px;">
<input class="form-check-input" type="checkbox" id="inbox_select_all" />
</th>
{% endif %}
<th scope="col">From</th>
<th scope="col">Subject</th>
<th scope="col">Date / time</th>
@ -75,6 +91,11 @@
{% if rows %}
{% for row in rows %}
<tr class="inbox-row" data-message-id="{{ row.id }}" style="cursor: pointer;">
{% if can_bulk_delete %}
<td onclick="event.stopPropagation();">
<input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" />
</td>
{% endif %}
<td>{{ row.from_address }}</td>
<td>{{ row.subject }}</td>
<td>{{ row.received_at }}</td>
@ -114,21 +135,21 @@
<div class="modal-body">
<div class="row">
<div class="col-md-3">
<dl class="row mb-0">
<dl class="row mb-0 dl-compact">
<dt class="col-4">From</dt>
<dd class="col-8" id="msg_from"></dd>
<dd class="col-8 ellipsis-field" id="msg_from"></dd>
<dt class="col-4">Backup</dt>
<dd class="col-8" id="msg_backup"></dd>
<dd class="col-8 ellipsis-field" id="msg_backup"></dd>
<dt class="col-4">Type</dt>
<dd class="col-8" id="msg_type"></dd>
<dd class="col-8 ellipsis-field" id="msg_type"></dd>
<dt class="col-4">Job</dt>
<dd class="col-8" id="msg_job"></dd>
<dd class="col-8 ellipsis-field" id="msg_job"></dd>
<dt class="col-4">Overall</dt>
<dd class="col-8" id="msg_overall"></dd>
<dd class="col-8 ellipsis-field" id="msg_overall"></dd>
<dt class="col-4">Customer</dt>
<dd class="col-8">
@ -140,28 +161,28 @@
{% endfor %}
</datalist>
{% else %}
<span id="msg_customer_display"></span>
<span id="msg_customer_display" class="ellipsis-field"></span>
{% endif %}
</dd>
<dt class="col-4">Received</dt>
<dd class="col-8" id="msg_received"></dd>
<dd class="col-8 ellipsis-field" id="msg_received"></dd>
<dt class="col-4">Parsed</dt>
<dd class="col-8" id="msg_parsed"></dd>
<dt class="col-4">Details</dt>
<dd class="col-8" id="msg_overall_message" style="white-space: pre-wrap;"></dd>
<dd class="col-8 ellipsis-field" id="msg_parsed"></dd>
</dl>
</div>
<div class="col-md-9">
<div class="mb-2">
<h6 class="mb-1">Details</h6>
<div id="msg_overall_message" class="border rounded p-2" style="white-space: pre-wrap; max-height: 20vh; overflow: auto;"></div>
</div>
<div class="border rounded p-2 p-0" style="overflow:hidden;">
<iframe id="msg_body_container_iframe" class="w-100" style="height:55vh; border:0; background:transparent;" sandbox="allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation"></iframe>
</div>
<div class="mt-3">
<h6>Objects</h6>
<div id="msg_objects_container">
<!-- Parsed objects will be rendered here -->
</div>
@ -174,7 +195,8 @@
{% if current_user.is_authenticated and active_role in ["admin", "operator"] %}
<form id="inboxApproveForm" method="POST" action="" class="me-auto mb-0">
<input type="hidden" id="msg_customer_id" name="customer_id" value="" />
<button type="submit" class="btn btn-primary">Approve job</button>
<button type="submit" class="btn btn-primary" id="inboxApproveBtn">Approve job</button>
<button type="button" class="btn btn-outline-primary ms-2 d-none" id="vspcMapCompaniesBtn">Map companies</button>
</form>
<form id="inboxDeleteForm" method="POST" action="" class="mb-0">
<button type="submit" class="btn btn-outline-danger" onclick="return confirm('Delete this message from the Inbox?');">Delete</button>
@ -186,23 +208,198 @@
</div>
</div>
<!-- VSPC company mapping modal (for multi-company summary emails) -->
<div class="modal fade" id="vspcCompanyMapModal" tabindex="-1" aria-labelledby="vspcCompanyMapModalLabel" aria-hidden="true">
<div class="modal-dialog modal-lg modal-dialog-scrollable">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title" id="vspcCompanyMapModalLabel">Map companies to customers</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal" aria-label="Close"></button>
</div>
<form id="vspcCompanyMapForm" method="POST" action="">
<div class="modal-body">
<p class="mb-2">This message contains multiple companies. Map each company to a customer to approve.</p>
<datalist id="vspcCustomerList">
{% for c in customers %}
<option value="{{ c.name }}"></option>
{% endfor %}
</datalist>
<div class="table-responsive" style="max-height:55vh; overflow-y:auto;">
<table class="table table-sm align-middle">
<thead>
<tr>
<th style="width: 40%;">Company</th>
<th style="width: 60%;">Customer</th>
</tr>
</thead>
<tbody id="vspcCompanyMapTbody">
<!-- rows injected by JS -->
</tbody>
</table>
</div>
<input type="hidden" id="vspc_company_mappings_json" name="company_mappings_json" value="" />
</div>
<div class="modal-footer">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
<button type="submit" class="btn btn-primary">Approve mapped companies</button>
</div>
</form>
</div>
</div>
</div>
<script>
(function () {
var customers = {{ customers|tojson|safe }};
var table = document.getElementById('inboxTable');
var selectAll = document.getElementById('inbox_select_all');
var btnDeleteSelected = document.getElementById('btn_inbox_delete_selected');
var statusEl = document.getElementById('inbox_status');
function getSelectedMessageIds() {
if (!table) return [];
var cbs = table.querySelectorAll('tbody .inbox_row_cb');
var ids = [];
cbs.forEach(function (cb) {
if (cb.checked) ids.push(parseInt(cb.value, 10));
});
return ids.filter(function (x) { return Number.isFinite(x); });
}
function refreshRowHighlights() {
if (!table) return;
var cbs = table.querySelectorAll('tbody .inbox_row_cb');
cbs.forEach(function (cb) {
var tr = cb.closest ? cb.closest('tr') : null;
if (!tr) return;
if (cb.checked) tr.classList.add('table-active');
else tr.classList.remove('table-active');
});
}
function refreshSelectAll() {
if (!selectAll || !table) return;
var cbs = table.querySelectorAll('tbody .inbox_row_cb');
var total = cbs.length;
var checked = 0;
cbs.forEach(function (cb) { if (cb.checked) checked++; });
selectAll.indeterminate = checked > 0 && checked < total;
selectAll.checked = total > 0 && checked === total;
}
function updateBulkDeleteUi() {
var ids = getSelectedMessageIds();
refreshRowHighlights();
if (btnDeleteSelected) btnDeleteSelected.disabled = ids.length === 0;
if (statusEl) statusEl.textContent = ids.length ? (ids.length + ' selected') : '';
refreshSelectAll();
}
function postJson(url, payload) {
return fetch(url, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
credentials: 'same-origin',
body: JSON.stringify(payload || {})
}).then(function (r) {
return r.json().then(function (data) { return { ok: r.ok, status: r.status, data: data }; });
});
}
if (selectAll && table) {
function setAllSelection(checked) {
var cbs = table.querySelectorAll('tbody .inbox_row_cb');
cbs.forEach(function (cb) { cb.checked = !!checked; });
selectAll.indeterminate = false;
selectAll.checked = !!checked;
setTimeout(function () {
selectAll.indeterminate = false;
selectAll.checked = !!checked;
}, 0);
updateBulkDeleteUi();
}
selectAll.addEventListener('click', function (e) {
e.stopPropagation();
});
selectAll.addEventListener('change', function () {
setAllSelection(selectAll.checked);
});
}
if (table) {
table.addEventListener('change', function (e) {
var t = e.target;
if (t && t.classList && t.classList.contains('inbox_row_cb')) {
updateBulkDeleteUi();
}
});
}
if (btnDeleteSelected) {
btnDeleteSelected.addEventListener('click', function () {
var ids = getSelectedMessageIds();
if (!ids.length) return;
var msg = 'Delete ' + ids.length + ' selected message' + (ids.length === 1 ? '' : 's') + ' from the Inbox?';
if (!confirm(msg)) return;
if (statusEl) statusEl.textContent = 'Deleting...';
postJson('{{ url_for('main.api_inbox_bulk_delete') }}', { message_ids: ids })
.then(function (res) {
if (!res.ok || !res.data || res.data.status !== 'ok') {
var err = (res.data && (res.data.message || res.data.error)) ? (res.data.message || res.data.error) : 'Request failed.';
if (statusEl) statusEl.textContent = err;
alert(err);
return;
}
window.location.reload();
})
.catch(function () {
var err = 'Request failed.';
if (statusEl) statusEl.textContent = err;
alert(err);
});
});
}
// Initialize UI state
updateBulkDeleteUi();
function wrapMailHtml(html) {
html = html || "";
// Ensure we render the mail HTML with its own CSS, isolated from the site styling.
return (
"<!doctype html><html><head><meta charset=\"utf-8\">" +
"<base target=\"_blank\">" +
"</head><body style=\"margin:0; padding:8px;\">" +
html +
"</body></html>"
);
var trimmed = (typeof html === "string") ? html.trim() : "";
var injection = '<meta charset="utf-8"><meta name="color-scheme" content="light"><meta name="supported-color-schemes" content="light"><meta name="viewport" content="width=device-width, initial-scale=1"><base target="_blank"><style>:root{color-scheme:light;}html{color-scheme:light;}body{margin:0;padding:8px;background:#fff;forced-color-adjust:none;-ms-high-contrast-adjust:none;}</style>';
function injectIntoFullDoc(doc) {
var d = doc || "";
if (/<head[^>]*>/i.test(d)) {
return d.replace(/<head[^>]*>/i, function (m) { return m + injection; });
}
if (/<html[^>]*>/i.test(d)) {
return d.replace(/<html[^>]*>/i, function (m) { return m + "<head>" + injection + "</head>"; });
}
return "<!doctype html><html><head>" + injection + "</head><body>" + d + "</body></html>";
}
if (trimmed.toLowerCase().indexOf("<!doctype") === 0 || trimmed.toLowerCase().indexOf("<html") === 0) {
return injectIntoFullDoc(trimmed);
}
return "<!doctype html><html><head>" + injection + "</head><body>" + html + "</body></html>";
}
function findCustomerIdByName(name) {
if (!name) return null;
for (var i = 0; i < customers.length; i++) {
@ -220,10 +417,39 @@ function findCustomerIdByName(name) {
return;
}
function objectSeverityRank(o) {
var st = String((o && o.status) || "").trim().toLowerCase();
var err = String((o && o.error_message) || "").trim();
if (st === "error" || st === "failed" || st === "failure" || err) return 0;
if (st === "warning") return 1;
return 2;
}
function sortObjects(list) {
return (list || []).slice().sort(function (a, b) {
var ra = objectSeverityRank(a);
var rb = objectSeverityRank(b);
if (ra !== rb) return ra - rb;
var na = String((a && a.name) || "").toLowerCase();
var nb = String((b && b.name) || "").toLowerCase();
if (na < nb) return -1;
if (na > nb) return 1;
var ta = String((a && a.type) || "").toLowerCase();
var tb = String((b && b.type) || "").toLowerCase();
if (ta < tb) return -1;
if (ta > tb) return 1;
return 0;
});
}
var sorted = sortObjects(objects);
var html = "<div class=\"table-responsive\"><table class=\"table table-sm table-bordered mb-0\">";
html += "<thead><tr><th>Object</th><th>Type</th><th>Status</th><th>Error</th></tr></thead><tbody>";
for (var i = 0; i < objects.length; i++) {
var o = objects[i] || {};
for (var i = 0; i < sorted.length; i++) {
var o = sorted[i] || {};
html += "<tr>";
html += "<td>" + (o.name || "") + "</td>";
html += "<td>" + (o.type || "") + "</td>";
@ -278,6 +504,132 @@ function findCustomerIdByName(name) {
renderObjects(data.objects || []);
// VSPC multi-company mapping support (Active alarms summary)
(function () {
var mapBtn = document.getElementById("vspcMapCompaniesBtn");
var approveBtn = document.getElementById("inboxApproveBtn");
if (!mapBtn) return;
// reset
mapBtn.classList.add("d-none");
if (approveBtn) approveBtn.classList.remove("d-none");
var ciReset = document.getElementById("msg_customer_input");
if (ciReset) {
ciReset.removeAttribute("disabled");
ciReset.placeholder = "Select customer";
}
var bsw = String(meta.backup_software || "").trim();
var btype = String(meta.backup_type || "").trim();
var jname = String(meta.job_name || "").trim();
if (bsw !== "Veeam" || btype !== "Service Provider Console" || jname !== "Active alarms summary") {
return;
}
var companies = (data.vspc_companies || meta.vspc_companies || []);
var defaults = (data.vspc_company_defaults || {});
if (!Array.isArray(companies)) companies = [];
// Fallback for older stored messages where companies were embedded in object names.
if (!companies.length) {
var objs = data.objects || [];
var seen = {};
objs.forEach(function (o) {
var name = String((o && o.name) || "");
var ix = name.indexOf(" | ");
if (ix > 0) {
var c = name.substring(0, ix).trim();
if (c && !seen[c]) { seen[c] = true; companies.push(c); }
}
});
}
if (!companies.length) return;
// Show mapping button; hide regular approve
mapBtn.classList.remove("d-none");
if (approveBtn) approveBtn.classList.add("d-none");
var ci = document.getElementById("msg_customer_input");
if (ci) {
ci.value = "";
ci.setAttribute("disabled", "disabled");
ci.placeholder = "Use \"Map companies\"";
}
mapBtn.onclick = function () {
var tbody = document.getElementById("vspcCompanyMapTbody");
var form = document.getElementById("vspcCompanyMapForm");
if (!tbody || !form) return;
// set form action
form.action = "{{ url_for('main.inbox_message_approve_vspc_companies', message_id=0) }}".replace("0", String(meta.id || id));
// build rows
tbody.innerHTML = "";
companies.forEach(function (company) {
var tr = document.createElement("tr");
var tdC = document.createElement("td");
tdC.textContent = company;
tr.appendChild(tdC);
var tdS = document.createElement("td");
var inp = document.createElement("input");
inp.type = "text";
inp.className = "form-control form-control-sm";
inp.setAttribute("list", "vspcCustomerList");
inp.setAttribute("data-company", company);
inp.placeholder = "Select customer";
// Prefill with existing mapping when available.
try {
var d = defaults && defaults[company];
if (d && d.customer_name) {
inp.value = String(d.customer_name);
}
} catch (e) {}
tdS.appendChild(inp);
tr.appendChild(tdS);
tbody.appendChild(tr);
});
// clear hidden field
var hidden = document.getElementById("vspc_company_mappings_json");
if (hidden) hidden.value = "";
var mapModalEl = document.getElementById("vspcCompanyMapModal");
if (mapModalEl && window.bootstrap) {
var mm = bootstrap.Modal.getOrCreateInstance(mapModalEl);
mm.show();
}
};
// Attach submit handler once
var mapForm = document.getElementById("vspcCompanyMapForm");
if (mapForm && !mapForm.getAttribute("data-bound")) {
mapForm.setAttribute("data-bound", "1");
mapForm.addEventListener("submit", function (ev) {
var rows = document.querySelectorAll("#vspcCompanyMapTbody input[data-company]");
var mappings = [];
rows.forEach(function (inp) {
var company = inp.getAttribute("data-company") || "";
var cname = String(inp.value || "").trim();
if (!company || !cname) return;
var cid = findCustomerIdByName(cname);
if (!cid) return;
mappings.push({ company: company, customer_id: cid });
});
var hidden = document.getElementById("vspc_company_mappings_json");
if (hidden) hidden.value = JSON.stringify(mappings);
});
}
})();
var customerName = meta.customer_name || "";
var approveForm = document.getElementById("inboxApproveForm");

View File

@ -99,18 +99,18 @@
<div class="modal-body">
<div class="row">
<div class="col-md-3">
<dl class="row mb-0">
<dl class="row mb-0 dl-compact">
<dt class="col-4">From</dt>
<dd class="col-8" id="dmsg_from"></dd>
<dd class="col-8 ellipsis-field" id="dmsg_from"></dd>
<dt class="col-4">Received</dt>
<dd class="col-8" id="dmsg_received"></dd>
<dd class="col-8 ellipsis-field" id="dmsg_received"></dd>
<dt class="col-4">Deleted by</dt>
<dd class="col-8" id="dmsg_deleted_by"></dd>
<dd class="col-8 ellipsis-field" id="dmsg_deleted_by"></dd>
<dt class="col-4">Deleted at</dt>
<dd class="col-8" id="dmsg_deleted_at"></dd>
<dd class="col-8 ellipsis-field" id="dmsg_deleted_at"></dd>
</dl>
</div>
@ -132,13 +132,25 @@
(function () {
function wrapMailHtml(html) {
html = html || "";
return (
"<!doctype html><html><head><meta charset=\"utf-8\">" +
"<base target=\"_blank\">" +
"</head><body style=\"margin:0; padding:8px;\">" +
html +
"</body></html>"
);
var trimmed = (typeof html === "string") ? html.trim() : "";
var injection = '<meta charset="utf-8"><meta name="color-scheme" content="light"><meta name="supported-color-schemes" content="light"><meta name="viewport" content="width=device-width, initial-scale=1"><base target="_blank"><style>:root{color-scheme:light;}html{color-scheme:light;}body{margin:0;padding:8px;background:#fff;forced-color-adjust:none;-ms-high-contrast-adjust:none;}</style>';
function injectIntoFullDoc(doc) {
var d = doc || "";
if (/<head[^>]*>/i.test(d)) {
return d.replace(/<head[^>]*>/i, function (m) { return m + injection; });
}
if (/<html[^>]*>/i.test(d)) {
return d.replace(/<html[^>]*>/i, function (m) { return m + "<head>" + injection + "</head>"; });
}
return "<!doctype html><html><head>" + injection + "</head><body>" + d + "</body></html>";
}
if (trimmed.toLowerCase().indexOf("<!doctype") === 0 || trimmed.toLowerCase().indexOf("<html") === 0) {
return injectIntoFullDoc(trimmed);
}
return "<!doctype html><html><head>" + injection + "</head><body>" + html + "</body></html>";
}
function attachHandlers() {

View File

@ -4,18 +4,18 @@
<div class="card mb-3">
<div class="card-body">
<dl class="row mb-0">
<dl class="row mb-0 dl-compact">
<dt class="col-sm-3">Customer</dt>
<dd class="col-sm-9">{{ job.customer.name if job.customer else "" }}</dd>
<dd class="col-sm-9 ellipsis-field">{{ job.customer.name if job.customer else "" }}</dd>
<dt class="col-sm-3">Backup</dt>
<dd class="col-sm-9">{{ job.backup_software }}</dd>
<dd class="col-sm-9 ellipsis-field">{{ job.backup_software }}</dd>
<dt class="col-sm-3">Type</dt>
<dd class="col-sm-9">{{ job.backup_type }}</dd>
<dd class="col-sm-9 ellipsis-field">{{ job.backup_type }}</dd>
<dt class="col-sm-3">Job name</dt>
<dd class="col-sm-9">{{ job.job_name }}</dd>
<dd class="col-sm-9 ellipsis-field">{{ job.job_name }}</dd>
<dt class="col-sm-3">Tickets</dt>
<dd class="col-sm-9">{{ ticket_open_count }} open / {{ ticket_total_count }} total</dd>
@ -48,9 +48,15 @@
</div>
{% if can_manage_jobs %}
<form method="post" action="{{ url_for('main.job_delete', job_id=job.id) }}" class="mb-3" onsubmit="return confirm('Are you sure you want to delete this job? Related mails will be returned to the Inbox.');">
<div class="d-flex flex-wrap gap-2 mb-3">
<form method="post" action="{{ url_for('main.archive_job', job_id=job.id) }}" class="mb-0" onsubmit="return confirm('Archive this job? No new runs are expected and it will be removed from Daily Jobs and Run Checks.');">
<button type="submit" class="btn btn-outline-secondary">Archive</button>
</form>
<form method="post" action="{{ url_for('main.job_delete', job_id=job.id) }}" class="mb-0" onsubmit="return confirm('Are you sure you want to delete this job? Related mails will be returned to the Inbox.');">
<button type="submit" class="btn btn-outline-danger">Delete job</button>
</form>
</div>
{% endif %}
<h3 class="mt-4 mb-3">Job history</h3>
@ -74,7 +80,7 @@
<tbody>
{% if history_rows %}
{% for r in history_rows %}
<tr{% if r.mail_message_id %} class="jobrun-row" data-message-id="{{ r.mail_message_id }}" data-ticket-codes="{{ (r.ticket_codes or [])|tojson|forceescape }}" data-remark-items="{{ (r.remark_items or [])|tojson|forceescape }}" style="cursor: pointer;"{% endif %}>
<tr{% if r.mail_message_id %} class="jobrun-row" data-message-id="{{ r.mail_message_id }}" data-run-id="{{ r.id }}" data-ticket-codes="{{ (r.ticket_codes or [])|tojson|forceescape }}" data-remark-items="{{ (r.remark_items or [])|tojson|forceescape }}" style="cursor: pointer;"{% endif %}>
<td>{{ r.run_day }}</td>
<td>{{ r.run_at }}</td>
{% set _s = (r.status or "")|lower %}
@ -162,49 +168,73 @@
<div class="modal-body">
<div class="row">
<div class="col-md-3">
<dl class="row mb-0">
<dl class="row mb-0 dl-compact">
<dt class="col-4">From</dt>
<dd class="col-8" id="run_msg_from"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_from"></dd>
<dt class="col-4">Backup</dt>
<dd class="col-8" id="run_msg_backup"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_backup"></dd>
<dt class="col-4">Type</dt>
<dd class="col-8" id="run_msg_type"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_type"></dd>
<dt class="col-4">Ticket</dt>
<dd class="col-8" id="run_msg_ticket"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_ticket"></dd>
<dt class="col-4">Remark</dt>
<dd class="col-8" id="run_msg_remark"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_remark"></dd>
<dt class="col-12 mt-2">Tickets &amp; remarks</dt>
<dd class="col-12">
<div id="jhm_alerts" class="small"></div>
{% if can_manage_jobs %}
<div class="border rounded p-2 mt-2">
<div class="fw-semibold">New ticket</div>
<div class="d-flex gap-2 mt-1">
<input class="form-control form-control-sm" id="jhm_ticket_code" type="text" placeholder="Ticket number (e.g., T20260106.0001)" />
<button type="button" class="btn btn-sm btn-outline-primary" id="jhm_ticket_save">Add</button>
</div>
<div class="mt-1 small text-muted" id="jhm_ticket_status"></div>
<div class="fw-semibold mt-2">New remark</div>
<textarea class="form-control form-control-sm" id="jhm_remark_body" rows="2" placeholder="Remark"></textarea>
<div class="d-flex justify-content-end mt-1">
<button type="button" class="btn btn-sm btn-outline-primary" id="jhm_remark_save">Add</button>
</div>
<div class="mt-1 small text-muted" id="jhm_remark_status"></div>
</div>
{% endif %}
</dd>
<dt class="col-4">Job</dt>
<dd class="col-8" id="run_msg_job"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_job"></dd>
<dt class="col-4">Overall</dt>
<dd class="col-8" id="run_msg_overall"></dd>
<dt class="col-4">Message</dt>
<dd class="col-8" id="run_msg_overall_message" style="white-space: pre-wrap;"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_overall"></dd>
<dt class="col-4">Customer</dt>
<dd class="col-8" id="run_msg_customer"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_customer"></dd>
<dt class="col-4">Received</dt>
<dd class="col-8" id="run_msg_received"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_received"></dd>
<dt class="col-4">Parsed</dt>
<dd class="col-8" id="run_msg_parsed"></dd>
<dd class="col-8 ellipsis-field" id="run_msg_parsed"></dd>
</dl>
</div>
<div class="col-md-9">
<div class="mb-2">
<h6 class="mb-1">Details</h6>
<div id="run_msg_overall_message" class="border rounded p-2" style="white-space: pre-wrap; max-height: 20vh; overflow: auto;"></div>
</div>
<div class="border rounded p-2 p-0" style="overflow:hidden;">
<iframe id="run_msg_body_container_iframe" class="w-100" style="height:55vh; border:0; background:transparent;" sandbox="allow-popups allow-popups-to-escape-sandbox allow-top-navigation-by-user-activation"></iframe>
</div>
<div class="mt-3">
<h6>Objects</h6>
<div id="run_msg_objects_container">
<!-- Parsed objects will be rendered here -->
</div>
@ -255,19 +285,220 @@
}
(function () {
var currentRunId = null;
function apiJson(url, opts) {
opts = opts || {};
opts.headers = opts.headers || {};
opts.headers['Content-Type'] = 'application/json';
return fetch(url, opts).then(function (r) {
return r.json().then(function (j) {
if (!r.ok || !j || j.status !== 'ok') {
var msg = (j && j.message) ? j.message : ('Request failed (' + r.status + ')');
throw new Error(msg);
}
return j;
});
});
}
function renderAlerts(payload) {
var box = document.getElementById('jhm_alerts');
if (!box) return;
var tickets = (payload && payload.tickets) || [];
var remarks = (payload && payload.remarks) || [];
if (!tickets.length && !remarks.length) {
box.innerHTML = '<span class="text-muted">No tickets or remarks linked to this run.</span>';
return;
}
var html = '';
if (tickets.length) {
html += '<div class="mb-2"><strong>Tickets</strong><div class="mt-1">';
tickets.forEach(function (t) {
var status = t.resolved_at ? 'Resolved' : 'Active';
html += '<div class="mb-2 border rounded p-2" data-alert-type="ticket" data-id="' + t.id + '">' +
'<div class="d-flex align-items-start justify-content-between gap-2">' +
'<div class="flex-grow-1 min-w-0">' +
'<div class="text-truncate">' +
'<span class="me-1" title="Ticket">🎫</span>' +
'<span class="fw-semibold">' + escapeHtml(t.ticket_code || '') + '</span>' +
'<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
'</div>' +
'</div>' +
'<div class="d-flex gap-1 flex-shrink-0">' +
'{% if can_manage_jobs %}' +
'<button type="button" class="btn btn-sm btn-outline-success" data-action="resolve-ticket" data-id="' + t.id + '" ' + (t.resolved_at ? 'disabled' : '') + '>Resolve</button>' +
'{% endif %}' +
'</div>' +
'</div>' +
'</div>';
});
html += '</div></div>';
}
if (remarks.length) {
html += '<div class="mb-2"><strong>Remarks</strong><div class="mt-1">';
remarks.forEach(function (r) {
var status = r.resolved_at ? 'Resolved' : 'Active';
html += '<div class="mb-2 border rounded p-2" data-alert-type="remark" data-id="' + r.id + '">' +
'<div class="d-flex align-items-start justify-content-between gap-2">' +
'<div class="flex-grow-1 min-w-0">' +
'<div class="text-truncate">' +
'<span class="me-1" title="Remark">💬</span>' +
'<span class="fw-semibold">Remark</span>' +
'<span class="ms-2 badge ' + (r.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
'</div>' +
(r.body ? ('<div class="small text-muted mt-1">' + escapeHtml(r.body) + '</div>') : '') +
'</div>' +
'<div class="d-flex gap-1 flex-shrink-0">' +
'{% if can_manage_jobs %}' +
'<button type="button" class="btn btn-sm btn-outline-success" data-action="resolve-remark" data-id="' + r.id + '" ' + (r.resolved_at ? 'disabled' : '') + '>Resolve</button>' +
'{% endif %}' +
'</div>' +
'</div>' +
'</div>';
});
html += '</div></div>';
}
box.innerHTML = html;
Array.prototype.forEach.call(box.querySelectorAll('button[data-action]'), function (btn) {
btn.addEventListener('click', function (ev) {
ev.preventDefault();
var action = btn.getAttribute('data-action');
var id = btn.getAttribute('data-id');
if (!action || !id) return;
if (action === 'resolve-ticket') {
if (!confirm('Mark ticket as resolved?')) return;
apiJson('/api/tickets/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})
.then(function () { loadAlerts(currentRunId); })
.catch(function (e) { alert(e.message || 'Failed.'); });
} else if (action === 'resolve-remark') {
if (!confirm('Mark remark as resolved?')) return;
apiJson('/api/remarks/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})
.then(function () { loadAlerts(currentRunId); })
.catch(function (e) { alert(e.message || 'Failed.'); });
}
});
});
}
function loadAlerts(runId) {
if (!runId) {
renderAlerts({tickets: [], remarks: []});
return;
}
fetch('/api/job-runs/' + encodeURIComponent(runId) + '/alerts')
.then(function (r) { return r.json(); })
.then(function (j) {
if (!j || j.status !== 'ok') throw new Error((j && j.message) || 'Failed');
renderAlerts(j);
})
.catch(function () {
renderAlerts({tickets: [], remarks: []});
});
}
function bindInlineCreateForms() {
var btnTicket = document.getElementById('jhm_ticket_save');
var btnRemark = document.getElementById('jhm_remark_save');
var tCode = document.getElementById('jhm_ticket_code');
var tStatus = document.getElementById('jhm_ticket_status');
var rBody = document.getElementById('jhm_remark_body');
var rStatus = document.getElementById('jhm_remark_status');
function clearStatus() {
if (tStatus) tStatus.textContent = '';
if (rStatus) rStatus.textContent = '';
}
if (btnTicket) {
btnTicket.addEventListener('click', function () {
if (!currentRunId) { alert('Select a run first.'); return; }
clearStatus();
var ticket_code = tCode ? (tCode.value || '').trim().toUpperCase() : '';
if (!ticket_code) {
if (tStatus) tStatus.textContent = 'Ticket number is required.';
else alert('Ticket number is required.');
return;
}
if (!/^T\d{8}\.\d{4}$/.test(ticket_code)) {
if (tStatus) tStatus.textContent = 'Invalid ticket number format. Expected TYYYYMMDD.####.';
else alert('Invalid ticket number format. Expected TYYYYMMDD.####.');
return;
}
if (tStatus) tStatus.textContent = 'Saving...';
apiJson('/api/tickets', {
method: 'POST',
body: JSON.stringify({job_run_id: currentRunId, ticket_code: ticket_code})
})
.then(function () {
if (tCode) tCode.value = '';
if (tStatus) tStatus.textContent = '';
loadAlerts(currentRunId);
})
.catch(function (e) {
if (tStatus) tStatus.textContent = e.message || 'Failed.';
else alert(e.message || 'Failed.');
});
});
}
if (btnRemark) {
btnRemark.addEventListener('click', function () {
if (!currentRunId) { alert('Select a run first.'); return; }
clearStatus();
var body = rBody ? rBody.value : '';
if (!body || !body.trim()) {
if (rStatus) rStatus.textContent = 'Body is required.';
else alert('Body is required.');
return;
}
if (rStatus) rStatus.textContent = 'Saving...';
apiJson('/api/remarks', {
method: 'POST',
body: JSON.stringify({job_run_id: currentRunId, body: body})
})
.then(function () {
if (rBody) rBody.value = '';
if (rStatus) rStatus.textContent = '';
loadAlerts(currentRunId);
})
.catch(function (e) {
if (rStatus) rStatus.textContent = e.message || 'Failed.';
else alert(e.message || 'Failed.');
});
});
}
}
function wrapMailHtml(html) {
html = html || "";
// Ensure we render the mail HTML with its own CSS, isolated from the site styling.
return (
"<!doctype html><html><head><meta charset=\"utf-8\">" +
"<base target=\"_blank\">" +
"</head><body style=\"margin:0; padding:8px;\">" +
html +
"</body></html>"
);
var trimmed = (typeof html === "string") ? html.trim() : "";
var injection = '<meta charset="utf-8"><meta name="color-scheme" content="light"><meta name="supported-color-schemes" content="light"><meta name="viewport" content="width=device-width, initial-scale=1"><base target="_blank"><style>:root{color-scheme:light;}html{color-scheme:light;}body{margin:0;padding:8px;background:#fff;forced-color-adjust:none;-ms-high-contrast-adjust:none;}</style>';
function injectIntoFullDoc(doc) {
var d = doc || "";
if (/<head[^>]*>/i.test(d)) {
return d.replace(/<head[^>]*>/i, function (m) { return m + injection; });
}
if (/<html[^>]*>/i.test(d)) {
return d.replace(/<html[^>]*>/i, function (m) { return m + "<head>" + injection + "</head>"; });
}
return "<!doctype html><html><head>" + injection + "</head><body>" + d + "</body></html>";
}
if (trimmed.toLowerCase().indexOf("<!doctype") === 0 || trimmed.toLowerCase().indexOf("<html") === 0) {
return injectIntoFullDoc(trimmed);
}
return "<!doctype html><html><head>" + injection + "</head><body>" + html + "</body></html>";
}
function renderObjects(objects) {
var container = document.getElementById("run_msg_objects_container");
if (!container) return;
@ -277,16 +508,47 @@ function renderObjects(objects) {
return;
}
function objectSeverityRank(o) {
var st = String((o && o.status) || "").trim().toLowerCase();
var err = String((o && o.error_message) || "").trim();
if (st === "error" || st === "failed" || st === "failure" || err) return 0;
if (st === "warning") return 1;
return 2;
}
// Sort: errors first, then warnings, then the rest; within each group sort alphabetically.
var sorted = (objects || []).slice().sort(function (a, b) {
var ra = objectSeverityRank(a);
var rb = objectSeverityRank(b);
if (ra !== rb) return ra - rb;
var an = String((a && a.name) || "").toLowerCase();
var bn = String((b && b.name) || "").toLowerCase();
if (an < bn) return -1;
if (an > bn) return 1;
var at = String((a && a.type) || "").toLowerCase();
var bt = String((b && b.type) || "").toLowerCase();
if (at < bt) return -1;
if (at > bt) return 1;
return 0;
});
var html = "<div class=\"table-responsive\"><table class=\"table table-sm table-bordered mb-0\">";
html += "<thead><tr><th>Object</th><th>Type</th><th>Status</th><th>Error</th></tr></thead><tbody>";
for (var i = 0; i < objects.length; i++) {
var o = objects[i] || {};
for (var i = 0; i < sorted.length; i++) {
var o = sorted[i] || {};
html += "<tr>";
html += "<td>" + (o.name || "") + "</td>";
html += "<td>" + (o.type || "") + "</td>";
html += "<td>" + escapeHtml(o.name || "") + "</td>";
html += "<td>" + escapeHtml(o.type || "") + "</td>";
var d = statusDotClass(o.status);
html += "<td class=\"status-text " + statusClass(o.status) + "\">" + (d ? ('<span class=\\\"status-dot ' + d + ' me-2\\\" aria-hidden=\\\"true\\\"></span>') : '') + escapeHtml(o.status || "") + "</td>";
html += "<td>" + (o.error_message || "") + "</td>";
html += "<td class=\"status-text " + statusClass(o.status) + "\">" +
(d ? ("<span class=\"status-dot " + d + " me-2\" aria-hidden=\"true\"></span>") : "") +
escapeHtml(o.status || "") +
"</td>";
html += "<td>" + escapeHtml(o.error_message || "") + "</td>";
html += "</tr>";
}
html += "</tbody></table></div>";
@ -299,10 +561,16 @@ function renderObjects(objects) {
if (!modalEl) return;
var modal = new bootstrap.Modal(modalEl);
{% if can_manage_jobs %}
bindInlineCreateForms();
{% endif %}
rows.forEach(function (row) {
row.addEventListener("click", function () {
var messageId = row.getAttribute("data-message-id");
var runId = row.getAttribute("data-run-id");
if (!messageId) return;
currentRunId = runId ? parseInt(runId, 10) : null;
fetch("{{ url_for('main.inbox_message_detail', message_id=0) }}".replace("0", messageId))
.then(function (resp) {
@ -370,6 +638,8 @@ function renderObjects(objects) {
if (bodyFrame) bodyFrame.srcdoc = wrapMailHtml(data.body_html || "");
renderObjects(data.objects || []);
loadAlerts(currentRunId);
modal.show();
})
.catch(function (err) {

View File

@ -15,7 +15,7 @@
<tbody>
{% if jobs %}
{% for j in jobs %}
<tr style="cursor: pointer;" onclick="window.location='{{ url_for('main.job_detail', job_id=j.id) }}'">
<tr class="job-row" data-href="{{ url_for('main.job_detail', job_id=j.id) }}" style="cursor: pointer;">
<td>{{ j.customer_name }}</td>
<td>{{ j.backup_software }}</td>
<td>{{ j.backup_type }}</td>
@ -32,4 +32,24 @@
</tbody>
</table>
</div>
<script>
(function () {
function onRowClick(e) {
// Don't navigate when clicking interactive elements inside the row.
if (e.target.closest('a, button, input, select, textarea, label, form')) {
return;
}
var href = this.getAttribute('data-href');
if (href) {
window.location.href = href;
}
}
document.querySelectorAll('tr.job-row[data-href]').forEach(function (row) {
row.addEventListener('click', onRowClick);
});
})();
</script>
{% endblock %}

View File

@ -62,7 +62,16 @@
</select>
</div>
<div class="col-md-3">
<label for="ov_match_error_contains" class="form-label">Error contains</label>
<label for="ov_match_error_mode" class="form-label">Error match type</label>
<select class="form-select" id="ov_match_error_mode" name="match_error_mode">
<option value="contains">Contains</option>
<option value="exact">Exact</option>
<option value="starts_with">Starts with</option>
<option value="ends_with">Ends with</option>
</select>
</div>
<div class="col-md-3">
<label for="ov_match_error_contains" class="form-label">Error text</label>
<input type="text" class="form-control" id="ov_match_error_contains" name="match_error_contains" placeholder="Text to match in error message">
</div>
<div class="col-md-3">
@ -142,6 +151,7 @@
data-ov-object-name="{{ ov.object_name or '' }}"
data-ov-match-status="{{ ov.match_status or '' }}"
data-ov-match-error-contains="{{ ov.match_error_contains or '' }}"
data-ov-match-error-mode="{{ ov.match_error_mode or 'contains' }}"
data-ov-treat-as-success="{{ 1 if ov.treat_as_success else 0 }}"
data-ov-comment="{{ ov.comment or '' }}"
data-ov-start-at="{{ ov.start_at_raw or '' }}"
@ -190,6 +200,7 @@
const jobField = document.getElementById('ov_job_id');
const objectNameField = document.getElementById('ov_object_name');
const matchStatusField = document.getElementById('ov_match_status');
const matchErrorModeField = document.getElementById('ov_match_error_mode');
const matchErrorContainsField = document.getElementById('ov_match_error_contains');
const treatAsSuccessField = document.getElementById('ov_treat_success');
const commentField = document.getElementById('ov_comment');
@ -228,6 +239,7 @@
setValue(jobField, btn.dataset.ovJobId || '');
setValue(objectNameField, btn.dataset.ovObjectName || '');
setValue(matchStatusField, btn.dataset.ovMatchStatus || '');
setValue(matchErrorModeField, btn.dataset.ovMatchErrorMode || 'contains');
setValue(matchErrorContainsField, btn.dataset.ovMatchErrorContains || '');
if (treatAsSuccessField) treatAsSuccessField.checked = (btn.dataset.ovTreatAsSuccess === '1');
setValue(commentField, btn.dataset.ovComment || '');

View File

@ -16,19 +16,19 @@
{% endif %}
</div>
<form method="post" class="row g-3"> <div class="col-12">
<div class="row g-3"> <div class="col-12">
<label class="form-label">Body</label>
<textarea class="form-control" name="body" rows="6">{{ remark.body or '' }}</textarea>
<div class="form-control-plaintext border rounded p-2" style="min-height: 7rem; white-space: pre-wrap;">{{ remark.body or '' }}</div>
</div>
{% if active_role in ['admin','operator'] %}
<div class="col-12">
<button class="btn btn-primary" type="submit">Save</button>
{% if not remark.resolved_at %}
<button class="btn btn-outline-success" type="button" onclick="if(confirm('Mark remark as resolved?')){fetch('{{ url_for('main.api_remark_resolve', remark_id=remark.id) }}',{method:'POST'}).then(()=>location.reload());}">Resolve</button>
{% endif %}
</div>
{% endif %}
</form>
</div>
</div>
</div>

View File

@ -36,9 +36,35 @@
</tr>
</thead>
<tbody id="rep_table_body">
{% if initial_reports %}
{% for item in initial_reports %}
<tr>
<td colspan="5" class="text-center text-muted py-4">Loading…</td>
<td><strong>{{ item.name }}</strong><div class="text-muted small">{{ item.description }}</div></td>
<td class="text-muted small">{{ item.report_type }}</td>
<td class="text-muted small">
{% if item.period_start or item.period_end %}
{{ item.period_start.replace('T',' ') if item.period_start else '' }} → {{ item.period_end.replace('T',' ') if item.period_end else '' }}
{% else %}
<span class="text-muted"></span>
{% endif %}
</td>
<td><span class="badge text-bg-light border">{{ item.output_format }}</span></td>
<td class="text-end">
<a class="btn btn-sm btn-outline-secondary" href="{{ url_for('main.reports_edit', report_id=item.id) }}">Edit</a>
<button type="button" class="btn btn-sm btn-outline-primary rep-generate-btn" data-id="{{ item.id }}">Generate</button>
<button type="button" class="btn btn-sm btn-outline-secondary ms-1 rep-view-btn" data-id="{{ item.id }}">View raw</button>
<a class="btn btn-sm btn-outline-success rep-download-btn ms-1" href="/api/reports/{{ item.id }}/export?format={{ (item.output_format or 'csv')|lower }}" target="_blank" rel="noopener">Download</a>
{% if active_role in ('admin','operator','reporter') %}
<button type="button" class="btn btn-sm btn-outline-danger rep-delete-btn ms-1" data-id="{{ item.id }}">Delete</button>
{% endif %}
</td>
</tr>
{% endfor %}
{% else %}
<tr>
<td colspan="5" class="text-center text-muted py-4">No reports found.</td>
</tr>
{% endif %}
</tbody>
</table>
</div>
@ -138,6 +164,7 @@
</div>
<script>
window.__reportColumnsMeta = {{ columns_meta|tojson }};
window.addEventListener('DOMContentLoaded', function () {
var rawModalEl = document.getElementById('rep_raw_modal');
var rawModal = window.bootstrap ? new bootstrap.Modal(rawModalEl) : null;
@ -147,6 +174,94 @@
var rawLimit = 100;
var rawOffset = 0;
var canDeleteReports = {{ 'true' if active_role in ('admin','operator','reporter') else 'false' }};
var reportsItems = [];
var reportColumnsMeta = window.__reportColumnsMeta || null;
var rawReportConfig = null;
function loadReportColumnsMeta() {
if (reportColumnsMeta) return Promise.resolve();
return fetch('/api/reports/columns', { credentials: 'same-origin' })
.then(function (r) { return r.json(); })
.then(function (j) { reportColumnsMeta = j || null; })
.catch(function () { reportColumnsMeta = null; });
}
function colLabel(key) {
if (!reportColumnsMeta || !reportColumnsMeta.groups) return key;
for (var i = 0; i < reportColumnsMeta.groups.length; i++) {
var items = reportColumnsMeta.groups[i].items || [];
for (var j = 0; j < items.length; j++) {
if (items[j].key === key) return items[j].label || key;
}
}
// Backwards compatibility: object_name was used for job name in older configs.
if (key === 'job_name') {
for (var i2 = 0; i2 < reportColumnsMeta.groups.length; i2++) {
var items2 = reportColumnsMeta.groups[i2].items || [];
for (var j2 = 0; j2 < items2.length; j2++) {
if (items2[j2].key === 'object_name') return items2[j2].label || 'Job name';
}
}
return 'Job name';
}
return key;
}
function uniqPreserveOrder(arr) {
var out = [];
var seen = {};
(arr || []).forEach(function (k) {
var key = String(k);
if (seen[key]) return;
seen[key] = true;
out.push(k);
});
return out;
}
function normalizeCols(arr) {
if (!Array.isArray(arr)) return [];
return uniqPreserveOrder(arr.map(function (k) { return (k === 'object_name') ? 'job_name' : k; }));
}
function defaultColsFor(view) {
if (reportColumnsMeta && reportColumnsMeta.defaults && reportColumnsMeta.defaults[view]) {
return normalizeCols(reportColumnsMeta.defaults[view].slice());
}
// hard fallback
if (view === 'snapshot') return ['job_name','customer_name','job_id','status','run_at'];
return ['job_name','total_runs','success_count','warning_count','failed_count','missed_count','success_rate'];
}
function selectedColsFor(view) {
var cfg = rawReportConfig || {};
var cols = null;
var hasView = false;
if (cfg.columns && typeof cfg.columns === 'object') {
hasView = Object.prototype.hasOwnProperty.call(cfg.columns, view);
cols = cfg.columns[view];
}
if (hasView && Array.isArray(cols)) {
// If an empty list is saved, keep it empty.
return normalizeCols(cols);
}
if (cols && cols.length) return normalizeCols(cols);
return defaultColsFor(view);
}
function fmtCellValue(v) {
if (v === null || v === undefined) return '';
if (typeof v === 'boolean') return v ? 'Yes' : 'No';
var s = String(v);
// basic ISO datetime prettify
if (s.indexOf('T') >= 0 && s.indexOf(':') >= 0 && s.indexOf('-') >= 0) {
return s.replace('T', ' ');
}
return s;
}
function qs(id) { return document.getElementById(id); }
function fmtPeriod(item) {
@ -193,95 +308,70 @@
}
}
function setRawDownloadLink() {
var btn = qs('rep_raw_download_btn');
if (!btn) return;
if (!rawReportId) {
btn.setAttribute('href', '#');
btn.classList.add('disabled');
return;
}
btn.classList.remove('disabled');
btn.setAttribute('href', '/api/reports/' + rawReportId + '/export?view=' + rawView);
}
function updateRawMeta(total) {
var t = parseInt(total || 0, 10) || 0;
var start = t ? (rawOffset + 1) : 0;
var end = t ? Math.min(rawOffset + rawLimit, t) : 0;
var label = (rawView === 'snapshot') ? 'Snapshot' : 'Summary';
qs('rep_raw_meta').textContent = label + ' · Rows ' + start + '-' + end + ' of ' + t;
qs('rep_raw_prev_btn').disabled = (rawOffset <= 0);
qs('rep_raw_next_btn').disabled = ((rawOffset + rawLimit) >= t);
}
function renderRawTable(view, items) {
var thead = qs('rep_raw_thead');
var tbody = qs('rep_raw_tbody');
function thRow(cols) {
return '<tr>' + cols.map(function (c) { return '<th>' + escapeHtml(c) + '</th>'; }).join('') + '</tr>';
var cols = selectedColsFor(view);
if (!cols || !cols.length) {
thead.innerHTML = '<tr><th class="text-muted">No columns selected</th></tr>';
tbody.innerHTML = '<tr><td class="text-muted py-4">No columns selected.</td></tr>';
setRawDownloadLink();
return;
}
if (view === 'snapshot') {
thead.innerHTML = thRow([
'Object', 'Customer', 'Job ID', 'Job Name', 'Backup software', 'Backup type',
'Run ID', 'Run at (UTC)', 'Status', 'Missed', 'Override', 'Reviewed at', 'Remark'
]);
function thRow(keys) {
return '<tr>' + keys.map(function (k) { return '<th>' + escapeHtml(colLabel(k)) + '</th>'; }).join('') + '</tr>';
}
thead.innerHTML = thRow(cols);
if (!items || !items.length) {
tbody.innerHTML = '<tr><td colspan="13" class="text-center text-muted py-4">No snapshot rows found.</td></tr>';
tbody.innerHTML = '<tr><td colspan="' + String(cols.length || 1) + '" class="text-center text-muted py-4">No rows found.</td></tr>';
setRawDownloadLink();
return;
}
function td(val, nowrap) {
var c = nowrap ? ' class="text-nowrap"' : '';
return '<td' + c + '>' + escapeHtml(fmtCellValue(val)) + '</td>';
}
tbody.innerHTML = items.map(function (r) {
return (
'<tr>' +
'<td class="text-nowrap">' + escapeHtml(r.object_name || '') + '</td>' +
'<td class="text-nowrap">' + escapeHtml(r.customer_name || '') + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.job_id || '')) + '</td>' +
'<td>' + escapeHtml(r.job_name || '') + '</td>' +
'<td class="text-nowrap">' + escapeHtml(r.backup_software || '') + '</td>' +
'<td class="text-nowrap">' + escapeHtml(r.backup_type || '') + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.run_id || '')) + '</td>' +
'<td class="text-nowrap">' + escapeHtml((r.run_at || '').replace('T', ' ')) + '</td>' +
'<td class="text-nowrap">' + escapeHtml(r.status || '') + '</td>' +
'<td class="text-nowrap">' + (r.missed ? '1' : '0') + '</td>' +
'<td class="text-nowrap">' + (r.override_applied ? '1' : '0') + '</td>' +
'<td class="text-nowrap">' + escapeHtml((r.reviewed_at || '').replace('T', ' ')) + '</td>' +
'<td>' + escapeHtml(r.remark || '') + '</td>' +
cols.map(function (k) {
var val = (k === 'job_name') ? ((r.job_name !== null && r.job_name !== undefined && String(r.job_name).length) ? r.job_name : r.object_name) : r[k];
return td(val, (k === 'run_at' || k === 'reviewed_at' || k === 'job_id' || k === 'run_id' || k === 'customer_name'));
}).join('') +
'</tr>'
);
}).join('');
return;
}
thead.innerHTML = thRow([
'Object', 'Total', 'Success', 'Success (override)', 'Warning', 'Failed', 'Missed', 'Success rate (%)'
]);
if (!items || !items.length) {
tbody.innerHTML = '<tr><td colspan="8" class="text-center text-muted py-4">No summary rows found.</td></tr>';
return;
}
tbody.innerHTML = items.map(function (r) {
return (
'<tr>' +
'<td class="text-nowrap">' + escapeHtml(r.object_name || '') + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.total_runs || 0)) + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.success_count || 0)) + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.success_override_count || 0)) + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.warning_count || 0)) + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.failed_count || 0)) + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.missed_count || 0)) + '</td>' +
'<td class="text-nowrap">' + escapeHtml(String(r.success_rate || 0)) + '</td>' +
'</tr>'
);
}).join('');
}
function updateRawMeta(total) {
var a = rawOffset + 1;
var b = Math.min(rawOffset + rawLimit, total);
if (!total) {
qs('rep_raw_meta').textContent = '0 rows';
} else {
qs('rep_raw_meta').textContent = a + '' + b + ' of ' + total;
}
qs('rep_raw_prev_btn').disabled = rawOffset <= 0;
qs('rep_raw_next_btn').disabled = (rawOffset + rawLimit) >= total;
}
function setRawDownloadLink() {
if (!rawReportId) {
qs('rep_raw_download_btn').setAttribute('href', '#');
qs('rep_raw_download_btn').classList.add('disabled');
return;
}
qs('rep_raw_download_btn').classList.remove('disabled');
qs('rep_raw_download_btn').setAttribute('href', '/api/reports/' + rawReportId + '/export.csv?view=' + rawView);
setRawDownloadLink();
}
function loadRawData() {
@ -305,6 +395,12 @@
function openRawModal(id) {
rawReportId = id;
rawReportConfig = null;
var rid = String(id || '');
for (var i = 0; i < (reportsItems || []).length; i++) {
if (String(reportsItems[i].id) === rid) { rawReportConfig = (reportsItems[i].report_config || null); break; }
}
rawOffset = 0;
rawView = rawView || 'summary';
qs('rep_raw_title').textContent = 'Raw data (Report #' + id + ')';
@ -313,6 +409,7 @@
}
function renderTable(items) {
reportsItems = items || [];
var body = qs('rep_table_body');
if (!items || !items.length) {
body.innerHTML = '<tr><td colspan="5" class="text-center text-muted py-4">No reports defined yet.</td></tr>';
@ -338,9 +435,11 @@
'<td class="text-muted small">' + period + '</td>' +
'<td><span class="badge text-bg-light border">' + fmt + '</span></td>' +
'<td class="text-end">' +
'<a class="btn btn-sm btn-outline-secondary me-1" href="/reports/' + item.id + '/edit">Edit</a>' +
'<button type="button" class="btn btn-sm btn-outline-primary me-1 rep-generate-btn" data-id="' + item.id + '">Generate</button>' +
'<button type="button" class="btn btn-sm btn-outline-secondary me-1 rep-view-btn" data-id="' + item.id + '">View raw</button>' +
'<a class="btn btn-sm btn-outline-success rep-download-btn" href="/api/reports/' + item.id + '/export.csv" target="_blank" rel="noopener">Download</a>' +
'<a class="btn btn-sm btn-outline-success rep-download-btn" href="/api/reports/' + item.id + '/export?format=' + encodeURIComponent((item.output_format || 'csv').toLowerCase()) + '" target="_blank" rel="noopener">Download</a>' +
(canDeleteReports ? '<button type="button" class="btn btn-sm btn-outline-danger ms-1 rep-delete-btn" data-id="' + item.id + '">Delete</button>' : '') +
'</td>';;
body.appendChild(tr);
@ -359,6 +458,14 @@
openRawModal(id);
});
});
body.querySelectorAll('.rep-delete-btn').forEach(function (btn) {
btn.addEventListener('click', function () {
var id = btn.getAttribute('data-id');
deleteReport(id, btn);
});
});
}
function loadReports() {
@ -373,6 +480,33 @@
});
}
function deleteReport(id, btnEl) {
if (!id) return;
if (!confirm('Delete this report definition? This cannot be undone.')) return;
var oldText = btnEl ? btnEl.textContent : '';
if (btnEl) { btnEl.disabled = true; btnEl.textContent = 'Deleting…'; }
fetch('/api/reports/' + id, { method: 'DELETE', credentials: 'same-origin' })
.then(function (r) { return r.json().then(function (j) { return { ok: r.ok, json: j }; }); })
.then(function (res) {
if (btnEl) { btnEl.disabled = false; btnEl.textContent = oldText; }
if (!res.ok) {
alert((res.json && res.json.error) ? res.json.error : 'Delete failed.');
return;
}
// If the raw modal is open for this report, close it.
if (rawReportId && String(rawReportId) === String(id) && rawModal) {
rawModal.hide();
rawReportId = null;
}
loadReportColumnsMeta().then(function () { loadReports(); });
})
.catch(function () {
if (btnEl) { btnEl.disabled = false; btnEl.textContent = oldText; }
alert('Delete failed.');
});
}
function generateReport(id, btnEl) {
if (!id) return;
var oldText = btnEl.textContent;
@ -430,7 +564,7 @@
loadRawData();
});
loadReports();
loadReportColumnsMeta().then(function () { loadReports(); });
});
</script>

View File

@ -3,8 +3,8 @@
<div class="d-flex flex-wrap align-items-baseline justify-content-between mb-3">
<div>
<h2 class="mb-1">New report</h2>
<div class="text-muted">Create a one-time report definition. Generate output from the Reports overview.</div>
<h2 class="mb-1">{{ 'Edit report' if is_edit else 'New report' }}</h2>
<div class="text-muted">{{ 'Update this report definition. Generate output from the Reports overview.' if is_edit else 'Create a one-time report definition. Generate output from the Reports overview.' }}</div>
</div>
<div class="mt-2 mt-md-0">
<a class="btn btn-outline-secondary" href="{{ url_for('main.reports') }}">Back</a>
@ -30,11 +30,22 @@
<div class="col-12 col-md-6">
<label class="form-label">Output format</label>
<select class="form-select" id="rep_output_format">
<option value="csv" selected>CSV</option>
<option value="pdf" disabled>PDF (coming soon)</option>
<option value="csv">CSV</option>
<option value="html">HTML</option>
<option value="pdf">PDF</option>
</select>
</div>
<div class="col-12 col-md-6">
<label class="form-label">HTML/PDF content</label>
<select class="form-select" id="rep_html_content">
<option value="customers">Customers</option>
<option value="jobs">Jobs</option>
<option value="both">Customers + Jobs</option>
</select>
<div class="form-text">Controls whether the HTML/PDF output shows a customer list, a job list, or both.</div>
</div>
<div class="col-12">
<label class="form-label">Description</label>
<input type="text" class="form-control" id="rep_description" placeholder="Optional description" />
@ -94,19 +105,85 @@
<div class="col-12 col-md-6" id="rep_single_wrap">
<label class="form-label">Customer <span class="text-danger">*</span></label>
<select class="form-select" id="rep_customer_single"></select>
<select class="form-select" id="rep_customer_single">
<option value="" selected>Select a customer…</option>
{% if initial_customers %}
{% for c in initial_customers %}
<option value="{{ c.id }}">{{ c.name }}</option>
{% endfor %}
{% endif %}
</select>
<div class="form-text">Search will be added later. For MVP this is a simple dropdown.</div>
</div>
<div class="col-12 col-md-6 d-none" id="rep_multiple_wrap">
<label class="form-label">Customers <span class="text-danger">*</span></label>
<select class="form-select" id="rep_customer_multiple" multiple size="10"></select>
<select class="form-select" id="rep_customer_multiple" multiple size="10">
{% if initial_customers %}
{% for c in initial_customers %}
<option value="{{ c.id }}">{{ c.name }}</option>
{% endfor %}
{% endif %}
</select>
<div class="form-text">Hold Ctrl/Cmd to select multiple customers.</div>
</div>
<div class="col-12">
<div class="alert alert-info mb-0">
Jobs selection is set to <span class="fw-semibold">all jobs for each selected customer</span> in this iteration.
<div class="fw-semibold mb-1">Jobs filter</div>
<div class="text-muted small mb-3">
Filter which jobs are included in the report by backup software and backup type. Informational jobs (e.g. License Key) are always excluded.
</div>
</div>
<div class="col-12 col-lg-6">
<label class="form-label fw-semibold" for="rep_job_backup_software">Backup software</label>
<select class="form-select" id="rep_job_backup_software" multiple size="6">
{% for item in job_filters_meta.backup_softwares %}
<option value="{{ item.key }}">{{ item.label }}</option>
{% endfor %}
</select>
<div class="form-text">Leave empty to include all backup software.</div>
</div>
<div class="col-12 col-lg-6">
<label class="form-label fw-semibold" for="rep_job_backup_type">Backup type</label>
<select class="form-select" id="rep_job_backup_type" multiple size="6">
{% for item in job_filters_meta.backup_types %}
<option value="{{ item.key }}">{{ item.label }}</option>
{% endfor %}
</select>
<div class="form-text">Leave empty to include all backup types (except informational types).</div>
</div>
</div>
<div class="fw-semibold mb-1">Report content</div>
<div class="text-muted small mb-3">Choose which columns are included in the report and define their order. (Applies to online table views.)</div>
<div class="border rounded p-3">
<div class="d-flex flex-wrap gap-2 align-items-center justify-content-between mb-3">
<div class="btn-group" role="group" aria-label="Report content view selector">
<button type="button" class="btn btn-outline-secondary active" id="rep_cols_tab_summary">Summary</button>
<button type="button" class="btn btn-outline-secondary" id="rep_cols_tab_snapshot">Snapshot</button>
<button type="button" class="btn btn-outline-secondary" id="rep_cols_tab_jobs">Jobs</button>
</div>
<div class="text-muted small" id="rep_cols_hint">Select columns for the summary view.</div>
</div>
<div class="alert alert-info py-2 px-3 mb-3 d-none" id="rep_cols_loading">Loading available columns…</div>
<div class="alert alert-danger py-2 px-3 mb-3 d-none" id="rep_cols_error"></div>
<div class="row g-3">
<div class="col-12 col-lg-6">
<div class="fw-semibold mb-2">Available columns</div>
<div id="rep_cols_available"></div>
</div>
<div class="col-12 col-lg-6">
<div class="fw-semibold mb-2">Selected columns (drag to reorder)</div>
<div class="text-muted small mb-2">Tip: disabled items are coming soon.</div>
<ul class="list-group" id="rep_cols_selected"></ul>
</div>
</div>
</div>
@ -114,7 +191,7 @@
<hr class="my-4" />
<div class="d-flex flex-wrap gap-2">
<button type="button" class="btn btn-primary" id="rep_create_btn">Create report</button>
<button type="button" class="btn btn-primary" id="rep_create_btn">{{ 'Save changes' if is_edit else 'Create report' }}</button>
<a class="btn btn-outline-secondary" href="{{ url_for('main.reports') }}">Cancel</a>
</div>
</div>
@ -154,6 +231,11 @@
</div>
<script>
window.__reportColumnsMeta = {{ columns_meta|tojson }};
window.__jobFiltersMeta = {{ job_filters_meta|tojson }};
window.__initialCustomers = {{ initial_customers|tojson }};
window.__isEdit = {{ 'true' if is_edit else 'false' }};
window.__initialReport = {{ initial_report|tojson }};
window.addEventListener('DOMContentLoaded', function () {
function qs(id) { return document.getElementById(id); }
@ -170,6 +252,500 @@
el.textContent = '';
}
var isEdit = !!window.__isEdit;
var initialReport = window.__initialReport || null;
var editReportId = (initialReport && initialReport.id) ? initialReport.id : null;
// --- HTML/PDF content selector ---
var repHtmlContent = null;
var repHtmlContentTouched = false;
if (isEdit && initialReport && initialReport.report_config && initialReport.report_config.presentation) {
var p = initialReport.report_config.presentation;
if (p && typeof p === 'object') {
repHtmlContent = (p.html_content || '').trim().toLowerCase() || null;
}
}
function defaultHtmlContentForScope(scope) {
if ((scope || '').toLowerCase() === 'single') return 'jobs';
return 'customers';
}
// --- Report content / column selector ---
var repColsView = 'summary';
var repColsMeta = window.__reportColumnsMeta || null;
// Use null to indicate "no value configured" so defaults can be applied.
// If the user explicitly saves an empty list, keep it empty.
var repColsSelected = { summary: null, snapshot: null, jobs: null };
if (isEdit && initialReport && initialReport.report_config && initialReport.report_config.columns) {
var cols = initialReport.report_config.columns;
repColsSelected = {
summary: (cols && Object.prototype.hasOwnProperty.call(cols, 'summary') && Array.isArray(cols.summary)) ? cols.summary.slice() : null,
snapshot: (cols && Object.prototype.hasOwnProperty.call(cols, 'snapshot') && Array.isArray(cols.snapshot)) ? cols.snapshot.slice() : null,
jobs: (cols && Object.prototype.hasOwnProperty.call(cols, 'jobs') && Array.isArray(cols.jobs)) ? cols.jobs.slice() : null,
};
}
function uniqPreserveOrder(arr) {
var out = [];
var seen = {};
(arr || []).forEach(function (k) {
var key = String(k);
if (seen[key]) return;
seen[key] = true;
out.push(k);
});
return out;
}
function normalizeJobNameColumns() {
// Merge legacy object_name into job_name (single logical column: "Job name").
if (repColsMeta && repColsMeta.groups) {
repColsMeta.groups.forEach(function (g) {
var items = g.items || [];
var jobItem = null;
var objItem = null;
items.forEach(function (it) {
if (!it) return;
if (it.key === 'job_name') jobItem = it;
if (it.key === 'object_name') objItem = it;
});
if (objItem && !jobItem) {
// convert object_name into job_name
objItem.key = 'job_name';
objItem.label = 'Job name';
jobItem = objItem;
objItem = null;
}
if (jobItem) {
jobItem.label = 'Job name';
// Merge views if object_name existed in the same group
if (objItem && Array.isArray(objItem.views)) {
var merged = (Array.isArray(jobItem.views) ? jobItem.views.slice() : []);
objItem.views.forEach(function (v) {
if (merged.indexOf(v) < 0) merged.push(v);
});
jobItem.views = merged;
}
}
// Remove any remaining object_name items
g.items = (g.items || []).filter(function (it) { return it && it.key !== 'object_name'; });
// Remove duplicate job_name items within the same group
var seenJob = false;
g.items = (g.items || []).filter(function (it) {
if (!it) return false;
if (it.key !== 'job_name') return true;
if (!seenJob) {
seenJob = true;
it.label = 'Job name';
return true;
}
return false;
});
});
// Ensure job_name only appears once across all groups.
var seenGlobalJob = false;
repColsMeta.groups.forEach(function (g2) {
g2.items = (g2.items || []).filter(function (it2) {
if (!it2) return false;
if (it2.key !== 'job_name') return true;
if (!seenGlobalJob) {
seenGlobalJob = true;
it2.label = 'Job name';
return true;
}
return false;
});
});
}
// Defaults: replace object_name -> job_name and de-duplicate
if (repColsMeta && repColsMeta.defaults) {
['summary', 'snapshot', 'jobs'].forEach(function (v) {
var d = repColsMeta.defaults[v];
if (!Array.isArray(d)) return;
repColsMeta.defaults[v] = uniqPreserveOrder(d.map(function (k) { return (k === 'object_name') ? 'job_name' : k; }));
});
}
// Selected: replace object_name -> job_name and de-duplicate
['summary', 'snapshot', 'jobs'].forEach(function (v) {
if (repColsSelected[v] === null || typeof repColsSelected[v] === 'undefined') return;
if (!Array.isArray(repColsSelected[v])) return;
repColsSelected[v] = uniqPreserveOrder(repColsSelected[v].map(function (k) { return (k === 'object_name') ? 'job_name' : k; }));
});
}
// --- Job filters (backup software / backup type) ---
var jobFiltersMeta = window.__jobFiltersMeta || null;
var repJobFiltersSelected = { backup_softwares: [], backup_types: [] };
if (isEdit && initialReport && initialReport.report_config && initialReport.report_config.filters) {
var f = initialReport.report_config.filters;
repJobFiltersSelected = {
backup_softwares: Array.isArray(f.backup_softwares) ? f.backup_softwares.slice() : [],
backup_types: Array.isArray(f.backup_types) ? f.backup_types.slice() : [],
};
}
function getSelectedValues(selectEl) {
var out = [];
if (!selectEl) return out;
for (var i = 0; i < selectEl.options.length; i++) {
var opt = selectEl.options[i];
if (opt && opt.selected) out.push(opt.value);
}
return out;
}
function setSelectedValues(selectEl, values) {
if (!selectEl) return;
var set = {};
(values || []).forEach(function (v) { set[String(v)] = true; });
for (var i = 0; i < selectEl.options.length; i++) {
var opt = selectEl.options[i];
if (!opt) continue;
opt.selected = !!set[String(opt.value)];
}
}
function rebuildBackupTypeOptions(allowedTypes, preserveSelected) {
var sel = qs('rep_job_backup_type');
if (!sel || !jobFiltersMeta || !Array.isArray(jobFiltersMeta.backup_types)) return;
var currentSelected = preserveSelected ? getSelectedValues(sel) : [];
var allowedSet = null;
if (Array.isArray(allowedTypes) && allowedTypes.length) {
allowedSet = {};
allowedTypes.forEach(function (v) { allowedSet[String(v)] = true; });
}
// rebuild options
sel.innerHTML = '';
jobFiltersMeta.backup_types.forEach(function (item) {
if (!item || !item.key) return;
if (String(item.key).toLowerCase() === 'license key') return; // always excluded
if (allowedSet && !allowedSet[String(item.key)]) return;
var opt = document.createElement('option');
opt.value = item.key;
opt.textContent = item.label || item.key;
sel.appendChild(opt);
});
// restore selection (only if still exists)
var finalSelected = [];
currentSelected.forEach(function (v) {
for (var i = 0; i < sel.options.length; i++) {
if (sel.options[i].value === v) finalSelected.push(v);
}
});
setSelectedValues(sel, finalSelected);
}
function onBackupSoftwareChange() {
var swSel = qs('rep_job_backup_software');
var selected = getSelectedValues(swSel);
if (!jobFiltersMeta || !jobFiltersMeta.by_backup_software) {
rebuildBackupTypeOptions(null, true);
return;
}
// union types for selected software(s)
if (!selected.length) {
rebuildBackupTypeOptions(null, true);
return;
}
var union = {};
selected.forEach(function (sw) {
var arr = jobFiltersMeta.by_backup_software[sw];
if (Array.isArray(arr)) {
arr.forEach(function (t) { union[String(t)] = true; });
}
});
var allowed = Object.keys(union);
rebuildBackupTypeOptions(allowed, true);
}
// Initialize job filter selects
(function initJobFilters() {
var swSel = qs('rep_job_backup_software');
if (swSel) {
swSel.addEventListener('change', onBackupSoftwareChange);
setSelectedValues(swSel, repJobFiltersSelected.backup_softwares);
}
onBackupSoftwareChange();
var btSel = qs('rep_job_backup_type');
if (btSel) setSelectedValues(btSel, repJobFiltersSelected.backup_types);
})();
function colsHintText(viewKey) {
if (viewKey === 'snapshot') return 'Select columns for the snapshot view.';
if (viewKey === 'jobs') return 'Select columns for the jobs view (HTML/PDF).';
return 'Select columns for the summary view.';
}
function setColsView(viewKey) {
repColsView = (viewKey === 'snapshot' || viewKey === 'jobs') ? viewKey : 'summary';
var a = qs('rep_cols_tab_summary');
var b = qs('rep_cols_tab_snapshot');
var c = qs('rep_cols_tab_jobs');
a.classList.toggle('active', repColsView === 'summary');
b.classList.toggle('active', repColsView === 'snapshot');
c.classList.toggle('active', repColsView === 'jobs');
qs('rep_cols_hint').textContent = colsHintText(repColsView);
renderColsAvailable();
renderColsSelected();
}
function showColsError(msg) {
var el = qs('rep_cols_error');
el.textContent = msg || 'Failed to load columns.';
el.classList.remove('d-none');
}
function clearColsError() {
var el = qs('rep_cols_error');
el.classList.add('d-none');
el.textContent = '';
}
function showColsLoading(on) {
var el = qs('rep_cols_loading');
if (on) el.classList.remove('d-none');
else el.classList.add('d-none');
}
function ensureDefaultsFromMeta() {
if (!repColsMeta || !repColsMeta.defaults) return;
['summary', 'snapshot', 'jobs'].forEach(function (v) {
if (repColsSelected[v] === null || typeof repColsSelected[v] === 'undefined') {
repColsSelected[v] = (repColsMeta.defaults[v] || []).slice();
}
});
}
function renderColsAvailable() {
var host = qs('rep_cols_available');
if (!host) return;
host.innerHTML = '';
if (!repColsMeta || !repColsMeta.groups) {
host.innerHTML = '<div class="text-muted">No column metadata available.</div>';
return;
}
function isSelected(key) {
return (repColsSelected[repColsView] || []).indexOf(key) >= 0;
}
function toggleKey(key, checked) {
var arr = repColsSelected[repColsView] || [];
var idx = arr.indexOf(key);
if (checked && idx < 0) arr.push(key);
if (!checked && idx >= 0) arr.splice(idx, 1);
repColsSelected[repColsView] = arr;
renderColsSelected();
}
repColsMeta.groups.forEach(function (g) {
var items = (g.items || []).filter(function (it) {
var v = it.views || [];
return v.indexOf(repColsView) >= 0;
});
if (!items.length) return;
var groupEl = document.createElement('div');
groupEl.className = 'mb-3';
var title = document.createElement('div');
title.className = 'fw-semibold mb-2';
title.textContent = g.name || 'Columns';
groupEl.appendChild(title);
items.forEach(function (it) {
var enabled = (it.enabled !== false);
var id = 'rep_col_' + repColsView + '_' + (it.key || '').replace(/[^a-zA-Z0-9_]/g, '_');
var wrap = document.createElement('div');
wrap.className = 'form-check';
var cb = document.createElement('input');
cb.className = 'form-check-input';
cb.type = 'checkbox';
cb.id = id;
cb.disabled = !enabled;
cb.checked = isSelected(it.key);
cb.addEventListener('change', function () {
toggleKey(it.key, cb.checked);
});
var lbl = document.createElement('label');
lbl.className = 'form-check-label';
lbl.setAttribute('for', id);
lbl.textContent = it.label || it.key || '';
if (!enabled) {
lbl.classList.add('text-muted');
lbl.textContent = (lbl.textContent || '') + ' (coming soon)';
}
wrap.appendChild(cb);
wrap.appendChild(lbl);
groupEl.appendChild(wrap);
});
host.appendChild(groupEl);
});
}
function renderColsSelected() {
var host = qs('rep_cols_selected');
if (!host) return;
host.innerHTML = '';
var arr = repColsSelected[repColsView] || [];
if (!arr.length) {
host.innerHTML = '<li class="list-group-item text-muted">No columns selected.</li>';
return;
}
function labelForKey(key) {
if (!repColsMeta || !repColsMeta.groups) return key;
for (var i = 0; i < repColsMeta.groups.length; i++) {
var items = repColsMeta.groups[i].items || [];
for (var j = 0; j < items.length; j++) {
if (items[j].key === key) return items[j].label || key;
}
}
return key;
}
function removeKey(key) {
var i = arr.indexOf(key);
if (i >= 0) arr.splice(i, 1);
repColsSelected[repColsView] = arr;
renderColsAvailable();
renderColsSelected();
}
arr.forEach(function (key) {
var li = document.createElement('li');
li.className = 'list-group-item d-flex align-items-center justify-content-between gap-2';
li.setAttribute('draggable', 'true');
li.dataset.key = key;
var left = document.createElement('div');
left.className = 'd-flex align-items-center gap-2';
var handle = document.createElement('span');
handle.className = 'text-muted';
handle.textContent = '↕';
var txt = document.createElement('span');
txt.textContent = labelForKey(key);
left.appendChild(handle);
left.appendChild(txt);
var btn = document.createElement('button');
btn.type = 'button';
btn.className = 'btn btn-sm btn-outline-danger';
btn.textContent = 'Remove';
btn.addEventListener('click', function () { removeKey(key); });
li.appendChild(left);
li.appendChild(btn);
li.addEventListener('dragstart', function (ev) {
ev.dataTransfer.setData('text/plain', key);
ev.dataTransfer.effectAllowed = 'move';
});
li.addEventListener('dragover', function (ev) {
ev.preventDefault();
ev.dataTransfer.dropEffect = 'move';
});
li.addEventListener('drop', function (ev) {
ev.preventDefault();
var draggedKey = ev.dataTransfer.getData('text/plain');
if (!draggedKey || draggedKey === key) return;
var from = arr.indexOf(draggedKey);
var to = arr.indexOf(key);
if (from < 0 || to < 0) return;
arr.splice(from, 1);
arr.splice(to, 0, draggedKey);
repColsSelected[repColsView] = arr;
renderColsSelected();
});
host.appendChild(li);
});
}
function loadReportColumns() {
var area = document.getElementById('rep_cols_available');
if (!area) return;
if (repColsMeta) {
showColsLoading(false);
clearColsError();
normalizeJobNameColumns();
ensureDefaultsFromMeta();
qs('rep_cols_tab_summary').addEventListener('click', function () { setColsView('summary'); });
qs('rep_cols_tab_snapshot').addEventListener('click', function () { setColsView('snapshot'); });
qs('rep_cols_tab_jobs').addEventListener('click', function () { setColsView('jobs'); });
setColsView('summary');
return;
}
showColsLoading(true);
clearColsError();
fetch('/api/reports/columns', { credentials: 'same-origin' })
.then(function (r) { return r.json().then(function (j) { return { ok: r.ok, json: j }; }); })
.then(function (res) {
showColsLoading(false);
if (!res.ok) {
showColsError((res.json && res.json.error) ? res.json.error : 'Failed to load columns.');
return;
}
repColsMeta = res.json || null;
normalizeJobNameColumns();
ensureDefaultsFromMeta();
// bind tabs once metadata is ready
qs('rep_cols_tab_summary').addEventListener('click', function () { setColsView('summary'); });
qs('rep_cols_tab_snapshot').addEventListener('click', function () { setColsView('snapshot'); });
qs('rep_cols_tab_jobs').addEventListener('click', function () { setColsView('jobs'); });
setColsView('summary');
})
.catch(function () {
showColsLoading(false);
showColsError('Failed to load columns.');
});
}
function pad2(n) { return (n < 10 ? '0' : '') + String(n); }
function setDateTime(prefix, d) {
@ -177,6 +753,15 @@
qs(prefix + '_time').value = pad2(d.getUTCHours()) + ':' + pad2(d.getUTCMinutes());
}
function setDateTimeFromIso(prefix, iso) {
var s = (iso || '').trim();
if (!s) return;
var m = s.match(/^(\d{4})-(\d{2})-(\d{2})(?:T|\s)(\d{2}):(\d{2})/);
if (!m) return;
qs(prefix + '_date').value = m[1] + '-' + m[2] + '-' + m[3];
qs(prefix + '_time').value = m[4] + ':' + m[5];
}
function buildIso(dateStr, timeStr, fallbackTime) {
var d = (dateStr || '').trim();
var t = (timeStr || '').trim() || (fallbackTime || '00:00');
@ -222,9 +807,41 @@
var scope = selectedScope();
qs('rep_single_wrap').classList.toggle('d-none', scope !== 'single');
qs('rep_multiple_wrap').classList.toggle('d-none', scope !== 'multiple');
// Set a sensible default for HTML content selection when the user hasn't chosen one yet.
if (!isEdit && !repHtmlContentTouched) {
qs('rep_html_content').value = defaultHtmlContentForScope(scope);
}
}
function applyCustomerSelection() {
if (!isEdit || !initialReport) return;
var scope = (initialReport.customer_scope || 'all');
var ids = initialReport.customer_ids || [];
if (scope === 'single') {
var singleSel = qs('rep_customer_single');
if (singleSel && ids.length === 1) {
singleSel.value = String(ids[0]);
}
} else if (scope === 'multiple') {
var multiSel = qs('rep_customer_multiple');
if (multiSel && Array.isArray(multiSel.options)) {
for (var i = 0; i < multiSel.options.length; i++) {
var opt = multiSel.options[i];
opt.selected = (ids.indexOf(parseInt(opt.value, 10)) >= 0);
}
}
}
}
function loadCustomers() {
var initialCustomers = window.__initialCustomers || null;
if (initialCustomers && Array.isArray(initialCustomers) && initialCustomers.length) {
// Already rendered server-side. Keep the selects usable even if API calls fail.
applyCustomerSelection();
return;
}
qs('rep_customer_single').innerHTML = '<option value="" selected>Loading…</option>';
qs('rep_customer_multiple').innerHTML = '';
@ -249,12 +866,42 @@
opt2.textContent = c.name || ('Customer ' + c.id);
qs('rep_customer_multiple').appendChild(opt2);
});
applyCustomerSelection();
})
.catch(function () {
qs('rep_customer_single').innerHTML = '<option value="" selected>Failed to load customers</option>';
});
}
function applyInitialReport() {
if (!isEdit || !initialReport) return;
qs('rep_name').value = initialReport.name || '';
qs('rep_description').value = initialReport.description || '';
qs('rep_output_format').value = (initialReport.output_format || 'csv');
// HTML/PDF content selection (defaults handled below)
if (repHtmlContent) {
qs('rep_html_content').value = repHtmlContent;
}
setDateTimeFromIso('rep_start', initialReport.period_start || '');
setDateTimeFromIso('rep_end', initialReport.period_end || '');
// scope + customer selections
var scope = (initialReport.customer_scope || 'all');
qs('rep_scope_single').checked = (scope === 'single');
qs('rep_scope_multiple').checked = (scope === 'multiple');
qs('rep_scope_all').checked = (scope === 'all');
updateScopeUi();
applyCustomerSelection();
if (!repHtmlContent) {
qs('rep_html_content').value = defaultHtmlContentForScope(scope);
}
}
function validate(payload) {
if (!payload.name) return 'Report name is required.';
if (!payload.period_start || !payload.period_end) return 'Start and end period are required.';
@ -272,6 +919,11 @@
function createReport() {
clearError();
if (isEdit && !editReportId) {
showError('Missing report id.');
return;
}
var scope = selectedScope();
var customerIds = [];
if (scope === 'single') {
@ -294,7 +946,20 @@
customer_scope: scope,
customer_ids: customerIds,
period_start: buildIso(qs('rep_start_date').value, qs('rep_start_time').value, '00:00'),
period_end: buildIso(qs('rep_end_date').value, qs('rep_end_time').value, '23:59')
period_end: buildIso(qs('rep_end_date').value, qs('rep_end_time').value, '23:59'),
report_config: {
columns: repColsSelected,
columns_version: 1,
filters: {
backup_softwares: getSelectedValues(qs('rep_job_backup_software')),
backup_types: getSelectedValues(qs('rep_job_backup_type')),
filters_version: 1
},
presentation: {
html_content: (qs('rep_html_content').value || '').trim().toLowerCase(),
presentation_version: 1
}
}
};
if (!payload.description) delete payload.description;
@ -308,10 +973,13 @@
var btn = qs('rep_create_btn');
var oldText = btn.textContent;
btn.disabled = true;
btn.textContent = 'Creating…';
btn.textContent = (isEdit ? 'Saving…' : 'Creating…');
fetch('/api/reports', {
method: 'POST',
var url = isEdit ? ('/api/reports/' + editReportId) : '/api/reports';
var method = isEdit ? 'PUT' : 'POST';
fetch(url, {
method: method,
credentials: 'same-origin',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(payload)
@ -321,7 +989,7 @@
btn.disabled = false;
btn.textContent = oldText;
if (!res.ok) {
showError((res.json && res.json.error) ? res.json.error : 'Create failed.');
showError((res.json && res.json.error) ? res.json.error : (isEdit ? 'Save failed.' : 'Create failed.'));
return;
}
window.location.href = '{{ url_for('main.reports') }}';
@ -329,21 +997,27 @@
.catch(function () {
btn.disabled = false;
btn.textContent = oldText;
showError('Create failed.');
showError(isEdit ? 'Save failed.' : 'Create failed.');
});
}
// Defaults
// Defaults / initial values
if (isEdit) {
applyInitialReport();
} else {
var now = todayUtc();
var end = new Date(Date.UTC(now.getUTCFullYear(), now.getUTCMonth(), now.getUTCDate(), now.getUTCHours(), now.getUTCMinutes(), 0));
var start = new Date(end.getTime() - (7 * 24 * 60 * 60 * 1000));
setDateTime('rep_start', start);
setDateTime('rep_end', end);
}
document.querySelectorAll('input[name="rep_scope"]').forEach(function (r) {
r.addEventListener('change', updateScopeUi);
});
qs('rep_html_content').addEventListener('change', function () { repHtmlContentTouched = true; });
qs('rep_preset_cur_month').addEventListener('click', presetCurrentMonth);
qs('rep_preset_last_month').addEventListener('click', presetLastMonth);
qs('rep_preset_last_month_full').addEventListener('click', presetLastMonthFull);
@ -351,6 +1025,9 @@
updateScopeUi();
loadCustomers();
// init column selector
loadReportColumns();
});
</script>

View File

@ -144,9 +144,22 @@
overflow: auto;
}
#runChecksModal #rcm_body_iframe { height: 100%; }
#runChecksModal .rcm-mail-panel { flex: 1 1 auto; min-height: 0; }
#runChecksModal .rcm-objects-scroll { max-height: 25vh; overflow: auto; }
#runChecksModal #rcm_body_iframe {
flex: 1 1 auto;
min-height: 0;
height: auto;
}
#runChecksModal .rcm-mail-panel {
display: flex;
flex-direction: column;
flex: 1 1 auto;
min-height: 0;
}
#runChecksModal .rcm-objects-scroll {
max-height: 25vh;
overflow: auto;
margin-top: 0.5rem;
}
</style>
<div class="modal fade" id="runChecksModal" tabindex="-1" aria-labelledby="runChecksModalLabel" aria-hidden="true">
@ -177,18 +190,21 @@
<div id="rcm_runs_list" class="list-group"></div>
</div>
<div class="col-md-9 rcm-detail-col">
<dl class="row mb-3">
<dl class="row mb-3 dl-compact">
<dt class="col-3">From</dt>
<dd class="col-9" id="rcm_from"></dd>
<dd class="col-9 ellipsis-field" id="rcm_from"></dd>
<dt class="col-3">Subject</dt>
<dd class="col-9" id="rcm_subject"></dd>
<dd class="col-9 ellipsis-field" id="rcm_subject"></dd>
<dt class="col-3">Received</dt>
<dd class="col-9" id="rcm_received"></dd>
<dd class="col-9 ellipsis-field" id="rcm_received"></dd>
<dt class="col-3">Indicator</dt>
<dd class="col-9" id="rcm_status"></dd>
<dd class="col-9 ellipsis-field" id="rcm_status"></dd>
<dt class="col-3">Overall remark</dt>
<dd class="col-9" id="rcm_overall_message" style="white-space: pre-wrap;"></dd>
<dt class="col-3">Remark</dt>
<dd class="col-9" id="rcm_remark" style="white-space: pre-wrap;"></dd>
@ -205,7 +221,7 @@
<button type="button" class="btn btn-sm btn-outline-primary" id="rcm_ticket_save">Add</button>
</div>
<div class="mt-2">
<textarea class="form-control form-control-sm" id="rcm_ticket_description" rows="2" placeholder="Description (optional)"></textarea>
<input class="form-control form-control-sm" id="rcm_ticket_code" type="text" placeholder="Ticket number (e.g., T20260106.0001)" />
</div>
<div class="mt-2 small text-muted" id="rcm_ticket_status"></div>
</div>
@ -239,7 +255,6 @@
</div>
<div>
<h6>Objects</h6>
<div class="table-responsive rcm-objects-scroll">
<table class="table table-sm table-bordered" id="rcm_objects_table">
<thead class="table-light" style="position: sticky; top: 0; z-index: 1;">
@ -285,6 +300,7 @@
var currentPayload = null;
var btnMarkAllReviewed = document.getElementById('rcm_mark_all_reviewed');
var btnMarkSuccessOverride = document.getElementById('rcm_mark_success_override');
// Shift-click range selection for checkbox rows
var lastCheckedCb = null;
@ -322,15 +338,54 @@ function statusClass(status) {
return "";
}
function objectSeverityRank(o) {
var st = String((o && o.status) || '').trim().toLowerCase();
var err = String((o && o.error_message) || '').trim();
if (st === 'error' || st === 'failed' || st === 'failure' || err) return 0;
if (st === 'warning') return 1;
return 2;
}
function sortObjects(objects) {
return (objects || []).slice().sort(function (a, b) {
var ra = objectSeverityRank(a);
var rb = objectSeverityRank(b);
if (ra !== rb) return ra - rb;
var na = String((a && a.name) || '').toLowerCase();
var nb = String((b && b.name) || '').toLowerCase();
if (na < nb) return -1;
if (na > nb) return 1;
var ta = String((a && a.type) || '').toLowerCase();
var tb = String((b && b.type) || '').toLowerCase();
if (ta < tb) return -1;
if (ta > tb) return 1;
return 0;
});
}
function wrapMailHtml(html) {
html = html || "";
return (
"<!doctype html><html><head><meta charset=\"utf-8\">" +
"<base target=\"_blank\">" +
"</head><body style=\"margin:0; padding:8px;\">" +
html +
"</body></html>"
);
var trimmed = (typeof html === "string") ? html.trim() : "";
var injection = '<meta charset="utf-8"><meta name="color-scheme" content="light"><meta name="supported-color-schemes" content="light"><meta name="viewport" content="width=device-width, initial-scale=1"><base target="_blank"><style>:root{color-scheme:light;}html{color-scheme:light;}body{margin:0;padding:8px;background:#fff;forced-color-adjust:none;-ms-high-contrast-adjust:none;}</style>';
function injectIntoFullDoc(doc) {
var d = doc || "";
if (/<head[^>]*>/i.test(d)) {
return d.replace(/<head[^>]*>/i, function (m) { return m + injection; });
}
if (/<html[^>]*>/i.test(d)) {
return d.replace(/<html[^>]*>/i, function (m) { return m + "<head>" + injection + "</head>"; });
}
return "<!doctype html><html><head>" + injection + "</head><body>" + d + "</body></html>";
}
if (trimmed.toLowerCase().indexOf("<!doctype") === 0 || trimmed.toLowerCase().indexOf("<html") === 0) {
return injectIntoFullDoc(trimmed);
}
return "<!doctype html><html><head>" + injection + "</head><body>" + html + "</body></html>";
}
function escapeHtml(s) {
@ -645,6 +700,25 @@ table.addEventListener('change', function (e) {
});
}
if (btnMarkSuccessOverride) {
btnMarkSuccessOverride.addEventListener('click', function () {
if (!currentRunId) return;
btnMarkSuccessOverride.disabled = true;
apiJson('/api/run-checks/mark-success-override', {
method: 'POST',
body: JSON.stringify({ run_id: currentRunId })
})
.then(function (j) {
if (!j || j.status !== 'ok') throw new Error((j && j.message) || 'Failed');
window.location.reload();
})
.catch(function (e) {
alert((e && e.message) ? e.message : 'Failed to mark as success (override).');
btnMarkSuccessOverride.disabled = false;
});
});
}
function renderAlerts(payload) {
var box = document.getElementById('rcm_alerts');
if (!box) return;
@ -661,33 +735,21 @@ table.addEventListener('change', function (e) {
html += '<div class="mb-2"><strong>Tickets</strong><div class="mt-1">';
tickets.forEach(function (t) {
var status = t.resolved_at ? 'Resolved' : 'Active';
var ticketCode = (t.ticket_code || '').toString();
html += '<div class="mb-2 border rounded p-2" data-alert-type="ticket" data-id="' + t.id + '">' +
'<div class="d-flex align-items-start justify-content-between gap-2">' +
'<div class="flex-grow-1 min-w-0">' +
'<div class="text-truncate">' +
'<span class="me-1" title="Ticket">🎫</span>' +
'<span class="fw-semibold">' + escapeHtml(t.ticket_code || '') + '</span>' +
'<span class="fw-semibold">' + escapeHtml(ticketCode) + '</span>' +
'<button type="button" class="btn btn-sm btn-outline-secondary ms-2 py-0 px-1" title="Copy ticket number" data-action="copy-ticket" data-code="' + escapeHtml(ticketCode) + '"></button>' +
'<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
'</div>' +
(t.description ? ('<div class="small text-muted mt-1">' + escapeHtml(t.description) + '</div>') : '') +
'</div>' +
'<div class="d-flex gap-1 flex-shrink-0">' +
'<button type="button" class="btn btn-sm btn-outline-secondary" data-action="toggle-edit-ticket" data-id="' + t.id + '" ' + (t.resolved_at ? 'disabled' : '') + '>Edit</button>' +
'<button type="button" class="btn btn-sm btn-outline-success" data-action="resolve-ticket" data-id="' + t.id + '" ' + (t.resolved_at ? 'disabled' : '') + '>Resolve</button>' +
'</div>' +
'</div>' +
'<div class="mt-2" data-edit="ticket" style="display:none;">' +
'<div class="row g-2">' +
'<div class="col-12">' +
'<textarea class="form-control form-control-sm" data-field="description" rows="2" placeholder="Description (optional)">' + escapeHtml(t.description || '') + '</textarea>' +
'</div>' +
'<div class="col-12 d-flex gap-2">' +
'<button type="button" class="btn btn-sm btn-primary" data-action="save-ticket" data-id="' + t.id + '">Save</button>' +
'<button type="button" class="btn btn-sm btn-outline-secondary" data-action="cancel-edit" data-id="' + t.id + '">Cancel</button>' +
'<div class="small text-muted align-self-center" data-field="status"></div>' +
'</div>' +
'</div>' +
'</div>' +
'</div>';
});
html += '</div></div>';
@ -696,34 +758,21 @@ table.addEventListener('change', function (e) {
if (remarks.length) {
html += '<div class="mb-2"><strong>Remarks</strong><div class="mt-1">';
remarks.forEach(function (r) {
var status2 = r.resolved_at ? 'Resolved' : 'Active';
var status = r.resolved_at ? 'Resolved' : 'Active';
html += '<div class="mb-2 border rounded p-2" data-alert-type="remark" data-id="' + r.id + '">' +
'<div class="d-flex align-items-start justify-content-between gap-2">' +
'<div class="flex-grow-1 min-w-0">' +
'<div class="text-truncate">' +
'<span class="me-1" title="Remark">💬</span>' +
'<span class="fw-semibold">Remark</span>' +
'<span class="ms-2 badge ' + (r.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status2 + '</span>' +
'<span class="ms-2 badge ' + (r.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
'</div>' +
(r.body ? ('<div class="small text-muted mt-1">' + escapeHtml(r.body) + '</div>') : '') +
'</div>' +
'<div class="d-flex gap-1 flex-shrink-0">' +
'<button type="button" class="btn btn-sm btn-outline-secondary" data-action="toggle-edit-remark" data-id="' + r.id + '" ' + (r.resolved_at ? 'disabled' : '') + '>Edit</button>' +
'<button type="button" class="btn btn-sm btn-outline-success" data-action="resolve-remark" data-id="' + r.id + '" ' + (r.resolved_at ? 'disabled' : '') + '>Resolve</button>' +
'</div>' +
'</div>' +
'<div class="mt-2" data-edit="remark" style="display:none;">' +
'<div class="row g-2">' +
'<div class="col-12">' +
'<textarea class="form-control form-control-sm" data-field="body" rows="2" placeholder="Body (required)">' + escapeHtml(r.body || '') + '</textarea>' +
'</div>' +
'<div class="col-12 d-flex gap-2">' +
'<button type="button" class="btn btn-sm btn-primary" data-action="save-remark" data-id="' + r.id + '">Save</button>' +
'<button type="button" class="btn btn-sm btn-outline-secondary" data-action="cancel-edit" data-id="' + r.id + '">Cancel</button>' +
'<div class="small text-muted align-self-center" data-field="status"></div>' +
'</div>' +
'</div>' +
'</div>' +
'</div>';
});
html += '</div></div>';
@ -736,8 +785,29 @@ table.addEventListener('change', function (e) {
ev.preventDefault();
var action = btn.getAttribute('data-action');
var id = btn.getAttribute('data-id');
if (!action || !id) return;
var wrapper = btn.closest('[data-alert-type]');
if (!action) return;
if (action === 'copy-ticket') {
var code = btn.getAttribute('data-code') || '';
if (!code) return;
if (navigator.clipboard && navigator.clipboard.writeText) {
navigator.clipboard.writeText(code)
.then(function () {
var original = btn.textContent;
btn.textContent = '✓';
setTimeout(function () { btn.textContent = original; }, 800);
})
.catch(function () {
// Fallback: select/copy via prompt
window.prompt('Copy ticket number:', code);
});
} else {
window.prompt('Copy ticket number:', code);
}
return;
}
if (!id) return;
if (action === 'resolve-ticket') {
if (!confirm('Mark ticket as resolved?')) return;
@ -749,58 +819,6 @@ table.addEventListener('change', function (e) {
apiJson('/api/remarks/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})
.then(function () { loadAlerts(currentRunId); })
.catch(function (e) { alert(e.message || 'Failed.'); });
} else if (action === 'toggle-edit-ticket') {
if (!wrapper) return;
var edit = wrapper.querySelector('[data-edit="ticket"]');
if (!edit) return;
edit.style.display = (edit.style.display === 'none' || !edit.style.display) ? '' : 'none';
} else if (action === 'toggle-edit-remark') {
if (!wrapper) return;
var edit2 = wrapper.querySelector('[data-edit="remark"]');
if (!edit2) return;
edit2.style.display = (edit2.style.display === 'none' || !edit2.style.display) ? '' : 'none';
} else if (action === 'cancel-edit') {
if (!wrapper) return;
var editAny = wrapper.querySelector('[data-edit]');
if (editAny) editAny.style.display = 'none';
} else if (action === 'save-ticket') {
if (!wrapper) return;
var editT = wrapper.querySelector('[data-edit="ticket"]');
if (!editT) return;
var descEl = editT.querySelector('[data-field="description"]');
var statusEl2 = editT.querySelector('[data-field="status"]');
var descVal = descEl ? descEl.value : '';
if (statusEl2) statusEl2.textContent = 'Saving...';
apiJson('/api/tickets/' + encodeURIComponent(id), {
method: 'PATCH',
body: JSON.stringify({description: descVal})
})
.then(function () { loadAlerts(currentRunId); })
.catch(function (e) {
if (statusEl2) statusEl2.textContent = e.message || 'Failed.';
else alert(e.message || 'Failed.');
});
} else if (action === 'save-remark') {
if (!wrapper) return;
var editR = wrapper.querySelector('[data-edit="remark"]');
if (!editR) return;
var bodyEl = editR.querySelector('[data-field="body"]');
var statusEl3 = editR.querySelector('[data-field="status"]');
var bodyVal = bodyEl ? bodyEl.value : '';
if (!bodyVal || !bodyVal.trim()) {
if (statusEl3) statusEl3.textContent = 'Body is required.';
return;
}
if (statusEl3) statusEl3.textContent = 'Saving...';
apiJson('/api/remarks/' + encodeURIComponent(id), {
method: 'PATCH',
body: JSON.stringify({body: bodyVal})
})
.then(function () { loadAlerts(currentRunId); })
.catch(function (e) {
if (statusEl3) statusEl3.textContent = e.message || 'Failed.';
else alert(e.message || 'Failed.');
});
}
});
});
@ -825,7 +843,7 @@ table.addEventListener('change', function (e) {
function bindInlineCreateForms() {
var btnTicket = document.getElementById('rcm_ticket_save');
var btnRemark = document.getElementById('rcm_remark_save');
var tDesc = document.getElementById('rcm_ticket_description');
var tCode = document.getElementById('rcm_ticket_code');
var tStatus = document.getElementById('rcm_ticket_status');
var rBody = document.getElementById('rcm_remark_body');
var rStatus = document.getElementById('rcm_remark_status');
@ -838,7 +856,7 @@ table.addEventListener('change', function (e) {
function setDisabled(disabled) {
if (btnTicket) btnTicket.disabled = disabled;
if (btnRemark) btnRemark.disabled = disabled;
if (tDesc) tDesc.disabled = disabled;
if (tCode) tCode.disabled = disabled;
if (rBody) rBody.disabled = disabled;
}
@ -849,14 +867,24 @@ table.addEventListener('change', function (e) {
btnTicket.addEventListener('click', function () {
if (!currentRunId) { alert('Select a run first.'); return; }
clearStatus();
var description = tDesc ? tDesc.value : '';
var ticket_code = tCode ? (tCode.value || '').trim().toUpperCase() : '';
if (!ticket_code) {
if (tStatus) tStatus.textContent = 'Ticket number is required.';
else alert('Ticket number is required.');
return;
}
if (!/^T\d{8}\.\d{4}$/.test(ticket_code)) {
if (tStatus) tStatus.textContent = 'Invalid ticket number format. Expected TYYYYMMDD.####.';
else alert('Invalid ticket number format. Expected TYYYYMMDD.####.');
return;
}
if (tStatus) tStatus.textContent = 'Saving...';
apiJson('/api/tickets', {
method: 'POST',
body: JSON.stringify({job_run_id: currentRunId, description: description})
body: JSON.stringify({job_run_id: currentRunId, ticket_code: ticket_code})
})
.then(function () {
if (tDesc) tDesc.value = '';
if (tCode) tCode.value = '';
if (tStatus) tStatus.textContent = '';
loadAlerts(currentRunId);
})
@ -924,10 +952,16 @@ table.addEventListener('change', function (e) {
stEl.innerHTML = (d ? ('<span class="status-dot ' + d + '" aria-hidden="true"></span>') : '');
}
document.getElementById('rcm_remark').textContent = run.remark || '';
document.getElementById('rcm_overall_message').textContent = run.overall_message || '';
currentRunId = run.id || null;
if (window.__rcmClearCreateStatus) window.__rcmClearCreateStatus();
if (window.__rcmSetCreateDisabled) window.__rcmSetCreateDisabled(!currentRunId);
if (btnMarkSuccessOverride) {
var _rs = (run.status || '').toString().toLowerCase();
var _canOverride = !!currentRunId && !run.missed && (_rs.indexOf('override') === -1) && (_rs.indexOf('success') === -1);
btnMarkSuccessOverride.disabled = !_canOverride;
}
loadAlerts(currentRunId);
var mail = run.mail || null;
@ -960,7 +994,7 @@ table.addEventListener('change', function (e) {
var tbody = document.querySelector('#rcm_objects_table tbody');
if (tbody) {
tbody.innerHTML = '';
(run.objects || []).forEach(function (obj) {
sortObjects(run.objects || []).forEach(function (obj) {
var tr = document.createElement('tr');
var tdName = document.createElement('td');
@ -991,6 +1025,7 @@ table.addEventListener('change', function (e) {
currentJobId = jobId;
if (btnMarkAllReviewed) btnMarkAllReviewed.disabled = true;
if (btnMarkSuccessOverride) btnMarkSuccessOverride.disabled = true;
var modalEl = document.getElementById('runChecksModal');
var modal = bootstrap.Modal.getOrCreateInstance(modalEl);

View File

@ -160,8 +160,30 @@
{% if users %}
{% for user in users %}
<tr>
{% set is_last_admin = ('admin' in user.roles and (admin_users_count or 0) <= 1) %}
<td>{{ user.username }}</td>
<td>{{ (user.role or '')|replace(',', ', ') }}</td>
<td>
<form method="post" action="{{ url_for('main.settings_users_update_roles', user_id=user.id) }}" class="d-flex flex-wrap gap-2 align-items-center">
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="role_admin_{{ user.id }}" name="roles" value="admin" {% if 'admin' in user.roles %}checked{% endif %} {% if is_last_admin %}disabled title="Cannot remove admin from the last admin account"{% endif %} />
<label class="form-check-label" for="role_admin_{{ user.id }}">Admin</label>
</div>
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="role_operator_{{ user.id }}" name="roles" value="operator" {% if 'operator' in user.roles %}checked{% endif %} />
<label class="form-check-label" for="role_operator_{{ user.id }}">Operator</label>
</div>
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="role_reporter_{{ user.id }}" name="roles" value="reporter" {% if 'reporter' in user.roles %}checked{% endif %} />
<label class="form-check-label" for="role_reporter_{{ user.id }}">Reporter</label>
</div>
<div class="form-check form-check-inline m-0">
<input class="form-check-input" type="checkbox" id="role_viewer_{{ user.id }}" name="roles" value="viewer" {% if 'viewer' in user.roles %}checked{% endif %} />
<label class="form-check-label" for="role_viewer_{{ user.id }}">Viewer</label>
</div>
<button type="submit" class="btn btn-sm btn-outline-primary">Save</button>
</form>
<div class="text-muted small mt-1">Current: {{ (user.role or '')|replace(',', ', ') }}</div>
</td>
<td>
<div class="d-flex flex-wrap gap-2">
<form method="post" action="{{ url_for('main.settings_users_reset_password', user_id=user.id) }}" class="d-inline">
@ -170,7 +192,6 @@
<button type="submit" class="btn btn-outline-secondary">Reset</button>
</div>
</form>
{% set is_last_admin = (user.role == 'admin' and (users | selectattr('role', 'equalto', 'admin') | list | length) <= 1) %}
<form method="post" action="{{ url_for('main.settings_users_delete', user_id=user.id) }}" class="d-inline">
<button type="submit" class="btn btn-sm btn-outline-danger" {% if is_last_admin %}disabled title="Cannot delete the last admin account"{% endif %}>Delete</button>
</form>

View File

@ -17,19 +17,16 @@
{% endif %}
</div>
<form method="post" class="row g-3"> <div class="col-12">
<label class="form-label">Description</label>
<textarea class="form-control" name="description" rows="5">{{ ticket.description or '' }}</textarea>
</div>
<div class="row g-3">
{% if active_role in ['admin','operator'] %}
<div class="col-12">
<button class="btn btn-primary" type="submit">Save</button>
{% if not ticket.resolved_at %}
<button class="btn btn-outline-success" type="button" onclick="if(confirm('Mark ticket as resolved?')){fetch('{{ url_for('main.api_ticket_resolve', ticket_id=ticket.id) }}',{method:'POST'}).then(()=>location.reload());}">Resolve</button>
{% endif %}
</div>
{% endif %}
</form>
</div>
</div>
</div>

View File

@ -45,7 +45,7 @@
<div class="col-auto" style="min-width: 260px;">
<label class="form-label" for="flt_q">Search</label>
<input class="form-control" id="flt_q" name="q" value="{{ q }}" placeholder="ticket code / description / job" />
<input class="form-control" id="flt_q" name="q" value="{{ q }}" placeholder="ticket code / job" />
</div>
<div class="col-auto">
@ -89,7 +89,7 @@
<td class="text-nowrap">{{ t.start_date }}</td>
<td class="text-nowrap">{{ t.resolved_at }}</td>
<td class="text-nowrap">
<a class="btn btn-sm btn-outline-primary" href="{{ url_for('main.ticket_detail', ticket_id=t.id) }}">View / Edit</a>
<a class="btn btn-sm btn-outline-primary" href="{{ url_for('main.ticket_detail', ticket_id=t.id) }}">View</a>
{% if t.active and t.job_id %}
<a class="btn btn-sm btn-outline-secondary ms-1" href="{{ url_for('main.job_detail', job_id=t.job_id) }}">Job page</a>
{% endif %}
@ -144,7 +144,7 @@
<td class="text-nowrap">{{ r.start_date }}</td>
<td class="text-nowrap">{{ r.resolved_at }}</td>
<td class="text-nowrap">
<a class="btn btn-sm btn-outline-primary" href="{{ url_for('main.remark_detail', remark_id=r.id) }}">View / Edit</a>
<a class="btn btn-sm btn-outline-primary" href="{{ url_for('main.remark_detail', remark_id=r.id) }}">View</a>
{% if r.active and r.job_id %}
<a class="btn btn-sm btn-outline-secondary ms-1" href="{{ url_for('main.job_detail', job_id=r.job_id) }}">Job page</a>
{% endif %}

View File

@ -1,6 +1,319 @@
***
================================================================================================================================================
## v0.1.21
This release focuses on improving correctness, consistency, and access control across core application workflows, with particular attention to changelog rendering, browser-specific mail readability, Run Checks visibility, role-based access restrictions, override flexibility, and VSPC object linking reliability. The goal is to ensure predictable behavior, clearer diagnostics, and safer administration across both day-to-day operations and complex multi-entity reports.
### Changelog Rendering and Documentation Accuracy
- Updated the Changelog route to render remote Markdown content instead of plain text.
- Enabled full Markdown parsing so headings, lists, links, and code blocks are displayed correctly.
- Ensured the changelog always fetches the latest version directly from the source repository at request time.
- Removed legacy plain-text rendering to prevent loss of structure and formatting.
### Mail Rendering and Browser Compatibility
- Forced a light color scheme for embedded mail content to prevent Microsoft Edge from applying automatic dark mode styling.
- Added explicit `color-scheme` and `forced-color-adjust` rules so original mail CSS is respected.
- Ensured consistent mail readability across Edge and Firefox.
- Applied these fixes consistently across Inbox, Deleted Inbox, Job Details, Run Checks, Daily Jobs, and Admin All Mail views.
### Run Checks Visibility and Consistency
- Added support for displaying the overall remark (overall_message) directly on the Run Checks page.
- Ensured consistency between Run Checks and Job Details, where the overall remark was already available.
- Improved operator visibility of high-level run context without requiring navigation to job details.
### Initial Setup and User Existence Safeguards
- Fixed an incorrect redirect to the “Initial admin setup” page when users already exist.
- Changed setup detection logic from “admin user exists” to “any user exists”.
- Ensured existing environments always show the login page instead of allowing a new initial admin to be created.
- Prevented direct access to the initial setup route when at least one user is present.
### Role-Based Access Control and Menu Restrictions
- Restricted the Reporter role to only access Dashboard, Reports, Changelog, and Feedback.
- Updated menu rendering to fully hide unauthorized menu items for Reporter users.
- Adjusted route access to ensure Feedback pages remain accessible for the Reporter role.
- Improved overall consistency between visible navigation and backend access rules.
### Override Matching Flexibility and Maintainability
- Added configurable error text matching modes for overrides: contains, exact, starts with, and ends with.
- Updated override evaluation logic to apply the selected match mode across run remarks and object error messages.
- Extended the overrides UI with a match type selector and improved edit support for existing overrides.
- Added a database migration to create and backfill the `overrides.match_error_mode` field for existing records.
### Job Deletion Stability
- Fixed an error that occurred during job deletion.
- Corrected backend deletion logic to prevent runtime exceptions.
- Ensured related records are handled safely to avoid constraint or reference errors during removal.
### VSPC Object Linking and Normalization
- Fixed VSPC company name normalization so detection and object prefixing behave consistently.
- Ensured filtered object persistence respects the UNIQUE(customer_id, object_name) constraint.
- Correctly update `last_seen` timestamps for existing objects.
- Added automatic object persistence routing for VSPC per-company runs, ensuring objects are linked to the correct customer and job.
- Improved auto-approval for VSPC Active Alarms summaries with per-company run creation and case-insensitive object matching.
- Added best-effort retroactive processing to automatically link older inbox messages once company mappings are approved.
### VSPC Normalization Bug Fixes and Backward Compatibility
- Removed duplicate definitions of VSPC Active Alarms company extraction logic that caused inconsistent normalization.
- Ensured a single, consistent normalization path is used when creating jobs and linking objects.
- Improved object linking so real objects (e.g. HV01, USB Disk) are reliably associated with their jobs.
- Restored automatic re-linking for both new and historical VSPC mails.
- Added backward-compatible matching to prevent existing VSPC jobs from breaking due to earlier inconsistent company naming.
---
## v0.1.20
This release delivers a comprehensive set of improvements focused on parser correctness, data consistency, and clearer operator workflows across Inbox handling, Run Checks, and administrative tooling. The main goal of these changes is to ensure that backup notifications are parsed reliably, presented consistently, and handled through predictable and auditable workflows, even for complex or multi-entity reports.
### Mail Parsing and Data Integrity
- Fixed Veeam Backup for Microsoft 365 parsing where the overall summary message was not consistently stored.
- Improved extraction of overall detail messages so permission and role warnings are reliably captured.
- Ensured the extracted overall message is always available across Job Details, Run Checks, and reporting views.
- Added decoding of HTML entities in parsed object fields (name, type, status, error message) before storage, ensuring characters such as ampersands are displayed correctly.
- Improved robustness of parsing logic to prevent partial or misleading data from being stored when mails contain mixed or malformed content.
### Object Classification and Sorting
- Updated object list sorting to improve readability and prioritization.
- Objects are now grouped by severity in a fixed order: Errors first, then Warnings, followed by all other statuses.
- Within each severity group, objects are sorted alphabetically (AZ).
- Applied the same sorting logic consistently across Inbox, Job Details, Run Checks, Daily Jobs, and the Admin All Mail view.
- Improved overall run status determination by reliably deriving the worst detected object state.
### Parsers Overview and Maintainability
- Refactored the Parsers overview page to use the central parser registry instead of a static, hardcoded list.
- All available parsers are now displayed automatically, ensuring the page stays in sync as parsers are added or removed.
- Removed hardcoded parser definitions from templates to improve long-term maintainability.
- Fixed a startup crash in the parsers route caused by an invalid absolute import by switching to a package-relative import.
- Prevented Gunicorn worker boot failures and Bad Gateway errors during application initialization.
### User Management and Feedback Handling
- Added support for editing user roles directly from the User Management interface.
- Implemented backend logic to update existing role assignments without requiring user deletion.
- Ensured role changes are applied immediately and reflected correctly in permissions and access control.
- Updated feedback listings to show only Open items by default.
- Ensured Resolved items are always sorted to the bottom when viewing all feedback.
- Preserved existing filtering, searching, and user-controlled sorting behavior.
### UI Improvements and Usability Enhancements
- Introduced reusable ellipsis handling for long detail fields to prevent layout overlap.
- Added click-to-expand behavior for truncated fields, with double-click support to expand and select all text.
- Added automatic tooltips showing the full value when a field is truncated.
- Removed the redundant “Objects” heading above objects tables to reduce visual clutter.
- Applied truncation and expansion behavior consistently across Inbox, Deleted Mail, Run Checks, Daily Jobs, Job Detail views, and Admin All Mail.
- Reset expanded ellipsis fields when Bootstrap modals or offcanvas components are opened or closed to prevent state leakage.
- Fixed layout issues where the Objects table could overlap mail content in the Run Checks popup.
### Veeam Cloud Connect and VSPC Parsing
- Improved the Veeam Cloud Connect Report parser by combining User and Repository Name into a single object identifier.
- Excluded “TOTAL” rows from object processing.
- Correctly classified red rows as Errors and yellow/orange rows as Warnings.
- Ensured overall status is set to Error when one or more objects are in error state.
- Added support for Veeam Service Provider Console daily alarm summary emails.
- Implemented per-company object aggregation and derived overall status from the worst detected state.
- Improved detection of VSPC Active Alarms emails to prevent incorrect fallback to other Veeam parsers.
- Fixed a SyntaxError in the VSPC parser that caused application startup failures.
### VSPC Company Mapping Workflow
- Introduced a dedicated company-mapping popup for VSPC Active Alarms summary reports.
- Enabled manual mapping of companies found in mails to existing customers.
- Implemented per-company job and run creation using the format “Active alarms summary | <Company>”.
- Disabled the standard approval flow for this report type and replaced it with a dedicated mapping workflow.
- Required all detected companies to be mapped before full approval, while still allowing partial approvals.
- Prevented duplicate run creation on repeated approvals.
- Improved visibility and usability of the mapping popup with scroll support for large company lists.
- Ensured only alarms belonging to the selected company are attached to the corresponding run.
### NTFS Auditing and Synology ABB Enhancements
- Added full parser support for NTFS Auditing reports.
- Improved hostname and FQDN extraction from subject lines, supporting multiple subject formats and prefixes.
- Ensured consistent job name generation as “<hostname> file audits”.
- Set overall status to Warning when detected change counts are greater than zero.
- Improved Synology Active Backup for Business parsing to detect partially completed jobs as Warning.
- Added support for localized completion messages and subject variants.
- Improved per-device object extraction and ensured specific device statuses take precedence over generic listings.
### Workflow Simplification and Cleanup
- Removed the “Mark success (override)” button from the Run Checks popup.
- Prevented creation of unintended overrides when marking individual runs as successful.
- Simplified override handling so Run Checks actions no longer affect override administration.
- Ensured firmware update notifications (QNAP) are treated as informational warnings and excluded from missing-run detection and reporting.
---
## v0.1.19
This release delivers a broad set of improvements focused on reliability, transparency, and operational control across mail processing, administrative auditing, and Run Checks workflows. The changes aim to make message handling more robust, provide better insight for administrators, and give operators clearer and more flexible control when reviewing backup runs.
### Mail Import Reliability and Data Integrity
- Updated the mail import flow so messages are only moved to the processed folder after a successful database store and commit.
- Prevented Graph emails from being moved when parsing, storing, or committing data fails, ensuring no messages are lost due to partial failures.
- Added explicit commit and rollback handling to guarantee database consistency before any mail state changes occur.
- Improved logging around import, commit, and rollback failures to make skipped or retried mails easier to trace and troubleshoot.
### Administrative Mail Auditing and Visibility
- Introduced an admin-only “All Mail” audit page that provides a complete overview of all received mail messages.
- Implemented pagination with a fixed page size of 50 items to ensure consistent performance and predictable navigation.
- Added always-visible search filters that can be combined using AND logic, including From, Subject, Backup, Type, Job name, and a received date/time range.
- Added an “Only unlinked” filter to quickly identify messages that are not associated with any job.
- Reused the existing Inbox message detail modal to allow consistent inspection of messages from the All Mail page.
- Added a dedicated navigation entry so administrators can easily access the All Mail audit view.
- Fixed modal opening behavior in the All Mail page to fully align click handling and popups with the Inbox implementation.
### Inbox and Mail Body Rendering Improvements
- Treated whitespace-only email bodies as empty during import so HTML report attachments can be extracted and displayed correctly.
- Added legacy fallback logic in the Inbox message detail API to extract the first HTML attachment from stored EML files when bodies are empty or invalid.
- Improved iframe rendering in the Inbox so full HTML documents (commonly used for report attachments) are rendered directly instead of being wrapped.
- Added detection for “effectively empty” HTML bodies, such as empty Graph-generated HTML skeletons.
- Ensured that both newly imported and already-stored messages can dynamically fall back to EML attachment extraction without requiring a reset.
### Run Checks Usability and Control
- Added a copy-to-clipboard icon next to ticket numbers in the Run Checks popup to quickly copy only the ticket code.
- Prevented accidental selection of appended status text when copying ticket numbers.
- Introduced a manual “Success (override)” action that allows Operators and Admins to mark a run as successful even if it originally failed or produced warnings.
- Implemented backend support to persist the override state without modifying the original run data.
- Updated UI indicators so overridden runs are clearly shown with a blue success status.
- Ensured overrides apply only to the selected run and do not affect historical or related runs.
- Improved Run Checks mail rendering by falling back to text bodies when HTML bodies are missing, matching Inbox and All Mail behavior.
- Added support for extracting HTML content from stored EML files when both HTML and text bodies are unavailable.
- Ensured plain-text emails are safely rendered using preformatted HTML to preserve readability.
### Customer, Ticket, and Scope Cleanup
- Updated customer deletion logic to allow removal of customers even when tickets or remarks are linked.
- Added explicit cleanup of related TicketScope and RemarkScope records prior to customer deletion.
- Ensured jobs linked to a deleted customer are safely unassigned to prevent foreign key constraint errors.
- Eliminated deletion failures caused by lingering ticket or remark relationships.
### Parser Enhancements and Informational Messages
- Added parser support for 3CX SSL Certificate notification emails.
- Classified these messages as Backup: 3CX with Type: SSL Certificate.
- Parsed and displayed certificate information in the Run Checks popup.
- Stored these messages as successful runs so certificate status can be tracked over time.
- Added detection for Synology DSM automatic update cancellation messages in both Dutch and English.
- Classified Synology Updates messages as informational and set their overall status to Warning.
- Excluded Synology Updates informational messages from scheduling logic and reporting output.
### UI Layout and Status Accuracy Improvements
- Moved the Details section above the email body in both Inbox and Job Details popups to improve readability.
- Avoided long detail texts being constrained to narrow side columns.
- Adjusted missed run detection to include a ±1 hour grace window around scheduled run times.
- Prevented runs that arrive shortly after the scheduled time from being temporarily marked as Missed.
- Ensured the Missed status is only applied after the full grace window has elapsed, reducing false alerts in Run Checks and Daily Jobs views.
---
## v0.1.18
This release focuses on improving ticket reuse, scoping, and visibility across jobs, runs, and history views to ensure consistent and flexible ticket handling.
### Ticket Linking and Reuse
- Updated ticket linking logic to allow the same existing ticket number to be associated with multiple jobs and job runs.
- Prevented duplicate ticket creation errors when reusing an existing ticket code.
- Ensured existing tickets are consistently reused and linked instead of being rejected when already present in the system.
### Ticket Scope and Resolution
- Fixed missing ticket number display in job and run popups by always creating or reusing a ticket scope when linking an existing ticket to a job.
- Updated ticket resolution logic to support per-job resolution when resolving tickets from a job or run context.
- Ensured resolving a ticket from the central Tickets view resolves the ticket globally and closes all associated job scopes.
- Updated ticket active status determination to be based on open job scopes, allowing the same ticket number to remain active for other jobs when applicable.
### Job History Tickets and Remarks
- Added a Tickets and Remarks section to the Job History mail popup.
- Aligned ticket handling in Job History with the existing Run Checks popup behavior.
- Enabled viewing of active and resolved tickets and remarks per job run.
- Added support for creating new tickets and remarks directly from the Job History popup.
- Enabled resolving tickets and remarks directly from the Job History popup.
- Ensured tickets and remarks are correctly scoped to the selected run (run_id).
---
## v0.1.17
### Release Summary
This release focuses on improving job normalization, ticket and remark handling, UI usability, and the robustness of run and object detection across the platform.
### Job normalization and aggregation
- Veeam job names are now normalized to prevent duplicates:
- Jobs with “(Combined)” and “(Full)” suffixes are merged with their base job names.
- Ensures accurate aggregation, reporting, and statistics for Veeam Backup and Microsoft 365 jobs.
- Added support for archiving inactive jobs while keeping all historical runs fully included in reports.
### Inbox and bulk operations
- Introduced multi-select inbox functionality for Operator and Admin roles.
- Added a bulk “Delete selected” action with validation, counters, and admin audit logging.
### Jobs UI and navigation
- Restored row-click navigation on the Jobs page.
- Moved the Archive action from the Jobs table to the Job Details page for consistency.
- Improved layout and behavior of job run popups, ensuring objects are visible, correctly rendered, and consistently sorted.
### Tickets and remarks
- Ticket creation now always uses a user-provided ticket code with strict format and uniqueness validation.
- Editing of tickets and remarks has been fully disabled; items must be resolved and recreated instead.
- Removed ticket description fields from creation and detail views to prevent inconsistent data.
- Fixed backend indentation errors that previously caused startup and Bad Gateway failures.
### Customer deletion stability
- Fixed foreign key constraint issues when deleting customers.
- Customer deletion now safely unlinks jobs while preserving historical job and run data.
- Enforced cascading deletes where appropriate to prevent integrity errors.
### Feedback handling
- Users can now reply directly to Feedback items while they are in the “Open” state.
- Replies are stored for audit and history tracking.
### Veeam parser improvements
- Configuration Backup parser now correctly captures multi-line overall warning messages.
- Improved date parsing to support formats without leading zeros.
- Microsoft 365 parser now always persists overall warning/info messages, even on successful runs.
### Run checks and missed run detection
- Improved weekly and monthly schedule inference to reduce false positives.
- Monthly jobs are now detected and marked as missed on the correct expected date.
- Added fallback mechanisms for loading run objects in Run Checks to support legacy and transitional data.
Overall, this release significantly improves data consistency, usability, and reliability across jobs, tickets, inbox management, and run analysis.
---
## v0.1.16
This release significantly expands and stabilizes the reporting functionality, focusing on configurability, correctness, and richer output formats.
### Key Highlights
- Reports are now **job-aggregated** instead of object-level, making them suitable for high-level status reviews.
- Full **report lifecycle management** was added, including secure deletion and reliable refresh behavior.
- Introduced **advanced reporting foundations** with configurable report definitions (columns, filters, layout, charts).
- Added support for **multiple export formats**: CSV, HTML, and PDF, including graphical HTML previews and basic PDF charts.
- Implemented extensive **column selection** across Summary, Snapshot, and Jobs views, with drag-and-drop ordering and persistence.
- Added **job-level aggregated metrics**, including per-job success rates and charts over the selected period.
- Improved **filtering** to exclude informational (non-backup) jobs and allow selection by backup software and type.
- Ensured **success rate and total runs calculations** are correct and consistently based only on selected run statuses.
- Added **Customer** as a selectable column and improved multi-customer report clarity.
- Introduced configurable **HTML/PDF content scope** (Customers, Jobs, or both).
- Fixed numerous **stability issues** (loading states, NameErrors, missing imports, endpoint collisions, JS errors).
- Improved HTML report layout, table rendering, column labeling, sorting logic, and visual consistency.
- Cleaned up column metadata, removed deprecated/duplicate options, and added migration logic to fix existing configurations.
---
## v0.1.15

View File

@ -56,6 +56,10 @@ Implemented in `backend/app/migrations.py`:
- Calls the above migrations in order.
- Logs progress to stdout so changes are visible in container / Portainer logs.
- `migrate_reporting_report_config()`
- Adds `report_definitions.report_config` (TEXT) if it does not exist.
- Stores the JSON report definition for the reporting UI (selected columns, chart types, filters) so the same definition can later be reused for PDF export.
## Future changes
- Every time you introduce a non-trivial schema change, update:

136
docs/reporting-proposal.md Normal file
View File

@ -0,0 +1,136 @@
# Proposal Advanced Reporting & Visualization
## Goal
Make reports more flexible through column selection and more insightful by adding graphical visualizations.
All reports are displayed online first, with a clear path towards future PDF export.
---
## 1. Column Selection When Creating Reports
### 1.1 Report Configuration
Add an extra step **“Report Content”** when creating a report:
- Checkbox list of available columns
- Drag & drop to define column order
- Option to **save as template** for reuse
### 1.2 Column Groups
Columns are grouped for clarity:
**Job Information**
- Job name
- Job type
- Repository / Target
**Status**
- Last run status
- Result (Success / Warning / Failed)
- Exit code
**Time & Performance**
- Start time
- Duration
- Average duration
**Data**
- Data processed
- Data size
- Change rate
**Reliability**
- Consecutive failures
- Last successful run
---
## 2. Graphical Reports (Online Dashboard)
### 2.1 Summary View
Always shown at the top of a report:
**KPI Cards**
- Total jobs
- Successful jobs
- Failed jobs
- Warning jobs
- Success rate (%)
---
### 2.2 Chart Proposals
**Status Distribution**
- Donut / pie chart: Success vs Warning vs Failed
**Trends Over Time**
- Line chart: success rate per day
- Line chart: number of failures per day
**Performance**
- Bar chart: average runtime per job
- Bar chart: largest data volumes per job
**Reliability**
- Heatmap: failures per job per day
- Bar chart: top N jobs with most failures
---
### 2.3 Interaction
- Hover tooltips with detailed values
- Clicking a chart filters the table view
- Time range selector:
- Last 24 hours
- Last 7 days
- Last 30 days
- Custom range
---
## 3. Raw Data vs Graphical View
### 3.1 Tabs per Report
- **Overview** KPIs and charts only
- **Details** table with selected columns
- **History** long-term trends
### 3.2 Filtering & Sorting
- Filter by status
- Filter by job
- Sort by duration, date, or failures
---
## 4. Preparation for PDF Export (Future)
### 4.1 Layout Guidelines
- Fixed A4-friendly layout
- Maximum two charts per row
- Dark text on light background
- Non-interactive chart rendering in PDF mode
### 4.2 PDF-Ready Structure
1. Title and period
2. Executive summary (KPIs)
3. Status overview
4. Trend analysis
5. Optional detail tables
---
## 5. Technical Principles (High-Level)
- Store report definitions as JSON:
- Selected columns
- Chart types
- Filters
- Client-side chart rendering
- Same report definition reusable for PDF rendering later
---
## Result
- Flexible, audience-focused reports
- Faster insight through visual summaries
- Clean transition to professional PDF reports without redesign

View File

@ -1 +1 @@
v0.1.15
v0.1.20