Compare commits

...

78 Commits

Author SHA1 Message Date
8e847e802b Mark Entra SSO docs as untested in Backupchecks 2026-02-23 14:36:31 +01:00
b992d6382a Add Entra security group access restriction for SSO 2026-02-23 14:30:42 +01:00
6bf81bd730 Add documentation page for Microsoft Entra SSO setup 2026-02-23 14:23:15 +01:00
5274286c04 Add Microsoft Entra SSO authentication flow 2026-02-23 14:20:22 +01:00
47bb4ee4f0 Deduplicate Cove runs per job instead of globally 2026-02-23 12:15:18 +01:00
8fe3f99e40 Improve linked Cove import feedback and timestamp fallback 2026-02-23 12:04:43 +01:00
f68f92e63a Trigger immediate Cove import on link and enrich run details 2026-02-23 11:57:39 +01:00
06abd8c7a3 Fix Cove workstation/server type heuristic 2026-02-23 11:17:38 +01:00
a0abd3d58e Differentiate Cove server/workstation types and show computer name 2026-02-23 11:14:41 +01:00
7803b7647c Improve Cove accounts typing and datasource readability 2026-02-23 11:10:48 +01:00
0c5adf17ab Fix Cove Run import button submission in settings 2026-02-23 11:02:56 +01:00
8deecd4c11 Update technical-notes-codex.md with Cove integration and recent changes
- Add complete Cove Data Protection integration section: API details,
  column codes, status mapping, inbox flow, CoveAccount model, routes,
  migrations, background thread, settings UI
- Update Data Model section: CoveAccount model, Job.cove_account_id,
  JobRun.source_type + external_id, SystemSettings Cove fields
- Update Application Architecture: cove_importer_service background thread
- Add debug logging snippet for ticket linking issues
- Add Recent Changes entry for 2026-02-23
- Add Cove files to Quick References

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-23 10:49:44 +01:00
9b19283c97 Add Cove Accounts inbox-style flow for linking accounts to jobs
- CoveAccount staging model: all Cove accounts upserted from API;
  unmatched accounts visible on /cove/accounts before job linking
- cove_importer.py: always upserts accounts, creates JobRuns only for
  accounts with a linked job (deduplication via external_id)
- routes_cove.py: GET /cove/accounts, POST link/unlink routes
- cove_accounts.html: inbox-style page with Bootstrap modals for
  creating new jobs or linking to existing ones
- Nav bar: Cove Accounts link for admin/operator when cove_enabled
- DB migration: migrate_cove_accounts_table() for staging table

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-23 10:40:54 +01:00
c045240001 Add manual Cove import trigger (Run import now button)
- New route POST /settings/cove/run-now calls run_cove_import()
  directly and shows result as flash message
- Settings > Integrations > Cove: "Run import now" button visible
  when partner_id is known (connection confirmed)
- Status bar shows partner ID, last import timestamp or "No import yet"

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-23 10:28:33 +01:00
3bd8178464 Fix Cove importer: correct API payload format and response parsing
- Login: use lowercase username/password params and id="jsonrpc"
- Login: visa is at top-level of response (not inside result)
- EnumerateAccountStatistics: use lowercase query param, RecordsCount
  instead of RecordCount, remove DisplayColumns (not needed)
- _flatten_settings: Settings items are single-key dicts like
  {"D09F00": "5"}, not {Key: ..., Value: ...} - use dict.update()
- _cove_enumerate: unwrap nested result and handle Accounts key
- _process_account: AccountId is top-level field, not from Settings

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-23 10:21:01 +01:00
467f350184 Auto-commit local changes before build (2026-02-23 10:14:36) 2026-02-23 10:14:36 +01:00
2f1cc20263 Add Cove Data Protection full integration
- New models: SystemSettings gets 8 cove_* fields, Job gets
  cove_account_id, JobRun gets source_type and external_id
- Migration migrate_cove_integration() adds all new DB columns and
  a deduplication index on job_runs.external_id
- cove_importer.py: Cove API login, paginated EnumerateAccountStatistics,
  deduplication via external_id, JobRun creation, per-datasource
  run_object_links persistence (Files&Folders, VssMsSql, M365, etc.)
- cove_importer_service.py: background thread, same pattern as
  auto_importer_service, respects cove_import_interval_minutes
- __init__.py: starts cove_importer thread on app startup
- routes_settings.py: Cove form handling (POST), has_cove_password
  variable, new AJAX route /settings/cove/test-connection
- routes_jobs.py: new route /jobs/<id>/set-cove-account,
  cove_enabled passed to job_detail template
- settings.html: Cove card in Integrations tab with AJAX test button
- job_detail.html: Cove Integration card with Account ID input

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-23 10:13:34 +01:00
dde2ccbb5d Fix Cove test script: parse Settings array format from API response
API returns Settings as list of single-key dicts, not a flat dict.
Also fixes AccountId display and status summary parsing.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-23 09:58:19 +01:00
a30d51bed0 Fix Cove test script: remove partner field from login, use confirmed columns
Login requires only username + password (no partner field).
Updated column set matches confirmed working columns from Postman testing.
Added per-datasource output and 28-day color bar display.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-23 09:52:18 +01:00
6d086a883f Add Cove API test script and update documentation with N-able support findings
- Add standalone cove_api_test.py to verify new D9Fxx/D10Fxx/D11Fxx column codes
- D02/D03 confirmed as legacy by N-able support; D9/D10/D11 should work
- Document session status codes (F00) and timestamp fields (F09/F15/F18)
- Update TODO and knowledge docs with breakthrough status

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-02-23 09:08:41 +01:00
f35ec25163 Update technical notes for 3CX and remark alerts behavior 2026-02-19 14:22:32 +01:00
6f2f7b593b Auto-commit local changes before build (2026-02-19 14:15:08) 2026-02-19 14:15:08 +01:00
38f0f8954e Fix remark visibility in run alerts 2026-02-19 14:14:44 +01:00
2ee5db8882 Auto-commit local changes before build (2026-02-19 13:45:13) 2026-02-19 13:45:13 +01:00
ea244193e0 Hide non-backup 3CX informational jobs from Run Checks 2026-02-19 13:44:46 +01:00
c6ff104767 Auto-commit local changes before build (2026-02-19 13:28:33) 2026-02-19 13:28:33 +01:00
441f5a8e50 Handle 3CX update mails as informational runs 2026-02-19 13:27:52 +01:00
3c629bb664 Polish changelog wording for 2026-02-16 and 2026-02-19 2026-02-19 13:04:49 +01:00
e0e8ed2b0d Auto-commit local changes before build (2026-02-19 12:57:34) 2026-02-19 12:57:34 +01:00
53b028ef78 Add optional Autotask ID import toggle 2026-02-19 12:56:45 +01:00
1fb99dc6e7 Update technical notes for search remarks and filters 2026-02-16 16:59:33 +01:00
f2c0d0b36a Auto-commit local changes before build (2026-02-16 16:58:07) 2026-02-16 16:58:07 +01:00
652da5e117 Add remarks to global search results 2026-02-16 16:57:51 +01:00
c8e7491c94 Add Daily Jobs note to search results 2026-02-16 16:54:26 +01:00
e5da01cfbb Auto-commit local changes before build (2026-02-16 16:50:14) 2026-02-16 16:50:14 +01:00
b46010dbc2 Forward global search filters to overview pages 2026-02-16 16:49:47 +01:00
f90b2bdcf6 Keep search pagination at current section 2026-02-16 16:32:23 +01:00
fcbf67aeb3 Update technical notes for latest search improvements 2026-02-16 16:28:49 +01:00
2beba3bc9d Auto-commit local changes before build (2026-02-16 16:27:14) 2026-02-16 16:27:14 +01:00
ded71cb50f Improve daily jobs search metadata and modal link 2026-02-16 16:26:56 +01:00
dc3eb2f73c Auto-commit local changes before build (2026-02-16 16:19:53) 2026-02-16 16:19:53 +01:00
8a8f957c9f Add per-section pagination to global search 2026-02-16 16:19:26 +01:00
8c29f527c6 Document search template crash fix 2026-02-16 16:10:31 +01:00
fcce3b8854 Fix search template section items iteration 2026-02-16 16:09:22 +01:00
79933c2ecd Update technical notes for global search 2026-02-16 16:08:29 +01:00
d84d2142ec Auto-commit local changes before build (2026-02-16 16:06:20) 2026-02-16 16:06:20 +01:00
7476ebcbe3 Add role-aware global grouped search 2026-02-16 16:05:47 +01:00
189dc4ed37 Update technical notes for customer jobs filter 2026-02-16 15:26:51 +01:00
f4384086f2 Auto-commit local changes before build (2026-02-16 15:15:10) 2026-02-16 15:15:10 +01:00
dca117ed79 Add customer-to-jobs filtering navigation 2026-02-16 15:12:10 +01:00
ecdb331c9b Update technical documentation with detailed system knowledge
Enhanced technical-notes-codex.md with comprehensive details from Claude's
system knowledge document, including:

Ticketing & Autotask:
- Detailed two-ticket system explanation (internal vs Autotask)
- Complete ticket propagation strategies (Strategy 1 & 2)
- Where ticket linking is called (email-based, missed runs)
- Display logic with two-source approach
- Resolved vs Deleted distinction
- All critical rules and anti-patterns

Database Models:
- Complete model listing
- Foreign key relationships and critical deletion order
- Key model fields documentation

UI & UX:
- Detailed navbar behavior
- Status badge color coding
- Complete ticket copy functionality with three-tier fallback
- Checkbox autocomplete behavior

Parser Architecture:
- Parser types (Informational vs Regular)
- Synology Updates parser example
- Schedule learning behavior

Recent Changes:
- Documented 2026-02-13 fixes (missed run ticket linking, checkbox autoselect)
- Documented 2026-02-12 fixes (Run Checks modal, Edge copy button)
- Documented 2026-02-10 changes (screenshot support, link-based system)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-13 13:36:03 +01:00
084c91945a Convert technical notes to English 2026-02-13 13:20:19 +01:00
d2cdd34541 Add internal technical notes document 2026-02-13 13:18:08 +01:00
b5cf91d5f2 Fix checkboxes auto-selecting after page reload on Inbox and Run Checks
Added autocomplete="off" attribute to all checkboxes to prevent browser from
automatically restoring checkbox states after page reload.

Changes:
- Inbox page: Added autocomplete="off" to select-all and row checkboxes
- Run Checks page: Added autocomplete="off" to select-all and row checkboxes

This fixes the issue where after deleting items, the browser would automatically
re-select the same number of checkboxes that were previously selected, causing
unwanted selections on the reloaded page.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-13 11:00:21 +01:00
385aeb901c Auto-commit local changes before build (2026-02-13 10:53:29) 2026-02-13 10:53:29 +01:00
6468cbbc74 Fix Autotask and internal tickets not linking to missed runs
Added ticket linking to missed runs by calling link_open_internal_tickets_to_run
after creating missed JobRun records in _ensure_missed_runs_for_job function.

Changes:
- Added import for link_open_internal_tickets_to_run in routes_run_checks.py
- Added db.session.flush() and ticket linking call after creating weekly missed runs
- Added db.session.flush() and ticket linking call after creating monthly missed runs
- Ensures missed runs receive same ticket propagation as email-based runs

This fixes the issue where missed runs were not showing linked internal tickets
or Autotask tickets, while error/warning runs from emails were working correctly.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-13 10:52:00 +01:00
0e1e7e053d Document Autotask internal ticket linking fix in changelog
Fixed issue where Autotask internal tickets were not being linked to new runs.
This resolves the problem identified on 2026-02-11.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-13 10:39:03 +01:00
bd72f91598 Auto-commit local changes before build (2026-02-12 13:10:24) 2026-02-12 13:10:24 +01:00
2e0baa4e35 Fix copy ticket button not working in Edge on Job Details page
Moved clipboard functions (copyToClipboard, fallbackCopy, showCopyFeedback)
inside IIFE scope for proper closure access. Edge browser is stricter than
Firefox about scope resolution - functions must be in same scope as event
listeners that call them.

Previously these functions were in global scope while event listeners were
in IIFE scope, which worked in Firefox but failed silently in Edge.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-12 11:52:32 +01:00
9dee9c300a Auto-commit local changes before build (2026-02-12 11:11:59) 2026-02-12 11:11:59 +01:00
c5cf07f4e5 Fix tickets not showing in Run Checks modal detail view
Extended /api/job-runs/<run_id>/alerts endpoint to include both:
- Tickets explicitly linked to run via ticket_job_runs (audit trail)
- Tickets linked to job via ticket_scopes (active on run date)

Previously only ticket_job_runs was queried, causing newly created
tickets to not appear in the Meldingen section of the Run Checks modal.
They would only appear after being resolved (which creates a
ticket_job_runs entry). Now both sources are queried and duplicates
are prevented.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-12 10:53:00 +01:00
91755c6e85 Add N-able support ticket email template to Cove TODO
Added ready-to-send email template for requesting expanded API access:
- Complete email with subject line
- Detailed explanation of current limitations
- Specific requests (MSP-level access, status fields, timestamps, errors)
- Technical details and test results reference
- Professional business justification (MSP use case)
- Alternative contact methods listed

User can copy-paste this email on Thursday to contact N-able support.

Template requests:
1. MSP-level API user creation
2. Access to restricted column codes (status, timestamps, errors)
3. Documentation of column code meanings
4. Alternative integration methods if API expansion not possible

Ready for action on Thursday.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 17:26:19 +01:00
6674d40f4b Major update: Cove API tested - critical limitations discovered
Added comprehensive API test results document (with ChatGPT assistance):
- docs/cove_data_protection_api_calls_known_info.md

Key findings from live API testing:
- API works: JSON-RPC 2.0 at https://api.backup.management/jsonapi
- Authentication: Login method → visa token
- Method tested: EnumerateAccountStatistics (limited success)

CRITICAL LIMITATIONS DISCOVERED:
- Security error 13501 blocks most useful columns
- No backup status fields (success/failed/warning) accessible
- No error messages (D02Fxx/D03Fxx ranges blocked)
- No reliable backup timestamps
- No detailed run history
- API users are customer-scoped (not MSP-level)
- EnumerateAccounts method always fails (security block)

Working columns (allow-list only):
- I1 (account ID), I14 (storage bytes), I18 (hostname)
- D01F00-D01F07, D09F00 (numeric metrics, semantics unclear)

Impact on Backupchecks:
- Current API access INSUFFICIENT for backup monitoring
- Cannot determine if backups succeeded or failed
- No error messages to show users
- Core Backupchecks functionality not achievable with current API

Added decision matrix with 4 options:
A. Implement metrics-only (low value, storage usage only)
B. Request expanded access from N-able (requires vendor cooperation)
C. Explore alternative methods (webhooks, reports, email)
D. Defer integration until better API access available

Recommendation: Option B or C before implementing anything
- Contact N-able support for MSP-level API user + expanded columns
- OR investigate if Cove has webhook/reporting alternatives

This represents a significant blocker for Cove integration.
Full integration requires either vendor cooperation or alternative approach.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 16:55:31 +01:00
32e68d7209 Update Cove TODO: Add complete API documentation links
Major discovery - found comprehensive JSON API documentation on N-able site!

Added documentation sections:
- Core API docs: login, authentication, construct API calls
- Key endpoints: enumerate-customers, enumerate-devices, enumerate-device-statistics
- Reference docs: API column codes, schema documentation
- Architecture and security guides

Key findings:
- API docs located in "unused" folder but still functional
- JSON API structure (likely JSON-RPC or custom format)
- Three critical endpoints identified for backup monitoring:
  1. enumerate-customers (list all customers)
  2. enumerate-devices (list backup devices)
  3. enumerate-device-statistics (backup job results - KEY ENDPOINT!)

Updated status:
- Marked API documentation as found
- Changed next action from "find docs" to "read auth docs and test"
- Updated Phase 1 to start with reading login/auth documentation

Next steps:
1. Read login.htm to understand token authentication
2. Read construct-a-call.htm to understand request format
3. Read enumerate-device-statistics.htm - likely contains backup status data
4. Test in Postman with documented format

Documentation base URL:
https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 15:48:35 +01:00
23e59ab459 Update Cove TODO: Add comprehensive Postman testing instructions
Replaced curl examples with detailed Postman testing guide:
- Step-by-step Postman setup instructions
- Two authentication methods to test (Bearer Token vs X-API-Key)
- Multiple base URLs to try (api.backup.management, backup.management)
- Expected response codes and what they mean (200, 401, 403, 404)
- Endpoint discovery list (accounts, customers, devices, jobs)
- Tips for finding API documentation

Added Postman best practices:
- Create Cove API collection
- Use environment variables (cove_token, cove_base_url)
- Save response examples
- Check rate limit headers
- Export collection to JSON

Added structured template for documenting test results:
- Working configuration (base URL, auth method)
- Available endpoints table
- Key response fields mapping to Backupchecks
- Pagination and rate limiting details
- Location to save Postman collection export

Ready for immediate API testing with Postman!

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 15:44:24 +01:00
b2992acc56 Update Cove TODO: API user created, add testing instructions
Major progress update:
- API user successfully created in Cove portal
- Credentials: SuperUser role, top-level customer access, token generated
- Portal URL identified: https://backup.management
- API user management: https://backup.management/#/api-users

Added comprehensive testing section:
- Likely API base URLs to test (api.backup.management, backup.management/api)
- Step-by-step Phase 1 testing instructions
- Multiple curl command examples for authentication testing
- Different auth header formats to try (Bearer, X-API-Key)
- Common endpoints to discover (accounts, customers, devices)
- POC Python script template

Next steps:
1. Test API authentication with curl commands
2. Find working API base URL and auth method
3. Discover available endpoints
4. Document API response format
5. Create POC script for data retrieval

Status: Ready for immediate API testing!

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 15:42:11 +01:00
200dd23285 Update Cove TODO: API exists but activation method unknown
Added critical information from user:
- Confirmed: Cove Data Protection HAS API access (documented)
- Problem: Location/method to enable API access is unknown

Changes:
- Added Phase 0: API Access Activation (critical first step)
- Marked API availability as confirmed
- Added checklist for finding API activation in admin portal
- Listed possible admin portal locations to check
- Added support channel suggestions if activation unclear
- Updated current status section with latest info

Next action: Investigate Cove admin portal or contact support for
API activation instructions.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 15:38:08 +01:00
d1023f9e52 Translate Cove Data Protection TODO to English
Changed TODO document language from Dutch to English to align with
project documentation standards (all code and docs in English).

No content changes, only translation.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 15:33:34 +01:00
1de1b032e7 Add TODO for Cove Data Protection integration
Created comprehensive TODO document for integrating Cove Data Protection
(formerly N-able Backup) into Backupchecks.

Key challenges:
- Cove does not use email notifications like other backup systems
- Need to research API availability and authentication methods
- Must determine optimal integration strategy (polling vs webhooks)

Document includes:
- Research questions (API availability, data structure, multi-tenancy)
- Three architecture options for integration
- Implementation phases (research, database, import, scheduling, UI)
- Success criteria and open questions
- References section for documentation links

Status: Research phase - waiting on API documentation investigation

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 15:32:12 +01:00
661a5783cf Auto-commit local changes before build (2026-02-10 15:27:46) 2026-02-10 15:27:46 +01:00
dfe86a6ed1 Update changelog with copy ticket button improvements
Added documentation for:
- Copy ticket button on Job Details page
- Cross-browser clipboard copy fix (Edge no longer requires manual popup)
- Three-tier fallback mechanism for clipboard operations

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 15:04:38 +01:00
35ec337c54 Add copy ticket button to Job Details and improve cross-browser copy functionality
Changes:
- Added copy ticket button (⧉) next to ticket numbers in Job Details modal
- Implemented robust cross-browser clipboard copy mechanism:
  1. Modern navigator.clipboard API (works in HTTPS contexts)
  2. Legacy document.execCommand('copy') fallback (works in older browsers)
  3. Prompt fallback as last resort
- Applied improved copy function to both Run Checks and Job Details pages
- Copy now works directly in all browsers (Firefox, Edge, Chrome) without popup

This eliminates the manual copy step in Edge that previously required a popup.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 15:04:21 +01:00
c777728c91 Update changelog with comprehensive screenshot feature documentation
Added detailed documentation for screenshot attachment support in Feedback
system, including:
- File validation using imghdr (header inspection, not just extensions)
- Admin access control for deleted item attachments
- Automatic CASCADE delete behavior
- Enhanced admin deleted items view with permanent delete
- UI improvements for deleted item display (opacity + background)
- Security considerations for non-admin users

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 13:51:54 +01:00
0510613708 Fix: Allow admins to view screenshots of deleted feedback items
Two fixes:
1. Improved deleted item row styling (opacity + background)
2. Allow feedback_attachment route to serve images from deleted items (admin only)

Before: Screenshots shown as links only (2026-02-10_13_29_39.png)
After: Screenshots shown as images/thumbnails

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 13:46:24 +01:00
fc99f17db3 Add admin view for deleted feedback items + permanent delete
User request: Allow admins to view deleted items and permanently
delete them (hard delete) to clean up database and remove screenshots.

Features:
1. Admin-only "Show deleted" checkbox on feedback list
2. Deleted items shown with gray background + "Deleted" badge
3. Permanent delete button (only for soft-deleted items)
4. Hard delete removes item + all attachments from database
5. Admins can view detail pages of deleted items

Backend (routes_feedback.py):
- Added show_deleted parameter (admin only)
- Modified feedback_page query to optionally include deleted items
- Added deleted_at, deleted_by to query results
- Modified feedback_detail to allow admins to view deleted items
- New route: feedback_permanent_delete (hard delete)
  - Only works on already soft-deleted items (safety check)
  - Uses db.session.delete() - CASCADE removes attachments
  - Shows attachment count in confirmation message

Frontend:
- feedback.html:
  - "Show deleted items" checkbox (auto-submits form)
  - Deleted items: gray background (table-secondary)
  - Shows deleted timestamp
  - "Permanent Delete" button in Actions column
  - Confirmation dialog warns about permanent deletion
- feedback_detail.html:
  - "Deleted" badge in header
  - Actions sidebar shows warning + "Permanent Delete" button
  - Normal actions (resolve/delete) hidden for deleted items

Benefits:
- Audit trail preserved with soft delete
- Database can be cleaned up later by removing old deleted items
- Screenshots (BYTEA) don't accumulate forever
- Two-stage safety: soft delete → permanent delete

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 13:40:53 +01:00
1a506c0713 Fix: Add FeedbackAttachment to routes_shared imports
Missing import caused NameError when creating feedback with screenshots.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 13:30:47 +01:00
85798a07ae Auto-commit local changes before build (2026-02-10 13:29:10) 2026-02-10 13:29:10 +01:00
451ce1ab22 Add screenshot attachment support to Feedback/Bug system
User request: Allow screenshots to be attached to bug reports
and feature requests for better documentation and reproduction.

Database:
- New model: FeedbackAttachment (file_data BYTEA, filename, mime_type, file_size)
- Links to feedback_item_id (required) and feedback_reply_id (optional)
- Migration: auto-creates table with indexes on startup
- Cascading deletes when item or reply is deleted

Backend (routes_feedback.py):
- Helper function: _validate_image_file() for security
  - Validates file type using imghdr (not just extension)
  - Enforces size limit (5MB per file)
  - Secure filename handling with werkzeug
  - Allowed: PNG, JPG, GIF, WEBP
- Updated feedback_new: accepts multiple file uploads
- Updated feedback_reply: accepts multiple file uploads
- Updated feedback_detail: fetches attachments for item + replies
- New route: /feedback/attachment/<id> to serve images

Frontend:
- feedback_new.html: file input with multiple selection
- feedback_detail.html:
  - Shows item screenshots as clickable thumbnails (max 300x200)
  - Shows reply screenshots as clickable thumbnails (max 200x150)
  - File upload in reply form
  - All images open full-size in new tab

Security:
- Access control: only authenticated users with feedback roles
- Image type verification using imghdr (header inspection)
- File size limit enforced (5MB)
- Secure filename sanitization
- Deleted items hide their attachments (404)

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 13:28:41 +01:00
58 changed files with 5780 additions and 106 deletions

7
.gitignore vendored
View File

@ -1,2 +1,9 @@
# Claude Code confidential files # Claude Code confidential files
.claude/ .claude/
# Codex local workspace files
.codex/
# Python cache artifacts
__pycache__/
*.pyc

View File

@ -1 +1 @@
main v20260223-05-cove-integration

View File

@ -0,0 +1,249 @@
# TODO: Cove Data Protection Integration
**Date:** 2026-02-23
**Status:** Research COMPLETED — Ready for implementation
**Priority:** Medium
---
## 🎯 Goal
Integrate Cove Data Protection (formerly N-able Backup / SolarWinds Backup) into Backupchecks for backup status monitoring via scheduled API polling. The integration runs server-side within the Backupchecks web application.
**Challenge:** Cove does NOT work with email notifications like other backup systems (Veeam, Synology, NAKIVO). We use the JSON-RPC API instead.
---
## ✅ Research Phase — COMPLETED (2026-02-23)
### Confirmed findings
- **API endpoint:** `https://api.backup.management/jsonapi`
- **Protocol:** JSON-RPC 2.0, POST requests, `Content-Type: application/json`
- **Authentication:** Login method returns a `visa` token — include in all subsequent calls
- **PartnerId:** `139124` (MCC Automatisering) — required for all queries, partnernaam is NIET nodig
- **Alle benodigde data is beschikbaar** — eerdere blockers (D02/D03 errors) waren door gebruik van legacy column codes. Vervangen door D10/D11.
- **Geen MSP-level beperking** — elke API user heeft dezelfde toegang. Toegang tot alle sub-customers via top-level account.
- **Geen EnumerateAccounts nodig**`EnumerateAccountStatistics` met juiste columns geeft alles wat we nodig hebben.
### Officiële documentatie (van N-able support, Andrew Robinson)
- **Getting Started:** https://developer.n-able.com/n-able-cove/docs/getting-started
- **Column Codes:** https://developer.n-able.com/n-able-cove/docs/column-codes
- **Construct a Call:** https://developer.n-able.com/n-able-cove/docs/construct-a-json-rpc-api-call
- **Authorization:** https://developer.n-able.com/n-able-cove/docs/authorization
---
## 📡 API — Vastgestelde werking
### Stap 1: Login
```json
POST https://api.backup.management/jsonapi
Content-Type: application/json
{
"jsonrpc": "2.0",
"id": "jsonrpc",
"method": "Login",
"params": {
"username": "{{cove_api_username}}",
"password": "{{cove_api_password}}"
}
}
```
**Response bevat:**
- `visa` — sessie token (meegeven in alle vervolg calls)
- `result.PartnerId` — het partner ID (139124 voor MCC Automatisering)
### Stap 2: EnumerateAccountStatistics
```json
{
"jsonrpc": "2.0",
"visa": "{{visa}}",
"id": "jsonrpc",
"method": "EnumerateAccountStatistics",
"params": {
"query": {
"PartnerId": 139124,
"StartRecordNumber": 0,
"RecordsCount": 250,
"Columns": [
"I1", "I18", "I8", "I78",
"D09F00", "D09F09", "D09F15", "D09F08",
"D1F00", "D1F15",
"D10F00", "D10F15",
"D11F00", "D11F15",
"D19F00", "D19F15",
"D20F00", "D20F15",
"D5F00", "D5F15",
"D23F00", "D23F15"
]
}
}
}
```
---
## 📋 Column codes — wat ze betekenen
### Device info
| Column | Betekenis | Type |
|--------|-----------|------|
| `I1` | Device naam (intern, uniek) | String |
| `I18` | Computer naam (leesbaar) — leeg bij M365 | String |
| `I8` | Klant naam | String |
| `I78` | Actieve datasources, bijv. `D01D02D10` | String |
### Datasource status (per datasource herhaalbaar)
| Suffix | Betekenis | Type |
|--------|-----------|------|
| `F00` | Status van laatste sessie | Int (zie tabel) |
| `F09` | Tijdstip laatste **succesvolle** sessie | Unix timestamp |
| `F15` | Tijdstip laatste sessie (ongeacht status) | Unix timestamp |
| `F08` | Color bar laatste 28 dagen (28 cijfers) | String |
### Status waarden (F00)
| Waarde | Betekenis |
|--------|-----------|
| `1` | In process |
| `2` | Failed ❌ |
| `3` | Aborted |
| `5` | Completed ✅ |
| `6` | Interrupted |
| `8` | CompletedWithErrors ⚠️ |
| `9` | InProgressWithFaults |
| `10` | OverQuota |
| `11` | NoSelection (geconfigureerd maar niets geselecteerd) |
| `12` | Restarted |
### Datasources
| Code | Naam | Gebruik |
|-------|------|---------|
| `D09` | Total (alle datasources gecombineerd) | Altijd aanwezig, beste voor overall status |
| `D1` | Files & Folders | Servers/workstations |
| `D2` | System State | Servers/workstations |
| `D10` | VssMsSql (SQL Server) | Servers met SQL |
| `D11` | VssSharePoint | Servers met SharePoint |
| `D19` | Microsoft 365 Exchange | M365 tenants |
| `D20` | Microsoft 365 OneDrive | M365 tenants |
| `D5` | Microsoft 365 SharePoint | M365 tenants |
| `D23` | Microsoft 365 Teams | M365 tenants |
**Let op:** D02 en D03 zijn legacy codes — gebruik D10 en D11.
### Device types herkennen via I78
- `I78` bevat waarden zoals `D01D02`, `D01D02D10`, `D19D20D05D23`
- Leeg `I18` veld = Microsoft 365 tenant
- Gevuld `I18` veld = server of workstation
### D09F08 — Color bar decoderen
28 tekens, elk karakter = 1 dag (oudste eerst):
- `5` = Completed ✅
- `8` = CompletedWithErrors ⚠️
- `2` = Failed ❌
- `1` = In progress
- `0` = Geen backup
---
## 🏗️ Architectuur beslissing
**Gekozen: Option 2 — Parallel Import System**
```
API Poller → Cove API Parser → JobRun (direct, zonder MailMessage)
```
Rationale:
- Schone scheiding van email- en API-gebaseerde imports
- Geen misbruik van MailMessage model voor data zonder email context
- Toekomstbestendig voor andere API-gebaseerde backup systemen
### Database wijzigingen nodig
- `JobRun.source_type` — nieuw veld: `"email"` of `"api"`
- `JobRun.external_id` — Cove `AccountId` als externe referentie
- `JobRun.mail_message` — moet nullable worden (of aparte tabel)
---
## 🔧 Implementatie fases
### Phase 1: Database migratie
- [ ] `source_type` veld toevoegen aan JobRun (`email` / `api`)
- [ ] `external_id` veld toevoegen aan JobRun (voor Cove AccountId)
- [ ] `mail_message` FK nullable maken voor API-gebaseerde runs
- [ ] Migratie schrijven en testen
### Phase 2: Cove API client
- [ ] Nieuw bestand: `app/services/cove_client.py`
- [ ] Login methode (visa token ophalen)
- [ ] `enumerate_account_statistics()` methode
- [ ] Paginatie afhandelen (RecordsCount / StartRecordNumber)
- [ ] Token verloop afhandelen (opnieuw inloggen)
- [ ] Error handling & retry logic
### Phase 3: Data transformatie
- [ ] Nieuw bestand: `app/services/cove_importer.py`
- [ ] Settings lijst omzetten naar dict voor makkelijke lookup
- [ ] Unix timestamps omzetten naar datetime
- [ ] Datasource status mappen naar Backupchecks status (success/warning/failed)
- [ ] Device type bepalen (server vs M365) via `I18` en `I78`
- [ ] JobRun records aanmaken per device
### Phase 4: Scheduled polling
- [ ] Cronjob of scheduled task (elke 15-60 minuten?)
- [ ] Duplicate detectie op basis van `external_id` + tijdstip
- [ ] Logging & audit trail
- [ ] Rate limiting respecteren
### Phase 5: UI aanpassingen
- [ ] Job Details: geen "Download EML" knop voor API-gebaseerde runs
- [ ] Indicatie dat job afkomstig is van Cove API (niet email)
- [ ] 28-daagse color bar eventueel tonen
### Phase 6: Configuratie
- [ ] Cove API credentials opslaan in SystemSettings
- [ ] PartnerId configureerbaar maken
- [ ] Polling interval instelbaar
---
## 🔑 API Credentials
- **API User:** `backupchecks-cove-01`
- **User ID:** `1665555`
- **PartnerId:** `139124`
- **Role:** SuperUser + SecurityOfficer
- **Portal:** https://backup.management/#/api-users
**BELANGRIJK:** Token opslaan in password manager — kan niet opnieuw worden opgevraagd!
---
## ❓ Openstaande vragen voor implementatie
1. Hoe slaan we de Cove API credentials veilig op in Backupchecks? (SystemSettings? Environment variable?)
2. Wat is de gewenste polling frequentie? (15 min / 30 min / 1 uur?)
3. Willen we historische data importeren bij eerste run, of alleen nieuwe sessies?
4. Willen we de 28-daagse color bar (`D09F08`) tonen in de UI?
5. Ondersteunen we meerdere Cove accounts (meerdere MSPs)?
---
## 🎯 Success Criteria (MVP)
- [ ] Backup status (success/warning/failed) per device zichtbaar in Backupchecks
- [ ] Klant naam en device naam correct gekoppeld
- [ ] Tijdstip laatste backup beschikbaar
- [ ] Zichtbaar in Daily Jobs & Run Checks
- [ ] Servers én Microsoft 365 tenants worden ondersteund
- [ ] Geen duplicates bij herhaalde polling
### Nice to Have
- [ ] 28-daagse history grafiek
- [ ] Per-datasource status (SQL, Exchange, etc.)
- [ ] Polling frequentie instelbaar per klant

View File

@ -13,6 +13,7 @@ from .main.routes import main_bp
from .main.routes_documentation import doc_bp from .main.routes_documentation import doc_bp
from .migrations import run_migrations from .migrations import run_migrations
from .auto_importer_service import start_auto_importer from .auto_importer_service import start_auto_importer
from .cove_importer_service import start_cove_importer
def _get_today_ui_date() -> str: def _get_today_ui_date() -> str:
@ -212,4 +213,7 @@ def create_app():
# Start automatic mail importer background thread # Start automatic mail importer background thread
start_auto_importer(app) start_auto_importer(app)
# Start Cove Data Protection importer background thread
start_cove_importer(app)
return app return app

View File

@ -1,5 +1,11 @@
import base64
import binascii
import hashlib
import os
import random import random
import secrets
from functools import wraps from functools import wraps
from urllib.parse import urlencode
from flask import ( from flask import (
Blueprint, Blueprint,
@ -11,9 +17,10 @@ from flask import (
session, session,
) )
from flask_login import login_user, logout_user, login_required, current_user from flask_login import login_user, logout_user, login_required, current_user
import requests
from ..database import db from ..database import db
from ..models import User from ..models import SystemSettings, User
auth_bp = Blueprint("auth", __name__, url_prefix="/auth") auth_bp = Blueprint("auth", __name__, url_prefix="/auth")
@ -31,6 +38,131 @@ def generate_captcha():
return question, answer return question, answer
def _entra_effective_config() -> dict:
"""Return effective Entra SSO config from DB settings with env fallback."""
settings = SystemSettings.query.first()
enabled = bool(getattr(settings, "entra_sso_enabled", False)) if settings else False
tenant_id = (getattr(settings, "entra_tenant_id", None) or "").strip() if settings else ""
client_id = (getattr(settings, "entra_client_id", None) or "").strip() if settings else ""
client_secret = (getattr(settings, "entra_client_secret", None) or "").strip() if settings else ""
redirect_uri = (getattr(settings, "entra_redirect_uri", None) or "").strip() if settings else ""
allowed_domain = (getattr(settings, "entra_allowed_domain", None) or "").strip().lower() if settings else ""
allowed_group_ids = (getattr(settings, "entra_allowed_group_ids", None) or "").strip() if settings else ""
auto_provision = bool(getattr(settings, "entra_auto_provision_users", False)) if settings else False
if not tenant_id:
tenant_id = (os.environ.get("ENTRA_TENANT_ID", "") or "").strip()
if not client_id:
client_id = (os.environ.get("ENTRA_CLIENT_ID", "") or "").strip()
if not client_secret:
client_secret = (os.environ.get("ENTRA_CLIENT_SECRET", "") or "").strip()
if not redirect_uri:
redirect_uri = (os.environ.get("ENTRA_REDIRECT_URI", "") or "").strip()
if not allowed_domain:
allowed_domain = (os.environ.get("ENTRA_ALLOWED_DOMAIN", "") or "").strip().lower()
if not enabled:
env_enabled = (os.environ.get("ENTRA_SSO_ENABLED", "") or "").strip().lower()
enabled = env_enabled in ("1", "true", "yes", "on")
if not auto_provision:
env_auto = (os.environ.get("ENTRA_AUTO_PROVISION_USERS", "") or "").strip().lower()
auto_provision = env_auto in ("1", "true", "yes", "on")
if not allowed_group_ids:
allowed_group_ids = (os.environ.get("ENTRA_ALLOWED_GROUP_IDS", "") or "").strip()
return {
"enabled": enabled,
"tenant_id": tenant_id,
"client_id": client_id,
"client_secret": client_secret,
"redirect_uri": redirect_uri,
"allowed_domain": allowed_domain,
"allowed_group_ids": allowed_group_ids,
"auto_provision": auto_provision,
}
def _parse_group_ids(raw: str | None) -> set[str]:
if not raw:
return set()
normalized = raw.replace("\n", ",").replace(";", ",")
out = set()
for item in normalized.split(","):
value = (item or "").strip()
if value:
out.add(value.lower())
return out
def _b64url_decode(data: str) -> bytes:
pad = "=" * (-len(data) % 4)
return base64.urlsafe_b64decode((data + pad).encode("ascii"))
def _decode_id_token_payload(id_token: str) -> dict:
"""Decode JWT payload without signature verification (token comes from Entra token endpoint)."""
if not id_token or "." not in id_token:
return {}
parts = id_token.split(".")
if len(parts) < 2:
return {}
try:
payload_raw = _b64url_decode(parts[1])
import json
payload = json.loads(payload_raw.decode("utf-8"))
if isinstance(payload, dict):
return payload
except (binascii.Error, ValueError, UnicodeDecodeError):
return {}
return {}
def _resolve_sso_user(claims: dict, auto_provision: bool) -> User | None:
"""Resolve or optionally create a local user from Entra claims."""
username = (
(claims.get("preferred_username") or "")
or (claims.get("upn") or "")
or (claims.get("email") or "")
).strip()
email = ((claims.get("email") or claims.get("preferred_username") or "") or "").strip() or None
if not username:
return None
user = User.query.filter_by(username=username).first()
if not user and email:
user = User.query.filter_by(email=email).first()
if user:
return user
if not auto_provision:
return None
new_username = username
if User.query.filter_by(username=new_username).first():
base = new_username
idx = 1
while User.query.filter_by(username=f"{base}.{idx}").first():
idx += 1
new_username = f"{base}.{idx}"
# Random local password as fallback; SSO users authenticate via Entra.
random_password = secrets.token_urlsafe(32)
new_user = User(username=new_username, email=email, role="viewer")
new_user.set_password(random_password)
db.session.add(new_user)
db.session.commit()
return new_user
def _groups_from_claims(claims: dict) -> set[str]:
groups = claims.get("groups")
if isinstance(groups, list):
return {str(x).strip().lower() for x in groups if str(x).strip()}
if isinstance(groups, str) and groups.strip():
return {groups.strip().lower()}
return set()
def captcha_required(func): def captcha_required(func):
@wraps(func) @wraps(func)
def wrapper(*args, **kwargs): def wrapper(*args, **kwargs):
@ -42,10 +174,18 @@ def captcha_required(func):
# regenerate captcha for re-render # regenerate captcha for re-render
question, answer = generate_captcha() question, answer = generate_captcha()
session["captcha_answer"] = answer session["captcha_answer"] = answer
cfg = _entra_effective_config()
entra_ready = bool(
cfg.get("enabled")
and cfg.get("tenant_id")
and cfg.get("client_id")
and cfg.get("client_secret")
)
return render_template( return render_template(
"auth/login.html", "auth/login.html",
captcha_question=question, captcha_question=question,
username=request.form.get("username", ""), username=request.form.get("username", ""),
entra_sso_enabled=entra_ready,
) )
return func(*args, **kwargs) return func(*args, **kwargs)
@ -61,7 +201,18 @@ def login():
question, answer = generate_captcha() question, answer = generate_captcha()
session["captcha_answer"] = answer session["captcha_answer"] = answer
return render_template("auth/login.html", captcha_question=question) cfg = _entra_effective_config()
entra_ready = bool(
cfg.get("enabled")
and cfg.get("tenant_id")
and cfg.get("client_id")
and cfg.get("client_secret")
)
return render_template(
"auth/login.html",
captcha_question=question,
entra_sso_enabled=entra_ready,
)
# POST # POST
username = (request.form.get("username") or "").strip() username = (request.form.get("username") or "").strip()
@ -72,8 +223,18 @@ def login():
flash("Invalid username or password.", "danger") flash("Invalid username or password.", "danger")
question, answer = generate_captcha() question, answer = generate_captcha()
session["captcha_answer"] = answer session["captcha_answer"] = answer
cfg = _entra_effective_config()
entra_ready = bool(
cfg.get("enabled")
and cfg.get("tenant_id")
and cfg.get("client_id")
and cfg.get("client_secret")
)
return render_template( return render_template(
"auth/login.html", captcha_question=question, username=username "auth/login.html",
captcha_question=question,
username=username,
entra_sso_enabled=entra_ready,
) )
login_user(user) login_user(user)
@ -81,18 +242,180 @@ def login():
session["active_role"] = user.roles[0] session["active_role"] = user.roles[0]
except Exception: except Exception:
session["active_role"] = (getattr(user, "role", "viewer") or "viewer").split(",")[0].strip() or "viewer" session["active_role"] = (getattr(user, "role", "viewer") or "viewer").split(",")[0].strip() or "viewer"
session["auth_provider"] = "local"
flash("You are now logged in.", "success") flash("You are now logged in.", "success")
return redirect(url_for("main.dashboard")) return redirect(url_for("main.dashboard"))
@auth_bp.route("/entra/login")
def entra_login():
"""Start Microsoft Entra ID authorization code flow."""
cfg = _entra_effective_config()
if not cfg.get("enabled"):
flash("Microsoft Entra SSO is not enabled.", "warning")
return redirect(url_for("auth.login"))
if not cfg.get("tenant_id") or not cfg.get("client_id") or not cfg.get("client_secret"):
flash("Microsoft Entra SSO is not fully configured.", "danger")
return redirect(url_for("auth.login"))
redirect_uri = cfg.get("redirect_uri") or url_for("auth.entra_callback", _external=True)
state = secrets.token_urlsafe(24)
nonce = hashlib.sha256(secrets.token_bytes(32)).hexdigest()
session["entra_state"] = state
session["entra_nonce"] = nonce
params = {
"client_id": cfg["client_id"],
"response_type": "code",
"redirect_uri": redirect_uri,
"response_mode": "query",
"scope": "openid profile email",
"state": state,
"nonce": nonce,
"prompt": "select_account",
}
auth_url = (
f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/authorize?"
f"{urlencode(params)}"
)
return redirect(auth_url)
@auth_bp.route("/entra/callback")
def entra_callback():
"""Handle Microsoft Entra ID callback and log in mapped local user."""
cfg = _entra_effective_config()
if not cfg.get("enabled"):
flash("Microsoft Entra SSO is not enabled.", "warning")
return redirect(url_for("auth.login"))
error = (request.args.get("error") or "").strip()
if error:
desc = (request.args.get("error_description") or "").strip()
flash(f"Microsoft Entra login failed: {error} {desc}".strip(), "danger")
return redirect(url_for("auth.login"))
state = (request.args.get("state") or "").strip()
expected_state = (session.get("entra_state") or "").strip()
if not state or not expected_state or state != expected_state:
flash("Invalid SSO state. Please try again.", "danger")
return redirect(url_for("auth.login"))
code = (request.args.get("code") or "").strip()
if not code:
flash("No authorization code returned by Microsoft Entra.", "danger")
return redirect(url_for("auth.login"))
redirect_uri = cfg.get("redirect_uri") or url_for("auth.entra_callback", _external=True)
token_url = f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/token"
token_payload = {
"client_id": cfg["client_id"],
"client_secret": cfg["client_secret"],
"grant_type": "authorization_code",
"code": code,
"redirect_uri": redirect_uri,
"scope": "openid profile email",
}
try:
token_resp = requests.post(token_url, data=token_payload, timeout=30)
token_resp.raise_for_status()
token_data = token_resp.json()
except Exception as exc:
flash(f"Failed to fetch token from Microsoft Entra: {exc}", "danger")
return redirect(url_for("auth.login"))
id_token = token_data.get("id_token")
claims = _decode_id_token_payload(id_token or "")
if not claims:
flash("Could not read Microsoft Entra ID token.", "danger")
return redirect(url_for("auth.login"))
expected_nonce = (session.get("entra_nonce") or "").strip()
token_nonce = (claims.get("nonce") or "").strip()
if expected_nonce and token_nonce and token_nonce != expected_nonce:
flash("Invalid SSO nonce. Please try again.", "danger")
return redirect(url_for("auth.login"))
allowed_domain = (cfg.get("allowed_domain") or "").strip().lower()
if allowed_domain:
token_tid = (claims.get("tid") or "").strip().lower()
token_domain = ""
upn = (claims.get("preferred_username") or claims.get("email") or "").strip().lower()
if "@" in upn:
token_domain = upn.split("@", 1)[1]
if allowed_domain not in {token_tid, token_domain}:
flash("Your Microsoft account is not allowed for this instance.", "danger")
return redirect(url_for("auth.login"))
allowed_groups = _parse_group_ids(cfg.get("allowed_group_ids"))
if allowed_groups:
claim_names = claims.get("_claim_names") or {}
groups_overage = isinstance(claim_names, dict) and "groups" in claim_names
token_groups = _groups_from_claims(claims)
if groups_overage:
flash(
"Group-based access check could not be completed because token group overage is active. "
"Limit group claims to assigned groups or reduce memberships.",
"danger",
)
return redirect(url_for("auth.login"))
if not token_groups:
flash(
"Group-based access is enabled, but no groups claim was received from Microsoft Entra. "
"Configure group claims in the Entra app token settings.",
"danger",
)
return redirect(url_for("auth.login"))
if token_groups.isdisjoint(allowed_groups):
flash("Your Microsoft account is not in an allowed security group.", "danger")
return redirect(url_for("auth.login"))
user = _resolve_sso_user(claims, auto_provision=bool(cfg.get("auto_provision")))
if not user:
flash(
"No local Backupchecks user is mapped to this Microsoft account. "
"Ask an admin to create or map your account.",
"danger",
)
return redirect(url_for("auth.login"))
login_user(user)
try:
session["active_role"] = user.roles[0]
except Exception:
session["active_role"] = (getattr(user, "role", "viewer") or "viewer").split(",")[0].strip() or "viewer"
session["auth_provider"] = "entra"
session.pop("entra_state", None)
session.pop("entra_nonce", None)
flash("You are now logged in with Microsoft Entra.", "success")
return redirect(url_for("main.dashboard"))
@auth_bp.route("/logout") @auth_bp.route("/logout")
@login_required @login_required
def logout(): def logout():
cfg = _entra_effective_config()
auth_provider = (session.get("auth_provider") or "").strip()
logout_user() logout_user()
try: try:
session.pop("active_role", None) session.pop("active_role", None)
session.pop("auth_provider", None)
session.pop("entra_state", None)
session.pop("entra_nonce", None)
except Exception: except Exception:
pass pass
if auth_provider == "entra" and cfg.get("enabled") and cfg.get("tenant_id"):
post_logout = url_for("auth.login", _external=True)
logout_url = (
f"https://login.microsoftonline.com/{cfg['tenant_id']}/oauth2/v2.0/logout?"
f"{urlencode({'post_logout_redirect_uri': post_logout})}"
)
return redirect(logout_url)
flash("You have been logged out.", "info") flash("You have been logged out.", "info")
return redirect(url_for("auth.login")) return redirect(url_for("auth.login"))

View File

@ -0,0 +1,536 @@
"""Cove Data Protection API importer.
Fetches backup job run data from the Cove (N-able) API.
Flow (mirrors the mail Inbox flow):
1. All Cove accounts are upserted into the `cove_accounts` staging table.
2. Accounts without a linked job appear on the Cove Accounts page where
an admin can create or link a job (same as approving a mail from Inbox).
3. For accounts that have a linked job, a JobRun is created per new session
(deduplicated via external_id).
"""
from __future__ import annotations
import logging
from datetime import datetime, timezone
from typing import Any
import requests
from sqlalchemy import text
from .database import db
logger = logging.getLogger(__name__)
COVE_DEFAULT_URL = "https://api.backup.management/jsonapi"
# Columns to request from EnumerateAccountStatistics
COVE_COLUMNS = [
"I1", # Account/device name
"I18", # Computer name
"I8", # Customer / partner name
"I78", # Active datasource label
"D09F00", # Overall last session status
"D09F09", # Last successful session timestamp
"D09F15", # Last session end timestamp
"D09F08", # 28-day colorbar
# Datasource-specific status (F00) and last session time (F15)
"D1F00", "D1F15", # Files & Folders
"D10F00", "D10F15", # VssMsSql
"D11F00", "D11F15", # VssSharePoint
"D19F00", "D19F15", # M365 Exchange
"D20F00", "D20F15", # M365 OneDrive
"D5F00", "D5F15", # M365 SharePoint
"D23F00", "D23F15", # M365 Teams
]
# Mapping from Cove status code to Backupchecks status string
STATUS_MAP: dict[int, str] = {
1: "Warning", # In process
2: "Error", # Failed
3: "Error", # Aborted
5: "Success", # Completed
6: "Error", # Interrupted
7: "Warning", # NotStarted
8: "Warning", # CompletedWithErrors
9: "Warning", # InProgressWithFaults
10: "Error", # OverQuota
11: "Warning", # NoSelection
12: "Warning", # Restarted
}
# Mapping from Cove status code to readable label
STATUS_LABELS: dict[int, str] = {
1: "In process",
2: "Failed",
3: "Aborted",
5: "Completed",
6: "Interrupted",
7: "Not started",
8: "Completed with errors",
9: "In progress with faults",
10: "Over quota",
11: "No selection",
12: "Restarted",
}
# Datasource label mapping (column prefix → human-readable label)
DATASOURCE_LABELS: dict[str, str] = {
"D1": "Files & Folders",
"D10": "VssMsSql",
"D11": "VssSharePoint",
"D19": "M365 Exchange",
"D20": "M365 OneDrive",
"D5": "M365 SharePoint",
"D23": "M365 Teams",
}
class CoveImportError(Exception):
"""Raised when Cove API interaction fails."""
def _cove_login(url: str, username: str, password: str) -> tuple[str, int]:
"""Login to the Cove API and return (visa, partner_id).
Raises CoveImportError on failure.
"""
payload = {
"jsonrpc": "2.0",
"id": "jsonrpc",
"method": "Login",
"params": {
"username": username,
"password": password,
},
}
try:
resp = requests.post(
url,
json=payload,
headers={"Content-Type": "application/json"},
timeout=30,
)
resp.raise_for_status()
data = resp.json()
except requests.RequestException as exc:
raise CoveImportError(f"Cove login request failed: {exc}") from exc
except ValueError as exc:
raise CoveImportError(f"Cove login response is not valid JSON: {exc}") from exc
if "error" in data and data["error"]:
error = data["error"]
msg = error.get("message") or str(error) if isinstance(error, dict) else str(error)
raise CoveImportError(f"Cove login failed: {msg}")
# Visa is returned at the top level of the response (not inside result)
visa = data.get("visa") or ""
if not visa:
raise CoveImportError("Cove login succeeded but no visa token returned")
# PartnerId is inside result
result = data.get("result") or {}
partner_id = (
result.get("PartnerId")
or result.get("PartnerID")
or result.get("result", {}).get("PartnerId")
or 0
)
return visa, int(partner_id)
def _cove_enumerate(
url: str,
visa: str,
partner_id: int,
start: int,
count: int,
) -> list[dict]:
"""Call EnumerateAccountStatistics and return a list of account dicts.
Returns empty list when no more results.
"""
payload = {
"jsonrpc": "2.0",
"visa": visa,
"id": "jsonrpc",
"method": "EnumerateAccountStatistics",
"params": {
"query": {
"PartnerId": partner_id,
"StartRecordNumber": start,
"RecordsCount": count,
"Columns": COVE_COLUMNS,
}
},
}
try:
resp = requests.post(
url,
json=payload,
headers={"Content-Type": "application/json"},
timeout=60,
)
resp.raise_for_status()
data = resp.json()
except requests.RequestException as exc:
raise CoveImportError(f"Cove EnumerateAccountStatistics request failed: {exc}") from exc
except ValueError as exc:
raise CoveImportError(f"Cove EnumerateAccountStatistics response is not valid JSON: {exc}") from exc
if "error" in data and data["error"]:
error = data["error"]
msg = error.get("message") or str(error) if isinstance(error, dict) else str(error)
raise CoveImportError(f"Cove EnumerateAccountStatistics failed: {msg}")
result = data.get("result")
if result is None:
return []
# Unwrap possible nested result
if isinstance(result, dict) and "result" in result:
result = result["result"]
# Accounts can be a list directly or wrapped in an "Accounts" key
if isinstance(result, list):
return result
if isinstance(result, dict):
return result.get("Accounts", []) or []
return []
def _flatten_settings(account: dict) -> dict:
"""Convert the Settings array in an account dict to a flat key→value dict.
Cove returns settings as a list of single-key dicts, e.g.:
[{"D09F00": "5"}, {"I1": "device name"}, ...]
"""
flat: dict[str, Any] = {}
settings_list = account.get("Settings") or []
if isinstance(settings_list, list):
for item in settings_list:
if isinstance(item, dict):
flat.update(item)
return flat
def _map_status(code: Any) -> str:
"""Map a Cove status code (int) to a Backupchecks status string."""
if code is None:
return "Warning"
try:
return STATUS_MAP.get(int(code), "Warning")
except (ValueError, TypeError):
return "Warning"
def _status_label(code: Any) -> str:
"""Map a Cove status code (int) to a human-readable label."""
if code is None:
return "Unknown"
try:
return STATUS_LABELS.get(int(code), f"Code {int(code)}")
except (ValueError, TypeError):
return "Unknown"
def _ts_to_dt(value: Any) -> datetime | None:
"""Convert a Unix timestamp (int or str) to a naive UTC datetime."""
if value is None:
return None
try:
ts = int(value)
if ts <= 0:
return None
return datetime.fromtimestamp(ts, tz=timezone.utc).replace(tzinfo=None)
except (ValueError, TypeError, OSError):
return None
def _fmt_utc(dt: datetime | None) -> str:
"""Format a naive UTC datetime to readable text for run object messages."""
if not dt:
return "unknown"
return dt.strftime("%Y-%m-%d %H:%M UTC")
def run_cove_import(settings) -> tuple[int, int, int, int]:
"""Fetch Cove account statistics and update the staging table + JobRuns.
For every account:
- Upsert into cove_accounts (always)
- If the account has a linked job create a JobRun if not already seen
Args:
settings: SystemSettings ORM object with cove_* fields.
Returns:
Tuple of (total_accounts, created_runs, skipped_runs, error_count).
Raises:
CoveImportError if the API login fails.
"""
url = (getattr(settings, "cove_api_url", None) or "").strip() or COVE_DEFAULT_URL
username = (getattr(settings, "cove_api_username", None) or "").strip()
password = (getattr(settings, "cove_api_password", None) or "").strip()
if not username or not password:
raise CoveImportError("Cove API username or password not configured")
visa, partner_id = _cove_login(url, username, password)
# Save partner_id back to settings
if partner_id and partner_id != getattr(settings, "cove_partner_id", None):
settings.cove_partner_id = partner_id
try:
db.session.commit()
except Exception:
db.session.rollback()
total = 0
created = 0
skipped = 0
errors = 0
page_size = 250
start = 0
while True:
try:
accounts = _cove_enumerate(url, visa, partner_id, start, page_size)
except CoveImportError:
raise
except Exception as exc:
raise CoveImportError(f"Unexpected error fetching accounts at offset {start}: {exc}") from exc
if not accounts:
break
for account in accounts:
total += 1
try:
run_created = _process_account(account)
if run_created:
created += 1
else:
skipped += 1
except Exception as exc:
errors += 1
logger.warning("Cove import: error processing account: %s", exc)
try:
db.session.rollback()
except Exception:
pass
if len(accounts) < page_size:
break
start += page_size
# Update last import timestamp
settings.cove_last_import_at = datetime.utcnow()
try:
db.session.commit()
except Exception:
db.session.rollback()
return total, created, skipped, errors
def _process_account(account: dict) -> bool:
"""Upsert a Cove account into the staging table and create a JobRun if linked.
Returns True if a new JobRun was created, False otherwise.
"""
from .models import CoveAccount, JobRun
flat = _flatten_settings(account)
# AccountId is a top-level field
account_id = account.get("AccountId") or account.get("AccountID")
if not account_id:
return False
try:
account_id = int(account_id)
except (ValueError, TypeError):
return False
# Extract metadata from flat settings
account_name = (flat.get("I1") or "").strip() or None
computer_name = (flat.get("I18") or "").strip() or None
customer_name = (flat.get("I8") or "").strip() or None
datasource_types = (flat.get("I78") or "").strip() or None
# Prefer "last session end" (D09F15); fallback to "last successful session" (D09F09)
# so accounts without D09F15 can still produce an initial run.
last_run_ts_raw = flat.get("D09F15")
last_run_at = _ts_to_dt(last_run_ts_raw)
if last_run_at is None:
last_run_ts_raw = flat.get("D09F09")
last_run_at = _ts_to_dt(last_run_ts_raw)
colorbar_28d = (flat.get("D09F08") or "").strip() or None
try:
last_status_code = int(flat["D09F00"]) if flat.get("D09F00") is not None else None
except (ValueError, TypeError):
last_status_code = None
# Upsert into cove_accounts staging table
cove_acc = CoveAccount.query.filter_by(account_id=account_id).first()
if cove_acc is None:
cove_acc = CoveAccount(
account_id=account_id,
first_seen_at=datetime.utcnow(),
)
db.session.add(cove_acc)
cove_acc.account_name = account_name
cove_acc.computer_name = computer_name
cove_acc.customer_name = customer_name
cove_acc.datasource_types = datasource_types
cove_acc.last_status_code = last_status_code
cove_acc.last_run_at = last_run_at
cove_acc.colorbar_28d = colorbar_28d
cove_acc.last_seen_at = datetime.utcnow()
db.session.flush() # ensure cove_acc.id is set
# If not linked to a job yet, nothing more to do (shows up in Cove Accounts page)
if not cove_acc.job_id:
db.session.commit()
return False
# Account is linked: create a JobRun if the last session is new
if not last_run_at:
db.session.commit()
return False
try:
run_ts = int(last_run_ts_raw or 0)
except (TypeError, ValueError):
run_ts = 0
# Fetch the linked job
from .models import Job
job = Job.query.get(cove_acc.job_id)
if not job:
db.session.commit()
return False
external_id = f"cove-{account_id}-{run_ts}"
# Deduplicate per job + session, not globally.
# This avoids blocking a run on a newly linked/relinked job when the same
# Cove session was previously stored under another job.
existing = JobRun.query.filter_by(job_id=job.id, external_id=external_id).first()
if existing:
db.session.commit()
return False
status = _map_status(last_status_code)
run_remark = (
f"Cove account: {account_name or account_id} | "
f"Computer: {computer_name or '-'} | "
f"Customer: {customer_name or '-'} | "
f"Last status: {_status_label(last_status_code)} ({last_status_code if last_status_code is not None else '-'}) | "
f"Last run: {_fmt_utc(last_run_at)}"
)
run = JobRun(
job_id=job.id,
mail_message_id=None,
run_at=last_run_at,
status=status,
remark=run_remark,
missed=False,
override_applied=False,
source_type="cove_api",
external_id=external_id,
)
db.session.add(run)
db.session.flush() # get run.id
# Persist per-datasource objects
if job.customer_id:
_persist_datasource_objects(flat, job.customer_id, job.id, run.id, last_run_at)
db.session.commit()
return True
def _persist_datasource_objects(
flat: dict,
customer_id: int,
job_id: int,
run_id: int,
observed_at: datetime,
) -> None:
"""Create run_object_links for each active datasource found in the account stats."""
engine = db.get_engine()
with engine.begin() as conn:
for ds_prefix, ds_label in DATASOURCE_LABELS.items():
status_key = f"{ds_prefix}F00"
status_code = flat.get(status_key)
if status_code is None:
continue
status = _map_status(status_code)
ds_last_ts = _ts_to_dt(flat.get(f"{ds_prefix}F15"))
status_msg = (
f"Cove datasource status: {_status_label(status_code)} "
f"({status_code}); last session: {_fmt_utc(ds_last_ts)}"
)
# Upsert customer_objects
customer_object_id = conn.execute(
text(
"""
INSERT INTO customer_objects (customer_id, object_name, object_type, first_seen_at, last_seen_at)
VALUES (:customer_id, :object_name, :object_type, NOW(), NOW())
ON CONFLICT (customer_id, object_name)
DO UPDATE SET
last_seen_at = NOW(),
object_type = COALESCE(EXCLUDED.object_type, customer_objects.object_type)
RETURNING id
"""
),
{
"customer_id": customer_id,
"object_name": ds_label,
"object_type": "cove_datasource",
},
).scalar()
# Upsert job_object_links
conn.execute(
text(
"""
INSERT INTO job_object_links (job_id, customer_object_id, first_seen_at, last_seen_at)
VALUES (:job_id, :customer_object_id, NOW(), NOW())
ON CONFLICT (job_id, customer_object_id)
DO UPDATE SET last_seen_at = NOW()
"""
),
{"job_id": job_id, "customer_object_id": customer_object_id},
)
# Upsert run_object_links
conn.execute(
text(
"""
INSERT INTO run_object_links (run_id, customer_object_id, status, error_message, observed_at)
VALUES (:run_id, :customer_object_id, :status, :error_message, :observed_at)
ON CONFLICT (run_id, customer_object_id)
DO UPDATE SET
status = EXCLUDED.status,
error_message = EXCLUDED.error_message,
observed_at = EXCLUDED.observed_at
"""
),
{
"run_id": run_id,
"customer_object_id": customer_object_id,
"status": status,
"error_message": status_msg,
"observed_at": ds_last_ts or observed_at,
},
)

View File

@ -0,0 +1,101 @@
"""Cove Data Protection importer background service.
Runs a background thread that periodically fetches backup job run data
from the Cove API and creates JobRun records in the local database.
"""
from __future__ import annotations
import threading
import time
from datetime import datetime
from .admin_logging import log_admin_event
from .cove_importer import CoveImportError, run_cove_import
from .models import SystemSettings
_COVE_IMPORTER_THREAD_NAME = "cove_importer"
def start_cove_importer(app) -> None:
"""Start the Cove importer background thread.
The thread checks settings on every loop and only runs imports when
enabled and the configured interval has elapsed.
"""
# Avoid starting multiple threads if create_app() is called more than once.
if any(t.name == _COVE_IMPORTER_THREAD_NAME for t in threading.enumerate()):
return
def _worker() -> None:
last_run_at: datetime | None = None
while True:
try:
with app.app_context():
settings = SystemSettings.query.first()
if settings is None:
time.sleep(10)
continue
enabled = bool(getattr(settings, "cove_import_enabled", False))
try:
interval_minutes = int(getattr(settings, "cove_import_interval_minutes", 30) or 30)
except (TypeError, ValueError):
interval_minutes = 30
if interval_minutes < 1:
interval_minutes = 1
now = datetime.utcnow()
due = False
if enabled:
if last_run_at is None:
due = True
else:
due = (now - last_run_at).total_seconds() >= (interval_minutes * 60)
if not due:
time.sleep(5)
continue
try:
total, created, skipped, errors = run_cove_import(settings)
except CoveImportError as exc:
log_admin_event(
"cove_import_error",
f"Cove import failed: {exc}",
)
last_run_at = now
time.sleep(5)
continue
except Exception as exc:
log_admin_event(
"cove_import_error",
f"Unexpected error during Cove import: {exc}",
)
last_run_at = now
time.sleep(5)
continue
log_admin_event(
"cove_import",
f"Cove import finished. accounts={total}, created={created}, skipped={skipped}, errors={errors}",
)
last_run_at = now
except Exception:
# Never let the thread die.
try:
with app.app_context():
log_admin_event(
"cove_import_error",
"Cove importer thread recovered from an unexpected exception.",
)
except Exception:
pass
time.sleep(5)
t = threading.Thread(target=_worker, name=_COVE_IMPORTER_THREAD_NAME, daemon=True)
t.start()

View File

@ -26,5 +26,7 @@ from . import routes_feedback # noqa: F401
from . import routes_api # noqa: F401 from . import routes_api # noqa: F401
from . import routes_reporting_api # noqa: F401 from . import routes_reporting_api # noqa: F401
from . import routes_user_settings # noqa: F401 from . import routes_user_settings # noqa: F401
from . import routes_search # noqa: F401
from . import routes_cove # noqa: F401
__all__ = ["main_bp", "roles_required"] __all__ = ["main_bp", "roles_required"]

View File

@ -16,9 +16,11 @@ def api_job_run_alerts(run_id: int):
tickets = [] tickets = []
remarks = [] remarks = []
# Tickets linked to this specific run # Tickets linked to this run:
# Only show tickets that were explicitly linked via ticket_job_runs # 1. Explicitly linked via ticket_job_runs (audit trail when resolved)
# 2. Linked to the job via ticket_scopes (active on run date)
try: try:
# First, get tickets explicitly linked to this run via ticket_job_runs
rows = ( rows = (
db.session.execute( db.session.execute(
text( text(
@ -43,7 +45,11 @@ def api_job_run_alerts(run_id: int):
.all() .all()
) )
ticket_ids_seen = set()
for r in rows: for r in rows:
ticket_id = int(r.get("id"))
ticket_ids_seen.add(ticket_id)
resolved_at = r.get("resolved_at") resolved_at = r.get("resolved_at")
resolved_same_day = False resolved_same_day = False
if resolved_at and run_date: if resolved_at and run_date:
@ -52,7 +58,62 @@ def api_job_run_alerts(run_id: int):
tickets.append( tickets.append(
{ {
"id": int(r.get("id")), "id": ticket_id,
"ticket_code": r.get("ticket_code") or "",
"description": r.get("description") or "",
"start_date": _format_datetime(r.get("start_date")),
"active_from_date": str(r.get("active_from_date")) if r.get("active_from_date") else "",
"resolved_at": _format_datetime(r.get("resolved_at")) if r.get("resolved_at") else "",
"active": bool(active_now),
"resolved_same_day": bool(resolved_same_day),
}
)
# Second, get tickets linked to the job via ticket_scopes
# These are tickets that apply to the whole job (not just a specific run)
rows = (
db.session.execute(
text(
"""
SELECT DISTINCT t.id,
t.ticket_code,
t.description,
t.start_date,
t.resolved_at,
t.active_from_date
FROM tickets t
JOIN ticket_scopes ts ON ts.ticket_id = t.id
WHERE ts.job_id = :job_id
AND t.active_from_date <= :run_date
AND COALESCE(ts.resolved_at, t.resolved_at) IS NULL
ORDER BY t.start_date DESC
"""
),
{
"job_id": job.id if job else 0,
"run_date": run_date,
},
)
.mappings()
.all()
)
for r in rows:
ticket_id = int(r.get("id"))
# Skip if already added via ticket_job_runs
if ticket_id in ticket_ids_seen:
continue
ticket_ids_seen.add(ticket_id)
resolved_at = r.get("resolved_at")
resolved_same_day = False
if resolved_at and run_date:
resolved_same_day = _to_amsterdam_date(resolved_at) == run_date
active_now = r.get("resolved_at") is None
tickets.append(
{
"id": ticket_id,
"ticket_code": r.get("ticket_code") or "", "ticket_code": r.get("ticket_code") or "",
"description": r.get("description") or "", "description": r.get("description") or "",
"start_date": _format_datetime(r.get("start_date")), "start_date": _format_datetime(r.get("start_date")),
@ -65,9 +126,13 @@ def api_job_run_alerts(run_id: int):
except Exception as exc: except Exception as exc:
return jsonify({"status": "error", "message": str(exc) or "Failed to load tickets."}), 500 return jsonify({"status": "error", "message": str(exc) or "Failed to load tickets."}), 500
# Remarks linked to this specific run # Remarks linked to this run:
# Only show remarks that were explicitly linked via remark_job_runs # 1. Explicitly linked via remark_job_runs (audit trail when resolved)
# 2. Linked to the job via remark_scopes (active on run date)
try: try:
remark_ids_seen = set()
# First, remarks explicitly linked to this run.
rows = ( rows = (
db.session.execute( db.session.execute(
text( text(
@ -88,6 +153,9 @@ def api_job_run_alerts(run_id: int):
) )
for rr in rows: for rr in rows:
remark_id = int(rr.get("id"))
remark_ids_seen.add(remark_id)
body = (rr.get("body") or "").strip() body = (rr.get("body") or "").strip()
if len(body) > 180: if len(body) > 180:
body = body[:177] + "..." body = body[:177] + "..."
@ -101,7 +169,64 @@ def api_job_run_alerts(run_id: int):
remarks.append( remarks.append(
{ {
"id": int(rr.get("id")), "id": remark_id,
"body": body,
"start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-",
"active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "",
"resolved_at": _format_datetime(rr.get("resolved_at")) if rr.get("resolved_at") else "",
"active": bool(active_now),
"resolved_same_day": bool(resolved_same_day),
}
)
# Second, active job-level remarks from scope (not yet explicitly linked to this run).
ui_tz = _get_ui_timezone_name()
rows = (
db.session.execute(
text(
"""
SELECT DISTINCT r.id, r.body, r.start_date, r.resolved_at, r.active_from_date
FROM remarks r
JOIN remark_scopes rs ON rs.remark_id = r.id
WHERE rs.job_id = :job_id
AND COALESCE(
r.active_from_date,
((r.start_date AT TIME ZONE 'UTC' AT TIME ZONE :ui_tz)::date)
) <= :run_date
AND r.resolved_at IS NULL
ORDER BY r.start_date DESC
"""
),
{
"job_id": job.id if job else 0,
"run_date": run_date,
"ui_tz": ui_tz,
},
)
.mappings()
.all()
)
for rr in rows:
remark_id = int(rr.get("id"))
if remark_id in remark_ids_seen:
continue
remark_ids_seen.add(remark_id)
body = (rr.get("body") or "").strip()
if len(body) > 180:
body = body[:177] + "..."
resolved_at = rr.get("resolved_at")
resolved_same_day = False
if resolved_at and run_date:
resolved_same_day = _to_amsterdam_date(resolved_at) == run_date
active_now = resolved_at is None or (not resolved_same_day)
remarks.append(
{
"id": remark_id,
"body": body, "body": body,
"start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-", "start_date": _format_datetime(rr.get("start_date")) if rr.get("start_date") else "-",
"active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "", "active_from_date": str(rr.get("active_from_date")) if rr.get("active_from_date") else "",
@ -547,4 +672,4 @@ def api_remark_link_run(remark_id: int):
db.session.rollback() db.session.rollback()
return jsonify({"status": "error", "message": str(exc) or "Failed to link run."}), 500 return jsonify({"status": "error", "message": str(exc) or "Failed to link run."}), 500
return jsonify({"status": "ok"}) return jsonify({"status": "ok"})

View File

@ -0,0 +1,313 @@
"""Cove Data Protection account review routes.
Mirrors the Inbox flow for mail messages:
/cove/accounts list all Cove accounts (unmatched first)
/cove/accounts/<id>/link link an account to an existing or new job
/cove/accounts/<id>/unlink remove the job link
"""
import re
from .routes_shared import * # noqa: F401,F403
from .routes_shared import _log_admin_event
from ..cove_importer import CoveImportError, run_cove_import
from ..models import CoveAccount, Customer, Job, JobRun, SystemSettings
_COVE_DATASOURCE_LABELS = {
"D01": "Files & Folders",
"D1": "Files & Folders",
"D02": "System State",
"D2": "System State",
"D10": "VssMsSql",
"D11": "VssSharePoint",
"D19": "M365 Exchange",
"D20": "M365 OneDrive",
"D05": "M365 SharePoint",
"D5": "M365 SharePoint",
"D23": "M365 Teams",
}
_COVE_M365_CODES = {"D19", "D20", "D05", "D5", "D23"}
_COVE_SERVER_CODES = {"D10", "D11"}
def _parse_cove_datasource_codes(raw: str | None) -> list[str]:
"""Extract datasource codes from Cove I78 strings like 'D01D02D10'."""
text = (raw or "").strip().upper()
if not text:
return []
return re.findall(r"D\d{1,2}", text)
def _derive_backup_type_for_account(cove_acc: CoveAccount) -> str:
"""Return Backupchecks-style backup type for a Cove account.
Heuristic:
- M365 datasource present -> Microsoft 365
- Server-specific datasource -> Server
- Otherwise -> Workstation
"""
codes = set(_parse_cove_datasource_codes(getattr(cove_acc, "datasource_types", None)))
if codes.intersection(_COVE_M365_CODES):
return "Microsoft 365"
if codes.intersection(_COVE_SERVER_CODES):
return "Server"
return "Workstation"
def _humanize_datasources(raw: str | None) -> str:
"""Return readable datasource labels from Cove I78 code string."""
labels: list[str] = []
for code in _parse_cove_datasource_codes(raw):
label = _COVE_DATASOURCE_LABELS.get(code, code)
if label not in labels:
labels.append(label)
return ", ".join(labels)
@main_bp.route("/cove/accounts")
@login_required
@roles_required("admin", "operator")
def cove_accounts():
settings = SystemSettings.query.first()
if not settings or not getattr(settings, "cove_enabled", False):
flash("Cove integration is not enabled.", "warning")
return redirect(url_for("main.settings", section="integrations"))
# Unmatched accounts (no job linked) shown first, like Inbox items
unmatched = (
CoveAccount.query
.filter(CoveAccount.job_id.is_(None))
.order_by(CoveAccount.customer_name.asc().nullslast(), CoveAccount.account_name.asc())
.all()
)
# Matched accounts
matched = (
CoveAccount.query
.filter(CoveAccount.job_id.isnot(None))
.order_by(CoveAccount.customer_name.asc().nullslast(), CoveAccount.account_name.asc())
.all()
)
customers = Customer.query.filter_by(active=True).order_by(Customer.name.asc()).all()
jobs = Job.query.filter_by(archived=False).order_by(Job.job_name.asc()).all()
for acc in unmatched + matched:
acc.derived_backup_software = "Cove Data Protection"
acc.derived_backup_type = _derive_backup_type_for_account(acc)
acc.derived_job_name = (acc.account_name or acc.computer_name or f"Cove account {acc.account_id}").strip()
acc.datasource_display = _humanize_datasources(acc.datasource_types) or ""
return render_template(
"main/cove_accounts.html",
unmatched=unmatched,
matched=matched,
customers=customers,
jobs=jobs,
settings=settings,
STATUS_LABELS={
1: "In process", 2: "Failed", 3: "Aborted", 5: "Completed",
6: "Interrupted", 7: "Not started", 8: "Completed with errors",
9: "In progress with faults", 10: "Over quota",
11: "No selection", 12: "Restarted",
},
STATUS_CLASS={
1: "warning", 2: "danger", 3: "danger", 5: "success",
6: "danger", 7: "secondary", 8: "warning", 9: "warning",
10: "danger", 11: "warning", 12: "warning",
},
)
@main_bp.route("/cove/accounts/<int:cove_account_db_id>/link", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def cove_account_link(cove_account_db_id: int):
"""Link a Cove account to a job (create a new one or select existing)."""
cove_acc = CoveAccount.query.get_or_404(cove_account_db_id)
action = (request.form.get("action") or "").strip() # "create" or "link"
linked_job_name = ""
if action == "create":
# Create a new job from the Cove account data
customer_id_raw = (request.form.get("customer_id") or "").strip()
if not customer_id_raw:
flash("Please select a customer.", "danger")
return redirect(url_for("main.cove_accounts"))
try:
customer_id = int(customer_id_raw)
except ValueError:
flash("Invalid customer selection.", "danger")
return redirect(url_for("main.cove_accounts"))
customer = Customer.query.get(customer_id)
if not customer:
flash("Customer not found.", "danger")
return redirect(url_for("main.cove_accounts"))
default_job_name = (cove_acc.account_name or cove_acc.computer_name or f"Cove account {cove_acc.account_id}").strip()
job_name = (request.form.get("job_name") or default_job_name).strip()
backup_type = (request.form.get("backup_type") or _derive_backup_type_for_account(cove_acc)).strip()
job = Job(
customer_id=customer.id,
backup_software="Cove Data Protection",
backup_type=backup_type,
job_name=job_name,
cove_account_id=cove_acc.account_id,
active=True,
auto_approve=True,
)
db.session.add(job)
db.session.flush()
cove_acc.job_id = job.id
db.session.commit()
_log_admin_event(
"cove_account_linked",
f"Created job {job.id} and linked Cove account {cove_acc.account_id} ({cove_acc.account_name})",
details=f"customer={customer.name}, job_name={job_name}",
)
linked_job_name = job_name
flash(f"Job '{job_name}' created for customer '{customer.name}'.", "success")
elif action == "link":
# Link to an existing job
job_id_raw = (request.form.get("job_id") or "").strip()
if not job_id_raw:
flash("Please select a job.", "danger")
return redirect(url_for("main.cove_accounts"))
try:
job_id = int(job_id_raw)
except ValueError:
flash("Invalid job selection.", "danger")
return redirect(url_for("main.cove_accounts"))
job = Job.query.get(job_id)
if not job:
flash("Job not found.", "danger")
return redirect(url_for("main.cove_accounts"))
job.cove_account_id = cove_acc.account_id
cove_acc.job_id = job.id
db.session.commit()
_log_admin_event(
"cove_account_linked",
f"Linked Cove account {cove_acc.account_id} ({cove_acc.account_name}) to existing job {job.id}",
details=f"job_name={job.job_name}",
)
linked_job_name = job.job_name or ""
flash(f"Cove account linked to job '{job.job_name}'.", "success")
else:
flash("Unknown action.", "warning")
return redirect(url_for("main.cove_accounts"))
# Trigger an immediate import so the latest Cove run appears right away
# after linking (instead of waiting for the next scheduled/manual import).
settings = SystemSettings.query.first()
if settings and getattr(settings, "cove_enabled", False):
linked_job_id = cove_acc.job_id
before_count = 0
if linked_job_id:
before_count = (
JobRun.query
.filter_by(job_id=linked_job_id, source_type="cove_api")
.count()
)
try:
total, created, skipped, errors = run_cove_import(settings)
after_count = 0
if linked_job_id:
after_count = (
JobRun.query
.filter_by(job_id=linked_job_id, source_type="cove_api")
.count()
)
linked_created = max(after_count - before_count, 0)
_log_admin_event(
"cove_import_after_link",
(
"Triggered immediate Cove import after account link. "
f"accounts={total}, created={created}, skipped={skipped}, errors={errors}"
),
)
if linked_created > 0:
flash(
(
f"Immediate import complete for '{linked_job_name}'. "
f"New linked runs: {linked_created} (accounts: {total}, skipped: {skipped}, errors: {errors})."
),
"success" if errors == 0 else "warning",
)
else:
latest_cove = CoveAccount.query.get(cove_acc.id)
if latest_cove and latest_cove.last_run_at:
reason = (
"latest run seems unchanged (already imported) "
"or Cove has not published a newer session yet"
)
else:
reason = "Cove returned no usable last-session timestamp yet for this account"
flash(
(
f"Immediate import complete for '{linked_job_name}', but no new run was found yet. "
f"Reason: {reason}. (accounts: {total}, skipped: {skipped}, errors: {errors})"
),
"info" if errors == 0 else "warning",
)
except CoveImportError as exc:
_log_admin_event(
"cove_import_after_link_error",
f"Immediate Cove import after account link failed: {exc}",
)
flash(
"Account linked, but immediate import failed. "
"You can run import again from Cove settings.",
"warning",
)
except Exception as exc:
_log_admin_event(
"cove_import_after_link_error",
f"Unexpected immediate Cove import error after account link: {exc}",
)
flash(
"Account linked, but immediate import encountered an unexpected error. "
"You can run import again from Cove settings.",
"warning",
)
return redirect(url_for("main.cove_accounts"))
@main_bp.route("/cove/accounts/<int:cove_account_db_id>/unlink", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def cove_account_unlink(cove_account_db_id: int):
"""Remove the job link from a Cove account (puts it back in the unmatched list)."""
cove_acc = CoveAccount.query.get_or_404(cove_account_db_id)
old_job_id = cove_acc.job_id
if old_job_id:
job = Job.query.get(old_job_id)
if job and job.cove_account_id == cove_acc.account_id:
job.cove_account_id = None
cove_acc.job_id = None
db.session.commit()
_log_admin_event(
"cove_account_unlinked",
f"Unlinked Cove account {cove_acc.account_id} ({cove_acc.account_name}) from job {old_job_id}",
)
flash("Cove account unlinked.", "success")
return redirect(url_for("main.cove_accounts"))

View File

@ -63,7 +63,27 @@ def _get_or_create_settings_local():
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def customers(): def customers():
items = Customer.query.order_by(Customer.name.asc()).all() q = (request.args.get("q") or "").strip()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
query = Customer.query
if q:
for pat in _patterns(q):
query = query.filter(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
items = query.order_by(Customer.name.asc()).all()
settings = _get_or_create_settings_local() settings = _get_or_create_settings_local()
autotask_enabled = bool(getattr(settings, "autotask_enabled", False)) autotask_enabled = bool(getattr(settings, "autotask_enabled", False))
@ -105,6 +125,7 @@ def customers():
can_manage=can_manage, can_manage=can_manage,
autotask_enabled=autotask_enabled, autotask_enabled=autotask_enabled,
autotask_configured=autotask_configured, autotask_configured=autotask_configured,
q=q,
) )
@ -484,6 +505,7 @@ def customers_export():
@roles_required("admin", "operator") @roles_required("admin", "operator")
def customers_import(): def customers_import():
file = request.files.get("file") file = request.files.get("file")
include_autotask_ids = bool(request.form.get("include_autotask_ids"))
if not file or not getattr(file, "filename", ""): if not file or not getattr(file, "filename", ""):
flash("No file selected.", "warning") flash("No file selected.", "warning")
return redirect(url_for("main.customers")) return redirect(url_for("main.customers"))
@ -520,10 +542,11 @@ def customers_import():
# Detect Autotask columns (backwards compatible - these are optional) # Detect Autotask columns (backwards compatible - these are optional)
autotask_id_idx = None autotask_id_idx = None
autotask_name_idx = None autotask_name_idx = None
if "autotask_company_id" in header: if include_autotask_ids:
autotask_id_idx = header.index("autotask_company_id") if "autotask_company_id" in header:
if "autotask_company_name" in header: autotask_id_idx = header.index("autotask_company_id")
autotask_name_idx = header.index("autotask_company_name") if "autotask_company_name" in header:
autotask_name_idx = header.index("autotask_company_name")
for r in rows[start_idx:]: for r in rows[start_idx:]:
if not r: if not r:
@ -561,7 +584,7 @@ def customers_import():
if active_val is not None: if active_val is not None:
existing.active = active_val existing.active = active_val
# Update Autotask mapping if provided in CSV # Update Autotask mapping if provided in CSV
if autotask_company_id is not None: if include_autotask_ids and autotask_company_id is not None:
existing.autotask_company_id = autotask_company_id existing.autotask_company_id = autotask_company_id
existing.autotask_company_name = autotask_company_name existing.autotask_company_name = autotask_company_name
existing.autotask_mapping_status = None # Will be resynced existing.autotask_mapping_status = None # Will be resynced
@ -579,7 +602,10 @@ def customers_import():
try: try:
db.session.commit() db.session.commit()
flash(f"Import finished. Created: {created}, Updated: {updated}, Skipped: {skipped}.", "success") flash(
f"Import finished. Created: {created}, Updated: {updated}, Skipped: {skipped}. Autotask IDs imported: {'yes' if include_autotask_ids else 'no'}.",
"success",
)
# Audit logging # Audit logging
import json import json
@ -588,6 +614,7 @@ def customers_import():
f"Imported customers from CSV", f"Imported customers from CSV",
details=json.dumps({ details=json.dumps({
"format": "CSV", "format": "CSV",
"include_autotask_ids": include_autotask_ids,
"created": created, "created": created,
"updated": updated, "updated": updated,
"skipped": skipped "skipped": skipped
@ -599,5 +626,3 @@ def customers_import():
flash("Failed to import customers.", "danger") flash("Failed to import customers.", "danger")
return redirect(url_for("main.customers")) return redirect(url_for("main.customers"))

View File

@ -9,6 +9,21 @@ MISSED_GRACE_WINDOW = timedelta(hours=1)
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def daily_jobs(): def daily_jobs():
q = (request.args.get("q") or "").strip()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
# Determine target date (default: today) in Europe/Amsterdam # Determine target date (default: today) in Europe/Amsterdam
date_str = request.args.get("date") date_str = request.args.get("date")
try: try:
@ -74,10 +89,21 @@ def daily_jobs():
weekday_idx = target_date.weekday() # 0=Mon..6=Sun weekday_idx = target_date.weekday() # 0=Mon..6=Sun
jobs = ( jobs_query = (
Job.query.join(Customer, isouter=True) Job.query.join(Customer, isouter=True)
.filter(Job.archived.is_(False)) .filter(Job.archived.is_(False))
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True))) .filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
)
if q:
for pat in _patterns(q):
jobs_query = jobs_query.filter(
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
)
jobs = (
jobs_query
.order_by(Customer.name.asc().nullslast(), Job.backup_software.asc(), Job.backup_type.asc(), Job.job_name.asc()) .order_by(Customer.name.asc().nullslast(), Job.backup_software.asc(), Job.backup_type.asc(), Job.job_name.asc())
.all() .all()
) )
@ -306,7 +332,7 @@ def daily_jobs():
) )
target_date_str = target_date.strftime("%Y-%m-%d") target_date_str = target_date.strftime("%Y-%m-%d")
return render_template("main/daily_jobs.html", rows=rows, target_date_str=target_date_str) return render_template("main/daily_jobs.html", rows=rows, target_date_str=target_date_str, q=q)
@main_bp.route("/daily-jobs/details") @main_bp.route("/daily-jobs/details")

View File

@ -89,6 +89,7 @@ DOCUMENTATION_STRUCTURE = {
{'slug': 'general', 'title': 'General Settings'}, {'slug': 'general', 'title': 'General Settings'},
{'slug': 'mail-configuration', 'title': 'Mail Configuration'}, {'slug': 'mail-configuration', 'title': 'Mail Configuration'},
{'slug': 'autotask-integration', 'title': 'Autotask Integration'}, {'slug': 'autotask-integration', 'title': 'Autotask Integration'},
{'slug': 'entra-sso', 'title': 'Microsoft Entra SSO'},
{'slug': 'reporting-settings', 'title': 'Reporting Settings'}, {'slug': 'reporting-settings', 'title': 'Reporting Settings'},
{'slug': 'user-management', 'title': 'User Management'}, {'slug': 'user-management', 'title': 'User Management'},
{'slug': 'maintenance', 'title': 'Maintenance'}, {'slug': 'maintenance', 'title': 'Maintenance'},

View File

@ -1,5 +1,53 @@
from .routes_shared import * # noqa: F401,F403 from .routes_shared import * # noqa: F401,F403
from .routes_shared import _format_datetime from .routes_shared import _format_datetime
from werkzeug.utils import secure_filename
import imghdr
# Allowed image extensions and max file size
ALLOWED_EXTENSIONS = {'png', 'jpg', 'jpeg', 'gif', 'webp'}
MAX_FILE_SIZE = 5 * 1024 * 1024 # 5 MB
def _validate_image_file(file):
"""Validate uploaded image file.
Returns (is_valid, error_message, mime_type)
"""
if not file or not file.filename:
return False, "No file selected", None
# Check file size
file.seek(0, 2) # Seek to end
size = file.tell()
file.seek(0) # Reset to beginning
if size > MAX_FILE_SIZE:
return False, f"File too large (max {MAX_FILE_SIZE // (1024*1024)}MB)", None
if size == 0:
return False, "Empty file", None
# Check extension
filename = secure_filename(file.filename)
if '.' not in filename:
return False, "File must have an extension", None
ext = filename.rsplit('.', 1)[1].lower()
if ext not in ALLOWED_EXTENSIONS:
return False, f"Only images allowed ({', '.join(ALLOWED_EXTENSIONS)})", None
# Verify it's actually an image by reading header
file_data = file.read()
file.seek(0)
image_type = imghdr.what(None, h=file_data)
if image_type is None:
return False, "Invalid image file", None
mime_type = f"image/{image_type}"
return True, None, mime_type
@main_bp.route("/feedback") @main_bp.route("/feedback")
@ -21,7 +69,14 @@ def feedback_page():
if sort not in ("votes", "newest", "updated"): if sort not in ("votes", "newest", "updated"):
sort = "votes" sort = "votes"
where = ["fi.deleted_at IS NULL"] # Admin-only: show deleted items
show_deleted = False
if get_active_role() == "admin":
show_deleted = request.args.get("show_deleted", "0") in ("1", "true", "yes", "on")
where = []
if not show_deleted:
where.append("fi.deleted_at IS NULL")
params = {"user_id": int(current_user.id)} params = {"user_id": int(current_user.id)}
if item_type: if item_type:
@ -58,6 +113,8 @@ def feedback_page():
fi.status, fi.status,
fi.created_at, fi.created_at,
fi.updated_at, fi.updated_at,
fi.deleted_at,
fi.deleted_by_user_id,
u.username AS created_by, u.username AS created_by,
COALESCE(v.vote_count, 0) AS vote_count, COALESCE(v.vote_count, 0) AS vote_count,
EXISTS ( EXISTS (
@ -95,6 +152,8 @@ def feedback_page():
"created_by": r["created_by"] or "-", "created_by": r["created_by"] or "-",
"vote_count": int(r["vote_count"] or 0), "vote_count": int(r["vote_count"] or 0),
"user_voted": bool(r["user_voted"]), "user_voted": bool(r["user_voted"]),
"is_deleted": bool(r["deleted_at"]),
"deleted_at": _format_datetime(r["deleted_at"]) if r["deleted_at"] else "",
} }
) )
@ -105,6 +164,7 @@ def feedback_page():
status=status, status=status,
q=q, q=q,
sort=sort, sort=sort,
show_deleted=show_deleted,
) )
@ -135,6 +195,31 @@ def feedback_new():
created_by_user_id=int(current_user.id), created_by_user_id=int(current_user.id),
) )
db.session.add(item) db.session.add(item)
db.session.flush() # Get item.id for attachments
# Handle file uploads (multiple files allowed)
files = request.files.getlist('screenshots')
for file in files:
if file and file.filename:
is_valid, error_msg, mime_type = _validate_image_file(file)
if not is_valid:
db.session.rollback()
flash(f"Screenshot error: {error_msg}", "danger")
return redirect(url_for("main.feedback_new"))
filename = secure_filename(file.filename)
file_data = file.read()
attachment = FeedbackAttachment(
feedback_item_id=item.id,
feedback_reply_id=None,
filename=filename,
file_data=file_data,
mime_type=mime_type,
file_size=len(file_data),
)
db.session.add(attachment)
db.session.commit() db.session.commit()
flash("Feedback item created.", "success") flash("Feedback item created.", "success")
@ -148,7 +233,8 @@ def feedback_new():
@roles_required("admin", "operator", "reporter", "viewer") @roles_required("admin", "operator", "reporter", "viewer")
def feedback_detail(item_id: int): def feedback_detail(item_id: int):
item = FeedbackItem.query.get_or_404(item_id) item = FeedbackItem.query.get_or_404(item_id)
if item.deleted_at is not None: # Allow admins to view deleted items
if item.deleted_at is not None and get_active_role() != "admin":
abort(404) abort(404)
vote_count = ( vote_count = (
@ -174,13 +260,41 @@ def feedback_detail(item_id: int):
resolved_by = User.query.get(item.resolved_by_user_id) resolved_by = User.query.get(item.resolved_by_user_id)
resolved_by_name = resolved_by.username if resolved_by else "" resolved_by_name = resolved_by.username if resolved_by else ""
# Get attachments for the main item (not linked to a reply)
item_attachments = (
FeedbackAttachment.query.filter(
FeedbackAttachment.feedback_item_id == item.id,
FeedbackAttachment.feedback_reply_id.is_(None),
)
.order_by(FeedbackAttachment.created_at.asc())
.all()
)
replies = ( replies = (
FeedbackReply.query.filter(FeedbackReply.feedback_item_id == item.id) FeedbackReply.query.filter(FeedbackReply.feedback_item_id == item.id)
.order_by(FeedbackReply.created_at.asc()) .order_by(FeedbackReply.created_at.asc())
.all() .all()
) )
# Get attachments for each reply
reply_ids = [r.id for r in replies]
reply_attachments_list = []
if reply_ids:
reply_attachments_list = (
FeedbackAttachment.query.filter(
FeedbackAttachment.feedback_reply_id.in_(reply_ids)
)
.order_by(FeedbackAttachment.created_at.asc())
.all()
)
# Map reply_id -> list of attachments
reply_attachments_map = {}
for att in reply_attachments_list:
if att.feedback_reply_id not in reply_attachments_map:
reply_attachments_map[att.feedback_reply_id] = []
reply_attachments_map[att.feedback_reply_id].append(att)
reply_user_ids = sorted({int(r.user_id) for r in replies}) reply_user_ids = sorted({int(r.user_id) for r in replies})
reply_users = ( reply_users = (
User.query.filter(User.id.in_(reply_user_ids)).all() if reply_user_ids else [] User.query.filter(User.id.in_(reply_user_ids)).all() if reply_user_ids else []
@ -196,6 +310,8 @@ def feedback_detail(item_id: int):
user_voted=bool(user_voted), user_voted=bool(user_voted),
replies=replies, replies=replies,
reply_user_map=reply_user_map, reply_user_map=reply_user_map,
item_attachments=item_attachments,
reply_attachments_map=reply_attachments_map,
) )
@main_bp.route("/feedback/<int:item_id>/reply", methods=["POST"]) @main_bp.route("/feedback/<int:item_id>/reply", methods=["POST"])
@ -222,6 +338,31 @@ def feedback_reply(item_id: int):
created_at=datetime.utcnow(), created_at=datetime.utcnow(),
) )
db.session.add(reply) db.session.add(reply)
db.session.flush() # Get reply.id for attachments
# Handle file uploads (multiple files allowed)
files = request.files.getlist('screenshots')
for file in files:
if file and file.filename:
is_valid, error_msg, mime_type = _validate_image_file(file)
if not is_valid:
db.session.rollback()
flash(f"Screenshot error: {error_msg}", "danger")
return redirect(url_for("main.feedback_detail", item_id=item.id))
filename = secure_filename(file.filename)
file_data = file.read()
attachment = FeedbackAttachment(
feedback_item_id=item.id,
feedback_reply_id=reply.id,
filename=filename,
file_data=file_data,
mime_type=mime_type,
file_size=len(file_data),
)
db.session.add(attachment)
db.session.commit() db.session.commit()
flash("Reply added.", "success") flash("Reply added.", "success")
@ -308,3 +449,60 @@ def feedback_delete(item_id: int):
flash("Feedback item deleted.", "success") flash("Feedback item deleted.", "success")
return redirect(url_for("main.feedback_page")) return redirect(url_for("main.feedback_page"))
@main_bp.route("/feedback/<int:item_id>/permanent-delete", methods=["POST"])
@login_required
@roles_required("admin")
def feedback_permanent_delete(item_id: int):
"""Permanently delete a feedback item and all its attachments from the database.
This is a hard delete - the item and all associated data will be removed permanently.
Only available for items that are already soft-deleted.
"""
item = FeedbackItem.query.get_or_404(item_id)
# Only allow permanent delete on already soft-deleted items
if item.deleted_at is None:
flash("Item must be deleted first before permanent deletion.", "warning")
return redirect(url_for("main.feedback_detail", item_id=item.id))
# Get attachment count for feedback message
attachment_count = FeedbackAttachment.query.filter_by(feedback_item_id=item.id).count()
# Hard delete - CASCADE will automatically delete:
# - feedback_votes
# - feedback_replies
# - feedback_attachments (via replies CASCADE)
# - feedback_attachments (direct, via item CASCADE)
db.session.delete(item)
db.session.commit()
flash(f"Feedback item permanently deleted ({attachment_count} screenshot(s) removed).", "success")
return redirect(url_for("main.feedback_page", show_deleted="1"))
@main_bp.route("/feedback/attachment/<int:attachment_id>")
@login_required
@roles_required("admin", "operator", "reporter", "viewer")
def feedback_attachment(attachment_id: int):
"""Serve a feedback attachment image."""
attachment = FeedbackAttachment.query.get_or_404(attachment_id)
# Check if the feedback item is deleted - allow admins to view
item = FeedbackItem.query.get(attachment.feedback_item_id)
if not item:
abort(404)
if item.deleted_at is not None and get_active_role() != "admin":
abort(404)
# Serve the image
from flask import send_file
import io
return send_file(
io.BytesIO(attachment.file_data),
mimetype=attachment.mime_type,
as_attachment=False,
download_name=attachment.filename,
)

View File

@ -9,12 +9,28 @@ from ..ticketing_utils import link_open_internal_tickets_to_run
import time import time
import re import re
import html as _html import html as _html
from sqlalchemy import cast, String
@main_bp.route("/inbox") @main_bp.route("/inbox")
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def inbox(): def inbox():
q = (request.args.get("q") or "").strip()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
try: try:
page = int(request.args.get("page", "1")) page = int(request.args.get("page", "1"))
except ValueError: except ValueError:
@ -28,6 +44,18 @@ def inbox():
# Use location column if available; otherwise just return all # Use location column if available; otherwise just return all
if hasattr(MailMessage, "location"): if hasattr(MailMessage, "location"):
query = query.filter(MailMessage.location == "inbox") query = query.filter(MailMessage.location == "inbox")
if q:
for pat in _patterns(q):
query = query.filter(
(func.coalesce(MailMessage.from_address, "").ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.subject, "").ilike(pat, escape="\\"))
| (cast(MailMessage.received_at, String).ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.backup_software, "").ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.backup_type, "").ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.job_name, "").ilike(pat, escape="\\"))
| (func.coalesce(MailMessage.parse_result, "").ilike(pat, escape="\\"))
| (cast(MailMessage.parsed_at, String).ilike(pat, escape="\\"))
)
total_items = query.count() total_items = query.count()
total_pages = max(1, math.ceil(total_items / per_page)) if total_items else 1 total_pages = max(1, math.ceil(total_items / per_page)) if total_items else 1
@ -79,6 +107,7 @@ def inbox():
customers=customer_rows, customers=customer_rows,
can_bulk_delete=(get_active_role() in ("admin", "operator")), can_bulk_delete=(get_active_role() in ("admin", "operator")),
is_admin=(get_active_role() == "admin"), is_admin=(get_active_role() == "admin"),
q=q,
) )
@ -1320,4 +1349,4 @@ def inbox_reparse_all():
"info", "info",
) )
return redirect(url_for("main.inbox")) return redirect(url_for("main.inbox"))

View File

@ -13,12 +13,56 @@ from .routes_shared import (
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
def jobs(): def jobs():
# Join with customers for display selected_customer_id = None
jobs = ( selected_customer_name = ""
q = (request.args.get("q") or "").strip()
customer_id_raw = (request.args.get("customer_id") or "").strip()
if customer_id_raw:
try:
selected_customer_id = int(customer_id_raw)
except ValueError:
selected_customer_id = None
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
base_query = (
Job.query Job.query
.filter(Job.archived.is_(False)) .filter(Job.archived.is_(False))
.outerjoin(Customer, Customer.id == Job.customer_id) .outerjoin(Customer, Customer.id == Job.customer_id)
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True))) )
if selected_customer_id is not None:
base_query = base_query.filter(Job.customer_id == selected_customer_id)
selected_customer = Customer.query.filter(Customer.id == selected_customer_id).first()
if selected_customer is not None:
selected_customer_name = selected_customer.name or ""
else:
# Default listing hides jobs for inactive customers.
base_query = base_query.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
if q:
for pat in _patterns(q):
base_query = base_query.filter(
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
)
# Join with customers for display
jobs = (
base_query
.add_columns( .add_columns(
Job.id, Job.id,
Job.backup_software, Job.backup_software,
@ -54,6 +98,9 @@ def jobs():
"main/jobs.html", "main/jobs.html",
jobs=rows, jobs=rows,
can_manage_jobs=can_manage_jobs, can_manage_jobs=can_manage_jobs,
selected_customer_id=selected_customer_id,
selected_customer_name=selected_customer_name,
q=q,
) )
@ -140,6 +187,35 @@ def unarchive_job(job_id: int):
return redirect(url_for("main.archived_jobs")) return redirect(url_for("main.archived_jobs"))
@main_bp.route("/jobs/<int:job_id>/set-cove-account", methods=["POST"])
@login_required
@roles_required("admin", "operator")
def job_set_cove_account(job_id: int):
"""Save or clear the Cove Account ID for this job."""
job = Job.query.get_or_404(job_id)
account_id_raw = (request.form.get("cove_account_id") or "").strip()
if account_id_raw:
try:
job.cove_account_id = int(account_id_raw)
except (ValueError, TypeError):
flash("Invalid Cove Account ID must be a number.", "warning")
return redirect(url_for("main.job_detail", job_id=job_id))
else:
job.cove_account_id = None
db.session.commit()
try:
log_admin_event(
"job_cove_account_set",
f"Set Cove Account ID for job {job.id} to {job.cove_account_id!r}",
details=f"job_name={job.job_name}",
)
except Exception:
pass
flash("Cove Account ID saved.", "success")
return redirect(url_for("main.job_detail", job_id=job_id))
@main_bp.route("/jobs/<int:job_id>") @main_bp.route("/jobs/<int:job_id>")
@login_required @login_required
@roles_required("admin", "operator", "viewer") @roles_required("admin", "operator", "viewer")
@ -444,6 +520,11 @@ def job_detail(job_id: int):
if job.customer_id: if job.customer_id:
customer = Customer.query.get(job.customer_id) customer = Customer.query.get(job.customer_id)
# Load system settings for Cove integration display
from ..models import SystemSettings as _SystemSettings
_settings = _SystemSettings.query.first()
cove_enabled = bool(getattr(_settings, "cove_enabled", False)) if _settings else False
return render_template( return render_template(
"main/job_detail.html", "main/job_detail.html",
job=job, job=job,
@ -460,6 +541,7 @@ def job_detail(job_id: int):
has_prev=has_prev, has_prev=has_prev,
has_next=has_next, has_next=has_next,
can_manage_jobs=can_manage_jobs, can_manage_jobs=can_manage_jobs,
cove_enabled=cove_enabled,
) )

View File

@ -11,6 +11,16 @@ _OVERRIDE_DEFAULT_START_AT = datetime(1970, 1, 1)
def overrides(): def overrides():
can_manage = get_active_role() in ("admin", "operator") can_manage = get_active_role() in ("admin", "operator")
can_delete = get_active_role() == "admin" can_delete = get_active_role() == "admin"
q = (request.args.get("q") or "").strip()
def _match_query(text: str, raw_query: str) -> bool:
hay = (text or "").lower()
tokens = [t.strip() for t in (raw_query or "").split() if t.strip()]
for tok in tokens:
needle = tok.lower().replace("*", "")
if needle and needle not in hay:
return False
return True
overrides_q = Override.query.order_by(Override.level.asc(), Override.start_at.desc()).all() overrides_q = Override.query.order_by(Override.level.asc(), Override.start_at.desc()).all()
@ -92,16 +102,31 @@ def overrides():
rows = [] rows = []
for ov in overrides_q: for ov in overrides_q:
scope_text = _describe_scope(ov)
start_text = _format_datetime(ov.start_at)
end_text = _format_datetime(ov.end_at) if ov.end_at else ""
comment_text = ov.comment or ""
if q:
full_text = " | ".join([
ov.level or "",
scope_text,
start_text,
end_text,
comment_text,
])
if not _match_query(full_text, q):
continue
rows.append( rows.append(
{ {
"id": ov.id, "id": ov.id,
"level": ov.level or "", "level": ov.level or "",
"scope": _describe_scope(ov), "scope": scope_text,
"start_at": _format_datetime(ov.start_at), "start_at": start_text,
"end_at": _format_datetime(ov.end_at) if ov.end_at else "", "end_at": end_text,
"active": bool(ov.active), "active": bool(ov.active),
"treat_as_success": bool(ov.treat_as_success), "treat_as_success": bool(ov.treat_as_success),
"comment": ov.comment or "", "comment": comment_text,
"match_status": ov.match_status or "", "match_status": ov.match_status or "",
"match_error_contains": ov.match_error_contains or "", "match_error_contains": ov.match_error_contains or "",
"match_error_mode": getattr(ov, "match_error_mode", None) or "", "match_error_mode": getattr(ov, "match_error_mode", None) or "",
@ -122,6 +147,7 @@ def overrides():
jobs_for_select=jobs_for_select, jobs_for_select=jobs_for_select,
backup_software_options=backup_software_options, backup_software_options=backup_software_options,
backup_type_options=backup_type_options, backup_type_options=backup_type_options,
q=q,
) )
@ -398,4 +424,3 @@ def overrides_toggle(override_id: int):
flash("Override status updated.", "success") flash("Override status updated.", "success")
return redirect(url_for("main.overrides")) return redirect(url_for("main.overrides"))

View File

@ -1,6 +1,6 @@
from .routes_shared import * # noqa: F401,F403 from .routes_shared import * # noqa: F401,F403
from sqlalchemy import text from sqlalchemy import text, cast, String
import json import json
import csv import csv
import io import io
@ -101,12 +101,33 @@ def api_reports_list():
if err is not None: if err is not None:
return err return err
rows = ( q = (request.args.get("q") or "").strip()
db.session.query(ReportDefinition)
.order_by(ReportDefinition.created_at.desc()) def _patterns(raw: str) -> list[str]:
.limit(200) out = []
.all() for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
) p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
query = db.session.query(ReportDefinition)
if q:
for pat in _patterns(q):
query = query.filter(
(func.coalesce(ReportDefinition.name, "").ilike(pat, escape="\\"))
| (func.coalesce(ReportDefinition.report_type, "").ilike(pat, escape="\\"))
| (func.coalesce(ReportDefinition.output_format, "").ilike(pat, escape="\\"))
| (cast(ReportDefinition.period_start, String).ilike(pat, escape="\\"))
| (cast(ReportDefinition.period_end, String).ilike(pat, escape="\\"))
)
rows = query.order_by(ReportDefinition.created_at.desc()).limit(200).all()
return { return {
"items": [ "items": [
{ {

View File

@ -1,6 +1,7 @@
from .routes_shared import * # noqa: F401,F403 from .routes_shared import * # noqa: F401,F403
from datetime import date, timedelta from datetime import date, timedelta
from .routes_reporting_api import build_report_columns_meta, build_report_job_filters_meta from .routes_reporting_api import build_report_columns_meta, build_report_job_filters_meta
from sqlalchemy import cast, String
def get_default_report_period(): def get_default_report_period():
"""Return default report period (last 7 days).""" """Return default report period (last 7 days)."""
@ -52,13 +53,33 @@ def _build_report_item(r):
@main_bp.route("/reports") @main_bp.route("/reports")
@login_required @login_required
def reports(): def reports():
q = (request.args.get("q") or "").strip()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
# Pre-render items so the page is usable even if JS fails to load/execute. # Pre-render items so the page is usable even if JS fails to load/execute.
rows = ( query = db.session.query(ReportDefinition)
db.session.query(ReportDefinition) if q:
.order_by(ReportDefinition.created_at.desc()) for pat in _patterns(q):
.limit(200) query = query.filter(
.all() (func.coalesce(ReportDefinition.name, "").ilike(pat, escape="\\"))
) | (func.coalesce(ReportDefinition.report_type, "").ilike(pat, escape="\\"))
| (func.coalesce(ReportDefinition.output_format, "").ilike(pat, escape="\\"))
| (cast(ReportDefinition.period_start, String).ilike(pat, escape="\\"))
| (cast(ReportDefinition.period_end, String).ilike(pat, escape="\\"))
)
rows = query.order_by(ReportDefinition.created_at.desc()).limit(200).all()
items = [_build_report_item(r) for r in rows] items = [_build_report_item(r) for r in rows]
period_start, period_end = get_default_report_period() period_start, period_end = get_default_report_period()
@ -70,6 +91,7 @@ def reports():
job_filters_meta=build_report_job_filters_meta(), job_filters_meta=build_report_job_filters_meta(),
default_period_start=period_start.isoformat(), default_period_start=period_start.isoformat(),
default_period_end=period_end.isoformat(), default_period_end=period_end.isoformat(),
q=q,
) )

View File

@ -38,11 +38,19 @@ from ..models import (
TicketScope, TicketScope,
User, User,
) )
from ..ticketing_utils import link_open_internal_tickets_to_run
AUTOTASK_TERMINAL_STATUS_IDS = {5} AUTOTASK_TERMINAL_STATUS_IDS = {5}
def _is_hidden_3cx_non_backup(backup_software: str | None, backup_type: str | None) -> bool:
"""Hide non-backup 3CX informational jobs from Run Checks."""
bs = (backup_software or "").strip().lower()
bt = (backup_type or "").strip().lower()
return bs == "3cx" and bt in {"update", "ssl certificate"}
def _ensure_internal_ticket_for_autotask( def _ensure_internal_ticket_for_autotask(
*, *,
ticket_number: str, ticket_number: str,
@ -725,6 +733,8 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
mail_message_id=None, mail_message_id=None,
) )
db.session.add(miss) db.session.add(miss)
db.session.flush() # Ensure miss.id is available for ticket linking
link_open_internal_tickets_to_run(run=miss, job=job)
inserted += 1 inserted += 1
d = d + timedelta(days=1) d = d + timedelta(days=1)
@ -806,6 +816,8 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
mail_message_id=None, mail_message_id=None,
) )
db.session.add(miss) db.session.add(miss)
db.session.flush() # Ensure miss.id is available for ticket linking
link_open_internal_tickets_to_run(run=miss, job=job)
inserted += 1 inserted += 1
# Next month # Next month
@ -825,6 +837,21 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
def run_checks_page(): def run_checks_page():
"""Run Checks page: list jobs that have runs to review (including generated missed runs).""" """Run Checks page: list jobs that have runs to review (including generated missed runs)."""
q = (request.args.get("q") or "").strip()
def _patterns(raw: str) -> list[str]:
out = []
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
p = tok.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = "%" + p
if not p.endswith("%"):
p = p + "%"
out.append(p)
return out
include_reviewed = False include_reviewed = False
if get_active_role() == "admin": if get_active_role() == "admin":
include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on") include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on")
@ -850,6 +877,8 @@ def run_checks_page():
today_local = _to_amsterdam_date(datetime.utcnow()) or datetime.utcnow().date() today_local = _to_amsterdam_date(datetime.utcnow()) or datetime.utcnow().date()
for job in jobs: for job in jobs:
if _is_hidden_3cx_non_backup(getattr(job, "backup_software", None), getattr(job, "backup_type", None)):
continue
last_rev = last_reviewed_map.get(int(job.id)) last_rev = last_reviewed_map.get(int(job.id))
if last_rev: if last_rev:
start_date = _to_amsterdam_date(last_rev) or settings_start start_date = _to_amsterdam_date(last_rev) or settings_start
@ -884,6 +913,14 @@ def run_checks_page():
.outerjoin(Customer, Customer.id == Job.customer_id) .outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False)) .filter(Job.archived.is_(False))
) )
if q:
for pat in _patterns(q):
base = base.filter(
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
)
# Runs to show in the overview: unreviewed (or all if admin toggle enabled) # Runs to show in the overview: unreviewed (or all if admin toggle enabled)
run_filter = [] run_filter = []
@ -956,7 +993,7 @@ def run_checks_page():
Job.id.asc(), Job.id.asc(),
) )
rows = q.limit(2000).all() rows = [r for r in q.limit(2000).all() if not _is_hidden_3cx_non_backup(r.backup_software, r.backup_type)]
# Ensure override flags are up-to-date for the runs shown in this overview. # Ensure override flags are up-to-date for the runs shown in this overview.
# The Run Checks modal computes override status on-the-fly, but the overview # The Run Checks modal computes override status on-the-fly, but the overview
@ -1131,6 +1168,7 @@ def run_checks_page():
is_admin=(get_active_role() == "admin"), is_admin=(get_active_role() == "admin"),
include_reviewed=include_reviewed, include_reviewed=include_reviewed,
autotask_enabled=autotask_enabled, autotask_enabled=autotask_enabled,
q=q,
) )
@ -1151,6 +1189,15 @@ def run_checks_details():
include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on") include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on")
job = Job.query.get_or_404(job_id) job = Job.query.get_or_404(job_id)
if _is_hidden_3cx_non_backup(getattr(job, "backup_software", None), getattr(job, "backup_type", None)):
job_payload = {
"id": job.id,
"customer_name": job.customer.name if job.customer else "",
"backup_software": job.backup_software or "",
"backup_type": job.backup_type or "",
"job_name": job.job_name or "",
}
return jsonify({"status": "ok", "job": job_payload, "runs": [], "message": "This 3CX informational type is hidden from Run Checks."})
q = JobRun.query.filter(JobRun.job_id == job.id) q = JobRun.query.filter(JobRun.job_id == job.id)
if not include_reviewed: if not include_reviewed:

View File

@ -0,0 +1,963 @@
from .routes_shared import * # noqa: F401,F403
from .routes_shared import (
_apply_overrides_to_run,
_format_datetime,
_get_or_create_settings,
_get_ui_timezone,
_infer_monthly_schedule_from_runs,
_infer_schedule_map_from_runs,
)
from sqlalchemy import and_, cast, func, or_, String
import math
SEARCH_LIMIT_PER_SECTION = 10
SEARCH_SECTION_KEYS = [
"inbox",
"customers",
"jobs",
"daily_jobs",
"run_checks",
"tickets",
"remarks",
"overrides",
"reports",
]
def _is_section_allowed(section: str) -> bool:
role = get_active_role()
allowed = {
"inbox": {"admin", "operator", "viewer"},
"customers": {"admin", "operator", "viewer"},
"jobs": {"admin", "operator", "viewer"},
"daily_jobs": {"admin", "operator", "viewer"},
"run_checks": {"admin", "operator"},
"tickets": {"admin", "operator", "viewer"},
"remarks": {"admin", "operator", "viewer"},
"overrides": {"admin", "operator", "viewer"},
"reports": {"admin", "operator", "viewer", "reporter"},
}
return role in allowed.get(section, set())
def _build_patterns(raw_query: str) -> list[str]:
tokens = [t.strip() for t in (raw_query or "").split() if t.strip()]
patterns: list[str] = []
for token in tokens:
p = token.replace("\\", "\\\\")
p = p.replace("%", "\\%").replace("_", "\\_")
p = p.replace("*", "%")
if not p.startswith("%"):
p = f"%{p}"
if not p.endswith("%"):
p = f"{p}%"
patterns.append(p)
return patterns
def _contains_all_terms(columns: list, patterns: list[str]):
if not patterns or not columns:
return None
term_filters = []
for pattern in patterns:
per_term = [col.ilike(pattern, escape="\\") for col in columns]
term_filters.append(or_(*per_term))
return and_(*term_filters)
def _parse_page(value: str | None) -> int:
try:
page = int((value or "").strip())
except Exception:
page = 1
return page if page > 0 else 1
def _paginate_query(query, page: int, order_by_cols: list):
total = query.count()
total_pages = max(1, math.ceil(total / SEARCH_LIMIT_PER_SECTION)) if total else 1
current_page = min(max(page, 1), total_pages)
rows = (
query.order_by(*order_by_cols)
.offset((current_page - 1) * SEARCH_LIMIT_PER_SECTION)
.limit(SEARCH_LIMIT_PER_SECTION)
.all()
)
return total, current_page, total_pages, rows
def _enrich_paging(section: dict, total: int, current_page: int, total_pages: int) -> None:
section["total"] = int(total or 0)
section["current_page"] = int(current_page or 1)
section["total_pages"] = int(total_pages or 1)
section["has_prev"] = section["current_page"] > 1
section["has_next"] = section["current_page"] < section["total_pages"]
section["prev_url"] = ""
section["next_url"] = ""
def _build_inbox_results(patterns: list[str], page: int) -> dict:
section = {
"key": "inbox",
"title": "Inbox",
"view_all_url": url_for("main.inbox"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("inbox"):
return section
query = MailMessage.query
if hasattr(MailMessage, "location"):
query = query.filter(MailMessage.location == "inbox")
match_expr = _contains_all_terms(
[
func.coalesce(MailMessage.from_address, ""),
func.coalesce(MailMessage.subject, ""),
cast(MailMessage.received_at, String),
func.coalesce(MailMessage.backup_software, ""),
func.coalesce(MailMessage.backup_type, ""),
func.coalesce(MailMessage.job_name, ""),
func.coalesce(MailMessage.parse_result, ""),
cast(MailMessage.parsed_at, String),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[MailMessage.received_at.desc().nullslast(), MailMessage.id.desc()],
)
_enrich_paging(section, total, current_page, total_pages)
for msg in rows:
parsed_flag = bool(getattr(msg, "parsed_at", None) or (msg.parse_result or ""))
section["items"].append(
{
"title": msg.subject or f"Message #{msg.id}",
"subtitle": f"{msg.from_address or '-'} | {_format_datetime(msg.received_at)}",
"meta": f"{msg.backup_software or '-'} / {msg.backup_type or '-'} / {msg.job_name or '-'} | Parsed: {'Yes' if parsed_flag else 'No'}",
"link": url_for("main.inbox"),
}
)
return section
def _build_customers_results(patterns: list[str], page: int) -> dict:
section = {
"key": "customers",
"title": "Customers",
"view_all_url": url_for("main.customers"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("customers"):
return section
query = Customer.query
match_expr = _contains_all_terms([func.coalesce(Customer.name, "")], patterns)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[Customer.name.asc()],
)
_enrich_paging(section, total, current_page, total_pages)
for c in rows:
try:
job_count = c.jobs.count()
except Exception:
job_count = 0
section["items"].append(
{
"title": c.name or f"Customer #{c.id}",
"subtitle": f"Jobs: {job_count}",
"meta": "Active" if c.active else "Inactive",
"link": url_for("main.jobs", customer_id=c.id),
}
)
return section
def _build_jobs_results(patterns: list[str], page: int) -> dict:
section = {
"key": "jobs",
"title": "Jobs",
"view_all_url": url_for("main.jobs"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("jobs"):
return section
query = (
db.session.query(
Job.id.label("job_id"),
Job.backup_software.label("backup_software"),
Job.backup_type.label("backup_type"),
Job.job_name.label("job_name"),
Customer.name.label("customer_name"),
)
.select_from(Job)
.outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False))
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
)
match_expr = _contains_all_terms(
[
func.coalesce(Customer.name, ""),
func.coalesce(Job.backup_software, ""),
func.coalesce(Job.backup_type, ""),
func.coalesce(Job.job_name, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[
Customer.name.asc().nullslast(),
Job.backup_software.asc(),
Job.backup_type.asc(),
Job.job_name.asc(),
],
)
_enrich_paging(section, total, current_page, total_pages)
for row in rows:
section["items"].append(
{
"title": row.job_name or f"Job #{row.job_id}",
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
"meta": "",
"link": url_for("main.job_detail", job_id=row.job_id),
}
)
return section
def _build_daily_jobs_results(patterns: list[str], page: int) -> dict:
section = {
"key": "daily_jobs",
"title": "Daily Jobs",
"view_all_url": url_for("main.daily_jobs"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("daily_jobs"):
return section
try:
tz = _get_ui_timezone()
except Exception:
tz = None
try:
target_date = datetime.now(tz).date() if tz else datetime.utcnow().date()
except Exception:
target_date = datetime.utcnow().date()
settings = _get_or_create_settings()
missed_start_date = getattr(settings, "daily_jobs_start_date", None)
if tz:
local_midnight = datetime(
year=target_date.year,
month=target_date.month,
day=target_date.day,
hour=0,
minute=0,
second=0,
tzinfo=tz,
)
start_of_day = local_midnight.astimezone(datetime_module.timezone.utc).replace(tzinfo=None)
end_of_day = (local_midnight + timedelta(days=1)).astimezone(datetime_module.timezone.utc).replace(tzinfo=None)
else:
start_of_day = datetime(
year=target_date.year,
month=target_date.month,
day=target_date.day,
hour=0,
minute=0,
second=0,
)
end_of_day = start_of_day + timedelta(days=1)
def _to_local(dt_utc):
if not dt_utc or not tz:
return dt_utc
try:
if dt_utc.tzinfo is None:
dt_utc = dt_utc.replace(tzinfo=datetime_module.timezone.utc)
return dt_utc.astimezone(tz)
except Exception:
return dt_utc
def _bucket_15min(dt_utc):
d = _to_local(dt_utc)
if not d:
return None
minute_bucket = (d.minute // 15) * 15
return f"{d.hour:02d}:{minute_bucket:02d}"
def _is_success_status(value: str) -> bool:
s = (value or "").strip().lower()
if not s:
return False
return ("success" in s) or ("override" in s)
query = (
db.session.query(
Job.id.label("job_id"),
Job.job_name.label("job_name"),
Job.backup_software.label("backup_software"),
Job.backup_type.label("backup_type"),
Customer.name.label("customer_name"),
)
.select_from(Job)
.outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False))
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
)
match_expr = _contains_all_terms(
[
func.coalesce(Customer.name, ""),
func.coalesce(Job.backup_software, ""),
func.coalesce(Job.backup_type, ""),
func.coalesce(Job.job_name, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[
Customer.name.asc().nullslast(),
Job.backup_software.asc(),
Job.backup_type.asc(),
Job.job_name.asc(),
],
)
_enrich_paging(section, total, current_page, total_pages)
for row in rows:
expected_times = (_infer_schedule_map_from_runs(row.job_id).get(target_date.weekday()) or [])
if not expected_times:
monthly = _infer_monthly_schedule_from_runs(row.job_id)
if monthly:
try:
dom = int(monthly.get("day_of_month") or 0)
except Exception:
dom = 0
mtimes = monthly.get("times") or []
try:
import calendar as _calendar
last_dom = _calendar.monthrange(target_date.year, target_date.month)[1]
except Exception:
last_dom = target_date.day
scheduled_dom = dom if (dom and dom <= last_dom) else last_dom
if target_date.day == scheduled_dom:
expected_times = list(mtimes)
runs_for_day = (
JobRun.query.filter(
JobRun.job_id == row.job_id,
JobRun.run_at >= start_of_day,
JobRun.run_at < end_of_day,
)
.order_by(JobRun.run_at.asc())
.all()
)
run_count = len(runs_for_day)
last_status = "-"
expected_display = expected_times[-1] if expected_times else "-"
if run_count > 0:
last_run = runs_for_day[-1]
try:
job_obj = Job.query.get(int(row.job_id))
status_display, _override_applied, _override_level, _ov_id, _ov_reason = _apply_overrides_to_run(job_obj, last_run)
if getattr(last_run, "missed", False):
last_status = status_display or "Missed"
else:
last_status = status_display or (last_run.status or "-")
except Exception:
last_status = last_run.status or "-"
expected_display = _bucket_15min(last_run.run_at) or expected_display
else:
try:
today_local = datetime.now(tz).date() if tz else datetime.utcnow().date()
except Exception:
today_local = datetime.utcnow().date()
if target_date > today_local:
last_status = "Expected"
elif target_date == today_local:
last_status = "Expected"
else:
if missed_start_date and target_date < missed_start_date:
last_status = "-"
else:
last_status = "Missed"
success_text = "Yes" if _is_success_status(last_status) else "No"
section["items"].append(
{
"title": row.job_name or f"Job #{row.job_id}",
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
"meta": f"Expected: {expected_display} | Successful: {success_text} | Runs: {run_count}",
"link": url_for("main.daily_jobs", date=target_date.strftime("%Y-%m-%d"), open_job_id=row.job_id),
}
)
return section
def _build_run_checks_results(patterns: list[str], page: int) -> dict:
section = {
"key": "run_checks",
"title": "Run Checks",
"view_all_url": url_for("main.run_checks_page"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("run_checks"):
return section
agg = (
db.session.query(
JobRun.job_id.label("job_id"),
func.count(JobRun.id).label("run_count"),
)
.filter(JobRun.reviewed_at.is_(None))
.group_by(JobRun.job_id)
.subquery()
)
query = (
db.session.query(
Job.id.label("job_id"),
Job.job_name.label("job_name"),
Job.backup_software.label("backup_software"),
Job.backup_type.label("backup_type"),
Customer.name.label("customer_name"),
agg.c.run_count.label("run_count"),
)
.select_from(Job)
.join(agg, agg.c.job_id == Job.id)
.outerjoin(Customer, Customer.id == Job.customer_id)
.filter(Job.archived.is_(False))
)
match_expr = _contains_all_terms(
[
func.coalesce(Customer.name, ""),
func.coalesce(Job.backup_software, ""),
func.coalesce(Job.backup_type, ""),
func.coalesce(Job.job_name, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[
Customer.name.asc().nullslast(),
Job.backup_software.asc().nullslast(),
Job.backup_type.asc().nullslast(),
Job.job_name.asc().nullslast(),
],
)
_enrich_paging(section, total, current_page, total_pages)
for row in rows:
section["items"].append(
{
"title": row.job_name or f"Job #{row.job_id}",
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
"meta": f"Unreviewed runs: {int(row.run_count or 0)}",
"link": url_for("main.run_checks_page"),
}
)
return section
def _build_tickets_results(patterns: list[str], page: int) -> dict:
section = {
"key": "tickets",
"title": "Tickets",
"view_all_url": url_for("main.tickets_page"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("tickets"):
return section
query = (
db.session.query(Ticket)
.select_from(Ticket)
.outerjoin(TicketScope, TicketScope.ticket_id == Ticket.id)
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
.outerjoin(Job, Job.id == TicketScope.job_id)
)
match_expr = _contains_all_terms(
[
func.coalesce(Ticket.ticket_code, ""),
func.coalesce(Customer.name, ""),
func.coalesce(TicketScope.scope_type, ""),
func.coalesce(TicketScope.backup_software, ""),
func.coalesce(TicketScope.backup_type, ""),
func.coalesce(TicketScope.job_name_match, ""),
func.coalesce(Job.job_name, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
query = query.distinct()
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[Ticket.start_date.desc().nullslast()],
)
_enrich_paging(section, total, current_page, total_pages)
for t in rows:
customer_display = "-"
scope_summary = "-"
try:
scope_rows = (
db.session.query(
TicketScope.scope_type.label("scope_type"),
TicketScope.backup_software.label("backup_software"),
TicketScope.backup_type.label("backup_type"),
Customer.name.label("customer_name"),
)
.select_from(TicketScope)
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
.filter(TicketScope.ticket_id == t.id)
.all()
)
customer_names = []
for s in scope_rows:
cname = getattr(s, "customer_name", None)
if cname and cname not in customer_names:
customer_names.append(cname)
if customer_names:
customer_display = customer_names[0]
if len(customer_names) > 1:
customer_display = f"{customer_display} +{len(customer_names)-1}"
if scope_rows:
s = scope_rows[0]
bits = []
if getattr(s, "scope_type", None):
bits.append(str(getattr(s, "scope_type")))
if getattr(s, "backup_software", None):
bits.append(str(getattr(s, "backup_software")))
if getattr(s, "backup_type", None):
bits.append(str(getattr(s, "backup_type")))
scope_summary = " / ".join(bits) if bits else "-"
except Exception:
customer_display = "-"
scope_summary = "-"
section["items"].append(
{
"title": t.ticket_code or f"Ticket #{t.id}",
"subtitle": f"{customer_display} | {scope_summary}",
"meta": _format_datetime(t.start_date),
"link": url_for("main.ticket_detail", ticket_id=t.id),
}
)
return section
def _build_remarks_results(patterns: list[str], page: int) -> dict:
section = {
"key": "remarks",
"title": "Remarks",
"view_all_url": url_for("main.tickets_page", tab="remarks"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("remarks"):
return section
query = (
db.session.query(Remark)
.select_from(Remark)
.outerjoin(RemarkScope, RemarkScope.remark_id == Remark.id)
.outerjoin(Customer, Customer.id == RemarkScope.customer_id)
.outerjoin(Job, Job.id == RemarkScope.job_id)
)
match_expr = _contains_all_terms(
[
func.coalesce(Remark.title, ""),
func.coalesce(Remark.body, ""),
func.coalesce(Customer.name, ""),
func.coalesce(RemarkScope.scope_type, ""),
func.coalesce(RemarkScope.backup_software, ""),
func.coalesce(RemarkScope.backup_type, ""),
func.coalesce(RemarkScope.job_name_match, ""),
func.coalesce(Job.job_name, ""),
cast(Remark.start_date, String),
cast(Remark.resolved_at, String),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
query = query.distinct()
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[Remark.start_date.desc().nullslast()],
)
_enrich_paging(section, total, current_page, total_pages)
for r in rows:
customer_display = "-"
scope_summary = "-"
try:
scope_rows = (
db.session.query(
RemarkScope.scope_type.label("scope_type"),
RemarkScope.backup_software.label("backup_software"),
RemarkScope.backup_type.label("backup_type"),
Customer.name.label("customer_name"),
)
.select_from(RemarkScope)
.outerjoin(Customer, Customer.id == RemarkScope.customer_id)
.filter(RemarkScope.remark_id == r.id)
.all()
)
customer_names = []
for s in scope_rows:
cname = getattr(s, "customer_name", None)
if cname and cname not in customer_names:
customer_names.append(cname)
if customer_names:
customer_display = customer_names[0]
if len(customer_names) > 1:
customer_display = f"{customer_display} +{len(customer_names)-1}"
if scope_rows:
s = scope_rows[0]
bits = []
if getattr(s, "scope_type", None):
bits.append(str(getattr(s, "scope_type")))
if getattr(s, "backup_software", None):
bits.append(str(getattr(s, "backup_software")))
if getattr(s, "backup_type", None):
bits.append(str(getattr(s, "backup_type")))
scope_summary = " / ".join(bits) if bits else "-"
except Exception:
customer_display = "-"
scope_summary = "-"
preview = (r.title or r.body or "").strip()
if len(preview) > 80:
preview = preview[:77] + "..."
section["items"].append(
{
"title": preview or f"Remark #{r.id}",
"subtitle": f"{customer_display} | {scope_summary}",
"meta": _format_datetime(r.start_date),
"link": url_for("main.remark_detail", remark_id=r.id),
}
)
return section
def _build_overrides_results(patterns: list[str], page: int) -> dict:
section = {
"key": "overrides",
"title": "Existing overrides",
"view_all_url": url_for("main.overrides"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("overrides"):
return section
query = (
db.session.query(
Override.id.label("id"),
Override.level.label("level"),
Override.backup_software.label("backup_software"),
Override.backup_type.label("backup_type"),
Override.object_name.label("object_name"),
Override.start_at.label("start_at"),
Override.end_at.label("end_at"),
Override.comment.label("comment"),
Customer.name.label("customer_name"),
Job.job_name.label("job_name"),
)
.select_from(Override)
.outerjoin(Job, Job.id == Override.job_id)
.outerjoin(Customer, Customer.id == Job.customer_id)
)
match_expr = _contains_all_terms(
[
func.coalesce(Override.level, ""),
func.coalesce(Customer.name, ""),
func.coalesce(Override.backup_software, ""),
func.coalesce(Override.backup_type, ""),
func.coalesce(Job.job_name, ""),
func.coalesce(Override.object_name, ""),
cast(Override.start_at, String),
cast(Override.end_at, String),
func.coalesce(Override.comment, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[Override.level.asc(), Override.start_at.desc()],
)
_enrich_paging(section, total, current_page, total_pages)
for row in rows:
scope_bits = []
if row.customer_name:
scope_bits.append(row.customer_name)
if row.backup_software:
scope_bits.append(row.backup_software)
if row.backup_type:
scope_bits.append(row.backup_type)
if row.job_name:
scope_bits.append(row.job_name)
if row.object_name:
scope_bits.append(f"object: {row.object_name}")
scope_text = " / ".join(scope_bits) if scope_bits else "All jobs"
section["items"].append(
{
"title": (row.level or "override").capitalize(),
"subtitle": scope_text,
"meta": f"From {_format_datetime(row.start_at)} to {_format_datetime(row.end_at) if row.end_at else '-'} | {row.comment or ''}",
"link": url_for("main.overrides"),
}
)
return section
def _build_reports_results(patterns: list[str], page: int) -> dict:
section = {
"key": "reports",
"title": "Reports",
"view_all_url": url_for("main.reports"),
"total": 0,
"items": [],
"current_page": 1,
"total_pages": 1,
"has_prev": False,
"has_next": False,
"prev_url": "",
"next_url": "",
}
if not _is_section_allowed("reports"):
return section
query = ReportDefinition.query
match_expr = _contains_all_terms(
[
func.coalesce(ReportDefinition.name, ""),
func.coalesce(ReportDefinition.report_type, ""),
cast(ReportDefinition.period_start, String),
cast(ReportDefinition.period_end, String),
func.coalesce(ReportDefinition.output_format, ""),
],
patterns,
)
if match_expr is not None:
query = query.filter(match_expr)
total, current_page, total_pages, rows = _paginate_query(
query,
page,
[ReportDefinition.created_at.desc()],
)
_enrich_paging(section, total, current_page, total_pages)
can_edit = get_active_role() in ("admin", "operator", "reporter")
for r in rows:
section["items"].append(
{
"title": r.name or f"Report #{r.id}",
"subtitle": f"{r.report_type or '-'} | {r.output_format or '-'}",
"meta": f"{_format_datetime(r.period_start)} -> {_format_datetime(r.period_end)}",
"link": (url_for("main.reports_edit", report_id=r.id) if can_edit else url_for("main.reports")),
}
)
return section
@main_bp.route("/search")
@login_required
def search_page():
query = (request.args.get("q") or "").strip()
patterns = _build_patterns(query)
requested_pages = {
key: _parse_page(request.args.get(f"p_{key}"))
for key in SEARCH_SECTION_KEYS
}
sections = []
if patterns:
sections.append(_build_inbox_results(patterns, requested_pages["inbox"]))
sections.append(_build_customers_results(patterns, requested_pages["customers"]))
sections.append(_build_jobs_results(patterns, requested_pages["jobs"]))
sections.append(_build_daily_jobs_results(patterns, requested_pages["daily_jobs"]))
sections.append(_build_run_checks_results(patterns, requested_pages["run_checks"]))
sections.append(_build_tickets_results(patterns, requested_pages["tickets"]))
sections.append(_build_remarks_results(patterns, requested_pages["remarks"]))
sections.append(_build_overrides_results(patterns, requested_pages["overrides"]))
sections.append(_build_reports_results(patterns, requested_pages["reports"]))
else:
sections = [
{"key": "inbox", "title": "Inbox", "view_all_url": url_for("main.inbox"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "customers", "title": "Customers", "view_all_url": url_for("main.customers"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "jobs", "title": "Jobs", "view_all_url": url_for("main.jobs"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "daily_jobs", "title": "Daily Jobs", "view_all_url": url_for("main.daily_jobs"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "run_checks", "title": "Run Checks", "view_all_url": url_for("main.run_checks_page"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "tickets", "title": "Tickets", "view_all_url": url_for("main.tickets_page"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "remarks", "title": "Remarks", "view_all_url": url_for("main.tickets_page", tab="remarks"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "overrides", "title": "Existing overrides", "view_all_url": url_for("main.overrides"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
{"key": "reports", "title": "Reports", "view_all_url": url_for("main.reports"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
]
visible_sections = [s for s in sections if _is_section_allowed(s["key"])]
current_pages = {
s["key"]: int(s.get("current_page", 1) or 1)
for s in sections
}
def _build_search_url(page_overrides: dict[str, int]) -> str:
args = {"q": query}
for key in SEARCH_SECTION_KEYS:
args[f"p_{key}"] = int(page_overrides.get(key, current_pages.get(key, 1)))
return url_for("main.search_page", **args)
for s in visible_sections:
key = s["key"]
cur = int(s.get("current_page", 1) or 1)
if query:
if key == "inbox":
s["view_all_url"] = url_for("main.inbox", q=query)
elif key == "customers":
s["view_all_url"] = url_for("main.customers", q=query)
elif key == "jobs":
s["view_all_url"] = url_for("main.jobs", q=query)
elif key == "daily_jobs":
s["view_all_url"] = url_for("main.daily_jobs", q=query)
elif key == "run_checks":
s["view_all_url"] = url_for("main.run_checks_page", q=query)
elif key == "tickets":
s["view_all_url"] = url_for("main.tickets_page", q=query)
elif key == "remarks":
s["view_all_url"] = url_for("main.tickets_page", tab="remarks", q=query)
elif key == "overrides":
s["view_all_url"] = url_for("main.overrides", q=query)
elif key == "reports":
s["view_all_url"] = url_for("main.reports", q=query)
if s.get("has_prev"):
prev_pages = dict(current_pages)
prev_pages[key] = cur - 1
s["prev_url"] = _build_search_url(prev_pages)
if s.get("has_next"):
next_pages = dict(current_pages)
next_pages[key] = cur + 1
s["next_url"] = _build_search_url(next_pages)
total_hits = sum(int(s.get("total", 0) or 0) for s in visible_sections)
return render_template(
"main/search.html",
query=query,
sections=visible_sections,
total_hits=total_hits,
limit_per_section=SEARCH_LIMIT_PER_SECTION,
)

View File

@ -585,6 +585,7 @@ def settings_jobs_export():
@roles_required("admin") @roles_required("admin")
def settings_jobs_import(): def settings_jobs_import():
upload = request.files.get("jobs_file") upload = request.files.get("jobs_file")
include_autotask_ids = bool(request.form.get("include_autotask_ids"))
if not upload or not upload.filename: if not upload or not upload.filename:
flash("No import file was provided.", "danger") flash("No import file was provided.", "danger")
return redirect(url_for("main.settings", section="general")) return redirect(url_for("main.settings", section="general"))
@ -621,14 +622,17 @@ def settings_jobs_import():
if not cust_name: if not cust_name:
continue continue
# Read Autotask fields (backwards compatible - optional) autotask_company_id = None
autotask_company_id = cust_item.get("autotask_company_id") autotask_company_name = None
autotask_company_name = cust_item.get("autotask_company_name") if include_autotask_ids:
# Read Autotask fields (backwards compatible - optional)
autotask_company_id = cust_item.get("autotask_company_id")
autotask_company_name = cust_item.get("autotask_company_name")
existing_customer = Customer.query.filter_by(name=cust_name).first() existing_customer = Customer.query.filter_by(name=cust_name).first()
if existing_customer: if existing_customer:
# Update Autotask mapping if provided # Update Autotask mapping only when explicitly allowed by import option.
if autotask_company_id is not None: if include_autotask_ids and autotask_company_id is not None:
existing_customer.autotask_company_id = autotask_company_id existing_customer.autotask_company_id = autotask_company_id
existing_customer.autotask_company_name = autotask_company_name existing_customer.autotask_company_name = autotask_company_name
existing_customer.autotask_mapping_status = None # Will be resynced existing_customer.autotask_mapping_status = None # Will be resynced
@ -747,7 +751,7 @@ def settings_jobs_import():
db.session.commit() db.session.commit()
flash( flash(
f"Import completed. Customers created: {created_customers}, updated: {updated_customers}. Jobs created: {created_jobs}, updated: {updated_jobs}.", f"Import completed. Customers created: {created_customers}, updated: {updated_customers}. Jobs created: {created_jobs}, updated: {updated_jobs}. Autotask IDs imported: {'yes' if include_autotask_ids else 'no'}.",
"success", "success",
) )
@ -758,6 +762,7 @@ def settings_jobs_import():
details=json.dumps({ details=json.dumps({
"format": "JSON", "format": "JSON",
"schema": payload.get("schema"), "schema": payload.get("schema"),
"include_autotask_ids": include_autotask_ids,
"customers_created": created_customers, "customers_created": created_customers,
"customers_updated": updated_customers, "customers_updated": updated_customers,
"jobs_created": created_jobs, "jobs_created": created_jobs,
@ -781,6 +786,8 @@ def settings():
if request.method == "POST": if request.method == "POST":
autotask_form_touched = any(str(k).startswith("autotask_") for k in (request.form or {}).keys()) autotask_form_touched = any(str(k).startswith("autotask_") for k in (request.form or {}).keys())
cove_form_touched = any(str(k).startswith("cove_") for k in (request.form or {}).keys())
entra_form_touched = any(str(k).startswith("entra_") for k in (request.form or {}).keys())
import_form_touched = any(str(k).startswith("auto_import_") or str(k).startswith("manual_import_") or str(k).startswith("ingest_eml_") for k in (request.form or {}).keys()) import_form_touched = any(str(k).startswith("auto_import_") or str(k).startswith("manual_import_") or str(k).startswith("ingest_eml_") for k in (request.form or {}).keys())
general_form_touched = "ui_timezone" in request.form general_form_touched = "ui_timezone" in request.form
mail_form_touched = any(k in request.form for k in ["graph_tenant_id", "graph_client_id", "graph_mailbox", "incoming_folder", "processed_folder"]) mail_form_touched = any(k in request.form for k in ["graph_tenant_id", "graph_client_id", "graph_mailbox", "incoming_folder", "processed_folder"])
@ -903,6 +910,51 @@ def settings():
except (ValueError, TypeError): except (ValueError, TypeError):
pass pass
# Cove Data Protection integration
if cove_form_touched:
settings.cove_enabled = bool(request.form.get("cove_enabled"))
settings.cove_import_enabled = bool(request.form.get("cove_import_enabled"))
if "cove_api_url" in request.form:
settings.cove_api_url = (request.form.get("cove_api_url") or "").strip() or None
if "cove_api_username" in request.form:
settings.cove_api_username = (request.form.get("cove_api_username") or "").strip() or None
if "cove_api_password" in request.form:
pw = (request.form.get("cove_api_password") or "").strip()
if pw:
settings.cove_api_password = pw
if "cove_import_interval_minutes" in request.form:
try:
interval = int(request.form.get("cove_import_interval_minutes") or 30)
if interval < 1:
interval = 1
settings.cove_import_interval_minutes = interval
except (ValueError, TypeError):
pass
# Microsoft Entra SSO
if entra_form_touched:
settings.entra_sso_enabled = bool(request.form.get("entra_sso_enabled"))
settings.entra_auto_provision_users = bool(request.form.get("entra_auto_provision_users"))
if "entra_tenant_id" in request.form:
settings.entra_tenant_id = (request.form.get("entra_tenant_id") or "").strip() or None
if "entra_client_id" in request.form:
settings.entra_client_id = (request.form.get("entra_client_id") or "").strip() or None
if "entra_redirect_uri" in request.form:
settings.entra_redirect_uri = (request.form.get("entra_redirect_uri") or "").strip() or None
if "entra_allowed_domain" in request.form:
settings.entra_allowed_domain = (request.form.get("entra_allowed_domain") or "").strip() or None
if "entra_allowed_group_ids" in request.form:
settings.entra_allowed_group_ids = (request.form.get("entra_allowed_group_ids") or "").strip() or None
if "entra_client_secret" in request.form:
pw = (request.form.get("entra_client_secret") or "").strip()
if pw:
settings.entra_client_secret = pw
# Daily Jobs # Daily Jobs
if "daily_jobs_start_date" in request.form: if "daily_jobs_start_date" in request.form:
daily_jobs_start_date_str = (request.form.get("daily_jobs_start_date") or "").strip() daily_jobs_start_date_str = (request.form.get("daily_jobs_start_date") or "").strip()
@ -1114,6 +1166,8 @@ def settings():
has_client_secret = bool(settings.graph_client_secret) has_client_secret = bool(settings.graph_client_secret)
has_autotask_password = bool(getattr(settings, "autotask_api_password", None)) has_autotask_password = bool(getattr(settings, "autotask_api_password", None))
has_cove_password = bool(getattr(settings, "cove_api_password", None))
has_entra_secret = bool(getattr(settings, "entra_client_secret", None))
# Common UI timezones (IANA names) # Common UI timezones (IANA names)
tz_options = [ tz_options = [
@ -1239,6 +1293,8 @@ def settings():
free_disk_warning=free_disk_warning, free_disk_warning=free_disk_warning,
has_client_secret=has_client_secret, has_client_secret=has_client_secret,
has_autotask_password=has_autotask_password, has_autotask_password=has_autotask_password,
has_cove_password=has_cove_password,
has_entra_secret=has_entra_secret,
tz_options=tz_options, tz_options=tz_options,
users=users, users=users,
admin_users_count=admin_users_count, admin_users_count=admin_users_count,
@ -1253,6 +1309,83 @@ def settings():
) )
@main_bp.route("/settings/cove/test-connection", methods=["POST"])
@login_required
@roles_required("admin")
def settings_cove_test_connection():
"""Test the Cove Data Protection API connection and return JSON result."""
from flask import jsonify
from ..cove_importer import CoveImportError, _cove_login, COVE_DEFAULT_URL
settings = _get_or_create_settings()
username = (getattr(settings, "cove_api_username", None) or "").strip()
password = (getattr(settings, "cove_api_password", None) or "").strip()
url = (getattr(settings, "cove_api_url", None) or "").strip() or COVE_DEFAULT_URL
if not username or not password:
return jsonify({"ok": False, "message": "Cove API username and password must be saved first."})
try:
visa, partner_id = _cove_login(url, username, password)
# Store the partner_id
settings.cove_partner_id = partner_id
db.session.commit()
_log_admin_event(
"cove_test_connection",
f"Cove connection test succeeded. Partner ID: {partner_id}",
)
return jsonify({
"ok": True,
"partner_id": partner_id,
"message": f"Connected Partner ID: {partner_id}",
})
except CoveImportError as exc:
db.session.rollback()
return jsonify({"ok": False, "message": str(exc)})
except Exception as exc:
db.session.rollback()
return jsonify({"ok": False, "message": f"Unexpected error: {exc}"})
@main_bp.route("/settings/cove/run-now", methods=["POST"])
@login_required
@roles_required("admin")
def settings_cove_run_now():
"""Manually trigger a Cove import and show the result as a flash message."""
from ..cove_importer import CoveImportError, run_cove_import
settings = _get_or_create_settings()
if not getattr(settings, "cove_enabled", False):
flash("Cove integration is not enabled.", "warning")
return redirect(url_for("main.settings", section="integrations"))
username = (getattr(settings, "cove_api_username", None) or "").strip()
password = (getattr(settings, "cove_api_password", None) or "").strip()
if not username or not password:
flash("Cove API credentials not configured.", "warning")
return redirect(url_for("main.settings", section="integrations"))
try:
total, created, skipped, errors = run_cove_import(settings)
_log_admin_event(
"cove_import_manual",
f"Manual Cove import finished. accounts={total}, created={created}, skipped={skipped}, errors={errors}",
)
flash(
f"Cove import finished. Accounts: {total}, new runs: {created}, skipped: {skipped}, errors: {errors}.",
"success" if errors == 0 else "warning",
)
except CoveImportError as exc:
_log_admin_event("cove_import_manual_error", f"Manual Cove import failed: {exc}")
flash(f"Cove import failed: {exc}", "danger")
except Exception as exc:
_log_admin_event("cove_import_manual_error", f"Unexpected error during manual Cove import: {exc}")
flash(f"Unexpected error: {exc}", "danger")
return redirect(url_for("main.settings", section="integrations"))
@main_bp.route("/settings/news/create", methods=["POST"]) @main_bp.route("/settings/news/create", methods=["POST"])
@login_required @login_required

View File

@ -52,6 +52,7 @@ from ..models import (
FeedbackItem, FeedbackItem,
FeedbackVote, FeedbackVote,
FeedbackReply, FeedbackReply,
FeedbackAttachment,
NewsItem, NewsItem,
NewsRead, NewsRead,
ReportDefinition, ReportDefinition,
@ -678,6 +679,10 @@ def _infer_schedule_map_from_runs(job_id: int):
return schedule return schedule
if bs == 'qnap' and bt == 'firmware update': if bs == 'qnap' and bt == 'firmware update':
return schedule return schedule
if bs == '3cx' and bt == 'update':
return schedule
if bs == '3cx' and bt == 'ssl certificate':
return schedule
if bs == 'syncovery' and bt == 'syncovery': if bs == 'syncovery' and bt == 'syncovery':
return schedule return schedule
except Exception: except Exception:
@ -993,4 +998,3 @@ def _next_ticket_code(now_utc: datetime) -> str:
seq = 1 seq = 1
return f"{prefix}{seq:04d}" return f"{prefix}{seq:04d}"

View File

@ -28,17 +28,33 @@ def tickets_page():
if tab == "tickets": if tab == "tickets":
query = Ticket.query query = Ticket.query
joined_scope = False
if active_only: if active_only:
query = query.filter(Ticket.resolved_at.is_(None)) query = query.filter(Ticket.resolved_at.is_(None))
if q: if q:
like_q = f"%{q}%" like_q = f"%{q}%"
query = (
query
.outerjoin(TicketScope, TicketScope.ticket_id == Ticket.id)
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
.outerjoin(Job, Job.id == TicketScope.job_id)
)
joined_scope = True
query = query.filter( query = query.filter(
(Ticket.ticket_code.ilike(like_q)) (Ticket.ticket_code.ilike(like_q))
| (Ticket.description.ilike(like_q)) | (Ticket.description.ilike(like_q))
| (Customer.name.ilike(like_q))
| (TicketScope.scope_type.ilike(like_q))
| (TicketScope.backup_software.ilike(like_q))
| (TicketScope.backup_type.ilike(like_q))
| (TicketScope.job_name_match.ilike(like_q))
| (Job.job_name.ilike(like_q))
) )
query = query.distinct()
if customer_id or backup_software or backup_type: if customer_id or backup_software or backup_type:
query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id) if not joined_scope:
query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id)
if customer_id: if customer_id:
query = query.filter(TicketScope.customer_id == customer_id) query = query.filter(TicketScope.customer_id == customer_id)
if backup_software: if backup_software:
@ -322,4 +338,3 @@ def ticket_detail(ticket_id: int):
scopes=scopes, scopes=scopes,
runs=runs, runs=runs,
) )

View File

@ -1070,6 +1070,141 @@ def migrate_rename_admin_logs_to_audit_logs() -> None:
print("[migrations] audit_logs table will be created by db.create_all()") print("[migrations] audit_logs table will be created by db.create_all()")
def migrate_cove_accounts_table() -> None:
"""Create the cove_accounts staging table if it does not exist.
This table stores all accounts returned by Cove EnumerateAccountStatistics.
Unlinked accounts (job_id IS NULL) appear in the Cove Accounts review page.
"""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for cove_accounts migration: {exc}")
return
try:
with engine.begin() as conn:
conn.execute(text("""
CREATE TABLE IF NOT EXISTS cove_accounts (
id SERIAL PRIMARY KEY,
account_id INTEGER NOT NULL UNIQUE,
account_name VARCHAR(512) NULL,
computer_name VARCHAR(512) NULL,
customer_name VARCHAR(255) NULL,
datasource_types VARCHAR(255) NULL,
last_status_code INTEGER NULL,
last_run_at TIMESTAMP NULL,
colorbar_28d VARCHAR(64) NULL,
job_id INTEGER NULL REFERENCES jobs(id) ON DELETE SET NULL,
first_seen_at TIMESTAMP NOT NULL DEFAULT NOW(),
last_seen_at TIMESTAMP NOT NULL DEFAULT NOW()
)
"""))
conn.execute(text(
"CREATE INDEX IF NOT EXISTS idx_cove_accounts_account_id ON cove_accounts (account_id)"
))
conn.execute(text(
"CREATE INDEX IF NOT EXISTS idx_cove_accounts_job_id ON cove_accounts (job_id)"
))
print("[migrations] migrate_cove_accounts_table completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate cove_accounts table: {exc}")
def migrate_cove_integration() -> None:
"""Add Cove Data Protection integration columns if missing.
Adds to system_settings:
- cove_enabled (BOOLEAN NOT NULL DEFAULT FALSE)
- cove_api_url (VARCHAR(255) NULL)
- cove_api_username (VARCHAR(255) NULL)
- cove_api_password (VARCHAR(255) NULL)
- cove_import_enabled (BOOLEAN NOT NULL DEFAULT FALSE)
- cove_import_interval_minutes (INTEGER NOT NULL DEFAULT 30)
- cove_partner_id (INTEGER NULL)
- cove_last_import_at (TIMESTAMP NULL)
Adds to jobs:
- cove_account_id (INTEGER NULL)
Adds to job_runs:
- source_type (VARCHAR(20) NULL)
- external_id (VARCHAR(100) NULL)
"""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for Cove integration migration: {exc}")
return
try:
with engine.begin() as conn:
# system_settings columns
ss_columns = [
("cove_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
("cove_api_url", "VARCHAR(255) NULL"),
("cove_api_username", "VARCHAR(255) NULL"),
("cove_api_password", "VARCHAR(255) NULL"),
("cove_import_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
("cove_import_interval_minutes", "INTEGER NOT NULL DEFAULT 30"),
("cove_partner_id", "INTEGER NULL"),
("cove_last_import_at", "TIMESTAMP NULL"),
]
for column, ddl in ss_columns:
if _column_exists_on_conn(conn, "system_settings", column):
continue
conn.execute(text(f'ALTER TABLE "system_settings" ADD COLUMN {column} {ddl}'))
# jobs column
if not _column_exists_on_conn(conn, "jobs", "cove_account_id"):
conn.execute(text('ALTER TABLE "jobs" ADD COLUMN cove_account_id INTEGER NULL'))
# job_runs columns
if not _column_exists_on_conn(conn, "job_runs", "source_type"):
conn.execute(text('ALTER TABLE "job_runs" ADD COLUMN source_type VARCHAR(20) NULL'))
if not _column_exists_on_conn(conn, "job_runs", "external_id"):
conn.execute(text('ALTER TABLE "job_runs" ADD COLUMN external_id VARCHAR(100) NULL'))
# Index for deduplication lookups
conn.execute(text(
'CREATE INDEX IF NOT EXISTS idx_job_runs_external_id ON "job_runs" (external_id)'
))
print("[migrations] migrate_cove_integration completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate Cove integration columns: {exc}")
def migrate_entra_sso_settings() -> None:
"""Add Microsoft Entra SSO columns to system_settings if missing."""
try:
engine = db.get_engine()
except Exception as exc:
print(f"[migrations] Could not get engine for Entra SSO migration: {exc}")
return
columns = [
("entra_sso_enabled", "BOOLEAN NOT NULL DEFAULT FALSE"),
("entra_tenant_id", "VARCHAR(128) NULL"),
("entra_client_id", "VARCHAR(128) NULL"),
("entra_client_secret", "VARCHAR(255) NULL"),
("entra_redirect_uri", "VARCHAR(512) NULL"),
("entra_allowed_domain", "VARCHAR(255) NULL"),
("entra_allowed_group_ids", "TEXT NULL"),
("entra_auto_provision_users", "BOOLEAN NOT NULL DEFAULT FALSE"),
]
try:
with engine.begin() as conn:
for column, ddl in columns:
if _column_exists_on_conn(conn, "system_settings", column):
continue
conn.execute(text(f'ALTER TABLE "system_settings" ADD COLUMN {column} {ddl}'))
print("[migrations] migrate_entra_sso_settings completed.")
except Exception as exc:
print(f"[migrations] Failed to migrate Entra SSO columns: {exc}")
def run_migrations() -> None: def run_migrations() -> None:
print("[migrations] Starting migrations...") print("[migrations] Starting migrations...")
migrate_add_username_to_users() migrate_add_username_to_users()
@ -1095,6 +1230,7 @@ def run_migrations() -> None:
migrate_object_persistence_tables() migrate_object_persistence_tables()
migrate_feedback_tables() migrate_feedback_tables()
migrate_feedback_replies_table() migrate_feedback_replies_table()
migrate_feedback_attachments_table()
migrate_tickets_active_from_date() migrate_tickets_active_from_date()
migrate_tickets_resolved_origin() migrate_tickets_resolved_origin()
migrate_remarks_active_from_date() migrate_remarks_active_from_date()
@ -1111,6 +1247,9 @@ def run_migrations() -> None:
migrate_performance_indexes() migrate_performance_indexes()
migrate_system_settings_require_daily_dashboard_visit() migrate_system_settings_require_daily_dashboard_visit()
migrate_rename_admin_logs_to_audit_logs() migrate_rename_admin_logs_to_audit_logs()
migrate_cove_integration()
migrate_cove_accounts_table()
migrate_entra_sso_settings()
print("[migrations] All migrations completed.") print("[migrations] All migrations completed.")
@ -1446,6 +1585,49 @@ def migrate_feedback_replies_table() -> None:
print("[migrations] Feedback replies table ensured.") print("[migrations] Feedback replies table ensured.")
def migrate_feedback_attachments_table() -> None:
"""Ensure feedback attachments table exists.
Table:
- feedback_attachments (screenshots/images for feedback items and replies)
"""
engine = db.get_engine()
with engine.begin() as conn:
conn.execute(
text(
"""
CREATE TABLE IF NOT EXISTS feedback_attachments (
id SERIAL PRIMARY KEY,
feedback_item_id INTEGER NOT NULL REFERENCES feedback_items(id) ON DELETE CASCADE,
feedback_reply_id INTEGER REFERENCES feedback_replies(id) ON DELETE CASCADE,
filename VARCHAR(255) NOT NULL,
file_data BYTEA NOT NULL,
mime_type VARCHAR(64) NOT NULL,
file_size INTEGER NOT NULL,
created_at TIMESTAMP NOT NULL DEFAULT NOW()
);
"""
)
)
conn.execute(
text(
"""
CREATE INDEX IF NOT EXISTS idx_feedback_attachments_item
ON feedback_attachments (feedback_item_id);
"""
)
)
conn.execute(
text(
"""
CREATE INDEX IF NOT EXISTS idx_feedback_attachments_reply
ON feedback_attachments (feedback_reply_id);
"""
)
)
print("[migrations] Feedback attachments table ensured.")
def migrate_tickets_active_from_date() -> None: def migrate_tickets_active_from_date() -> None:
"""Ensure tickets.active_from_date exists and is populated. """Ensure tickets.active_from_date exists and is populated.

View File

@ -117,6 +117,26 @@ class SystemSettings(db.Model):
# this is not a production environment. # this is not a production environment.
is_sandbox_environment = db.Column(db.Boolean, nullable=False, default=False) is_sandbox_environment = db.Column(db.Boolean, nullable=False, default=False)
# Cove Data Protection integration settings
cove_enabled = db.Column(db.Boolean, nullable=False, default=False)
cove_api_url = db.Column(db.String(255), nullable=True) # default: https://api.backup.management/jsonapi
cove_api_username = db.Column(db.String(255), nullable=True)
cove_api_password = db.Column(db.String(255), nullable=True)
cove_import_enabled = db.Column(db.Boolean, nullable=False, default=False)
cove_import_interval_minutes = db.Column(db.Integer, nullable=False, default=30)
cove_partner_id = db.Column(db.Integer, nullable=True) # stored after successful login
cove_last_import_at = db.Column(db.DateTime, nullable=True)
# Microsoft Entra SSO settings
entra_sso_enabled = db.Column(db.Boolean, nullable=False, default=False)
entra_tenant_id = db.Column(db.String(128), nullable=True)
entra_client_id = db.Column(db.String(128), nullable=True)
entra_client_secret = db.Column(db.String(255), nullable=True)
entra_redirect_uri = db.Column(db.String(512), nullable=True)
entra_allowed_domain = db.Column(db.String(255), nullable=True)
entra_allowed_group_ids = db.Column(db.Text, nullable=True) # comma/newline separated Entra Group Object IDs
entra_auto_provision_users = db.Column(db.Boolean, nullable=False, default=False)
# Autotask integration settings # Autotask integration settings
autotask_enabled = db.Column(db.Boolean, nullable=False, default=False) autotask_enabled = db.Column(db.Boolean, nullable=False, default=False)
autotask_environment = db.Column(db.String(32), nullable=True) # sandbox | production autotask_environment = db.Column(db.String(32), nullable=True) # sandbox | production
@ -242,6 +262,9 @@ class Job(db.Model):
auto_approve = db.Column(db.Boolean, nullable=False, default=True) auto_approve = db.Column(db.Boolean, nullable=False, default=True)
active = db.Column(db.Boolean, nullable=False, default=True) active = db.Column(db.Boolean, nullable=False, default=True)
# Cove Data Protection integration (legacy: account ID stored directly on job)
cove_account_id = db.Column(db.Integer, nullable=True) # kept for backwards compat
# Archived jobs are excluded from Daily Jobs and Run Checks. # Archived jobs are excluded from Daily Jobs and Run Checks.
# JobRuns remain in the database and are still included in reporting. # JobRuns remain in the database and are still included in reporting.
archived = db.Column(db.Boolean, nullable=False, default=False) archived = db.Column(db.Boolean, nullable=False, default=False)
@ -290,6 +313,10 @@ class JobRun(db.Model):
reviewed_at = db.Column(db.DateTime, nullable=True) reviewed_at = db.Column(db.DateTime, nullable=True)
reviewed_by_user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=True) reviewed_by_user_id = db.Column(db.Integer, db.ForeignKey("users.id"), nullable=True)
# Import source tracking
source_type = db.Column(db.String(20), nullable=True) # NULL = email (backwards compat), "cove_api"
external_id = db.Column(db.String(100), nullable=True) # e.g. "cove-{account_id}-{run_ts}" for deduplication
# Autotask integration (Phase 4: ticket creation from Run Checks) # Autotask integration (Phase 4: ticket creation from Run Checks)
autotask_ticket_id = db.Column(db.Integer, nullable=True) autotask_ticket_id = db.Column(db.Integer, nullable=True)
autotask_ticket_number = db.Column(db.String(64), nullable=True) autotask_ticket_number = db.Column(db.String(64), nullable=True)
@ -314,6 +341,41 @@ class JobRun(db.Model):
autotask_ticket_created_by = db.relationship("User", foreign_keys=[autotask_ticket_created_by_user_id]) autotask_ticket_created_by = db.relationship("User", foreign_keys=[autotask_ticket_created_by_user_id])
class CoveAccount(db.Model):
"""Staging table for Cove Data Protection accounts.
All accounts returned by EnumerateAccountStatistics are upserted here.
Unlinked accounts (job_id IS NULL) appear in the Cove Accounts page
where an admin can create or link a job the same flow as the mail Inbox.
Once linked, the importer creates JobRuns for each new session.
"""
__tablename__ = "cove_accounts"
id = db.Column(db.Integer, primary_key=True)
# Cove account identifier (unique, from AccountId field)
account_id = db.Column(db.Integer, nullable=False, unique=True)
# Account/device info from Cove columns
account_name = db.Column(db.String(512), nullable=True) # I1 device/backup name
computer_name = db.Column(db.String(512), nullable=True) # I18 computer name
customer_name = db.Column(db.String(255), nullable=True) # I8 Cove customer/partner name
datasource_types = db.Column(db.String(255), nullable=True) # I78 active datasource label
# Last known status
last_status_code = db.Column(db.Integer, nullable=True) # D09F00
last_run_at = db.Column(db.DateTime, nullable=True) # D09F15 (converted from Unix ts)
colorbar_28d = db.Column(db.String(64), nullable=True) # D09F08
# Link to a Backupchecks job (NULL = unmatched, needs review)
job_id = db.Column(db.Integer, db.ForeignKey("jobs.id"), nullable=True)
first_seen_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
last_seen_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
job = db.relationship("Job", backref=db.backref("cove_account", uselist=False))
class JobRunReviewEvent(db.Model): class JobRunReviewEvent(db.Model):
__tablename__ = "job_run_review_events" __tablename__ = "job_run_review_events"
@ -567,6 +629,23 @@ class FeedbackReply(db.Model):
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False) created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
class FeedbackAttachment(db.Model):
__tablename__ = "feedback_attachments"
id = db.Column(db.Integer, primary_key=True)
feedback_item_id = db.Column(
db.Integer, db.ForeignKey("feedback_items.id", ondelete="CASCADE"), nullable=False
)
feedback_reply_id = db.Column(
db.Integer, db.ForeignKey("feedback_replies.id", ondelete="CASCADE"), nullable=True
)
filename = db.Column(db.String(255), nullable=False)
file_data = db.Column(db.LargeBinary, nullable=False)
mime_type = db.Column(db.String(64), nullable=False)
file_size = db.Column(db.Integer, nullable=False)
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
class NewsItem(db.Model): class NewsItem(db.Model):
__tablename__ = "news_items" __tablename__ = "news_items"
@ -708,4 +787,4 @@ class ReportObjectSummary(db.Model):
report = db.relationship( report = db.relationship(
"ReportDefinition", "ReportDefinition",
backref=db.backref("object_summaries", lazy="dynamic", cascade="all, delete-orphan"), backref=db.backref("object_summaries", lazy="dynamic", cascade="all, delete-orphan"),
) )

View File

@ -24,6 +24,10 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
- SSL Certificate Renewal (informational) - SSL Certificate Renewal (informational)
Subject: '3CX Notification: SSL Certificate Renewal - <host>' Subject: '3CX Notification: SSL Certificate Renewal - <host>'
Body contains an informational message about the renewal. Body contains an informational message about the renewal.
- Update Successful (informational)
Subject: '3CX Notification: Update Successful - <host>'
Body confirms update completion and healthy services.
""" """
subject = (msg.subject or "").strip() subject = (msg.subject or "").strip()
if not subject: if not subject:
@ -38,11 +42,16 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
subject, subject,
flags=re.IGNORECASE, flags=re.IGNORECASE,
) )
m_update = re.match(
r"^3CX Notification:\s*Update Successful\s*-\s*(.+)$",
subject,
flags=re.IGNORECASE,
)
if not m_backup and not m_ssl: if not m_backup and not m_ssl and not m_update:
return False, {}, [] return False, {}, []
job_name = (m_backup or m_ssl).group(1).strip() job_name = (m_backup or m_ssl or m_update).group(1).strip()
body = (getattr(msg, "text_body", None) or getattr(msg, "body", None) or "") body = (getattr(msg, "text_body", None) or getattr(msg, "body", None) or "")
if not body: if not body:
@ -60,6 +69,17 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
} }
return True, result, [] return True, result, []
# Update successful: store as tracked informational run
if m_update:
result = {
"backup_software": "3CX",
"backup_type": "Update",
"job_name": job_name,
"overall_status": "Success",
"overall_message": body or None,
}
return True, result, []
# Backup complete # Backup complete
backup_file = None backup_file = None
m_file = re.search(r"^\s*Backup\s+name\s*:\s*(.+?)\s*$", body, flags=re.IGNORECASE | re.MULTILINE) m_file = re.search(r"^\s*Backup\s+name\s*:\s*(.+?)\s*$", body, flags=re.IGNORECASE | re.MULTILINE)

View File

@ -11,7 +11,7 @@
class="form-control" class="form-control"
id="username" id="username"
name="username" name="username"
value="{{ email or '' }}" value="{{ username or '' }}"
required required
/> />
</div> </div>
@ -39,6 +39,12 @@
<a class="btn btn-link" href="{{ url_for('auth.password_reset_request') }}">Forgot password?</a> <a class="btn btn-link" href="{{ url_for('auth.password_reset_request') }}">Forgot password?</a>
</div> </div>
</form> </form>
{% if entra_sso_enabled %}
<div class="my-3"><hr /></div>
<a class="btn btn-outline-secondary w-100" href="{{ url_for('auth.entra_login') }}">
Sign in with Microsoft
</a>
{% endif %}
</div> </div>
</div> </div>
{% endblock %} {% endblock %}

View File

@ -0,0 +1,121 @@
{% extends "documentation/base.html" %}
{% block doc_content %}
<h1>Microsoft Entra SSO</h1>
<p>Use Microsoft Entra ID (Azure AD) to let users sign in with their Microsoft account.</p>
<div class="doc-callout doc-callout-warning">
<strong>Status: Untested in Backupchecks.</strong>
This SSO implementation has not yet been end-to-end validated in Backupchecks itself.
Treat this page as implementation guidance for future rollout, not as a confirmed production setup.
</div>
<div class="doc-callout doc-callout-info">
<strong>Scope:</strong> this page explains the setup for Backupchecks and Microsoft Entra.
It does not replace your internal identity/security policies.
</div>
<h2>Prerequisites</h2>
<ul>
<li>Admin access to your Microsoft Entra tenant.</li>
<li>Admin access to Backupchecks <strong>Settings → Integrations</strong>.</li>
<li>A stable HTTPS URL for Backupchecks (recommended for production).</li>
</ul>
<h2>Step 1: Register an app in Microsoft Entra</h2>
<ol>
<li>Open <strong>Microsoft Entra admin center</strong><strong>App registrations</strong>.</li>
<li>Create a new registration (single-tenant is typical for internal use).</li>
<li>Set a name, for example <code>Backupchecks SSO</code>.</li>
<li>After creation, copy:
<ul>
<li><strong>Application (client) ID</strong></li>
<li><strong>Directory (tenant) ID</strong></li>
</ul>
</li>
</ol>
<h2>Step 2: Configure redirect URI</h2>
<ol>
<li>In the app registration, open <strong>Authentication</strong>.</li>
<li>Add a <strong>Web</strong> redirect URI:
<ul>
<li><code>https://your-backupchecks-domain/auth/entra/callback</code></li>
</ul>
</li>
<li>Save the authentication settings.</li>
</ol>
<h2>Step 3: Create client secret</h2>
<ol>
<li>Open <strong>Certificates &amp; secrets</strong> in the app registration.</li>
<li>Create a new client secret.</li>
<li>Copy the secret value immediately (it is shown only once).</li>
</ol>
<h2>Step 4: Configure Backupchecks</h2>
<ol>
<li>Open <strong>Settings → Integrations → Microsoft Entra SSO</strong>.</li>
<li>Enable <strong>Microsoft sign-in</strong>.</li>
<li>Fill in:
<ul>
<li><strong>Tenant ID</strong></li>
<li><strong>Client ID</strong></li>
<li><strong>Client Secret</strong></li>
<li><strong>Redirect URI</strong> (optional override, leave empty to auto-use callback URL)</li>
<li><strong>Allowed domain/tenant</strong> (optional restriction)</li>
<li><strong>Allowed Entra Group Object ID(s)</strong> (optional but recommended)</li>
</ul>
</li>
<li>Optional: enable <strong>Auto-provision unknown users as Viewer</strong>.</li>
<li>Save settings.</li>
</ol>
<h2>Security Group Restriction (recommended)</h2>
<p>You can enforce that only members of one or more specific Entra security groups can sign in.</p>
<ol>
<li>Create or choose a security group in Entra (for example <code>Backupchecks-Users</code>).</li>
<li>Add the allowed users to that group.</li>
<li>Copy the group <strong>Object ID</strong> (not display name).</li>
<li>Paste one or more group object IDs in:
<ul>
<li><strong>Settings → Integrations → Microsoft Entra SSO → Allowed Entra Group Object ID(s)</strong></li>
</ul>
</li>
<li>In the Entra app registration, configure <strong>Token configuration</strong> to include the <code>groups</code> claim in ID tokens.</li>
</ol>
<div class="doc-callout doc-callout-warning">
<strong>Important:</strong> if users are member of many groups, Entra may return a "group overage" token without inline
<code>groups</code> list. In that case Backupchecks cannot verify membership and login is blocked by design.
</div>
<h2>Step 5: Test sign-in</h2>
<ol>
<li>Open <strong>/auth/login</strong> in a private/incognito browser session.</li>
<li>Click <strong>Sign in with Microsoft</strong>.</li>
<li>Authenticate with an allowed account.</li>
<li>Confirm you are redirected back into Backupchecks.</li>
</ol>
<h2>User mapping behavior</h2>
<ul>
<li>Backupchecks first tries to match Entra user to local user by username/email.</li>
<li>If no match exists:
<ul>
<li>With auto-provision disabled: login is rejected.</li>
<li>With auto-provision enabled: a new local user is created with <strong>Viewer</strong> role.</li>
</ul>
</li>
</ul>
<h2>Troubleshooting</h2>
<ul>
<li><strong>Redirect URI mismatch:</strong> ensure Entra app URI exactly matches Backupchecks callback URI.</li>
<li><strong>SSO button not visible:</strong> check that SSO is enabled and Tenant/Client/Secret are saved.</li>
<li><strong>Account not allowed:</strong> verify tenant/domain restriction in <em>Allowed domain/tenant</em>.</li>
<li><strong>Group restricted login fails:</strong> verify group object IDs and ensure the ID token includes a <code>groups</code> claim.</li>
<li><strong>No local user mapping:</strong> create a matching local user or enable auto-provision.</li>
</ul>
{% endblock %}

View File

@ -95,6 +95,11 @@
<li class="nav-item"> <li class="nav-item">
<a class="nav-link" href="{{ url_for('main.inbox') }}">Inbox</a> <a class="nav-link" href="{{ url_for('main.inbox') }}">Inbox</a>
</li> </li>
{% if system_settings and system_settings.cove_enabled and active_role in ('admin', 'operator') %}
<li class="nav-item">
<a class="nav-link" href="{{ url_for('main.cove_accounts') }}">Cove Accounts</a>
</li>
{% endif %}
{% if active_role == 'admin' %} {% if active_role == 'admin' %}
<li class="nav-item"> <li class="nav-item">
<a class="nav-link" href="{{ url_for('main.admin_all_mails') }}">All Mail</a> <a class="nav-link" href="{{ url_for('main.admin_all_mails') }}">All Mail</a>
@ -157,6 +162,18 @@
</li> </li>
{% endif %} {% endif %}
</ul> </ul>
<form method="get" action="{{ url_for('main.search_page') }}" class="d-flex me-3 mb-2 mb-lg-0" role="search" autocomplete="off">
<input
class="form-control form-control-sm me-2"
type="search"
name="q"
placeholder="Search"
aria-label="Search"
value="{{ request.args.get('q','') if request.path == url_for('main.search_page') else '' }}"
style="min-width: 220px;"
/>
<button class="btn btn-outline-secondary btn-sm" type="submit">Search</button>
</form>
<span class="navbar-text me-3"> <span class="navbar-text me-3">
<a class="text-decoration-none" href="{{ url_for('main.user_settings') }}"> <a class="text-decoration-none" href="{{ url_for('main.user_settings') }}">
{{ current_user.username }} ({{ active_role }}) {{ current_user.username }} ({{ active_role }})
@ -361,4 +378,4 @@
})(); })();
</script> </script>
</body> </body>
</html> </html>

View File

@ -0,0 +1,241 @@
{% extends "layout/base.html" %}
{% block content %}
<div class="d-flex justify-content-between align-items-center mb-3">
<h2 class="mb-0">Cove Accounts</h2>
<div class="d-flex gap-2">
{% if settings.cove_partner_id %}
<form method="post" action="{{ url_for('main.settings_cove_run_now') }}" class="mb-0">
<button type="submit" class="btn btn-sm btn-outline-primary">Run import now</button>
</form>
{% endif %}
<a href="{{ url_for('main.settings', section='integrations') }}" class="btn btn-sm btn-outline-secondary">Cove settings</a>
</div>
</div>
{% if settings.cove_last_import_at %}
<p class="text-muted small mb-3">Last import: {{ settings.cove_last_import_at|local_datetime }}</p>
{% else %}
<p class="text-muted small mb-3">No import has run yet. Click <strong>Run import now</strong> to fetch Cove accounts.</p>
{% endif %}
{# ── Unmatched accounts (need a job) ─────────────────────────────────────── #}
{% if unmatched %}
<h4 class="mb-2">Unmatched <span class="badge bg-warning text-dark">{{ unmatched|length }}</span></h4>
<p class="text-muted small mb-3">These accounts have no linked job yet. Create a new job or link to an existing one.</p>
<div class="table-responsive mb-4">
<table class="table table-sm table-hover align-middle">
<thead class="table-light">
<tr>
<th>Backup software</th>
<th>Type</th>
<th>Job name</th>
<th>Computer</th>
<th>Customer (Cove)</th>
<th>Datasources</th>
<th>Last status</th>
<th>Last run</th>
<th>First seen</th>
<th></th>
</tr>
</thead>
<tbody>
{% for acc in unmatched %}
<tr>
<td>{{ acc.derived_backup_software }}</td>
<td>{{ acc.derived_backup_type }}</td>
<td>{{ acc.derived_job_name }}</td>
<td class="text-muted small">{{ acc.computer_name or '—' }}</td>
<td>{{ acc.customer_name or '—' }}</td>
<td class="text-muted small">{{ acc.datasource_display }}</td>
<td>
{% if acc.last_status_code is not none %}
<span class="badge bg-{{ STATUS_CLASS.get(acc.last_status_code, 'secondary') }}">
{{ STATUS_LABELS.get(acc.last_status_code, acc.last_status_code) }}
</span>
{% else %}—{% endif %}
</td>
<td class="text-muted small">{{ acc.last_run_at|local_datetime if acc.last_run_at else '—' }}</td>
<td class="text-muted small">{{ acc.first_seen_at|local_datetime }}</td>
<td>
<button class="btn btn-sm btn-primary"
data-bs-toggle="modal"
data-bs-target="#link-modal-{{ acc.id }}">
Link / Create job
</button>
</td>
</tr>
{# Link modal #}
<div class="modal fade" id="link-modal-{{ acc.id }}" tabindex="-1">
<div class="modal-dialog">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Link: {{ acc.account_name or acc.account_id }}</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<p class="text-muted small mb-3">
Cove account <strong>{{ acc.account_id }}</strong>
customer: <strong>{{ acc.customer_name or '?' }}</strong>
</p>
<ul class="nav nav-tabs mb-3" id="tab-{{ acc.id }}" role="tablist">
<li class="nav-item" role="presentation">
<button class="nav-link active" data-bs-toggle="tab"
data-bs-target="#create-{{ acc.id }}" type="button">
Create new job
</button>
</li>
<li class="nav-item" role="presentation">
<button class="nav-link" data-bs-toggle="tab"
data-bs-target="#existing-{{ acc.id }}" type="button">
Link to existing job
</button>
</li>
</ul>
<div class="tab-content">
{# Tab 1: Create new job #}
<div class="tab-pane fade show active" id="create-{{ acc.id }}">
<form method="post" action="{{ url_for('main.cove_account_link', cove_account_db_id=acc.id) }}">
<input type="hidden" name="action" value="create" />
<div class="mb-3">
<label class="form-label">Customer <span class="text-danger">*</span></label>
<select class="form-select" name="customer_id" required>
<option value="">Select customer…</option>
{% for c in customers %}
<option value="{{ c.id }}"
{% if acc.customer_name and acc.customer_name.lower() == c.name.lower() %}selected{% endif %}>
{{ c.name }}
</option>
{% endfor %}
</select>
</div>
<div class="mb-3">
<label class="form-label">Job name</label>
<input type="text" class="form-control" name="job_name"
value="{{ acc.derived_job_name }}" />
<div class="form-text">Defaults to the Cove account name.</div>
</div>
<div class="mb-3">
<label class="form-label">Backup type</label>
<input type="text" class="form-control" name="backup_type"
value="{{ acc.derived_backup_type }}" />
<div class="form-text">Derived from Cove datasource profile.</div>
</div>
<div class="d-flex justify-content-end gap-2">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
<button type="submit" class="btn btn-primary">Create job &amp; link</button>
</div>
</form>
</div>
{# Tab 2: Link to existing job #}
<div class="tab-pane fade" id="existing-{{ acc.id }}">
<form method="post" action="{{ url_for('main.cove_account_link', cove_account_db_id=acc.id) }}">
<input type="hidden" name="action" value="link" />
<div class="mb-3">
<label class="form-label">Job <span class="text-danger">*</span></label>
<select class="form-select" name="job_id" required>
<option value="">Select job…</option>
{% for j in jobs %}
<option value="{{ j.id }}">
{{ j.customer.name ~ ' ' if j.customer else '' }}{{ j.backup_software }} / {{ j.job_name }}
</option>
{% endfor %}
</select>
</div>
<div class="d-flex justify-content-end gap-2">
<button type="button" class="btn btn-secondary" data-bs-dismiss="modal">Cancel</button>
<button type="submit" class="btn btn-primary">Link to job</button>
</div>
</form>
</div>
</div>{# /tab-content #}
</div>
</div>
</div>
</div>{# /modal #}
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="alert alert-success mb-4">
<strong>All accounts matched.</strong>
{% if not settings.cove_last_import_at %}
Run an import first to see Cove accounts here.
{% else %}
No unmatched Cove accounts.
{% endif %}
</div>
{% endif %}
{# ── Matched accounts ────────────────────────────────────────────────────── #}
{% if matched %}
<h4 class="mb-2">Linked <span class="badge bg-success">{{ matched|length }}</span></h4>
<div class="table-responsive">
<table class="table table-sm table-hover align-middle">
<thead class="table-light">
<tr>
<th>Backup software</th>
<th>Type</th>
<th>Job name</th>
<th>Computer</th>
<th>Customer (Cove)</th>
<th>Datasources</th>
<th>Last status</th>
<th>Last run</th>
<th>Linked job</th>
<th></th>
</tr>
</thead>
<tbody>
{% for acc in matched %}
<tr>
<td>{{ acc.derived_backup_software }}</td>
<td>{{ acc.derived_backup_type }}</td>
<td>{{ acc.derived_job_name }}</td>
<td class="text-muted small">{{ acc.computer_name or '—' }}</td>
<td>{{ acc.customer_name or '—' }}</td>
<td class="text-muted small">{{ acc.datasource_display }}</td>
<td>
{% if acc.last_status_code is not none %}
<span class="badge bg-{{ STATUS_CLASS.get(acc.last_status_code, 'secondary') }}">
{{ STATUS_LABELS.get(acc.last_status_code, acc.last_status_code) }}
</span>
{% else %}—{% endif %}
</td>
<td class="text-muted small">{{ acc.last_run_at|local_datetime if acc.last_run_at else '—' }}</td>
<td>
{% if acc.job %}
<a href="{{ url_for('main.job_detail', job_id=acc.job.id) }}">
{{ acc.job.customer.name ~ ' ' if acc.job.customer else '' }}{{ acc.job.job_name }}
</a>
{% else %}—{% endif %}
</td>
<td>
<form method="post"
action="{{ url_for('main.cove_account_unlink', cove_account_db_id=acc.id) }}"
onsubmit="return confirm('Remove link between this Cove account and the job?');"
class="mb-0">
<button type="submit" class="btn btn-sm btn-outline-secondary">Unlink</button>
</form>
</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% endif %}
{% if not unmatched and not matched %}
<div class="alert alert-info">
No Cove accounts found. Run an import first via the button above or via Settings → Integrations → Cove.
</div>
{% endif %}
{% endblock %}

View File

@ -15,6 +15,10 @@
<form method="post" action="{{ url_for('main.customers_import') }}" enctype="multipart/form-data" class="d-flex align-items-center gap-2 mb-0"> <form method="post" action="{{ url_for('main.customers_import') }}" enctype="multipart/form-data" class="d-flex align-items-center gap-2 mb-0">
<input type="file" name="file" accept=".csv,text/csv" class="form-control form-control-sm" required style="max-width: 420px;" /> <input type="file" name="file" accept=".csv,text/csv" class="form-control form-control-sm" required style="max-width: 420px;" />
<div class="form-check mb-0">
<input class="form-check-input" type="checkbox" value="1" id="include_autotask_ids_customers" name="include_autotask_ids" />
<label class="form-check-label small" for="include_autotask_ids_customers">Include Autotask IDs</label>
</div>
<button type="submit" class="btn btn-outline-secondary btn-sm" style="white-space: nowrap;">Import CSV</button> <button type="submit" class="btn btn-outline-secondary btn-sm" style="white-space: nowrap;">Import CSV</button>
</form> </form>
@ -45,7 +49,11 @@
{% if customers %} {% if customers %}
{% for c in customers %} {% for c in customers %}
<tr> <tr>
<td>{{ c.name }}</td> <td>
<a href="{{ url_for('main.jobs', customer_id=c.id) }}" class="link-primary text-decoration-none">
{{ c.name }}
</a>
</td>
<td> <td>
{% if c.active %} {% if c.active %}
<span class="badge bg-success">Active</span> <span class="badge bg-success">Active</span>

View File

@ -4,6 +4,9 @@
<h2 class="mb-3">Daily Jobs</h2> <h2 class="mb-3">Daily Jobs</h2>
<form method="get" class="row g-3 mb-3"> <form method="get" class="row g-3 mb-3">
{% if q %}
<input type="hidden" name="q" value="{{ q }}" />
{% endif %}
<div class="col-auto"> <div class="col-auto">
<label for="dj_date" class="form-label">Date</label> <label for="dj_date" class="form-label">Date</label>
<input <input
@ -665,7 +668,7 @@ if (tStatus) tStatus.textContent = '';
}); });
} }
function attachDailyJobsHandlers() { function attachDailyJobsHandlers() {
var rows = document.querySelectorAll(".daily-job-row"); var rows = document.querySelectorAll(".daily-job-row");
if (!rows.length) { if (!rows.length) {
return; return;
@ -771,9 +774,43 @@ if (tStatus) tStatus.textContent = '';
}); });
} }
function autoOpenJobFromQuery() {
try {
var params = new URLSearchParams(window.location.search || "");
var openJobId = (params.get("open_job_id") || "").trim();
if (!openJobId) {
return;
}
var rows = document.querySelectorAll(".daily-job-row");
var targetRow = null;
rows.forEach(function (row) {
if ((row.getAttribute("data-job-id") || "") === openJobId) {
targetRow = row;
}
});
if (!targetRow) {
return;
}
targetRow.click();
params.delete("open_job_id");
var nextQuery = params.toString();
var nextUrl = window.location.pathname + (nextQuery ? ("?" + nextQuery) : "");
if (window.history && window.history.replaceState) {
window.history.replaceState({}, document.title, nextUrl);
}
} catch (e) {
// no-op
}
}
document.addEventListener("DOMContentLoaded", function () { document.addEventListener("DOMContentLoaded", function () {
bindInlineCreateForms(); bindInlineCreateForms();
attachDailyJobsHandlers(); attachDailyJobsHandlers();
autoOpenJobFromQuery();
}); });
})(); })();
</script> </script>

View File

@ -34,6 +34,16 @@
<div class="col-6 col-md-3"> <div class="col-6 col-md-3">
<button class="btn btn-outline-secondary" type="submit">Apply</button> <button class="btn btn-outline-secondary" type="submit">Apply</button>
</div> </div>
{% if active_role == 'admin' %}
<div class="col-12">
<div class="form-check">
<input class="form-check-input" type="checkbox" name="show_deleted" value="1" id="show_deleted" {% if show_deleted %}checked{% endif %} onchange="this.form.submit()">
<label class="form-check-label" for="show_deleted">
Show deleted items
</label>
</div>
</div>
{% endif %}
</form> </form>
<div class="table-responsive"> <div class="table-responsive">
@ -46,6 +56,9 @@
<th style="width: 160px;">Component</th> <th style="width: 160px;">Component</th>
<th style="width: 120px;">Status</th> <th style="width: 120px;">Status</th>
<th style="width: 170px;">Created</th> <th style="width: 170px;">Created</th>
{% if active_role == 'admin' and show_deleted %}
<th style="width: 140px;">Actions</th>
{% endif %}
</tr> </tr>
</thead> </thead>
<tbody> <tbody>
@ -56,20 +69,30 @@
{% endif %} {% endif %}
{% for i in items %} {% for i in items %}
<tr> <tr {% if i.is_deleted %}style="opacity: 0.6; background-color: var(--bs-secondary-bg);"{% endif %}>
<td> <td>
{% if not i.is_deleted %}
<form method="post" action="{{ url_for('main.feedback_vote', item_id=i.id) }}"> <form method="post" action="{{ url_for('main.feedback_vote', item_id=i.id) }}">
<input type="hidden" name="ref" value="list" /> <input type="hidden" name="ref" value="list" />
<button type="submit" class="btn btn-sm {% if i.user_voted %}btn-success{% else %}btn-outline-secondary{% endif %}"> <button type="submit" class="btn btn-sm {% if i.user_voted %}btn-success{% else %}btn-outline-secondary{% endif %}">
+ {{ i.vote_count }} + {{ i.vote_count }}
</button> </button>
</form> </form>
{% else %}
<span class="text-muted">+ {{ i.vote_count }}</span>
{% endif %}
</td> </td>
<td> <td>
<a href="{{ url_for('main.feedback_detail', item_id=i.id) }}">{{ i.title }}</a> <a href="{{ url_for('main.feedback_detail', item_id=i.id) }}">{{ i.title }}</a>
{% if i.is_deleted %}
<span class="badge text-bg-dark ms-2">Deleted</span>
{% endif %}
{% if i.created_by %} {% if i.created_by %}
<div class="text-muted" style="font-size: 0.85rem;">by {{ i.created_by }}</div> <div class="text-muted" style="font-size: 0.85rem;">by {{ i.created_by }}</div>
{% endif %} {% endif %}
{% if i.is_deleted and i.deleted_at %}
<div class="text-muted" style="font-size: 0.85rem;">Deleted {{ i.deleted_at|local_datetime }}</div>
{% endif %}
</td> </td>
<td> <td>
{% if i.item_type == 'bug' %} {% if i.item_type == 'bug' %}
@ -90,6 +113,15 @@
<div>{{ i.created_at|local_datetime }}</div> <div>{{ i.created_at|local_datetime }}</div>
<div class="text-muted" style="font-size: 0.85rem;">Updated {{ i.updated_at|local_datetime }}</div> <div class="text-muted" style="font-size: 0.85rem;">Updated {{ i.updated_at|local_datetime }}</div>
</td> </td>
{% if active_role == 'admin' and show_deleted %}
<td>
{% if i.is_deleted %}
<form method="post" action="{{ url_for('main.feedback_permanent_delete', item_id=i.id) }}" onsubmit="return confirm('Permanently delete this item and all screenshots? This cannot be undone!');">
<button type="submit" class="btn btn-sm btn-danger">Permanent Delete</button>
</form>
{% endif %}
</td>
{% endif %}
</tr> </tr>
{% endfor %} {% endfor %}
</tbody> </tbody>

View File

@ -15,6 +15,9 @@
{% else %} {% else %}
<span class="badge text-bg-warning">Open</span> <span class="badge text-bg-warning">Open</span>
{% endif %} {% endif %}
{% if item.deleted_at %}
<span class="badge text-bg-dark">Deleted</span>
{% endif %}
<span class="ms-2">by {{ created_by_name }}</span> <span class="ms-2">by {{ created_by_name }}</span>
</div> </div>
</div> </div>
@ -29,6 +32,23 @@
<div class="mb-2"><strong>Component:</strong> {{ item.component }}</div> <div class="mb-2"><strong>Component:</strong> {{ item.component }}</div>
{% endif %} {% endif %}
<div style="white-space: pre-wrap;">{{ item.description }}</div> <div style="white-space: pre-wrap;">{{ item.description }}</div>
{% if item_attachments %}
<div class="mt-3">
<strong>Screenshots:</strong>
<div class="d-flex flex-wrap gap-2 mt-2">
{% for att in item_attachments %}
<a href="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}" target="_blank">
<img src="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}"
alt="{{ att.filename }}"
class="img-thumbnail"
style="max-height: 200px; max-width: 300px; cursor: pointer;"
title="Click to view full size" />
</a>
{% endfor %}
</div>
</div>
{% endif %}
</div> </div>
<div class="card-footer d-flex justify-content-between align-items-center"> <div class="card-footer d-flex justify-content-between align-items-center">
<div class="text-muted" style="font-size: 0.9rem;"> <div class="text-muted" style="font-size: 0.9rem;">
@ -63,6 +83,22 @@
</span> </span>
</div> </div>
<div style="white-space: pre-wrap;">{{ r.message }}</div> <div style="white-space: pre-wrap;">{{ r.message }}</div>
{% if r.id in reply_attachments_map %}
<div class="mt-2">
<div class="d-flex flex-wrap gap-2">
{% for att in reply_attachments_map[r.id] %}
<a href="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}" target="_blank">
<img src="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}"
alt="{{ att.filename }}"
class="img-thumbnail"
style="max-height: 150px; max-width: 200px; cursor: pointer;"
title="Click to view full size" />
</a>
{% endfor %}
</div>
</div>
{% endif %}
</div> </div>
{% endfor %} {% endfor %}
</div> </div>
@ -76,10 +112,15 @@
<div class="card-body"> <div class="card-body">
<h5 class="card-title mb-3">Add reply</h5> <h5 class="card-title mb-3">Add reply</h5>
{% if item.status == 'open' %} {% if item.status == 'open' %}
<form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}"> <form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}" enctype="multipart/form-data">
<div class="mb-2"> <div class="mb-2">
<textarea class="form-control" name="message" rows="4" required></textarea> <textarea class="form-control" name="message" rows="4" required></textarea>
</div> </div>
<div class="mb-2">
<label class="form-label">Screenshots (optional)</label>
<input type="file" name="screenshots" class="form-control" multiple accept="image/png,image/jpeg,image/jpg,image/gif,image/webp" />
<div class="form-text">You can attach multiple screenshots (PNG, JPG, GIF, WEBP, max 5MB each)</div>
</div>
<button type="submit" class="btn btn-primary">Post reply</button> <button type="submit" class="btn btn-primary">Post reply</button>
</form> </form>
{% else %} {% else %}
@ -95,21 +136,32 @@
<h2 class="h6">Actions</h2> <h2 class="h6">Actions</h2>
{% if active_role == 'admin' %} {% if active_role == 'admin' %}
{% if item.status == 'resolved' %} {% if item.deleted_at %}
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2"> {# Item is deleted - show permanent delete option #}
<input type="hidden" name="action" value="reopen" /> <div class="alert alert-warning mb-2" style="font-size: 0.9rem;">
<button type="submit" class="btn btn-outline-secondary w-100">Reopen</button> This item is deleted.
</form> </div>
<form method="post" action="{{ url_for('main.feedback_permanent_delete', item_id=item.id) }}" onsubmit="return confirm('Permanently delete this item and all screenshots? This cannot be undone!');">
<button type="submit" class="btn btn-danger w-100">Permanent Delete</button>
</form>
{% else %} {% else %}
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2"> {# Item is not deleted - show normal actions #}
<input type="hidden" name="action" value="resolve" /> {% if item.status == 'resolved' %}
<button type="submit" class="btn btn-success w-100">Mark as resolved</button> <form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
</form> <input type="hidden" name="action" value="reopen" />
{% endif %} <button type="submit" class="btn btn-outline-secondary w-100">Reopen</button>
</form>
{% else %}
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
<input type="hidden" name="action" value="resolve" />
<button type="submit" class="btn btn-success w-100">Mark as resolved</button>
</form>
{% endif %}
<form method="post" action="{{ url_for('main.feedback_delete', item_id=item.id) }}" onsubmit="return confirm('Delete this item?');"> <form method="post" action="{{ url_for('main.feedback_delete', item_id=item.id) }}" onsubmit="return confirm('Delete this item?');">
<button type="submit" class="btn btn-danger w-100">Delete</button> <button type="submit" class="btn btn-danger w-100">Delete</button>
</form> </form>
{% endif %}
{% else %} {% else %}
<div class="text-muted">Only administrators can resolve or delete items.</div> <div class="text-muted">Only administrators can resolve or delete items.</div>
{% endif %} {% endif %}

View File

@ -6,7 +6,7 @@
<a class="btn btn-outline-secondary" href="{{ url_for('main.feedback_page') }}">Back</a> <a class="btn btn-outline-secondary" href="{{ url_for('main.feedback_page') }}">Back</a>
</div> </div>
<form method="post" class="card"> <form method="post" enctype="multipart/form-data" class="card">
<div class="card-body"> <div class="card-body">
<div class="row g-3"> <div class="row g-3">
<div class="col-12 col-md-3"> <div class="col-12 col-md-3">
@ -28,6 +28,11 @@
<label class="form-label">Component (optional)</label> <label class="form-label">Component (optional)</label>
<input type="text" name="component" class="form-control" /> <input type="text" name="component" class="form-control" />
</div> </div>
<div class="col-12">
<label class="form-label">Screenshots (optional)</label>
<input type="file" name="screenshots" class="form-control" multiple accept="image/png,image/jpeg,image/jpg,image/gif,image/webp" />
<div class="form-text">You can attach multiple screenshots (PNG, JPG, GIF, WEBP, max 5MB each)</div>
</div>
</div> </div>
</div> </div>
<div class="card-footer d-flex justify-content-end"> <div class="card-footer d-flex justify-content-end">

View File

@ -14,12 +14,12 @@
<div class="d-flex justify-content-between align-items-center my-2"> <div class="d-flex justify-content-between align-items-center my-2">
<div> <div>
{% if has_prev %} {% if has_prev %}
<a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.inbox', page=page-1) }}">Previous</a> <a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.inbox', page=page-1, q=q) }}">Previous</a>
{% else %} {% else %}
<button class="btn btn-outline-secondary btn-sm" disabled>Previous</button> <button class="btn btn-outline-secondary btn-sm" disabled>Previous</button>
{% endif %} {% endif %}
{% if has_next %} {% if has_next %}
<a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.inbox', page=page+1) }}">Next</a> <a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.inbox', page=page+1, q=q) }}">Next</a>
{% else %} {% else %}
<button class="btn btn-outline-secondary btn-sm ms-2" disabled>Next</button> <button class="btn btn-outline-secondary btn-sm ms-2" disabled>Next</button>
{% endif %} {% endif %}
@ -73,7 +73,7 @@
<tr> <tr>
{% if can_bulk_delete %} {% if can_bulk_delete %}
<th scope="col" style="width: 34px;"> <th scope="col" style="width: 34px;">
<input class="form-check-input" type="checkbox" id="inbox_select_all" /> <input class="form-check-input" type="checkbox" id="inbox_select_all" autocomplete="off" />
</th> </th>
{% endif %} {% endif %}
<th scope="col">From</th> <th scope="col">From</th>
@ -93,7 +93,7 @@
<tr class="inbox-row" data-message-id="{{ row.id }}" style="cursor: pointer;"> <tr class="inbox-row" data-message-id="{{ row.id }}" style="cursor: pointer;">
{% if can_bulk_delete %} {% if can_bulk_delete %}
<td onclick="event.stopPropagation();"> <td onclick="event.stopPropagation();">
<input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" /> <input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" autocomplete="off" />
</td> </td>
{% endif %} {% endif %}
<td>{{ row.from_address }}</td> <td>{{ row.from_address }}</td>

View File

@ -59,6 +59,34 @@
</div> </div>
{% endif %} {% endif %}
{% if cove_enabled and can_manage_jobs %}
<div class="card mb-3">
<div class="card-header">Cove Integration</div>
<div class="card-body">
<form method="post" action="{{ url_for('main.job_set_cove_account', job_id=job.id) }}" class="row g-2 align-items-end mb-0">
<div class="col-auto">
<label for="cove_account_id" class="form-label mb-1">Cove Account ID</label>
<input type="number" class="form-control form-control-sm" id="cove_account_id" name="cove_account_id"
value="{{ job.cove_account_id or '' }}" placeholder="e.g. 4504627" style="width: 180px;" />
</div>
<div class="col-auto">
<button type="submit" class="btn btn-sm btn-primary">Save</button>
{% if job.cove_account_id %}
<button type="submit" name="cove_account_id" value="" class="btn btn-sm btn-outline-secondary ms-1">Clear</button>
{% endif %}
</div>
<div class="col-auto text-muted small">
{% if job.cove_account_id %}
Linked to Cove account <strong>{{ job.cove_account_id }}</strong>
{% else %}
Not linked to a Cove account runs will not be imported automatically.
{% endif %}
</div>
</form>
</div>
</div>
{% endif %}
<h3 class="mt-4 mb-3">Job history</h3> <h3 class="mt-4 mb-3">Job history</h3>
<div class="table-responsive"> <div class="table-responsive">
@ -287,6 +315,60 @@
(function () { (function () {
var currentRunId = null; var currentRunId = null;
// Cross-browser copy to clipboard function
function copyToClipboard(text, button) {
// Method 1: Modern Clipboard API (works in most browsers with HTTPS)
if (navigator.clipboard && navigator.clipboard.writeText) {
navigator.clipboard.writeText(text)
.then(function () {
showCopyFeedback(button);
})
.catch(function () {
// Fallback to method 2 if clipboard API fails
fallbackCopy(text, button);
});
} else {
// Method 2: Legacy execCommand method
fallbackCopy(text, button);
}
}
function fallbackCopy(text, button) {
var textarea = document.createElement('textarea');
textarea.value = text;
textarea.style.position = 'fixed';
textarea.style.opacity = '0';
textarea.style.top = '0';
textarea.style.left = '0';
document.body.appendChild(textarea);
textarea.focus();
textarea.select();
try {
var successful = document.execCommand('copy');
if (successful) {
showCopyFeedback(button);
} else {
// If execCommand fails, use prompt as last resort
window.prompt('Copy ticket number:', text);
}
} catch (err) {
// If all else fails, show prompt
window.prompt('Copy ticket number:', text);
}
document.body.removeChild(textarea);
}
function showCopyFeedback(button) {
if (!button) return;
var original = button.textContent;
button.textContent = '✓';
setTimeout(function () {
button.textContent = original;
}, 800);
}
function apiJson(url, opts) { function apiJson(url, opts) {
opts = opts || {}; opts = opts || {};
opts.headers = opts.headers || {}; opts.headers = opts.headers || {};
@ -319,12 +401,14 @@
html += '<div class="mb-2"><strong>Tickets</strong><div class="mt-1">'; html += '<div class="mb-2"><strong>Tickets</strong><div class="mt-1">';
tickets.forEach(function (t) { tickets.forEach(function (t) {
var status = t.resolved_at ? 'Resolved' : 'Active'; var status = t.resolved_at ? 'Resolved' : 'Active';
var ticketCode = (t.ticket_code || '').toString();
html += '<div class="mb-2 border rounded p-2" data-alert-type="ticket" data-id="' + t.id + '">' + html += '<div class="mb-2 border rounded p-2" data-alert-type="ticket" data-id="' + t.id + '">' +
'<div class="d-flex align-items-start justify-content-between gap-2">' + '<div class="d-flex align-items-start justify-content-between gap-2">' +
'<div class="flex-grow-1 min-w-0">' + '<div class="flex-grow-1 min-w-0">' +
'<div class="text-truncate">' + '<div class="text-truncate">' +
'<span class="me-1" title="Ticket">🎫</span>' + '<span class="me-1" title="Ticket">🎫</span>' +
'<span class="fw-semibold">' + escapeHtml(t.ticket_code || '') + '</span>' + '<span class="fw-semibold">' + escapeHtml(ticketCode) + '</span>' +
'<button type="button" class="btn btn-sm btn-outline-secondary ms-2 py-0 px-1" title="Copy ticket number" data-action="copy-ticket" data-code="' + escapeHtml(ticketCode) + '"></button>' +
'<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' + '<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
'</div>' + '</div>' +
'</div>' + '</div>' +
@ -371,7 +455,16 @@
ev.preventDefault(); ev.preventDefault();
var action = btn.getAttribute('data-action'); var action = btn.getAttribute('data-action');
var id = btn.getAttribute('data-id'); var id = btn.getAttribute('data-id');
if (!action || !id) return; if (!action) return;
if (action === 'copy-ticket') {
var code = btn.getAttribute('data-code') || '';
if (!code) return;
copyToClipboard(code, btn);
return;
}
if (!id) return;
if (action === 'resolve-ticket') { if (action === 'resolve-ticket') {
if (!confirm('Mark ticket as resolved?')) return; if (!confirm('Mark ticket as resolved?')) return;
apiJson('/api/tickets/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'}) apiJson('/api/tickets/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})

View File

@ -2,6 +2,16 @@
{% block content %} {% block content %}
<h2 class="mb-3">Jobs</h2> <h2 class="mb-3">Jobs</h2>
{% if selected_customer_id %}
<div class="alert alert-info d-flex justify-content-between align-items-center py-2" role="alert">
<span>
Filtered on customer:
<strong>{{ selected_customer_name or ('#' ~ selected_customer_id) }}</strong>
</span>
<a href="{{ url_for('main.jobs') }}" class="btn btn-sm btn-outline-primary">Clear filter</a>
</div>
{% endif %}
<div class="table-responsive"> <div class="table-responsive">
<table class="table table-sm table-hover align-middle"> <table class="table table-sm table-hover align-middle">
<thead class="table-light"> <thead class="table-light">

View File

@ -422,7 +422,10 @@ function loadRawData() {
function loadReports() { function loadReports() {
setTableLoading('Loading…'); setTableLoading('Loading…');
fetch('/api/reports', { credentials: 'same-origin' }) var params = new URLSearchParams(window.location.search || '');
var q = (params.get('q') || '').trim();
var apiUrl = '/api/reports' + (q ? ('?q=' + encodeURIComponent(q)) : '');
fetch(apiUrl, { credentials: 'same-origin' })
.then(function (r) { return r.json(); }) .then(function (r) { return r.json(); })
.then(function (data) { .then(function (data) {
renderTable((data && data.items) ? data.items : []); renderTable((data && data.items) ? data.items : []);
@ -521,4 +524,4 @@ function loadRawData() {
</script> </script>
{% endblock %} {% endblock %}

View File

@ -48,7 +48,7 @@
<thead class="table-light"> <thead class="table-light">
<tr> <tr>
<th scope="col" style="width: 34px;"> <th scope="col" style="width: 34px;">
<input class="form-check-input" type="checkbox" id="rc_select_all" /> <input class="form-check-input" type="checkbox" id="rc_select_all" autocomplete="off" />
</th> </th>
<th scope="col">Customer</th> <th scope="col">Customer</th>
<th scope="col">Backup</th> <th scope="col">Backup</th>
@ -63,7 +63,7 @@
{% for r in rows %} {% for r in rows %}
<tr class="rc-job-row" data-job-id="{{ r.job_id }}" style="cursor: pointer;"> <tr class="rc-job-row" data-job-id="{{ r.job_id }}" style="cursor: pointer;">
<td onclick="event.stopPropagation();"> <td onclick="event.stopPropagation();">
<input class="form-check-input rc_row_cb" type="checkbox" value="{{ r.job_id }}" /> <input class="form-check-input rc_row_cb" type="checkbox" value="{{ r.job_id }}" autocomplete="off" />
</td> </td>
<td>{{ r.customer_name }}</td> <td>{{ r.customer_name }}</td>
<td>{{ r.backup_software }}</td> <td>{{ r.backup_software }}</td>
@ -447,6 +447,60 @@ function escapeHtml(s) {
.replace(/'/g, "&#39;"); .replace(/'/g, "&#39;");
} }
// Cross-browser copy to clipboard function
function copyToClipboard(text, button) {
// Method 1: Modern Clipboard API (works in most browsers with HTTPS)
if (navigator.clipboard && navigator.clipboard.writeText) {
navigator.clipboard.writeText(text)
.then(function () {
showCopyFeedback(button);
})
.catch(function () {
// Fallback to method 2 if clipboard API fails
fallbackCopy(text, button);
});
} else {
// Method 2: Legacy execCommand method
fallbackCopy(text, button);
}
}
function fallbackCopy(text, button) {
var textarea = document.createElement('textarea');
textarea.value = text;
textarea.style.position = 'fixed';
textarea.style.opacity = '0';
textarea.style.top = '0';
textarea.style.left = '0';
document.body.appendChild(textarea);
textarea.focus();
textarea.select();
try {
var successful = document.execCommand('copy');
if (successful) {
showCopyFeedback(button);
} else {
// If execCommand fails, use prompt as last resort
window.prompt('Copy ticket number:', text);
}
} catch (err) {
// If all else fails, show prompt
window.prompt('Copy ticket number:', text);
}
document.body.removeChild(textarea);
}
function showCopyFeedback(button) {
if (!button) return;
var original = button.textContent;
button.textContent = '✓';
setTimeout(function () {
button.textContent = original;
}, 800);
}
function getSelectedJobIds() { function getSelectedJobIds() {
var cbs = table.querySelectorAll('tbody .rc_row_cb'); var cbs = table.querySelectorAll('tbody .rc_row_cb');
var ids = []; var ids = [];
@ -840,20 +894,7 @@ table.addEventListener('change', function (e) {
if (action === 'copy-ticket') { if (action === 'copy-ticket') {
var code = btn.getAttribute('data-code') || ''; var code = btn.getAttribute('data-code') || '';
if (!code) return; if (!code) return;
if (navigator.clipboard && navigator.clipboard.writeText) { copyToClipboard(code, btn);
navigator.clipboard.writeText(code)
.then(function () {
var original = btn.textContent;
btn.textContent = '✓';
setTimeout(function () { btn.textContent = original; }, 800);
})
.catch(function () {
// Fallback: select/copy via prompt
window.prompt('Copy ticket number:', code);
});
} else {
window.prompt('Copy ticket number:', code);
}
return; return;
} }

View File

@ -0,0 +1,75 @@
{% extends "layout/base.html" %}
{% block content %}
<h2 class="mb-3">Search</h2>
{% if query %}
<p class="text-muted mb-3">
Query: <strong>{{ query }}</strong> | Total hits: <strong>{{ total_hits }}</strong>
</p>
{% else %}
<div class="alert alert-secondary py-2">
Enter a search term in the top navigation bar.
</div>
{% endif %}
{% for section in sections %}
<div class="card mb-3" id="search-section-{{ section['key'] }}" style="scroll-margin-top: 96px;">
<div class="card-header d-flex justify-content-between align-items-center">
<span>{{ section['title'] }} ({{ section['total'] }})</span>
<a href="{{ section['view_all_url'] }}" class="btn btn-sm btn-outline-secondary">Open {{ section['title'] }}</a>
</div>
{% if section['key'] == 'daily_jobs' %}
<div class="px-3 py-2 small text-muted border-bottom">
Note: The Daily Jobs page itself only shows results for the selected day. Search results can include matches that relate to jobs across other days.
</div>
{% endif %}
<div class="card-body p-0">
{% if section['items'] %}
<div class="table-responsive">
<table class="table table-sm mb-0 align-middle">
<thead class="table-light">
<tr>
<th>Result</th>
<th>Details</th>
<th>Meta</th>
</tr>
</thead>
<tbody>
{% for item in section['items'] %}
<tr>
<td>
{% if item.link %}
<a href="{{ item.link }}">{{ item.title }}</a>
{% else %}
{{ item.title }}
{% endif %}
</td>
<td>{{ item.subtitle }}</td>
<td>{{ item.meta }}</td>
</tr>
{% endfor %}
</tbody>
</table>
</div>
{% else %}
<div class="p-3 text-muted">No results in this section.</div>
{% endif %}
</div>
{% if section['total_pages'] > 1 %}
<div class="card-footer d-flex justify-content-between align-items-center small">
<span class="text-muted">
Page {{ section['current_page'] }} of {{ section['total_pages'] }} ({{ section['total'] }} results)
</span>
<div class="d-flex gap-2">
{% if section['has_prev'] %}
<a class="btn btn-sm btn-outline-secondary" href="{{ section['prev_url'] }}#search-section-{{ section['key'] }}">Previous</a>
{% endif %}
{% if section['has_next'] %}
<a class="btn btn-sm btn-outline-secondary" href="{{ section['next_url'] }}#search-section-{{ section['key'] }}">Next</a>
{% endif %}
</div>
</div>
{% endif %}
</div>
{% endfor %}
{% endblock %}

View File

@ -504,6 +504,178 @@
</div> </div>
{% endif %} {% endif %}
{% if section == 'integrations' %}
<form method="post" class="mb-4" id="cove-settings-form">
<div class="card mb-3">
<div class="card-header">Cove Data Protection (N-able)</div>
<div class="card-body">
<div class="form-check form-switch mb-3">
<input class="form-check-input" type="checkbox" id="cove_enabled" name="cove_enabled" {% if settings.cove_enabled %}checked{% endif %} />
<label class="form-check-label" for="cove_enabled">Enable Cove integration</label>
</div>
<div class="row g-3">
<div class="col-md-12">
<label for="cove_api_url" class="form-label">API URL</label>
<input type="url" class="form-control" id="cove_api_url" name="cove_api_url"
value="{{ settings.cove_api_url or '' }}"
placeholder="https://api.backup.management/jsonapi" />
<div class="form-text">Leave empty to use the default Cove API endpoint.</div>
</div>
<div class="col-md-6">
<label for="cove_api_username" class="form-label">API Username <span class="text-danger">*</span></label>
<input type="text" class="form-control" id="cove_api_username" name="cove_api_username"
value="{{ settings.cove_api_username or '' }}" />
</div>
<div class="col-md-6">
<label for="cove_api_password" class="form-label">API Password {% if not has_cove_password %}<span class="text-danger">*</span>{% endif %}</label>
<input type="password" class="form-control" id="cove_api_password" name="cove_api_password"
placeholder="{% if has_cove_password %}******** (stored){% else %}enter password{% endif %}" />
<div class="form-text">Leave empty to keep the existing password.</div>
</div>
<div class="col-md-6">
<div class="form-check form-switch mt-2">
<input class="form-check-input" type="checkbox" id="cove_import_enabled" name="cove_import_enabled" {% if settings.cove_import_enabled %}checked{% endif %} />
<label class="form-check-label" for="cove_import_enabled">Enable automatic import</label>
</div>
</div>
<div class="col-md-6">
<label for="cove_import_interval_minutes" class="form-label">Import interval (minutes)</label>
<input type="number" class="form-control" id="cove_import_interval_minutes" name="cove_import_interval_minutes"
value="{{ settings.cove_import_interval_minutes or 30 }}" min="1" max="1440" />
<div class="form-text">How often (in minutes) to fetch new data from the Cove API.</div>
</div>
</div>
<div class="d-flex justify-content-between align-items-center mt-3">
<div id="cove-test-result" class="small"></div>
<div class="d-flex gap-2">
<button type="button" class="btn btn-outline-secondary" id="cove-test-btn">Test connection</button>
<button type="submit" class="btn btn-primary">Save Cove Settings</button>
</div>
</div>
{% if settings.cove_partner_id %}
<div class="mt-2 d-flex justify-content-between align-items-center flex-wrap gap-2">
<div class="text-muted small">
Connected Partner ID: <strong>{{ settings.cove_partner_id }}</strong>
{% if settings.cove_last_import_at %}
&nbsp;·&nbsp; Last import: {{ settings.cove_last_import_at|local_datetime }}
{% else %}
&nbsp;·&nbsp; No import yet
{% endif %}
</div>
<button
type="submit"
class="btn btn-sm btn-outline-primary"
formaction="{{ url_for('main.settings_cove_run_now') }}"
formmethod="post"
>
Run import now
</button>
</div>
{% endif %}
</div>
</div>
</form>
<script>
(function () {
var btn = document.getElementById('cove-test-btn');
var resultDiv = document.getElementById('cove-test-result');
if (!btn) return;
btn.addEventListener('click', function () {
btn.disabled = true;
resultDiv.textContent = 'Testing…';
resultDiv.className = 'small text-muted';
fetch('{{ url_for("main.settings_cove_test_connection") }}', {
method: 'POST',
headers: { 'X-CSRFToken': document.querySelector('meta[name="csrf-token"]') ? document.querySelector('meta[name="csrf-token"]').content : '' },
credentials: 'same-origin',
})
.then(function (r) { return r.json(); })
.then(function (data) {
if (data.ok) {
resultDiv.textContent = data.message;
resultDiv.className = 'small text-success';
} else {
resultDiv.textContent = data.message;
resultDiv.className = 'small text-danger';
}
})
.catch(function (err) {
resultDiv.textContent = 'Request failed: ' + err;
resultDiv.className = 'small text-danger';
})
.finally(function () { btn.disabled = false; });
});
})();
</script>
<form method="post" class="mb-4" id="entra-settings-form">
<div class="card mb-3">
<div class="card-header">Microsoft Entra SSO</div>
<div class="card-body">
<div class="form-check form-switch mb-3">
<input class="form-check-input" type="checkbox" id="entra_sso_enabled" name="entra_sso_enabled" {% if settings.entra_sso_enabled %}checked{% endif %} />
<label class="form-check-label" for="entra_sso_enabled">Enable Microsoft sign-in</label>
</div>
<div class="row g-3">
<div class="col-md-6">
<label for="entra_tenant_id" class="form-label">Tenant ID</label>
<input type="text" class="form-control" id="entra_tenant_id" name="entra_tenant_id"
value="{{ settings.entra_tenant_id or '' }}" placeholder="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" />
</div>
<div class="col-md-6">
<label for="entra_client_id" class="form-label">Client ID</label>
<input type="text" class="form-control" id="entra_client_id" name="entra_client_id"
value="{{ settings.entra_client_id or '' }}" placeholder="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" />
</div>
<div class="col-md-12">
<label for="entra_client_secret" class="form-label">Client Secret {% if not has_entra_secret %}<span class="text-danger">*</span>{% endif %}</label>
<input type="password" class="form-control" id="entra_client_secret" name="entra_client_secret"
placeholder="{% if has_entra_secret %}******** (stored){% else %}enter secret{% endif %}" />
<div class="form-text">Leave empty to keep the existing secret.</div>
</div>
<div class="col-md-12">
<label for="entra_redirect_uri" class="form-label">Redirect URI (optional override)</label>
<input type="url" class="form-control" id="entra_redirect_uri" name="entra_redirect_uri"
value="{{ settings.entra_redirect_uri or '' }}"
placeholder="https://your-domain.example/auth/entra/callback" />
<div class="form-text">If empty, Backupchecks uses its own external callback URL.</div>
</div>
<div class="col-md-6">
<label for="entra_allowed_domain" class="form-label">Allowed domain/tenant (optional)</label>
<input type="text" class="form-control" id="entra_allowed_domain" name="entra_allowed_domain"
value="{{ settings.entra_allowed_domain or '' }}" placeholder="contoso.com or tenant-id" />
<div class="form-text">Restrict sign-ins to one tenant id or one email domain.</div>
</div>
<div class="col-md-12">
<label for="entra_allowed_group_ids" class="form-label">Allowed Entra Group Object ID(s) (optional)</label>
<textarea class="form-control" id="entra_allowed_group_ids" name="entra_allowed_group_ids" rows="3"
placeholder="group-object-id-1&#10;group-object-id-2">{{ settings.entra_allowed_group_ids or '' }}</textarea>
<div class="form-text">Optional hard access gate. Enter one or more Entra security group object IDs (comma or newline separated). User must be member of at least one.</div>
</div>
<div class="col-md-6">
<div class="form-check form-switch mt-4">
<input class="form-check-input" type="checkbox" id="entra_auto_provision_users" name="entra_auto_provision_users" {% if settings.entra_auto_provision_users %}checked{% endif %} />
<label class="form-check-label" for="entra_auto_provision_users">Auto-provision unknown users as Viewer</label>
</div>
</div>
</div>
<div class="d-flex justify-content-end mt-3">
<button type="submit" class="btn btn-primary">Save Entra Settings</button>
</div>
</div>
</div>
</form>
{% endif %}
{% if section == 'maintenance' %} {% if section == 'maintenance' %}
<div class="row g-3 mb-4"> <div class="row g-3 mb-4">
@ -528,8 +700,16 @@
<div class="col-md-4 d-flex align-items-end"> <div class="col-md-4 d-flex align-items-end">
<button type="submit" class="btn btn-primary w-100">Import jobs</button> <button type="submit" class="btn btn-primary w-100">Import jobs</button>
</div> </div>
<div class="col-12">
<div class="form-check">
<input class="form-check-input" type="checkbox" value="1" id="include_autotask_ids_jobs" name="include_autotask_ids" />
<label class="form-check-label" for="include_autotask_ids_jobs">
Include Autotask IDs from import file
</label>
</div>
</div>
<div class="col-md-8"> <div class="col-md-8">
<div class="form-text">Use a JSON export created by this application.</div> <div class="form-text">Use a JSON export created by this application. Leave Autotask IDs unchecked for sandbox/development environments with a different Autotask database.</div>
</div> </div>
</div> </div>
</form> </form>

310
cove_api_test.py Normal file
View File

@ -0,0 +1,310 @@
#!/usr/bin/env python3
"""
Cove Data Protection API Test Script
=======================================
Verified working via Postman (2026-02-23). Uses confirmed column codes.
Usage:
python3 cove_api_test.py --username "api-user" --password "secret"
Or via environment variables:
COVE_USERNAME="api-user" COVE_PASSWORD="secret" python3 cove_api_test.py
Optional:
--url API endpoint (default: https://api.backup.management/jsonapi)
--records Max records to fetch (default: 50)
"""
import argparse
import json
import os
import sys
from datetime import datetime, timezone
import requests
API_URL = "https://api.backup.management/jsonapi"
# Session status codes (F00 / F15 / F09)
SESSION_STATUS = {
1: "In process",
2: "Failed",
3: "Aborted",
5: "Completed",
6: "Interrupted",
7: "NotStarted",
8: "CompletedWithErrors",
9: "InProgressWithFaults",
10: "OverQuota",
11: "NoSelection",
12: "Restarted",
}
# Backupchecks status mapping
STATUS_MAP = {
1: "Warning", # In process
2: "Error", # Failed
3: "Error", # Aborted
5: "Success", # Completed
6: "Error", # Interrupted
7: "Warning", # NotStarted
8: "Warning", # CompletedWithErrors
9: "Warning", # InProgressWithFaults
10: "Error", # OverQuota
11: "Warning", # NoSelection
12: "Warning", # Restarted
}
# Confirmed working columns (verified via Postman 2026-02-23)
COLUMNS = [
"I1", "I18", "I8", "I78",
"D09F00", "D09F09", "D09F15", "D09F08",
"D1F00", "D1F15",
"D10F00", "D10F15",
"D11F00", "D11F15",
"D19F00", "D19F15",
"D20F00", "D20F15",
"D5F00", "D5F15",
"D23F00", "D23F15",
]
# Datasource labels
DATASOURCE_LABELS = {
"D09": "Total",
"D1": "Files & Folders",
"D2": "System State",
"D10": "VssMsSql (SQL Server)",
"D11": "VssSharePoint",
"D19": "M365 Exchange",
"D20": "M365 OneDrive",
"D5": "M365 SharePoint",
"D23": "M365 Teams",
}
def _post(url: str, payload: dict, timeout: int = 30) -> dict:
headers = {"Content-Type": "application/json"}
resp = requests.post(url, json=payload, headers=headers, timeout=timeout)
resp.raise_for_status()
return resp.json()
def login(url: str, username: str, password: str) -> tuple[str, int]:
"""Authenticate and return (visa, partner_id)."""
payload = {
"jsonrpc": "2.0",
"id": "jsonrpc",
"method": "Login",
"params": {
"username": username,
"password": password,
},
}
data = _post(url, payload)
if "error" in data:
raise RuntimeError(f"Login failed: {data['error']}")
visa = data.get("visa")
if not visa:
raise RuntimeError(f"No visa token in response: {data}")
result = data.get("result", {})
partner_id = result.get("PartnerId") or result.get("result", {}).get("PartnerId")
if not partner_id:
raise RuntimeError(f"Could not find PartnerId in response: {data}")
return visa, int(partner_id)
def enumerate_statistics(url: str, visa: str, partner_id: int, columns: list[str], records: int = 50) -> dict:
payload = {
"jsonrpc": "2.0",
"visa": visa,
"id": "jsonrpc",
"method": "EnumerateAccountStatistics",
"params": {
"query": {
"PartnerId": partner_id,
"StartRecordNumber": 0,
"RecordsCount": records,
"Columns": columns,
}
},
}
return _post(url, payload)
def fmt_ts(value) -> str:
if not value:
return "(none)"
try:
ts = int(value)
if ts == 0:
return "(none)"
dt = datetime.fromtimestamp(ts, tz=timezone.utc)
return dt.strftime("%Y-%m-%d %H:%M UTC")
except (ValueError, TypeError, OSError):
return str(value)
def fmt_status(value) -> str:
if value is None:
return "(none)"
try:
code = int(value)
bc = STATUS_MAP.get(code, "?")
label = SESSION_STATUS.get(code, f"Unknown")
return f"{code} ({label}) → {bc}"
except (ValueError, TypeError):
return str(value)
def fmt_colorbar(value: str) -> str:
if not value:
return "(none)"
icons = {"5": "", "8": "⚠️", "2": "", "1": "🔄", "0": "·"}
return "".join(icons.get(c, c) for c in str(value))
def print_header(title: str) -> None:
print()
print("=" * 70)
print(f" {title}")
print("=" * 70)
def run(url: str, username: str, password: str, records: int, debug: bool = False) -> None:
print_header("Cove Data Protection API Test")
print(f" URL: {url}")
print(f" Username: {username}")
# Login
print_header("Step 1: Login")
visa, partner_id = login(url, username, password)
print(f" ✅ Login OK")
print(f" PartnerId: {partner_id}")
print(f" Visa: {visa[:40]}...")
# Fetch statistics
print_header("Step 2: EnumerateAccountStatistics")
print(f" Columns: {', '.join(COLUMNS)}")
print(f" Records: max {records}")
data = enumerate_statistics(url, visa, partner_id, COLUMNS, records)
if debug:
print(f"\n RAW response (first 2000 chars):")
print(json.dumps(data, indent=2)[:2000])
if "error" in data:
err = data["error"]
print(f" ❌ FAILED error {err.get('code')}: {err.get('message')}")
print(f" Data: {err.get('data')}")
sys.exit(1)
result = data.get("result")
if result is None:
print(" ⚠️ result is null raw response:")
print(json.dumps(data, indent=2)[:1000])
sys.exit(0)
if debug:
print(f"\n result type: {type(result).__name__}")
if isinstance(result, dict):
print(f" result keys: {list(result.keys())}")
# Unwrap possible nested result
if isinstance(result, dict) and "result" in result:
result = result["result"]
# Result can be a list directly or wrapped in Accounts key
accounts = result if isinstance(result, list) else result.get("Accounts", []) if isinstance(result, dict) else []
total = len(accounts)
print(f" ✅ SUCCESS {total} account(s) returned")
# Per-account output
print_header(f"Step 3: Account Details ({total} total)")
for i, acc in enumerate(accounts):
# Settings is a list of single-key dicts: [{"D09F00": "5"}, {"I1": "name"}, ...]
# Flatten to a single dict for easy lookup.
s: dict = {}
for item in acc.get("Settings", []):
s.update(item)
account_id = acc.get("AccountId", "?")
device_name = s.get("I1", "(no name)")
computer = s.get("I18") or "(M365 tenant)"
customer = s.get("I8", "")
active_ds = s.get("I78", "")
print(f"\n [{i+1}/{total}] {device_name} (AccountId: {account_id})")
print(f" Computer : {computer}")
print(f" Customer : {customer}")
print(f" Datasrc : {active_ds}")
# Total (D09)
print(f" Total:")
print(f" Status : {fmt_status(s.get('D09F00'))}")
print(f" Last session: {fmt_ts(s.get('D09F15'))}")
print(f" Last success: {fmt_ts(s.get('D09F09'))}")
print(f" 28-day bar : {fmt_colorbar(s.get('D09F08'))}")
# Per-datasource (only if present in response)
ds_pairs = [
("D1", "D1F00", "D1F15"),
("D10", "D10F00", "D10F15"),
("D11", "D11F00", "D11F15"),
("D19", "D19F00", "D19F15"),
("D20", "D20F00", "D20F15"),
("D5", "D5F00", "D5F15"),
("D23", "D23F00", "D23F15"),
]
for ds_code, f00_col, f15_col in ds_pairs:
f00 = s.get(f00_col)
f15 = s.get(f15_col)
if f00 is None and f15 is None:
continue
label = DATASOURCE_LABELS.get(ds_code, ds_code)
print(f" {label}:")
print(f" Status : {fmt_status(f00)}")
print(f" Last session: {fmt_ts(f15)}")
# Summary
print_header("Summary")
status_counts: dict[str, int] = {}
for acc in accounts:
flat: dict = {}
for item in acc.get("Settings", []):
flat.update(item)
raw = flat.get("D09F00")
bc = STATUS_MAP.get(int(raw), "Unknown") if raw is not None else "No data"
status_counts[bc] = status_counts.get(bc, 0) + 1
for status, count in sorted(status_counts.items()):
icon = {"Success": "", "Warning": "⚠️", "Error": ""}.get(status, " ")
print(f" {icon} {status}: {count}")
print(f"\n Total accounts: {total}")
print()
def main() -> None:
parser = argparse.ArgumentParser(description="Test Cove Data Protection API")
parser.add_argument("--url", default=os.environ.get("COVE_URL", API_URL))
parser.add_argument("--username", default=os.environ.get("COVE_USERNAME", ""))
parser.add_argument("--password", default=os.environ.get("COVE_PASSWORD", ""))
parser.add_argument("--records", type=int, default=50, help="Max accounts to fetch")
parser.add_argument("--debug", action="store_true", help="Print raw API responses")
args = parser.parse_args()
if not args.username or not args.password:
print("Error: --username and --password are required.")
print("Or set COVE_USERNAME and COVE_PASSWORD environment variables.")
sys.exit(1)
run(args.url, args.username, args.password, args.records, args.debug)
if __name__ == "__main__":
main()

View File

@ -2,9 +2,131 @@
This file documents all changes made to this project via Claude Code. This file documents all changes made to this project via Claude Code.
## [2026-02-10] ## [2026-02-23]
### Added
- Cove Data Protection full integration into Backupchecks:
- `app/cove_importer.py` Cove API client: login, paginated EnumerateAccountStatistics, status mapping, deduplication, per-datasource object persistence
- `app/cove_importer_service.py` background thread that polls Cove API on configurable interval
- `SystemSettings` model: 8 new Cove fields (`cove_enabled`, `cove_api_url`, `cove_api_username`, `cove_api_password`, `cove_import_enabled`, `cove_import_interval_minutes`, `cove_partner_id`, `cove_last_import_at`)
- `Job` model: `cove_account_id` column to link a job to a Cove account
- `JobRun` model: `source_type` (NULL = email, "cove_api") and `external_id` (deduplication key) columns
- DB migration `migrate_cove_integration()` for all new columns + deduplication index
- Settings > Integrations tab: new Cove section with enable toggle, API URL/username/password, import interval, and Test Connection button (AJAX → JSON response with partner ID)
- Job Detail page: Cove Integration card showing Account ID input (only when `cove_enabled`)
- Route `POST /settings/cove/test-connection` verifies Cove credentials and stores partner ID
- Route `POST /settings/cove/run-now` manually trigger a Cove import from the Settings page
- Route `POST /jobs/<id>/set-cove-account` saves or clears Cove Account ID on a job
- Cove Accounts inbox-style flow:
- `CoveAccount` model (staging table): stores all Cove accounts from API, with optional `job_id` link
- DB migration `migrate_cove_accounts_table()` creates `cove_accounts` table with indexes
- `cove_importer.py` updated: always upserts all accounts into staging table; JobRuns only created for accounts with a linked job
- `routes_cove.py` new routes: `GET /cove/accounts`, `POST /cove/accounts/<id>/link`, `POST /cove/accounts/<id>/unlink`
- `cove_accounts.html` inbox-style page: unmatched accounts shown first with "Link / Create job" modals (two tabs: create new job or link to existing), matched accounts listed below with Unlink button
- Nav bar: "Cove Accounts" link added for admin/operator roles when `cove_enabled`
- Route `POST /settings/cove/run-now` triggers manual import (button also shown on Cove Accounts page)
- `cove_api_test.py` standalone Python test script to verify Cove Data Protection API column codes
- Tests D9Fxx (Total), D10Fxx (VssMsSql), D11Fxx (VssSharePoint), and D1Fxx (Files&Folders)
- Displays backup status (F00), timestamps (F09/F15/F18), error counts (F06) per account
- Accepts credentials via CLI args or environment variables
- Summary output showing which column sets work
- Updated `docs/cove_data_protection_api_calls_known_info.md` with N-able support feedback:
- D02/D03 are legacy use D10/D11 or D9 (Total) instead
- All users have the same API access (no MSP-level restriction)
- Session status codes documented (D9F00: 2=Failed, 5=Completed, 8=CompletedWithErrors, etc.)
- Updated `TODO-cove-data-protection.md` with breakthrough status and next steps
## [2026-02-19]
### Added
- Explicit `Include Autotask IDs` import option in the Approved Jobs JSON import form (Settings -> Maintenance)
- Explicit `Include Autotask IDs` import option in the Customers CSV import form
### Changed
- Approved Jobs import now only applies `autotask_company_id` and `autotask_company_name` when the import option is checked
- Customers CSV import now only applies Autotask mapping fields when the import option is checked
- Import success and audit output now includes whether Autotask IDs were imported
- 3CX parser now recognizes `3CX Notification: Update Successful - <host>` as an informational run with `backup_software: 3CX`, `backup_type: Update`, and `overall_status: Success`, and excludes this type from schedule inference (no Expected/Missed generation)
- Run Checks now hides only non-backup 3CX informational types (`Update`, `SSL Certificate`), while other backup software/types remain visible
- Restored remark visibility in Run Checks and Job Details alerts by loading remarks from both sources: explicit run links (`remark_job_runs`) and active job scopes (`remark_scopes`) with duplicate prevention
## [2026-02-16]
### Added
- Customer-to-jobs navigation by making customer names clickable on the Customers page (`/jobs?customer_id=<id>`)
- Jobs page customer filter context UI with an active filter banner and a "Clear filter" action
- Global search page (`/search`) with grouped results for Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Existing overrides, and Reports
- Navbar search form to trigger global search from all authenticated pages
- Dedicated Remarks section in global search results (with paging and detail links), so remark records are searchable alongside tickets
### Changed
- `/jobs` route now accepts optional `customer_id` and returns only jobs for that customer when provided
- Default Jobs listing keeps inactive-customer filtering only when no `customer_id` filter is applied
- Updated `docs/technical-notes-codex.md` with a new "Last updated" date, Customers->Jobs navigation notes, and test build/push validation snapshot
- Search matching is now case-insensitive with wildcard support (`*`) and automatic contains behavior (`*term*`) per search term
- Global search visibility now only includes sections accessible to the currently active role
- Updated `docs/technical-notes-codex.md` with a dedicated Global Grouped Search section (route/UI/behavior/access rules) and latest test build digest for `v20260216-02-global-search`
- Global search now supports per-section pagination (previous/next), so results beyond the first 10 can be browsed per section while preserving current query/state
- Daily Jobs search result metadata now includes expected run time, success indicator, and run count for the selected day
- Daily Jobs search result links now open the same Daily Jobs modal flow via `open_job_id` (instead of only navigating to the overview page)
- Updated `docs/technical-notes-codex.md` with search pagination query params, Daily Jobs modal-open search behavior, and latest successful test-build digest
- Search pagination buttons now preserve scroll position by linking back to the active section anchor after page navigation
- "Open <section>" behavior now passes `q` into destination pages and applies page-level filtering, so opened overviews reflect the same search term
- Filtering support on Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Overrides, and Reports now accepts wildcard-enabled `q` terms from search
- Reports frontend loading (`/api/reports`) now forwards URL `q` so client-side refresh keeps the same filtered result set
- Daily Jobs search section UI now shows an explicit English note that the Daily Jobs page itself is day-scoped while search matches can reflect jobs across other days
- Updated `docs/technical-notes-codex.md` to include remarks in grouped search sections, `p_remarks` pagination key, q-forwarding to overview pages, and latest test-build digest
### Fixed ### Fixed
- `/search` page crash (`TypeError: 'builtin_function_or_method' object is not iterable`) by replacing Jinja dict access from `section.items` to `section['items']` in `templates/main/search.html`
## [2026-02-13]
### Added
- Added internal technical reference document `docs/technical-notes-codex.md` with repository structure, application architecture, processing flow, parser system rules, ticketing/Autotask constraints, feedback attachment notes, deployment/build workflow, and operational attention points
### Changed
- Changed `docs/technical-notes-codex.md` language from Dutch to English to align with project language rules for documentation
### Fixed
- Fixed Autotask tickets and internal tickets not being linked to missed runs by calling `link_open_internal_tickets_to_run` after creating missed JobRun records in `_ensure_missed_runs_for_job` (both weekly and monthly schedules), ensuring missed runs now receive the same ticket propagation as email-based runs
- Fixed checkboxes being automatically re-selected after delete actions on Inbox and Run Checks pages by adding `autocomplete="off"` attribute to all checkboxes, preventing browser from restoring previous checkbox states after page reload
## [2026-02-12]
### Fixed
- Fixed tickets not being displayed in Run Checks modal detail view (Meldingen section) by extending `/api/job-runs/<run_id>/alerts` endpoint to include both run-specific tickets (via ticket_job_runs) and job-level tickets (via ticket_scopes), ensuring newly created tickets are visible immediately in the modal instead of only after being resolved
- Fixed copy ticket button not working in Edge browser on Job Details page by moving clipboard functions (copyToClipboard, fallbackCopy, showCopyFeedback) inside IIFE scope for proper closure access (Edge is stricter than Firefox about scope resolution)
## [2026-02-10]
### Added
- Added screenshot attachment support to Feedback/Bug system (user request: allow screenshots for bugs/features)
- New database model: `FeedbackAttachment` with file_data (BYTEA), filename, mime_type, file_size
- Upload support on feedback creation form (multiple files, PNG/JPG/GIF/WEBP, max 5MB each)
- Upload support on reply forms (attach screenshots when replying)
- Inline image display on feedback detail page (thumbnails with click-to-view-full-size)
- Screenshot display for both main feedback items and replies
- File validation: image type verification using imghdr (not just extension), size limits, secure filename handling
- New route: `/feedback/attachment/<id>` to serve images (access-controlled, admins can view deleted item attachments)
- Database migration: auto-creates `feedback_attachments` table with indexes on startup
- Automatic CASCADE delete: removing feedback item or reply automatically removes associated attachments
- Added admin-only deleted items view and permanent delete functionality to Feedback system
- "Show deleted items" checkbox on feedback list page (admin only)
- Deleted items shown with reduced opacity + background color and "Deleted" badge
- Permanent delete action removes item + all attachments from database (hard delete with CASCADE)
- Attachment count shown in deletion confirmation message
- Admins can view detail pages of deleted items including their screenshots
- Two-stage delete: soft delete (audit trail) → permanent delete (database cleanup)
- Prevents accidental permanent deletion (requires item to be soft-deleted first)
- Security: non-admin users cannot view deleted items or their attachments (404 response)
- Added copy ticket button (⧉) to Job Details page modal for quickly copying ticket numbers to clipboard (previously only available on Run Checks page)
### Fixed
- Fixed cross-browser clipboard copy functionality for ticket numbers (previously required manual copy popup in Edge browser)
- Implemented three-tier fallback mechanism: modern Clipboard API → legacy execCommand('copy') → prompt fallback
- Copy button now works directly in all browsers (Firefox, Edge, Chrome) without requiring user interaction
- Applied improved copy mechanism to both Run Checks and Job Details pages
- Fixed Autotask ticket not being automatically linked to new runs when internal ticket is resolved by implementing independent Autotask propagation strategy (now checks for most recent non-deleted and non-resolved Autotask ticket on job regardless of internal ticket status, ensuring PSA ticket reference persists across runs until explicitly resolved or deleted) - Fixed Autotask ticket not being automatically linked to new runs when internal ticket is resolved by implementing independent Autotask propagation strategy (now checks for most recent non-deleted and non-resolved Autotask ticket on job regardless of internal ticket status, ensuring PSA ticket reference persists across runs until explicitly resolved or deleted)
- Fixed internal and Autotask tickets being linked to new runs even after being resolved by removing date-based "open" logic from ticket query (tickets now only link to new runs if they are genuinely unresolved, not based on run date comparisons) - Fixed internal and Autotask tickets being linked to new runs even after being resolved by removing date-based "open" logic from ticket query (tickets now only link to new runs if they are genuinely unresolved, not based on run date comparisons)
- Fixed Job Details page showing resolved tickets for ALL runs by implementing two-source ticket display: directly linked tickets (via ticket_job_runs) are always shown for audit trail, while active window tickets (via scope query) are only shown if unresolved, preserving historical ticket links while preventing resolved tickets from appearing on new runs - Fixed Job Details page showing resolved tickets for ALL runs by implementing two-source ticket display: directly linked tickets (via ticket_job_runs) are always shown for audit trail, while active window tickets (via scope query) are only shown if unresolved, preserving historical ticket links while preventing resolved tickets from appearing on new runs

View File

@ -0,0 +1,230 @@
# Cove Data Protection (N-able Backup) Known Information on API Calls
Date: 2026-02-10 (updated 2026-02-23)
Status: Pending re-test with corrected column codes
## ⚠️ Important Update (2026-02-23)
**N-able support (Andrew Robinson, Applications Engineer) confirmed:**
1. **D02 and D03 are legacy column codes** use **D10 and D11** instead.
2. **There is no MSP-level restriction** all API users have the same access level.
3. New documentation: https://developer.n-able.com/n-able-cove/docs/getting-started
4. Column code reference: https://developer.n-able.com/n-able-cove/docs/column-codes
**Impact:** The security error 13501 was caused by using legacy D02Fxx/D03Fxx codes.
Using D9Fxx (Total aggregate), D10Fxx (VssMsSql), D11Fxx (VssSharePoint) should work.
**Key newly available columns (pending re-test):**
- `D9F00` = Last Session Status (2=Failed, 5=Completed, 8=CompletedWithErrors, etc.)
- `D9F06` = Last Session Errors Count
- `D9F09` = Last Successful Session Timestamp (Unix)
- `D9F12` = Session Duration
- `D9F15` = Last Session Timestamp (Unix)
- `D9F17` = Last Completed Session Status
- `D9F18` = Last Completed Session Timestamp (Unix)
**Session status codes (F00):**
1=In process, 2=Failed, 3=Aborted, 5=Completed, 6=Interrupted,
7=NotStarted, 8=CompletedWithErrors, 9=InProgressWithFaults,
10=OverQuota, 11=NoSelection, 12=Restarted
**Test script:** `cove_api_test.py` in project root run this to verify new column codes.
---
## Summary of original findings (2026-02-10)
API access to Cove Data Protection via JSON-RPC **works**, but was **heavily restricted**
because legacy column codes (D02Fxx, D03Fxx) were being used. Now resolved.
Previous error:
```
Operation failed because of security reasons (error 13501)
```
---
## Authentication model (confirmed)
- Endpoint: https://api.backup.management/jsonapi
- Protocol: JSONRPC 2.0
- Method: POST only
- Authentication flow:
1. Login method is called
2. Response returns a **visa** token (toplevel field)
3. The visa **must be included in every subsequent call**
4. Cove may return a new visa in later responses (token chaining)
### Login request (working)
```json
{
"jsonrpc": "2.0",
"method": "Login",
"params": {
"partner": "<EXACT customer/partner name>",
"username": "<api login name>",
"password": "<password>"
},
"id": "1"
}
```
### Login response structure (important)
```json
{
"result": {
"result": {
"PartnerId": <number>,
"Name": "<login name>",
"Flags": ["SecurityOfficer","NonInteractive"]
}
},
"visa": "<visa token>"
}
```
Notes:
- `visa` is **not** inside `result`, but at top level
- `PartnerId` is found at `result.result.PartnerId`
---
## API user scope (critical finding)
- API users are **always bound to a single Partner (customer)** unless created at MSP/root level
- In this environment, it is **not possible to create an MSPlevel API user**
- All testing was therefore done with **customerscoped API users**
Impact:
- Crosscustomer enumeration is impossible
- Only data belonging to the linked customer can be queried
- Some enumerate/reporting calls are blocked regardless of role
---
## EnumerateAccountStatistics what works and what does not
### Method
```json
{
"jsonrpc": "2.0",
"method": "EnumerateAccountStatistics",
"visa": "<visa>",
"params": {
"query": {
"PartnerId": <partner_id>,
"SelectionMode": "Merged",
"StartRecordNumber": 0,
"RecordsCount": 50,
"Columns": [ ... ]
}
}
}
```
### Mandatory behavior
- **Columns are required**; omitting them returns `result: null`
- The API behaves as an **allowlist**:
- If *any* requested column is restricted, the **entire call fails** with error 13501
### Confirmed working (safe) column set
The following column set works reliably:
- I1 → account / device / tenant identifier
- I14 → used storage (bytes)
- I18 → computer name (if applicable)
- D01F00 D01F07 → numeric metrics (exact semantics TBD)
- D09F00 → numeric status/category code
Example (validated working):
```json
"Columns": [
"I1","I14","I18",
"D01F00","D01F01","D01F02","D01F03",
"D01F04","D01F05","D01F06","D01F07",
"D09F00"
]
```
### Confirmed restricted (cause security error 13501)
- Entire D02Fxx range
- Entire D03Fxx range
- Broad Iranges (e.g. I1I10 batches)
- Many individually tested Icodes not in the safe set
Even adding **one restricted code** causes the entire call to fail.
---
## EnumerateAccounts
- Method consistently fails with `Operation failed because of security reasons`
- This applies even with:
- SuperUser role
- SecurityOfficer flag enabled
Conclusion:
- EnumerateAccounts is **not usable** in this tenant for customerscoped API users
---
## Other tested methods
- EnumerateStatistics → Method not found
- GetPartnerInfo → works only for basic partner metadata (not statistics)
---
## Practical implications for BackupChecks
What **is possible**:
- Enumerate accounts implicitly via EnumerateAccountStatistics
- Identify devices/accounts via AccountId + I1/I18
- Collect storage usage (I14)
- Collect numeric status/metrics via D01Fxx and D09F00
What is **not possible (via this API scope)**:
- Reliable last backup timestamp
- Explicit success / failure / warning text
- Error messages
- Enumerating devices via EnumerateAccounts
- Crosscustomer aggregation
### Suggested internal model mapping
- Customer
- external_id = PartnerId
- Job
- external_id = AccountId
- display_name = I1
- hostname = I18 (if present)
- Run (limited)
- metrics only (bytes, counters)
- status must be **derived heuristically** from numeric fields (if possible)
---
## Open questions / next steps
1. Confirm official meaning of:
- D01F00 D01F07
- D09F00
2. Investigate whether:
- A tokenbased (nonJSONRPC) reporting endpoint exists
- Nable support can enable additional reporting columns
- An MSPlevel API user can be provisioned by Nable
3. Decide whether Cove integration in BackupChecks will be:
- Metricsonly (no run result semantics)
- Or require vendor cooperation for expanded API access

View File

@ -0,0 +1,559 @@
# Technical Notes (Internal)
Last updated: 2026-02-23
## Purpose
Internal technical snapshot of the `backupchecks` repository for faster onboarding, troubleshooting, and change impact analysis.
## Repository Overview
- Application: Flask web app with SQLAlchemy and Flask-Migrate.
- Runtime: Containerized (Docker), deployed via Docker Compose stack.
- Primary source code location: `containers/backupchecks/src`.
- The project also contains extensive functional documentation in `docs/` and multiple roadmap TODO files at repository root.
## Main Structure
- `containers/backupchecks/Dockerfile`: Python 3.12-slim image, starts `gunicorn` with `backend.app:create_app()`.
- `containers/backupchecks/requirements.txt`: Flask stack + PostgreSQL driver + reporting libraries (`reportlab`, `Markdown`).
- `containers/backupchecks/src/backend/app`: backend domain logic, routes, parsers, models, migrations.
- `containers/backupchecks/src/templates`: Jinja templates for auth/main/documentation pages.
- `containers/backupchecks/src/static`: CSS, images, favicon.
- `deploy/backupchecks-stack.yml`: compose stack with `backupchecks`, `postgres`, `adminer`.
- `build-and-push.sh`: release/test build script with version bumping, tags, and image push.
- `docs/`: functional design, changelogs, migration notes, API notes.
## Application Architecture (Current Observation)
- Factory pattern: `create_app()` in `containers/backupchecks/src/backend/app/__init__.py`.
- Blueprints:
- `auth_bp` for authentication.
- `main_bp` for core functionality.
- `doc_bp` for internal documentation pages.
- Database initialization at startup:
- `db.create_all()`
- `run_migrations()`
- Background tasks:
- `start_auto_importer(app)` starts the automatic mail importer thread.
- `start_cove_importer(app)` starts the Cove Data Protection polling thread (started only when `cove_import_enabled` is set).
- Health endpoint:
- `GET /health` returns `{ "status": "ok" }`.
## Functional Processing Flow
- Import:
- Email is fetched via Microsoft Graph API.
- Parse:
- Parser selection through registry + software-specific parser implementations.
- Approve:
- New jobs first appear in Inbox for initial customer assignment.
- Auto-process:
- Subsequent emails for known jobs automatically create `JobRun` records.
- Monitor:
- Runs appear in Daily Jobs and Run Checks.
- Review:
- Manual review removes items from the unreviewed operational queue.
## Configuration and Runtime
- Config is built from environment variables in `containers/backupchecks/src/backend/app/config.py`.
- Important variables:
- `APP_SECRET_KEY`
- `APP_ENV`
- `APP_PORT`
- `POSTGRES_DB`
- `POSTGRES_USER`
- `POSTGRES_PASSWORD`
- `DB_HOST`
- `DB_PORT`
- Database URI pattern:
- `postgresql+psycopg2://<user>:<pass>@<host>:<port>/<db>`
- Default timezone in config: `Europe/Amsterdam`.
## Data Model (High-level)
File: `containers/backupchecks/src/backend/app/models.py`
- Auth/users:
- `User` with role(s), active role in session.
- System settings:
- `SystemSettings` with Graph/mail settings, import settings, UI timezone, dashboard policy, sandbox flag.
- Autotask configuration and cache fields are present.
- Cove Data Protection fields: `cove_enabled`, `cove_api_url`, `cove_api_username`, `cove_api_password`, `cove_import_enabled`, `cove_import_interval_minutes`, `cove_partner_id`, `cove_last_import_at`.
- Logging:
- `AuditLog` (legacy alias `AdminLog`).
- Domain:
- `Customer`, `Job`, `JobRun`, `Override`
- `MailMessage`, `MailObject`
- `CoveAccount` (Cove staging table — see Cove integration section)
- `Ticket`, `TicketScope`, `TicketJobRun`
- `Remark`, `RemarkScope`, `RemarkJobRun`
- `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`
### Foreign Key Relationships & Deletion Order
Critical deletion order to avoid constraint violations:
1. Clean auxiliary tables (ticket_job_runs, remark_job_runs, scopes, overrides)
2. Unlink mails from jobs (UPDATE mail_messages SET job_id = NULL)
3. Delete mail_objects
4. Delete jobs (cascades to job_runs)
5. Delete mails
### Key Model Fields
**MailMessage model:**
- `from_address` (NOT `sender`!) - sender email
- `subject` - email subject
- `text_body` - plain text content
- `html_body` - HTML content
- `received_at` - timestamp
- `location` - inbox/processed/deleted
- `job_id` - link to Job (nullable)
**Job model:**
- `customer_id` - FK to Customer
- `job_name` - parsed from email
- `backup_software` - e.g., "Veeam", "Synology", "Cove Data Protection"
- `backup_type` - e.g., "Backup Job", "Active Backup"
- `cove_account_id` - (nullable int) links this job to a Cove AccountId
**JobRun model:**
- `source_type` - NULL = email (backwards compat), `"cove_api"` for Cove-imported runs
- `external_id` - deduplication key for Cove runs: `"cove-{account_id}-{run_ts}"`
## Parser Architecture
- Folder: `containers/backupchecks/src/backend/app/parsers/`
- Two layers:
- `registry.py`:
- matching/documentation/visibility on `/parsers`.
- examples must stay generic (no customer names).
- parser files (`veeam.py`, `synology.py`, etc.):
- actual detection and parsing logic.
- return structured output: software, type, job name, status, objects.
- Practical rule:
- extend patterns by adding, not replacing (backward compatibility).
### Parser Types
**Informational Parsers:**
- DSM Updates, Account Protection, Firmware Updates
- Set appropriate backup_type (e.g., "Updates", "Firmware Update")
- Do NOT participate in schedule learning
- Usually still visible in Run Checks for awareness
- Exception: non-backup 3CX informational types (`Update`, `SSL Certificate`) are hidden from Run Checks
**Regular Parsers:**
- Backup jobs (Veeam, Synology Active Backup, NAKIVO, etc.)
- Participate in schedule learning (daily/weekly/monthly detection)
- Generate missed runs when expected runs don't occur
**Example: Synology Updates Parser (synology.py)**
- Handles multiple update notification types under same job:
- DSM automatic update cancelled
- Packages out-of-date
- Combined notifications (DSM + packages)
- Detection patterns:
- DSM: "Automatische DSM-update", "DSM-update op", "automatic DSM update"
- Packages: "Packages on", "out-of-date", "Package Center"
- Hostname extraction from multiple patterns
- Returns: backup_type "Updates", job_name "Synology Automatic Update"
## Cove Data Protection Integration
### Overview
Cove (N-able) Data Protection is a cloud backup platform. Backupchecks integrates with it via the Cove JSON-RPC API, following the same inbox-style staging flow as email imports.
### Files
- `containers/backupchecks/src/backend/app/cove_importer.py` API client, account processing, JobRun creation
- `containers/backupchecks/src/backend/app/cove_importer_service.py` background polling thread
- `containers/backupchecks/src/backend/app/main/routes_cove.py` `/cove/accounts` routes
- `containers/backupchecks/src/templates/main/cove_accounts.html` inbox-style accounts page
### API Details
- Endpoint: `https://api.backup.management/jsonapi` (JSON-RPC 2.0)
- **Login**: `POST` with `{"jsonrpc":"2.0","id":"jsonrpc","method":"Login","params":{"username":"...","password":"..."}}`
- Returns `visa` at top level (`data["visa"]`), **not** inside `result`
- Returns `PartnerId` inside `result`
- **EnumerateAccountStatistics**: `POST` with visa in payload, `query` (lowercase) with `PartnerId`, `StartRecordNumber`, `RecordsCount`, `Columns`
- Settings format per account: `[{"D09F00": "5"}, {"I1": "device name"}, ...]` — list of single-key dicts, flatten with `dict.update(item)`
### Column Codes
| Code | Meaning |
|------|---------|
| `I1` | Account/device name |
| `I18` | Computer name |
| `I8` | Customer/partner name |
| `I78` | Active datasource label |
| `D09F00` | Overall last session status code |
| `D09F09` | Last successful session timestamp (Unix) |
| `D09F15` | Last session end timestamp (Unix) |
| `D09F08` | 28-day colorbar string |
| `D1F00/F15` | Files & Folders status/timestamp |
| `D10F00/F15` | VssMsSql |
| `D11F00/F15` | VssSharePoint |
| `D19F00/F15` | M365 Exchange |
| `D20F00/F15` | M365 OneDrive |
| `D5F00/F15` | M365 SharePoint |
| `D23F00/F15` | M365 Teams |
### Status Code Mapping
| Cove code | Meaning | Backupchecks status |
|-----------|---------|---------------------|
| 1 | In process | Warning |
| 2 | Failed | Error |
| 3 | Aborted | Error |
| 5 | Completed | Success |
| 6 | Interrupted | Error |
| 7 | Not started | Warning |
| 8 | Completed with errors | Warning |
| 9 | In progress with faults | Warning |
| 10 | Over quota | Error |
| 11 | No selection | Warning |
| 12 | Restarted | Warning |
### Inbox-Style Flow (mirrors email import)
1. Cove importer fetches all accounts via paginated `EnumerateAccountStatistics` (250/page).
2. Every account is upserted into the `cove_accounts` staging table (always, regardless of job link).
3. Accounts without a `job_id` appear on `/cove/accounts` ("Cove Accounts" page) for admin action.
4. Admin can:
- **Create new job** creates a `Job` with `backup_software="Cove Data Protection"` and links it.
- **Link to existing job** sets `job.cove_account_id` and `cove_acc.job_id`.
5. On the next import, linked accounts generate `JobRun` records (deduplicated via `external_id`).
6. Per-datasource objects are persisted to `customer_objects`, `job_object_links`, `run_object_links`.
### CoveAccount Model
```python
class CoveAccount(db.Model):
__tablename__ = "cove_accounts"
id # PK
account_id # Cove AccountId (unique)
account_name # I1
computer_name # I18
customer_name # I8
datasource_types # I78
last_status_code # D09F00 (int)
last_run_at # D09F15 (datetime)
colorbar_28d # D09F08
job_id # FK → jobs.id (nullable — None = unmatched)
first_seen_at
last_seen_at
job # relationship → Job
```
### Deduplication
`external_id = f"cove-{account_id}-{run_ts}"` where `run_ts` is the Unix timestamp from `D09F15`.
Before creating a `JobRun`, check `JobRun.query.filter_by(external_id=external_id).first()`.
### Background Thread
`cove_importer_service.py` — same pattern as `auto_importer_service.py`:
- Thread name: `"cove_importer"`
- Checks `settings.cove_import_enabled`
- Interval: `settings.cove_import_interval_minutes` (default 30)
- Calls `run_cove_import(settings)` which returns `(total, created, skipped, errors)`
### Settings UI
Settings → Integrations → Cove section:
- Enable toggle, API URL, username, password (masked, only overwritten if non-empty)
- Import enabled + interval
- "Test Connection" button (AJAX → `POST /settings/cove/test-connection`) returns `{ok, partner_id, message}`
- "Run import now" button (→ `POST /settings/cove/run-now`) triggers manual import
### Routes
| Route | Method | Description |
|-------|--------|-------------|
| `/cove/accounts` | GET | Inbox-style page: unmatched + matched accounts |
| `/cove/accounts/<id>/link` | POST | `action=create` or `action=link` |
| `/cove/accounts/<id>/unlink` | POST | Removes job link, puts account back in unmatched |
| `/settings/cove/test-connection` | POST | AJAX: verify credentials, save partner_id |
| `/settings/cove/run-now` | POST | Manual import trigger |
### Migrations
- `migrate_cove_integration()` — adds 8 columns to `system_settings`, `cove_account_id` to `jobs`, `source_type` + `external_id` to `job_runs`, dedup index on `job_runs.external_id`
- `migrate_cove_accounts_table()` — creates `cove_accounts` table with indexes
---
## Ticketing and Autotask (Critical Rules)
### Two Ticket Types
1. **Internal Tickets** (tickets table)
- Created manually or via Autotask integration
- Stored in `tickets` table with `ticket_code` (e.g., "T20250123.0001")
- Linked to runs via `ticket_job_runs` many-to-many table
- Scoped to jobs via `ticket_scopes` table
- Have `resolved_at` field for resolution tracking
- **Auto-propagation**: Automatically linked to new runs via `link_open_internal_tickets_to_run`
2. **Autotask Tickets** (job_runs columns)
- Created via Run Checks modal → "Create Autotask Ticket"
- Stored directly in JobRun columns: `autotask_ticket_id`, `autotask_ticket_number`, etc.
- When created, also creates matching internal ticket for legacy UI compatibility
- Have `autotask_ticket_deleted_at` field for deletion tracking
- Resolution tracked via matching internal ticket's `resolved_at` field
- **Auto-propagation**: Linked to new runs via two-strategy approach
### Ticket Propagation to New Runs
When a new JobRun is created (via email import OR missed run generation), `link_open_internal_tickets_to_run` ensures:
**Strategy 1: Internal ticket linking**
- Query finds tickets where: `COALESCE(ts.resolved_at, t.resolved_at) IS NULL`
- Creates `ticket_job_runs` links automatically
- Tickets remain visible until explicitly resolved
- **NO date-based logic** - resolved = immediately hidden from new runs
**Strategy 2: Autotask ticket propagation (independent)**
1. Check if internal ticket code exists → find matching Autotask run → copy ticket info
2. If no match, directly search for most recent Autotask ticket on job where:
- `autotask_ticket_deleted_at IS NULL` (not deleted in PSA)
- Internal ticket `resolved_at IS NULL` (not resolved in PSA)
3. Copy `autotask_ticket_id`, `autotask_ticket_number`, `created_at`, `created_by_user_id` to new run
### Where Ticket Linking is Called
`link_open_internal_tickets_to_run` is invoked in three locations:
1. **Email-based runs**: `routes_inbox.py` and `mail_importer.py` - after creating JobRun from parsed email
2. **Missed runs**: `routes_run_checks.py` in `_ensure_missed_runs_for_job` - after creating missed JobRun records
- Weekly schedule: After creating weekly missed run (with flush to get run.id)
- Monthly schedule: After creating monthly missed run (with flush to get run.id)
- **Critical**: Without this call, missed runs don't get ticket propagation!
### Display Logic - Link-Based System
All pages use **explicit link-based queries** (no date-based logic):
**Job Details Page:**
- **Two sources** for ticket display:
1. Direct links (`ticket_job_runs WHERE job_run_id = X`) → always show (audit trail)
2. Active window (`ticket_scopes WHERE job_id = Y AND resolved_at IS NULL`) → only unresolved
- Result: Old runs keep their ticket references, new runs don't get resolved tickets
**Run Checks Main Page (Indicators 🎫):**
- Query: `ticket_scopes JOIN tickets WHERE job_id = X AND resolved_at IS NULL`
- Only shows indicator if unresolved tickets exist for the job
**Run Checks Popup Modal:**
- API: `/api/job-runs/<run_id>/alerts`
- **Two-source ticket display**:
1. Direct links: `tickets JOIN ticket_job_runs WHERE job_run_id = X`
2. Job-level scope: `tickets JOIN ticket_scopes WHERE job_id = Y AND resolved_at IS NULL AND active_from_date <= run_date`
- Prevents duplicates by tracking seen ticket IDs
- Shows newly created tickets immediately (via scope) without waiting for resolve action
- **Two-source remark display**:
1. Direct links: `remarks JOIN remark_job_runs WHERE job_run_id = X`
2. Job-level scope: `remarks JOIN remark_scopes WHERE job_id = Y AND resolved_at IS NULL AND active_from_date <= run_date` (with timezone-safe fallback from `start_date`)
- Prevents duplicates by tracking seen remark IDs
### Debug Logging for Ticket Linking (Reference)
If you need to debug ticket linking issues, add this to `link_open_internal_tickets_to_run` in `ticketing_utils.py` after the rows query:
```python
try:
from .models import AuditLog
details = []
if rows:
for tid, code, t_resolved, ts_resolved in rows:
details.append(f"ticket_id={tid}, code={code}, t.resolved_at={t_resolved}, ts.resolved_at={ts_resolved}")
else:
details.append("No open tickets found for this job")
audit = AuditLog(
user="system", event_type="ticket_link_debug",
message=f"link_open_internal_tickets_to_run called: run_id={run.id}, job_id={job.id}, found={len(rows)} ticket(s)",
details="\n".join(details)
)
db.session.add(audit)
db.session.commit()
except Exception:
pass
```
Visible on Logging page under `event_type = "ticket_link_debug"`. Remove after debugging.
### Resolved vs Deleted
- **Resolved**: Ticket completed in Autotask (tracked in internal `tickets.resolved_at`)
- Stops propagating to new runs
- Ticket still exists in PSA
- Synced via PSA polling
- **Deleted**: Ticket removed from Autotask (tracked in `job_runs.autotask_ticket_deleted_at`)
- Also stops propagating
- Ticket no longer exists in PSA
- Rare operation
### Critical Rules
- ❌ **NEVER** use date-based resolved logic: `resolved_at >= run_date` OR `active_from_date <= run_date`
- ✅ Only show tickets that are ACTUALLY LINKED via `ticket_job_runs` table
- ✅ Resolved tickets stop linking immediately when resolved
- ✅ Old links preserved for audit trail (visible on old runs)
- ✅ All queries must use explicit JOIN to link tables
- ✅ Consistency: All pages use same "resolved = NULL" logic
- ✅ **CRITICAL**: Preserve description field during Autotask updates - must include "description" in optional_fields list
## UI and UX Notes
### Navbar
- Fixed-top positioning
- Collapses on mobile (hamburger menu)
- Dynamic padding adjustment via JavaScript (measures navbar height, adjusts main content padding-top)
- Role-based menu items (Admin sees more than Operator/Viewer)
### Status Badges
- Success: Green
- Warning: Yellow/Orange
- Failed/Error: Red
- Override applied: Blue badge
- Reviewed: Checkmark indicator
### Ticket Copy Functionality
- Copy button (⧉) available on both Run Checks and Job Details pages
- Allows quick copying of ticket numbers to clipboard
- Cross-browser compatible with three-tier fallback mechanism:
1. **Modern Clipboard API**: `navigator.clipboard.writeText()` - works in modern browsers with HTTPS
2. **Legacy execCommand**: `document.execCommand('copy')` - fallback for older browsers and Edge
3. **Prompt fallback**: `window.prompt()` - last resort if clipboard access fails
- Visual feedback: button changes to ✓ checkmark for 800ms after successful copy
- Implementation uses hidden textarea for execCommand method to ensure compatibility
- No user interaction required in modern browsers (direct copy)
### Checkbox Behavior
- All checkboxes on Inbox and Run Checks pages use `autocomplete="off"`
- Prevents browser from auto-selecting checkboxes after page reload
- Fixes issue where deleting items would cause same number of new items to be selected
### Customers to Jobs Navigation (2026-02-16)
- Customers page links each customer name to filtered Jobs view:
- `GET /jobs?customer_id=<customer_id>`
- Jobs route behavior:
- Accepts optional `customer_id` query parameter in `routes_jobs.py`.
- If set: returns jobs for that customer only.
- If not set: keeps default filter that hides jobs linked to inactive customers.
- Jobs UI behavior:
- Shows active filter banner with selected customer name.
- Provides "Clear filter" action back to unfiltered `/jobs`.
- Templates touched:
- `templates/main/customers.html`
- `templates/main/jobs.html`
### Global Grouped Search (2026-02-16)
- New route:
- `GET /search` in `main/routes_search.py`
- New UI:
- Navbar search form in `templates/layout/base.html`
- Grouped result page in `templates/main/search.html`
- Search behavior:
- Case-insensitive matching (`ILIKE`).
- `*` wildcard is supported and translated to SQL `%`.
- Automatic contains behavior is applied per term (`*term*`) when wildcard not explicitly set.
- Multi-term queries use AND across terms and OR across configured columns within each section.
- Per-section pagination is supported via query params: `p_inbox`, `p_customers`, `p_jobs`, `p_daily_jobs`, `p_run_checks`, `p_tickets`, `p_remarks`, `p_overrides`, `p_reports`.
- Pagination keeps search state for all sections while browsing one section.
- "Open <section>" links pass `q` to destination overview pages so page-level filtering matches the search term.
- Grouped sections:
- Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Remarks, Existing overrides, Reports.
- Daily Jobs search result details:
- Meta now includes expected run time, success indicator, and run count for the selected day.
- Link now opens Daily Jobs with modal auto-open using `open_job_id` query parameter (same modal flow as clicking a row in Daily Jobs).
- Access control:
- Search results are role-aware and only show sections/data the active role can access.
- `run_checks` results are restricted to `admin`/`operator`.
- `reports` supports `admin`/`operator`/`viewer`/`reporter`.
- Current performance strategy:
- Per-section limit (`SEARCH_LIMIT_PER_SECTION = 10`), with total count per section.
- No schema migration required for V1.
## Feedback Module with Screenshots
- Models: `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`.
- Attachments:
- multiple uploads, type validation, per-file size limits, storage in database (BYTEA).
## Validation Snapshot
- 2026-02-16: Test build + push succeeded via `update-and-build.sh t`.
- Pushed image: `gitea.oskamp.info/ivooskamp/backupchecks:dev`.
- 2026-02-16: Test build + push succeeded on branch `v20260216-02-global-search`.
- Pushed image digest: `sha256:6996675b9529426fe2ad58b5f353479623f3ebe24b34552c17ad0421d8a7ee0f`.
- 2026-02-16: Additional test build + push cycles succeeded on `v20260216-02-global-search`.
- Latest pushed image digest: `sha256:8ec8bfcbb928e282182fa223ce8bf7f92112d20e79f4a8602d015991700df5d7`.
- 2026-02-16: Additional test build + push cycles succeeded after search enhancements.
- Latest pushed image digest: `sha256:b36b5cdd4bc7c4dadedca0534f1904a6e12b5b97abc4f12bc51e42921976f061`.
- Delete strategy:
- soft delete by default,
- permanent delete only for admins and only after soft delete.
## Deployment and Operations
- Stack exposes:
- app on `8080`
- adminer on `8081`
- PostgreSQL persistent volume:
- `/docker/appdata/backupchecks/backupchecks-postgres:/var/lib/postgresql/data`
- `deploy/backupchecks-stack.yml` also contains example `.env` variables at the bottom.
## Build/Release Flow
File: `build-and-push.sh`
- Bump options:
- `1` patch, `2` minor, `3` major, `t` test.
- Release build:
- update `version.txt`
- commit + tag + push
- docker push of `:<version>`, `:dev`, `:latest`
- Test build:
- only `:dev`
- no commit/tag.
- Services are discovered under `containers/*` with Dockerfile-per-service.
## Technical Observations / Attention Points
- `README.md` is currently empty; quick-start entry context is missing.
- `LICENSE` is currently empty.
- `docs/architecture.md` is currently empty.
- `deploy/backupchecks-stack.yml` contains hardcoded example values (`Changeme`), with risk if used without proper secrets management.
- The app performs DB initialization + migrations at startup; for larger schema changes this can impact startup time/robustness.
- There is significant parser and ticketing complexity; route changes carry regression risk without targeted testing.
- For Autotask update calls, the `description` field must be explicitly preserved to prevent unintended NULL overwrite.
- Security hygiene remains important:
- no customer names in parser examples/source,
- no hardcoded credentials.
## Quick References
- App entrypoint: `containers/backupchecks/src/backend/app/main.py`
- App factory: `containers/backupchecks/src/backend/app/__init__.py`
- Config: `containers/backupchecks/src/backend/app/config.py`
- Models: `containers/backupchecks/src/backend/app/models.py`
- Parsers: `containers/backupchecks/src/backend/app/parsers/registry.py`
- Ticketing utilities: `containers/backupchecks/src/backend/app/ticketing_utils.py`
- Run Checks routes: `containers/backupchecks/src/backend/app/main/routes_run_checks.py`
- Cove importer: `containers/backupchecks/src/backend/app/cove_importer.py`
- Cove routes: `containers/backupchecks/src/backend/app/main/routes_cove.py`
- Compose stack: `deploy/backupchecks-stack.yml`
- Build script: `build-and-push.sh`
## Recent Changes
### 2026-02-23
- **Cove Data Protection full integration**:
- `cove_importer.py` Cove API client (login, paginated enumeration, status mapping, deduplication, per-datasource object persistence)
- `cove_importer_service.py` background polling thread (same pattern as `auto_importer_service.py`)
- `CoveAccount` staging model + `migrate_cove_accounts_table()` migration
- `SystemSettings` 8 new Cove fields, `Job` `cove_account_id`, `JobRun` `source_type` + `external_id`
- `routes_cove.py` inbox-style `/cove/accounts` with link/unlink routes
- `cove_accounts.html` unmatched accounts shown first with Bootstrap modals (create job / link to existing), matched accounts with Unlink
- Settings > Integrations: Cove section with test connection (AJAX) and manual import trigger
- Navbar: "Cove Accounts" link for admin/operator when `cove_enabled`
- **Cove API key findings** (from test script + N-able support):
- Visa is returned at top level of login response, not inside `result`
- Settings per account are a list of single-key dicts `[{"D09F00":"5"}, ...]` — flatten with `flat.update(item)`
- EnumerateAccountStatistics params must use lowercase `query` key and `RecordsCount` (not `RecordCount`)
- Login params must use lowercase `username`/`password`
- D02/D03 are legacy; use D10/D11 or D09 (Total) instead
### 2026-02-19
- **Added 3CX Update parser support**: `threecx.py` now recognizes subject `3CX Notification: Update Successful - <host>` and stores it as informational with:
- `backup_software = 3CX`
- `backup_type = Update`
- `overall_status = Success`
- **3CX informational schedule behavior**:
- `3CX / Update` and `3CX / SSL Certificate` are excluded from schedule inference in `routes_shared.py` (no Expected/Missed generation).
- **Run Checks visibility scope (3CX-only)**:
- Run Checks now hides only non-backup 3CX informational jobs (`Update`, `SSL Certificate`).
- Other backup software/types remain visible and unchanged.
- **Fixed remark visibility mismatch**:
- `/api/job-runs/<run_id>/alerts` now loads remarks from both:
1. `remark_job_runs` (explicit run links),
2. `remark_scopes` (active job-scoped remarks),
- with duplicate prevention by remark ID.
- This resolves cases where the remark indicator appeared but remarks were not shown in Run Checks modal or Job Details modal.
### 2026-02-13
- **Fixed missed runs ticket propagation**: Added `link_open_internal_tickets_to_run` calls in `_ensure_missed_runs_for_job` (routes_run_checks.py) after creating both weekly and monthly missed JobRun records. Previously only email-based runs got ticket linking, causing missed runs to not show internal tickets or Autotask tickets. Required `db.session.flush()` before linking to ensure run.id is available.
- **Fixed checkbox auto-selection**: Added `autocomplete="off"` to all checkboxes on Inbox and Run Checks pages. Prevents browser from automatically re-selecting checkboxes after page reload following delete actions.
### 2026-02-12
- **Fixed Run Checks modal ticket display**: Implemented two-source display logic (ticket_job_runs + ticket_scopes). Previously only showed tickets after they were resolved (when ticket_job_runs entry was created). Now shows tickets immediately upon creation via scope query.
- **Fixed copy button in Edge**: Moved clipboard functions inside IIFE scope for proper closure access (Edge is stricter than Firefox about scope resolution).
### 2026-02-10
- **Added screenshot support to Feedback system**: Multiple file upload, inline display, two-stage delete (soft delete for audit trail, permanent delete for cleanup).
- **Completed transition to link-based ticket system**: All pages now use JOIN queries, no date-based logic. Added cross-browser copy ticket functionality with three-tier fallback mechanism to both Run Checks and Job Details pages.