Compare commits
55 Commits
main
...
v20260219-
| Author | SHA1 | Date | |
|---|---|---|---|
| 2ee5db8882 | |||
| ea244193e0 | |||
| c6ff104767 | |||
| 441f5a8e50 | |||
| 3c629bb664 | |||
| e0e8ed2b0d | |||
| 53b028ef78 | |||
| 1fb99dc6e7 | |||
| f2c0d0b36a | |||
| 652da5e117 | |||
| c8e7491c94 | |||
| e5da01cfbb | |||
| b46010dbc2 | |||
| f90b2bdcf6 | |||
| fcbf67aeb3 | |||
| 2beba3bc9d | |||
| ded71cb50f | |||
| dc3eb2f73c | |||
| 8a8f957c9f | |||
| 8c29f527c6 | |||
| fcce3b8854 | |||
| 79933c2ecd | |||
| d84d2142ec | |||
| 7476ebcbe3 | |||
| 189dc4ed37 | |||
| f4384086f2 | |||
| dca117ed79 | |||
| ecdb331c9b | |||
| 084c91945a | |||
| d2cdd34541 | |||
| b5cf91d5f2 | |||
| 385aeb901c | |||
| 6468cbbc74 | |||
| 0e1e7e053d | |||
| bd72f91598 | |||
| 2e0baa4e35 | |||
| 9dee9c300a | |||
| c5cf07f4e5 | |||
| 91755c6e85 | |||
| 6674d40f4b | |||
| 32e68d7209 | |||
| 23e59ab459 | |||
| b2992acc56 | |||
| 200dd23285 | |||
| d1023f9e52 | |||
| 1de1b032e7 | |||
| 661a5783cf | |||
| dfe86a6ed1 | |||
| 35ec337c54 | |||
| c777728c91 | |||
| 0510613708 | |||
| fc99f17db3 | |||
| 1a506c0713 | |||
| 85798a07ae | |||
| 451ce1ab22 |
7
.gitignore
vendored
7
.gitignore
vendored
@ -1,2 +1,9 @@
|
||||
# Claude Code confidential files
|
||||
.claude/
|
||||
|
||||
# Codex local workspace files
|
||||
.codex/
|
||||
|
||||
# Python cache artifacts
|
||||
__pycache__/
|
||||
*.pyc
|
||||
|
||||
@ -1 +1 @@
|
||||
main
|
||||
v20260219-02-3cx-update-informational
|
||||
|
||||
765
TODO-cove-data-protection.md
Normal file
765
TODO-cove-data-protection.md
Normal file
@ -0,0 +1,765 @@
|
||||
# TODO: Cove Data Protection Integration
|
||||
|
||||
**Date:** 2026-02-10
|
||||
**Status:** Research phase
|
||||
**Priority:** Medium
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Goal
|
||||
|
||||
Integrate Cove Data Protection (formerly N-able Backup / SolarWinds Backup) into Backupchecks for backup status monitoring.
|
||||
|
||||
**Challenge:** Cove does NOT work with email notifications like other backup systems (Veeam, Synology, NAKIVO). We need to find an alternative method to import backup status information.
|
||||
|
||||
---
|
||||
|
||||
## 🔍 Research Questions
|
||||
|
||||
### 1. API Availability
|
||||
- [x] Does Cove Data Protection have a public API? **YES - Confirmed in documentation**
|
||||
- [ ] **CRITICAL:** How to enable/activate API access? (settings location, admin portal?)
|
||||
- [ ] What authentication method does the API use? (API key, OAuth, basic auth?)
|
||||
- [ ] Which endpoints are available for backup status?
|
||||
- [ ] Is there rate limiting on the API?
|
||||
- [ ] Documentation URL: ?
|
||||
- [ ] Is API access available in all Cove subscription tiers or only specific plans?
|
||||
|
||||
### 2. Data Structure
|
||||
- [ ] What information can we retrieve per backup job?
|
||||
- Job name
|
||||
- Status (success/warning/failed)
|
||||
- Start/end time
|
||||
- Backup type
|
||||
- Client/device name
|
||||
- Error messages
|
||||
- Objects/files backed up
|
||||
- [ ] Is there a webhook system available?
|
||||
- [ ] How often should the API be polled?
|
||||
|
||||
### 3. Multi-Tenancy
|
||||
- [ ] Does Cove support multi-tenant setups? (MSP use case)
|
||||
- [ ] Can we monitor multiple customers/partners from 1 account?
|
||||
- [ ] How are permissions/access managed?
|
||||
|
||||
### 4. Integration Strategy
|
||||
- [ ] **Option A: Scheduled Polling**
|
||||
- Cronjob that periodically calls API
|
||||
- Parse results to JobRun records
|
||||
- Pro: Simple, consistent with current flow
|
||||
- Con: Delay between backup and registration in system
|
||||
|
||||
- [ ] **Option B: Webhook/Push**
|
||||
- Cove sends notifications to our endpoint
|
||||
- Pro: Real-time updates
|
||||
- Con: Requires external endpoint, security considerations
|
||||
|
||||
- [ ] **Option C: Email Forwarding**
|
||||
- If Cove has email support after all (hidden setting?)
|
||||
- Pro: Reuses existing email import flow
|
||||
- Con: Possibly not available
|
||||
|
||||
---
|
||||
|
||||
## 📋 Technical Considerations
|
||||
|
||||
### Database Model
|
||||
Current JobRun model expects:
|
||||
- `mail_message_id` (FK) - how do we adapt this for API-sourced runs?
|
||||
- Possible new field: `source_type` ("email" vs "api")
|
||||
- Possible new field: `external_id` (Cove job ID)
|
||||
|
||||
### Parser System
|
||||
Current parser system works with email content. For API:
|
||||
- New "parser" concept for API responses?
|
||||
- Or direct JobRun creation without parser layer?
|
||||
|
||||
### Architecture Options
|
||||
|
||||
**Option 1: Extend Email Import System**
|
||||
```
|
||||
API Poller → Pseudo-MailMessage → Existing Parser → JobRun
|
||||
```
|
||||
- Pro: Reuse existing flow
|
||||
- Con: Hacky, email fields have no meaning
|
||||
|
||||
**Option 2: Parallel Import System**
|
||||
```
|
||||
API Poller → API Parser → JobRun (direct)
|
||||
```
|
||||
- Pro: Clean separation, no email dependency
|
||||
- Con: Logic duplication
|
||||
|
||||
**Option 3: Unified Import Layer**
|
||||
```
|
||||
→ Email Import →
|
||||
Unified → → Common Processor → JobRun
|
||||
→ API Import →
|
||||
```
|
||||
- Pro: Future-proof, scalable
|
||||
- Con: Larger refactor
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Implementation Steps (After Research)
|
||||
|
||||
### Phase 0: API Access Activation (FIRST!)
|
||||
**Critical step before any development can begin:**
|
||||
|
||||
1. [ ] **Find API activation location**
|
||||
- Check Cove admin portal/dashboard
|
||||
- Look in: Settings → API / Integrations / Developer section
|
||||
- Check: Account settings, Company settings, Partner settings
|
||||
- Search documentation for: "API activation", "API access", "enable API"
|
||||
|
||||
2. [ ] **Generate API credentials**
|
||||
- API key generation
|
||||
- Client ID / Client Secret (if OAuth)
|
||||
- Note: which user/role can generate API keys?
|
||||
|
||||
3. [ ] **Document API base URL**
|
||||
- Production API endpoint
|
||||
- Sandbox/test environment (if available)
|
||||
- Regional endpoints (EU vs US?)
|
||||
|
||||
4. [ ] **Document API authentication flow**
|
||||
- Header format (Bearer token, API key in header, query param?)
|
||||
- Token expiration and refresh
|
||||
- Rate limit headers to watch
|
||||
|
||||
5. [ ] **Find API documentation portal**
|
||||
- Developer documentation URL
|
||||
- Interactive API explorer (Swagger/OpenAPI?)
|
||||
- Code examples/SDKs
|
||||
- Support channels for API questions
|
||||
|
||||
**Resources to check:**
|
||||
- Cove admin portal: https://backup.management (or similar)
|
||||
- N-able partner portal
|
||||
- Cove knowledge base / support docs
|
||||
- Contact Cove support for API access instructions
|
||||
|
||||
### Phase 1: API Research & POC
|
||||
|
||||
**Step 1: Read Authentication Documentation** ✅ DOCUMENTATION FOUND!
|
||||
- [x] API documentation located
|
||||
- [ ] **Read:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/login.htm
|
||||
- [ ] **Read:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/construct-a-call.htm
|
||||
- [ ] Document API base URL from docs
|
||||
- [ ] Document authentication flow (likely JSON-RPC style based on "construct-a-call")
|
||||
- [ ] Note any required request format (headers, body structure)
|
||||
|
||||
**Step 2: Test Authentication**
|
||||
- [ ] Determine token format (Bearer token? API key header? Query param?)
|
||||
- [ ] Common authentication patterns to test:
|
||||
```bash
|
||||
# Option 1: Bearer token
|
||||
curl -H "Authorization: Bearer YOUR_TOKEN" https://api.example.com/endpoint
|
||||
|
||||
# Option 2: API Key header
|
||||
curl -H "X-API-Key: YOUR_TOKEN" https://api.example.com/endpoint
|
||||
|
||||
# Option 3: Custom header
|
||||
curl -H "X-Auth-Token: YOUR_TOKEN" https://api.example.com/endpoint
|
||||
```
|
||||
- [ ] Test with simple endpoint (e.g., `/api/v1/status`, `/api/accounts`, `/api/devices`)
|
||||
|
||||
**Step 3: Discover Available Endpoints**
|
||||
- [ ] Find API documentation/reference
|
||||
- [ ] Look for OpenAPI/Swagger spec
|
||||
- [ ] Key endpoints we need:
|
||||
- List customers/accounts
|
||||
- List backup devices/jobs
|
||||
- Get backup job history
|
||||
- Get backup job status/details
|
||||
- Get backup run results (success/failed/warnings)
|
||||
|
||||
**Step 4: Test Data Retrieval**
|
||||
- [ ] Test listing customers (verify top-level access works)
|
||||
- [ ] Test listing backup jobs for one customer
|
||||
- [ ] Test retrieving details for one backup job
|
||||
- [ ] Document response format (JSON structure)
|
||||
- [ ] Save example API responses for reference
|
||||
|
||||
**Step 5: Proof of Concept Script**
|
||||
1. [ ] Create standalone Python script (outside Backupchecks)
|
||||
2. [ ] Test authentication and data retrieval
|
||||
3. [ ] Parse API response to extract key fields
|
||||
4. [ ] Mapping of Cove data → Backupchecks JobRun model
|
||||
5. [ ] Document findings in this TODO
|
||||
|
||||
### Phase 2: Database Changes
|
||||
1. [ ] Decide: extend MailMessage model or new source type?
|
||||
2. [ ] Migration: add `source_type` field to JobRun
|
||||
3. [ ] Migration: add `external_id` field to JobRun
|
||||
4. [ ] Update constraints/validations
|
||||
|
||||
### Phase 3: Import Mechanism
|
||||
1. [ ] New file: `containers/backupchecks/src/backend/app/cove_importer.py`
|
||||
2. [ ] API client for Cove
|
||||
3. [ ] Data transformation to JobRun format
|
||||
4. [ ] Error handling & retry logic
|
||||
5. [ ] Logging & audit trail
|
||||
|
||||
### Phase 4: Scheduling
|
||||
1. [ ] Cronjob/scheduled task for polling (every 15 min?)
|
||||
2. [ ] Or: webhook endpoint if Cove supports it
|
||||
3. [ ] Rate limiting & throttling
|
||||
4. [ ] Duplicate detection (avoid double imports)
|
||||
|
||||
### Phase 5: UI Updates
|
||||
1. [ ] Job Details: indication that job is from API (not email)
|
||||
2. [ ] No "Download EML" button for API-sourced runs
|
||||
3. [ ] Possibly different metadata display
|
||||
|
||||
---
|
||||
|
||||
## 📚 References
|
||||
|
||||
### Cove Data Protection
|
||||
- **Product name:** Cove Data Protection (formerly N-able Backup, SolarWinds Backup)
|
||||
- **Website:** https://www.n-able.com/products/cove-data-protection
|
||||
- **API Documentation Base:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/
|
||||
|
||||
### JSON API Documentation (Found!)
|
||||
|
||||
**Core Documentation:**
|
||||
- 📘 **API Home:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/home.htm
|
||||
- 🔑 **Login/Authentication:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/login.htm
|
||||
- 🔧 **Construct API Calls:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/construct-a-call.htm
|
||||
|
||||
**Key Endpoints for Backupchecks:**
|
||||
- 👥 **Enumerate Customers:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/enumerate-customers.htm
|
||||
- 💻 **Enumerate Devices:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/enumerate-devices.htm
|
||||
- 📊 **Enumerate Device Statistics:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/enumerate-device-statistics.htm
|
||||
|
||||
**Reference:**
|
||||
- 📋 **API Column Codes:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/API-column-codes.htm
|
||||
- 📋 **Legacy Column Codes:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/API-column-codes-legacy.htm
|
||||
- 📐 **Schema Documentation:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/how-to-schema.htm
|
||||
|
||||
**Other Resources:**
|
||||
- 🏗️ **Architecture Guide:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/Architecture-and-Security/Cove-Architecture-Guide.htm
|
||||
- 🔒 **Security Guide:** https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/Architecture-and-Security/Cove-Security-Guide.htm
|
||||
|
||||
**Note:** API docs are in "unused" folder - likely legacy but still functional!
|
||||
|
||||
### Similar Integrations
|
||||
Other backup systems that use APIs:
|
||||
- Veeam: Has both email and REST API
|
||||
- Acronis: REST API available
|
||||
- MSP360: API for management
|
||||
|
||||
### Resources
|
||||
- [ ] API documentation (yet to find)
|
||||
- [ ] SDK/Client libraries available?
|
||||
- [ ] Community/forum for integration questions?
|
||||
- [ ] Example code/integrations?
|
||||
|
||||
---
|
||||
|
||||
## ❓ Open Questions
|
||||
|
||||
1. **Performance:** How many Cove jobs do we need to monitor? (impact on polling frequency)
|
||||
2. **Historical Data:** Can we retrieve old backup runs, or only new ones?
|
||||
3. **Filtering:** Can we apply filters (only failed jobs, specific clients)?
|
||||
4. **Authentication:** Where do we store Cove API credentials? (SystemSettings?)
|
||||
5. **Multi-Account:** Do we support multiple Cove accounts? (MSP scenario)
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Success Criteria
|
||||
|
||||
### Minimum Viable Product (MVP)
|
||||
- [ ] Backup runs from Cove are automatically imported
|
||||
- [ ] Status (success/warning/failed) displayed correctly
|
||||
- [ ] Job name and timestamp available
|
||||
- [ ] Visible in Daily Jobs & Run Checks
|
||||
- [ ] Errors and warnings are shown
|
||||
|
||||
### Nice to Have
|
||||
- [ ] Real-time import (webhook instead of polling)
|
||||
- [ ] Backup object details (individual files/folders)
|
||||
- [ ] Retry history
|
||||
- [ ] Storage usage metrics
|
||||
- [ ] Multi-tenant support
|
||||
|
||||
---
|
||||
|
||||
## ⚠️ Critical Limitations Discovered (2026-02-10)
|
||||
|
||||
### What the API CAN provide:
|
||||
- ✅ Account/device identifiers (I1)
|
||||
- ✅ Storage usage metrics (I14 - bytes used)
|
||||
- ✅ Computer/hostname (I18)
|
||||
- ✅ Numeric metrics (D01F00-D01F07, D09F00)
|
||||
- ✅ Basic partner metadata
|
||||
|
||||
### What the API CANNOT provide (security restrictions):
|
||||
- ❌ **Last backup timestamp** - No reliable date/time fields accessible
|
||||
- ❌ **Backup status** (success/failed/warning) - No explicit status fields
|
||||
- ❌ **Error messages** - All D02Fxx/D03Fxx ranges blocked
|
||||
- ❌ **Backup run history** - No detailed run information
|
||||
- ❌ **Cross-customer aggregation** - API users are customer-scoped
|
||||
- ❌ **Device enumeration** - EnumerateAccounts method blocked (error 13501)
|
||||
|
||||
### Root Cause
|
||||
**Security error 13501** ("Operation failed because of security reasons") occurs when:
|
||||
- Any restricted column code is requested in EnumerateAccountStatistics
|
||||
- EnumerateAccounts method is called (always fails)
|
||||
- This applies even with SuperUser + SecurityOfficer roles
|
||||
|
||||
**Column restrictions are per-tenant and not documented.** The allow-list is extremely limited.
|
||||
|
||||
### Impact on Backupchecks Integration
|
||||
**Current API access is insufficient for backup monitoring** because:
|
||||
1. No way to determine if a backup succeeded or failed
|
||||
2. No error messages to display to users
|
||||
3. No timestamps to track backup frequency
|
||||
4. Cannot import backup "runs" in meaningful way
|
||||
|
||||
**Possible with current API:**
|
||||
- Storage usage dashboard only
|
||||
- Device inventory list
|
||||
- But NOT backup status monitoring (core Backupchecks function)
|
||||
|
||||
---
|
||||
|
||||
## 🔀 Decision Point: Integration Feasibility
|
||||
|
||||
### Option A: Implement Metrics-Only Integration
|
||||
**Pros:**
|
||||
- Can display storage usage per device
|
||||
- Simple implementation
|
||||
- Works with current API access
|
||||
|
||||
**Cons:**
|
||||
- Does NOT meet core Backupchecks requirement (backup status monitoring)
|
||||
- No success/failure tracking
|
||||
- No alerting on backup issues
|
||||
- Limited value compared to email-based systems
|
||||
|
||||
**Effort:** Low (2-3 days)
|
||||
**Value:** Low (storage metrics only, no backup monitoring)
|
||||
|
||||
### Option B: Request Expanded API Access from N-able ⭐ RECOMMENDED
|
||||
**Contact N-able support and request:**
|
||||
1. MSP-level API user capability (cross-customer access)
|
||||
2. Access to restricted column codes:
|
||||
- Backup timestamps (last successful backup)
|
||||
- Status fields (success/warning/failed)
|
||||
- Error message fields (D02Fxx/D03Fxx)
|
||||
- Session/run history fields
|
||||
|
||||
**Pros:**
|
||||
- Could enable full backup monitoring if granted
|
||||
- Proper integration matching other backup systems
|
||||
|
||||
**Cons:**
|
||||
- May require vendor cooperation
|
||||
- No guarantee N-able will grant access
|
||||
- Possible additional licensing costs?
|
||||
- Timeline uncertain (support ticket process)
|
||||
|
||||
**Effort:** Unknown (depends on N-able response)
|
||||
**Value:** High (if successful)
|
||||
|
||||
---
|
||||
|
||||
### 📧 Support Ticket Template (Ready to Send)
|
||||
|
||||
**To:** N-able Cove Data Protection Support
|
||||
**Subject:** API Access Request - Backup Monitoring Integration
|
||||
|
||||
**Email Body:**
|
||||
|
||||
```
|
||||
Hello N-able Support Team,
|
||||
|
||||
We are developing a backup monitoring solution for MSPs and are integrating
|
||||
with Cove Data Protection via the JSON-RPC API for our customers.
|
||||
|
||||
Current Situation:
|
||||
- We have successfully authenticated with the API
|
||||
- API endpoint: https://api.backup.management/jsonapi
|
||||
- API user management: https://backup.management/#/api-users
|
||||
- Method tested: EnumerateAccountStatistics
|
||||
- Role: SuperUser + SecurityOfficer
|
||||
|
||||
Current Limitations (Blocking Integration):
|
||||
We are encountering "Operation failed because of security reasons (error 13501)"
|
||||
when attempting to access essential backup monitoring data:
|
||||
|
||||
1. Backup Status Fields
|
||||
- Cannot determine if backups succeeded, failed, or completed with warnings
|
||||
- Need access to status/result columns
|
||||
|
||||
2. Timestamp Information
|
||||
- Cannot access last backup date/time
|
||||
- Need reliable timestamp fields to track backup frequency
|
||||
|
||||
3. Error Messages
|
||||
- D02Fxx and D03Fxx column ranges are blocked
|
||||
- Cannot retrieve error details to show users what went wrong
|
||||
|
||||
4. API User Scope
|
||||
- API users are customer-scoped only
|
||||
- Need MSP-level API user capability for cross-customer monitoring
|
||||
|
||||
Impact:
|
||||
Without access to these fields, we can only retrieve storage usage metrics,
|
||||
which is insufficient for backup status monitoring - the core requirement
|
||||
for our MSP customers.
|
||||
|
||||
Request:
|
||||
Can you please:
|
||||
1. Enable MSP-level API user creation for cross-customer access
|
||||
2. Grant access to restricted column codes containing:
|
||||
- Backup status (success/failed/warning)
|
||||
- Last backup timestamps
|
||||
- Error messages and details
|
||||
- Session/run history
|
||||
3. Provide documentation on the semantic meaning of column codes (especially
|
||||
D01F00-D01F07 and D09F00 which currently work)
|
||||
4. OR suggest an alternative integration method if expanded API access is
|
||||
not available (webhooks, reporting API, email notifications, etc.)
|
||||
|
||||
Technical Details:
|
||||
- Our test results are documented at:
|
||||
docs/cove_data_protection_api_calls_known_info.md (can provide upon request)
|
||||
- Safe columns identified: I1, I14, I18, D01F00-D01F07, D09F00
|
||||
- Restricted columns: Entire D02Fxx and D03Fxx ranges
|
||||
|
||||
Use Case:
|
||||
We need this integration to provide our MSP customers with centralized backup
|
||||
monitoring across multiple backup vendors (Veeam, Synology, NAKIVO, and Cove).
|
||||
Without proper API access, Cove customers cannot benefit from our monitoring
|
||||
solution.
|
||||
|
||||
Please advise on the best path forward for enabling comprehensive backup
|
||||
monitoring via the Cove API.
|
||||
|
||||
Thank you for your assistance.
|
||||
|
||||
Best regards,
|
||||
[Your Name]
|
||||
[Company Name]
|
||||
[Contact Information]
|
||||
```
|
||||
|
||||
**Alternative Contact Methods:**
|
||||
- N-able Partner Portal support ticket
|
||||
- Cove support email (if available)
|
||||
- N-able account manager (if assigned)
|
||||
|
||||
---
|
||||
|
||||
### Option C: Alternative Integration Methods
|
||||
Explore if Cove has:
|
||||
1. **Reporting API** (separate from JSON-RPC)
|
||||
2. **Webhook system** (push notifications for backup events)
|
||||
3. **Email notifications** (if available, use existing email parser)
|
||||
4. **Export/CSV reports** (scheduled export that can be imported)
|
||||
|
||||
**Effort:** Medium (research required)
|
||||
**Value:** Unknown
|
||||
|
||||
### Option D: Defer Integration
|
||||
**Wait until:**
|
||||
- Customer requests Cove support specifically
|
||||
- N-able improves API capabilities
|
||||
- Alternative integration method discovered
|
||||
|
||||
**Pros:**
|
||||
- No wasted effort on limited implementation
|
||||
- Focus on systems with better API support
|
||||
|
||||
**Cons:**
|
||||
- Cove customers cannot use Backupchecks
|
||||
- Competitive disadvantage if other MSPs support Cove
|
||||
|
||||
---
|
||||
|
||||
## 🎯 Recommended Next Steps
|
||||
|
||||
### Immediate (This Week)
|
||||
1. **Decision:** Choose Option A, B, C, or D above
|
||||
2. **If Option B (contact N-able):**
|
||||
- Open support ticket with N-able
|
||||
- Reference API user creation at https://backup.management/#/api-users
|
||||
- Explain need for expanded column access for monitoring solution
|
||||
- Attach findings from `/docker/develop/cove_data_protection_api_calls_known_info.md`
|
||||
- Ask specifically for:
|
||||
- MSP-level API user creation
|
||||
- Access to backup status/timestamp columns
|
||||
- Documentation of column codes semantics
|
||||
|
||||
3. **If Option C (alternative methods):**
|
||||
- Check Cove portal for webhook/reporting settings
|
||||
- Search N-able docs for "reporting API", "webhooks", "notifications"
|
||||
- Test if email notifications can be enabled per customer
|
||||
|
||||
### Long Term (Future)
|
||||
- Monitor N-able API changelog for improvements
|
||||
- Check if other MSPs have found workarounds
|
||||
- Consider partnering with N-able for integration
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Next Steps
|
||||
|
||||
### Immediate Actions (Ready to Start!)
|
||||
|
||||
**1. Read API Documentation** ✅ FOUND!
|
||||
Priority reading order:
|
||||
1. **Start here:** [Login/Auth](https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/login.htm) - How to authenticate with your token
|
||||
2. **Then read:** [Construct a Call](https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/construct-a-call.htm) - Request format
|
||||
3. **Key endpoint:** [Enumerate Device Statistics](https://documentation.n-able.com/covedataprotection/USERGUIDE/documentation/Content/unused/service-management/json-api/enumerate-device-statistics.htm) - This likely has backup job data!
|
||||
|
||||
**What to extract from docs:**
|
||||
- API base URL/endpoint
|
||||
- Request format (JSON-RPC? REST? POST body structure?)
|
||||
- How to use the token in requests
|
||||
- Response format examples
|
||||
- Which fields contain backup status/results
|
||||
|
||||
**2. Quick API Test with Postman** (can be done now with token!)
|
||||
|
||||
### Postman Setup Instructions
|
||||
|
||||
**Step 1: Create New Request**
|
||||
1. Open Postman
|
||||
2. Click "New" → "HTTP Request"
|
||||
3. Name it "Cove API - Test Authentication"
|
||||
|
||||
**Step 2: Configure Request**
|
||||
- **Method:** GET
|
||||
- **URL:** Try these in order:
|
||||
1. `https://api.backup.management/api/accounts`
|
||||
2. `https://backup.management/api/accounts`
|
||||
3. `https://api.backup.management/api/customers`
|
||||
|
||||
**Step 3: Add Authentication (try both methods)**
|
||||
|
||||
**Option A: Bearer Token**
|
||||
- Go to "Authorization" tab
|
||||
- Type: "Bearer Token"
|
||||
- Token: `YOUR_TOKEN` (paste token from backup.management)
|
||||
|
||||
**Option B: API Key in Header**
|
||||
- Go to "Headers" tab
|
||||
- Add header:
|
||||
- Key: `X-API-Key`
|
||||
- Value: `YOUR_TOKEN`
|
||||
|
||||
**Step 4: Send Request and Analyze Response**
|
||||
|
||||
**Expected Results:**
|
||||
- ✅ **200 OK** → Success! API works, save this configuration
|
||||
- Copy the JSON response → we'll analyze structure
|
||||
- Note which URL and auth method worked
|
||||
- Check for pagination info in response
|
||||
|
||||
- ❌ **401 Unauthorized** → Wrong auth method
|
||||
- Try other authentication option (Bearer vs X-API-Key)
|
||||
- Check if token was copied correctly
|
||||
|
||||
- ❌ **404 Not Found** → Wrong endpoint URL
|
||||
- Try alternative base URL (api.backup.management vs backup.management)
|
||||
- Try different endpoint (/api/customers, /api/devices)
|
||||
|
||||
- ❌ **403 Forbidden** → Token works but insufficient permissions
|
||||
- Verify API user has SuperUser role
|
||||
- Check customer scope selection
|
||||
|
||||
**Step 5: Discover Available Endpoints**
|
||||
|
||||
Once authentication works, try these endpoints:
|
||||
```
|
||||
GET /api/accounts
|
||||
GET /api/customers
|
||||
GET /api/devices
|
||||
GET /api/jobs
|
||||
GET /api/statistics
|
||||
GET /api/sessions
|
||||
```
|
||||
|
||||
For each successful endpoint, save:
|
||||
- The request in Postman collection
|
||||
- Example response in TODO or separate file
|
||||
- Note any query parameters (page, limit, filter, etc.)
|
||||
|
||||
**Step 6: Look for API Documentation**
|
||||
|
||||
Try these URLs in browser or Postman:
|
||||
- `https://api.backup.management/swagger`
|
||||
- `https://api.backup.management/docs`
|
||||
- `https://api.backup.management/api-docs`
|
||||
- `https://backup.management/api/documentation`
|
||||
|
||||
**Step 7: Document Findings**
|
||||
|
||||
After successful testing, document in this TODO:
|
||||
- ✅ Working API base URL
|
||||
- ✅ Correct authentication method (Bearer vs header)
|
||||
- ✅ List of available endpoints discovered
|
||||
- ✅ JSON response structure examples
|
||||
- ✅ Any pagination/filtering patterns
|
||||
- ✅ Rate limits (check response headers: X-RateLimit-*)
|
||||
|
||||
### Postman Tips for This Project
|
||||
|
||||
**Save Everything:**
|
||||
- Create a "Cove API" collection in Postman
|
||||
- Save all working requests
|
||||
- Export collection to JSON for documentation
|
||||
|
||||
**Use Variables:**
|
||||
- Create Postman environment "Cove Production"
|
||||
- Add variable: `cove_token` = your token
|
||||
- Add variable: `cove_base_url` = working base URL
|
||||
- Use `{{cove_token}}` and `{{cove_base_url}}` in requests
|
||||
|
||||
**Check Response Headers:**
|
||||
- Look for `X-RateLimit-Limit` (API call limits)
|
||||
- Look for `X-RateLimit-Remaining` (calls left)
|
||||
- Look for `Link` header (pagination)
|
||||
|
||||
**Save Response Examples:**
|
||||
- For each endpoint, save example response
|
||||
- Use Postman's "Save Response" feature
|
||||
- Or copy JSON to separate file for reference
|
||||
|
||||
**3. Document Findings**
|
||||
|
||||
**After successful Postman testing, update this TODO with:**
|
||||
|
||||
```markdown
|
||||
## ✅ API Testing Results (Add after testing)
|
||||
|
||||
### Working Configuration
|
||||
- **Base URL:** [fill in]
|
||||
- **Authentication:** Bearer Token / X-API-Key header (circle one)
|
||||
- **Token Location:** Authorization header / X-API-Key header (circle one)
|
||||
|
||||
### Available Endpoints Discovered
|
||||
| Endpoint | Method | Purpose | Response Fields |
|
||||
|----------|--------|---------|-----------------|
|
||||
| /api/accounts | GET | List accounts | [list key fields] |
|
||||
| /api/customers | GET | List customers | [list key fields] |
|
||||
| /api/devices | GET | List backup devices | [list key fields] |
|
||||
| /api/jobs | GET | List backup jobs | [list key fields] |
|
||||
|
||||
### Key Response Fields for Backupchecks Integration
|
||||
From backup job/session endpoint:
|
||||
- Job ID: `[field name]`
|
||||
- Job Name: `[field name]`
|
||||
- Status: `[field name]` (values: success/warning/failed)
|
||||
- Start Time: `[field name]`
|
||||
- End Time: `[field name]`
|
||||
- Customer/Device: `[field name]`
|
||||
- Error Messages: `[field name]`
|
||||
- Backup Objects: `[field name or nested path]`
|
||||
|
||||
### Pagination
|
||||
- Method: [Link headers / page parameter / cursor / none]
|
||||
- Page size: [default and max]
|
||||
- Total count: [available in response?]
|
||||
|
||||
### Rate Limiting
|
||||
- Limit: [X requests per Y time]
|
||||
- Headers: [X-RateLimit-* header names]
|
||||
|
||||
### API Documentation URL
|
||||
- [URL if found, or "Not found" if unavailable]
|
||||
```
|
||||
|
||||
**Save Postman Collection:**
|
||||
- Export collection as JSON
|
||||
- Save to: `/docker/develop/backupchecks/docs/cove-api-postman-collection.json`
|
||||
- Or share Postman workspace link in this TODO
|
||||
|
||||
**4. Create POC Script**
|
||||
Once API works, create standalone Python test script:
|
||||
```python
|
||||
import requests
|
||||
|
||||
# Test script to retrieve Cove backup data
|
||||
token = "YOUR_TOKEN"
|
||||
base_url = "https://api.example.com"
|
||||
|
||||
headers = {
|
||||
"Authorization": f"Bearer {token}",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
# Get list of customers
|
||||
response = requests.get(f"{base_url}/api/customers", headers=headers)
|
||||
print(response.json())
|
||||
```
|
||||
|
||||
**5. Plan Integration**
|
||||
Based on POC results, decide architecture approach and start implementation
|
||||
|
||||
**Status:** Ready for API testing - token available!
|
||||
|
||||
---
|
||||
|
||||
## 📝 Notes
|
||||
|
||||
- This TODO document should be updated after each research step
|
||||
- Add API examples as soon as available
|
||||
- Document edge cases and limitations
|
||||
- Consider security implications (API key storage, rate limits, etc.)
|
||||
|
||||
### Current Status (2026-02-10)
|
||||
- ✅ **Confirmed:** Cove Data Protection HAS API access (mentioned in documentation)
|
||||
- ✅ **Found:** API user creation location in Cove portal
|
||||
- ✅ **Created:** API user with SuperUser role and token
|
||||
- ✅ **Found:** Complete JSON API documentation (N-able docs site)
|
||||
- ✅ **Tested:** API authentication and multiple methods (with ChatGPT assistance)
|
||||
- ⚠️ **CRITICAL LIMITATION DISCOVERED:** API heavily restricted by column allow-list
|
||||
- ⚠️ **BLOCKER:** No reliable backup status (success/failed/warning) available via API
|
||||
- ⚠️ **BLOCKER:** No error messages, timestamps, or detailed run information accessible
|
||||
- 🎯 **Next decision:** Determine if metrics-only integration is valuable OR contact N-able for expanded access
|
||||
|
||||
### Test Results Summary (see docs/cove_data_protection_api_calls_known_info.md)
|
||||
- **Endpoint:** https://api.backup.management/jsonapi (JSON-RPC 2.0)
|
||||
- **Authentication:** Login method → visa token → include in all subsequent calls
|
||||
- **Working method:** EnumerateAccountStatistics (with limited columns)
|
||||
- **Blocked method:** EnumerateAccounts (security error 13501)
|
||||
- **Safe columns:** I1, I14, I18, D01F00-D01F07, D09F00
|
||||
- **Restricted columns:** D02Fxx, D03Fxx ranges (cause entire request to fail)
|
||||
- **Scope limitation:** API users are customer-scoped, not MSP-level
|
||||
|
||||
### API Credentials (Created)
|
||||
- **Authentication:** Token-based
|
||||
- **Role:** SuperUser (full access)
|
||||
- **Scope:** Top-level customer (access to all sub-customers)
|
||||
- **Token:** Generated (store securely!)
|
||||
- **Portal URL:** https://backup.management
|
||||
- **API User Management:** https://backup.management/#/api-users
|
||||
|
||||
**IMPORTANT:** Store token in secure location (password manager) - cannot be retrieved again if lost!
|
||||
|
||||
### Likely API Base URLs to Test
|
||||
Based on portal URL `backup.management`:
|
||||
1. `https://api.backup.management` (most common pattern)
|
||||
2. `https://backup.management/api`
|
||||
3. `https://api.backup.management/jsonapi` (some backup systems use this)
|
||||
4. Check API user page for hints or documentation links
|
||||
|
||||
### Possible Admin Portal Locations
|
||||
Check these sections in Cove dashboard:
|
||||
- Settings → API Keys / Developer
|
||||
- Settings → Integrations
|
||||
- Account → API Access
|
||||
- Partner Portal → API Management
|
||||
- Company Settings → Advanced → API
|
||||
|
||||
### Support Channels
|
||||
If API activation is not obvious:
|
||||
- Cove support ticket: Ask "How do I enable API access for backup monitoring?"
|
||||
- N-able partner support (if MSP)
|
||||
- Check Cove community forums
|
||||
- Review onboarding documentation for API mentions
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@ -26,5 +26,6 @@ from . import routes_feedback # noqa: F401
|
||||
from . import routes_api # noqa: F401
|
||||
from . import routes_reporting_api # noqa: F401
|
||||
from . import routes_user_settings # noqa: F401
|
||||
from . import routes_search # noqa: F401
|
||||
|
||||
__all__ = ["main_bp", "roles_required"]
|
||||
|
||||
@ -16,9 +16,11 @@ def api_job_run_alerts(run_id: int):
|
||||
tickets = []
|
||||
remarks = []
|
||||
|
||||
# Tickets linked to this specific run
|
||||
# Only show tickets that were explicitly linked via ticket_job_runs
|
||||
# Tickets linked to this run:
|
||||
# 1. Explicitly linked via ticket_job_runs (audit trail when resolved)
|
||||
# 2. Linked to the job via ticket_scopes (active on run date)
|
||||
try:
|
||||
# First, get tickets explicitly linked to this run via ticket_job_runs
|
||||
rows = (
|
||||
db.session.execute(
|
||||
text(
|
||||
@ -43,7 +45,11 @@ def api_job_run_alerts(run_id: int):
|
||||
.all()
|
||||
)
|
||||
|
||||
ticket_ids_seen = set()
|
||||
for r in rows:
|
||||
ticket_id = int(r.get("id"))
|
||||
ticket_ids_seen.add(ticket_id)
|
||||
|
||||
resolved_at = r.get("resolved_at")
|
||||
resolved_same_day = False
|
||||
if resolved_at and run_date:
|
||||
@ -52,7 +58,62 @@ def api_job_run_alerts(run_id: int):
|
||||
|
||||
tickets.append(
|
||||
{
|
||||
"id": int(r.get("id")),
|
||||
"id": ticket_id,
|
||||
"ticket_code": r.get("ticket_code") or "",
|
||||
"description": r.get("description") or "",
|
||||
"start_date": _format_datetime(r.get("start_date")),
|
||||
"active_from_date": str(r.get("active_from_date")) if r.get("active_from_date") else "",
|
||||
"resolved_at": _format_datetime(r.get("resolved_at")) if r.get("resolved_at") else "",
|
||||
"active": bool(active_now),
|
||||
"resolved_same_day": bool(resolved_same_day),
|
||||
}
|
||||
)
|
||||
|
||||
# Second, get tickets linked to the job via ticket_scopes
|
||||
# These are tickets that apply to the whole job (not just a specific run)
|
||||
rows = (
|
||||
db.session.execute(
|
||||
text(
|
||||
"""
|
||||
SELECT DISTINCT t.id,
|
||||
t.ticket_code,
|
||||
t.description,
|
||||
t.start_date,
|
||||
t.resolved_at,
|
||||
t.active_from_date
|
||||
FROM tickets t
|
||||
JOIN ticket_scopes ts ON ts.ticket_id = t.id
|
||||
WHERE ts.job_id = :job_id
|
||||
AND t.active_from_date <= :run_date
|
||||
AND COALESCE(ts.resolved_at, t.resolved_at) IS NULL
|
||||
ORDER BY t.start_date DESC
|
||||
"""
|
||||
),
|
||||
{
|
||||
"job_id": job.id if job else 0,
|
||||
"run_date": run_date,
|
||||
},
|
||||
)
|
||||
.mappings()
|
||||
.all()
|
||||
)
|
||||
|
||||
for r in rows:
|
||||
ticket_id = int(r.get("id"))
|
||||
# Skip if already added via ticket_job_runs
|
||||
if ticket_id in ticket_ids_seen:
|
||||
continue
|
||||
ticket_ids_seen.add(ticket_id)
|
||||
|
||||
resolved_at = r.get("resolved_at")
|
||||
resolved_same_day = False
|
||||
if resolved_at and run_date:
|
||||
resolved_same_day = _to_amsterdam_date(resolved_at) == run_date
|
||||
active_now = r.get("resolved_at") is None
|
||||
|
||||
tickets.append(
|
||||
{
|
||||
"id": ticket_id,
|
||||
"ticket_code": r.get("ticket_code") or "",
|
||||
"description": r.get("description") or "",
|
||||
"start_date": _format_datetime(r.get("start_date")),
|
||||
|
||||
@ -63,7 +63,27 @@ def _get_or_create_settings_local():
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
def customers():
|
||||
items = Customer.query.order_by(Customer.name.asc()).all()
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
query = Customer.query
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
query = query.filter(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
|
||||
|
||||
items = query.order_by(Customer.name.asc()).all()
|
||||
|
||||
settings = _get_or_create_settings_local()
|
||||
autotask_enabled = bool(getattr(settings, "autotask_enabled", False))
|
||||
@ -105,6 +125,7 @@ def customers():
|
||||
can_manage=can_manage,
|
||||
autotask_enabled=autotask_enabled,
|
||||
autotask_configured=autotask_configured,
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
@ -484,6 +505,7 @@ def customers_export():
|
||||
@roles_required("admin", "operator")
|
||||
def customers_import():
|
||||
file = request.files.get("file")
|
||||
include_autotask_ids = bool(request.form.get("include_autotask_ids"))
|
||||
if not file or not getattr(file, "filename", ""):
|
||||
flash("No file selected.", "warning")
|
||||
return redirect(url_for("main.customers"))
|
||||
@ -520,6 +542,7 @@ def customers_import():
|
||||
# Detect Autotask columns (backwards compatible - these are optional)
|
||||
autotask_id_idx = None
|
||||
autotask_name_idx = None
|
||||
if include_autotask_ids:
|
||||
if "autotask_company_id" in header:
|
||||
autotask_id_idx = header.index("autotask_company_id")
|
||||
if "autotask_company_name" in header:
|
||||
@ -561,7 +584,7 @@ def customers_import():
|
||||
if active_val is not None:
|
||||
existing.active = active_val
|
||||
# Update Autotask mapping if provided in CSV
|
||||
if autotask_company_id is not None:
|
||||
if include_autotask_ids and autotask_company_id is not None:
|
||||
existing.autotask_company_id = autotask_company_id
|
||||
existing.autotask_company_name = autotask_company_name
|
||||
existing.autotask_mapping_status = None # Will be resynced
|
||||
@ -579,7 +602,10 @@ def customers_import():
|
||||
|
||||
try:
|
||||
db.session.commit()
|
||||
flash(f"Import finished. Created: {created}, Updated: {updated}, Skipped: {skipped}.", "success")
|
||||
flash(
|
||||
f"Import finished. Created: {created}, Updated: {updated}, Skipped: {skipped}. Autotask IDs imported: {'yes' if include_autotask_ids else 'no'}.",
|
||||
"success",
|
||||
)
|
||||
|
||||
# Audit logging
|
||||
import json
|
||||
@ -588,6 +614,7 @@ def customers_import():
|
||||
f"Imported customers from CSV",
|
||||
details=json.dumps({
|
||||
"format": "CSV",
|
||||
"include_autotask_ids": include_autotask_ids,
|
||||
"created": created,
|
||||
"updated": updated,
|
||||
"skipped": skipped
|
||||
@ -599,5 +626,3 @@ def customers_import():
|
||||
flash("Failed to import customers.", "danger")
|
||||
|
||||
return redirect(url_for("main.customers"))
|
||||
|
||||
|
||||
|
||||
@ -9,6 +9,21 @@ MISSED_GRACE_WINDOW = timedelta(hours=1)
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
def daily_jobs():
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
# Determine target date (default: today) in Europe/Amsterdam
|
||||
date_str = request.args.get("date")
|
||||
try:
|
||||
@ -74,10 +89,21 @@ def daily_jobs():
|
||||
|
||||
weekday_idx = target_date.weekday() # 0=Mon..6=Sun
|
||||
|
||||
jobs = (
|
||||
jobs_query = (
|
||||
Job.query.join(Customer, isouter=True)
|
||||
.filter(Job.archived.is_(False))
|
||||
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
)
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
jobs_query = jobs_query.filter(
|
||||
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
|
||||
)
|
||||
jobs = (
|
||||
jobs_query
|
||||
.order_by(Customer.name.asc().nullslast(), Job.backup_software.asc(), Job.backup_type.asc(), Job.job_name.asc())
|
||||
.all()
|
||||
)
|
||||
@ -306,7 +332,7 @@ def daily_jobs():
|
||||
)
|
||||
|
||||
target_date_str = target_date.strftime("%Y-%m-%d")
|
||||
return render_template("main/daily_jobs.html", rows=rows, target_date_str=target_date_str)
|
||||
return render_template("main/daily_jobs.html", rows=rows, target_date_str=target_date_str, q=q)
|
||||
|
||||
|
||||
@main_bp.route("/daily-jobs/details")
|
||||
|
||||
@ -1,5 +1,53 @@
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
from .routes_shared import _format_datetime
|
||||
from werkzeug.utils import secure_filename
|
||||
import imghdr
|
||||
|
||||
|
||||
# Allowed image extensions and max file size
|
||||
ALLOWED_EXTENSIONS = {'png', 'jpg', 'jpeg', 'gif', 'webp'}
|
||||
MAX_FILE_SIZE = 5 * 1024 * 1024 # 5 MB
|
||||
|
||||
|
||||
def _validate_image_file(file):
|
||||
"""Validate uploaded image file.
|
||||
|
||||
Returns (is_valid, error_message, mime_type)
|
||||
"""
|
||||
if not file or not file.filename:
|
||||
return False, "No file selected", None
|
||||
|
||||
# Check file size
|
||||
file.seek(0, 2) # Seek to end
|
||||
size = file.tell()
|
||||
file.seek(0) # Reset to beginning
|
||||
|
||||
if size > MAX_FILE_SIZE:
|
||||
return False, f"File too large (max {MAX_FILE_SIZE // (1024*1024)}MB)", None
|
||||
|
||||
if size == 0:
|
||||
return False, "Empty file", None
|
||||
|
||||
# Check extension
|
||||
filename = secure_filename(file.filename)
|
||||
if '.' not in filename:
|
||||
return False, "File must have an extension", None
|
||||
|
||||
ext = filename.rsplit('.', 1)[1].lower()
|
||||
if ext not in ALLOWED_EXTENSIONS:
|
||||
return False, f"Only images allowed ({', '.join(ALLOWED_EXTENSIONS)})", None
|
||||
|
||||
# Verify it's actually an image by reading header
|
||||
file_data = file.read()
|
||||
file.seek(0)
|
||||
|
||||
image_type = imghdr.what(None, h=file_data)
|
||||
if image_type is None:
|
||||
return False, "Invalid image file", None
|
||||
|
||||
mime_type = f"image/{image_type}"
|
||||
|
||||
return True, None, mime_type
|
||||
|
||||
|
||||
@main_bp.route("/feedback")
|
||||
@ -21,7 +69,14 @@ def feedback_page():
|
||||
if sort not in ("votes", "newest", "updated"):
|
||||
sort = "votes"
|
||||
|
||||
where = ["fi.deleted_at IS NULL"]
|
||||
# Admin-only: show deleted items
|
||||
show_deleted = False
|
||||
if get_active_role() == "admin":
|
||||
show_deleted = request.args.get("show_deleted", "0") in ("1", "true", "yes", "on")
|
||||
|
||||
where = []
|
||||
if not show_deleted:
|
||||
where.append("fi.deleted_at IS NULL")
|
||||
params = {"user_id": int(current_user.id)}
|
||||
|
||||
if item_type:
|
||||
@ -58,6 +113,8 @@ def feedback_page():
|
||||
fi.status,
|
||||
fi.created_at,
|
||||
fi.updated_at,
|
||||
fi.deleted_at,
|
||||
fi.deleted_by_user_id,
|
||||
u.username AS created_by,
|
||||
COALESCE(v.vote_count, 0) AS vote_count,
|
||||
EXISTS (
|
||||
@ -95,6 +152,8 @@ def feedback_page():
|
||||
"created_by": r["created_by"] or "-",
|
||||
"vote_count": int(r["vote_count"] or 0),
|
||||
"user_voted": bool(r["user_voted"]),
|
||||
"is_deleted": bool(r["deleted_at"]),
|
||||
"deleted_at": _format_datetime(r["deleted_at"]) if r["deleted_at"] else "",
|
||||
}
|
||||
)
|
||||
|
||||
@ -105,6 +164,7 @@ def feedback_page():
|
||||
status=status,
|
||||
q=q,
|
||||
sort=sort,
|
||||
show_deleted=show_deleted,
|
||||
)
|
||||
|
||||
|
||||
@ -135,6 +195,31 @@ def feedback_new():
|
||||
created_by_user_id=int(current_user.id),
|
||||
)
|
||||
db.session.add(item)
|
||||
db.session.flush() # Get item.id for attachments
|
||||
|
||||
# Handle file uploads (multiple files allowed)
|
||||
files = request.files.getlist('screenshots')
|
||||
for file in files:
|
||||
if file and file.filename:
|
||||
is_valid, error_msg, mime_type = _validate_image_file(file)
|
||||
if not is_valid:
|
||||
db.session.rollback()
|
||||
flash(f"Screenshot error: {error_msg}", "danger")
|
||||
return redirect(url_for("main.feedback_new"))
|
||||
|
||||
filename = secure_filename(file.filename)
|
||||
file_data = file.read()
|
||||
|
||||
attachment = FeedbackAttachment(
|
||||
feedback_item_id=item.id,
|
||||
feedback_reply_id=None,
|
||||
filename=filename,
|
||||
file_data=file_data,
|
||||
mime_type=mime_type,
|
||||
file_size=len(file_data),
|
||||
)
|
||||
db.session.add(attachment)
|
||||
|
||||
db.session.commit()
|
||||
|
||||
flash("Feedback item created.", "success")
|
||||
@ -148,7 +233,8 @@ def feedback_new():
|
||||
@roles_required("admin", "operator", "reporter", "viewer")
|
||||
def feedback_detail(item_id: int):
|
||||
item = FeedbackItem.query.get_or_404(item_id)
|
||||
if item.deleted_at is not None:
|
||||
# Allow admins to view deleted items
|
||||
if item.deleted_at is not None and get_active_role() != "admin":
|
||||
abort(404)
|
||||
|
||||
vote_count = (
|
||||
@ -174,6 +260,15 @@ def feedback_detail(item_id: int):
|
||||
resolved_by = User.query.get(item.resolved_by_user_id)
|
||||
resolved_by_name = resolved_by.username if resolved_by else ""
|
||||
|
||||
# Get attachments for the main item (not linked to a reply)
|
||||
item_attachments = (
|
||||
FeedbackAttachment.query.filter(
|
||||
FeedbackAttachment.feedback_item_id == item.id,
|
||||
FeedbackAttachment.feedback_reply_id.is_(None),
|
||||
)
|
||||
.order_by(FeedbackAttachment.created_at.asc())
|
||||
.all()
|
||||
)
|
||||
|
||||
replies = (
|
||||
FeedbackReply.query.filter(FeedbackReply.feedback_item_id == item.id)
|
||||
@ -181,6 +276,25 @@ def feedback_detail(item_id: int):
|
||||
.all()
|
||||
)
|
||||
|
||||
# Get attachments for each reply
|
||||
reply_ids = [r.id for r in replies]
|
||||
reply_attachments_list = []
|
||||
if reply_ids:
|
||||
reply_attachments_list = (
|
||||
FeedbackAttachment.query.filter(
|
||||
FeedbackAttachment.feedback_reply_id.in_(reply_ids)
|
||||
)
|
||||
.order_by(FeedbackAttachment.created_at.asc())
|
||||
.all()
|
||||
)
|
||||
|
||||
# Map reply_id -> list of attachments
|
||||
reply_attachments_map = {}
|
||||
for att in reply_attachments_list:
|
||||
if att.feedback_reply_id not in reply_attachments_map:
|
||||
reply_attachments_map[att.feedback_reply_id] = []
|
||||
reply_attachments_map[att.feedback_reply_id].append(att)
|
||||
|
||||
reply_user_ids = sorted({int(r.user_id) for r in replies})
|
||||
reply_users = (
|
||||
User.query.filter(User.id.in_(reply_user_ids)).all() if reply_user_ids else []
|
||||
@ -196,6 +310,8 @@ def feedback_detail(item_id: int):
|
||||
user_voted=bool(user_voted),
|
||||
replies=replies,
|
||||
reply_user_map=reply_user_map,
|
||||
item_attachments=item_attachments,
|
||||
reply_attachments_map=reply_attachments_map,
|
||||
)
|
||||
|
||||
@main_bp.route("/feedback/<int:item_id>/reply", methods=["POST"])
|
||||
@ -222,6 +338,31 @@ def feedback_reply(item_id: int):
|
||||
created_at=datetime.utcnow(),
|
||||
)
|
||||
db.session.add(reply)
|
||||
db.session.flush() # Get reply.id for attachments
|
||||
|
||||
# Handle file uploads (multiple files allowed)
|
||||
files = request.files.getlist('screenshots')
|
||||
for file in files:
|
||||
if file and file.filename:
|
||||
is_valid, error_msg, mime_type = _validate_image_file(file)
|
||||
if not is_valid:
|
||||
db.session.rollback()
|
||||
flash(f"Screenshot error: {error_msg}", "danger")
|
||||
return redirect(url_for("main.feedback_detail", item_id=item.id))
|
||||
|
||||
filename = secure_filename(file.filename)
|
||||
file_data = file.read()
|
||||
|
||||
attachment = FeedbackAttachment(
|
||||
feedback_item_id=item.id,
|
||||
feedback_reply_id=reply.id,
|
||||
filename=filename,
|
||||
file_data=file_data,
|
||||
mime_type=mime_type,
|
||||
file_size=len(file_data),
|
||||
)
|
||||
db.session.add(attachment)
|
||||
|
||||
db.session.commit()
|
||||
|
||||
flash("Reply added.", "success")
|
||||
@ -308,3 +449,60 @@ def feedback_delete(item_id: int):
|
||||
|
||||
flash("Feedback item deleted.", "success")
|
||||
return redirect(url_for("main.feedback_page"))
|
||||
|
||||
|
||||
@main_bp.route("/feedback/<int:item_id>/permanent-delete", methods=["POST"])
|
||||
@login_required
|
||||
@roles_required("admin")
|
||||
def feedback_permanent_delete(item_id: int):
|
||||
"""Permanently delete a feedback item and all its attachments from the database.
|
||||
|
||||
This is a hard delete - the item and all associated data will be removed permanently.
|
||||
Only available for items that are already soft-deleted.
|
||||
"""
|
||||
item = FeedbackItem.query.get_or_404(item_id)
|
||||
|
||||
# Only allow permanent delete on already soft-deleted items
|
||||
if item.deleted_at is None:
|
||||
flash("Item must be deleted first before permanent deletion.", "warning")
|
||||
return redirect(url_for("main.feedback_detail", item_id=item.id))
|
||||
|
||||
# Get attachment count for feedback message
|
||||
attachment_count = FeedbackAttachment.query.filter_by(feedback_item_id=item.id).count()
|
||||
|
||||
# Hard delete - CASCADE will automatically delete:
|
||||
# - feedback_votes
|
||||
# - feedback_replies
|
||||
# - feedback_attachments (via replies CASCADE)
|
||||
# - feedback_attachments (direct, via item CASCADE)
|
||||
db.session.delete(item)
|
||||
db.session.commit()
|
||||
|
||||
flash(f"Feedback item permanently deleted ({attachment_count} screenshot(s) removed).", "success")
|
||||
return redirect(url_for("main.feedback_page", show_deleted="1"))
|
||||
|
||||
|
||||
@main_bp.route("/feedback/attachment/<int:attachment_id>")
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "reporter", "viewer")
|
||||
def feedback_attachment(attachment_id: int):
|
||||
"""Serve a feedback attachment image."""
|
||||
attachment = FeedbackAttachment.query.get_or_404(attachment_id)
|
||||
|
||||
# Check if the feedback item is deleted - allow admins to view
|
||||
item = FeedbackItem.query.get(attachment.feedback_item_id)
|
||||
if not item:
|
||||
abort(404)
|
||||
if item.deleted_at is not None and get_active_role() != "admin":
|
||||
abort(404)
|
||||
|
||||
# Serve the image
|
||||
from flask import send_file
|
||||
import io
|
||||
|
||||
return send_file(
|
||||
io.BytesIO(attachment.file_data),
|
||||
mimetype=attachment.mime_type,
|
||||
as_attachment=False,
|
||||
download_name=attachment.filename,
|
||||
)
|
||||
|
||||
@ -9,12 +9,28 @@ from ..ticketing_utils import link_open_internal_tickets_to_run
|
||||
import time
|
||||
import re
|
||||
import html as _html
|
||||
from sqlalchemy import cast, String
|
||||
|
||||
|
||||
@main_bp.route("/inbox")
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
def inbox():
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
try:
|
||||
page = int(request.args.get("page", "1"))
|
||||
except ValueError:
|
||||
@ -28,6 +44,18 @@ def inbox():
|
||||
# Use location column if available; otherwise just return all
|
||||
if hasattr(MailMessage, "location"):
|
||||
query = query.filter(MailMessage.location == "inbox")
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
query = query.filter(
|
||||
(func.coalesce(MailMessage.from_address, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.subject, "").ilike(pat, escape="\\"))
|
||||
| (cast(MailMessage.received_at, String).ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.backup_software, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.backup_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.job_name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(MailMessage.parse_result, "").ilike(pat, escape="\\"))
|
||||
| (cast(MailMessage.parsed_at, String).ilike(pat, escape="\\"))
|
||||
)
|
||||
|
||||
total_items = query.count()
|
||||
total_pages = max(1, math.ceil(total_items / per_page)) if total_items else 1
|
||||
@ -79,6 +107,7 @@ def inbox():
|
||||
customers=customer_rows,
|
||||
can_bulk_delete=(get_active_role() in ("admin", "operator")),
|
||||
is_admin=(get_active_role() == "admin"),
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
|
||||
@ -13,12 +13,56 @@ from .routes_shared import (
|
||||
@login_required
|
||||
@roles_required("admin", "operator", "viewer")
|
||||
def jobs():
|
||||
# Join with customers for display
|
||||
jobs = (
|
||||
selected_customer_id = None
|
||||
selected_customer_name = ""
|
||||
q = (request.args.get("q") or "").strip()
|
||||
customer_id_raw = (request.args.get("customer_id") or "").strip()
|
||||
if customer_id_raw:
|
||||
try:
|
||||
selected_customer_id = int(customer_id_raw)
|
||||
except ValueError:
|
||||
selected_customer_id = None
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
base_query = (
|
||||
Job.query
|
||||
.filter(Job.archived.is_(False))
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
)
|
||||
|
||||
if selected_customer_id is not None:
|
||||
base_query = base_query.filter(Job.customer_id == selected_customer_id)
|
||||
selected_customer = Customer.query.filter(Customer.id == selected_customer_id).first()
|
||||
if selected_customer is not None:
|
||||
selected_customer_name = selected_customer.name or ""
|
||||
else:
|
||||
# Default listing hides jobs for inactive customers.
|
||||
base_query = base_query.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
base_query = base_query.filter(
|
||||
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
|
||||
)
|
||||
|
||||
# Join with customers for display
|
||||
jobs = (
|
||||
base_query
|
||||
.add_columns(
|
||||
Job.id,
|
||||
Job.backup_software,
|
||||
@ -54,6 +98,9 @@ def jobs():
|
||||
"main/jobs.html",
|
||||
jobs=rows,
|
||||
can_manage_jobs=can_manage_jobs,
|
||||
selected_customer_id=selected_customer_id,
|
||||
selected_customer_name=selected_customer_name,
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
|
||||
@ -11,6 +11,16 @@ _OVERRIDE_DEFAULT_START_AT = datetime(1970, 1, 1)
|
||||
def overrides():
|
||||
can_manage = get_active_role() in ("admin", "operator")
|
||||
can_delete = get_active_role() == "admin"
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _match_query(text: str, raw_query: str) -> bool:
|
||||
hay = (text or "").lower()
|
||||
tokens = [t.strip() for t in (raw_query or "").split() if t.strip()]
|
||||
for tok in tokens:
|
||||
needle = tok.lower().replace("*", "")
|
||||
if needle and needle not in hay:
|
||||
return False
|
||||
return True
|
||||
|
||||
overrides_q = Override.query.order_by(Override.level.asc(), Override.start_at.desc()).all()
|
||||
|
||||
@ -92,16 +102,31 @@ def overrides():
|
||||
|
||||
rows = []
|
||||
for ov in overrides_q:
|
||||
scope_text = _describe_scope(ov)
|
||||
start_text = _format_datetime(ov.start_at)
|
||||
end_text = _format_datetime(ov.end_at) if ov.end_at else ""
|
||||
comment_text = ov.comment or ""
|
||||
if q:
|
||||
full_text = " | ".join([
|
||||
ov.level or "",
|
||||
scope_text,
|
||||
start_text,
|
||||
end_text,
|
||||
comment_text,
|
||||
])
|
||||
if not _match_query(full_text, q):
|
||||
continue
|
||||
|
||||
rows.append(
|
||||
{
|
||||
"id": ov.id,
|
||||
"level": ov.level or "",
|
||||
"scope": _describe_scope(ov),
|
||||
"start_at": _format_datetime(ov.start_at),
|
||||
"end_at": _format_datetime(ov.end_at) if ov.end_at else "",
|
||||
"scope": scope_text,
|
||||
"start_at": start_text,
|
||||
"end_at": end_text,
|
||||
"active": bool(ov.active),
|
||||
"treat_as_success": bool(ov.treat_as_success),
|
||||
"comment": ov.comment or "",
|
||||
"comment": comment_text,
|
||||
"match_status": ov.match_status or "",
|
||||
"match_error_contains": ov.match_error_contains or "",
|
||||
"match_error_mode": getattr(ov, "match_error_mode", None) or "",
|
||||
@ -122,6 +147,7 @@ def overrides():
|
||||
jobs_for_select=jobs_for_select,
|
||||
backup_software_options=backup_software_options,
|
||||
backup_type_options=backup_type_options,
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
@ -398,4 +424,3 @@ def overrides_toggle(override_id: int):
|
||||
|
||||
flash("Override status updated.", "success")
|
||||
return redirect(url_for("main.overrides"))
|
||||
|
||||
|
||||
@ -1,6 +1,6 @@
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy import text, cast, String
|
||||
import json
|
||||
import csv
|
||||
import io
|
||||
@ -101,12 +101,33 @@ def api_reports_list():
|
||||
if err is not None:
|
||||
return err
|
||||
|
||||
rows = (
|
||||
db.session.query(ReportDefinition)
|
||||
.order_by(ReportDefinition.created_at.desc())
|
||||
.limit(200)
|
||||
.all()
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
query = db.session.query(ReportDefinition)
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
query = query.filter(
|
||||
(func.coalesce(ReportDefinition.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(ReportDefinition.report_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(ReportDefinition.output_format, "").ilike(pat, escape="\\"))
|
||||
| (cast(ReportDefinition.period_start, String).ilike(pat, escape="\\"))
|
||||
| (cast(ReportDefinition.period_end, String).ilike(pat, escape="\\"))
|
||||
)
|
||||
|
||||
rows = query.order_by(ReportDefinition.created_at.desc()).limit(200).all()
|
||||
return {
|
||||
"items": [
|
||||
{
|
||||
|
||||
@ -1,6 +1,7 @@
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
from datetime import date, timedelta
|
||||
from .routes_reporting_api import build_report_columns_meta, build_report_job_filters_meta
|
||||
from sqlalchemy import cast, String
|
||||
|
||||
def get_default_report_period():
|
||||
"""Return default report period (last 7 days)."""
|
||||
@ -52,13 +53,33 @@ def _build_report_item(r):
|
||||
@main_bp.route("/reports")
|
||||
@login_required
|
||||
def reports():
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
# Pre-render items so the page is usable even if JS fails to load/execute.
|
||||
rows = (
|
||||
db.session.query(ReportDefinition)
|
||||
.order_by(ReportDefinition.created_at.desc())
|
||||
.limit(200)
|
||||
.all()
|
||||
query = db.session.query(ReportDefinition)
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
query = query.filter(
|
||||
(func.coalesce(ReportDefinition.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(ReportDefinition.report_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(ReportDefinition.output_format, "").ilike(pat, escape="\\"))
|
||||
| (cast(ReportDefinition.period_start, String).ilike(pat, escape="\\"))
|
||||
| (cast(ReportDefinition.period_end, String).ilike(pat, escape="\\"))
|
||||
)
|
||||
rows = query.order_by(ReportDefinition.created_at.desc()).limit(200).all()
|
||||
items = [_build_report_item(r) for r in rows]
|
||||
|
||||
period_start, period_end = get_default_report_period()
|
||||
@ -70,6 +91,7 @@ def reports():
|
||||
job_filters_meta=build_report_job_filters_meta(),
|
||||
default_period_start=period_start.isoformat(),
|
||||
default_period_end=period_end.isoformat(),
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
|
||||
@ -38,11 +38,19 @@ from ..models import (
|
||||
TicketScope,
|
||||
User,
|
||||
)
|
||||
from ..ticketing_utils import link_open_internal_tickets_to_run
|
||||
|
||||
|
||||
AUTOTASK_TERMINAL_STATUS_IDS = {5}
|
||||
|
||||
|
||||
def _is_hidden_3cx_non_backup(backup_software: str | None, backup_type: str | None) -> bool:
|
||||
"""Hide non-backup 3CX informational jobs from Run Checks."""
|
||||
bs = (backup_software or "").strip().lower()
|
||||
bt = (backup_type or "").strip().lower()
|
||||
return bs == "3cx" and bt in {"update", "ssl certificate"}
|
||||
|
||||
|
||||
def _ensure_internal_ticket_for_autotask(
|
||||
*,
|
||||
ticket_number: str,
|
||||
@ -725,6 +733,8 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
|
||||
mail_message_id=None,
|
||||
)
|
||||
db.session.add(miss)
|
||||
db.session.flush() # Ensure miss.id is available for ticket linking
|
||||
link_open_internal_tickets_to_run(run=miss, job=job)
|
||||
inserted += 1
|
||||
|
||||
d = d + timedelta(days=1)
|
||||
@ -806,6 +816,8 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
|
||||
mail_message_id=None,
|
||||
)
|
||||
db.session.add(miss)
|
||||
db.session.flush() # Ensure miss.id is available for ticket linking
|
||||
link_open_internal_tickets_to_run(run=miss, job=job)
|
||||
inserted += 1
|
||||
|
||||
# Next month
|
||||
@ -825,6 +837,21 @@ def _ensure_missed_runs_for_job(job: Job, start_from: date, end_inclusive: date)
|
||||
def run_checks_page():
|
||||
"""Run Checks page: list jobs that have runs to review (including generated missed runs)."""
|
||||
|
||||
q = (request.args.get("q") or "").strip()
|
||||
|
||||
def _patterns(raw: str) -> list[str]:
|
||||
out = []
|
||||
for tok in [t.strip() for t in (raw or "").split() if t.strip()]:
|
||||
p = tok.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = "%" + p
|
||||
if not p.endswith("%"):
|
||||
p = p + "%"
|
||||
out.append(p)
|
||||
return out
|
||||
|
||||
include_reviewed = False
|
||||
if get_active_role() == "admin":
|
||||
include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on")
|
||||
@ -850,6 +877,8 @@ def run_checks_page():
|
||||
today_local = _to_amsterdam_date(datetime.utcnow()) or datetime.utcnow().date()
|
||||
|
||||
for job in jobs:
|
||||
if _is_hidden_3cx_non_backup(getattr(job, "backup_software", None), getattr(job, "backup_type", None)):
|
||||
continue
|
||||
last_rev = last_reviewed_map.get(int(job.id))
|
||||
if last_rev:
|
||||
start_date = _to_amsterdam_date(last_rev) or settings_start
|
||||
@ -884,6 +913,14 @@ def run_checks_page():
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(Job.archived.is_(False))
|
||||
)
|
||||
if q:
|
||||
for pat in _patterns(q):
|
||||
base = base.filter(
|
||||
(func.coalesce(Customer.name, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_software, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.backup_type, "").ilike(pat, escape="\\"))
|
||||
| (func.coalesce(Job.job_name, "").ilike(pat, escape="\\"))
|
||||
)
|
||||
|
||||
# Runs to show in the overview: unreviewed (or all if admin toggle enabled)
|
||||
run_filter = []
|
||||
@ -956,7 +993,7 @@ def run_checks_page():
|
||||
Job.id.asc(),
|
||||
)
|
||||
|
||||
rows = q.limit(2000).all()
|
||||
rows = [r for r in q.limit(2000).all() if not _is_hidden_3cx_non_backup(r.backup_software, r.backup_type)]
|
||||
|
||||
# Ensure override flags are up-to-date for the runs shown in this overview.
|
||||
# The Run Checks modal computes override status on-the-fly, but the overview
|
||||
@ -1131,6 +1168,7 @@ def run_checks_page():
|
||||
is_admin=(get_active_role() == "admin"),
|
||||
include_reviewed=include_reviewed,
|
||||
autotask_enabled=autotask_enabled,
|
||||
q=q,
|
||||
)
|
||||
|
||||
|
||||
@ -1151,6 +1189,15 @@ def run_checks_details():
|
||||
include_reviewed = request.args.get("include_reviewed", "0") in ("1", "true", "yes", "on")
|
||||
|
||||
job = Job.query.get_or_404(job_id)
|
||||
if _is_hidden_3cx_non_backup(getattr(job, "backup_software", None), getattr(job, "backup_type", None)):
|
||||
job_payload = {
|
||||
"id": job.id,
|
||||
"customer_name": job.customer.name if job.customer else "",
|
||||
"backup_software": job.backup_software or "",
|
||||
"backup_type": job.backup_type or "",
|
||||
"job_name": job.job_name or "",
|
||||
}
|
||||
return jsonify({"status": "ok", "job": job_payload, "runs": [], "message": "This 3CX informational type is hidden from Run Checks."})
|
||||
|
||||
q = JobRun.query.filter(JobRun.job_id == job.id)
|
||||
if not include_reviewed:
|
||||
|
||||
963
containers/backupchecks/src/backend/app/main/routes_search.py
Normal file
963
containers/backupchecks/src/backend/app/main/routes_search.py
Normal file
@ -0,0 +1,963 @@
|
||||
from .routes_shared import * # noqa: F401,F403
|
||||
from .routes_shared import (
|
||||
_apply_overrides_to_run,
|
||||
_format_datetime,
|
||||
_get_or_create_settings,
|
||||
_get_ui_timezone,
|
||||
_infer_monthly_schedule_from_runs,
|
||||
_infer_schedule_map_from_runs,
|
||||
)
|
||||
|
||||
from sqlalchemy import and_, cast, func, or_, String
|
||||
import math
|
||||
|
||||
|
||||
SEARCH_LIMIT_PER_SECTION = 10
|
||||
SEARCH_SECTION_KEYS = [
|
||||
"inbox",
|
||||
"customers",
|
||||
"jobs",
|
||||
"daily_jobs",
|
||||
"run_checks",
|
||||
"tickets",
|
||||
"remarks",
|
||||
"overrides",
|
||||
"reports",
|
||||
]
|
||||
|
||||
|
||||
def _is_section_allowed(section: str) -> bool:
|
||||
role = get_active_role()
|
||||
allowed = {
|
||||
"inbox": {"admin", "operator", "viewer"},
|
||||
"customers": {"admin", "operator", "viewer"},
|
||||
"jobs": {"admin", "operator", "viewer"},
|
||||
"daily_jobs": {"admin", "operator", "viewer"},
|
||||
"run_checks": {"admin", "operator"},
|
||||
"tickets": {"admin", "operator", "viewer"},
|
||||
"remarks": {"admin", "operator", "viewer"},
|
||||
"overrides": {"admin", "operator", "viewer"},
|
||||
"reports": {"admin", "operator", "viewer", "reporter"},
|
||||
}
|
||||
return role in allowed.get(section, set())
|
||||
|
||||
|
||||
def _build_patterns(raw_query: str) -> list[str]:
|
||||
tokens = [t.strip() for t in (raw_query or "").split() if t.strip()]
|
||||
patterns: list[str] = []
|
||||
for token in tokens:
|
||||
p = token.replace("\\", "\\\\")
|
||||
p = p.replace("%", "\\%").replace("_", "\\_")
|
||||
p = p.replace("*", "%")
|
||||
if not p.startswith("%"):
|
||||
p = f"%{p}"
|
||||
if not p.endswith("%"):
|
||||
p = f"{p}%"
|
||||
patterns.append(p)
|
||||
return patterns
|
||||
|
||||
|
||||
def _contains_all_terms(columns: list, patterns: list[str]):
|
||||
if not patterns or not columns:
|
||||
return None
|
||||
term_filters = []
|
||||
for pattern in patterns:
|
||||
per_term = [col.ilike(pattern, escape="\\") for col in columns]
|
||||
term_filters.append(or_(*per_term))
|
||||
return and_(*term_filters)
|
||||
|
||||
|
||||
def _parse_page(value: str | None) -> int:
|
||||
try:
|
||||
page = int((value or "").strip())
|
||||
except Exception:
|
||||
page = 1
|
||||
return page if page > 0 else 1
|
||||
|
||||
|
||||
def _paginate_query(query, page: int, order_by_cols: list):
|
||||
total = query.count()
|
||||
total_pages = max(1, math.ceil(total / SEARCH_LIMIT_PER_SECTION)) if total else 1
|
||||
current_page = min(max(page, 1), total_pages)
|
||||
rows = (
|
||||
query.order_by(*order_by_cols)
|
||||
.offset((current_page - 1) * SEARCH_LIMIT_PER_SECTION)
|
||||
.limit(SEARCH_LIMIT_PER_SECTION)
|
||||
.all()
|
||||
)
|
||||
return total, current_page, total_pages, rows
|
||||
|
||||
|
||||
def _enrich_paging(section: dict, total: int, current_page: int, total_pages: int) -> None:
|
||||
section["total"] = int(total or 0)
|
||||
section["current_page"] = int(current_page or 1)
|
||||
section["total_pages"] = int(total_pages or 1)
|
||||
section["has_prev"] = section["current_page"] > 1
|
||||
section["has_next"] = section["current_page"] < section["total_pages"]
|
||||
section["prev_url"] = ""
|
||||
section["next_url"] = ""
|
||||
|
||||
|
||||
def _build_inbox_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "inbox",
|
||||
"title": "Inbox",
|
||||
"view_all_url": url_for("main.inbox"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("inbox"):
|
||||
return section
|
||||
|
||||
query = MailMessage.query
|
||||
if hasattr(MailMessage, "location"):
|
||||
query = query.filter(MailMessage.location == "inbox")
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(MailMessage.from_address, ""),
|
||||
func.coalesce(MailMessage.subject, ""),
|
||||
cast(MailMessage.received_at, String),
|
||||
func.coalesce(MailMessage.backup_software, ""),
|
||||
func.coalesce(MailMessage.backup_type, ""),
|
||||
func.coalesce(MailMessage.job_name, ""),
|
||||
func.coalesce(MailMessage.parse_result, ""),
|
||||
cast(MailMessage.parsed_at, String),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[MailMessage.received_at.desc().nullslast(), MailMessage.id.desc()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
|
||||
for msg in rows:
|
||||
parsed_flag = bool(getattr(msg, "parsed_at", None) or (msg.parse_result or ""))
|
||||
section["items"].append(
|
||||
{
|
||||
"title": msg.subject or f"Message #{msg.id}",
|
||||
"subtitle": f"{msg.from_address or '-'} | {_format_datetime(msg.received_at)}",
|
||||
"meta": f"{msg.backup_software or '-'} / {msg.backup_type or '-'} / {msg.job_name or '-'} | Parsed: {'Yes' if parsed_flag else 'No'}",
|
||||
"link": url_for("main.inbox"),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_customers_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "customers",
|
||||
"title": "Customers",
|
||||
"view_all_url": url_for("main.customers"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("customers"):
|
||||
return section
|
||||
|
||||
query = Customer.query
|
||||
match_expr = _contains_all_terms([func.coalesce(Customer.name, "")], patterns)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[Customer.name.asc()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for c in rows:
|
||||
try:
|
||||
job_count = c.jobs.count()
|
||||
except Exception:
|
||||
job_count = 0
|
||||
section["items"].append(
|
||||
{
|
||||
"title": c.name or f"Customer #{c.id}",
|
||||
"subtitle": f"Jobs: {job_count}",
|
||||
"meta": "Active" if c.active else "Inactive",
|
||||
"link": url_for("main.jobs", customer_id=c.id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_jobs_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "jobs",
|
||||
"title": "Jobs",
|
||||
"view_all_url": url_for("main.jobs"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("jobs"):
|
||||
return section
|
||||
|
||||
query = (
|
||||
db.session.query(
|
||||
Job.id.label("job_id"),
|
||||
Job.backup_software.label("backup_software"),
|
||||
Job.backup_type.label("backup_type"),
|
||||
Job.job_name.label("job_name"),
|
||||
Customer.name.label("customer_name"),
|
||||
)
|
||||
.select_from(Job)
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(Job.archived.is_(False))
|
||||
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(Job.backup_software, ""),
|
||||
func.coalesce(Job.backup_type, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[
|
||||
Customer.name.asc().nullslast(),
|
||||
Job.backup_software.asc(),
|
||||
Job.backup_type.asc(),
|
||||
Job.job_name.asc(),
|
||||
],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for row in rows:
|
||||
section["items"].append(
|
||||
{
|
||||
"title": row.job_name or f"Job #{row.job_id}",
|
||||
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
|
||||
"meta": "",
|
||||
"link": url_for("main.job_detail", job_id=row.job_id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_daily_jobs_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "daily_jobs",
|
||||
"title": "Daily Jobs",
|
||||
"view_all_url": url_for("main.daily_jobs"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("daily_jobs"):
|
||||
return section
|
||||
|
||||
try:
|
||||
tz = _get_ui_timezone()
|
||||
except Exception:
|
||||
tz = None
|
||||
|
||||
try:
|
||||
target_date = datetime.now(tz).date() if tz else datetime.utcnow().date()
|
||||
except Exception:
|
||||
target_date = datetime.utcnow().date()
|
||||
|
||||
settings = _get_or_create_settings()
|
||||
missed_start_date = getattr(settings, "daily_jobs_start_date", None)
|
||||
|
||||
if tz:
|
||||
local_midnight = datetime(
|
||||
year=target_date.year,
|
||||
month=target_date.month,
|
||||
day=target_date.day,
|
||||
hour=0,
|
||||
minute=0,
|
||||
second=0,
|
||||
tzinfo=tz,
|
||||
)
|
||||
start_of_day = local_midnight.astimezone(datetime_module.timezone.utc).replace(tzinfo=None)
|
||||
end_of_day = (local_midnight + timedelta(days=1)).astimezone(datetime_module.timezone.utc).replace(tzinfo=None)
|
||||
else:
|
||||
start_of_day = datetime(
|
||||
year=target_date.year,
|
||||
month=target_date.month,
|
||||
day=target_date.day,
|
||||
hour=0,
|
||||
minute=0,
|
||||
second=0,
|
||||
)
|
||||
end_of_day = start_of_day + timedelta(days=1)
|
||||
|
||||
def _to_local(dt_utc):
|
||||
if not dt_utc or not tz:
|
||||
return dt_utc
|
||||
try:
|
||||
if dt_utc.tzinfo is None:
|
||||
dt_utc = dt_utc.replace(tzinfo=datetime_module.timezone.utc)
|
||||
return dt_utc.astimezone(tz)
|
||||
except Exception:
|
||||
return dt_utc
|
||||
|
||||
def _bucket_15min(dt_utc):
|
||||
d = _to_local(dt_utc)
|
||||
if not d:
|
||||
return None
|
||||
minute_bucket = (d.minute // 15) * 15
|
||||
return f"{d.hour:02d}:{minute_bucket:02d}"
|
||||
|
||||
def _is_success_status(value: str) -> bool:
|
||||
s = (value or "").strip().lower()
|
||||
if not s:
|
||||
return False
|
||||
return ("success" in s) or ("override" in s)
|
||||
|
||||
query = (
|
||||
db.session.query(
|
||||
Job.id.label("job_id"),
|
||||
Job.job_name.label("job_name"),
|
||||
Job.backup_software.label("backup_software"),
|
||||
Job.backup_type.label("backup_type"),
|
||||
Customer.name.label("customer_name"),
|
||||
)
|
||||
.select_from(Job)
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(Job.archived.is_(False))
|
||||
.filter(db.or_(Customer.id.is_(None), Customer.active.is_(True)))
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(Job.backup_software, ""),
|
||||
func.coalesce(Job.backup_type, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[
|
||||
Customer.name.asc().nullslast(),
|
||||
Job.backup_software.asc(),
|
||||
Job.backup_type.asc(),
|
||||
Job.job_name.asc(),
|
||||
],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for row in rows:
|
||||
expected_times = (_infer_schedule_map_from_runs(row.job_id).get(target_date.weekday()) or [])
|
||||
if not expected_times:
|
||||
monthly = _infer_monthly_schedule_from_runs(row.job_id)
|
||||
if monthly:
|
||||
try:
|
||||
dom = int(monthly.get("day_of_month") or 0)
|
||||
except Exception:
|
||||
dom = 0
|
||||
mtimes = monthly.get("times") or []
|
||||
try:
|
||||
import calendar as _calendar
|
||||
last_dom = _calendar.monthrange(target_date.year, target_date.month)[1]
|
||||
except Exception:
|
||||
last_dom = target_date.day
|
||||
scheduled_dom = dom if (dom and dom <= last_dom) else last_dom
|
||||
if target_date.day == scheduled_dom:
|
||||
expected_times = list(mtimes)
|
||||
|
||||
runs_for_day = (
|
||||
JobRun.query.filter(
|
||||
JobRun.job_id == row.job_id,
|
||||
JobRun.run_at >= start_of_day,
|
||||
JobRun.run_at < end_of_day,
|
||||
)
|
||||
.order_by(JobRun.run_at.asc())
|
||||
.all()
|
||||
)
|
||||
run_count = len(runs_for_day)
|
||||
|
||||
last_status = "-"
|
||||
expected_display = expected_times[-1] if expected_times else "-"
|
||||
if run_count > 0:
|
||||
last_run = runs_for_day[-1]
|
||||
try:
|
||||
job_obj = Job.query.get(int(row.job_id))
|
||||
status_display, _override_applied, _override_level, _ov_id, _ov_reason = _apply_overrides_to_run(job_obj, last_run)
|
||||
if getattr(last_run, "missed", False):
|
||||
last_status = status_display or "Missed"
|
||||
else:
|
||||
last_status = status_display or (last_run.status or "-")
|
||||
except Exception:
|
||||
last_status = last_run.status or "-"
|
||||
expected_display = _bucket_15min(last_run.run_at) or expected_display
|
||||
else:
|
||||
try:
|
||||
today_local = datetime.now(tz).date() if tz else datetime.utcnow().date()
|
||||
except Exception:
|
||||
today_local = datetime.utcnow().date()
|
||||
if target_date > today_local:
|
||||
last_status = "Expected"
|
||||
elif target_date == today_local:
|
||||
last_status = "Expected"
|
||||
else:
|
||||
if missed_start_date and target_date < missed_start_date:
|
||||
last_status = "-"
|
||||
else:
|
||||
last_status = "Missed"
|
||||
|
||||
success_text = "Yes" if _is_success_status(last_status) else "No"
|
||||
section["items"].append(
|
||||
{
|
||||
"title": row.job_name or f"Job #{row.job_id}",
|
||||
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
|
||||
"meta": f"Expected: {expected_display} | Successful: {success_text} | Runs: {run_count}",
|
||||
"link": url_for("main.daily_jobs", date=target_date.strftime("%Y-%m-%d"), open_job_id=row.job_id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_run_checks_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "run_checks",
|
||||
"title": "Run Checks",
|
||||
"view_all_url": url_for("main.run_checks_page"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("run_checks"):
|
||||
return section
|
||||
|
||||
agg = (
|
||||
db.session.query(
|
||||
JobRun.job_id.label("job_id"),
|
||||
func.count(JobRun.id).label("run_count"),
|
||||
)
|
||||
.filter(JobRun.reviewed_at.is_(None))
|
||||
.group_by(JobRun.job_id)
|
||||
.subquery()
|
||||
)
|
||||
|
||||
query = (
|
||||
db.session.query(
|
||||
Job.id.label("job_id"),
|
||||
Job.job_name.label("job_name"),
|
||||
Job.backup_software.label("backup_software"),
|
||||
Job.backup_type.label("backup_type"),
|
||||
Customer.name.label("customer_name"),
|
||||
agg.c.run_count.label("run_count"),
|
||||
)
|
||||
.select_from(Job)
|
||||
.join(agg, agg.c.job_id == Job.id)
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
.filter(Job.archived.is_(False))
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(Job.backup_software, ""),
|
||||
func.coalesce(Job.backup_type, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[
|
||||
Customer.name.asc().nullslast(),
|
||||
Job.backup_software.asc().nullslast(),
|
||||
Job.backup_type.asc().nullslast(),
|
||||
Job.job_name.asc().nullslast(),
|
||||
],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for row in rows:
|
||||
section["items"].append(
|
||||
{
|
||||
"title": row.job_name or f"Job #{row.job_id}",
|
||||
"subtitle": f"{row.customer_name or '-'} | {row.backup_software or '-'} / {row.backup_type or '-'}",
|
||||
"meta": f"Unreviewed runs: {int(row.run_count or 0)}",
|
||||
"link": url_for("main.run_checks_page"),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_tickets_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "tickets",
|
||||
"title": "Tickets",
|
||||
"view_all_url": url_for("main.tickets_page"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("tickets"):
|
||||
return section
|
||||
|
||||
query = (
|
||||
db.session.query(Ticket)
|
||||
.select_from(Ticket)
|
||||
.outerjoin(TicketScope, TicketScope.ticket_id == Ticket.id)
|
||||
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
|
||||
.outerjoin(Job, Job.id == TicketScope.job_id)
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Ticket.ticket_code, ""),
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(TicketScope.scope_type, ""),
|
||||
func.coalesce(TicketScope.backup_software, ""),
|
||||
func.coalesce(TicketScope.backup_type, ""),
|
||||
func.coalesce(TicketScope.job_name_match, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
query = query.distinct()
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[Ticket.start_date.desc().nullslast()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
|
||||
for t in rows:
|
||||
customer_display = "-"
|
||||
scope_summary = "-"
|
||||
try:
|
||||
scope_rows = (
|
||||
db.session.query(
|
||||
TicketScope.scope_type.label("scope_type"),
|
||||
TicketScope.backup_software.label("backup_software"),
|
||||
TicketScope.backup_type.label("backup_type"),
|
||||
Customer.name.label("customer_name"),
|
||||
)
|
||||
.select_from(TicketScope)
|
||||
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
|
||||
.filter(TicketScope.ticket_id == t.id)
|
||||
.all()
|
||||
)
|
||||
customer_names = []
|
||||
for s in scope_rows:
|
||||
cname = getattr(s, "customer_name", None)
|
||||
if cname and cname not in customer_names:
|
||||
customer_names.append(cname)
|
||||
if customer_names:
|
||||
customer_display = customer_names[0]
|
||||
if len(customer_names) > 1:
|
||||
customer_display = f"{customer_display} +{len(customer_names)-1}"
|
||||
|
||||
if scope_rows:
|
||||
s = scope_rows[0]
|
||||
bits = []
|
||||
if getattr(s, "scope_type", None):
|
||||
bits.append(str(getattr(s, "scope_type")))
|
||||
if getattr(s, "backup_software", None):
|
||||
bits.append(str(getattr(s, "backup_software")))
|
||||
if getattr(s, "backup_type", None):
|
||||
bits.append(str(getattr(s, "backup_type")))
|
||||
scope_summary = " / ".join(bits) if bits else "-"
|
||||
except Exception:
|
||||
customer_display = "-"
|
||||
scope_summary = "-"
|
||||
|
||||
section["items"].append(
|
||||
{
|
||||
"title": t.ticket_code or f"Ticket #{t.id}",
|
||||
"subtitle": f"{customer_display} | {scope_summary}",
|
||||
"meta": _format_datetime(t.start_date),
|
||||
"link": url_for("main.ticket_detail", ticket_id=t.id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_remarks_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "remarks",
|
||||
"title": "Remarks",
|
||||
"view_all_url": url_for("main.tickets_page", tab="remarks"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("remarks"):
|
||||
return section
|
||||
|
||||
query = (
|
||||
db.session.query(Remark)
|
||||
.select_from(Remark)
|
||||
.outerjoin(RemarkScope, RemarkScope.remark_id == Remark.id)
|
||||
.outerjoin(Customer, Customer.id == RemarkScope.customer_id)
|
||||
.outerjoin(Job, Job.id == RemarkScope.job_id)
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Remark.title, ""),
|
||||
func.coalesce(Remark.body, ""),
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(RemarkScope.scope_type, ""),
|
||||
func.coalesce(RemarkScope.backup_software, ""),
|
||||
func.coalesce(RemarkScope.backup_type, ""),
|
||||
func.coalesce(RemarkScope.job_name_match, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
cast(Remark.start_date, String),
|
||||
cast(Remark.resolved_at, String),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
query = query.distinct()
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[Remark.start_date.desc().nullslast()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
|
||||
for r in rows:
|
||||
customer_display = "-"
|
||||
scope_summary = "-"
|
||||
try:
|
||||
scope_rows = (
|
||||
db.session.query(
|
||||
RemarkScope.scope_type.label("scope_type"),
|
||||
RemarkScope.backup_software.label("backup_software"),
|
||||
RemarkScope.backup_type.label("backup_type"),
|
||||
Customer.name.label("customer_name"),
|
||||
)
|
||||
.select_from(RemarkScope)
|
||||
.outerjoin(Customer, Customer.id == RemarkScope.customer_id)
|
||||
.filter(RemarkScope.remark_id == r.id)
|
||||
.all()
|
||||
)
|
||||
customer_names = []
|
||||
for s in scope_rows:
|
||||
cname = getattr(s, "customer_name", None)
|
||||
if cname and cname not in customer_names:
|
||||
customer_names.append(cname)
|
||||
if customer_names:
|
||||
customer_display = customer_names[0]
|
||||
if len(customer_names) > 1:
|
||||
customer_display = f"{customer_display} +{len(customer_names)-1}"
|
||||
|
||||
if scope_rows:
|
||||
s = scope_rows[0]
|
||||
bits = []
|
||||
if getattr(s, "scope_type", None):
|
||||
bits.append(str(getattr(s, "scope_type")))
|
||||
if getattr(s, "backup_software", None):
|
||||
bits.append(str(getattr(s, "backup_software")))
|
||||
if getattr(s, "backup_type", None):
|
||||
bits.append(str(getattr(s, "backup_type")))
|
||||
scope_summary = " / ".join(bits) if bits else "-"
|
||||
except Exception:
|
||||
customer_display = "-"
|
||||
scope_summary = "-"
|
||||
|
||||
preview = (r.title or r.body or "").strip()
|
||||
if len(preview) > 80:
|
||||
preview = preview[:77] + "..."
|
||||
|
||||
section["items"].append(
|
||||
{
|
||||
"title": preview or f"Remark #{r.id}",
|
||||
"subtitle": f"{customer_display} | {scope_summary}",
|
||||
"meta": _format_datetime(r.start_date),
|
||||
"link": url_for("main.remark_detail", remark_id=r.id),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_overrides_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "overrides",
|
||||
"title": "Existing overrides",
|
||||
"view_all_url": url_for("main.overrides"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("overrides"):
|
||||
return section
|
||||
|
||||
query = (
|
||||
db.session.query(
|
||||
Override.id.label("id"),
|
||||
Override.level.label("level"),
|
||||
Override.backup_software.label("backup_software"),
|
||||
Override.backup_type.label("backup_type"),
|
||||
Override.object_name.label("object_name"),
|
||||
Override.start_at.label("start_at"),
|
||||
Override.end_at.label("end_at"),
|
||||
Override.comment.label("comment"),
|
||||
Customer.name.label("customer_name"),
|
||||
Job.job_name.label("job_name"),
|
||||
)
|
||||
.select_from(Override)
|
||||
.outerjoin(Job, Job.id == Override.job_id)
|
||||
.outerjoin(Customer, Customer.id == Job.customer_id)
|
||||
)
|
||||
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(Override.level, ""),
|
||||
func.coalesce(Customer.name, ""),
|
||||
func.coalesce(Override.backup_software, ""),
|
||||
func.coalesce(Override.backup_type, ""),
|
||||
func.coalesce(Job.job_name, ""),
|
||||
func.coalesce(Override.object_name, ""),
|
||||
cast(Override.start_at, String),
|
||||
cast(Override.end_at, String),
|
||||
func.coalesce(Override.comment, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[Override.level.asc(), Override.start_at.desc()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
for row in rows:
|
||||
scope_bits = []
|
||||
if row.customer_name:
|
||||
scope_bits.append(row.customer_name)
|
||||
if row.backup_software:
|
||||
scope_bits.append(row.backup_software)
|
||||
if row.backup_type:
|
||||
scope_bits.append(row.backup_type)
|
||||
if row.job_name:
|
||||
scope_bits.append(row.job_name)
|
||||
if row.object_name:
|
||||
scope_bits.append(f"object: {row.object_name}")
|
||||
scope_text = " / ".join(scope_bits) if scope_bits else "All jobs"
|
||||
|
||||
section["items"].append(
|
||||
{
|
||||
"title": (row.level or "override").capitalize(),
|
||||
"subtitle": scope_text,
|
||||
"meta": f"From {_format_datetime(row.start_at)} to {_format_datetime(row.end_at) if row.end_at else '-'} | {row.comment or ''}",
|
||||
"link": url_for("main.overrides"),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
def _build_reports_results(patterns: list[str], page: int) -> dict:
|
||||
section = {
|
||||
"key": "reports",
|
||||
"title": "Reports",
|
||||
"view_all_url": url_for("main.reports"),
|
||||
"total": 0,
|
||||
"items": [],
|
||||
"current_page": 1,
|
||||
"total_pages": 1,
|
||||
"has_prev": False,
|
||||
"has_next": False,
|
||||
"prev_url": "",
|
||||
"next_url": "",
|
||||
}
|
||||
if not _is_section_allowed("reports"):
|
||||
return section
|
||||
|
||||
query = ReportDefinition.query
|
||||
match_expr = _contains_all_terms(
|
||||
[
|
||||
func.coalesce(ReportDefinition.name, ""),
|
||||
func.coalesce(ReportDefinition.report_type, ""),
|
||||
cast(ReportDefinition.period_start, String),
|
||||
cast(ReportDefinition.period_end, String),
|
||||
func.coalesce(ReportDefinition.output_format, ""),
|
||||
],
|
||||
patterns,
|
||||
)
|
||||
if match_expr is not None:
|
||||
query = query.filter(match_expr)
|
||||
|
||||
total, current_page, total_pages, rows = _paginate_query(
|
||||
query,
|
||||
page,
|
||||
[ReportDefinition.created_at.desc()],
|
||||
)
|
||||
_enrich_paging(section, total, current_page, total_pages)
|
||||
|
||||
can_edit = get_active_role() in ("admin", "operator", "reporter")
|
||||
for r in rows:
|
||||
section["items"].append(
|
||||
{
|
||||
"title": r.name or f"Report #{r.id}",
|
||||
"subtitle": f"{r.report_type or '-'} | {r.output_format or '-'}",
|
||||
"meta": f"{_format_datetime(r.period_start)} -> {_format_datetime(r.period_end)}",
|
||||
"link": (url_for("main.reports_edit", report_id=r.id) if can_edit else url_for("main.reports")),
|
||||
}
|
||||
)
|
||||
|
||||
return section
|
||||
|
||||
|
||||
@main_bp.route("/search")
|
||||
@login_required
|
||||
def search_page():
|
||||
query = (request.args.get("q") or "").strip()
|
||||
patterns = _build_patterns(query)
|
||||
|
||||
requested_pages = {
|
||||
key: _parse_page(request.args.get(f"p_{key}"))
|
||||
for key in SEARCH_SECTION_KEYS
|
||||
}
|
||||
|
||||
sections = []
|
||||
if patterns:
|
||||
sections.append(_build_inbox_results(patterns, requested_pages["inbox"]))
|
||||
sections.append(_build_customers_results(patterns, requested_pages["customers"]))
|
||||
sections.append(_build_jobs_results(patterns, requested_pages["jobs"]))
|
||||
sections.append(_build_daily_jobs_results(patterns, requested_pages["daily_jobs"]))
|
||||
sections.append(_build_run_checks_results(patterns, requested_pages["run_checks"]))
|
||||
sections.append(_build_tickets_results(patterns, requested_pages["tickets"]))
|
||||
sections.append(_build_remarks_results(patterns, requested_pages["remarks"]))
|
||||
sections.append(_build_overrides_results(patterns, requested_pages["overrides"]))
|
||||
sections.append(_build_reports_results(patterns, requested_pages["reports"]))
|
||||
else:
|
||||
sections = [
|
||||
{"key": "inbox", "title": "Inbox", "view_all_url": url_for("main.inbox"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "customers", "title": "Customers", "view_all_url": url_for("main.customers"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "jobs", "title": "Jobs", "view_all_url": url_for("main.jobs"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "daily_jobs", "title": "Daily Jobs", "view_all_url": url_for("main.daily_jobs"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "run_checks", "title": "Run Checks", "view_all_url": url_for("main.run_checks_page"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "tickets", "title": "Tickets", "view_all_url": url_for("main.tickets_page"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "remarks", "title": "Remarks", "view_all_url": url_for("main.tickets_page", tab="remarks"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "overrides", "title": "Existing overrides", "view_all_url": url_for("main.overrides"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
{"key": "reports", "title": "Reports", "view_all_url": url_for("main.reports"), "total": 0, "items": [], "current_page": 1, "total_pages": 1, "has_prev": False, "has_next": False, "prev_url": "", "next_url": ""},
|
||||
]
|
||||
|
||||
visible_sections = [s for s in sections if _is_section_allowed(s["key"])]
|
||||
current_pages = {
|
||||
s["key"]: int(s.get("current_page", 1) or 1)
|
||||
for s in sections
|
||||
}
|
||||
|
||||
def _build_search_url(page_overrides: dict[str, int]) -> str:
|
||||
args = {"q": query}
|
||||
for key in SEARCH_SECTION_KEYS:
|
||||
args[f"p_{key}"] = int(page_overrides.get(key, current_pages.get(key, 1)))
|
||||
return url_for("main.search_page", **args)
|
||||
|
||||
for s in visible_sections:
|
||||
key = s["key"]
|
||||
cur = int(s.get("current_page", 1) or 1)
|
||||
if query:
|
||||
if key == "inbox":
|
||||
s["view_all_url"] = url_for("main.inbox", q=query)
|
||||
elif key == "customers":
|
||||
s["view_all_url"] = url_for("main.customers", q=query)
|
||||
elif key == "jobs":
|
||||
s["view_all_url"] = url_for("main.jobs", q=query)
|
||||
elif key == "daily_jobs":
|
||||
s["view_all_url"] = url_for("main.daily_jobs", q=query)
|
||||
elif key == "run_checks":
|
||||
s["view_all_url"] = url_for("main.run_checks_page", q=query)
|
||||
elif key == "tickets":
|
||||
s["view_all_url"] = url_for("main.tickets_page", q=query)
|
||||
elif key == "remarks":
|
||||
s["view_all_url"] = url_for("main.tickets_page", tab="remarks", q=query)
|
||||
elif key == "overrides":
|
||||
s["view_all_url"] = url_for("main.overrides", q=query)
|
||||
elif key == "reports":
|
||||
s["view_all_url"] = url_for("main.reports", q=query)
|
||||
if s.get("has_prev"):
|
||||
prev_pages = dict(current_pages)
|
||||
prev_pages[key] = cur - 1
|
||||
s["prev_url"] = _build_search_url(prev_pages)
|
||||
if s.get("has_next"):
|
||||
next_pages = dict(current_pages)
|
||||
next_pages[key] = cur + 1
|
||||
s["next_url"] = _build_search_url(next_pages)
|
||||
|
||||
total_hits = sum(int(s.get("total", 0) or 0) for s in visible_sections)
|
||||
|
||||
return render_template(
|
||||
"main/search.html",
|
||||
query=query,
|
||||
sections=visible_sections,
|
||||
total_hits=total_hits,
|
||||
limit_per_section=SEARCH_LIMIT_PER_SECTION,
|
||||
)
|
||||
@ -585,6 +585,7 @@ def settings_jobs_export():
|
||||
@roles_required("admin")
|
||||
def settings_jobs_import():
|
||||
upload = request.files.get("jobs_file")
|
||||
include_autotask_ids = bool(request.form.get("include_autotask_ids"))
|
||||
if not upload or not upload.filename:
|
||||
flash("No import file was provided.", "danger")
|
||||
return redirect(url_for("main.settings", section="general"))
|
||||
@ -621,14 +622,17 @@ def settings_jobs_import():
|
||||
if not cust_name:
|
||||
continue
|
||||
|
||||
autotask_company_id = None
|
||||
autotask_company_name = None
|
||||
if include_autotask_ids:
|
||||
# Read Autotask fields (backwards compatible - optional)
|
||||
autotask_company_id = cust_item.get("autotask_company_id")
|
||||
autotask_company_name = cust_item.get("autotask_company_name")
|
||||
|
||||
existing_customer = Customer.query.filter_by(name=cust_name).first()
|
||||
if existing_customer:
|
||||
# Update Autotask mapping if provided
|
||||
if autotask_company_id is not None:
|
||||
# Update Autotask mapping only when explicitly allowed by import option.
|
||||
if include_autotask_ids and autotask_company_id is not None:
|
||||
existing_customer.autotask_company_id = autotask_company_id
|
||||
existing_customer.autotask_company_name = autotask_company_name
|
||||
existing_customer.autotask_mapping_status = None # Will be resynced
|
||||
@ -747,7 +751,7 @@ def settings_jobs_import():
|
||||
|
||||
db.session.commit()
|
||||
flash(
|
||||
f"Import completed. Customers created: {created_customers}, updated: {updated_customers}. Jobs created: {created_jobs}, updated: {updated_jobs}.",
|
||||
f"Import completed. Customers created: {created_customers}, updated: {updated_customers}. Jobs created: {created_jobs}, updated: {updated_jobs}. Autotask IDs imported: {'yes' if include_autotask_ids else 'no'}.",
|
||||
"success",
|
||||
)
|
||||
|
||||
@ -758,6 +762,7 @@ def settings_jobs_import():
|
||||
details=json.dumps({
|
||||
"format": "JSON",
|
||||
"schema": payload.get("schema"),
|
||||
"include_autotask_ids": include_autotask_ids,
|
||||
"customers_created": created_customers,
|
||||
"customers_updated": updated_customers,
|
||||
"jobs_created": created_jobs,
|
||||
|
||||
@ -52,6 +52,7 @@ from ..models import (
|
||||
FeedbackItem,
|
||||
FeedbackVote,
|
||||
FeedbackReply,
|
||||
FeedbackAttachment,
|
||||
NewsItem,
|
||||
NewsRead,
|
||||
ReportDefinition,
|
||||
@ -678,6 +679,10 @@ def _infer_schedule_map_from_runs(job_id: int):
|
||||
return schedule
|
||||
if bs == 'qnap' and bt == 'firmware update':
|
||||
return schedule
|
||||
if bs == '3cx' and bt == 'update':
|
||||
return schedule
|
||||
if bs == '3cx' and bt == 'ssl certificate':
|
||||
return schedule
|
||||
if bs == 'syncovery' and bt == 'syncovery':
|
||||
return schedule
|
||||
except Exception:
|
||||
@ -993,4 +998,3 @@ def _next_ticket_code(now_utc: datetime) -> str:
|
||||
seq = 1
|
||||
|
||||
return f"{prefix}{seq:04d}"
|
||||
|
||||
|
||||
@ -28,16 +28,32 @@ def tickets_page():
|
||||
|
||||
if tab == "tickets":
|
||||
query = Ticket.query
|
||||
joined_scope = False
|
||||
if active_only:
|
||||
query = query.filter(Ticket.resolved_at.is_(None))
|
||||
if q:
|
||||
like_q = f"%{q}%"
|
||||
query = (
|
||||
query
|
||||
.outerjoin(TicketScope, TicketScope.ticket_id == Ticket.id)
|
||||
.outerjoin(Customer, Customer.id == TicketScope.customer_id)
|
||||
.outerjoin(Job, Job.id == TicketScope.job_id)
|
||||
)
|
||||
joined_scope = True
|
||||
query = query.filter(
|
||||
(Ticket.ticket_code.ilike(like_q))
|
||||
| (Ticket.description.ilike(like_q))
|
||||
| (Customer.name.ilike(like_q))
|
||||
| (TicketScope.scope_type.ilike(like_q))
|
||||
| (TicketScope.backup_software.ilike(like_q))
|
||||
| (TicketScope.backup_type.ilike(like_q))
|
||||
| (TicketScope.job_name_match.ilike(like_q))
|
||||
| (Job.job_name.ilike(like_q))
|
||||
)
|
||||
query = query.distinct()
|
||||
|
||||
if customer_id or backup_software or backup_type:
|
||||
if not joined_scope:
|
||||
query = query.join(TicketScope, TicketScope.ticket_id == Ticket.id)
|
||||
if customer_id:
|
||||
query = query.filter(TicketScope.customer_id == customer_id)
|
||||
@ -322,4 +338,3 @@ def ticket_detail(ticket_id: int):
|
||||
scopes=scopes,
|
||||
runs=runs,
|
||||
)
|
||||
|
||||
|
||||
@ -1095,6 +1095,7 @@ def run_migrations() -> None:
|
||||
migrate_object_persistence_tables()
|
||||
migrate_feedback_tables()
|
||||
migrate_feedback_replies_table()
|
||||
migrate_feedback_attachments_table()
|
||||
migrate_tickets_active_from_date()
|
||||
migrate_tickets_resolved_origin()
|
||||
migrate_remarks_active_from_date()
|
||||
@ -1446,6 +1447,49 @@ def migrate_feedback_replies_table() -> None:
|
||||
print("[migrations] Feedback replies table ensured.")
|
||||
|
||||
|
||||
def migrate_feedback_attachments_table() -> None:
|
||||
"""Ensure feedback attachments table exists.
|
||||
|
||||
Table:
|
||||
- feedback_attachments (screenshots/images for feedback items and replies)
|
||||
"""
|
||||
engine = db.get_engine()
|
||||
with engine.begin() as conn:
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
CREATE TABLE IF NOT EXISTS feedback_attachments (
|
||||
id SERIAL PRIMARY KEY,
|
||||
feedback_item_id INTEGER NOT NULL REFERENCES feedback_items(id) ON DELETE CASCADE,
|
||||
feedback_reply_id INTEGER REFERENCES feedback_replies(id) ON DELETE CASCADE,
|
||||
filename VARCHAR(255) NOT NULL,
|
||||
file_data BYTEA NOT NULL,
|
||||
mime_type VARCHAR(64) NOT NULL,
|
||||
file_size INTEGER NOT NULL,
|
||||
created_at TIMESTAMP NOT NULL DEFAULT NOW()
|
||||
);
|
||||
"""
|
||||
)
|
||||
)
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
CREATE INDEX IF NOT EXISTS idx_feedback_attachments_item
|
||||
ON feedback_attachments (feedback_item_id);
|
||||
"""
|
||||
)
|
||||
)
|
||||
conn.execute(
|
||||
text(
|
||||
"""
|
||||
CREATE INDEX IF NOT EXISTS idx_feedback_attachments_reply
|
||||
ON feedback_attachments (feedback_reply_id);
|
||||
"""
|
||||
)
|
||||
)
|
||||
print("[migrations] Feedback attachments table ensured.")
|
||||
|
||||
|
||||
def migrate_tickets_active_from_date() -> None:
|
||||
"""Ensure tickets.active_from_date exists and is populated.
|
||||
|
||||
|
||||
@ -567,6 +567,23 @@ class FeedbackReply(db.Model):
|
||||
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
|
||||
|
||||
|
||||
class FeedbackAttachment(db.Model):
|
||||
__tablename__ = "feedback_attachments"
|
||||
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
feedback_item_id = db.Column(
|
||||
db.Integer, db.ForeignKey("feedback_items.id", ondelete="CASCADE"), nullable=False
|
||||
)
|
||||
feedback_reply_id = db.Column(
|
||||
db.Integer, db.ForeignKey("feedback_replies.id", ondelete="CASCADE"), nullable=True
|
||||
)
|
||||
filename = db.Column(db.String(255), nullable=False)
|
||||
file_data = db.Column(db.LargeBinary, nullable=False)
|
||||
mime_type = db.Column(db.String(64), nullable=False)
|
||||
file_size = db.Column(db.Integer, nullable=False)
|
||||
created_at = db.Column(db.DateTime, default=datetime.utcnow, nullable=False)
|
||||
|
||||
|
||||
class NewsItem(db.Model):
|
||||
__tablename__ = "news_items"
|
||||
|
||||
|
||||
@ -24,6 +24,10 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
|
||||
- SSL Certificate Renewal (informational)
|
||||
Subject: '3CX Notification: SSL Certificate Renewal - <host>'
|
||||
Body contains an informational message about the renewal.
|
||||
|
||||
- Update Successful (informational)
|
||||
Subject: '3CX Notification: Update Successful - <host>'
|
||||
Body confirms update completion and healthy services.
|
||||
"""
|
||||
subject = (msg.subject or "").strip()
|
||||
if not subject:
|
||||
@ -38,11 +42,16 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
|
||||
subject,
|
||||
flags=re.IGNORECASE,
|
||||
)
|
||||
m_update = re.match(
|
||||
r"^3CX Notification:\s*Update Successful\s*-\s*(.+)$",
|
||||
subject,
|
||||
flags=re.IGNORECASE,
|
||||
)
|
||||
|
||||
if not m_backup and not m_ssl:
|
||||
if not m_backup and not m_ssl and not m_update:
|
||||
return False, {}, []
|
||||
|
||||
job_name = (m_backup or m_ssl).group(1).strip()
|
||||
job_name = (m_backup or m_ssl or m_update).group(1).strip()
|
||||
|
||||
body = (getattr(msg, "text_body", None) or getattr(msg, "body", None) or "")
|
||||
if not body:
|
||||
@ -60,6 +69,17 @@ def try_parse_3cx(msg: MailMessage) -> Tuple[bool, Dict, List[Dict]]:
|
||||
}
|
||||
return True, result, []
|
||||
|
||||
# Update successful: store as tracked informational run
|
||||
if m_update:
|
||||
result = {
|
||||
"backup_software": "3CX",
|
||||
"backup_type": "Update",
|
||||
"job_name": job_name,
|
||||
"overall_status": "Success",
|
||||
"overall_message": body or None,
|
||||
}
|
||||
return True, result, []
|
||||
|
||||
# Backup complete
|
||||
backup_file = None
|
||||
m_file = re.search(r"^\s*Backup\s+name\s*:\s*(.+?)\s*$", body, flags=re.IGNORECASE | re.MULTILINE)
|
||||
|
||||
@ -157,6 +157,18 @@
|
||||
</li>
|
||||
{% endif %}
|
||||
</ul>
|
||||
<form method="get" action="{{ url_for('main.search_page') }}" class="d-flex me-3 mb-2 mb-lg-0" role="search" autocomplete="off">
|
||||
<input
|
||||
class="form-control form-control-sm me-2"
|
||||
type="search"
|
||||
name="q"
|
||||
placeholder="Search"
|
||||
aria-label="Search"
|
||||
value="{{ request.args.get('q','') if request.path == url_for('main.search_page') else '' }}"
|
||||
style="min-width: 220px;"
|
||||
/>
|
||||
<button class="btn btn-outline-secondary btn-sm" type="submit">Search</button>
|
||||
</form>
|
||||
<span class="navbar-text me-3">
|
||||
<a class="text-decoration-none" href="{{ url_for('main.user_settings') }}">
|
||||
{{ current_user.username }} ({{ active_role }})
|
||||
|
||||
@ -15,6 +15,10 @@
|
||||
|
||||
<form method="post" action="{{ url_for('main.customers_import') }}" enctype="multipart/form-data" class="d-flex align-items-center gap-2 mb-0">
|
||||
<input type="file" name="file" accept=".csv,text/csv" class="form-control form-control-sm" required style="max-width: 420px;" />
|
||||
<div class="form-check mb-0">
|
||||
<input class="form-check-input" type="checkbox" value="1" id="include_autotask_ids_customers" name="include_autotask_ids" />
|
||||
<label class="form-check-label small" for="include_autotask_ids_customers">Include Autotask IDs</label>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-outline-secondary btn-sm" style="white-space: nowrap;">Import CSV</button>
|
||||
</form>
|
||||
|
||||
@ -45,7 +49,11 @@
|
||||
{% if customers %}
|
||||
{% for c in customers %}
|
||||
<tr>
|
||||
<td>{{ c.name }}</td>
|
||||
<td>
|
||||
<a href="{{ url_for('main.jobs', customer_id=c.id) }}" class="link-primary text-decoration-none">
|
||||
{{ c.name }}
|
||||
</a>
|
||||
</td>
|
||||
<td>
|
||||
{% if c.active %}
|
||||
<span class="badge bg-success">Active</span>
|
||||
|
||||
@ -4,6 +4,9 @@
|
||||
<h2 class="mb-3">Daily Jobs</h2>
|
||||
|
||||
<form method="get" class="row g-3 mb-3">
|
||||
{% if q %}
|
||||
<input type="hidden" name="q" value="{{ q }}" />
|
||||
{% endif %}
|
||||
<div class="col-auto">
|
||||
<label for="dj_date" class="form-label">Date</label>
|
||||
<input
|
||||
@ -771,9 +774,43 @@ if (tStatus) tStatus.textContent = '';
|
||||
});
|
||||
}
|
||||
|
||||
function autoOpenJobFromQuery() {
|
||||
try {
|
||||
var params = new URLSearchParams(window.location.search || "");
|
||||
var openJobId = (params.get("open_job_id") || "").trim();
|
||||
if (!openJobId) {
|
||||
return;
|
||||
}
|
||||
|
||||
var rows = document.querySelectorAll(".daily-job-row");
|
||||
var targetRow = null;
|
||||
rows.forEach(function (row) {
|
||||
if ((row.getAttribute("data-job-id") || "") === openJobId) {
|
||||
targetRow = row;
|
||||
}
|
||||
});
|
||||
|
||||
if (!targetRow) {
|
||||
return;
|
||||
}
|
||||
|
||||
targetRow.click();
|
||||
|
||||
params.delete("open_job_id");
|
||||
var nextQuery = params.toString();
|
||||
var nextUrl = window.location.pathname + (nextQuery ? ("?" + nextQuery) : "");
|
||||
if (window.history && window.history.replaceState) {
|
||||
window.history.replaceState({}, document.title, nextUrl);
|
||||
}
|
||||
} catch (e) {
|
||||
// no-op
|
||||
}
|
||||
}
|
||||
|
||||
document.addEventListener("DOMContentLoaded", function () {
|
||||
bindInlineCreateForms();
|
||||
attachDailyJobsHandlers();
|
||||
autoOpenJobFromQuery();
|
||||
});
|
||||
})();
|
||||
</script>
|
||||
|
||||
@ -34,6 +34,16 @@
|
||||
<div class="col-6 col-md-3">
|
||||
<button class="btn btn-outline-secondary" type="submit">Apply</button>
|
||||
</div>
|
||||
{% if active_role == 'admin' %}
|
||||
<div class="col-12">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" name="show_deleted" value="1" id="show_deleted" {% if show_deleted %}checked{% endif %} onchange="this.form.submit()">
|
||||
<label class="form-check-label" for="show_deleted">
|
||||
Show deleted items
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</form>
|
||||
|
||||
<div class="table-responsive">
|
||||
@ -46,6 +56,9 @@
|
||||
<th style="width: 160px;">Component</th>
|
||||
<th style="width: 120px;">Status</th>
|
||||
<th style="width: 170px;">Created</th>
|
||||
{% if active_role == 'admin' and show_deleted %}
|
||||
<th style="width: 140px;">Actions</th>
|
||||
{% endif %}
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
@ -56,20 +69,30 @@
|
||||
{% endif %}
|
||||
|
||||
{% for i in items %}
|
||||
<tr>
|
||||
<tr {% if i.is_deleted %}style="opacity: 0.6; background-color: var(--bs-secondary-bg);"{% endif %}>
|
||||
<td>
|
||||
{% if not i.is_deleted %}
|
||||
<form method="post" action="{{ url_for('main.feedback_vote', item_id=i.id) }}">
|
||||
<input type="hidden" name="ref" value="list" />
|
||||
<button type="submit" class="btn btn-sm {% if i.user_voted %}btn-success{% else %}btn-outline-secondary{% endif %}">
|
||||
+ {{ i.vote_count }}
|
||||
</button>
|
||||
</form>
|
||||
{% else %}
|
||||
<span class="text-muted">+ {{ i.vote_count }}</span>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
<a href="{{ url_for('main.feedback_detail', item_id=i.id) }}">{{ i.title }}</a>
|
||||
{% if i.is_deleted %}
|
||||
<span class="badge text-bg-dark ms-2">Deleted</span>
|
||||
{% endif %}
|
||||
{% if i.created_by %}
|
||||
<div class="text-muted" style="font-size: 0.85rem;">by {{ i.created_by }}</div>
|
||||
{% endif %}
|
||||
{% if i.is_deleted and i.deleted_at %}
|
||||
<div class="text-muted" style="font-size: 0.85rem;">Deleted {{ i.deleted_at|local_datetime }}</div>
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>
|
||||
{% if i.item_type == 'bug' %}
|
||||
@ -90,6 +113,15 @@
|
||||
<div>{{ i.created_at|local_datetime }}</div>
|
||||
<div class="text-muted" style="font-size: 0.85rem;">Updated {{ i.updated_at|local_datetime }}</div>
|
||||
</td>
|
||||
{% if active_role == 'admin' and show_deleted %}
|
||||
<td>
|
||||
{% if i.is_deleted %}
|
||||
<form method="post" action="{{ url_for('main.feedback_permanent_delete', item_id=i.id) }}" onsubmit="return confirm('Permanently delete this item and all screenshots? This cannot be undone!');">
|
||||
<button type="submit" class="btn btn-sm btn-danger">Permanent Delete</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
</td>
|
||||
{% endif %}
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
|
||||
@ -15,6 +15,9 @@
|
||||
{% else %}
|
||||
<span class="badge text-bg-warning">Open</span>
|
||||
{% endif %}
|
||||
{% if item.deleted_at %}
|
||||
<span class="badge text-bg-dark">Deleted</span>
|
||||
{% endif %}
|
||||
<span class="ms-2">by {{ created_by_name }}</span>
|
||||
</div>
|
||||
</div>
|
||||
@ -29,6 +32,23 @@
|
||||
<div class="mb-2"><strong>Component:</strong> {{ item.component }}</div>
|
||||
{% endif %}
|
||||
<div style="white-space: pre-wrap;">{{ item.description }}</div>
|
||||
|
||||
{% if item_attachments %}
|
||||
<div class="mt-3">
|
||||
<strong>Screenshots:</strong>
|
||||
<div class="d-flex flex-wrap gap-2 mt-2">
|
||||
{% for att in item_attachments %}
|
||||
<a href="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}" target="_blank">
|
||||
<img src="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}"
|
||||
alt="{{ att.filename }}"
|
||||
class="img-thumbnail"
|
||||
style="max-height: 200px; max-width: 300px; cursor: pointer;"
|
||||
title="Click to view full size" />
|
||||
</a>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
<div class="card-footer d-flex justify-content-between align-items-center">
|
||||
<div class="text-muted" style="font-size: 0.9rem;">
|
||||
@ -63,6 +83,22 @@
|
||||
</span>
|
||||
</div>
|
||||
<div style="white-space: pre-wrap;">{{ r.message }}</div>
|
||||
|
||||
{% if r.id in reply_attachments_map %}
|
||||
<div class="mt-2">
|
||||
<div class="d-flex flex-wrap gap-2">
|
||||
{% for att in reply_attachments_map[r.id] %}
|
||||
<a href="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}" target="_blank">
|
||||
<img src="{{ url_for('main.feedback_attachment', attachment_id=att.id) }}"
|
||||
alt="{{ att.filename }}"
|
||||
class="img-thumbnail"
|
||||
style="max-height: 150px; max-width: 200px; cursor: pointer;"
|
||||
title="Click to view full size" />
|
||||
</a>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
@ -76,10 +112,15 @@
|
||||
<div class="card-body">
|
||||
<h5 class="card-title mb-3">Add reply</h5>
|
||||
{% if item.status == 'open' %}
|
||||
<form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}">
|
||||
<form method="post" action="{{ url_for('main.feedback_reply', item_id=item.id) }}" enctype="multipart/form-data">
|
||||
<div class="mb-2">
|
||||
<textarea class="form-control" name="message" rows="4" required></textarea>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<label class="form-label">Screenshots (optional)</label>
|
||||
<input type="file" name="screenshots" class="form-control" multiple accept="image/png,image/jpeg,image/jpg,image/gif,image/webp" />
|
||||
<div class="form-text">You can attach multiple screenshots (PNG, JPG, GIF, WEBP, max 5MB each)</div>
|
||||
</div>
|
||||
<button type="submit" class="btn btn-primary">Post reply</button>
|
||||
</form>
|
||||
{% else %}
|
||||
@ -95,6 +136,16 @@
|
||||
<h2 class="h6">Actions</h2>
|
||||
|
||||
{% if active_role == 'admin' %}
|
||||
{% if item.deleted_at %}
|
||||
{# Item is deleted - show permanent delete option #}
|
||||
<div class="alert alert-warning mb-2" style="font-size: 0.9rem;">
|
||||
This item is deleted.
|
||||
</div>
|
||||
<form method="post" action="{{ url_for('main.feedback_permanent_delete', item_id=item.id) }}" onsubmit="return confirm('Permanently delete this item and all screenshots? This cannot be undone!');">
|
||||
<button type="submit" class="btn btn-danger w-100">Permanent Delete</button>
|
||||
</form>
|
||||
{% else %}
|
||||
{# Item is not deleted - show normal actions #}
|
||||
{% if item.status == 'resolved' %}
|
||||
<form method="post" action="{{ url_for('main.feedback_resolve', item_id=item.id) }}" class="mb-2">
|
||||
<input type="hidden" name="action" value="reopen" />
|
||||
@ -110,6 +161,7 @@
|
||||
<form method="post" action="{{ url_for('main.feedback_delete', item_id=item.id) }}" onsubmit="return confirm('Delete this item?');">
|
||||
<button type="submit" class="btn btn-danger w-100">Delete</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<div class="text-muted">Only administrators can resolve or delete items.</div>
|
||||
{% endif %}
|
||||
|
||||
@ -6,7 +6,7 @@
|
||||
<a class="btn btn-outline-secondary" href="{{ url_for('main.feedback_page') }}">Back</a>
|
||||
</div>
|
||||
|
||||
<form method="post" class="card">
|
||||
<form method="post" enctype="multipart/form-data" class="card">
|
||||
<div class="card-body">
|
||||
<div class="row g-3">
|
||||
<div class="col-12 col-md-3">
|
||||
@ -28,6 +28,11 @@
|
||||
<label class="form-label">Component (optional)</label>
|
||||
<input type="text" name="component" class="form-control" />
|
||||
</div>
|
||||
<div class="col-12">
|
||||
<label class="form-label">Screenshots (optional)</label>
|
||||
<input type="file" name="screenshots" class="form-control" multiple accept="image/png,image/jpeg,image/jpg,image/gif,image/webp" />
|
||||
<div class="form-text">You can attach multiple screenshots (PNG, JPG, GIF, WEBP, max 5MB each)</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-footer d-flex justify-content-end">
|
||||
|
||||
@ -14,12 +14,12 @@
|
||||
<div class="d-flex justify-content-between align-items-center my-2">
|
||||
<div>
|
||||
{% if has_prev %}
|
||||
<a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.inbox', page=page-1) }}">Previous</a>
|
||||
<a class="btn btn-outline-secondary btn-sm" href="{{ url_for('main.inbox', page=page-1, q=q) }}">Previous</a>
|
||||
{% else %}
|
||||
<button class="btn btn-outline-secondary btn-sm" disabled>Previous</button>
|
||||
{% endif %}
|
||||
{% if has_next %}
|
||||
<a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.inbox', page=page+1) }}">Next</a>
|
||||
<a class="btn btn-outline-secondary btn-sm ms-2" href="{{ url_for('main.inbox', page=page+1, q=q) }}">Next</a>
|
||||
{% else %}
|
||||
<button class="btn btn-outline-secondary btn-sm ms-2" disabled>Next</button>
|
||||
{% endif %}
|
||||
@ -73,7 +73,7 @@
|
||||
<tr>
|
||||
{% if can_bulk_delete %}
|
||||
<th scope="col" style="width: 34px;">
|
||||
<input class="form-check-input" type="checkbox" id="inbox_select_all" />
|
||||
<input class="form-check-input" type="checkbox" id="inbox_select_all" autocomplete="off" />
|
||||
</th>
|
||||
{% endif %}
|
||||
<th scope="col">From</th>
|
||||
@ -93,7 +93,7 @@
|
||||
<tr class="inbox-row" data-message-id="{{ row.id }}" style="cursor: pointer;">
|
||||
{% if can_bulk_delete %}
|
||||
<td onclick="event.stopPropagation();">
|
||||
<input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" />
|
||||
<input class="form-check-input inbox_row_cb" type="checkbox" value="{{ row.id }}" autocomplete="off" />
|
||||
</td>
|
||||
{% endif %}
|
||||
<td>{{ row.from_address }}</td>
|
||||
|
||||
@ -287,6 +287,60 @@
|
||||
(function () {
|
||||
var currentRunId = null;
|
||||
|
||||
// Cross-browser copy to clipboard function
|
||||
function copyToClipboard(text, button) {
|
||||
// Method 1: Modern Clipboard API (works in most browsers with HTTPS)
|
||||
if (navigator.clipboard && navigator.clipboard.writeText) {
|
||||
navigator.clipboard.writeText(text)
|
||||
.then(function () {
|
||||
showCopyFeedback(button);
|
||||
})
|
||||
.catch(function () {
|
||||
// Fallback to method 2 if clipboard API fails
|
||||
fallbackCopy(text, button);
|
||||
});
|
||||
} else {
|
||||
// Method 2: Legacy execCommand method
|
||||
fallbackCopy(text, button);
|
||||
}
|
||||
}
|
||||
|
||||
function fallbackCopy(text, button) {
|
||||
var textarea = document.createElement('textarea');
|
||||
textarea.value = text;
|
||||
textarea.style.position = 'fixed';
|
||||
textarea.style.opacity = '0';
|
||||
textarea.style.top = '0';
|
||||
textarea.style.left = '0';
|
||||
document.body.appendChild(textarea);
|
||||
textarea.focus();
|
||||
textarea.select();
|
||||
|
||||
try {
|
||||
var successful = document.execCommand('copy');
|
||||
if (successful) {
|
||||
showCopyFeedback(button);
|
||||
} else {
|
||||
// If execCommand fails, use prompt as last resort
|
||||
window.prompt('Copy ticket number:', text);
|
||||
}
|
||||
} catch (err) {
|
||||
// If all else fails, show prompt
|
||||
window.prompt('Copy ticket number:', text);
|
||||
}
|
||||
|
||||
document.body.removeChild(textarea);
|
||||
}
|
||||
|
||||
function showCopyFeedback(button) {
|
||||
if (!button) return;
|
||||
var original = button.textContent;
|
||||
button.textContent = '✓';
|
||||
setTimeout(function () {
|
||||
button.textContent = original;
|
||||
}, 800);
|
||||
}
|
||||
|
||||
function apiJson(url, opts) {
|
||||
opts = opts || {};
|
||||
opts.headers = opts.headers || {};
|
||||
@ -319,12 +373,14 @@
|
||||
html += '<div class="mb-2"><strong>Tickets</strong><div class="mt-1">';
|
||||
tickets.forEach(function (t) {
|
||||
var status = t.resolved_at ? 'Resolved' : 'Active';
|
||||
var ticketCode = (t.ticket_code || '').toString();
|
||||
html += '<div class="mb-2 border rounded p-2" data-alert-type="ticket" data-id="' + t.id + '">' +
|
||||
'<div class="d-flex align-items-start justify-content-between gap-2">' +
|
||||
'<div class="flex-grow-1 min-w-0">' +
|
||||
'<div class="text-truncate">' +
|
||||
'<span class="me-1" title="Ticket">🎫</span>' +
|
||||
'<span class="fw-semibold">' + escapeHtml(t.ticket_code || '') + '</span>' +
|
||||
'<span class="fw-semibold">' + escapeHtml(ticketCode) + '</span>' +
|
||||
'<button type="button" class="btn btn-sm btn-outline-secondary ms-2 py-0 px-1" title="Copy ticket number" data-action="copy-ticket" data-code="' + escapeHtml(ticketCode) + '">⧉</button>' +
|
||||
'<span class="ms-2 badge ' + (t.resolved_at ? 'bg-secondary' : 'bg-warning text-dark') + '">' + status + '</span>' +
|
||||
'</div>' +
|
||||
'</div>' +
|
||||
@ -371,7 +427,16 @@
|
||||
ev.preventDefault();
|
||||
var action = btn.getAttribute('data-action');
|
||||
var id = btn.getAttribute('data-id');
|
||||
if (!action || !id) return;
|
||||
if (!action) return;
|
||||
|
||||
if (action === 'copy-ticket') {
|
||||
var code = btn.getAttribute('data-code') || '';
|
||||
if (!code) return;
|
||||
copyToClipboard(code, btn);
|
||||
return;
|
||||
}
|
||||
|
||||
if (!id) return;
|
||||
if (action === 'resolve-ticket') {
|
||||
if (!confirm('Mark ticket as resolved?')) return;
|
||||
apiJson('/api/tickets/' + encodeURIComponent(id) + '/resolve', {method: 'POST', body: '{}'})
|
||||
|
||||
@ -2,6 +2,16 @@
|
||||
{% block content %}
|
||||
<h2 class="mb-3">Jobs</h2>
|
||||
|
||||
{% if selected_customer_id %}
|
||||
<div class="alert alert-info d-flex justify-content-between align-items-center py-2" role="alert">
|
||||
<span>
|
||||
Filtered on customer:
|
||||
<strong>{{ selected_customer_name or ('#' ~ selected_customer_id) }}</strong>
|
||||
</span>
|
||||
<a href="{{ url_for('main.jobs') }}" class="btn btn-sm btn-outline-primary">Clear filter</a>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm table-hover align-middle">
|
||||
<thead class="table-light">
|
||||
|
||||
@ -422,7 +422,10 @@ function loadRawData() {
|
||||
|
||||
function loadReports() {
|
||||
setTableLoading('Loading…');
|
||||
fetch('/api/reports', { credentials: 'same-origin' })
|
||||
var params = new URLSearchParams(window.location.search || '');
|
||||
var q = (params.get('q') || '').trim();
|
||||
var apiUrl = '/api/reports' + (q ? ('?q=' + encodeURIComponent(q)) : '');
|
||||
fetch(apiUrl, { credentials: 'same-origin' })
|
||||
.then(function (r) { return r.json(); })
|
||||
.then(function (data) {
|
||||
renderTable((data && data.items) ? data.items : []);
|
||||
|
||||
@ -48,7 +48,7 @@
|
||||
<thead class="table-light">
|
||||
<tr>
|
||||
<th scope="col" style="width: 34px;">
|
||||
<input class="form-check-input" type="checkbox" id="rc_select_all" />
|
||||
<input class="form-check-input" type="checkbox" id="rc_select_all" autocomplete="off" />
|
||||
</th>
|
||||
<th scope="col">Customer</th>
|
||||
<th scope="col">Backup</th>
|
||||
@ -63,7 +63,7 @@
|
||||
{% for r in rows %}
|
||||
<tr class="rc-job-row" data-job-id="{{ r.job_id }}" style="cursor: pointer;">
|
||||
<td onclick="event.stopPropagation();">
|
||||
<input class="form-check-input rc_row_cb" type="checkbox" value="{{ r.job_id }}" />
|
||||
<input class="form-check-input rc_row_cb" type="checkbox" value="{{ r.job_id }}" autocomplete="off" />
|
||||
</td>
|
||||
<td>{{ r.customer_name }}</td>
|
||||
<td>{{ r.backup_software }}</td>
|
||||
@ -447,6 +447,60 @@ function escapeHtml(s) {
|
||||
.replace(/'/g, "'");
|
||||
}
|
||||
|
||||
// Cross-browser copy to clipboard function
|
||||
function copyToClipboard(text, button) {
|
||||
// Method 1: Modern Clipboard API (works in most browsers with HTTPS)
|
||||
if (navigator.clipboard && navigator.clipboard.writeText) {
|
||||
navigator.clipboard.writeText(text)
|
||||
.then(function () {
|
||||
showCopyFeedback(button);
|
||||
})
|
||||
.catch(function () {
|
||||
// Fallback to method 2 if clipboard API fails
|
||||
fallbackCopy(text, button);
|
||||
});
|
||||
} else {
|
||||
// Method 2: Legacy execCommand method
|
||||
fallbackCopy(text, button);
|
||||
}
|
||||
}
|
||||
|
||||
function fallbackCopy(text, button) {
|
||||
var textarea = document.createElement('textarea');
|
||||
textarea.value = text;
|
||||
textarea.style.position = 'fixed';
|
||||
textarea.style.opacity = '0';
|
||||
textarea.style.top = '0';
|
||||
textarea.style.left = '0';
|
||||
document.body.appendChild(textarea);
|
||||
textarea.focus();
|
||||
textarea.select();
|
||||
|
||||
try {
|
||||
var successful = document.execCommand('copy');
|
||||
if (successful) {
|
||||
showCopyFeedback(button);
|
||||
} else {
|
||||
// If execCommand fails, use prompt as last resort
|
||||
window.prompt('Copy ticket number:', text);
|
||||
}
|
||||
} catch (err) {
|
||||
// If all else fails, show prompt
|
||||
window.prompt('Copy ticket number:', text);
|
||||
}
|
||||
|
||||
document.body.removeChild(textarea);
|
||||
}
|
||||
|
||||
function showCopyFeedback(button) {
|
||||
if (!button) return;
|
||||
var original = button.textContent;
|
||||
button.textContent = '✓';
|
||||
setTimeout(function () {
|
||||
button.textContent = original;
|
||||
}, 800);
|
||||
}
|
||||
|
||||
function getSelectedJobIds() {
|
||||
var cbs = table.querySelectorAll('tbody .rc_row_cb');
|
||||
var ids = [];
|
||||
@ -840,20 +894,7 @@ table.addEventListener('change', function (e) {
|
||||
if (action === 'copy-ticket') {
|
||||
var code = btn.getAttribute('data-code') || '';
|
||||
if (!code) return;
|
||||
if (navigator.clipboard && navigator.clipboard.writeText) {
|
||||
navigator.clipboard.writeText(code)
|
||||
.then(function () {
|
||||
var original = btn.textContent;
|
||||
btn.textContent = '✓';
|
||||
setTimeout(function () { btn.textContent = original; }, 800);
|
||||
})
|
||||
.catch(function () {
|
||||
// Fallback: select/copy via prompt
|
||||
window.prompt('Copy ticket number:', code);
|
||||
});
|
||||
} else {
|
||||
window.prompt('Copy ticket number:', code);
|
||||
}
|
||||
copyToClipboard(code, btn);
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
75
containers/backupchecks/src/templates/main/search.html
Normal file
75
containers/backupchecks/src/templates/main/search.html
Normal file
@ -0,0 +1,75 @@
|
||||
{% extends "layout/base.html" %}
|
||||
{% block content %}
|
||||
<h2 class="mb-3">Search</h2>
|
||||
|
||||
{% if query %}
|
||||
<p class="text-muted mb-3">
|
||||
Query: <strong>{{ query }}</strong> | Total hits: <strong>{{ total_hits }}</strong>
|
||||
</p>
|
||||
{% else %}
|
||||
<div class="alert alert-secondary py-2">
|
||||
Enter a search term in the top navigation bar.
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
{% for section in sections %}
|
||||
<div class="card mb-3" id="search-section-{{ section['key'] }}" style="scroll-margin-top: 96px;">
|
||||
<div class="card-header d-flex justify-content-between align-items-center">
|
||||
<span>{{ section['title'] }} ({{ section['total'] }})</span>
|
||||
<a href="{{ section['view_all_url'] }}" class="btn btn-sm btn-outline-secondary">Open {{ section['title'] }}</a>
|
||||
</div>
|
||||
{% if section['key'] == 'daily_jobs' %}
|
||||
<div class="px-3 py-2 small text-muted border-bottom">
|
||||
Note: The Daily Jobs page itself only shows results for the selected day. Search results can include matches that relate to jobs across other days.
|
||||
</div>
|
||||
{% endif %}
|
||||
<div class="card-body p-0">
|
||||
{% if section['items'] %}
|
||||
<div class="table-responsive">
|
||||
<table class="table table-sm mb-0 align-middle">
|
||||
<thead class="table-light">
|
||||
<tr>
|
||||
<th>Result</th>
|
||||
<th>Details</th>
|
||||
<th>Meta</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for item in section['items'] %}
|
||||
<tr>
|
||||
<td>
|
||||
{% if item.link %}
|
||||
<a href="{{ item.link }}">{{ item.title }}</a>
|
||||
{% else %}
|
||||
{{ item.title }}
|
||||
{% endif %}
|
||||
</td>
|
||||
<td>{{ item.subtitle }}</td>
|
||||
<td>{{ item.meta }}</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="p-3 text-muted">No results in this section.</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% if section['total_pages'] > 1 %}
|
||||
<div class="card-footer d-flex justify-content-between align-items-center small">
|
||||
<span class="text-muted">
|
||||
Page {{ section['current_page'] }} of {{ section['total_pages'] }} ({{ section['total'] }} results)
|
||||
</span>
|
||||
<div class="d-flex gap-2">
|
||||
{% if section['has_prev'] %}
|
||||
<a class="btn btn-sm btn-outline-secondary" href="{{ section['prev_url'] }}#search-section-{{ section['key'] }}">Previous</a>
|
||||
{% endif %}
|
||||
{% if section['has_next'] %}
|
||||
<a class="btn btn-sm btn-outline-secondary" href="{{ section['next_url'] }}#search-section-{{ section['key'] }}">Next</a>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% endfor %}
|
||||
{% endblock %}
|
||||
@ -528,8 +528,16 @@
|
||||
<div class="col-md-4 d-flex align-items-end">
|
||||
<button type="submit" class="btn btn-primary w-100">Import jobs</button>
|
||||
</div>
|
||||
<div class="col-12">
|
||||
<div class="form-check">
|
||||
<input class="form-check-input" type="checkbox" value="1" id="include_autotask_ids_jobs" name="include_autotask_ids" />
|
||||
<label class="form-check-label" for="include_autotask_ids_jobs">
|
||||
Include Autotask IDs from import file
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
<div class="col-md-8">
|
||||
<div class="form-text">Use a JSON export created by this application.</div>
|
||||
<div class="form-text">Use a JSON export created by this application. Leave Autotask IDs unchecked for sandbox/development environments with a different Autotask database.</div>
|
||||
</div>
|
||||
</div>
|
||||
</form>
|
||||
|
||||
@ -2,9 +2,96 @@
|
||||
|
||||
This file documents all changes made to this project via Claude Code.
|
||||
|
||||
## [2026-02-10]
|
||||
## [2026-02-19]
|
||||
|
||||
### Added
|
||||
- Explicit `Include Autotask IDs` import option in the Approved Jobs JSON import form (Settings -> Maintenance)
|
||||
- Explicit `Include Autotask IDs` import option in the Customers CSV import form
|
||||
|
||||
### Changed
|
||||
- Approved Jobs import now only applies `autotask_company_id` and `autotask_company_name` when the import option is checked
|
||||
- Customers CSV import now only applies Autotask mapping fields when the import option is checked
|
||||
- Import success and audit output now includes whether Autotask IDs were imported
|
||||
- 3CX parser now recognizes `3CX Notification: Update Successful - <host>` as an informational run with `backup_software: 3CX`, `backup_type: Update`, and `overall_status: Success`, and excludes this type from schedule inference (no Expected/Missed generation)
|
||||
- Run Checks now hides only non-backup 3CX informational types (`Update`, `SSL Certificate`), while other backup software/types remain visible
|
||||
|
||||
## [2026-02-16]
|
||||
|
||||
### Added
|
||||
- Customer-to-jobs navigation by making customer names clickable on the Customers page (`/jobs?customer_id=<id>`)
|
||||
- Jobs page customer filter context UI with an active filter banner and a "Clear filter" action
|
||||
- Global search page (`/search`) with grouped results for Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Existing overrides, and Reports
|
||||
- Navbar search form to trigger global search from all authenticated pages
|
||||
- Dedicated Remarks section in global search results (with paging and detail links), so remark records are searchable alongside tickets
|
||||
|
||||
### Changed
|
||||
- `/jobs` route now accepts optional `customer_id` and returns only jobs for that customer when provided
|
||||
- Default Jobs listing keeps inactive-customer filtering only when no `customer_id` filter is applied
|
||||
- Updated `docs/technical-notes-codex.md` with a new "Last updated" date, Customers->Jobs navigation notes, and test build/push validation snapshot
|
||||
- Search matching is now case-insensitive with wildcard support (`*`) and automatic contains behavior (`*term*`) per search term
|
||||
- Global search visibility now only includes sections accessible to the currently active role
|
||||
- Updated `docs/technical-notes-codex.md` with a dedicated Global Grouped Search section (route/UI/behavior/access rules) and latest test build digest for `v20260216-02-global-search`
|
||||
- Global search now supports per-section pagination (previous/next), so results beyond the first 10 can be browsed per section while preserving current query/state
|
||||
- Daily Jobs search result metadata now includes expected run time, success indicator, and run count for the selected day
|
||||
- Daily Jobs search result links now open the same Daily Jobs modal flow via `open_job_id` (instead of only navigating to the overview page)
|
||||
- Updated `docs/technical-notes-codex.md` with search pagination query params, Daily Jobs modal-open search behavior, and latest successful test-build digest
|
||||
- Search pagination buttons now preserve scroll position by linking back to the active section anchor after page navigation
|
||||
- "Open <section>" behavior now passes `q` into destination pages and applies page-level filtering, so opened overviews reflect the same search term
|
||||
- Filtering support on Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Overrides, and Reports now accepts wildcard-enabled `q` terms from search
|
||||
- Reports frontend loading (`/api/reports`) now forwards URL `q` so client-side refresh keeps the same filtered result set
|
||||
- Daily Jobs search section UI now shows an explicit English note that the Daily Jobs page itself is day-scoped while search matches can reflect jobs across other days
|
||||
- Updated `docs/technical-notes-codex.md` to include remarks in grouped search sections, `p_remarks` pagination key, q-forwarding to overview pages, and latest test-build digest
|
||||
|
||||
### Fixed
|
||||
- `/search` page crash (`TypeError: 'builtin_function_or_method' object is not iterable`) by replacing Jinja dict access from `section.items` to `section['items']` in `templates/main/search.html`
|
||||
|
||||
## [2026-02-13]
|
||||
|
||||
### Added
|
||||
- Added internal technical reference document `docs/technical-notes-codex.md` with repository structure, application architecture, processing flow, parser system rules, ticketing/Autotask constraints, feedback attachment notes, deployment/build workflow, and operational attention points
|
||||
|
||||
### Changed
|
||||
- Changed `docs/technical-notes-codex.md` language from Dutch to English to align with project language rules for documentation
|
||||
|
||||
### Fixed
|
||||
- Fixed Autotask tickets and internal tickets not being linked to missed runs by calling `link_open_internal_tickets_to_run` after creating missed JobRun records in `_ensure_missed_runs_for_job` (both weekly and monthly schedules), ensuring missed runs now receive the same ticket propagation as email-based runs
|
||||
- Fixed checkboxes being automatically re-selected after delete actions on Inbox and Run Checks pages by adding `autocomplete="off"` attribute to all checkboxes, preventing browser from restoring previous checkbox states after page reload
|
||||
|
||||
## [2026-02-12]
|
||||
|
||||
### Fixed
|
||||
- Fixed tickets not being displayed in Run Checks modal detail view (Meldingen section) by extending `/api/job-runs/<run_id>/alerts` endpoint to include both run-specific tickets (via ticket_job_runs) and job-level tickets (via ticket_scopes), ensuring newly created tickets are visible immediately in the modal instead of only after being resolved
|
||||
- Fixed copy ticket button not working in Edge browser on Job Details page by moving clipboard functions (copyToClipboard, fallbackCopy, showCopyFeedback) inside IIFE scope for proper closure access (Edge is stricter than Firefox about scope resolution)
|
||||
|
||||
## [2026-02-10]
|
||||
|
||||
### Added
|
||||
- Added screenshot attachment support to Feedback/Bug system (user request: allow screenshots for bugs/features)
|
||||
- New database model: `FeedbackAttachment` with file_data (BYTEA), filename, mime_type, file_size
|
||||
- Upload support on feedback creation form (multiple files, PNG/JPG/GIF/WEBP, max 5MB each)
|
||||
- Upload support on reply forms (attach screenshots when replying)
|
||||
- Inline image display on feedback detail page (thumbnails with click-to-view-full-size)
|
||||
- Screenshot display for both main feedback items and replies
|
||||
- File validation: image type verification using imghdr (not just extension), size limits, secure filename handling
|
||||
- New route: `/feedback/attachment/<id>` to serve images (access-controlled, admins can view deleted item attachments)
|
||||
- Database migration: auto-creates `feedback_attachments` table with indexes on startup
|
||||
- Automatic CASCADE delete: removing feedback item or reply automatically removes associated attachments
|
||||
- Added admin-only deleted items view and permanent delete functionality to Feedback system
|
||||
- "Show deleted items" checkbox on feedback list page (admin only)
|
||||
- Deleted items shown with reduced opacity + background color and "Deleted" badge
|
||||
- Permanent delete action removes item + all attachments from database (hard delete with CASCADE)
|
||||
- Attachment count shown in deletion confirmation message
|
||||
- Admins can view detail pages of deleted items including their screenshots
|
||||
- Two-stage delete: soft delete (audit trail) → permanent delete (database cleanup)
|
||||
- Prevents accidental permanent deletion (requires item to be soft-deleted first)
|
||||
- Security: non-admin users cannot view deleted items or their attachments (404 response)
|
||||
- Added copy ticket button (⧉) to Job Details page modal for quickly copying ticket numbers to clipboard (previously only available on Run Checks page)
|
||||
|
||||
### Fixed
|
||||
- Fixed cross-browser clipboard copy functionality for ticket numbers (previously required manual copy popup in Edge browser)
|
||||
- Implemented three-tier fallback mechanism: modern Clipboard API → legacy execCommand('copy') → prompt fallback
|
||||
- Copy button now works directly in all browsers (Firefox, Edge, Chrome) without requiring user interaction
|
||||
- Applied improved copy mechanism to both Run Checks and Job Details pages
|
||||
- Fixed Autotask ticket not being automatically linked to new runs when internal ticket is resolved by implementing independent Autotask propagation strategy (now checks for most recent non-deleted and non-resolved Autotask ticket on job regardless of internal ticket status, ensuring PSA ticket reference persists across runs until explicitly resolved or deleted)
|
||||
- Fixed internal and Autotask tickets being linked to new runs even after being resolved by removing date-based "open" logic from ticket query (tickets now only link to new runs if they are genuinely unresolved, not based on run date comparisons)
|
||||
- Fixed Job Details page showing resolved tickets for ALL runs by implementing two-source ticket display: directly linked tickets (via ticket_job_runs) are always shown for audit trail, while active window tickets (via scope query) are only shown if unresolved, preserving historical ticket links while preventing resolved tickets from appearing on new runs
|
||||
|
||||
200
docs/cove_data_protection_api_calls_known_info.md
Normal file
200
docs/cove_data_protection_api_calls_known_info.md
Normal file
@ -0,0 +1,200 @@
|
||||
# Cove Data Protection (N-able Backup) – Known Information on API Calls
|
||||
|
||||
Date: 2026-02-10
|
||||
Status: Research phase (validated with live testing)
|
||||
|
||||
## Summary of current findings
|
||||
|
||||
API access to Cove Data Protection via JSON-RPC **works**, but is **heavily restricted per tenant and per API user scope**. The API is usable for monitoring, but only with a **very limited, allow‑listed set of column codes**. Any request that includes a restricted column immediately fails with:
|
||||
|
||||
```
|
||||
Operation failed because of security reasons (error 13501)
|
||||
```
|
||||
|
||||
This behavior is consistent even when the API user has **SuperUser** and **SecurityOfficer** roles.
|
||||
|
||||
---
|
||||
|
||||
## Authentication model (confirmed)
|
||||
|
||||
- Endpoint: https://api.backup.management/jsonapi
|
||||
- Protocol: JSON‑RPC 2.0
|
||||
- Method: POST only
|
||||
- Authentication flow:
|
||||
1. Login method is called
|
||||
2. Response returns a **visa** token (top‑level field)
|
||||
3. The visa **must be included in every subsequent call**
|
||||
4. Cove may return a new visa in later responses (token chaining)
|
||||
|
||||
### Login request (working)
|
||||
|
||||
```json
|
||||
{
|
||||
"jsonrpc": "2.0",
|
||||
"method": "Login",
|
||||
"params": {
|
||||
"partner": "<EXACT customer/partner name>",
|
||||
"username": "<api login name>",
|
||||
"password": "<password>"
|
||||
},
|
||||
"id": "1"
|
||||
}
|
||||
```
|
||||
|
||||
### Login response structure (important)
|
||||
|
||||
```json
|
||||
{
|
||||
"result": {
|
||||
"result": {
|
||||
"PartnerId": <number>,
|
||||
"Name": "<login name>",
|
||||
"Flags": ["SecurityOfficer","NonInteractive"]
|
||||
}
|
||||
},
|
||||
"visa": "<visa token>"
|
||||
}
|
||||
```
|
||||
|
||||
Notes:
|
||||
- `visa` is **not** inside `result`, but at top level
|
||||
- `PartnerId` is found at `result.result.PartnerId`
|
||||
|
||||
---
|
||||
|
||||
## API user scope (critical finding)
|
||||
|
||||
- API users are **always bound to a single Partner (customer)** unless created at MSP/root level
|
||||
- In this environment, it is **not possible to create an MSP‑level API user**
|
||||
- All testing was therefore done with **customer‑scoped API users**
|
||||
|
||||
Impact:
|
||||
- Cross‑customer enumeration is impossible
|
||||
- Only data belonging to the linked customer can be queried
|
||||
- Some enumerate/reporting calls are blocked regardless of role
|
||||
|
||||
---
|
||||
|
||||
## EnumerateAccountStatistics – what works and what does not
|
||||
|
||||
### Method
|
||||
|
||||
```json
|
||||
{
|
||||
"jsonrpc": "2.0",
|
||||
"method": "EnumerateAccountStatistics",
|
||||
"visa": "<visa>",
|
||||
"params": {
|
||||
"query": {
|
||||
"PartnerId": <partner_id>,
|
||||
"SelectionMode": "Merged",
|
||||
"StartRecordNumber": 0,
|
||||
"RecordsCount": 50,
|
||||
"Columns": [ ... ]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Mandatory behavior
|
||||
|
||||
- **Columns are required**; omitting them returns `result: null`
|
||||
- The API behaves as an **allow‑list**:
|
||||
- If *any* requested column is restricted, the **entire call fails** with error 13501
|
||||
|
||||
### Confirmed working (safe) column set
|
||||
|
||||
The following column set works reliably:
|
||||
|
||||
- I1 → account / device / tenant identifier
|
||||
- I14 → used storage (bytes)
|
||||
- I18 → computer name (if applicable)
|
||||
- D01F00 – D01F07 → numeric metrics (exact semantics TBD)
|
||||
- D09F00 → numeric status/category code
|
||||
|
||||
Example (validated working):
|
||||
|
||||
```json
|
||||
"Columns": [
|
||||
"I1","I14","I18",
|
||||
"D01F00","D01F01","D01F02","D01F03",
|
||||
"D01F04","D01F05","D01F06","D01F07",
|
||||
"D09F00"
|
||||
]
|
||||
```
|
||||
|
||||
### Confirmed restricted (cause security error 13501)
|
||||
|
||||
- Entire D02Fxx range
|
||||
- Entire D03Fxx range
|
||||
- Broad I‑ranges (e.g. I1–I10 batches)
|
||||
- Many individually tested I‑codes not in the safe set
|
||||
|
||||
Even adding **one restricted code** causes the entire call to fail.
|
||||
|
||||
---
|
||||
|
||||
## EnumerateAccounts
|
||||
|
||||
- Method consistently fails with `Operation failed because of security reasons`
|
||||
- This applies even with:
|
||||
- SuperUser role
|
||||
- SecurityOfficer flag enabled
|
||||
|
||||
Conclusion:
|
||||
- EnumerateAccounts is **not usable** in this tenant for customer‑scoped API users
|
||||
|
||||
---
|
||||
|
||||
## Other tested methods
|
||||
|
||||
- EnumerateStatistics → Method not found
|
||||
- GetPartnerInfo → works only for basic partner metadata (not statistics)
|
||||
|
||||
---
|
||||
|
||||
## Practical implications for BackupChecks
|
||||
|
||||
What **is possible**:
|
||||
- Enumerate accounts implicitly via EnumerateAccountStatistics
|
||||
- Identify devices/accounts via AccountId + I1/I18
|
||||
- Collect storage usage (I14)
|
||||
- Collect numeric status/metrics via D01Fxx and D09F00
|
||||
|
||||
What is **not possible (via this API scope)**:
|
||||
- Reliable last backup timestamp
|
||||
- Explicit success / failure / warning text
|
||||
- Error messages
|
||||
- Enumerating devices via EnumerateAccounts
|
||||
- Cross‑customer aggregation
|
||||
|
||||
### Suggested internal model mapping
|
||||
|
||||
- Customer
|
||||
- external_id = PartnerId
|
||||
|
||||
- Job
|
||||
- external_id = AccountId
|
||||
- display_name = I1
|
||||
- hostname = I18 (if present)
|
||||
|
||||
- Run (limited)
|
||||
- metrics only (bytes, counters)
|
||||
- status must be **derived heuristically** from numeric fields (if possible)
|
||||
|
||||
---
|
||||
|
||||
## Open questions / next steps
|
||||
|
||||
1. Confirm official meaning of:
|
||||
- D01F00 – D01F07
|
||||
- D09F00
|
||||
|
||||
2. Investigate whether:
|
||||
- A token‑based (non‑JSON‑RPC) reporting endpoint exists
|
||||
- N‑able support can enable additional reporting columns
|
||||
- An MSP‑level API user can be provisioned by N‑able
|
||||
|
||||
3. Decide whether Cove integration in BackupChecks will be:
|
||||
- Metrics‑only (no run result semantics)
|
||||
- Or require vendor cooperation for expanded API access
|
||||
372
docs/technical-notes-codex.md
Normal file
372
docs/technical-notes-codex.md
Normal file
@ -0,0 +1,372 @@
|
||||
# Technical Notes (Internal)
|
||||
|
||||
Last updated: 2026-02-16
|
||||
|
||||
## Purpose
|
||||
Internal technical snapshot of the `backupchecks` repository for faster onboarding, troubleshooting, and change impact analysis.
|
||||
|
||||
## Repository Overview
|
||||
- Application: Flask web app with SQLAlchemy and Flask-Migrate.
|
||||
- Runtime: Containerized (Docker), deployed via Docker Compose stack.
|
||||
- Primary source code location: `containers/backupchecks/src`.
|
||||
- The project also contains extensive functional documentation in `docs/` and multiple roadmap TODO files at repository root.
|
||||
|
||||
## Main Structure
|
||||
- `containers/backupchecks/Dockerfile`: Python 3.12-slim image, starts `gunicorn` with `backend.app:create_app()`.
|
||||
- `containers/backupchecks/requirements.txt`: Flask stack + PostgreSQL driver + reporting libraries (`reportlab`, `Markdown`).
|
||||
- `containers/backupchecks/src/backend/app`: backend domain logic, routes, parsers, models, migrations.
|
||||
- `containers/backupchecks/src/templates`: Jinja templates for auth/main/documentation pages.
|
||||
- `containers/backupchecks/src/static`: CSS, images, favicon.
|
||||
- `deploy/backupchecks-stack.yml`: compose stack with `backupchecks`, `postgres`, `adminer`.
|
||||
- `build-and-push.sh`: release/test build script with version bumping, tags, and image push.
|
||||
- `docs/`: functional design, changelogs, migration notes, API notes.
|
||||
|
||||
## Application Architecture (Current Observation)
|
||||
- Factory pattern: `create_app()` in `containers/backupchecks/src/backend/app/__init__.py`.
|
||||
- Blueprints:
|
||||
- `auth_bp` for authentication.
|
||||
- `main_bp` for core functionality.
|
||||
- `doc_bp` for internal documentation pages.
|
||||
- Database initialization at startup:
|
||||
- `db.create_all()`
|
||||
- `run_migrations()`
|
||||
- Background task:
|
||||
- `start_auto_importer(app)` starts the automatic mail importer thread.
|
||||
- Health endpoint:
|
||||
- `GET /health` returns `{ "status": "ok" }`.
|
||||
|
||||
## Functional Processing Flow
|
||||
- Import:
|
||||
- Email is fetched via Microsoft Graph API.
|
||||
- Parse:
|
||||
- Parser selection through registry + software-specific parser implementations.
|
||||
- Approve:
|
||||
- New jobs first appear in Inbox for initial customer assignment.
|
||||
- Auto-process:
|
||||
- Subsequent emails for known jobs automatically create `JobRun` records.
|
||||
- Monitor:
|
||||
- Runs appear in Daily Jobs and Run Checks.
|
||||
- Review:
|
||||
- Manual review removes items from the unreviewed operational queue.
|
||||
|
||||
## Configuration and Runtime
|
||||
- Config is built from environment variables in `containers/backupchecks/src/backend/app/config.py`.
|
||||
- Important variables:
|
||||
- `APP_SECRET_KEY`
|
||||
- `APP_ENV`
|
||||
- `APP_PORT`
|
||||
- `POSTGRES_DB`
|
||||
- `POSTGRES_USER`
|
||||
- `POSTGRES_PASSWORD`
|
||||
- `DB_HOST`
|
||||
- `DB_PORT`
|
||||
- Database URI pattern:
|
||||
- `postgresql+psycopg2://<user>:<pass>@<host>:<port>/<db>`
|
||||
- Default timezone in config: `Europe/Amsterdam`.
|
||||
|
||||
## Data Model (High-level)
|
||||
File: `containers/backupchecks/src/backend/app/models.py`
|
||||
- Auth/users:
|
||||
- `User` with role(s), active role in session.
|
||||
- System settings:
|
||||
- `SystemSettings` with Graph/mail settings, import settings, UI timezone, dashboard policy, sandbox flag.
|
||||
- Autotask configuration and cache fields are present.
|
||||
- Logging:
|
||||
- `AuditLog` (legacy alias `AdminLog`).
|
||||
- Domain:
|
||||
- `Customer`, `Job`, `JobRun`, `Override`
|
||||
- `MailMessage`, `MailObject`
|
||||
- `Ticket`, `TicketScope`, `TicketJobRun`
|
||||
- `Remark`, `RemarkScope`, `RemarkJobRun`
|
||||
- `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`
|
||||
|
||||
### Foreign Key Relationships & Deletion Order
|
||||
Critical deletion order to avoid constraint violations:
|
||||
1. Clean auxiliary tables (ticket_job_runs, remark_job_runs, scopes, overrides)
|
||||
2. Unlink mails from jobs (UPDATE mail_messages SET job_id = NULL)
|
||||
3. Delete mail_objects
|
||||
4. Delete jobs (cascades to job_runs)
|
||||
5. Delete mails
|
||||
|
||||
### Key Model Fields
|
||||
**MailMessage model:**
|
||||
- `from_address` (NOT `sender`!) - sender email
|
||||
- `subject` - email subject
|
||||
- `text_body` - plain text content
|
||||
- `html_body` - HTML content
|
||||
- `received_at` - timestamp
|
||||
- `location` - inbox/processed/deleted
|
||||
- `job_id` - link to Job (nullable)
|
||||
|
||||
**Job model:**
|
||||
- `customer_id` - FK to Customer
|
||||
- `job_name` - parsed from email
|
||||
- `backup_software` - e.g., "Veeam", "Synology"
|
||||
- `backup_type` - e.g., "Backup Job", "Active Backup"
|
||||
|
||||
## Parser Architecture
|
||||
- Folder: `containers/backupchecks/src/backend/app/parsers/`
|
||||
- Two layers:
|
||||
- `registry.py`:
|
||||
- matching/documentation/visibility on `/parsers`.
|
||||
- examples must stay generic (no customer names).
|
||||
- parser files (`veeam.py`, `synology.py`, etc.):
|
||||
- actual detection and parsing logic.
|
||||
- return structured output: software, type, job name, status, objects.
|
||||
- Practical rule:
|
||||
- extend patterns by adding, not replacing (backward compatibility).
|
||||
|
||||
### Parser Types
|
||||
**Informational Parsers:**
|
||||
- DSM Updates, Account Protection, Firmware Updates
|
||||
- Set appropriate backup_type (e.g., "Updates", "Firmware Update")
|
||||
- Do NOT participate in schedule learning
|
||||
- Still visible in Run Checks for awareness
|
||||
|
||||
**Regular Parsers:**
|
||||
- Backup jobs (Veeam, Synology Active Backup, NAKIVO, etc.)
|
||||
- Participate in schedule learning (daily/weekly/monthly detection)
|
||||
- Generate missed runs when expected runs don't occur
|
||||
|
||||
**Example: Synology Updates Parser (synology.py)**
|
||||
- Handles multiple update notification types under same job:
|
||||
- DSM automatic update cancelled
|
||||
- Packages out-of-date
|
||||
- Combined notifications (DSM + packages)
|
||||
- Detection patterns:
|
||||
- DSM: "Automatische DSM-update", "DSM-update op", "automatic DSM update"
|
||||
- Packages: "Packages on", "out-of-date", "Package Center"
|
||||
- Hostname extraction from multiple patterns
|
||||
- Returns: backup_type "Updates", job_name "Synology Automatic Update"
|
||||
|
||||
## Ticketing and Autotask (Critical Rules)
|
||||
|
||||
### Two Ticket Types
|
||||
1. **Internal Tickets** (tickets table)
|
||||
- Created manually or via Autotask integration
|
||||
- Stored in `tickets` table with `ticket_code` (e.g., "T20250123.0001")
|
||||
- Linked to runs via `ticket_job_runs` many-to-many table
|
||||
- Scoped to jobs via `ticket_scopes` table
|
||||
- Have `resolved_at` field for resolution tracking
|
||||
- **Auto-propagation**: Automatically linked to new runs via `link_open_internal_tickets_to_run`
|
||||
|
||||
2. **Autotask Tickets** (job_runs columns)
|
||||
- Created via Run Checks modal → "Create Autotask Ticket"
|
||||
- Stored directly in JobRun columns: `autotask_ticket_id`, `autotask_ticket_number`, etc.
|
||||
- When created, also creates matching internal ticket for legacy UI compatibility
|
||||
- Have `autotask_ticket_deleted_at` field for deletion tracking
|
||||
- Resolution tracked via matching internal ticket's `resolved_at` field
|
||||
- **Auto-propagation**: Linked to new runs via two-strategy approach
|
||||
|
||||
### Ticket Propagation to New Runs
|
||||
When a new JobRun is created (via email import OR missed run generation), `link_open_internal_tickets_to_run` ensures:
|
||||
|
||||
**Strategy 1: Internal ticket linking**
|
||||
- Query finds tickets where: `COALESCE(ts.resolved_at, t.resolved_at) IS NULL`
|
||||
- Creates `ticket_job_runs` links automatically
|
||||
- Tickets remain visible until explicitly resolved
|
||||
- **NO date-based logic** - resolved = immediately hidden from new runs
|
||||
|
||||
**Strategy 2: Autotask ticket propagation (independent)**
|
||||
1. Check if internal ticket code exists → find matching Autotask run → copy ticket info
|
||||
2. If no match, directly search for most recent Autotask ticket on job where:
|
||||
- `autotask_ticket_deleted_at IS NULL` (not deleted in PSA)
|
||||
- Internal ticket `resolved_at IS NULL` (not resolved in PSA)
|
||||
3. Copy `autotask_ticket_id`, `autotask_ticket_number`, `created_at`, `created_by_user_id` to new run
|
||||
|
||||
### Where Ticket Linking is Called
|
||||
`link_open_internal_tickets_to_run` is invoked in three locations:
|
||||
1. **Email-based runs**: `routes_inbox.py` and `mail_importer.py` - after creating JobRun from parsed email
|
||||
2. **Missed runs**: `routes_run_checks.py` in `_ensure_missed_runs_for_job` - after creating missed JobRun records
|
||||
- Weekly schedule: After creating weekly missed run (with flush to get run.id)
|
||||
- Monthly schedule: After creating monthly missed run (with flush to get run.id)
|
||||
- **Critical**: Without this call, missed runs don't get ticket propagation!
|
||||
|
||||
### Display Logic - Link-Based System
|
||||
All pages use **explicit link-based queries** (no date-based logic):
|
||||
|
||||
**Job Details Page:**
|
||||
- **Two sources** for ticket display:
|
||||
1. Direct links (`ticket_job_runs WHERE job_run_id = X`) → always show (audit trail)
|
||||
2. Active window (`ticket_scopes WHERE job_id = Y AND resolved_at IS NULL`) → only unresolved
|
||||
- Result: Old runs keep their ticket references, new runs don't get resolved tickets
|
||||
|
||||
**Run Checks Main Page (Indicators 🎫):**
|
||||
- Query: `ticket_scopes JOIN tickets WHERE job_id = X AND resolved_at IS NULL`
|
||||
- Only shows indicator if unresolved tickets exist for the job
|
||||
|
||||
**Run Checks Popup Modal:**
|
||||
- API: `/api/job-runs/<run_id>/alerts`
|
||||
- **Two-source ticket display**:
|
||||
1. Direct links: `tickets JOIN ticket_job_runs WHERE job_run_id = X`
|
||||
2. Job-level scope: `tickets JOIN ticket_scopes WHERE job_id = Y AND resolved_at IS NULL AND active_from_date <= run_date`
|
||||
- Prevents duplicates by tracking seen ticket IDs
|
||||
- Shows newly created tickets immediately (via scope) without waiting for resolve action
|
||||
- Same for remarks: `remarks JOIN remark_job_runs WHERE job_run_id = X`
|
||||
|
||||
### Resolved vs Deleted
|
||||
- **Resolved**: Ticket completed in Autotask (tracked in internal `tickets.resolved_at`)
|
||||
- Stops propagating to new runs
|
||||
- Ticket still exists in PSA
|
||||
- Synced via PSA polling
|
||||
- **Deleted**: Ticket removed from Autotask (tracked in `job_runs.autotask_ticket_deleted_at`)
|
||||
- Also stops propagating
|
||||
- Ticket no longer exists in PSA
|
||||
- Rare operation
|
||||
|
||||
### Critical Rules
|
||||
- ❌ **NEVER** use date-based resolved logic: `resolved_at >= run_date` OR `active_from_date <= run_date`
|
||||
- ✅ Only show tickets that are ACTUALLY LINKED via `ticket_job_runs` table
|
||||
- ✅ Resolved tickets stop linking immediately when resolved
|
||||
- ✅ Old links preserved for audit trail (visible on old runs)
|
||||
- ✅ All queries must use explicit JOIN to link tables
|
||||
- ✅ Consistency: All pages use same "resolved = NULL" logic
|
||||
- ✅ **CRITICAL**: Preserve description field during Autotask updates - must include "description" in optional_fields list
|
||||
|
||||
## UI and UX Notes
|
||||
|
||||
### Navbar
|
||||
- Fixed-top positioning
|
||||
- Collapses on mobile (hamburger menu)
|
||||
- Dynamic padding adjustment via JavaScript (measures navbar height, adjusts main content padding-top)
|
||||
- Role-based menu items (Admin sees more than Operator/Viewer)
|
||||
|
||||
### Status Badges
|
||||
- Success: Green
|
||||
- Warning: Yellow/Orange
|
||||
- Failed/Error: Red
|
||||
- Override applied: Blue badge
|
||||
- Reviewed: Checkmark indicator
|
||||
|
||||
### Ticket Copy Functionality
|
||||
- Copy button (⧉) available on both Run Checks and Job Details pages
|
||||
- Allows quick copying of ticket numbers to clipboard
|
||||
- Cross-browser compatible with three-tier fallback mechanism:
|
||||
1. **Modern Clipboard API**: `navigator.clipboard.writeText()` - works in modern browsers with HTTPS
|
||||
2. **Legacy execCommand**: `document.execCommand('copy')` - fallback for older browsers and Edge
|
||||
3. **Prompt fallback**: `window.prompt()` - last resort if clipboard access fails
|
||||
- Visual feedback: button changes to ✓ checkmark for 800ms after successful copy
|
||||
- Implementation uses hidden textarea for execCommand method to ensure compatibility
|
||||
- No user interaction required in modern browsers (direct copy)
|
||||
|
||||
### Checkbox Behavior
|
||||
- All checkboxes on Inbox and Run Checks pages use `autocomplete="off"`
|
||||
- Prevents browser from auto-selecting checkboxes after page reload
|
||||
- Fixes issue where deleting items would cause same number of new items to be selected
|
||||
|
||||
### Customers to Jobs Navigation (2026-02-16)
|
||||
- Customers page links each customer name to filtered Jobs view:
|
||||
- `GET /jobs?customer_id=<customer_id>`
|
||||
- Jobs route behavior:
|
||||
- Accepts optional `customer_id` query parameter in `routes_jobs.py`.
|
||||
- If set: returns jobs for that customer only.
|
||||
- If not set: keeps default filter that hides jobs linked to inactive customers.
|
||||
- Jobs UI behavior:
|
||||
- Shows active filter banner with selected customer name.
|
||||
- Provides "Clear filter" action back to unfiltered `/jobs`.
|
||||
- Templates touched:
|
||||
- `templates/main/customers.html`
|
||||
- `templates/main/jobs.html`
|
||||
|
||||
### Global Grouped Search (2026-02-16)
|
||||
- New route:
|
||||
- `GET /search` in `main/routes_search.py`
|
||||
- New UI:
|
||||
- Navbar search form in `templates/layout/base.html`
|
||||
- Grouped result page in `templates/main/search.html`
|
||||
- Search behavior:
|
||||
- Case-insensitive matching (`ILIKE`).
|
||||
- `*` wildcard is supported and translated to SQL `%`.
|
||||
- Automatic contains behavior is applied per term (`*term*`) when wildcard not explicitly set.
|
||||
- Multi-term queries use AND across terms and OR across configured columns within each section.
|
||||
- Per-section pagination is supported via query params: `p_inbox`, `p_customers`, `p_jobs`, `p_daily_jobs`, `p_run_checks`, `p_tickets`, `p_remarks`, `p_overrides`, `p_reports`.
|
||||
- Pagination keeps search state for all sections while browsing one section.
|
||||
- "Open <section>" links pass `q` to destination overview pages so page-level filtering matches the search term.
|
||||
- Grouped sections:
|
||||
- Inbox, Customers, Jobs, Daily Jobs, Run Checks, Tickets, Remarks, Existing overrides, Reports.
|
||||
- Daily Jobs search result details:
|
||||
- Meta now includes expected run time, success indicator, and run count for the selected day.
|
||||
- Link now opens Daily Jobs with modal auto-open using `open_job_id` query parameter (same modal flow as clicking a row in Daily Jobs).
|
||||
- Access control:
|
||||
- Search results are role-aware and only show sections/data the active role can access.
|
||||
- `run_checks` results are restricted to `admin`/`operator`.
|
||||
- `reports` supports `admin`/`operator`/`viewer`/`reporter`.
|
||||
- Current performance strategy:
|
||||
- Per-section limit (`SEARCH_LIMIT_PER_SECTION = 10`), with total count per section.
|
||||
- No schema migration required for V1.
|
||||
|
||||
## Feedback Module with Screenshots
|
||||
- Models: `FeedbackItem`, `FeedbackVote`, `FeedbackReply`, `FeedbackAttachment`.
|
||||
- Attachments:
|
||||
- multiple uploads, type validation, per-file size limits, storage in database (BYTEA).
|
||||
|
||||
## Validation Snapshot
|
||||
- 2026-02-16: Test build + push succeeded via `update-and-build.sh t`.
|
||||
- Pushed image: `gitea.oskamp.info/ivooskamp/backupchecks:dev`.
|
||||
- 2026-02-16: Test build + push succeeded on branch `v20260216-02-global-search`.
|
||||
- Pushed image digest: `sha256:6996675b9529426fe2ad58b5f353479623f3ebe24b34552c17ad0421d8a7ee0f`.
|
||||
- 2026-02-16: Additional test build + push cycles succeeded on `v20260216-02-global-search`.
|
||||
- Latest pushed image digest: `sha256:8ec8bfcbb928e282182fa223ce8bf7f92112d20e79f4a8602d015991700df5d7`.
|
||||
- 2026-02-16: Additional test build + push cycles succeeded after search enhancements.
|
||||
- Latest pushed image digest: `sha256:b36b5cdd4bc7c4dadedca0534f1904a6e12b5b97abc4f12bc51e42921976f061`.
|
||||
- Delete strategy:
|
||||
- soft delete by default,
|
||||
- permanent delete only for admins and only after soft delete.
|
||||
|
||||
## Deployment and Operations
|
||||
- Stack exposes:
|
||||
- app on `8080`
|
||||
- adminer on `8081`
|
||||
- PostgreSQL persistent volume:
|
||||
- `/docker/appdata/backupchecks/backupchecks-postgres:/var/lib/postgresql/data`
|
||||
- `deploy/backupchecks-stack.yml` also contains example `.env` variables at the bottom.
|
||||
|
||||
## Build/Release Flow
|
||||
File: `build-and-push.sh`
|
||||
- Bump options:
|
||||
- `1` patch, `2` minor, `3` major, `t` test.
|
||||
- Release build:
|
||||
- update `version.txt`
|
||||
- commit + tag + push
|
||||
- docker push of `:<version>`, `:dev`, `:latest`
|
||||
- Test build:
|
||||
- only `:dev`
|
||||
- no commit/tag.
|
||||
- Services are discovered under `containers/*` with Dockerfile-per-service.
|
||||
|
||||
## Technical Observations / Attention Points
|
||||
- `README.md` is currently empty; quick-start entry context is missing.
|
||||
- `LICENSE` is currently empty.
|
||||
- `docs/architecture.md` is currently empty.
|
||||
- `deploy/backupchecks-stack.yml` contains hardcoded example values (`Changeme`), with risk if used without proper secrets management.
|
||||
- The app performs DB initialization + migrations at startup; for larger schema changes this can impact startup time/robustness.
|
||||
- There is significant parser and ticketing complexity; route changes carry regression risk without targeted testing.
|
||||
- For Autotask update calls, the `description` field must be explicitly preserved to prevent unintended NULL overwrite.
|
||||
- Security hygiene remains important:
|
||||
- no customer names in parser examples/source,
|
||||
- no hardcoded credentials.
|
||||
|
||||
## Quick References
|
||||
- App entrypoint: `containers/backupchecks/src/backend/app/main.py`
|
||||
- App factory: `containers/backupchecks/src/backend/app/__init__.py`
|
||||
- Config: `containers/backupchecks/src/backend/app/config.py`
|
||||
- Models: `containers/backupchecks/src/backend/app/models.py`
|
||||
- Parsers: `containers/backupchecks/src/backend/app/parsers/registry.py`
|
||||
- Ticketing utilities: `containers/backupchecks/src/backend/app/ticketing_utils.py`
|
||||
- Run Checks routes: `containers/backupchecks/src/backend/app/main/routes_run_checks.py`
|
||||
- Compose stack: `deploy/backupchecks-stack.yml`
|
||||
- Build script: `build-and-push.sh`
|
||||
|
||||
## Recent Changes
|
||||
|
||||
### 2026-02-13
|
||||
- **Fixed missed runs ticket propagation**: Added `link_open_internal_tickets_to_run` calls in `_ensure_missed_runs_for_job` (routes_run_checks.py) after creating both weekly and monthly missed JobRun records. Previously only email-based runs got ticket linking, causing missed runs to not show internal tickets or Autotask tickets. Required `db.session.flush()` before linking to ensure run.id is available.
|
||||
- **Fixed checkbox auto-selection**: Added `autocomplete="off"` to all checkboxes on Inbox and Run Checks pages. Prevents browser from automatically re-selecting checkboxes after page reload following delete actions.
|
||||
|
||||
### 2026-02-12
|
||||
- **Fixed Run Checks modal ticket display**: Implemented two-source display logic (ticket_job_runs + ticket_scopes). Previously only showed tickets after they were resolved (when ticket_job_runs entry was created). Now shows tickets immediately upon creation via scope query.
|
||||
- **Fixed copy button in Edge**: Moved clipboard functions inside IIFE scope for proper closure access (Edge is stricter than Firefox about scope resolution).
|
||||
|
||||
### 2026-02-10
|
||||
- **Added screenshot support to Feedback system**: Multiple file upload, inline display, two-stage delete (soft delete for audit trail, permanent delete for cleanup).
|
||||
- **Completed transition to link-based ticket system**: All pages now use JOIN queries, no date-based logic. Added cross-browser copy ticket functionality with three-tier fallback mechanism to both Run Checks and Job Details pages.
|
||||
Loading…
Reference in New Issue
Block a user