Changed TODO document language from Dutch to English to align with project documentation standards (all code and docs in English). No content changes, only translation. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
6.0 KiB
6.0 KiB
TODO: Cove Data Protection Integration
Date: 2026-02-10 Status: Research phase Priority: Medium
🎯 Goal
Integrate Cove Data Protection (formerly N-able Backup / SolarWinds Backup) into Backupchecks for backup status monitoring.
Challenge: Cove does NOT work with email notifications like other backup systems (Veeam, Synology, NAKIVO). We need to find an alternative method to import backup status information.
🔍 Research Questions
1. API Availability
- Does Cove Data Protection have a public API?
- What authentication method does the API use? (API key, OAuth, basic auth?)
- Which endpoints are available for backup status?
- Is there rate limiting on the API?
- Documentation URL: ?
2. Data Structure
- What information can we retrieve per backup job?
- Job name
- Status (success/warning/failed)
- Start/end time
- Backup type
- Client/device name
- Error messages
- Objects/files backed up
- Is there a webhook system available?
- How often should the API be polled?
3. Multi-Tenancy
- Does Cove support multi-tenant setups? (MSP use case)
- Can we monitor multiple customers/partners from 1 account?
- How are permissions/access managed?
4. Integration Strategy
-
Option A: Scheduled Polling
- Cronjob that periodically calls API
- Parse results to JobRun records
- Pro: Simple, consistent with current flow
- Con: Delay between backup and registration in system
-
Option B: Webhook/Push
- Cove sends notifications to our endpoint
- Pro: Real-time updates
- Con: Requires external endpoint, security considerations
-
Option C: Email Forwarding
- If Cove has email support after all (hidden setting?)
- Pro: Reuses existing email import flow
- Con: Possibly not available
📋 Technical Considerations
Database Model
Current JobRun model expects:
mail_message_id(FK) - how do we adapt this for API-sourced runs?- Possible new field:
source_type("email" vs "api") - Possible new field:
external_id(Cove job ID)
Parser System
Current parser system works with email content. For API:
- New "parser" concept for API responses?
- Or direct JobRun creation without parser layer?
Architecture Options
Option 1: Extend Email Import System
API Poller → Pseudo-MailMessage → Existing Parser → JobRun
- Pro: Reuse existing flow
- Con: Hacky, email fields have no meaning
Option 2: Parallel Import System
API Poller → API Parser → JobRun (direct)
- Pro: Clean separation, no email dependency
- Con: Logic duplication
Option 3: Unified Import Layer
→ Email Import →
Unified → → Common Processor → JobRun
→ API Import →
- Pro: Future-proof, scalable
- Con: Larger refactor
🔧 Implementation Steps (After Research)
Phase 1: API Research & POC
- Research Cove API documentation
- Test API authentication
- Test data retrieval (1 backup job)
- Mapping of Cove data → Backupchecks model
- Proof of concept script (standalone)
Phase 2: Database Changes
- Decide: extend MailMessage model or new source type?
- Migration: add
source_typefield to JobRun - Migration: add
external_idfield to JobRun - Update constraints/validations
Phase 3: Import Mechanism
- New file:
containers/backupchecks/src/backend/app/cove_importer.py - API client for Cove
- Data transformation to JobRun format
- Error handling & retry logic
- Logging & audit trail
Phase 4: Scheduling
- Cronjob/scheduled task for polling (every 15 min?)
- Or: webhook endpoint if Cove supports it
- Rate limiting & throttling
- Duplicate detection (avoid double imports)
Phase 5: UI Updates
- Job Details: indication that job is from API (not email)
- No "Download EML" button for API-sourced runs
- Possibly different metadata display
📚 References
Cove Data Protection
- Product name: Cove Data Protection (formerly N-able Backup, SolarWinds Backup)
- Website: https://www.n-able.com/products/cove-data-protection
- API Docs: [TODO: add link after research]
- Support: [TODO: contact info]
Similar Integrations
Other backup systems that use APIs:
- Veeam: Has both email and REST API
- Acronis: REST API available
- MSP360: API for management
Resources
- API documentation (yet to find)
- SDK/Client libraries available?
- Community/forum for integration questions?
- Example code/integrations?
❓ Open Questions
- Performance: How many Cove jobs do we need to monitor? (impact on polling frequency)
- Historical Data: Can we retrieve old backup runs, or only new ones?
- Filtering: Can we apply filters (only failed jobs, specific clients)?
- Authentication: Where do we store Cove API credentials? (SystemSettings?)
- Multi-Account: Do we support multiple Cove accounts? (MSP scenario)
🎯 Success Criteria
Minimum Viable Product (MVP)
- Backup runs from Cove are automatically imported
- Status (success/warning/failed) displayed correctly
- Job name and timestamp available
- Visible in Daily Jobs & Run Checks
- Errors and warnings are shown
Nice to Have
- Real-time import (webhook instead of polling)
- Backup object details (individual files/folders)
- Retry history
- Storage usage metrics
- Multi-tenant support
🚀 Next Steps
- Research first! - Start with API documentation investigation
- Create POC script (standalone, outside Backupchecks)
- Document findings in this file
- Decide which architecture option (1, 2, or 3)
- Only then start implementation
Status: Waiting on API research completion.
📝 Notes
- This TODO document should be updated after each research step
- Add API examples as soon as available
- Document edge cases and limitations
- Consider security implications (API key storage, rate limits, etc.)