backupchecks/TODO-cove-data-protection.md
Ivo Oskamp 6674d40f4b Major update: Cove API tested - critical limitations discovered
Added comprehensive API test results document (with ChatGPT assistance):
- docs/cove_data_protection_api_calls_known_info.md

Key findings from live API testing:
- API works: JSON-RPC 2.0 at https://api.backup.management/jsonapi
- Authentication: Login method → visa token
- Method tested: EnumerateAccountStatistics (limited success)

CRITICAL LIMITATIONS DISCOVERED:
- Security error 13501 blocks most useful columns
- No backup status fields (success/failed/warning) accessible
- No error messages (D02Fxx/D03Fxx ranges blocked)
- No reliable backup timestamps
- No detailed run history
- API users are customer-scoped (not MSP-level)
- EnumerateAccounts method always fails (security block)

Working columns (allow-list only):
- I1 (account ID), I14 (storage bytes), I18 (hostname)
- D01F00-D01F07, D09F00 (numeric metrics, semantics unclear)

Impact on Backupchecks:
- Current API access INSUFFICIENT for backup monitoring
- Cannot determine if backups succeeded or failed
- No error messages to show users
- Core Backupchecks functionality not achievable with current API

Added decision matrix with 4 options:
A. Implement metrics-only (low value, storage usage only)
B. Request expanded access from N-able (requires vendor cooperation)
C. Explore alternative methods (webhooks, reports, email)
D. Defer integration until better API access available

Recommendation: Option B or C before implementing anything
- Contact N-able support for MSP-level API user + expanded columns
- OR investigate if Cove has webhook/reporting alternatives

This represents a significant blocker for Cove integration.
Full integration requires either vendor cooperation or alternative approach.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
2026-02-10 16:55:31 +01:00

24 KiB

TODO: Cove Data Protection Integration

Date: 2026-02-10 Status: Research phase Priority: Medium


🎯 Goal

Integrate Cove Data Protection (formerly N-able Backup / SolarWinds Backup) into Backupchecks for backup status monitoring.

Challenge: Cove does NOT work with email notifications like other backup systems (Veeam, Synology, NAKIVO). We need to find an alternative method to import backup status information.


🔍 Research Questions

1. API Availability

  • Does Cove Data Protection have a public API? YES - Confirmed in documentation
  • CRITICAL: How to enable/activate API access? (settings location, admin portal?)
  • What authentication method does the API use? (API key, OAuth, basic auth?)
  • Which endpoints are available for backup status?
  • Is there rate limiting on the API?
  • Documentation URL: ?
  • Is API access available in all Cove subscription tiers or only specific plans?

2. Data Structure

  • What information can we retrieve per backup job?
    • Job name
    • Status (success/warning/failed)
    • Start/end time
    • Backup type
    • Client/device name
    • Error messages
    • Objects/files backed up
  • Is there a webhook system available?
  • How often should the API be polled?

3. Multi-Tenancy

  • Does Cove support multi-tenant setups? (MSP use case)
  • Can we monitor multiple customers/partners from 1 account?
  • How are permissions/access managed?

4. Integration Strategy

  • Option A: Scheduled Polling

    • Cronjob that periodically calls API
    • Parse results to JobRun records
    • Pro: Simple, consistent with current flow
    • Con: Delay between backup and registration in system
  • Option B: Webhook/Push

    • Cove sends notifications to our endpoint
    • Pro: Real-time updates
    • Con: Requires external endpoint, security considerations
  • Option C: Email Forwarding

    • If Cove has email support after all (hidden setting?)
    • Pro: Reuses existing email import flow
    • Con: Possibly not available

📋 Technical Considerations

Database Model

Current JobRun model expects:

  • mail_message_id (FK) - how do we adapt this for API-sourced runs?
  • Possible new field: source_type ("email" vs "api")
  • Possible new field: external_id (Cove job ID)

Parser System

Current parser system works with email content. For API:

  • New "parser" concept for API responses?
  • Or direct JobRun creation without parser layer?

Architecture Options

Option 1: Extend Email Import System

API Poller → Pseudo-MailMessage → Existing Parser → JobRun
  • Pro: Reuse existing flow
  • Con: Hacky, email fields have no meaning

Option 2: Parallel Import System

API Poller → API Parser → JobRun (direct)
  • Pro: Clean separation, no email dependency
  • Con: Logic duplication

Option 3: Unified Import Layer

         → Email Import →
Unified  →                → Common Processor → JobRun
         → API Import   →
  • Pro: Future-proof, scalable
  • Con: Larger refactor

🔧 Implementation Steps (After Research)

Phase 0: API Access Activation (FIRST!)

Critical step before any development can begin:

  1. Find API activation location

    • Check Cove admin portal/dashboard
    • Look in: Settings → API / Integrations / Developer section
    • Check: Account settings, Company settings, Partner settings
    • Search documentation for: "API activation", "API access", "enable API"
  2. Generate API credentials

    • API key generation
    • Client ID / Client Secret (if OAuth)
    • Note: which user/role can generate API keys?
  3. Document API base URL

    • Production API endpoint
    • Sandbox/test environment (if available)
    • Regional endpoints (EU vs US?)
  4. Document API authentication flow

    • Header format (Bearer token, API key in header, query param?)
    • Token expiration and refresh
    • Rate limit headers to watch
  5. Find API documentation portal

    • Developer documentation URL
    • Interactive API explorer (Swagger/OpenAPI?)
    • Code examples/SDKs
    • Support channels for API questions

Resources to check:

  • Cove admin portal: https://backup.management (or similar)
  • N-able partner portal
  • Cove knowledge base / support docs
  • Contact Cove support for API access instructions

Phase 1: API Research & POC

Step 1: Read Authentication Documentation DOCUMENTATION FOUND!

Step 2: Test Authentication

  • Determine token format (Bearer token? API key header? Query param?)
  • Common authentication patterns to test:
    # Option 1: Bearer token
    curl -H "Authorization: Bearer YOUR_TOKEN" https://api.example.com/endpoint
    
    # Option 2: API Key header
    curl -H "X-API-Key: YOUR_TOKEN" https://api.example.com/endpoint
    
    # Option 3: Custom header
    curl -H "X-Auth-Token: YOUR_TOKEN" https://api.example.com/endpoint
    
  • Test with simple endpoint (e.g., /api/v1/status, /api/accounts, /api/devices)

Step 3: Discover Available Endpoints

  • Find API documentation/reference
  • Look for OpenAPI/Swagger spec
  • Key endpoints we need:
    • List customers/accounts
    • List backup devices/jobs
    • Get backup job history
    • Get backup job status/details
    • Get backup run results (success/failed/warnings)

Step 4: Test Data Retrieval

  • Test listing customers (verify top-level access works)
  • Test listing backup jobs for one customer
  • Test retrieving details for one backup job
  • Document response format (JSON structure)
  • Save example API responses for reference

Step 5: Proof of Concept Script

  1. Create standalone Python script (outside Backupchecks)
  2. Test authentication and data retrieval
  3. Parse API response to extract key fields
  4. Mapping of Cove data → Backupchecks JobRun model
  5. Document findings in this TODO

Phase 2: Database Changes

  1. Decide: extend MailMessage model or new source type?
  2. Migration: add source_type field to JobRun
  3. Migration: add external_id field to JobRun
  4. Update constraints/validations

Phase 3: Import Mechanism

  1. New file: containers/backupchecks/src/backend/app/cove_importer.py
  2. API client for Cove
  3. Data transformation to JobRun format
  4. Error handling & retry logic
  5. Logging & audit trail

Phase 4: Scheduling

  1. Cronjob/scheduled task for polling (every 15 min?)
  2. Or: webhook endpoint if Cove supports it
  3. Rate limiting & throttling
  4. Duplicate detection (avoid double imports)

Phase 5: UI Updates

  1. Job Details: indication that job is from API (not email)
  2. No "Download EML" button for API-sourced runs
  3. Possibly different metadata display

📚 References

Cove Data Protection

JSON API Documentation (Found!)

Core Documentation:

Key Endpoints for Backupchecks:

Reference:

Other Resources:

Note: API docs are in "unused" folder - likely legacy but still functional!

Similar Integrations

Other backup systems that use APIs:

  • Veeam: Has both email and REST API
  • Acronis: REST API available
  • MSP360: API for management

Resources

  • API documentation (yet to find)
  • SDK/Client libraries available?
  • Community/forum for integration questions?
  • Example code/integrations?

Open Questions

  1. Performance: How many Cove jobs do we need to monitor? (impact on polling frequency)
  2. Historical Data: Can we retrieve old backup runs, or only new ones?
  3. Filtering: Can we apply filters (only failed jobs, specific clients)?
  4. Authentication: Where do we store Cove API credentials? (SystemSettings?)
  5. Multi-Account: Do we support multiple Cove accounts? (MSP scenario)

🎯 Success Criteria

Minimum Viable Product (MVP)

  • Backup runs from Cove are automatically imported
  • Status (success/warning/failed) displayed correctly
  • Job name and timestamp available
  • Visible in Daily Jobs & Run Checks
  • Errors and warnings are shown

Nice to Have

  • Real-time import (webhook instead of polling)
  • Backup object details (individual files/folders)
  • Retry history
  • Storage usage metrics
  • Multi-tenant support

⚠️ Critical Limitations Discovered (2026-02-10)

What the API CAN provide:

  • Account/device identifiers (I1)
  • Storage usage metrics (I14 - bytes used)
  • Computer/hostname (I18)
  • Numeric metrics (D01F00-D01F07, D09F00)
  • Basic partner metadata

What the API CANNOT provide (security restrictions):

  • Last backup timestamp - No reliable date/time fields accessible
  • Backup status (success/failed/warning) - No explicit status fields
  • Error messages - All D02Fxx/D03Fxx ranges blocked
  • Backup run history - No detailed run information
  • Cross-customer aggregation - API users are customer-scoped
  • Device enumeration - EnumerateAccounts method blocked (error 13501)

Root Cause

Security error 13501 ("Operation failed because of security reasons") occurs when:

  • Any restricted column code is requested in EnumerateAccountStatistics
  • EnumerateAccounts method is called (always fails)
  • This applies even with SuperUser + SecurityOfficer roles

Column restrictions are per-tenant and not documented. The allow-list is extremely limited.

Impact on Backupchecks Integration

Current API access is insufficient for backup monitoring because:

  1. No way to determine if a backup succeeded or failed
  2. No error messages to display to users
  3. No timestamps to track backup frequency
  4. Cannot import backup "runs" in meaningful way

Possible with current API:

  • Storage usage dashboard only
  • Device inventory list
  • But NOT backup status monitoring (core Backupchecks function)

🔀 Decision Point: Integration Feasibility

Option A: Implement Metrics-Only Integration

Pros:

  • Can display storage usage per device
  • Simple implementation
  • Works with current API access

Cons:

  • Does NOT meet core Backupchecks requirement (backup status monitoring)
  • No success/failure tracking
  • No alerting on backup issues
  • Limited value compared to email-based systems

Effort: Low (2-3 days) Value: Low (storage metrics only, no backup monitoring)

Option B: Request Expanded API Access from N-able

Contact N-able support and request:

  1. MSP-level API user capability (cross-customer access)
  2. Access to restricted column codes:
    • Backup timestamps (last successful backup)
    • Status fields (success/warning/failed)
    • Error message fields (D02Fxx/D03Fxx)
    • Session/run history fields

Pros:

  • Could enable full backup monitoring if granted
  • Proper integration matching other backup systems

Cons:

  • May require vendor cooperation
  • No guarantee N-able will grant access
  • Possible additional licensing costs?
  • Timeline uncertain (support ticket process)

Effort: Unknown (depends on N-able response) Value: High (if successful)

Option C: Alternative Integration Methods

Explore if Cove has:

  1. Reporting API (separate from JSON-RPC)
  2. Webhook system (push notifications for backup events)
  3. Email notifications (if available, use existing email parser)
  4. Export/CSV reports (scheduled export that can be imported)

Effort: Medium (research required) Value: Unknown

Option D: Defer Integration

Wait until:

  • Customer requests Cove support specifically
  • N-able improves API capabilities
  • Alternative integration method discovered

Pros:

  • No wasted effort on limited implementation
  • Focus on systems with better API support

Cons:

  • Cove customers cannot use Backupchecks
  • Competitive disadvantage if other MSPs support Cove

Immediate (This Week)

  1. Decision: Choose Option A, B, C, or D above

  2. If Option B (contact N-able):

    • Open support ticket with N-able
    • Reference API user creation at https://backup.management/#/api-users
    • Explain need for expanded column access for monitoring solution
    • Attach findings from /docker/develop/cove_data_protection_api_calls_known_info.md
    • Ask specifically for:
      • MSP-level API user creation
      • Access to backup status/timestamp columns
      • Documentation of column codes semantics
  3. If Option C (alternative methods):

    • Check Cove portal for webhook/reporting settings
    • Search N-able docs for "reporting API", "webhooks", "notifications"
    • Test if email notifications can be enabled per customer

Long Term (Future)

  • Monitor N-able API changelog for improvements
  • Check if other MSPs have found workarounds
  • Consider partnering with N-able for integration

🚀 Next Steps

Immediate Actions (Ready to Start!)

1. Read API Documentation FOUND! Priority reading order:

  1. Start here: Login/Auth - How to authenticate with your token
  2. Then read: Construct a Call - Request format
  3. Key endpoint: Enumerate Device Statistics - This likely has backup job data!

What to extract from docs:

  • API base URL/endpoint
  • Request format (JSON-RPC? REST? POST body structure?)
  • How to use the token in requests
  • Response format examples
  • Which fields contain backup status/results

2. Quick API Test with Postman (can be done now with token!)

Postman Setup Instructions

Step 1: Create New Request

  1. Open Postman
  2. Click "New" → "HTTP Request"
  3. Name it "Cove API - Test Authentication"

Step 2: Configure Request

  • Method: GET
  • URL: Try these in order:
    1. https://api.backup.management/api/accounts
    2. https://backup.management/api/accounts
    3. https://api.backup.management/api/customers

Step 3: Add Authentication (try both methods)

Option A: Bearer Token

  • Go to "Authorization" tab
  • Type: "Bearer Token"
  • Token: YOUR_TOKEN (paste token from backup.management)

Option B: API Key in Header

  • Go to "Headers" tab
  • Add header:
    • Key: X-API-Key
    • Value: YOUR_TOKEN

Step 4: Send Request and Analyze Response

Expected Results:

  • 200 OK → Success! API works, save this configuration

    • Copy the JSON response → we'll analyze structure
    • Note which URL and auth method worked
    • Check for pagination info in response
  • 401 Unauthorized → Wrong auth method

    • Try other authentication option (Bearer vs X-API-Key)
    • Check if token was copied correctly
  • 404 Not Found → Wrong endpoint URL

    • Try alternative base URL (api.backup.management vs backup.management)
    • Try different endpoint (/api/customers, /api/devices)
  • 403 Forbidden → Token works but insufficient permissions

    • Verify API user has SuperUser role
    • Check customer scope selection

Step 5: Discover Available Endpoints

Once authentication works, try these endpoints:

GET /api/accounts
GET /api/customers
GET /api/devices
GET /api/jobs
GET /api/statistics
GET /api/sessions

For each successful endpoint, save:

  • The request in Postman collection
  • Example response in TODO or separate file
  • Note any query parameters (page, limit, filter, etc.)

Step 6: Look for API Documentation

Try these URLs in browser or Postman:

  • https://api.backup.management/swagger
  • https://api.backup.management/docs
  • https://api.backup.management/api-docs
  • https://backup.management/api/documentation

Step 7: Document Findings

After successful testing, document in this TODO:

  • Working API base URL
  • Correct authentication method (Bearer vs header)
  • List of available endpoints discovered
  • JSON response structure examples
  • Any pagination/filtering patterns
  • Rate limits (check response headers: X-RateLimit-*)

Postman Tips for This Project

Save Everything:

  • Create a "Cove API" collection in Postman
  • Save all working requests
  • Export collection to JSON for documentation

Use Variables:

  • Create Postman environment "Cove Production"
  • Add variable: cove_token = your token
  • Add variable: cove_base_url = working base URL
  • Use {{cove_token}} and {{cove_base_url}} in requests

Check Response Headers:

  • Look for X-RateLimit-Limit (API call limits)
  • Look for X-RateLimit-Remaining (calls left)
  • Look for Link header (pagination)

Save Response Examples:

  • For each endpoint, save example response
  • Use Postman's "Save Response" feature
  • Or copy JSON to separate file for reference

3. Document Findings

After successful Postman testing, update this TODO with:

## ✅ API Testing Results (Add after testing)

### Working Configuration
- **Base URL:** [fill in]
- **Authentication:** Bearer Token / X-API-Key header (circle one)
- **Token Location:** Authorization header / X-API-Key header (circle one)

### Available Endpoints Discovered
| Endpoint | Method | Purpose | Response Fields |
|----------|--------|---------|-----------------|
| /api/accounts | GET | List accounts | [list key fields] |
| /api/customers | GET | List customers | [list key fields] |
| /api/devices | GET | List backup devices | [list key fields] |
| /api/jobs | GET | List backup jobs | [list key fields] |

### Key Response Fields for Backupchecks Integration
From backup job/session endpoint:
- Job ID: `[field name]`
- Job Name: `[field name]`
- Status: `[field name]` (values: success/warning/failed)
- Start Time: `[field name]`
- End Time: `[field name]`
- Customer/Device: `[field name]`
- Error Messages: `[field name]`
- Backup Objects: `[field name or nested path]`

### Pagination
- Method: [Link headers / page parameter / cursor / none]
- Page size: [default and max]
- Total count: [available in response?]

### Rate Limiting
- Limit: [X requests per Y time]
- Headers: [X-RateLimit-* header names]

### API Documentation URL
- [URL if found, or "Not found" if unavailable]

Save Postman Collection:

  • Export collection as JSON
  • Save to: /docker/develop/backupchecks/docs/cove-api-postman-collection.json
  • Or share Postman workspace link in this TODO

4. Create POC Script Once API works, create standalone Python test script:

import requests

# Test script to retrieve Cove backup data
token = "YOUR_TOKEN"
base_url = "https://api.example.com"

headers = {
    "Authorization": f"Bearer {token}",
    "Content-Type": "application/json"
}

# Get list of customers
response = requests.get(f"{base_url}/api/customers", headers=headers)
print(response.json())

5. Plan Integration Based on POC results, decide architecture approach and start implementation

Status: Ready for API testing - token available!


📝 Notes

  • This TODO document should be updated after each research step
  • Add API examples as soon as available
  • Document edge cases and limitations
  • Consider security implications (API key storage, rate limits, etc.)

Current Status (2026-02-10)

  • Confirmed: Cove Data Protection HAS API access (mentioned in documentation)
  • Found: API user creation location in Cove portal
  • Created: API user with SuperUser role and token
  • Found: Complete JSON API documentation (N-able docs site)
  • Tested: API authentication and multiple methods (with ChatGPT assistance)
  • ⚠️ CRITICAL LIMITATION DISCOVERED: API heavily restricted by column allow-list
  • ⚠️ BLOCKER: No reliable backup status (success/failed/warning) available via API
  • ⚠️ BLOCKER: No error messages, timestamps, or detailed run information accessible
  • 🎯 Next decision: Determine if metrics-only integration is valuable OR contact N-able for expanded access

Test Results Summary (see docs/cove_data_protection_api_calls_known_info.md)

  • Endpoint: https://api.backup.management/jsonapi (JSON-RPC 2.0)
  • Authentication: Login method → visa token → include in all subsequent calls
  • Working method: EnumerateAccountStatistics (with limited columns)
  • Blocked method: EnumerateAccounts (security error 13501)
  • Safe columns: I1, I14, I18, D01F00-D01F07, D09F00
  • Restricted columns: D02Fxx, D03Fxx ranges (cause entire request to fail)
  • Scope limitation: API users are customer-scoped, not MSP-level

API Credentials (Created)

IMPORTANT: Store token in secure location (password manager) - cannot be retrieved again if lost!

Likely API Base URLs to Test

Based on portal URL backup.management:

  1. https://api.backup.management (most common pattern)
  2. https://backup.management/api
  3. https://api.backup.management/jsonapi (some backup systems use this)
  4. Check API user page for hints or documentation links

Possible Admin Portal Locations

Check these sections in Cove dashboard:

  • Settings → API Keys / Developer
  • Settings → Integrations
  • Account → API Access
  • Partner Portal → API Management
  • Company Settings → Advanced → API

Support Channels

If API activation is not obvious:

  • Cove support ticket: Ask "How do I enable API access for backup monitoring?"
  • N-able partner support (if MSP)
  • Check Cove community forums
  • Review onboarding documentation for API mentions