ipxcore takes nightly server-level backups, retained for 90 days. That covers most disaster scenarios — but for true belt-and-braces protection, you also want copies of your data outside our infrastructure. This article covers configuring off-site backups to three popular storage options: Amazon S3, Google Drive, and Backblaze B2.
Why off-site matters
Server-level backups protect against software failures, accidental deletions, and individual file corruption — the common cases. They don't protect against:
- A catastrophic data center incident
- An attacker who compromises your account and deletes both files and backups
- A billing dispute that suspends your service
- You being locked out of your account during a crisis
An off-site backup at a third-party storage provider gives you a copy that you can restore from regardless of what happens to your hosting account.
Storage cost comparison
| Provider | Cost / GB / month | Egress cost (download) |
| Backblaze B2 | $0.006 | $0.01/GB |
| Amazon S3 (Standard) | $0.023 | $0.09/GB |
| Google Drive (Workspace) | ~$0.03 (bundled) | Free |
| Wasabi | $0.0069 | Free up to limits |
For 10 GB of backups: Backblaze B2 = ~$0.06/month, Amazon S3 = ~$0.23/month, Google Drive = bundled with Workspace if you have it. The differences are pennies.
Method 1: WordPress UpdraftPlus (per-site backups)
For WordPress sites specifically, UpdraftPlus is the easiest option. It backs up files and database, schedules automatically, and uploads to your chosen storage:
- Install UpdraftPlus from WordPress Plugins.
- Settings → UpdraftPlus Backups → Settings tab.
- Pick a Remote Storage destination (Amazon S3, Google Drive, Backblaze, Dropbox, OneDrive, etc.).
- Authenticate (each provider has its own OAuth flow).
- Set schedule:
- Files: weekly
- Database: daily
- Retain: 4 backup sets
- Save Changes.
Free for one storage destination; paid version ($70/year) adds incremental backups, multi-destination, and migration features.
Method 2: cPanel "Remote FTP" backups (whole-account)
cPanel's Backup feature can upload full account backups to a remote FTP/SFTP server you control:
- cPanel → Backup.
- Under Full Backup, click Download a Full Account Backup.
- Choose Remote FTP Server or Secure Copy (SCP) as the destination.
- Enter the remote server details, port, username, password.
- Click Generate Backup.
This requires a remote server you can SSH or FTP to. A $5 cloud VPS (DigitalOcean, Vultr, Hetzner) works for the destination. Set it up once, then schedule via cron.
Method 3: rclone via cron (most flexible, command-line)
rclone is the swiss-army knife of cloud storage. It speaks to S3, Google Drive, Backblaze, Dropbox, OneDrive, Box, and 50+ other backends. If you have SSH access on a dedicated server or VPS, this is the cleanest solution.
Install rclone
$ curl https://rclone.org/install.sh | sudo bash
Configure a remote
$ rclone config
Walk through the wizard for your provider. For Backblaze B2 specifically, it asks for your Application Key ID and Application Key from the B2 dashboard.
Test the connection
$ rclone ls b2:my-backup-bucket
Set up a nightly cron
Create a script /usr/local/bin/nightly-backup.sh:
#!/bin/bash DATE=$(date +%Y-%m-%d) BACKUP_DIR=/tmp/backup mkdir -p $BACKUP_DIR # Database mysqldump -u USER -pPASS --all-databases | gzip > $BACKUP_DIR/db-$DATE.sql.gz # Files (incremental) tar czf $BACKUP_DIR/files-$DATE.tar.gz /home/USER/public_html/ # Sync to off-site storage rclone copy $BACKUP_DIR/ b2:my-backup-bucket/$(hostname)/ # Retain only last 30 days locally find $BACKUP_DIR -mtime +30 -delete # Retain only last 90 days remotely rclone delete --min-age 90d b2:my-backup-bucket/$(hostname)/
Make it executable: chmod +x /usr/local/bin/nightly-backup.sh
Add to root's crontab via crontab -e:
0 3 * * * /usr/local/bin/nightly-backup.sh
Runs nightly at 3 AM. Easy to extend with notifications (email on failure) or extra paths.
Method 4: WHMCS reseller-side backups
If you're a reseller running WHMCS, JetBackup is a paid add-on we can enable for you that gives:
- Per-account scheduled backups
- Off-site destinations (S3, Backblaze, Google Drive)
- Per-file restore (no need to restore the whole account to recover one file)
- Self-service restore for your customers
Pricing scales by number of accounts. Open a ticket for a quote.
Restore drills
A backup you've never restored is a hope, not a backup. Test your restore process at least once per provider, ideally yearly. The drill:
- Pick a recent backup file from your off-site storage.
- Download it to a separate test environment (a $5 VPS, a local Docker container, anywhere).
- Extract files, import database, reconstruct what your hosting account contained.
- Verify a sample of pages and databases work as expected.
- Document every step you took. The next time, that documentation is the difference between calm and panic.
Common pitfalls
- Backups stored in
public_html. They're web-accessible — anyone who finds the URL can download your entire database. Always store backups outside the web root. - API keys committed to source control. Don't store rclone configs or backup scripts in a public Git repository — the keys would be exposed.
- Backup that overlaps with backup script. If your nightly cron is still running when the next one fires, you have two simultaneous tar processes. Use
flockin the script to prevent overlap. - Provider went down at the wrong time. Use two destinations (e.g., Backblaze for primary, Google Drive for redundancy) for important data.
- Encryption forgotten. Backups contain everything — database with hashed passwords, customer info, email contents. Use rclone's
cryptbackend or pre-encrypt withgpgbefore upload.