• Our team is looking to connect with folks who use email services provided by Plesk, or a premium service. If you'd like to be part of the discovery process and share your experiences, we invite you to complete this short screening survey. If your responses match the persona we are looking for, you'll receive a link to schedule a call at your convenience. We look forward to hearing from you!
  • We are looking for U.S.-based freelancer or agency working with SEO or WordPress for a quick 30-min interviews to gather feedback on XOVI, a successful German SEO tool we’re looking to launch in the U.S.
    If you qualify and participate, you’ll receive a $30 Amazon gift card as a thank-you. Please apply here. Thanks for helping shape a better SEO product for agencies!
  • The BIND DNS server has already been deprecated and removed from Plesk for Windows.
    If a Plesk for Windows server is still using BIND, the upgrade to Plesk Obsidian 18.0.70 will be unavailable until the administrator switches the DNS server to Microsoft DNS. We strongly recommend transitioning to Microsoft DNS within the next 6 weeks, before the Plesk 18.0.70 release.
  • The Horde component is removed from Plesk Installer. We recommend switching to another webmail software supported in Plesk.

Issue Issue with pleskbackup on large backup files

nickmm

New Pleskian
Hi ;-)

Well, I hope someone could help me to understand What's happening with large backups (I have customers over 80GBytes).

The issue only happens with large backups using split option:

/usr/local/psa/bin/pleskbackup clients-name CUSTOMER --split=2G --output-file=/CUSTOMER-BACKUP

If backup has more than 80GBytes (for example), pleskbackup split the backup on 40 files. Each one 2G. When I try to restore it, fails:

cat CUSTOMER-BACKUP* > FULL-BACKUP

And then:

tar xf FULL-BACKUP

I'm getting this errors:

tar: Skipping to next header
tar: Exiting with failure status due to previous errors


First, I thought the problem was only this backup. But I'm testing all servers (Onyx 17.5.3) with same results. Large backups faults to restore :( Split the backup results on backup file corrupte.

If I make the backup without splitting, I can open and restore the backup file. Wihtout any problem. Works fine.

So, What's is happening??? Different customers, different servers ................ but When the backup is big, and I use split, the backup can't be restore after join the pieces with cat. Backup Is corrupte.

Can be a bug? Plesk? Linux? Split or cat? Is not safe split large backups? Better work with large files (80/100GBtes)?????????????????

I apreciate any help. If anyone can reproduce it. At this moment I'm removing split option on our backup scripts until wait a "safe" solution.

Best regards ;-)
 
Not understanding why you are joining the pieces. You should only start restore from the first file and other files will be joined and restored automatically.
 
Not understanding why you are joining the pieces. You should only start restore from the first file and other files will be joined and restored automatically.

With large backups is usual. It's easier joined pieces, untar, get data needed and move it to production server (just data needed). If not, you must move big backup to production server causing big loads .......... etc etc.
 
Back
Top