• We value your experience with Plesk during 2024
    Plesk strives to perform even better in 2025. To help us improve further, please answer a few questions about your experience with Plesk Obsidian 2024.
    Please take this short survey:

    https://pt-research.typeform.com/to/AmZvSXkx
  • The Horde webmail has been deprecated. Its complete removal is scheduled for April 2025. For details and recommended actions, see the Feature and Deprecation Plan.

Question Best Practices for Backing Up (large) Plesk Servers

Fede Marsell

Basic Pleskian
Server operating system version
Almalinux 8
Plesk version and microupdate number
18.0.62
Hi!

I have been using Plesk for many years, and this is a question I have wanted to ask the community many times :)

For small servers, Pleskbackup is a really good tool. I have used it on many servers without errors, and the backups and restore processes work well.

But for large servers, for example, if a server has 1TB of data (web, mails, databases, etc.), is it still a good idea to use Pleskbackup? Is Pleskbackup still a good tool for backing up large servers? :rolleyes:

In this case, the backup is fragmented into 2GB files, but if the server has 1TB, I end up with 500 files, each 2GB. I think there may be a certain risk if one of these files has an error, and I cannot restore the data.

So, for large Plesk servers, what is the best backup practice?

Could Pleskbackup be a valid solution in this case (with incremental backups and a full weekly backup)? Or is additional software like Borgbackup or others necessary?

At this time, I have a server with 980GB, and I am using Pleskbackup, but I want to know if this can be a risk for large servers.

Thanks!!!
 
The built-in backup solution is not very useful for large Plesk servers with several terabytes of user data. Let's say you have a full server backup every night to a remote storage. Have you ever tried a restore? If your free disk space is less than 50%, you cannot copy it back to your server storage for the restore. Additionally, copying the data over the network can take hours, because the full backup always needs to be transferred and merged together, even for a single file.

On top of that, typical backup recovery tasks from our clients, such as...

a) "Can I have a database copy from 3 days ago? Please do not overwrite the live database."

b) "Can you give me a mailbox copy from 5 days ago? Please do not overwrite the current mailbox; I need a comparison in a temporary mailbox."

c) "Can I get a copy of my webroot from yesterday without overwriting the live system?"

...are just not practical at all. Fast recovery of specific small objects from external backups is not possible.

This gave us a lot of headaches in the past. We switched to other well-known tools like rdiff-backup, rsync, or rsnapshot.
 
Additionally, copying the data over the network can take hours, because the full backup always needs to be transferred and merged together, even for a single file.

Are you saying that to restore a client, Plesk must retrieve all full backups to the local server? :rolleyes:
 
@Fede Marsell Yes, the last full backup plus - if existing - all incremental backups till the restore date need to be transferred first. If this is not possible or the restore fails because of broken files (of a multivolume backup) or other reasons it gets tricky. You can find several threads in the forum dealing with possible ways to recover your data in such a situation.
 
We still use the Plesk Backup with great success, even for servers with multiple TB of data - but of course only with local storage. (as everything "remote" gets quirky with that much of data)
This allows us and our customers, to easily restore any type of objects from the backups in a fast and convenient way.

The "downside" of this method is that you need a secondary backup solution that will store your data offside. (we use a block based, imaging type backup software for that)
 
Back
Top