• If you are still using CentOS 7.9, it's time to convert to Alma 8 with the free centos2alma tool by Plesk or Plesk Migrator. Please let us know your experiences or concerns in this thread:
    CentOS2Alma discussion

PSA Backup Script

S

sieb@

Guest
Ok, after months of testing, I have finally finished a friendly automated backup script for PSA. I have most of the rough edges polished so it should be ok for mutual consumption. This is simple so people other than me can figure it out. I know there are other fancy perl scripts and such floating around but thats too much work. :D I like keeping things simple on my servers, makes less that I have to remember to configure if I have to restore. Just copy the script and put it in a psadump.sh file.

What it does?
-Creates a dir based on the current date to house the dump files
-creates a PSADump for disaster recovery
-creates manual archives of Vhosts and Mailnames dirs for off-hand restores
-capable of migrating dumps to an NFS server
-clears Qmail queues
-writes most output to a temp log file
-emails temp logfile to the admin

I set this script is run twice a week via a cron entry I have in PSA under Root (sh /home/backup/psadump.sh). I also have rudamentary NFS commands added in here, that I am still testing, to offload dumps to an NFS server. You will have to configure your server for NFS so I commented the NFS stuff out for you.

You will have to remember to go in every other month or so and clean out past dumps. I like leaving them there just so I have a second copy somewhere, though this will cause issues with NFS (hence the still testing).

You will also have to create the /home/backup folder if thats where you want them (its my biggest partition so I leave them there), and remember to change the email address. I set the script to 777, but 700 should work.

Also, the end of this script purges the Qmail queue using qmHandle, so you will need that installed too (everyone should! http://sourceforge.net/projects/qmhandle ). This is to clear spam and **** out of my remote queues. I assume that if its still in there at 2am, even after I force Qmail to send everything, then its not going to be leaving on its own.

The log file it creates only has the PSAdump output in it and the dumped mailanmes appended to it. I didnt do a verbose tar of the vhosts dir because that creates too much output. But if thats what you want, its there, just add a v to the tar options. I suppose if you have alot of users you should remove verbose tar for the mailnames also but again, thats up to you.

**I take no responsability if this borks your boxen.**

-----------------

#!/bin/bash
echo ***************************************
echo Initializing FULL PSA System Dump
echo -------------------------------------------------
cd /home/backup
mkdir $(date +%F)
cd /
.//usr/local/psa/bin/psadump -F -f /home/backup/$(date +%F) -z &> /home/backup/$(date +%F).log
sleep 5
echo ----------------------------------
echo Initializing Manual Dump
echo ----------------------------------
echo Dumping Website Directory
cd /home/backup/$(date +%F)
tar -czf vhosts.tar.gz /home/httpd/vhosts >> /home/backup/$(date +%F).log
echo ----------------------------------
echo Dumping Qmail Mailboxes
tar -czvf mailnames.tar.gz /var/qmail/mailnames >> /home/backup/$(date +%F).log
echo ----------------------------------
echo Dump Complete!
#echo Updating Remote NFS Backup
#echo This May Take A While
#echo --------------------------------
#mount 172.168.1.162:/nfs/00 /mnt/nfs
#cp -ur /home/backup /mnt/nfs
#cd /
#umount /mnt/nfs
#echo --------------------------------
#echo Remote Backup Complete
echo ----------------------------------
echo Clear Qmail Queues
echo ----------------------------------
cd /
cd /var/qmail/bin
./qmhandle -a
sleep 10
./qmhandle -D >> /home/backup/$(date +%F).log
echo DONE!
echo Mailing Logfile
cat /home/backup/$(date +%F).log | mail -s "Server Backup Log `date +%F`" [email protected]
echo ---------------------------------
echo Returning to Normal Operations
echo ---------------------------------
 
Hi Sieb,

thanx for your script, can u post an example of the log file you receive by mail?

Thanx
 
The log file is just all the screen output piped to a file (which is then piped to an email). If you remove the >>/home/backup/$(date +%F).log, you will get all the sysout to the console screen. Or you can check the log file itself.

As for what the actual output is, its all the same stuff you would see with a normal PSADump.

As a note, I have come across a good MIME decoder that will decode PSADump's archive files, this removes the immediate need for seperate tars of the home and mailnames data unless you need to do quick restores if users delete some website files. http://www.miken.com/uud/
 
Hi Sieb,

i downloaded the MIME tool but i have some problem to use, can u explain me how to setting this?

thanx
 
You will need to load the dump file into UUDeview (this is the raw backup file ungzipped file, the extension is .archive). Then do a "preview. This will list a bunch of "unknown.xx" files, hit "Go" after selecting where you want them extracted to. These unknown files are targzipped archives of each domain and mailalias (and some other stuff), so just renaming them to .tgz and you can go in and explore the archive for the files you need. Unfortunetly, there is no way to know whats in which file until you check it out.

This is why I still kind of prefer to have independent tars of everything and just reserve the psadump for server restores.
 
I notice you backup the homedirectories manualy, dos psadump not do that? Is psadump only bacling up configuration and mysql? If not, dos anyone know exactly what psadump will backup?
 
I make seperate tars as a personal preference for "off-hand restores," like if I just need to restore a couple files here and there. It's quicker than tearing into the psadump archive.
 
Beware. I just ran the simple vhosts directory backup and it locked me out of my VPS. Plus, I can't even ping the neighbor IP's. Hopefully their web server didn't go down. I wouldn't imagine this would use that many resources?
 
If you have alot of sites, it will take a while, and a bit of resources, since its Taring EVERYTHING in vhosts. As I've said before, you don't HAVE to do it. I only do it for quick one off restores instead of parsing throught PSAs dump file.

As for locking you out of a VPS, I can't comment on that since I don't use VPS's, all my PSA servers run on VMWare GSX servers. But when the backup is running, the servers are still accessible. The actual taring runs after PSA has already started back up.
 
It actually looks liike it might be my webhost's fault. They've been having faulty hardware issues. Will try this again on a new server and get back to you.

Thanks for your instructions.
 
I'm using this script from the crontab each night and it works well. However, when the script finishes it leaves an instance of tar running in memory. looked at my server today and saw that there was a 3.5 load. I checked what was running and there were four instances of tar running (one from each of the last four days).

Is there a way we could add a kill tar command at the end of the script? Any help would be appreciated! :)
 
I only run my backups twice a week and do not have not seen this problem. One thing to keep in mind is that if you have alot of websites, tars can take a long time since it makes one big file. If I had time, I would figure out how to get it to tar up individual sites.

A kill command at the end of the script would work, but I'm not poficient enough to get it to grep for the pid and kill it on its own. Anyone feel free to contribute!
 
/usr/local/psa/bin/psadump --nostop --nostop-domain --do-not-dump-logs -z -f - | split -b 1500m - psadump-${DATE}-part-

that will work to split them to 1.5g sizes.. disregard the variable, as it's out of another script
 
Originally posted by Mr.Yes
Hi Sieb,



did u find the time to do it?

thanx

Nope, I haven't needed it. TARing each indivitual site makes for very large, and long, backups.
 
Back
Top