Posts Tagged ‘crontab’

CPanel full backup (all files+databases+emails) PHP script

23 Comments »

 

I was looking for a working script to take full backup (all files+databases+emails) manually or using cron services on my hosting server, each CPanel user by one. But most of the scripts are either old, totally unusable or commercial.
So I wrote one for my own use and sharing here so others don’t need to re-invent the wheel

password_auth($cpanel_account,$cpanel_password);
$xmlapi->set_port('2083');
 
// Delete any other backup with filetime greater than expire time, before create new backup
$conn_id = ftp_connect($ftphost);
$login_result = ftp_login($conn_id, $ftpacct, $ftppass);

ftp_chdir($conn_id, $logs_dir);
$files = ftp_nlist($conn_id, ".");
foreach ($files as $filename) {
        $fileCreationTime = ftp_mdtm($conn_id, $filename);
        //$date = date("F j, Y, g:i a", ftp_mdtm($conn_id, $filename));
        //print "
Timestamp of '$filename': $date"; $fileAge=time(); $fileAge=$fileAge-$fileCreationTime; if ($fileAge > $backupexpireindays) { // Is the file older than the given time span? //echo "
The file $filename is older than Expire time :$expiretime ...Deleting\n"; ftp_delete($conn_id, $filename); //echo "
Deleted

"; } } ftp_close($conn_id); $api_args = array( 'passiveftp', $ftphost, $ftpacct, $ftppass, $email_notify, 21, '/' ); $xmlapi->set_output('json'); print $xmlapi->api1_query($cpanel_account,'Fileman','fullbackup',$api_args); ?>

You need to save it with .php extension (upload it to your server) and download include file from xmlapi.zip(right click->save as) and extract it to the same folder (on your web server). Create cron job from your CPanel or trigger it manually to get full backup in your FTP server, That’s it.
OR
You can fork from my git hub Repositories at cpanel-Fullbackup
Enjoy


FTP automation on Linux

No Comments »

 

Ever wanted FTP backups and automation for keeping backups of important file on off-peak time. I always love automation, so machines can do things automatically and help humans 🙂 Read the rest of this entry »