Jump to content

Recommended Posts

Posted

Hey,

Just wondering if it was possible and if yes, how to do it!

1. Automatically backing up MySQL database

2. Automated backing up public_html folder.

3. Zipping to one file

4. Mailing it as a link or an attachment to the webmaster or required email address or atleast sending a simple mail to remind of the backup.

 

Any ideas on how to do that and run it weakly as a cron or monthly as per requirement!

Open to all suggestions and discussions.

Posted

That's an excellent question!

  • Create /home/rutaj6/backup.sh
  • Give it 755 permissions
  • Put the contents
#!/bin/bash

# create mysql dump(s) duplicate this line for other databases
/bin/mysqldump --user=rutaj6 --password=password rutaj6_database > /home/rutaj6/rutaj6_database.sql

# remove old backup
rm -f /home/rutaj6/backup.tar.gz

# archive and compress .sql files and public_html
cd /home/rutaj6/
/bin/tar zcvf /home/rutaj6/backup.tar.gz *.sql public_html/*

# attach backup to email with timestamp of when it was created
date | /bin/mutt -a "/home/rutaj6/backup.tar.gz" -s "Rutaj6 cPanel Automated Backup" -- rutaj6@email.com
  • Then go to https://tommy.heliohost.org:2083/frontend/paper_lantern/cron/index.html
  • Type an email for error messages to go to during the account creation. The mysqldump command will always result in an error message since you're passing a password on the command line.
  • Select once a month from the dropdown box.
  • Change the day/minute/hour to random numbers, like always take a backup on the 5th of the month at 1:36am. If everyone selects the first of the month at midnight it can cause load spikes.
  • In the command box type "/home/rutaj6/backup.sh" If you want to prevent error messages from being emailed you can add "2>/dev/null" to the end of the command.
Posted

Tested the above commands after updating:

1) cpanel user to rutaj6

2) cpanel password of the required user rutaj6

3) databse name rutaj6_db, which is the name of my database

4) email address to rutaj@outlook.com, which is my email address.

 

But unfortunately, it did not work.

I got no email of any error, no email for the backup, and there was no backup when I checked the directory.

Did I go wrong somewhere?

I'm not too good at Linux Shell Commands but have made a few minor name changes and have scheduled it to day after just to test. Will set it to monthly after I know that it works!

Thanks anyways.

Posted

I checked your script and it looked good so I ran it manually (without a cron) and it created the .sql file and the backup.tar.gz file as expected. Did you receive it via email? If so it looks like the problem might be in how you created the cron job.

 

30 6 * * 0 /home/rutaj6/backup.sh
That means it's set to run every Sunday at 6:30am. Keep in mind that cron jobs unfortunately run on local time which is PST. What did you have it set to before?

 

I checked the cron log file and I don't see any cron job runs for your account. The reason it didn't work was because it never ran.

Posted

Yes, had got the mail.

I deleted all the files and the email and then reset the cron. Didn't work. So its quite sure the problem is with the cron setting.

Just a few questions:

1) The cron runs as per GMT or some other timezone?

2) Is there some minimum time limit you have to give for the crontab to propagate or can it run 10 minutes after you have set it?

 

I had set the cron to run 10 minutes later as per GMT and had given the command as per your directives but it didn't run. I guess I will have to schedule the cron for a farther time.

I have now set it to 0 8 * * 0 . So I'm guessing it will run tomorrow at 8am since tomorrow is Sunday(0).

 

Please tell me if there are any mistakes.

Posted

server time zone is PST

 

your Cron job will run at 16:00 Sunday UTZ

Posted

server time zone is PST

 

your Cron job will run at 16:00 Sunday UTZ

Thanks!

 

I checked your script and it looked good so I ran it manually (without a cron) and it created the .sql file and the backup.tar.gz file as expected. Did you receive it via email? If so it looks like the problem might be in how you created the cron job.

 

30 6 * * 0 /home/rutaj6/backup.sh
That means it's set to run every Sunday at 6:30am. Keep in mind that cron jobs unfortunately run on local time which is PST. What did you have it set to before?

 

I checked the cron log file and I don't see any cron job runs for your account. The reason it didn't work was because it never ran.

 

 

Yep. That was the mistake. I was setting the cron as per GMT. now set it to PST and it ran as expected.

Thanks for all your help!

Have now set it to run monthly!

  • 3 weeks later...
Posted

Hey rutaj6

I am willing to know the procedure that you followed for doing so(backup and mailing it).

However, I am new to the business and am unfamiliar with many of the terms involved.

Rather, if you wanna guide me (Or all the learners, reading this post),

CAN YOU PLEASE POST THE DETAILED PROCEDURE, from scratch, TO heliohost WIKI?

Please. Its a request.

 

Once you are done, please post the link here.

 

 

If you are unable to do so because of your tight schedule, please post at least an insight of what is exactly needed to be done (FROM SCRAP). I will mail you for further assist if you permit!

 

THANKS for your kind help!!

Posted

Hey, I would love to help but at present don't have wiki access. I will request for it and try to post as soon as possible.

Posted

Looks good :) Things you might want to add:

 

Create the MySQL dump:

/bin/mysqldump --user=cpUsername --password=cpPassword cpUsername_database > /home/cpUsername/cpUsername_database.sql

Probably should mention you need to repeat this line multiple times if you have multiple databases. It also doesn't accept wildcards (you need extra code to do a SELECT from the information_schema instead). I'm sure we'll get someone who will try a database name of cpUsername_% at some point...

 

 

/bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/*

 

If users have web roots for addon domains that are outside of public_html (I know at least 2 users that do this on Tommy), they need to add them to the end of this line if they wish to include them.

Posted

 

Create the MySQL dump:

/bin/mysqldump --user=cpUsername --password=cpPassword cpUsername_database > /home/cpUsername/cpUsername_database.sql

Probably should mention you need to repeat this line multiple times if you have multiple databases. It also doesn't accept wildcards (you need extra code to do a SELECT from the information_schema instead). I'm sure we'll get someone who will try a database name of cpUsername_% at some point...

I have already added the line in the note. Though its not in bold so its easy to miss out.

 

 

 

/bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/*

 

If users have web roots for addon domains that are outside of public_html (I know at least 2 users that do this on Tommy), they need to add them to the end of this line if they wish to include them.

I will add that now. But i'm not quite sure what the code for that will be!

 

 

Sounds interesting!

Thanks rutaj6.

Hitting a like. Not for the post but for the efforts you took to help the newbies.

 

 

Thanks a lot.

Thanks. The idea for creating the wiki article was yours. I would have never thought of it anyway. And anyways, as a member of HelioHost, I take it as a duty to help as much as I can.

Posted

 

 

/bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/*

 

If users have web roots for addon domains that are outside of public_html (I know at least 2 users that do this on Tommy), they need to add them to the end of this line if they wish to include them.

I will add that now. But i'm not quite sure what the code for that will be!

 

Give it a try for sure. failure can even drive another good output!!!! :)

 

 

Sounds interesting!

Thanks rutaj6.

Hitting a like. Not for the post but for the efforts you took to help the newbies.

 

 

Thanks a lot.

Thanks. The idea for creating the wiki article was yours. I would have never thought of it anyway. And anyways, as a member of HelioHost, I take it as a duty to help as much as I can.

 

We are not just users of heliohost. We are a community. Makes sense!!!! :)

Posted

 

/bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/*
I will add that now. But i'm not quite sure what the code for that will be!

 

Any additional directory to add would just get appended on the end of the command like

/bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/* directory2/* directory3/*

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...