rutaj6 Posted December 30, 2016 Posted December 30, 2016 Hey,Just wondering if it was possible and if yes, how to do it!1. Automatically backing up MySQL database2. Automated backing up public_html folder.3. Zipping to one file4. Mailing it as a link or an attachment to the webmaster or required email address or atleast sending a simple mail to remind of the backup. Any ideas on how to do that and run it weakly as a cron or monthly as per requirement!Open to all suggestions and discussions.
Krydos Posted December 30, 2016 Posted December 30, 2016 That's an excellent question!Create /home/rutaj6/backup.shGive it 755 permissionsPut the contents #!/bin/bash # create mysql dump(s) duplicate this line for other databases /bin/mysqldump --user=rutaj6 --password=password rutaj6_database > /home/rutaj6/rutaj6_database.sql # remove old backup rm -f /home/rutaj6/backup.tar.gz # archive and compress .sql files and public_html cd /home/rutaj6/ /bin/tar zcvf /home/rutaj6/backup.tar.gz *.sql public_html/* # attach backup to email with timestamp of when it was created date | /bin/mutt -a "/home/rutaj6/backup.tar.gz" -s "Rutaj6 cPanel Automated Backup" -- rutaj6@email.com Then go to https://tommy.heliohost.org:2083/frontend/paper_lantern/cron/index.htmlType an email for error messages to go to during the account creation. The mysqldump command will always result in an error message since you're passing a password on the command line.Select once a month from the dropdown box.Change the day/minute/hour to random numbers, like always take a backup on the 5th of the month at 1:36am. If everyone selects the first of the month at midnight it can cause load spikes.In the command box type "/home/rutaj6/backup.sh" If you want to prevent error messages from being emailed you can add "2>/dev/null" to the end of the command.
rutaj6 Posted December 31, 2016 Author Posted December 31, 2016 Tested the above commands after updating:1) cpanel user to rutaj62) cpanel password of the required user rutaj63) databse name rutaj6_db, which is the name of my database4) email address to rutaj@outlook.com, which is my email address. But unfortunately, it did not work.I got no email of any error, no email for the backup, and there was no backup when I checked the directory.Did I go wrong somewhere?I'm not too good at Linux Shell Commands but have made a few minor name changes and have scheduled it to day after just to test. Will set it to monthly after I know that it works!Thanks anyways.
Krydos Posted December 31, 2016 Posted December 31, 2016 I checked your script and it looked good so I ran it manually (without a cron) and it created the .sql file and the backup.tar.gz file as expected. Did you receive it via email? If so it looks like the problem might be in how you created the cron job. 30 6 * * 0 /home/rutaj6/backup.sh That means it's set to run every Sunday at 6:30am. Keep in mind that cron jobs unfortunately run on local time which is PST. What did you have it set to before? I checked the cron log file and I don't see any cron job runs for your account. The reason it didn't work was because it never ran.
rutaj6 Posted December 31, 2016 Author Posted December 31, 2016 Yes, had got the mail.I deleted all the files and the email and then reset the cron. Didn't work. So its quite sure the problem is with the cron setting.Just a few questions:1) The cron runs as per GMT or some other timezone?2) Is there some minimum time limit you have to give for the crontab to propagate or can it run 10 minutes after you have set it? I had set the cron to run 10 minutes later as per GMT and had given the command as per your directives but it didn't run. I guess I will have to schedule the cron for a farther time.I have now set it to 0 8 * * 0 . So I'm guessing it will run tomorrow at 8am since tomorrow is Sunday(0). Please tell me if there are any mistakes.
bdistler Posted December 31, 2016 Posted December 31, 2016 server time zone is PST your Cron job will run at 16:00 Sunday UTZ
rutaj6 Posted December 31, 2016 Author Posted December 31, 2016 server time zone is PST your Cron job will run at 16:00 Sunday UTZThanks! I checked your script and it looked good so I ran it manually (without a cron) and it created the .sql file and the backup.tar.gz file as expected. Did you receive it via email? If so it looks like the problem might be in how you created the cron job. 30 6 * * 0 /home/rutaj6/backup.sh That means it's set to run every Sunday at 6:30am. Keep in mind that cron jobs unfortunately run on local time which is PST. What did you have it set to before? I checked the cron log file and I don't see any cron job runs for your account. The reason it didn't work was because it never ran. Yep. That was the mistake. I was setting the cron as per GMT. now set it to PST and it ran as expected.Thanks for all your help!Have now set it to run monthly!
giteshss2 Posted January 18, 2017 Posted January 18, 2017 Hey rutaj6I am willing to know the procedure that you followed for doing so(backup and mailing it).However, I am new to the business and am unfamiliar with many of the terms involved.Rather, if you wanna guide me (Or all the learners, reading this post),CAN YOU PLEASE POST THE DETAILED PROCEDURE, from scratch, TO heliohost WIKI?Please. Its a request. Once you are done, please post the link here. If you are unable to do so because of your tight schedule, please post at least an insight of what is exactly needed to be done (FROM SCRAP). I will mail you for further assist if you permit! THANKS for your kind help!!
rutaj6 Posted January 19, 2017 Author Posted January 19, 2017 Hey, I would love to help but at present don't have wiki access. I will request for it and try to post as soon as possible.
rutaj6 Posted January 20, 2017 Author Posted January 20, 2017 As requested by giteshss2,I have created the wiki page and this is the link: http://wiki.helionet.org/Running_Auto_BackupsIt has detailed procedure of how to go about creating and setting the file. 1
wolstech Posted January 20, 2017 Posted January 20, 2017 Looks good Things you might want to add: Create the MySQL dump:/bin/mysqldump --user=cpUsername --password=cpPassword cpUsername_database > /home/cpUsername/cpUsername_database.sqlProbably should mention you need to repeat this line multiple times if you have multiple databases. It also doesn't accept wildcards (you need extra code to do a SELECT from the information_schema instead). I'm sure we'll get someone who will try a database name of cpUsername_% at some point... /bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/* If users have web roots for addon domains that are outside of public_html (I know at least 2 users that do this on Tommy), they need to add them to the end of this line if they wish to include them.
giteshss2 Posted January 20, 2017 Posted January 20, 2017 Sounds interesting!Thanks rutaj6.Hitting a like. Not for the post but for the efforts you took to help the newbies. Thanks a lot.
rutaj6 Posted January 21, 2017 Author Posted January 21, 2017 Create the MySQL dump:/bin/mysqldump --user=cpUsername --password=cpPassword cpUsername_database > /home/cpUsername/cpUsername_database.sqlProbably should mention you need to repeat this line multiple times if you have multiple databases. It also doesn't accept wildcards (you need extra code to do a SELECT from the information_schema instead). I'm sure we'll get someone who will try a database name of cpUsername_% at some point...I have already added the line in the note. Though its not in bold so its easy to miss out. /bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/* If users have web roots for addon domains that are outside of public_html (I know at least 2 users that do this on Tommy), they need to add them to the end of this line if they wish to include them.I will add that now. But i'm not quite sure what the code for that will be! Sounds interesting!Thanks rutaj6.Hitting a like. Not for the post but for the efforts you took to help the newbies. Thanks a lot. Thanks. The idea for creating the wiki article was yours. I would have never thought of it anyway. And anyways, as a member of HelioHost, I take it as a duty to help as much as I can.
giteshss2 Posted January 22, 2017 Posted January 22, 2017 /bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/* If users have web roots for addon domains that are outside of public_html (I know at least 2 users that do this on Tommy), they need to add them to the end of this line if they wish to include them.I will add that now. But i'm not quite sure what the code for that will be! Give it a try for sure. failure can even drive another good output!!!! Sounds interesting!Thanks rutaj6.Hitting a like. Not for the post but for the efforts you took to help the newbies. Thanks a lot. Thanks. The idea for creating the wiki article was yours. I would have never thought of it anyway. And anyways, as a member of HelioHost, I take it as a duty to help as much as I can. We are not just users of heliohost. We are a community. Makes sense!!!!
Krydos Posted January 22, 2017 Posted January 22, 2017 /bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/* I will add that now. But i'm not quite sure what the code for that will be! Any additional directory to add would just get appended on the end of the command like /bin/tar zcvf /home/cpUsername/backup.tar.gz *.sql public_html/* directory2/* directory3/*
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now