Great video. I used this for many months but recently wrote a new script. I just have one script rather than three. In that script it runs the daily every day and then uses if-then to check if it's Friday, it COPIES the day file to the weekly folder. If it's the first day of the month, then it copies the day file to the monthly folder. That way you don't run the whole archive three separate times and only one cron job. The rest is totally based on your tutorial. Thanks so much.
I've spent so long looking for some of the contents of this tutorial online and found it here in one place. 2nd of your videos that I've found that have absolutely NAILED what I needed to know. Awesome tutorial, thank you.
Very nice tutorial, specially for those of us who know little about Linux. This method implies you have enough space in the source to make the compressed backups to transfer to destination.
Nice video on tar, rsync and crontab 👍 I shall try the script but also experimenting with piping each of the commands into grep error and further piping into either touch error or cat > error-$(date +%Y%m%d), all in order to gate the steps and log errors. Also I shall experiment with creating tar.gz files onto a labile ram disk. Maybe better for speed and wear. Dunno. Kindest regards, friends and neighbours.
What if you only want to backup changes to your directory. Presumably tar is not what you need to use for this? Eg if I want to backup my entire Plex library for redundancy purposes.
I love the stream-of-consciousness motif where you show the mistakes like "backup" vs "backups" and checking in realtime whether the cron job is calling the correct path.
A good video; I was wondering why you used three scripts when one would have done the job, and you could have avoided the problem of the rsync command running before tar had completed its task? I was also wondering because you have created the three scripts why you didn't move the scripts into the crontab directory structure where jobs are run daily etc. I agree with Anders A you really need some logging so perhaps make a follow on video to show how you add logging into the script.
Is there any reason as to why run multiple tar arguments for monthly and weekly? Couldn't they just include an copy or hardlink command to the tar file that the daily script creates at that point? And in so decreasing the harddrive usage and decreasing CPU usage as you don't have to compress your files so many times?
If you think of huge amounts of data, it might well happen that you‘re running out of allocated time for backuping. I modified you‘re (else great!) tutorial to just copy once a week / month / year. A link wouldn‘t do the trick as the intention is to maintain older backup versions.
Oh for this command? tar -zcf /home/tony/backup/monthly/backup-$(date +%Y%m%d).tar.gz -C /var/www/ html The -C flag means to change to the /var/www directory and the html argument is the directory that we want to archive
Hello. when I use find delete command +3 it does not want to delete any files but when I use -3 it deletes new files ... I have files in the folder that are older than 3 days but they do not work .... find /mnt/exe/backups/nextcloud-db/* -mtime +3 -delete
Hi there, i am trying a whole week now to get rsnapshot running. Please, can u give an advice or recommendation for a tutorial for linux beginners? We are running ubuntu server, no problem with creating samba servers or webserver that serves us well, but until now - and after a dozen very different tutorials online (and everyone got so much differences) - we are just a few days before fucking the whole linux story and running everything with windows until ransomeware take us out. Before you recommend us to count the days until it happens and we deserve that - pleeeease reflect that out there, there are many small NGO's besides us (like aids help, abortion help, transexual help and other ngo's with just a few people who are really helpless and will shut down one after another in those days :( )
@@laci272 Thanks for letting me know. I'm not sure what's going on here. It's working for me on Chrome, Firefox, and Opera. Unfortunately there's no log file that I can check to debug. I changed the error message though to include my email address as an alternative method of contact. Please try again to see if that shows up for you and then feel free to contact me that way.
@@TonyTeachesTech i still get the old message:) There was an error trying to send your message. Please try again later. wait.. i'm on tonyteaches.tech .. are we talking about the same site?
@@laci272 I really apologize. I'm not sure what's going on here. Another way to get in contact with me directly is to get my email on my UA-cam about page ua-cam.com/channels/WPJwoVXJhv0-ucr3pUs1dA.htmlabout
First of all thanks for video, very useful. It does however need editing as in several places you either make a mistake or get a bit lost. So it would definitely benefit from an hour or two of editing. Still a good video though. Thank You !
Great video. I used this for many months but recently wrote a new script. I just have one script rather than three. In that script it runs the daily every day and then uses if-then to check if it's Friday, it COPIES the day file to the weekly folder. If it's the first day of the month, then it copies the day file to the monthly folder. That way you don't run the whole archive three separate times and only one cron job. The rest is totally based on your tutorial. Thanks so much.
Can you share your version via github or something? It sounds like exactly what I'm looking for
I've spent so long looking for some of the contents of this tutorial online and found it here in one place. 2nd of your videos that I've found that have absolutely NAILED what I needed to know. Awesome tutorial, thank you.
You're very welcome!
This is gold. Amazing presentation of information ! You would make a great teacher
Glad it was helpful! Subscribe if you haven't already done so 🙏
Very nice tutorial, specially for those of us who know little about Linux. This method implies you have enough space in the source to make the compressed backups to transfer to destination.
Thanks!
I've been using linux for years and it's still a really helpful video!
Thank you so much, the best 22 minutes invested.
Nice video on tar, rsync and crontab 👍
I shall try the script but also experimenting with piping each of the commands into grep error and further piping into either touch error or cat > error-$(date +%Y%m%d), all in order to gate the steps and log errors.
Also I shall experiment with creating tar.gz files onto a labile ram disk. Maybe better for speed and wear. Dunno.
Kindest regards, friends and neighbours.
Really nice video covering useful simple scripting! Sweet!
Thanks!
Excellent tutorial
Exactly matched to my scenario
Thank you so much
Great to hear!
thanks for offering this video. Would have been perfect if you also included the restore process.
Awesome video. Thank you for your explanations and details.
Awesome man❤... Really helpful
Very well explain video!! Thanks for sharing!
Amazing thank you for this. Very good explanations.
great video! perfect for what i need - also subscribed
Great video. I did need it. Thanks for doing the research (and for the videos)
No problem!
thanks for the date command, I used to do that manually before
You're welcome Vivek
Great job and best explanation
Thanks!
Thanks for the guide.
Great video, thanks for your help
No problem 👍
Thank you for a great tutorial.
You are welcome!
great video. excellent explanation
Thank you for your video. Usefully.
What if you only want to backup changes to your directory. Presumably tar is not what you need to use for this? Eg if I want to backup my entire Plex library for redundancy purposes.
Very nice! Your a great teacher ~~ easy to follow :-) It helped me Thank you
Thanks!
I never remember about the -C tar flag... what a handy option.
Never knew that you could so easily cull old files with a simple find option
Not a fan of blanket passwordless root access via ssh keys.. will you address this? 9 more minutes, maybe that'll be in part 2
I don't understand the bit about deleting and then "oops no we actually need that because [unintelligible]"
I love the stream-of-consciousness motif where you show the mistakes like "backup" vs "backups" and checking in realtime whether the cron job is calling the correct path.
Looks like a workable backup solution with minimal fuss, thanks for sharing your expertise. Subscribe.
A good video; I was wondering why you used three scripts when one would have done the job, and you could have avoided the problem of the rsync command running before tar had completed its task? I was also wondering because you have created the three scripts why you didn't move the scripts into the crontab directory structure where jobs are run daily etc. I agree with Anders A you really need some logging so perhaps make a follow on video to show how you add logging into the script.
Owesome!, you really helped me. Thank you so much.
Sure thing!
Shuouldn't you execute the rsync command at 01:00 AM for example, to make sure that all the backups are made before moving them?
Really what I was looking for......👍
Great
Clearly explained! Thanks! Any idea backing up to Google drive?
well done - thanks for sharing.
Great Great videos! actually how to include the password option in script without using public key? Appreciate in advance!!
Excelent..!! saludos desde Perú..!! Theacher
Thank you :) youre awesome
You’re welcome!
Would an 8gb usb be enough for a system back up with out home etc?
how do i take an image of the entire linux server and restore it later?
Many THANKS
Is there any reason as to why run multiple tar arguments for monthly and weekly?
Couldn't they just include an copy or hardlink command to the tar file that the daily script creates at that point? And in so decreasing the harddrive usage and decreasing CPU usage as you don't have to compress your files so many times?
That sounds like a valid point. In no way was I considering efficiency in this tutorial. Thanks!
If you think of huge amounts of data, it might well happen that you‘re running out of allocated time for backuping. I modified you‘re (else great!) tutorial to just copy once a week / month / year. A link wouldn‘t do the trick as the intention is to maintain older backup versions.
how about we make a filename as the username of the account + date
can you explain the options
Awesome video....loved every bit of it😍👍
Is there any reason why there is a space between the 'var/www' and the 'html' directories?
Oh for this command? tar -zcf /home/tony/backup/monthly/backup-$(date +%Y%m%d).tar.gz -C /var/www/ html
The -C flag means to change to the /var/www directory and the html argument is the directory that we want to archive
@@TonyTeachesTech Awesome...thanks a million 🤗
"Execute the daily backup script everyday at 12:15 AM" - What if I want to backup at 8:30 PM for example. Thanks
then modify it accordingly
Hello.
when I use find delete command +3 it does not want to delete any files but when I use -3 it deletes new files ... I have files in the folder that are older than 3 days but they do not work ....
find /mnt/exe/backups/nextcloud-db/* -mtime +3 -delete
I think you understand this, but just to make sure the +/- depends on when the files were created i.e. timestamp
Put Caveat on the --delete command
Yeah
Hi there, i am trying a whole week now to get rsnapshot running. Please, can u give an advice or recommendation for a tutorial for linux beginners? We are running ubuntu server, no problem with creating samba servers or webserver that serves us well, but until now - and after a dozen very different tutorials online (and everyone got so much differences) - we are just a few days before fucking the whole linux story and running everything with windows until ransomeware take us out. Before you recommend us to count the days until it happens and we deserve that - pleeeease reflect that out there, there are many small NGO's besides us (like aids help, abortion help, transexual help and other ngo's with just a few people who are really helpless and will shut down one after another in those days :( )
very nice thanks
Thanks James
Hello,
What about the database backup?
You can look here ua-cam.com/video/_zu9ss2RX-0/v-deo.html
Andway to have the destination to be Google Drive?
No I don't think so
Tony, the contact form on your site is broken. "There was an error trying to send your message. Please try again later."
I'm not seeing this issue. What browser are you using?
@@TonyTeachesTech I've tried in Brave and Firefox and Opera:) (Opera is default, no plugins, no changes)
@@laci272 Thanks for letting me know. I'm not sure what's going on here. It's working for me on Chrome, Firefox, and Opera.
Unfortunately there's no log file that I can check to debug.
I changed the error message though to include my email address as an alternative method of contact. Please try again to see if that shows up for you and then feel free to contact me that way.
@@TonyTeachesTech i still get the old message:) There was an error trying to send your message. Please try again later.
wait.. i'm on tonyteaches.tech .. are we talking about the same site?
@@laci272 I really apologize. I'm not sure what's going on here. Another way to get in contact with me directly is to get my email on my UA-cam about page ua-cam.com/channels/WPJwoVXJhv0-ucr3pUs1dA.htmlabout
Cool!
:)
Thanks bro
Sure thing
Should work on a mac also, correct?
Yeah. Only thing is a cron would require your mac to be on at the time it executes.
First of all thanks for video, very useful. It does however need editing as in several places you either make a mistake or get a bit lost. So it would definitely benefit from an hour or two of editing. Still a good video though. Thank You !
Sorry for that. I too make mistakes ;)
Borg is a better solution. Look for Borg backup.
What to do when tar-ing files that are owned by root? I got many "permission denied".
use root permission when tarring them