NOTE TO computerconsultantsforum.com and forums.techcareerfubar.com USERS: This is the same site. Your login will work here. Use the "forgot password" function if you need help recovering your password.

Obvious fact: You're not logged in.

Therefore, you're only seeing the tip of the iceberg of great discussion threads on this site.

Get rid of this big black message box by joining here: http://mature-it.pro/register/

Who We Are: A collection of IT, engineering and sciences professionals, in a variety of current circumstances with a variety of career backgrounds. Including System admins, Developers and programmers, Freelancers and "gig" entrepreneurs, Contract, job shopping and FTE-employed contract house IT workers, Web developers, Inventors, and artists and writers with tech backgrounds.

We're smarter than the hive mind you've experienced on large tech discussion forums and groups. So register on the board - your email is NEVER sold or provided to third parties. Then LOGIN FREQUENTLY to see new stuff daily.

Join by Registering here: http://mature-it.pro/register/

Author Topic: Linux backup strategy  (Read 444 times)

benali72

  • CCF Winner's Circle - Supporter
  • Wise Sage
  • *
  • Posts: 2731
Linux backup strategy
« on: November 05, 2017, 01:01:48 am »
What's a good backup strategy for a linux mint PC?

Here are my ideas.  I'm posting it here to solicit any improvements you might have.

--  AUTOMATION -- assuming 1 pc, scripting is not much needed
--  DISK HEALTH -- occasionally go into DISKS tool (gnome-disks) and look at the SMART output to ensure no bad sectors
--  FILESYSTEM HEALTH -- occasionally run fsck on all filesystems
--  BACKUP -- use any of the standard gui backup tools that supports incrementals. Check return code to ensure good backups.
-- DATA SEGREGATION --separate data into manageable units
     *  Keep OS and User Data separate
     *  For user data, I have separate partitions or folders for
               * working set       * semi-archival data       * archival data        * videos and photos     *  VM guests (*.vdi files)
     *  I backup Working Set daily, the others less frequently
     *  Separating data sets in this or similar fashion makes B/R easier (even when using incrementals)
-- ENCRYPTION -- don't use it unless you need it. If you do, use Mint's default encryption for HOME directories and partitions

Any suggestions for improvement?   Thanks for your feedback.

G0ddard B0lt

  • I absolutely DESPISE improvised sulfur-charcoal-salt peter cannons made out of hollow tree branches filled with diamonds as projectiles.
  • Trusted Member
  • Wise Sage
  • ******
  • Posts: 22672
  • Gorn Classic, user of Gornix
Re: Linux backup strategy
« Reply #1 on: November 05, 2017, 06:26:33 am »
I tried "Back in time", a program that showed up in my Mint program menu. I read up on it - it is a wrapper around rsync and diff. (I've been using rsync as a remote FTP running in a cron job for months to back up websites so I am a bit familiar with it.)

For convenience, I had a link in my home directory to the same external drive I used for the backup. What I found was that after about 8 hours the backup got stuck and the almost indecipherable status line on the screen shows that the backup was trying to back up all of the files on the external drive (through the link) back to itself!!!  >:(

I looked around and rsync by default is claimed to NOT do this.

I mean, GOD D***, common sense!! Aspie dumbasses who wrote the program!! Do a reality check of each folder you descend into, it's not exactly goddamned rocket science.

Anyway, I killed all of the processes and restarted it, after removing the hard link, and also adding exclusions of the /mnt and /media directories.

It completed in a couple more hours.  It did what was claimed, it mirrored a copy of my / root system to the external drive in a backup path.

Now, "Back in time"'s front end claims that the archive can't be restored. I guess I could manually copy it.

After all of that time spent.

F&^# Linux utility programs and the aspies who write them.

Linux: halfassed, never, ever good enough for day to day office use, yet another such experience of many I've had.

Anyway, YES, I am down with copying clear data, no encryption, and just using the space necessary to copy the data without compression.
Gornix is protected by the GPL. *

* Gorn Public License. Duplication by inferior sentient species prohibited.

unix

  • Trusted Member
  • Wise Sage
  • ******
  • Posts: 4296
Re: Linux backup strategy
« Reply #2 on: November 05, 2017, 08:32:27 am »
LOL

That's how I feel.

Sadly, our entire department is moving from 'real' unix to Redhat Enterprise.  I will install RedHat Enterpise on my machine to become familiar with it.
Brawndo. It's got what plants crave.

G0ddard B0lt

  • I absolutely DESPISE improvised sulfur-charcoal-salt peter cannons made out of hollow tree branches filled with diamonds as projectiles.
  • Trusted Member
  • Wise Sage
  • ******
  • Posts: 22672
  • Gorn Classic, user of Gornix
Now evaluating backup-manager
« Reply #3 on: February 17, 2018, 11:22:25 pm »
backup-manager is in the standard packages for Mint (apparently) and I've installed it.

I tried fwbackups (which is not in the standard packages and has to be installed manually.) The issue is that if you select incremental backups, it just copies the files, along with permission flags and users, to the backup store. So if you have a vfat storage media, it doesn't support  Unix style ownership and permission flags.  >:( It blew errors at the end of the backup that indicated that attempts by the code to set permissions failed.

I wasted a bit of time creating a full home directory backup only to find that it's not really a full backup per Unix standards. I mean, I have my data, but I'd have to fiddle with permissions manually if I ever restored the stuff to a live system.

Now I am evaluating this GNU package backup-manager. No GUI to speak of, just a command line and a config file. It hasn't been updated apparently since 2014.

Perfect.  8)

It will create gz tar archives so it's independent of the destination archive's restrictions - and also supports incremental backups. And it supports a plethora of storage options: burning DVDs, sftp, rsync, scp, and even AWS storage.

I'm currently backing up my ~1+ TB home directory. I'll report back on how it goes, and also how it goes for subsequent incremental updates of the backup.

So far this seems like the best option for anyone who has the technical chops to use something that is not GUI driven.
Gornix is protected by the GPL. *

* Gorn Public License. Duplication by inferior sentient species prohibited.

G0ddard B0lt

  • I absolutely DESPISE improvised sulfur-charcoal-salt peter cannons made out of hollow tree branches filled with diamonds as projectiles.
  • Trusted Member
  • Wise Sage
  • ******
  • Posts: 22672
  • Gorn Classic, user of Gornix
Re: Linux backup strategy
« Reply #4 on: February 18, 2018, 07:58:48 am »
^ G**d***ed  fucking software.

When I came out to take a look 8 hours later there was an error message and the command was frozen without completing:

"Tar reported a file changed during archive creation."

No log file, of course.

Also I don't see a way to break up the G** D**** archives it generates properly so I wound up with a 281 GB .tar.gz file.
Gornix is protected by the GPL. *

* Gorn Public License. Duplication by inferior sentient species prohibited.

benali72

  • CCF Winner's Circle - Supporter
  • Wise Sage
  • *
  • Posts: 2731
Re: Linux backup strategy
« Reply #5 on: February 18, 2018, 09:01:57 am »
With a 1+ terabyte home directory, would mirrored disk writes make sense (RAID) ? 

That way you're always in sync and not performing a big batch backup program.

You have to have enough disk data-transfer sockets on your motherboard for this to work.  Or you can buy a "raid card" (usually specifically designed for this purpose, and often including appropriate software). I'd check out the software very carefully as you'd be dependent on it.

I've never done this on a personal computer myself, but I have a friend who's operated this way for years and says it works for him.

The other idea that maybe would be useful would be to break up the monolithic backup job into several smaller jobs, each handling certain directories or types of files. For example, break out the backup for photos (that never change) from high volatility files. Not everybody's data can be segregated like this, but if it applies it's helpful.




G0ddard B0lt

  • I absolutely DESPISE improvised sulfur-charcoal-salt peter cannons made out of hollow tree branches filled with diamonds as projectiles.
  • Trusted Member
  • Wise Sage
  • ******
  • Posts: 22672
  • Gorn Classic, user of Gornix
Re: Linux backup strategy
« Reply #6 on: February 18, 2018, 10:18:09 am »
With a 1+ terabyte home directory, would mirrored disk writes make sense (RAID) ? 

It would in some ways. But one of risks I want to be inoculated from is losing everything in the box due to static, fire or power surge. I don't want the backup drive to be always connected at all times, for that reason.  I really want my main backup to be on a portable HD.

When I first set up this PC as a Windows 7 box nine years ago (holy crap, time flies) I set up RAID1 dual drives (mirrors of each other). Then after a crash I recovered from in 2014, I separated the raid and just kept one of the drives. Configuring the drivers and getting everything to work from the BIOS level up through the OS was too much hassle to repeat the exercise.

That way you're always in sync and not performing a big batch backup program.
The other idea that maybe would be useful would be to break up the monolithic backup job into several smaller jobs, each handling certain directories or types of files. For example, break out the backup for photos (that never change) from high volatility files. Not everybody's data can be segregated like this, but if it applies it's helpful.

Good thoughts, thanks. I'm already attempting that kind of thing of partitioning types of data. I was already doing so. The tar that was created overnight was about 300 GB. I just don't want everything in a single corruptible monster archive.  A tar file of over 5 GB is a hassle to work with.

backup-manager supports incremental backups. My intention is to occasionally (every 6 months or so) create a comprehensive master backup, and then perform weekly or so incremental backups against the master which should take less than an hour to perform.

I did read the documentation further for backup-manager and it does support archive splitting - setting a size limit on individual archives. But you must use one particular archive format: the format is dar, which is an intended replacement for "tar". I tried it and it does work and split up the archives. I may go with that format.

I'll report back. We had some "domestic emergencies" this afternoon that soured my mood when I posted and cursed the utility I was using....
Gornix is protected by the GPL. *

* Gorn Public License. Duplication by inferior sentient species prohibited.

unix

  • Trusted Member
  • Wise Sage
  • ******
  • Posts: 4296
Re: Linux backup strategy
« Reply #7 on: February 18, 2018, 02:05:48 pm »
Mirroring is expensive but the answer is yes. It's totally worth it. I want to get a 1TB Samsung SSD, like 960 Pro or maybe SM961. They are pricey for sure but it's oh so worth it. Data is more expensive, in that context

I am assuming you are rolling with SSDs.  Either way. If you are married to the olde HDD, you can get even more disk for that coin or much less. Like 2-4TB.
Brawndo. It's got what plants crave.

benali72

  • CCF Winner's Circle - Supporter
  • Wise Sage
  • *
  • Posts: 2731
Re: Linux backup strategy
« Reply #8 on: February 19, 2018, 10:07:48 pm »
I use mirroring (RAID 0+1) at work all the time. It's great for servers.

I have no idea how it scales down to an individual PC.  I'd love to hear your feedback if you do it.

Another big thing in server-world -- OS page mirroring. Solaris does this for example. The OS mirrors pages (very different than disk mirroring or RAID) in transparent fashion. Excellent for fast d-to-d recovery.