The shallow end

One of the side effects of distrohoppititus is that there is little time to swim into the deep waters. I have mastered the art of my own setup in arch and 'buntu worlds, and have strategies for restoring my favored DEs - usually XFCE or KDE. But I have only compiled one or two programs, and didn’t really retain anything I learned.

The upside is that given, say for instance, a mid-2011 Macbook Air, I was able to quickly go through my favorite distros to find the “sweet spot” for this machine. The answer (in this instance) being KDE Neon, which I wasn’t necessarily expecting, but it worked OOTB with the touchpad and all.

This machine had been slowing down to the point of unusable for the last year or so. When I finally got my daughter a new laptop, I took this one and said goodbye to Cupertino. Now it works better than new!

Sometimes the answer is actually found in the shallow end.

4 Likes

To be honest, this sentence oversimplifies the process of reaching these goals. Behind all this is a couple (few) years of file transfers, organization, and backup protocols in order to manage ownership of my data. The popular notion of a hoarder involves some crazy person with a house full of flea market rejects that the paramedics have to tunnel through to provide treatment. The emotional roller coaster during the organization of 30+ years of files helped me understand the psychopathy of such of a person.

Data curation and backup protocols are the soft underbelly of the digital world. One size does NOT fit all, and no matter how loud we scream, “BACKUP, BACKUP, BACKUP,” we leave the question of sensible and timely recovery unanswered.

Iv’e lived long enough to see the evolution of lazy backups. In the 90s, lazy backup meant Megabytes of backup files that you had to wade through not to find what you’re looking for. Then it was Gigabytes, today it’s Terabytes, tomorrow will be Petabytes.

Legitimate backups require discipline in file management that enables recovery. I know. This simple and unassailable truth is a spittake to any of us that have dealt with real people and IT support (“I saved it in Word, but it’s not there anymore.”), but, while we evangelize Linux, we have to keep in mind that the real problem is deeper than any OS or some weird hardware fluke.

The real problem of data curation is ancient and has been solved throughout the course of human civilization over and over. The answers available to us are legion - enough to make Ptolemy envious. But the core answer, that nobody (including me) wants to hear, is self discipline.

I agree about the backup but I sometimes lack the discipline of backing up my system. What do you use for a backup solution?

I use a multilevel system of file management. Basically, archive (off-machine;NTFS just in case), Dynamic (off-machine on ssd via usb), and Active (current files synced with cloud). Work in the active category is moved to dynamic as soon as practical.

I got stung by ransomware in its early days. Since offline and off-machine saved my bacon then (except for a couple of weeks of work), I stand by it.

Be honest with you I haven’t found the ideal solution. I use rsync (Grsync) to sync locally and move large directories around. Dropbox, the Google, Onedrive, and (I’ll deny this) icloud as appropriate, depending on the project.

What I love about Linux is the fact that I know if all my machines die tomorrow, I can be up and running within an hour from acquiring just about any machine, new or used.

Borg is an excellent backup tool, especially when combined with Borgmatic. You can have a basic configuration up and automated in a few short minutes.

EDIT: I just found out today that BorgBase, a hosting provider for Borg backups, has developed a GUI frontend for Borg called Vorta. It is available on the Flathub. It’s still fairly new and doesn’t have all of the features that Borgmatic does, but I may start recommending it to newbies instead of Déjà Dup, since they could leverage Borg’s more advanced features in the future without having to switch backup formats. Besides, they’re both named after Star Trek! :laughing:

I have a server at home. I installed rsnapshot on it, and configured it to use ssh to pull backups from all my machines. As long as a machine is on, it’ll get backed up at some point during the day. It uses hard links to ensure you don’t end up with duplicate copies of everything. I exclude some folders like VMs and my Steam library, but other than that it backs up everything. I’ve restored my entire home, or individual files from that over the years many times. It’s not perfect, but once setup, I rarely worry about it.