This website uses Google Analytics and Advertising cookies, used to generate statistical data on how visitors uses this website (with IP anonymization) and to deliver personalized ads based on user prior visits to this website or other websites. [read more]
These cookies are disabled by default until your Accept/Reject Consent.
The Consent can be changed at any time by accesing the ePrivacy button located at the bottom of every page. Enjoy your visit!

Ubuntu, data encryption and backup, the right way

This article is about how to backup an Ubuntu based workstation, if this is what you are looking for, keep reading.
The backup procedure works for any up to date Ubuntu based distribution (Ubuntu, Linux Mint, Pop!_OS, etc).
Personal projects, documents, images, videos, they are all here. Including the installed software and their settings.

If you care about those files then you must protect them. And there are 2 layers of protection that can be applied. First layer is an encryption at rest, so if your hdd/ssd get into the wrong hands (while powered off), the data must be inaccessible without proper authorization.

The encryption (at rest)

This protection is easy to archive if you have a SED SSD as storage device and a motherboard capable of sending ATA commands (HDD password).
The principle it's simple, a SED (Self Encrypted Drive) automatically encrypts the data using an internal encryption key (Media Encryption Key - MEK) and we need just to set a password (Key Encryption Key - KEK) to encrypt/decrypt the MEK and to lock/unlock the disk. That's it. Just remember that once unlocked, the drive will remain so, even in sleep mode, so to power off the computer when is not used it's a must to ensure the protection at rest.
If there is no motherboard support for HDD password, then the hdparm linux utility can be used to check/set this level of security, there are few good tutorials on the internet that may help. Just in case you have a ThinkPad, I'll suggest keeping a copy of hdd ata_identity as sugested (and successfully used by me) here and the tool available on github

Data backup, the right way

Any backup must be encrypted, so only us or an authorized by us person must be able to use the data inside.
Backups should be done in an incremental way to keep the space occupied at a minimum.
Backups should be stored in multiple locations, geographically separated. But for home/soho use I think that one reliable cloud storage account should suffice, as it is hard to think to a disaster that may affect at the same time your personal computer and a tier 2+ data center.
And finally, backups should be automated executed

Ubuntu, incremental encrypted backup plan
Ok then, let's prepare our Ubuntu workstation for the right way backups, in five simple steps

-- The software used:
deja-dup and duplicity for the automated, recurrent, encrypted, incremental files backup
aptik and aptik-gtk for the installed packages list and cron files backup
anacron for programming recurrent execution of the aptik utility
the cloud storage provider file sync utility, megasync

-- The cloud storage provider, a free 50GB account or a premium one

Step one:
Install the required packages:
sudo apt install deja-dup duplicity anacron
sudo apt-add-repository -y ppa:teejee2008/ppa
sudo apt update && sudo apt install aptik aptik-gtk
Download and install the megasync utility from website

Step two
Create two folders (BackupSync, BackupAptik) in home directory, folders where we will store a local copy of backups.
mkdir {~/BackupAptik,~/BackupSync}

Step three
Open "Backups" and configure it as follow:
Folders to save: Home (username)
Folders to ignore: run this in terminal:
gsettings set org.gnome.DejaDup exclude-list "['**cache/**', '**Cache**/**', '**trash/**', '**Trash/**', '**temp/**', '**Temp/**', '.thumbnails', '.dbus', '.gvfs', '.bash*', '.*authority', '.*environment', '.profile', '.*success*', '.*error*', '.*lock*', 'Downloads']"
Storage location: Local Folder -> BackupSync
Scheduling: Automatic backup, Every Day, Keep Forever
Then execute a first manual backup (Overview -> Backup Up Now) in order to provide the option for encryption and the password for that (tick the 'Remember password' but be aware that from time to time we will be asked for it, the password)

Step four
Create a daily recurrence for the software packages backup (/etc/cron.daily/backup), with this content
aptik --basepath /home/YOUR_USERNAME/BackupAptik --backup-all --include-pkg-foreign --skip-cache --skip-fonts --skip-themes --skip-icons --skip-users --skip-groups --skip-home --skip-mounts --skip-dconf --skip-files --skip-scripts --scripted
Make it executable (sudo chmod +x /etc/cron.daily/backup) and run it once using run-parts (sudo run-parts /etc/cron.daily/)

Step five
Open megasync and configure it to keep in sync the local backup folder with a location of choice from the cloud storage

That's it, we can now sleep well, as we have the backup in place and we can use it if needed.

How to use the backup

For deleted or modified files
using the "Gnome Files (Nautilus)", just right click on a file or folder/inside a folder and we will have the "restore missing files" and the "revert to a previous version", what's next is self explanatory.
For OS new installs (including changing the storage drive after a complete crash)
use steps 1 2 4 5 (skip step 3), allow megasync to download the backup files then step 3
use either "Gnome Files" to restore data or better use the "Backups" utility with the option 'restore files to their original location'
for the software packages and the cron files use aptik-gtk
reboot the computer afterwards

Pros and cons


- one time procedure
- the possibility to keep the backup folder BackupSync in sync with multiple cloud storage accounts at once (Google Drive, Dropbox, Mega), using their corresponding sync clients
- benefit of Mega extra support for the file versioning and recycle bin
- fastest data restoration time


- local storage used
- multiple steps needed in order to complete the one time procedure

Storage alternatives

The Dejadup "Backups" have support for Google Drive (using API) and other storage locations like ftp and ssh but I haven't tried yet this ones, so I can't say if there are real benefits, especially for the local space used and the restoration speed (get data back) when using this ones. But I can guess: without a local copy of the data (gaining space), a restoration speed will have to suffer as the backup data needs to be downloaded first (at least part of). Anyway, it's great that we have from what to choose from. At least for me, using the setup described in this article it's a life saver, haved used few times for data restoration and few times for complete data restoration (one time when my Intel SSD completly electrically crashed)


Share this post on your favorite networks.
© Copyright 2017 | Just another information technology blog - All Rights Reserved