Crypto.BI Encrypted Cloud Backup (ECB) is a tool that encrypts files, directories and MySQL databases, mangles their name so it’s not understandable by the cloud storage system.
I originally wrote this tool to back up wallet.dat files without the cloud providers knowing they’re backing up possibly valuable data.
To an outsider, ECB backups are simply raw data dumps, indistinguishable from noise.
By mangling the filenames, it makes it impossible for your data to be searched in case of a cloud leak or other privacy invasion. In case of a major leak, your wallet.dat and other secure files won’t be searchable.
We’d include ECB in Crypto.BI Toolbox, but we decided to make it a standalone module so it can be used independently from the larger Toolbox package.
Right now ECB works with Amazon AWS but should be relatively easy to write interfaces for other cloud providers.
Clone the ECB Github repository and follow the configuration instructions below.
The Python3 scripts can be run directly on the cloned repository path.
Create a AWS S3 bucket and path for backup storage.
Create an Amazon AWS IAM role and give it read/write authorization on the bucket. Get the API keys associated with the new IAM role and follow these instructions to configure the boto3 AWS client library.
If you’re backing up databases too, then create a MySQL user with proper access level. Take note of the username/password pair.
dt.py and enter the required configurations, directories and files you wish to backup, along with any MySQL databases.
# list your files and directories here filenames = [ "/some/directory/", "/my/secrets/wallet.dat", "/etc/shadow", ] # list your databases here databases = [ "local_db1", "local_db2", ] dbuser = "dbuser" dbpass = "dbpass" dbhost = "localhost" secsalt = "randomsecuresaltstringmakeitlong234923823804020393323" odname = "/path/to/backups" # directory where to store the encrypted backups for upload mfname = odname + "/" + "mf" # manifest filename openssl = "/usr/bin/openssl" # encryption command tar = "/bin/tar" # archive command path mysqld = "/usr/bin/mysqldump" # mysql dump program path bkt = "mybucket" # cloud bucket name kpth = "my/backups" # cloud backup path within bucket backup_prefix = "skdkcms93" # any short random alphanumeric string max_upload_size = 50 # maximum upload file size in MB
After configuring ECB, run:
$ python3 bk.py [max_size]
max_size parameter overrides the
dt.py parameter of same name (sets maximum upload file size in MB).
Now enter the archive password twice. I recommend not reusing passwords. This password will be used by
openssl‘s strong symmetric encryption algorithm. Every file sent to the cloud will be encrypted using this password, so make it strong and make sure you do not lose it.
Now monitor screen output for possible errors.
bk.py script will process the list of files, directories and MySQL databases, archive, encrypt and send them to your cloud storage provider.
The manifest file (whose name was set in variable
dt.py) maps real path names to the mangled names.
To recover files, look for the random filename you set in
mfname and download that file from your cloud provider’s user interface.
mfname using the archive password and open it in a text editor. Select the file/dir(s) you wish to recover from the manifest, find and download the mangled name files based on the manifest name map.
Rename and decrypt the mangled file locally. The resulting file should either be the plain recovered file, a directory tar archive or a MySQL database.