Can I use Amazon Glacier as the backup location?

Solution 1:

Not yet, however there's been progress in this area, follow this bug report:

  • https://bugs.launchpad.net/duplicity/+bug/1039511

Solution 2:

At the moment, I have a work-around that may be of use to someone out there.

I use backupninja with a duplicity back-end to back up all of my system stuff ( everything except $HOME ) to a local drive in my network as well as S3 ( to comply with having an out of premises backup ).

For my $HOME, I use Déjà Dup. I have it set up to back up directly to S3 as I still haven't figured a way to get Déjà Dup to back up to multiple places yet.

Here is how I have Amazon move my data from S3 to Glacier automatically for me.

  • Open your browser to: https://console.aws.amazon.com/s3

  • After you log on, either pick an existing bucket where you'll be storing your data or, create a new one.

  • After you have created ( or selected ) your bucket, click on the Properties button on the top row of buttons ( The row should read Refresh, Properties, Transfers, Help ).
  • After you click on the Properties button, a bottom pane should show up with the properties for the bucket you selected ). The tabs shoud read: Permissions, Website, Logging, Notifications, Lifecycle, Tags.
  • We are interested in the Lifecycle tab, click it.
  • You should now see a button labeled Add rule ( with a green "+" sign next to it ), click that to add a new rule.
  • This allows you to define rules for archiving and removal of data in the selected bucket.
  • A new pop-up window shows up with some options that need to be completed before the rule(s) are saved:
    • Name (Optional): Pick a name that makes sense for you or, one will be created for you
    • Prefix: This is an important one; Any object that matches this prefix will be subject to the rule we are creating. In my case, I typed Backup/. This means that any object in my bucket that starts with Backup/ will be subject to this rule. If you think of S3 like a file structure, this would mean that any "file" or "folder" inside of the "Backup" directory in this bucket will be subject to this rule.
    • Time Period Format: Days from creation date or Date. I picked the first one (Days from creation date) so I can use relative terms when referring to my data. It will make sense in a minute I promise.
    • Add a Transition rule by clicking on "Add Transition". This will add a "Transition to Glacier" option. This rule specifies when to move data from S3 to Glacier. In my case, I want data to be moved 25 days after it has been created.
    • Add an Expiration rule by clicking on "Add Expiration". This will add an "Expiration" option. This rule specifies when to remove data from S3. I have mine set up to 90 days after creation.

Once you are happy with your Transfer and Expiration rules, click on Save and you're done.

If you followed my set up, your data will be moved to Glacier 25 days after it has been created and removed after it has been in S3 for 90 days.

A few things to note: - I move the data to Glacier 25 days after creation ... yes that's an odd number I know ... Déjà Dup rotates it's data and creates a fresh backup every 28 days, six months, a year or never. I picked 28 days on Déjà Dup hence the 25 day rule in S3. I have backupninja create new backup files every 30 days. Both Déjà Dup and backupninja run every day on my systems. - I still have my backup tools clean up their data after 60 days or so but, if anything happens to be in S3 for longer than 90 days ( software crashed, changes in the rules, etc. ) it will be wiped out by the Expiration rule in S3. This saves me money.

The above is how I backup my data. How do you backup yours? You do backup your data right?

-Juan