I've had an
Amazon AWS account for a long time, mainly for use with Route53. There are, however, many uses for an AWS account, and the best part is, they are free to open, and then you only pay for what you use. One of the uses that is relevant to all of us here is backing up our files at an extremely low cost. Amazon Glacier is now down to
.007? per gigabyte and the price drops as they add more users and servers.
I started today to backup my photo archive of roughly 70K photos that is about 400 gig in size. That will cost all of $2.80/mo and I view that as cheap insurance to reduce risk of loss. Glacier is different from a lot of online backup services in mainly that it is cheaper, but also that all your combined data can be uploaded from multiple sources and you only pay for what you use. Further, it's less of a backup solution and more of an archival tool. I want my kids to be able to get to our archived photos for decades to come, even if they don't have access to the physical disks I back them up on. This might be a great place to keep weather data archived, as well, but only your imagination is the limit.
The quick an dirty is to create a bucket in AWS S3. A bucket is your storage container space. Then upload your files or folders into your Amazon S3 bucket via an FTP tool. Finally, setup a rule that after x amount of days, say five or ten, it is transferred automatically into Glacier. S3 is live data storage, so it is 10x more expensive to keep files there whereas Glacier is much cheaper but requires 3-5 hours before one can restore and download a file. If you have programming skills, you can write code that will transfer directly to Glacier, but command line interfaces make me nervous about making a mistake, so the S3 transfer route is usually the easiest and safest path to take. I thought I'd share in the hopes this might help someone else.
dfw