Search This Blog

Thursday, September 1, 2016

Issues with my Aperture Backup Plan

On a recent post, I explained how easy it is to transfer data from Amazon's S3 storage to Google Cloud Storage (GCS). I mean, this is cloud computing, so it should be simple, right? Well, in case my readers ran into problems (like I did), I didn't want to skip over the fact that issues can arise. And big issues they were...

First off, my transfer did not work as I described. In fact, I started writing the blog entry while the transfer was happening, and it was still transferring when I was done. It only appeared as if it were going to be successful. However, when all was said and done, I ended up with errors. To keep a long story short - and to not bore you with the research I had to do - I'll just link to my Stack Overflow post, and let you read it if you're interested.

At the end of the day, I got up to speed with gsutil, a very handy command line utility for talking with Google Storage from the local computer (remember - I'm running Xubuntu, but it should work fine for you Windows folks too). Some background though: when I started using S3, my intentions were to archive to Glacier to save money, and then only restore to S3 if a disaster ever happened. I would just sync my Macbook to the cloud, and then it would automagically archive to Amazon's cheap, long-term storage. Something went awry in the mix, though, and my files were neither in Glacier, nor classified as Standard storage in S3. The file types, as viewed from S3, was Glacier - but I could not see them in that service. I started down the path of restoring the files by meticulously right-clicking and restoring from Glacier within the S3 web console, but then I found out that the files would only be available for 3-5 days, and then go back to Glacier status. On top of that, I found out that the pricing would quickly escalate. So my dream of restoring from Glacier to S3 in bulk and have my Aperture library back up and running within 3 hours should a catastrophe happen was immediately squashed. I guess that's why they say that you should test your backup plan before putting all of your eggs in one basket, right?

At any rate, I got to learn some new command line interface (CLI) options for Linux, which always gets me going. Again, I'll save you from the boredom of explaining all of my research, but it suffices to say that the following command is what I needed to get my local files (from a USB thumb drive) to GCS:

gsutil -m rsync -r -d /media/benmctee/27F4-D3DE/ApertureLibrary.aplibrary gs://photo-archive-benmctee/ApertureLibrary.aplibrary

Let me explain what is going on here:
gsutil - that's the Google Storage Utility, which is part of the Google Cloud SDK. It's very useful, and more intuitive than one would think.
-m - Enable multithreading. This allows for multiple operations to go on at once when there are a lot of files to be processed. at over 220,000 files in my library, this really sped things up.
rsync - this shouldn't be new to any CLI users out there. But if it is, it's a very useful file mirroring tool for Linux (not sure about Windows?). It will sync two directories to ensure a 100% backup.
-r - Recursive. This option allows us to dive deep into all the folders
-d - Delete remote files that are not the same or available locally (use with caution!)
/media/benmctee.... - This is my local directory on my thumbdrive. Remember, always use the local directory first, and then the remote. Otherwise, serious deletions/file damage can occur!
gs://photo-archive-benmctee... - This is my GCS bucket, the "remote" location

If you want more details on gsutil rsync, check it out on Google's website.

This time, I did wait until a successful transfer before making this blog post. If you never used the Glacier option before, then my first post will hopefully work for you, because that is a lot easier and more straightforward. But if not, this should get you going. To install the Google Cloud SDK, which puts gsutil on your computer, head on over to Google Cloud Platform website. Happy clouding!

Wednesday, August 31, 2016

Transfer Data from Amazon's AWS S3 Servers to Google Cloud

A while back I decided it would be a smart idea to archive my Aperture photos (now Apple Photos) since my Macbook, a 2008 model, is near end of life in terms of hardware longevity. A quick research landed me at Amazon's cloud storage, S3 (Simple Storage Service), at just pennies per GB. I have around 100 GB of data within Aperture, so I needed something that was less expensive than using Google Drive.

As of this writing, Google Drive is $1.99/mo for 100 GB, and the next tier is 1 TB for $9.99/mo, with no scaling between the two tiers. S3, on the other hand, is $0.03/GB for standard storage ($3 per 100 GB, but scalable), and $0.007/GB for their long-term infrequent access "Glacier" storage. My plan was to upload all of my data from my Aperture library to S3, and then have it automatically archive to Glacier, costing me about 7 cents per month to safely stow away several years worth of photos. Sounds like a pretty good deal, right?

The problem with Glacier is that it is meant to be a long-term storage solution with very infrequent access. Once the files are in S3, I'd have to run a script to archive them to Glacier, followed by removing the files from S3 to save money. The next archive time I would have to transfer from Glacier to S3 so that I could compare source/destination (computer to S3) for changes on upload, and then repeat the whole sequence of archiving. However, I don't have that much time dedicated to optimizing my backup plan. Rather, I'd just like a cloud storage solution that I can easily access whenever I want, without having to worry where all the data is spread across the AWS platform.

I recently discovered Google Cloud, which I surprisingly had not heard of before, considering how Google-centric all my stuff is. I mean, I have a Nexus phone, Chromebook, this website is hosted on Google (including using Google's DNS services), I use Blogger, as well as Drive. I'm pretty much a Google fanboy at this point. But, it never crossed my mind to see if Google had a solution. So I started to compare AWS to Cloud, and, for my purposes, they are surprisingly similar, yet Google's services seem more intuitive.

There are 3 tiers with Google Cloud Storage: Standard ($0.026/GB), Durable Reduced Availability ($0.02/GB) and Nearline ($0.01/GB). Although the Nearline is more expensive than Glacier, the ease of use far beats that of AWS's option. What's more, Google has a transfer service that talks to Amazon's S3, so I can easily transfer over my bucket - which is what a storage node is called for both services. Google has key term explanations if this is all new to you. But, basically, you create your Project (Google Cloud account), create a bucket (Standard, DRA, or Nearline), and then put objects (or files) into that bucket.

Once you have your Project, or Google Cloud Storage account, to initiate a transfer from S3 to Google Cloud, follow these steps:

  1. Create a user access policy within AWS IAM (Identity and Account Management)
    1. Create a user in IAM (ie GoogleTransfer)
      1. Download the credentials 
      2. Copy the Access Key and Secret Access Key to somewhere you won't lose it (I created a Google Sheets file to keep track of users and their access keys). You will need both of these while creating the transfer later over on Google, and will never have access to Secret Key once you leave this page.
    2. Give that user an Inline Policy
      1. Policy Generator
        1. Effect: Allow
        2. AWS Service: Amazon S3
        3. Actions: All Actions, minus the 5 Delete* options at the top of the list
          1. This is probably overkill, but I ran into access permission problems while trying to use Groups instead of an inline policy, so I just gave blanket permission.
        4. Amazon Resource Name: arn:aws:s3:::*
          1. This gives the user access to everything on your S3, including all your buckets. If you want to restrict it further, have a read here.
      2. Add Statement
      3. Next Step
      4. Apply Policy
  2. Create a Bucket in Google Cloud Storage
    1. Give it a unique name (ie aperture-backup-benmctee)
    2. Select your storage class (pricing and explantions)
    3. Select your storage location. I would stick to multi-regional unless you have a good reason not to.
  3. In your Google Cloud Console, create a new Transfer
    1. Amazon S3 Bucket: s3://bucket-name (this is your unique bucket name, ie benmctee-aperture-archive)
    2. Access Key ID: This is the public key generated in IAM.
    3. Secret Access Key: The secret key generated in IAM - you saved it, right??
    4. Continue
    5. Select the bucket you created
    6. If this is the first time you are transferring, you should not need to select further options. If you are trying it again because a transfer failed, you may want to select Overwrite destination with source, even when identical
    7. Continue
    8. Give it a unique name, if desired
    9. Choose Run Now, and Create
The beauty of cloud computing is that this will all happen without you having to stay on that page to monitor it. If you want to come back later and check the progress, just log back into your Google Cloud Console, go to Transfers, and click the job to see where you're at. From Amazon to Google should be relatively quick, depending on the volume of files ("objects") you are transferring.

Thursday, July 14, 2016

2016 Toyota Entune Messaging Issues

Aloha everyone. It has been a while since I've posted a "how to" on something technologically based; I guess that means I've not run into issues that were not easily solved by a Google search. That changed over the last couple of days, however.

I recently purchased a 2016 Toyota Tacoma TRD Off Road that came with the Entune Premium Audio navigation and multimedia system. This is the standard head unit that comes with Toyota, and not the Entune Audio Plus version. From what I can tell, you have the standard version if you do not have the JBL logo at the bottom. Also, I believe the Entune Audio Plus version comes with apps like Pandora, Slacker, etc. So if you have those, you have the upgraded version. At any rate, this problem is probably across both systems.

Entune Premium Audio - Courtesy of PC Mag
Entune Audio Plus - Courtesy of Truck Trend

I connected my Android Nexus 6P device, which I use on the Project Fi network, and it seemed to work without issue for phone and bluetooth audio, as well as downloading all of my contacts. I can easily make a phone call using voice commands, play Pandora from my phone over the audio system, and look at all of contacts and make a phone call. However, when I went to the messaging feature, it showed some very old messages (newest was December 2015 - almost 7 months).

I was using the Google Hangouts app since I'm on Project Fi. It syncs all of my text messages, MMS, hangouts, and voicemail across all of my Android devices, as well as allowing me to make voice calls from any device over WiFi. To start the troubleshooting process, I opened the stock Messaging app on the phone, and it had all of my SMS and MMS messages in there, so I was curious why they weren't showing up on the phone.

After some Googling, I found some users had success with deleting all of their messages on the phone and that would make it show up on the Entune head unit. So I did that within the stock Messaging app, sent a test message from my wife's phone, and still no luck. I then changed my default SMS app on the phone to Messaging instead of Hangouts, no love. The next thing I did got it to work:

1) Hangouts app > Settings

2)  Untick Enable Merged Conversations

2) Account Settings (click on your e-mail account on the same settings screen).
3) Disable Project Fi calls and SMS "Messages" option, about halfway down.


If you are not a Project Fi user, the first step might work for you. If it still doesn't work, try deleting your old messages in the default messaging app as well as making it the default messaging app.

I am probably a niche user within the Toyota realm by being on the Project Fi network, which is why I could not find a solution on the forums for the fix. Hopefully this helps someone else out. If you have any advice or ways to still use Hangouts on your Project Fi device, please post in the comments below.