Not very closely. The "inception" in the title of this piece refers to the so called "inception" layers in Google's convoluted neural network architecture. These layers are unique to Google - no one else really uses them.
In that blog piece it was more a reference to the recursive nature that produces that "deepdream"-style artwork. The code for that is available here: https://github.com/google/deepdream/
Is Google Brain group primarily using Caffe2, Torch7 or something else? It's great to see deep learning innovation happening in Python. I'd previously heard they were focused on Torch7 like Facebook.
I have noticed that they seem to ship a lot of their NN experiments using Caffe (and now Caffe2?). However to some extent this might be because iPython gives you such a nice environment for demo systems.