Google Deep Dream Experiments

As I understand it Google's DeepDream software uses libraries of images and then looks for those images in whatever image file you give it at various levels of detail based on how you set the Octaves up. And it will iteratively apply the filter again and again to bring out what it thinks it finds.

The first few layers bring out patterns. The middle set of layers eyes start to appear. Then the last set of layer settings bring out dogslugs, sankes lizards, then architecture. But so many cute doggy faces...

So it has a very specific look and style. Not completely random. Why so many dog faces?

Sometimes like a bad photoshop filter but sometimes wonderfully surreal.

How can I use it effectively?

'American flag' test. (Could of used any national flag but this is inspired by Hunter S Thomson's "Fear and Loathing in Las Vegas". Deep Dream and the American Deep Dream just fit so well together.)

'Uncle Sam wants you' test. Eyes. Again seems to fit well with the revelations of how we are all now being watched 24/7.

Ok, what if I give it nothing... just white pixels to dream on?

Wow! Was surprised it worked as the image is pure white with zero noise variance.

Ok what about black?

Again totally unexpected. The cuteness has become quite evil and disturbing.

And finally 50% grey for all the concrete everywhere.