Download pages as PDF files.
Buy it.
 


 
About This 'Visually Similar' Calendar
After learning what Google's new search-by-image technology could do, I became interested in what it cannot do. I made a colorful scribble in five seconds with handfuls of markers. The search engine matched it (repeatedly) with stained glass from a Prague cathedral that took 600 years to build. The central images in this calendar were used as source images for visual searches that produced the smaller images that surround them. In cases where a matching image cannot be found, the associative algorithm produces rather mixed results, ranging from the absurd to the provocative, and the offensive. What you see is edited. I avoided a great deal of erotica, breast implant photos, Nazi memorabilia, etc. Yet these wrong associations may tell us something about our visual culture, our collective unconscious, and the imaginative realms (databases) that we explore.
For the search corporation that dubs these results 'visually similar' the terminology is meant to elude responsibility for actual similarity. To be fair, the results are often useful. But there are value judgments implicit in the idea 'similar' that fall flat. Likewise, the same corporation's claims of providing a more 'democratic' search result disappeared from its marketing rhetoric when it became the search engine hegemon in the late 1990s. Like the New York Times' masthead slogan, "All the News that's Fit to Print" these claims of democracy and similarity are best viewed with skepticism, as it is our perceptions of the world that are at stake.
We are both at the beginning of 'artificially intelligent' software and many decades into the project. New interface modalities involving voice commands and image searching are but the latest in a long line of attempts to augment and replace human intelligence. We would do well to remain the masters of our 'smart' appliances, recalling with wonder that even the least informed child can judge visual similarity and contextual relevance better than state of the art computing tools. As this gap narrows, the limits and programmed biases of our software agents are bound to become less apparent, and more insidious.
 
— Andy Deck