Using the image search function on the Google Goggle iPhone app, I take portraits of myself with the iPhone camera and then input into the search and see what the closest matches are.

I am interested in how the image search which operates on visual elements can bring up matches that appear random or mis-matched, yet they begin to act as a kind of “mood board portrait”