Dangers of Algorithmic Sourcing

The increasing pervasiveness of algorithms in everyday life disturbs me.

At the behest of many friends, I finally joined the 500 Pixels community and have begun uploading some of my better photos there for licensing. It’s an awesome place filled with pro photographers competing for the highest scores on their photos.

Yet, scores are determined by the amount of likes, favs and comments you get over a short period of time.

For all intents and purposes, you have a homogenous community of primarily male photographers who are either very good enthusiasts or professionals voting on photos. What gets top ranked? The general popular stream is dominated by surreal landscapes and pics of almost nude models with the occasional wildlife pic thrown in for flavor.

500 Pixels Popular

If you want a top rank of 99 on 500 Pixels, bring epic photoshopped scenes and beautiful scantily clad women. These are amazed photos, and deserved their popular ranking. But you can look at the categories to dig deeper. Some of the lesser ranking photos strike me as a better representation of the many things you can do with a lens (and Photoshop).

Here’s the thing, I stopped posting anything I don’t think can get a peak rating of 80 or higher on 500 Pixels. I just won’t do it. Because I don’t shoot almost naked women for a variety of reasons starting with respecting my peers and wanting to stay married, I post landscapes. Since I shoot more than just landscapes, for that reason alone the site is limiting in its artistic and creative scope.

Algorithmic Determination

3410837914_767f9a331a_o
Image by aloalo.

Algorithms impact our news choices, too. And our clothing choices. And what we read. And the movies we like.

It seems like algorithms are everywhere. Here are just a few examples:

  • Huffington Post and Mashable sourcing their news based on rising social media memes.
  • Colors and types of shirt you are most likely to buy (based on past purchasing history).
  • Books you should buy on Amazon.
  • Movies and TV programs you are most likely to enjoy on Netflix.

Is this healthy?

It depends. If you like the same type of thing over and over again, then perhaps algorithmic determination is OK.

Afterall, if you participate on the same sites and buy from the same vendors, then your general behavior will match your peers. As such the algorithms are likely to be correct most of the time.

Consider that 60 people eat the same seven meals every week.

Yum, pizza.

Crazy People Like Orange

2522462056_a12690515e_o
Image by balotto.

Because I am crazy, five percent of the time I’d like to buy an orange shirt. Yup, it makes my skin look like shit, but I like orange.

Orange was my favorite color as a child. I had orange and green dinosaur wall paper, and one whole wall was painted exclusively orange. I still remember it fondly.

The algorithms don’t know that, but based on what they see online they have predetermined that I will buy black and red and maybe blue. I do like my black T-shirts, but I also like splashes of bright color. And 5% of the time that means I like orange.

What to do?

No Growth

Manny-Moe-Jack

Image via View from the Blue Ridge.

How do things become popular? Someone has to try them first, and then they tell friends. Soon early adopters flock to the product.

Perhaps it becomes popular within a niche community (More surreal interior architecture shots, please). Enough people in the community participate in other social networks, and not just online. Work, family and neighborhoods count, too. People tell their friends, and show them the the new thing they like.

Suddenly, it is safe to try something new. But maybe it won’t be new. Because an algorithm already saw that seven percent of your friends tried something, and it knows you buy items as an early adopter. The site sources you an ad telling you your friends Manny, Moe and Jack bought it already.

Boom! You react and plunk down your credit card.

What’s so daring about that? Where’s the growth?

Cool to be Weird

14537361442_9330f7d9f6_k

In a world where anything can be customized to a unique taste, niche stores are popping up all over the Internet to serve the terminally weird. Now it’s cool to be weird.

As database technology becomes cheaper and cheaper, niche stores will be able to serve a customer with algorithmic offerings. Even the daring will find themselves served with the predetermined.

And the algorithms will only get smarter.

How smarter more accessible algorithms impact the inevitable break from the norm remains to be seen. Perhaps that same percentage of the population will be able to resist precision marketing in this form. Or maybe we will all simply accept the endless stream of data driven sales pitches, some subtle, some obvious.

It’s a change that will happen whether we like it or not. The train has left the station.

What do you think?

7 Replies to “Dangers of Algorithmic Sourcing”

  1. I’ve done some soul searching on this issue. It always comes back to Bukowski, or Gauguin for me. We’re not artists if we’re catering to the crowd. We’re hacks.

    People like us (artists who work in marketing or marketers who do art) are conflicted because it’s difficult to split the thinking. We teach ourselves that meeting objectives is important (because it is important in business). But real art isn’t like that.

    Real art doesn’t care if anyone notices or not. It takes risks. It pushes the envelope. It defies measurement. And algorithms will never be able to rank it beyond a populist impression, which has always favored the banal and weird.

    1. I definitely agree with you, and I think that’s the real source of agitation, this sense of forced pre determination that my creative side detests. Just detests. It makes you want to throw away the baby with the bath water.

      Plus as a marketer, we do have to deliver results and that is a bear. It’s one of the primary reasons I am backing away from pro photography. I don’t want to be told what to shoot.

  2. Great post. The challenge I see is algorithms follow behavior, but not necessarily what is good for us. For example, a colleague of mine just wrote a column in Digiday today about discovering a digital advertising algorithm was “racist”; his nonprofit client had two versions of digital ad creative, one featuring a black child and another a white child, and the image of the white child got more clicks and donations. The media placement algorithm doing what it was supposed to do began serving that white ad more often. But was this right? Or should my friend intervene, override the computer algorithm, and serve balanced images? So, Geoff, your images of a moon over Washington D.C. may be far more beautiful and aesthetically unique than that a half-nude model in lingerie. If we allow only algorithms to select photos, we may end up with quasi porn, and yet the photo enthusiast would miss the beauty of a moonlit moment. Our predictive models need to do more than follow pure behavior to make the best selections for us, because we don’t always know what is best for us ourselves.

    1. This is a great point about sociological behavior. I’d augment it by saying that when something is good algorithms have a bad habit of taking them to a level of over exposure, much like top 40 radio. I used to like that song until they played it all the time!

  3. We had far more restrictions in the 20th century by virtue of large organizations making decisions about what we read and saw. We have more freedom of information now, but with limitations that you so well articulate.

    At least some part of our souls needs to stand apart, observe and make decisions such as @benkunz writes . We shouldn’t expect any technology — be it a TV network or a photo serving algorithm — to make a human-centered world for us.

    1. I wish we were as discerining as you note. I think teaching people and children in particular to question the information they see is becoming critical.

Comments are closed.