Visual Literacy Means Better Thinking

A few weeks ago, I wrote a blog about the world moving from text-based to visual communications, an underlying angst was touched upon. A fear exists that visual literacy may mean more ignorance and the general dumbing down of society as a whole.

First, let’s be clear. Visual literacy is not a new concept. It dates back to the sixties.

The recent media trend towards communicating with video, pictures and graphics has inspired people to building methods of encouraging visual literacy. The Internet world has started wrestling with this as an entire culture, but some thought leaders were earlier in driving visual media. Others have even published a strong how-to book for marketers looking to master visual in the social context.

Yet the tension remains. What will a society where people learn and communicate visually — and struggle with reading and writing — look like?

Will we become a society of ignorant fools? Will superstition and bogus news stories dominate our thinking? Will violence and polarizing behavior continue to trend upwards? Will there be so much visual bait demanding our attention that image pollution and desensitization are the next battle after content shock?

This is the End


Context means everything in this conversation. Ignorance or lack of education has been best typified by the inability to read or write effectively, the common definition of illiteracy.

People who were bright, but didn’t know how to read or write effectively or didn’t have a college education were dubbed “street smart.” This is how we were raised to think when I was a kid (back in the neanderthal era). I know I’m not alone.

When someone from this kind of upbringing encounters an inability to speak and write well, we think illiterate. This also assumes ignorance. Afterall, the written word was the foundation of civilization, preventing us from sinking back into the Dark Ages.

This well-rooted historical view creates a prejudice steeped in an increasingly archaic definition of information literacy.

Once can come to understand concepts and communicate extremely well through other means. And if the devolving state of writing coming from most college graduates is any bellweather, let us hope some improvement in communication arises soon.

And the Beginning


We as a species process visual information faster than than the written word. We come to understand objects as infants and toddlers well before we can read or write. I’ve heard that we understand visual information 500 to as much as 60,000 times faster than text.

Perhaps visual is the way we are meant to digest information. It’s just that historically we needed a Gutenberg press or its derivative to exchange ideas. Now we just need an S5 or an iPhone.

As we move forward into an era of visual learning and media, it could be argued those of us who only use and understand text to communicate will become the illiterate ones.

Now that’s a scary thought.

Using objects to learn from as opposed to words may lead to more and faster growth of knowledge. Those who master visual learning may be able to create and evolve ideas, concepts, and technologies faster than their counterparts in prior eras. They will need to build from a foundation of knowledge. Innovation requires understanding the current state of things, and the historical predecessors that got us to the present.

Traditionally, ideas and concepts have been retained for our reference through books, papers and articles. This was the classic role of the library. In the modern era, right or wrong we find this information through Wikipedia, Google, and other perhaps more qualified sources online.

But some search on YouTube for answers now. One of my favorite sites to search for photography information is KelbyOne. There are tons of answers to all sorts of questions, but the answers are in a video format. I prefer this kind of reference information than reading my Nikon D7100 manual or the Adobe Photoshop help guides.

Libraries recognized visual literacy well before it became hot as a trend. Microfiches, video libraries, etc. have all existed for decades. Now the visual may become the primary media form within the libraries of the future. And perhaps those libraries will only be online with a Siri-like interface much like Neal Stephenson envisioned so long ago in Snow Crash.

Part of literacy in my mind is being able to delineate quality information from bad data. In the visual world, that includes producing and consuming quality media in a loud world.

People struggle with seeing things and understanding whether they are real or fake. They think the unfiltered is filtered and vice versa. They believe the video clip rather than question if it is a screenplay. The infographic is trusted even if it doesn’t cite sources.

Separating good visual commmunication from the bad, the signal from the noise, will mark the literate mind of tomorrow.

What do you think?

Rebelling Against Scripts

We know so little about ourselves as human beings. Yet, the technology and media industries are full of finite statements about how people will act in this new age of technology and sensors. Such are the opinions of futurists.

Vision attempts to deliver a script of how things will evolve. Programs are developed with algorithms that suggest the “best” behavioral path. They create a means to help people fit within the vision of how things should be.

Then reality strikes. Both the data and human instinct defies the vision.

People rebel against scripts. They defy them and act differently. Computational models respond to the new behavior and suggest improvements. New and better means are found.

Defiance Today

Image by Phantom Rebel.

As more of our lives become scripted by tech — and by design what others determine the best method to act is — we will find outliers. These people look at a recommendation and defy it.

Rebellion is an inevitable consequence of the human condition. When people are forced into defined situations, a small group breaks free from the norm. They see that everyone is behaving one way at that time, identify alternatives, and choose to find a different way.

The fastest way to work? Google Maps be damned. Everyone participating in birthday congratulations on Facebook? Pass on wall writing, and send a text.

By bucking the trend, some people make things worse, but others succeed.

One of the biggest scripts we are fed today is the need to share. Share everything. Share it on Facebook (Like it will do, too). Retweet, pin, and/or plus one it, too! Send us your photos wherever you are, and check in.

Sorry, corporate America and your data-compromising social network partners. More and more people are rebelling and saying no.

Conversations and photos are going private. More than half of social updates are occuring on dark channels. We know about SnapChat, but how about WhatsApp? Beyond the billions of messages, six hundred million photos are shared privately on WhatsApp every day.

Check in frequently and find your friends coming by and interrupting you? Use anti-social app Cloak to avoid them.

The social script keeps getting disrupted. People say no, and migrate to a technological alternative. A new norm is established.

A Young Lady’s Illustrated Primer

The algorithm-driven “smart” world we are developing brings to mind Neal Stephenson’s Diamond Age: A Young Lady’s Illustrated Primer. From the description, “He’s made an illicit copy of a state-of-the-art interactive device called A Young Ladys Illustrated Primer Commissioned by an eccentric duke for his grandchild, stolen for Hackworth’s own daughter, the Primer’s purpose is to educate and raise a girl capable of thinking for herself. It performs its function superbly.”

As the novel unfolds three different girls are raised with the algorithmic primer as a nanny of sorts. We see that socioeconomic background, situations and personality dramatically affect actual outcomes, even through all three have access to the exact same primer program.

None of the girls end up the same. Hunger, desire, emotional disposition and drive all make a big difference as their lives evolve.

Algorithms cannot predict the evolving human mind, and how it will react to each random situation. Many people will follow the prescribed script. But every person is new. Some will break from the path.

A mentality of following change exists. Innovators trailblaze, then early adopters follow suit. At some point it becomes safe for the general population to leave the tried and true, and adapt the new method of doing things.

Then a new generation of tools and technologies becomes scripted. Innovators and outliers rebel. The cycle continues faster and faster.

This seems to be our world now.

But whatever the truth about technology adoption may be, I am not too worried about algorithms driving human existence. We just won’t stand still long enough to let it happen.

What do you think?

Long Novels Are Painful

I have a confession: I hate long books, particularly long novels.

When I was in college, if you couldn’t stomach a long novel then you weren’t a true Literature student. Long live Dostoyevsky and Tolstoy! And some of those masterpieces (particularly Dostoyevsky’s) were compelling enough to keep my attention.

Most of them put me to sleep, though.

A quality novel can be defined as a good and complete story, rather than some 19th century concept of word count. When a long novel is a good story, it captivates you with a compelling plot and storyline. You don’t really care about how long it is, you just want to devour it! But before that ideal state of reading pleasure, a tome is prohibitive because of assumed time demands.

Sometimes a long book is necessary. I’m a big fan of breaking up lengthy works. Tolkein’s Lord of the Rings was actually one novel that the publisher divided into a trilogy. That didn’t turn out too bad!


Long Novels in a Digital World

Things haven’t changed in the publishing business even though media has evolved. Publishers frequently push long novels.

I cannot help but turn my nose up at these wares. Unless a long book has fantastic word of mouth, I am not reading it. A good story is a brisk one, at least to my tastes. I find most of today’s writers embellish their novels with back story and details that leave me bored and disenchanted.

I remember how good Neal Stephenson used to be before 1000 pages became his average. How I long for the Diamond Age.

Currently, I am reading Stephen King’s Doctor Sleep. Great prose, fantastic start, fluid and easy to read, but pages 50-120 were slow. In my mind, they could have been 20 pages instead of 70. I began wondering if the remaining 400+ pages would be worth it. And the book didn’t strike me as long. You get my point. Fortunately, things seem to be moving along again in Doctor Sleep.

In this digital age with so many other entertainment options besides reading, will people keep tolerating books greater than 100,000 words in length? Personally, the Kindle and other eReaders makes the experience of a long book more difficult.

Frankly, I think 50-75,000 is the ideal amount. Some call the shorter side of that a novella, I call it a reasonable risk.

One more thing about shorter lengths: Great writers deliver impact with each sentence. They focus on quality, and reveal their story in a meaningful captivating fashion. When I read Philip Roth, who often (but not always) clocks in under 300 pages, I am certain that every chapter will be great. He respects the reader with a tight well written novel (or novella) everytime.

What do you think?

Featured image via Devon Fredericksen. Lord of the Rings image by Abdulla Al Muhairi.

17 Favorite Science Fiction Works

Image source: Aumanack Diversão sem limite

After reading last week’s post on science fiction, Erin Feldman asked me to suggest a few books in the genre. Of course, I was delighted. So here are my favorite science fiction books (and trilogies) of all time. You’ll see they span sub genre and era.

1) The Diamond Age (Or, a Young Lady’s Illustrated Primer) by Neal Stephenson. If you gave three young girls with different backgrounds a primer based on the the ultimate algorithm-based artificial intelligence, their lives would still end up completely different. And those with the most advantages may have the largest handicaps. Simply brilliant analysis of semantic technologies, and quite a dystopian look at nano-technology, too. Check it out.

2) Altered Carbon by Richard Morgan. Imagine if your soul could be backed up and stored in the cloud. You’d need a chip in your cordial stack to access motor functions, and to identify your soul if the physical body should fail. Assassins could forever wipe you from the face of the earth by destroying your cordial stack chip. This premise drives one of the most bloody and violent books in the cyberpunk genre. I loved it!

3) The Lord of the Rings by J.R.R. Tolkein: Do I need to say anything about this cornerstone of fantasy and science fiction (the trilogy was originally one long book). I’ve read the stories of Middle Earth well over ten times in my life, and loved the movies, too. The compelling battle of good versus evil painted in a dire light still grips me every time.

Continue reading “17 Favorite Science Fiction Works”

The Murky Nature of Internet Vigilantes

Image by Frank Tellez

Freedom allows many things, good and bad. The rationalization of justified Internet vigilantes arguably falls in both camps, depending on your perspective.

We love the archetype of the vigilante, the person who goes out and meters justice when authorities fail to do so. In a romantic sense, it makes sense. Consider our pop culture heros; Batman, Iron Man, Jack Reacher (in spite of Tom Cruise), Clint Eastwood’s many tough guy characters, and on and on. We worship their ability to right wrong in the spite of flawed protection mechanisms.

Thanks to the Internet, practicing vigilantism has never been easier. Social media empowers anyone to speak out for justice, and successful acts are met with attention and notoriety.

That’s unfortunate. Vigilantism (or “digilantism” online) is dangerous because the actor may not be well grounded in their ideas of right or wrong.

Continue reading “The Murky Nature of Internet Vigilantes”

SciFi Nerd Dream Come True


Since this weekend marks the unofficial beginning of summer, and I already know how my summer will end, let’s talk science fiction.

I registered for the 71st World Science Fiction Convention this coming Labor Day weekend. It will be a nerd dream come true on a few levels.

First, the prestigious Hugo Awards are given out at the event. I plan on reading the five nominees and submitting a ballot by July 31. It will be awesome to see the program, meet science fiction authors, and talk about the craft.

Then there is the costume contest. I went to the Emerald City ComicCon last year during a layover in Seattle. What a crazy event! The costumes were wild and fun, and, well, nerdy! The science fiction convention costumes may be even crazier!

Continue reading “SciFi Nerd Dream Come True”