The More We Stay, the Less We Say

Forrester recently updated its Technographics profiles (made famous in the book Groundswell) for global social media consumption, surveying 95,000 consumers across 18 countries in North America, Europe, Asia and Latin America. One primary finding was the lack of commenting occurring in mature western markets, including the United States.


Adoption is pretty much complete in the U.S. (86%) and globally. Almost everyone who is online also is using or has used social media. Comscore recently corroborated this data, saying 83% of the world’s online population participates in social media.

But, most of us in the United States are not social and care not to converse. The Forrester report finds that 2/3 of the US adult social media population doesn’t comment. This is notable.

Commenting seems to have decreased over the past six years. Perhaps it’s because of the widespread proliferation of mobile media with smaller screens and touch input. It’s certainly harder to type in a blog comment or critique a product on a smartphone.

Continue reading “The More We Stay, the Less We Say”

All Polls and Surveys Are Not Equal

linkedin polls
Image by renaissancechambara

In Washington, polls and surveys drive policy decisions, particularly around campaign season. For a presidential election, Gallup polls are considered accurate within four points, and this has yet to be proven wrong. However, several online polls and surveys last week did produce highly questionable results, and in once case, was outed as a hoax tarnishing the Microsoft Explorer brand.

This degradation in quality is indicative of a larger trend on the social web, the erosion of expertise (and professionalism) caused by social media content. Launching a poll or a test on a web site is so easy now that anyone can claim to execute research. Indeed, they are. The quality and value of their data is another story. Mind you, this erosion has not only impacted the new media content producer, but also the traditional journalism field as both our Microsoft and Google+ examples will show.

Interactive firm AptiQuant ran a test on its site purportedly measuring the IQ of visitors and correlating that data with IQ. Explorer users were deemed least intelligent.

Unfortunately for Microsoft, the browser IQ test was widely reported by the media, which did not verify the data. Finally, the BBC determined the research was a hoax, but not before the media had popularized Explorer as a low IQ tool. AptiQuant is defending its study, and says it will battle any lawsuits.

But does it matter? The damage has been done to an already lagging brand. Publications that may print retractions won’t push them to the top of their sites with the same zeal they did in their original reporting. A successful lawsuit would only provide a consolation prize for being a called stupid Internet Explorer user.

Google+ Polls

Several polls came out surveying Google+ users about abandoning Facebook for the new circle based social network. Of all the polls only the Christian Post labeled their effort as an unofficial poll, and their numbers were the lowest with 7% moving solely to Google+.

The Brian Solis, Mashable and PC Magazine posts ranged from 23% to 50%. However, all of their readers are extremely tech or social media centric, in essence polling the early adopters. They do not represent the general population, and as such their polls can be pretty much dismissed as industry and demographic specific.

The average reader of these stories would not be able to discern that three tech/social media polls are in essence, “inside baseball.” Mashable did add a little conjecture: “Users may be reacting to the novelty of a new social network. Facebook.”

What is most notable about these four polls is the 40+ point spread between them in response. In the case of the three social media and tech polls, there was still a 27 point spread. Such wild variations should be a clear indicator that the data is inaccurate or compromised in some way.

Keep in mind on line polls — particularly those on social media — often suffer from fan based flash mobbing towards a favored outcome. Also, given the subject matter a survey of the non-indoctrinated general public’s opinion about Google+ would have offered an interesting context to the data.

Conclusion

Without stronger open methodology and wider population samplings, polls cannot be considered representative of likely trends. Polls that deserve respect like Gallup and Pew Internet research are painstaking about their methodology.

In the information age are readers and the media — as the Internet Explorer hoax revealed — savvy enough to discern quality information? Yet another series of examples why we need to teach children and adults alike how to mindfully accept information, and question sources.

What do you think of the polling trend?