Tags

, , , , , , ,

SEED published an article the other day that discussed the coming impact of near total authorship.  The gist of the article is that at some point, nearly everyone will be able to publish content and that this will have profound implications for society in much the same way that near universal literacy has.

Authorship over time

Authorship over time

So what are the implications for universal authorship?  SEED mentions one and I have another that comes to mind.

The first implication is in the article: as more and more people become creators of content it will become increasingly difficult for organizations of all kinds to control their messaging and their brand.  Provide a bad customer experience and it will be on facebook, immediately broadcast to hundreds if not thousands of people.  Discriminate against someone because of their race or sexual orientation and they will tweet about it, likely prompting dozens of re-tweets which allows the issue to reach exponentially more people, who then might blog about it, and so on, and so on.  Organizations are already finding it hard to control their brand, imagine when the number of authors increases 10 fold (which, according to SEED, will now happen yearly).  Theoretically, firms and technologies that can monitor content related to an organization will be big winners.  Additionally, organizations will need to invest more heavily in their own networks and crowds to help combat negative content (whether true or false).

Another implication that immediately came to my mind is the increased difficulty in separating signals from noise.   As the cost of entry into the market for content heads towards zero and the tools of creativity are fully democratized, there will be an even greater explosion in content from which individuals and organizations will have to separate relevant, accurate pieces of information.  There is already a flood of information to wade through and it is becoming more difficult to do so.  Exponentially increase the amount of content and the variety of sources and you’ve taken the problem to an even greater level.  All things being equal, more content and a more fractured supply source will only increase the amount of noise and make identifying the signals (the accurate pieces of information) more difficult.  What we will need, and what will become valuable, are services that don’t simply aggregate content, but also determine their level of accuracy and credibility.

Nate Silver

Nate Silver

I am thinking here of services that mimic the approach of Nate Silver at FiveThirtyEight.  The world didn’t lack political polls, but it did lack a methodology for cutting through the noise created by dozens of polls, many providing contradictory predictions.  Silver came up with a way to not just aggregate polls, but to increase the ratio of signal to noise, allowing for a more accurate portrayal of public opinion and ultimately a prediction of Presidential elections.  Silver made polls better by developing what I would call a sophisticated aggregation methodology (note: I will be writing more on this soon).  The key will be to develop a scalable, replicable approach that can be applied to a variety of domains.

Soon enough, we may all be creatives.  And that means there will be a heck of a lot more chaff to wade through.

Reblog this post [with Zemanta]
Advertisements