Wednesday, October 11, 2006

Peer Review meets Helium Floats an Old Model in a New Space

The announcement of Helium, a new portal focused on developing user-generated content, provides a late-to-the-game entry for user-generated content with a number of interesting twists to their business model and content acquisition scheme. Helium intends to build high-quality content not by implementing a straight user voting system but by having authors review one another's content for quality. Reviewers are given articles side-by-side to indicate which one that they think is a better article and by how much. The concept is that peers who are willing to write on a given topic will be more expert in a given topic and therefore be able to judge the quality of a given piece more than users - much as scholars review one another's work for academic journals. Over time the experts in a given topic build up a reputation and gain enough clicks on their content to earn a reasonable share of the ad revenues associated with their content - kind of an ad-driven docent model.

It's in interesting concept, but the execution is fairly rough out of the blocks. Helium provides a broad category map for its main navigation in anticipation of authors filling these vertical "buckets" with content. But in many instances the buckets are very lacking. Try the "movies" category and the top seed is a 75-word "essay" (?) on a "new" movie - "Meet the Fockers." Oops, so much for quality content in that category. Unfortunately there's no place for me as a user to give this substandard content the razz until I choose to write something; in the meantime this is in the top 10 movie articles (number one is no better). Perhaps over time good content is attracted to this framework, but providing a navigation system that will ensure that initial users encounter inferior content is not likely to attract real experts as authors. Sometimes user-driven rating systems are not favored for the cliqueishness that they can engender, but forcing audiences to provide content before they can provide an opinion is not likely to accelerate the generation of quality content.

The peer-review rating concept in Helium is a very interesting new feature and there's reason to think that it can be made to work in ways that cross grass-roots input with true expert input. But the wisdom of crowds only works when you have...a crowd. Social media requires a very strong community of support for both authoring and audience to provide the level of authority that people are willing to accept. The best and fastest-growing authoring communities provide both opportunities for authors to create their own user-voted content and to point to other quality content on the Web. Quality can be found anywhere on the Web, a concept that search engines accepted long ago. I encourage publishers to think about how side-by-side review of articles such as that being tried out in Helium can help to improve content, but I also encourage them to think carefully about how the most effective peer groups that can rate content are formed in online communities. You can't have one work without the other...
Post a Comment