From Madhu Rajaraman, American Journalism Review
It was a victim of its own success.Read more here.
At least, that's how Andy Carvin sees it.
Carvin, senior strategist for NPR, says the comment threads on the news organization's Web site are intended to allow reporters to engage with the public and foster intelligent dialogue and debate. However, NPR was forced to take defensive action after barrages of inflammatory posts by trolls and spammers polluted its discussion boards and threatened to become a persistent problem.
As a result, NPR announced in October that it would outsource its Web site regulation duties to ICUC Moderation Services, a social media monitoring company based in Canada — a move that, according to Carvin, has yielded impressive results in the short time since it's been implemented.
"Earlier, it wasn't unusual to find 500 comments in the abuse queue [per day]. Now it's only about three, four, five a day," he says. "It's only been about a month and a half now, but so far, so good." The term "abuse queue" refers to the list of user comments that have been marked as offensive.
Prior to bringing ICUC into the picture, Carvin says, interns and other staff members were on their own when it came to removing abusive comments. That is, until it came to the point where the online commentary became so voluminous that NPR staff could no longer keep up.
The discussion boards on NPR's Web site have a posted set of guidelines, which is generally what directs the decision-making. Rule #1: "If you can't be polite, don't say it."
Flagging offensive content is the one duty that remains up to readers and NPR staff, and is done simply by clicking the "Report Abuse" button to the side of the comment in question. ICUC is alerted every time this happens, and the moderation company takes it from there, relieving NPR employees of endless hours spent screening controversial content.
No comments:
Post a Comment