YouTube has long been criticized for hosting videos that seemingly cater to terrorism. Back in ye olde 2008, for example, Senator Joe Lieberman made his concerns known by sending YouTube a letter urging the site to pull down videos that he said were made to entice extremists to kill Americans.
At the time, YouTube offered a bit of a "meh" response, saying that the site "defends everyone's right to express unpopular points of view."
That kind of response wasn't good enough for some lawmakers, who feel the video hosting site should do more to screen these types of videos before they go live. YouTube was also reprimanded this past October by Rep. Anthony Weiner, who wanted it to attend to the 700 recruitment videos featuring Anwar al-Awlaki, known in some quarters as the "Bin Laden of the Internet."
While YouTube has typically pushed back when asked to remove/screen videos, the company is now taking a quasi-step in doing something about terrorist-related content on its site.
According to The Los Angeles Times, in addition to giving users the ability to flag videos for nudity, sexual activity, animal abuse, etc., YouTube will now be letting users flag videos if they "promote terrorism."
In this sense, rather than have to scour the hours and hours of video that are uploaded to the site every minute, or possibly make a decision that infringes upon someone's free speech, YouTube is leaving it up to the users to decide what does and does not promote terrorism.
According to Lieberman, this is a "good first step toward scrubbing mainstream Internet sites of terrorist propaganda."
So, that's good. Keeping Lieberman moderately pleased is definitely high on my list of priorities, personally. However, I'm not confident that this is a solution that will please anyone very much.
For starters, by putting the "crowd" in charge of this, YouTube is effectively throwing its hands up and saying it's no longer the site's job to determine what content belongs and what doesn't. And, as we all know, the main thing we learn when we trust the "wisdom of the crowd" is that the crowd doesn't have much wisdom.
Secondly, the phrase "promotes terrorism" can easily be left open to interpretation. What may seem to one person as religious expression may come across to another as being pro-murder.
The situation YouTube is grappling with is not a unique one. Facebook, for example, is regularly defending its decisions to not remove certain content and to be a place that welcomes controversial ideas and viewpoints. The issue is complicated, and giving users the power to "flag" content is certainly not going to be the thing to solve it.
But, really. Why are we worried about this when there are clearly bigger problems to contend with on the Tube o' You? I mean, have you seen the list of the most-watched music videos for 2010? Justin Bieber tops the list and dominates it with four slots.
Until YouTube offers a screening process for dangerously bad pop-culture clips, I'm afraid the terrorists still win.
— Nicole Ferraro , Site Editor, Internet Evolution