YouTube and Twitter face public outcry and judicial scrutiny for their online radical Islamic content.Â
By Ambassador Marc Ginsberg and Robert Gemmill, Esq.Â
Under the provisions of the Communications Decency Act (CDA) internet service providers have all too conveniently shielded themselves from content liability claims â€“ particularly those claims brought by victims of domestic terror — whose perpetrators relied on YouTube, Facebook, Twitter, and other social media platforms to become radicalized, communicate plans, and execute extremist threats against our homeland.
Virtually each case of domesticated â€œlone wolfâ€ act or intercepted act of terrorism investigated by the FBI in the past five years (287 in total) involved utilization of encrypted social media platforms, inspirational YouTube videos urging the killing of Americans, and hundreds of thousands of Twitter and Facebook accounts serving as the social media â€œunderground railroadâ€ facilitating terrorists to their intended targets.
YouTube™’s management, in particular, has avoided taking any action to remove the most egregious on-line â€œwhy and how to kill Americansâ€ content despite urgent pleas from the FBI and Congress to do so.Â Â In what surely was a momentary lapse of corporate irresponsibility, YouTube™’s management publicly assured Congress in 2012 that it would voluntarily police and remove extremist content.Â Donâ€™t rely on YouTube for abiding by its own assurances â€“ the same extremism content that was on YouTube in 2012 is there today.Â I know.Â I painstakingly checked.
No one will ever know how many young American lives would have been saved had YouTube voluntarily acted to remove the hateful sermons of radical (and dead) Al Qaeda U.S. born cleric Anwar al-Alawki â€“ which the FBI directly implicated as the principal source of inspiration for 65 of those domestic acts of terror.
YouTube, which is, of course, owned by Google â€“ arguably the most influential internet address in the world â€“ also has recently been named in a lawsuit brought by the family of a Californian student, Nohemi Gonzalez,Â who was killed in November’s terror attacks in Paris. Twitter and Facebook have also been named as defendants in the suit, which was filed in federal court in San Francisco.
The family is alleging that the internet giants violated the US Anti-Terrorism Act for providing â€œmaterial supportâ€ to ISIS and â€œknowingly permitt[ing] the terrorist group â€¦ to use their social networks as a tool for spreading extremist propaganda, raising funds and attracting new recruits.â€
Google™’s official comment when asked about the lawsuit was dubious:
“We have clear policies prohibiting terrorist recruitment and content intending to incite violence and quickly remove videos violating these policies when flagged by our users. We also terminate accounts run by terrorist organizations or those that repeatedly violate our policies.”
Why, then, do they allow YouTube to keep such hateful, terrorist-sponsored content up on the site? Deliberately contradicting its own policies is one thing, but contributing to the direct harm of American lives is quite another.
The longer YouTube hides behind the CDA, and in the absence of Congressional action to remove YouTube™’s legal alibis, it is aiding and abetting terrorism â€“ pure and simple.
Why is YouTube not following the lead of much more responsible social media platforms such as Twitter and Facebook?
Twitter announced this week that it is voluntarily suspending 235,000 accounts that promoted terrorism over the last six months.Â Twitter™’s decision brings the number of accounts it has terminated to 360,000 deemed to be facilitating terrorism and radical Islamic terrorism.Â Twitter did not hide behind the CDA â€“ the way YouTube is doing â€“ to act responsibly and in the best interests of our national security and free speech protections.
The families of terrorism deserve more than a back of the hand from YouTube and other internet service providers who duck behind the CDA as if any content which is not child porn is protected under the CDA and the First Amendment.Â A statutory remedy to the CDA is urgently needed, and it is time for the courts to remove the shield of current law absolving them of liability for failing to be a good corporate citizen.
The longer YouTube disregards its civic duty to all Americans, the most urgent the need to take action against it.Â That action must include more public outcry from presidential candidates, officials, and average Americans demanding more accountability from its management. More pressure on Congress to amend the CDA to impose affirmative duties on internet service providers to remove any content specifically calling for the killing of Americans.Â And better educating Americans on new tools which can be used to block the content that companies, such as YouTube, refuse to block.