This month, the families of five American citizens injured or killed in attacks in Israel filed a lawsuit against Facebook for $1 billion. They allege that the site bears responsibility for allowing Hamas to plan and coordinate its attacks on their family members via its platform.
The complaint states that Hamas has used and relied on Facebook’s online social network platform and communications services as among its most important tools to facilitate and carry out its terrorist activity, including the terrorist attacks in which Hamas murdered and injured victims and their families in this case. For years, Hamas, its leaders, spokesmen and members have openly maintained and used official Facebook accounts with little or no interference.
The incidents in question involve the kidnapping and killing of a 16-year-old U.S. citizen in June 2014 and a March 2016 stabbing murder of 29-year-old U.S. Army veteran.
That isn’t the only suit of its kind. The family of a victim of the November 2015 ISIS attacks in Paris just filed a separate suit in June. It accuses Facebook, Google, and Twitter of knowingly permitting the terrorist group ISIS to use their social networks as a tool for spreading extremist propaganda, raising funds and attracting new recruits.
For its part, Facebook has stated that these claims are without merit and that it removes content promoting terrorism aggressively and acts as soon as we become aware of it.
But the issues for Facebook don’t stop there. It was only just revealed that Micah Johnson, who murdered five police officers in Dallas last week, bought an AK-47 semi-automatic rifle in a transaction arranged via Facebook.
And what of the recent class action lawsuit filed against Snapchat for allegedly allowing minors to see sexually explicit content through its Discover function? The suit states the fact that Snapchat does not differentiate content offered to its minor users and adult users is problematic, and ultimately a violation of Federal and State consumer law.
While obviously a separate issue from terrorist activity and its consequences, the matter still falls under some general questions that no one seems to be able to answer: What can be posted on these social networks? Should social networks monitor more closely these potentially harmful or violent postings? What should be done when a particular posting is found? How is it determined whether a post is harmful, offensive, or violent in the first place? Finally, what responsibility, if any, do these social media companies bear for content posted on their platforms?
Common sense might dictate that, for example, a Hamas or ISIS-organized Facebook page is easily identifiable. To steal the famous line from Supreme Court Justice Potter Stewart when making the characterization of pornography: “I know it when I see it” (Jacobellis v. Ohio, 1964).
But the answer might not be so simple in a civil court of law especially looking ahead, as issues morph and social networks evolve. Legalities aside, it would be wise for Facebook, Snapchat, Twitter, Google, and other social network giants to ensure they are seen as defenders of liberty and detractors of terrorism and other harmful content in the court of public opinion.
The social networks appear to be taking a reactive approach to these issues, dismissing such lawsuits with short, simple statements that reflect a sentiment of arrogant annoyance, as if they barely know the lawsuit exists. While that strategy can sometimes have merit, it is important that they retain their users’ trust above all else. If they are seen as not taking the issues of terrorism, gun violence, and sexual assault seriously, the tide can turn fast even if the lawsuits do not present much of a threat in the legal realm.
A more proactive and public approach may be in order, to reassure many users that these issues are serious, and that the companies stand with the vast majority of their users who are patriotic and peaceful all over the world. If not, this wave of lawsuits might just be the beginning across a whole spectrum of issues as momentum builds in the media.