This past Friday, February 5, Twitter announced - in a tweet, of course – that it had shut down more than 125,000 terrorism-related accounts since the middle of 2015, most of them linked to the Islamic State. The social media site removes accounts that are reported to them, and it also uses spam-fighting tools to identify and take down other violent accounts. It works with public interest groups to encourage counter speech and reports violent accounts to the government.
This is the way the system should work. Companies work with government and civil society groups to accomplish a common public purpose – without facing legal liability or mandates in an area where sensitive line drawing and judgments are inevitable. Twitter is not alone in carrying out its social responsibilities – Google and Facebook have been equally active.
In a striking coincidence, today is the twentieth anniversary of the enormously effective Section 230 of the 1996 Telecom Act, which made it clear that Internet platforms like Twitter are not liable for material placed on their systems by third-parties. This limitation on liability has fueled the growth of Internet platforms as engines of free speech and association, as well as drivers of innovation, economic growth and job creation. Indeed, without Section 230 Twitter and other social media probably would not exist today.
It is thanks to this far-sighted provision that social media companies have been able to succeed and to act effectively to combat social problems like hate speech, terrorist activity and revenge porn.
Some still see this liability rule as requiring or encouraging Internet platforms to be hands off, and not to interfere in any way with the content on their systems.
But the absence of legal liability does not mean the absence of social responsibility. In fact, the heading of Section 230 is the “Protection for “Good Samaritan” blocking and screening of offensive material.” Its purpose is to allow Internet entities to respond to problems on their system, without becoming legally liable for damages when harmful information was present on their systems.
Twitter and other Internet actors are doing just what the Good Samaritan provision calls for – acting strongly to respond to a social problem without the need for a legal mandate and with protection from legal liability. Congratulations to them for their leadership. And happy birthday, Section 230!