Oct. 27, Jared Schroeder, SMU Journalism professor with free press expertise, for a piece critical of social media platforms that fail to alert law enforcement agencies in a timely manner about threats of violence discovered on their platforms. Published in the Dallas Morning News: http://bit.ly/32UUvI9
. . We should expect more from social media companies when it comes to identifying and reporting dangerous threats.
Mass shooters have made posting threats and manifestos on social media a part of the playbook for their horrendous crimes.
The El Paso shooting suspect, who was in federal court earlier this month, posted a hate-spewing manifesto on 8chan before the attack. Soon after posting his missive, 22 people were killed and 24 more were injured. . .
By Jared Schroeder
We should expect more from social media companies when it comes to identifying and reporting dangerous threats.
Mass shooters have made posting threats and manifestos on social media a part of the playbook for their horrendous crimes.
The El Paso shooting suspect, who was in federal court earlier this month, posted a hate-spewing manifesto on 8chan before the attack. Soon after posting his missive, 22 people were killed and 24 more were injured.
While he was in court in El Paso, a Texas House committee on mass violence, created after the El Paso and Permian Basin attacks in August, met for the second time and focused on the connection between social media and violent threats. During the hearing at Brookhaven College near Dallas, I testified about the First Amendment and federal laws that limit the power of lawmakers to halt the type of hateful expression that social media forums often host. Before my testimony, however, lawmakers heard from law enforcement and a representative from Facebook.
Maj. David Cabrera, a leader in the Texas Department of Public Safety’s counterterrorism and intelligence efforts, told the committee about a threat his office identified on Twitter on Oct. 3. Someone posted plans to shoot up a 10 p.m. showing of the The Joker movie. As DPS scrambled to learn the identity of the poster from Twitter, the clock ran out. Twitter provided DPS with the information after the threatened attack would have taken place. If this had been a true threat, people may have been killed.
Twitter was not at the hearing. Neither were representatives of Google or Microsoft, though all were invited. Someone from Facebook provided testimony, but offered little substantive guidance for a committee that is tasked with making policy suggestions for protecting Texans from future shootings.
Rep. César Blanco, who represents El Paso, said the tech firms’ absence from the meeting “demonstrates that social media companies don’t believe they should be active participants in keeping people safe.”
At a time when tech firms and state lawmakers need each other more than ever to prevent mass violence, this disconnect is disturbing.
Social media and tech firms are increasingly being labeled as bad actors. Their names have become synonymous with election maleficence, privacy violations and threats from mass shooters. A little cooperation with lawmakers who seek to save lives would improve their tarnished images.
For lawmakers, the problem remains: Social media and tech firms often have the information law enforcement needs to keep us safe, but have been less than enthusiastic about providing it or creating systems that provide it.
The testimony from Facebook’s representative highlighted systems and initiatives the company has created, but I was struck by how everything was done on the corporation’s terms. Law enforcement must navigate their systems. As it stands, social media and tech firms evaluate what to provide and when to provide it.
Whereas social media and tech firms have found countless ways to monetize our personal data — they are capable of posting a running shoe ad on my browser seconds after I search for a new pair — they seem awful flustered when it comes to creating systems for alerting law enforcement about imminent threats.
These corporations do a tremendous amount of business in Texas — by business, I mean collecting Texans’ data and monetizing it. Asking them to create a seamless system for identifying threats is not unreasonable and passing information on to law enforcement does not violate privacy rights. But if the companies do not wish to do so, lawmakers are limited in their power to compel such good behavior.
Federal law does not allow social media and tech firms to be held liable for how people use their services, which means officials likely cannot fine the firms for not sharing information. At the same time, the First Amendment generally does not allow lawmakers to limit the expression that occurs online. In most cases, it also does not allow lawmakers to compel the corporations to take information down or to include certain language in their terms of use, which was one of the lines of questions the committee discussed.
We also do not want the government to have access to all of the information these firms have about us, so privacy law limits many options.
This is why, aside from creating media literacy programs that help inoculate Texans from online radicalization, the best solution to this otherwise intractable problem is for the government and firms to work together to save lives. Some excellent ideas were suggested during the hearing:
· Social media and tech firms must create a hotline for law enforcement to call to process requests quickly. Online submissions are slow and unreliable.
· Lawmakers could mandate social media and tech firms each designate a person or office located in the state that works with law enforcement in exploring threats — part of the price of doing business in Texas.
· If firms refuse to cooperate, lawmakers could create a tax on social media and tech firms’ profits in the state. The proceeds would support law enforcements’ efforts to keep us safe or media literacy programs, which could help Texans develop strategies for discerning truth from falsity online.
Social media and tech firms have created business models that buy, sell and use our data. Their privileges here can be tempered with a responsibility to identify the foreshadowing of immediate threats and take steps to work with those seeking to keep us safe.
Jared Schroeder is an assistant professor of journalism at Southern Methodist University and the author of “The Press Clause and Digital Technology’s Fourth Wave: Media Law and the Symbiotic Web.” He wrote this column for The Dallas Morning News.