Payton Gendron, center, is led into the courtroom for a...

Payton Gendron, center, is led into the courtroom for a hearing at Erie County Court, in Buffalo, N.Y., Thursday, May 19, 2022. Gendron faces charges in the May 14 fatal shooting at a supermarket. (AP Photo/Matt Rourke) Credit: AP/Matt Rourke

WASHINGTON — The mass shootings at a Texas elementary school and a Buffalo supermarket this month have intensified calls from U.S. lawmakers for social media platforms to crack down on violent imagery and hate speech that proliferate online. But industry experts say any proposed legislation must contend with fast-changing technology and the emergence of smaller platforms created by extremist groups.

The 18-year-old shooter who killed 19 fourth-graders and two teachers at Robb Elementary School in Uvalde, Texas, on Tuesday posted multiple images on his Instagram account showcasing the AR-15 rifles he would later use in his rampage. Authorities also said Salvador Ramos used Facebook to message another user about his plans.

Only 25¢ for 5 months

Unlimited Digital Access. Cancel anytime.

Already a subscriber?

WASHINGTON — The mass shootings at a Texas elementary school and a Buffalo supermarket this month have intensified calls from U.S. lawmakers for social media platforms to crack down on violent imagery and hate speech that proliferate online. But industry experts say any proposed legislation must contend with fast-changing technology and the emergence of smaller platforms created by extremist groups.

The 18-year-old shooter who killed 19 fourth-graders and two teachers at Robb Elementary School in Uvalde, Texas, on Tuesday posted multiple images on his Instagram account showcasing the AR-15 rifles he would later use in his rampage. Authorities also said Salvador Ramos used Facebook to message another user about his plans.

In Buffalo, the 18-year-old accused of shooting 10 Black New Yorkers and wounding three others at the Tops supermarket livestreamed the massacre on his Twitch account and used the site to preview his plans to a select group of members. Authorities have said Payton Gendron was radicalized online, subscribing to racist ideology promoted by white supremacists in online forums.

Immediately after the Buffalo shooting, Democratic lawmakers called on social media companies to do more to regulate racist and incendiary content.

President Joe Biden, in a speech in Buffalo days after the shooting, said “the internet has radicalized angry, alienated, lost and isolated individuals into falsely believing that they will be replaced” by nonwhites. He said addressing online radicalization was critical to tackling the number of mass shootings in the United States.

“You can't prevent people from being radicalized to violence, but we can address the relentless exploitation of the internet to recruit and mobilize terrorism,” Biden said. “We just need to have the courage to do that, to stand up.”

Gov. Kathy Hochul, a day after the Buffalo shooting, told ABCs “This Week” that social media companies “need to be held accountable and assure all of us that they’re taking every step humanly possible to be able to monitor this information.”

The shootings at a Buffalo supermarket and a Texas elementary school have sparked calls from Democratic lawmakers for more oversight of social media companies and calls for them to take down hateful rhetoric. Newsday's Laura Figueroa Hernandez has the story.      Credit: Newsday; The Associated Press/Laura Figueroa Hernandez

“I know it's a huge, vast undertaking, but these companies have a lot of money. They have resources. They have technology,” Hochul said. “Keywords show up, they need to be identified, someone needs to watch this, and to shut it down the second it appears, and short of that, we will protect the right to free speech, but there is a limit. There is a limit to what you can do and … hate speech is not protected.”

But efforts to increase oversight of social media companies have languished in Congress, in part over arguments raised by Republican lawmakers who contend social media platforms would target and censor conservative voices. Democrats counter that those concerns are unfounded, noting there are safeguards in place to protect free speech and expression.

Last June, the White House released a 30-page “National Strategy for Countering Domestic Terrorism,” which calls for “reducing both supply and demand of recruitment materials by limiting widespread availability online and bolstering resilience to it by those who nonetheless encounter it.”

The document calls for “increased information sharing” among federal agencies and the tech industry to help social media companies identify harmful content that could be used to radicalize users.

In the wake of the Buffalo shooting, congressional Democrats were pushing for legislation that would bolster the Biden administration’s strategy by authorizing the Department of Homeland Security, Department of Justice and FBI to establish offices focused on investigating and tracking domestic terrorist activity, such as the calls for violence often promoted on far-right and white nationalist web forums.

House Democrats passed the Domestic Terrorism Prevention Act days after the shooting, but on Thursday the measure failed to garner the necessary 10 GOP votes in the Senate to move the legislation forward.

Sen. Chuck Schumer (D-N.Y.), the Senate majority leader, said in a floor speech that the legislation would have been “a chance to act on a pernicious issue that has recently become an increasingly prevalent component in America's gun violence epidemic — the evil spread of white supremacy and domestic terrorism.”

Sen. Kirsten Gillibrand (D-N.Y.), in a news conference a day after the Uvalde shooting, said there is a “need to hold Big Tech and social media platforms accountable for the harms against society and our democracy.”

Gillibrand called on lawmakers to support her proposed Data Protection Act, which would create a regulatory body that she contends will “not only help Americans against modern privacy harms, it would also hold social media companies accountable for monetizing hate and pushing dangerous extremist content into people's feeds.”

Gillibrand later told Newsday that she is working to garner Republican support for her proposal “so that we can have a bipartisan bill” and that some tech platforms have openly said “they think there needs to be a place for regulation of these platforms.”

Meta, the parent company of Facebook and Instagram, and Twitter, some of the most widely used platforms in the United States, have implemented user standards that they say bans the dissemination of violent imagery, violent threats and hate speech. But watchdog groups say the social media giants are sometimes slow to identify and take down such content.

Ben Weich, a spokesman for The Center for Countering Digital Hate, told Newsday in an interview that social media platforms have not adequately funded efforts within their own companies to monitor and eliminate hate speech.

Weich said “any solution must start with better investment in this area.”

The Washington, D.C.-based nonprofit has worked with lawmakers in the United States and abroad on legislation to address online hate speech. Weich said the key to crafting meaningful laws to tackle online radicalization “is to increase transparency from tech companies about what they're doing to counter harmful content, in order to empower enforcers, researchers and users.”

Tech Against Terrorism, a London-based organization supported in part by the United Nations, told Newsday it has been working with “hundreds of platforms” to have the Buffalo shooting video removed. It said that while most are cooperating, a “small number of platforms are refusing probably because they are owned or operated by a violent extremists who sympathize with the ideology behind attacks such as in Buffalo, but under U.S. law there is nothing criminal about this.”

Adam Hadley, the group’s executive director, said in an interview with Newsday that part of the challenge of attempting to crack down on the spread of extremist content online in the United States is the lack of a federal designation for domestic terrorist groups, such as neo-Nazi organizations. Such designations would clarify for tech companies which groups should be closely monitored.

Another challenge is addressing the growth of smaller platforms created by extremist groups, which are harder to trace and regulate, Hadley said.

“Increasingly we're finding that terrorists and violent extremists of all types are building their own technologies,” Hadley said. “They're building their own apps. They're building their own websites. They're building their own social media platforms.”

Hadley, noting the outsized number of mass shootings in the United States compared with other democratic nations, said that while “tech companies should be doing a lot more, so should society overall.”

Gilgo-related search continues … Corelife Eatery closes locations … BOCES does Billy Joel Credit: Newsday

Details on the charges in body-parts case ... Gilgo-related search continues ... Airport travel record ... Upgrading Penn Station area