![]() “The reality is that social media has become a front line in extremism and terrorism,” Ware said. But they can’t catch everything, and the emergence of many newer, less closely moderated sites has allowed more hateful ideas to fester unchecked, said Michael Jensen, senior researcher at the University of Maryland-based Consortium for the Study of Terrorism and Responses to Terrorism, or START.ĭespite the obstacles, social media companies need to be more vigilant about regulating violent content, said Jacob Ware, a research fellow at the Council on Foreign Relations. Most social platforms have policies to remove violent and extremist content. In the video posted after the killing, Mohn described his father as a 20-year federal employee, espoused a variety of conspiracy theories and ranted against the government. The company, formerly known as Twitter, did not respond to a request for comment.Įxperts in radicalization say that social media and the internet have lowered the barrier to entry for people to explore extremist groups and ideologies, allowing any person who may be predisposed to violence to find a community that reinforces those ideas. GIFCT allows the platform with the original footage to submit a “hash” - a digital fingerprint corresponding to a video - and notifies nearly two dozen other member companies so they can restrict it from their platforms.īut by Wednesday morning, the video had already spread to X, where a graphic clip of Mohn holding his father’s head remained on the platform for at least seven hours and received 20,000 views. Roughly 40 minutes after midnight Eastern time on Wednesday, GIFCT issued a “Content Incident Protocol,” which it activates to formally alert its members – and other stakeholders – about a violent event that’s been livestreamed or recorded. The Global Internet Forum to Counter Terrorism, a group set up by tech companies to prevent these types of videos from spreading online, was in communication with its all of its members about the incident on Tuesday evening, said Adelina Petit-Vouriot, a spokesperson for the organization. “AI is improving, but it’s not there yet.” That’s when human moderators are “really, really critical,” he said. But that technology can sometimes fall short when a video is violent and graphic in a way that is new or unusual, as it was in this case, said Brian Fishman, co-founder of the trust and safety technology startup Cinder. Major social media companies moderate content with the help of powerful automated systems, which can often catch prohibited content before a human can. Tuesday and online for about five hours, a time lag that raises questions about whether social media platforms are delivering on moderation practices that might be needed more than ever amid wars in Gaza and Ukraine, and an extremely contentious presidential election in the U.S. ![]() Pete Feeney said the video in Pennsylvania was posted at about 10 p.m. The disturbing video from Pennsylvania follows other horrific clips that have been broadcast on social media in recent years, including domestic mass shootings livestreamed from Louisville, Kentucky Memphis, Tennessee and Buffalo, New York - as well as carnages filmed abroad in Christchurch, New Zealand, and the German city of Halle. ![]() YouTube, which is owned by Google, did not attend the hearing despite its status as one of the most popular platforms among teens. News of the incident - which drew comparisons to the beheading videos posted online by the Islamic State militants at the height of their prominence nearly a decade ago - came as the CEOs of Meta, TikTok and other social media companies were testifying in front of federal lawmakers frustrated by what they see as a lack of progress on child safety online. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |