page contents Verification: 9ffcbb9dc8386bf9 Algorithm fail: YouTube videos of Notre Dame fire feature panels of 9/11 attacks – News Vire
Home / News Articles / Algorithm fail: YouTube videos of Notre Dame fire feature panels of 9/11 attacks

Algorithm fail: YouTube videos of Notre Dame fire feature panels of 9/11 attacks

As thousands of Parisians lined the streets watching the historic Notre Dame cathedral burn on Monday evening, others around the world turned to YouTube for updates and were provided with false context about 9/11.

YouTube users watching the live stream of the burning building in the United States and South Korea were greeted with “knowledge panels”, a banner with a synopsis of related information, pushing Encyclopedia Britannica articles about the September 11 attacks. The platform introduced the knowledge panel feature in 2018 to cut down on misinformation, but in this case the tool created false associations between fire reportedly caused by accident and the 2001 US-based terrorist attack.

The platform’s automated tools may have mistaken the visuals of the burning building for 9/11 footage, according to Vagelis Papalexakis, an assistant professor of computer science and engineering at the University of California, Riverside who studies machine learning used in similar systems.

“As long as we are using automated methods to throttle content there is always a margin for mistake,” he said. “This is a multifaceted problem; not only is it working to detect false news but something being falsely associated with 9/11.”

YouTube did not immediately respond to request for comment, but said in a statement it has removed the panels on live streams of the fire following criticism.

The failure of the algorithm in this instance lends momentum to calls from tech watchdogs for openness surrounding how algorithms are written and used on the platform, said Caroline Sinders, a design and machine-learning researcher at Harvard.

“In this case specifically, with the recommendation being something so unrelated, we really need better audits to see why it is recommending what it’s recommending,” she said. “Hiding it is not helping.”

The controversy comes after YouTube, which is owned by Google vowed to serve users fewer conspiracy theory videos following criticism for amplifying “harmful” misinformation, including content “claiming the Earth is flat, or making blatantly false claims about historic events like 9/11”. Last week, the platform was forced to cut comments off its live stream of a congressional hearing regarding hate speech after the comment section was filled with hate speech.

The 9/11 content is the latest example of the company’s algorithms falling short as they attempt to address the massive amount of content uploaded to the site each day, said Danaë Metaxa, a PhD candidate and researcher at Stanford focused on issues of internet and democracy.

“As tech companies play an increasingly key role in informing the public, they need to find ways to use automation to augment human intelligence rather than replace it, as well as to integrate journalistic standards and expertise into these pipelines,” Metaxa said.

About newsvire

Check Also

Palestinians denounce Trump tweets against local hero Rashida Tlaib

RAMALLAH, West Bank — Palestinians on Monday denounced President Donald Trump’s attack on U.S. Rep. …

Leave a Reply

Your email address will not be published. Required fields are marked *