By Sheila Dang
Further investigation into Facebook’s lack of controls to prevent misinformation and abuse in languages other than English is likely to leave people “even more shocked” by the potential damage done by the social media company, a whistleblower Frances Haugen told Reuters.
Haugen, a former Facebook product manager at Meta Platforms Inc, spoke at the Reuters Next conference on Friday.
She left the company in May with thousands of internal documents that she leaked to the Wall Street Journal. This led to a series of articles in September detailing how the company knew its apps were helping spread controversial content and harming the mental health of some young users.
Facebook also knew there were too few workers with the language skills to identify objectionable user posts in a number of developing countries, according to internal documents and Reuters interviews with former employees.
People who use the platform in languages other than English are using a “raw and dangerous version of Facebook,” Haugen said.
Facebook has always said it doesn’t agree with Haugen’s characterization of internal research and is proud of the work it has done to end abuse on the platform.
Haugen said the company should be required to disclose the languages supported by its technical security systems, otherwise “Facebook will do … the bare minimum to minimize the public relations risk,” she said.
Internal Facebook documents released by Haugen also raised new concerns about how he may have failed to take action to prevent the spread of misleading information.
Haugen said the social media company knew it could introduce “strategic frictions” to slow users down before sharing posts, such as forcing users to click on a link before they could share the content. But she said the company is avoiding taking such steps in order to preserve its profits.
Such measures to make users reconsider sharing certain content could be useful given that allowing technology platforms or governments to determine which information is true poses many risks, according to the Internet and legal experts who spoke out on the matter. ‘a separate panel at the Reuters Next conference on Friday.
“By regulating speech, you give states the power to manipulate speech for their own purposes,” said David Greene, director of civil liberties at the Electronic Frontier Foundation.
The documents released by Haugen led to a series of US Congressional hearings. Adam Mosseri, Instagram app manager for Meta Platforms, will testify next week about the app’s effect on young people.
When asked what she would tell Mosseri if the opportunity presented itself, Haugen said she would question why the company hasn’t released more of its internal research.
“We now have proof that Facebook has known for years that it harms children,” she said. “How are we supposed to trust you in the future?” “