Meta ignored its own research and continued business as usual. (Image by Taylor Callery, WSJ) |
As an employee and a consultant, Bejar had a long history with Facebook, now Meta. He started with the company in 2009, and he left Facebook in 2015 for personal reasons. By then, Instagram was a powerful audience engagement arm for the company. When Bejar's 14-year-old daughter shared a vulgar Instagram posting she received, that's when "The trouble began," Horwitz writes. Bejar's daughter reported the comment to Instagram. "A few days later, the platform got back to her: The insult didn't violate its community guidelines."
"Bejar was floored — all the more so when he learned that virtually all of his daughter's friends had been subjected to similar harassment," Horwitz reports, "Instagram acted so rarely on reports of such behavior that the girls no longer bothered reporting them. Bejar began peppering his former colleagues at Facebook with questions about what they were doing to address such misbehavior. The company responded by offering him a two-year consulting gig."
Back on the company's campus, Bejar described its approach to addressing harassment complaints as almost entirely automated and "while users could still flag things that upset them, Meta shifted resources away from reviewing them," Horwitz explains. "To discourage users from filing reports, internal documents from 2019 show, Meta added steps to the reporting process. . . . statistics rules were written narrowly enough to ban only unambiguously vile material. Meta's rules didn't clearly prohibit adults from flooding the comments section on a teenager's posts with kiss emojis or posting pictures of kids in their underwear, inviting their followers to 'see more' in a private Facebook Messenger group."
Bejar was tasked with addressing some of these issues. His team "built a new questionnaire called BEEF, short for 'Bad Emotional Experience Feedback.' A recurring survey of issues 238,000 users had experienced over the past seven days, the effort identified problems with prevalence from the start: Users were 100 times more likely to tell Instagram they'd witnessed bullying in the last week than Meta's bullying-prevalence statistics indicated they should," Horwitz reports. "Among users under the age of 16, 26% recalled having a bad experience in the last week due to witnessing hostility against someone based on their race, religion or identity. More than a fifth felt worse about themselves after viewing others' posts, and 13% had experienced unwanted sexual advances in the past seven days."
Bejar was under a time crutch, "With just weeks left at the company, Bejar emailed Zuckerberg, Chief Operating Officer Sheryl Sandberg. . . . Blending the findings from BEEF with highly personal examples of how the company was letting down users like his own daughter," Horwitz notes. "In response to Bejar's email, Sandberg sent a note to Bejar only, not the other executives. . . . He says he never heard back from Zuckerberg."
Bejar left Meta in 2021. He told the Journal, that any effort to change Meta "would have to come from the outside. He began consulting with a coalition of state attorneys general who filed suit against the company, alleging that the company had built its products to maximize engagement at the expense of young users' physical and mental health," Horwitz adds. "Bejar also got in touch with members of Congress about where he believes the company's user-safety efforts fell short." He's scheduled to testify in front of a Senate subcommittee on Tuesday.
No comments:
Post a Comment