Facebook Will ‘Nudge’ Children and Teens Away From Harmful Content
IS THAT IT?
Dado Ruvic/Reuters
Somewhere, there’s a team of Facebook employees popping open a bottle of champagne and slapping each other on the back. Just days after whistleblower Frances Haugen testified before Congress that the company’s platforms actively harm children and propagate dangerous misinformation, the social media giant has swooped in with a plan to mitigate the issues it has long known about internally. The company will offer several fresh features on its platforms, including politely prompting teens to take a break if the algorithm judges they’ve been scrolling Instagram for too long, and “nudging” young users if they’re repeatedly looking at content deemed harmful to their mental health.
The trumpeting of these new features was assigned to Nick Clegg, Facebook’s vice president for global affairs, who earned his paycheck this weekend by stopping by a number of Sunday news shows. “We are constantly iterating in order to improve our products,” Clegg said on State of the Union. “We cannot, with a wave of our wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use.” Clegg added that the company needs “greater transparency.” Josh Golin, an executive director for a media marketing watchdog, when interviewed by the AP, simply said of the new features: “There is tremendous reason to be skeptical.”