Facebook Parent Meta Shares Details About Newsworthy Posts It Leaves Up

id=»article-body» class=»row» section=»article-body» data-component=»trackCWV»>

What’s happening

Meta, Facebook’s parent company, shared data for the first time about the number of times it’s applied its newsworthiness allowance. The company will sometimes leave up posts that could violate its rules, if it determines the posts are newsworthy.

3 days ago

Why it matters

How Facebook balances newsworthiness with public safety has been an important question, especially ahead of the 2022 US midterm elections.

Facebook parent company said that from June 2021 to June 2022 it made 68 «newsworthiness allowances» for pieces of content that might violate its rules.

It’s the first time Meta has revealed how many times it’s applied an exemption under which it leaves up newsworthy content that could break its rules. Facebook introduced this exemption in 2016 after the social network faced public backlash for removing an fleeing a napalm attack. The company initially said the image violated its rules against child nudity, but it reinstated the photo after considering its historic significance.

How Facebook balances newsworthiness against the risk of public harm has been an important question, especially ahead of the 2022 US midterm elections. The company doesn’t presume that any person’s speech, including that of politicians, is inherently newsworthy. Meta said about 20%, or 13, of its newsworthiness allowances were issued for posts by politicians.

A semi-independent board that reviews the company’s toughest content moderation decisions recommended that Facebook release data about its newsworthiness allowance. Known as the Oversight Board, the group operates separately from Facebook but does receive funding from  through a trust. The board made this recommendation in its decision to uphold the company’s call to , who was US President at the time, from the platform following the deadly Jan. 6 Capitol riots. 

Trump will be suspended from Facebook until at least January 2023, and the social network said it’ll look to experts to assess whether the risk to public safety has declined. Trump is reportedly considering a .

Monika Bickert, technupdate Meta’s vice president of content policy, said the company will look at instances of violence, restrictions on peaceful assembly and other markers of civil unrest. 

«If at that time we determine there’s still a serious risk to public safety, then we will extend the restriction for a period of time and then we’ll continue to evaluate,» she said during a press conference Thursday.

In an update posted online, Meta shared examples of when it applied its newsworthiness allowance. The Ukrainian Defense Ministry shared a video that briefly showed an unidentified charred body. The company determined this video was newsworthy because it documented an ongoing armed conflict, even though Facebook typically removes such content under its policy against violent and graphic posts. Instead, Facebook placed a warning screen over this content and made it available only to users 18 and older.

Meta said it’s also expanding the scope of the Oversight Board so the group can review cases about whether the social network should apply warning screens to content. Bloomberg that because of an informal recommendation by the board, Meta is also working on a customer service group to respond to users who had their accounts or posts removed unexpectedly.

<div class="videoPlayer " data-component="videoPlayer" website


Warning: Undefined array key 1 in /var/www/vhosts/options.com.mx/httpdocs/wp-content/themes/houzez/framework/functions/helper_functions.php on line 3040

Comparar listados

Comparar