Meta Platforms announced their plan of attack to control “potentially unwelcome or unwanted comments” on Facebook posts about the conflict between Israel and Hamas.
In an updated blog post, the tech giant said that they are offering users a “temporary measure” to “protect them” from “unwanted or unwelcome” comments about the ongoing conflict in the Middle East.
The company, which owns Facebook and Instagram, also said it will change the default settings for people who can comment on new and public Facebook posts created by users “in the region” to only their friends and followers. Social media users can opt out and change the setting at any time, Meta added.
A Meta spokesperson declined to specify how the company defined the region.
The social media giant said that the new policies are designed to “keep people safe” while “giving everyone a voice.”
Ticker | Security | Last | Change | Change % |
---|---|---|---|---|
META | META PLATFORMS INC. | 316.97 | -7.03 | -2.17% |
“After the terrorist attack by Hamas against Israel [Oct. 7], and Israel’s response in Gaza, our teams introduced a series of measures to address the spike in harmful and potentially harmful content spreading on our platforms,” Meta wrote in the blog post. “Our policies are designed to keep people safe on our apps while giving everyone a voice.”
META RAMPING UP EFFORTS TO REMOVE POSTS CONTAINING VIOLENCE, MISINFORMATION ABOUT ATTACK ON ISRAEL
Meta said that the new policy is applied “equally around the world” and that “there is no truth to the suggestion that we are deliberately suppressing voice.”
“We apply these policies equally around the world and there is no truth to the suggestion that we are deliberately suppressing voice,” Meta said.
Earlier this week, some users who posted in support for Palestine or Gaza citizens accused Meta of suppressing their content.
Mondoweiss, a news website that covers Palestinian human rights, said that Instagram had twice suspended the profile of its video correspondent. Other Instagram users reported their posts and stories about Palestine were not receiving views.
EU OFFICIAL WANS GOOGLE, YOUTUBE ABOUT HAMAS-ISRAEL DISINFORMATION AND GRAPHIC CONTENT FOR MINORS
Meta said it fixed a bug on Instagram that caused re-posted content to not appear correctly in a user’s story, which disappears after 24 hours.
The company previously said that they created a “special operations center” with experts, including those who speak Hebrew and Arabic fluently, to monitor the social media platforms and remove content that violates Meta’s policies faster.
Meta said Hamas is banned from Facebook and Instagram under its dangerous organizations and individuals policy.
“We want to reiterate that our policies are designed to give everyone a voice while keeping people safe on our apps,” the company said in a statement. “We apply these policies regardless of who is posting or their personal beliefs, and it is never our intention to suppress a particular community or point of view.”
GET FOX BUSINESS ON THE GO BY CLICKING HERE
“Given the higher volumes of content being reported to us, we know content that doesn’t violate our policies may be removed in error,” the statement continued. “To mitigate this, for some violations we are temporarily removing content without strikes, meaning these content removals won’t cause accounts to be disabled. We also continue to provide tools for users to appeal our decisions if they think we made a mistake.”
The tech company also said it is working with AFP, Reuters and Fatabyyano to fact-check posts and move content with false claims lower in users’ feeds.
Fox News Digital’s Landon Mion and Reuters contributed to this report.