Donald Trump's Video Removed by Facebook and Twitter for Spreading Misinformation About Coronavirus
A post from the Donald Trump campaign was removed on Wednesday from Facebook and Twitter for violating the rules of the social media sites.
"If you look at children, children are almost — and I would almost say definitely — but almost immune from this disease," Trump said in the removed video, according to a report from The Washington Post.
Children are not immune to COVID-19. More than 240,000 children have contracted the contagious respiratory virus in the United States, the Post reported.
The Team Trump tweet "is in violation of the Twitter Rules on COVID-19 misinformation," a Twitter spokesperson said in a statement to PEOPLE. "The account owner will be required to remove the Tweet before they can Tweet again."
"This video includes false claims that a group of people is immune from COVID-19 which is a violation of our policies around harmful COVID misinformation," said a spokesperson from Facebook in a statement to PEOPLE.
The tweet had been deleted from both Twitter and Facebook by Wednesday evening.
A representative for the Trump campaign claimed the decision was "another display of Silicon Valley’s flagrant bias against this president, where the rules are only enforced in one direction."
"Social media companies are not the arbiters of truth. The president was stating a fact that children are less susceptible to the coronavirus," the representative told PEOPLE.
Last week, Donald Trump, Jr. was temporarily blocked from part of his Twitter account after he also shared misinformation about COVID-19, PEOPLE previously reported. The video shared by Donald Jr. appeared to have originated with the website Breitbart and showed doctors making unsupported statements about the virus from outside the Supreme Court.
The video showed the doctors "telling Americans they do not need to wear masks to prevent coronavirus while also pitching hydroxychloroquine, an anti-malaria drug the president has previously touted himself," according to the Associated Press.
In May, Twitter implemented a new fact-checking feature, adding a link at the bottom of two tweets from Trump in which he claimed mail-in ballots would increase voter fraud.
"Get the facts about mail-in ballots," says the message, which, when clicked, takes users to a page that includes further information on the subject that debunk the president's incorrect statements.
"In serving the public conversation, our goal is to make it easy to find credible information on Twitter and to limit the spread of potentially harmful and misleading content," Twitter said in a press release earlier that month. "Starting today, we’re introducing new labels and warning messages that will provide additional context and information on some Tweets containing disputed or misleading information related to COVID-19."
In June, Mark Zuckerberg announced that Facebook would be adding similar policies, which he said "are designed to address the reality of the challenges our country is facing and how they’re showing up across our community."
The steps Zuckerberg announced included "Providing Authoritative Information on Voting During the Pandemic," "Additional Steps to Fight Voter Suppression," "Creating a Higher Standard for Hateful Content in Ads," and "Labeling Newsworthy Content."
As information about the coronavirus pandemic rapidly changes, PEOPLE is committed to providing the most recent data in our coverage. Some of the information in this story may have changed after publication. For the latest on COVID-19, readers are encouraged to use online resources from CDC, WHO, and local public health departments. PEOPLE has partnered with GoFundMe to raise money for the COVID-19 Relief Fund, a GoFundMe.org fundraiser to support everything from frontline responders to families in need, as well as organizations helping communities. For more information or to donate, click here.