A disturbing Facebook video shows a mother trespassing at a mosque with her three kids while teaching them racist and inaccurate information about Islam.
The video reported by HuffPost says the incident took place in Tempe in the state of Arizona. It’s a city just east of Phoenix.
Mother is teaching kids anti-Muslim racism, the video shows.
Facebook’s community standards prohibit hate speech and violent threats against people based on their religious practices.
Facebook’s army of 7,500 censors — known as content reviewers — decide whether to allow or remove posts flagged by its 2 billion users.
According to a ProPublica report, Facebook posts’ content reviewers often make different calls on items with similar content, and don’t always abide by the company’s complex guidelines. Even when they do follow the rules, racist or sexist language may survive scrutiny because it is not sufficiently derogatory or violent to meet Facebook’s definition of hate speech.
A disturbing Facebook video shows a mother trespassing at a mosque with her three kids while teaching them racist and inaccurate information about Islam. pic.twitter.com/TGUyJFruJG
— HuffPost (@HuffPost) March 15, 2018
ProPublica asked Facebook to explain its decisions on a sample of 49 items, sent in by people who maintained that content reviewers had erred, mostly by leaving hate speech up, or in a few instances by deleting legitimate expression. In 22 cases, Facebook said its reviewers had made a mistake. In 19, it defended the rulings. In six cases, Facebook said the content did violate its rules but its reviewers had not actually judged it one way or the other because users had not flagged it correctly, or the author had deleted it. In the other two cases, it said it didn’t have enough information to respond.
ProPublica is an American nonprofit organization based in New York City. Founded in 2007, it describes itself as a nonprofit newsroom that produces investigative journalism in the public interest.
“We’re sorry for the mistakes we have made — they do not reflect the community we want to help build,” Facebook Vice President Justin Osofsky said in a statement. “We must do better.” He said Facebook will double the size of its safety and security team, which includes content reviewers and other employees, to 20,000 people in 2018, in an effort to enforce its rules better.
He added that Facebook deletes about 66,000 posts reported as hate speech each week, but that not everything offensive qualifies as hate speech. “Our policies allow content that may be controversial and at times even distasteful, but it does not cross the line into hate speech,” he said. “This may include criticism of public figures, religions, professions, and political ideologies.”
Facebook’s guidelines are very literal in defining a hateful attack, which means that posts expressing bias against a specific group but lacking explicitly hostile or demeaning language often stay up, even if they use sarcasm, mockery or ridicule to convey the same message. Because Facebook tries to write policies that can be applied consistently across regions and cultures, its guidelines are sometimes blunter than it would like, a company spokesperson told ProPublica.