The test couldn't have been much easier — and Facebook still failed

Facebook and its parent company Meta flopped once again in a test of how well they could detect obviously violent hate speech

The hateful messages focused on Ethiopia, where internal documents obtained by whistleblower Frances Haugen

The group created 12 text-based ads that used dehumanizing hate speech to call for the murder of people belonging to each of Ethiopia

Facebook’s systems approved the ads for publication , just as they did with the Myanmar ads. The ads were not actually published on Facebook

The company said the ads shouldn't have been approved and pointed to the work it has done to catch hateful content

A week after hearing from Meta, Global Witness submitted two more ads for approval, again with blatant hate speech

The two ads, written in Amharic, the most widely used language in Ethiopia, were approved