Facebook blames those pesky algorithms for approving pro-genocide ads in Kenya

Meta’s mascot should be the “NOT ME” imp that appears in Family Circus comics when none of the kids are willing to take responsibility for spilled grape juice, broken furniture, and other mishaps around the house.

Who approved ads advocating for ethnic cleansing in Kenya? NOT ME, says Facebook.

Who “allowed themselves to be a vector of hate speech and incitement?” NOT ME.

Who “approved ads on Facebook in both Swahili and English that included calls to rape and behead Kenyan citizens along ethnic lines?” NOT ME.

As you might imagine, Facebook is very upset with NOT ME.

Facebook’s NOT ME isn’t the smirking sprite seen in Family Circus; it’s “proactive detection technology” that serves as a convenient scapegoat every time Facebook makes a massive fuckup.

Fortunately, the ads were submitted by a watchdog group testing Facebook’s detection measures against hate speech and never ran on the platform.

From Courthouse News:

We have dedicated teams of Swahili speakers and proactive detection technology to help us remove harmful content quickly and at scale,” Meta said in a statement. “Despite these efforts, we know that there will be examples of things we miss or we take down in error, as both machines and people make mistakes. That’s why we have teams closely monitoring the situation and addressing these errors as quickly as possible.”

Comments are closed.