I just learned an article I shared yesterday was taken down by Facebook because “it looks like spam to them.” 

The article? My opinion piece urging people not to politicize the Mollie Tibbetts case and urging them to pray for the family instead.

Here’s a screenshot of the message I received:

I want to be clear and say that I know human beings who work at Facebook did not do this. They have an algorithm that does this work for them. So I’m not saying this is discriminatory.

I contacted them to let them know I did post the article and it is not spam so I anticipate a reasonable human being at Facebook after a review will see that and put the post back.

I hope that happens anyway.

This, however, is a consequence of the social media giant’s anti-fake news crusade – algorithms that are too sensitive.

Since documented viewpoint discrimination has taken place at Facebook, the optics of this sort of thing looks horrible.

Some remedies:

1. Stop trying to be the arbiter of what is fake news.

U.S. Senator Ben Sasse (R-Nebraska) cautioned Facebook to err on the side of free speech during testimony that Facebook founder Mark Zuckerberg gave at a Senate hearing on the subject.

“I wouldn’t want you to leave here today and think that there is a unified view in Congress that you should be moving toward policing more and more and more speech. I think violence has no place on your platform. Sex traffickers and human traffickers have no place on your platform, but vigorous debates? Adults need to engage in vigorous debates,” Sasse said.

Facebook has made adjustments in their process of purchasing political ads which, I believe, will nip a lot of the “fake news” problem we saw from the Russians on the platform. If there is a post that is egregious and it violates their community standards then let humans report it so humans can review it. There is legitimately “fake news” posted on Facebook, but much of what is being called “fake” both on the left and the right isn’t necessarily fake, but rather an opinion or a news piece they disagree with.

Also, we can police ourselves. I’ve commented on friends’ posts who share a story from an unreliable site that the website is notorious for promoting false information. As a result, some have taken the post down; some haven’t. People can decide for themselves what news sites they can trust and which ones they can’t.

2. Be transparent about what the algorithm looks for.

Algorithms and AI don’t have a bias, they don’t have a particular viewpoint, but those who program them do. It would behoove Facebook if they are going to proactively look at content that they disclose what they are looking for.

I don’t think it would be proprietary information to share the keywords or phrases their algorithm is programmed to look for. Because I’m not the only one who has had a post taken down for “spam” or an alleged violation of community standards.

Why did Facebook’s algorithm determine my post above to be spam? I would love to know. 

Ultimately, if Facebook would like to build trust with its conservative users, they should either stop the pro-active removal of content or share what they are looking for.

I would also suggest the algorithm is far, far too sensitive. In the long run, this does nothing to help foster a culture of free speech which, I would think, Facebook would be interested in.

Update: Facebook finally restored my post, but gave no explanation why they considered it spam. Frankly, the damage has been done in regards to closing down the comments and likes that would help a post gain traction on the platform.

Get CT In Your Inbox!

Don't miss a single update.

You May Also Like

Demons, Halos, and the Nocebo Effect

Marilyn Singleton: The political nocebo effect is a sister of the age-old propaganda tool of demonizing the opposition rather than promoting one’s own position. Today, the demon is a far less discrete group: all men, with a special place in political hell for white men.

Jailed for Debt

The Des Moines Register reprinted an article in the Minneapolis Star-Tribune about…

Robert Zimmerman, Jr: NAACP, Rainbow Push “Moving Around Goal Posts”

George Zimmerman’s brother, Robert Zimmerman, Jr., told Fox News about how he thought special interest groups were moving the goal posts to get a conviction.

The New Stockholm Syndrome: How Sweden Rewards Terrorism

Sadly, just like kidnappers do not suddenly start to like their victims just because they develop Stockholm syndrome neither will terrorists with Swedes.