In one Facebook post, he stands tall in yellow camouflage, decorated with badges. He promises he'll increase the salaries of teachers in Sudan. In another post, he hunches over a fire, cooking food with locals. And in another, published days after he oversaw a bloodbath, he's standing on top of his jeep, brimming with joy as throngs of men, women and children dance around him.
Lt. Gen. Mohamed Hamdan Dagalo, better known as Hemeti, is a social media personality. He is also the leader of the Rapid Support Forces — the paramilitary group that attacked thousands of pro-democracy protesters this month, leaving more than 100 dead. This is a bit of a second act for Hemeti, who also served time with the Janjaweed, the militia group considered responsible for the genocide in Darfur about 15 years ago, according to Foreign Policy magazine.
On Facebook, multiple pages promote Hemeti as a formidable yet kind authority figure.
Sudanese activists have petitioned Facebook to remove Hemeti and his extremist group from the platform. But the tech giant says it cannot take action because Hemeti is now second in command in Sudan's transitional government. Even if he is a warlord, Facebook leaders reason, he may be a state actor. The company is reluctant to make decisions that either anoint or knock down government officials.
The conflict in Sudan is just the latest example of Facebook's chronic uncertainty over how to wield its vast power in volatile regions where lives are at stake — while also shielding itself from the charge that the private company is simply too powerful.
The massacre that has put Facebook in the hot seat this time happened on June 3, the last day of Ramadan, among the holiest of days for Muslims. RSF soldiers led the attack, along with police and some special forces, using live ammunition, tear gas, whips and sticks to raid a months-long sit-in in Sudan's largest city and capital, Khartoum.
Health workers on the ground say more than 100 people were killed. Those responsible tried to conceal the carnage by dumping the bodies into the Nile River. Humanitarian groups charge the junta's footmen raped women too, and point to pictures of RSF troops flaunting female underwear on Facebook.
Facebook is "giving a pulpit to what is essentially a terrorist organization," says Ahmed el-Gaili, a Sudanese attorney who practices international law and is based in Dubai. "You cannot give a forum to an organization that has committed such crimes, even if all they are posting are pictures of cats and dogs."
Facebook has a track record of banning extremists who have come under fire in the U.S. The company recently expanded its definition of hate speech to include white nationalism. Multiple Sudanese activists said they find it baffling that the company would ban far-right activists like Alex Jones, and yet allow a paramilitary leader like Hemeti to use the platform as his propaganda machine.
"Unfortunately, reactive changes in policy are the norm at Facebook, and they probably haven't felt enough pressure yet on Sudan," says Susan Benesch, director of the Dangerous Speech Project, which tracks extremist content online.
Facebook has a different explanation. Company leaders do not dispute Hemeti's track record as a warlord. But his position has changed over the years. He's moved from the fringes of Sudanese society into Sudan's main political circles. He was appointed second-in-command of a transitional government that ousted dictator Omar al-Bashir earlier this year.
According to Brian Fishman, a Facebook spokesman who leads efforts to track dangerous organizations, different rules apply to state actors and non-state actors. The company has artificial intelligence designed to identify and deplatform (or boot) groups that may be affiliated with ISIS, for example. Even if certain posts might seem innocuous, they're banned because their activities in the real world are harmful.
But if the questionable Facebook user also represents the state, and is not explicitly breaking speech rules set by the company, Facebook is hesitant to intervene. The international community is already worried about the company's inordinate power to publish and censor the speech of more than 2 billion people. If Facebook bans a government official, Fishman says, that could make other governments even more wary of the Silicon Valley giant.
CEO Mark Zuckerberg has talked openly about the expectation that social media companies should protect society "from broader harms" by censoring or banning content. He is calling for a new independent body to be created to take on these decisions. "I've come to believe that we shouldn't make so many important decisions about speech on our own," Zuckerberg explained in a Washington Post op-ed.
His remarks came after his company acknowledged its slow response to the genocide in Myanmar. Civil society and human rights organizations in that country reached out to Facebook as early as 2014, asking repeatedly for the platform to intervene as extremist leaders built their social media personalities, and later moved beyond propaganda to incite violence against the Rohingya Muslim population.
"Facebook ignored the warnings," says Michael Lwin, a technologist and lawyer based in Myanmar. The problem in that country was not that Facebook didn't know that its platform was being used for propaganda, Lwin says. It's that the company didn't act despite knowing. The company kept a handful of Burmese-language human reviewers based in Singapore, he says, who didn't understand the local context and reached out to local civil society groups infrequently and reactively. "There are commonalities in how this story unfolds," Lwin adds.
Facebook is one of the world's largest companies. Its revenue topped $15 billion in the first quarter of this year. Facebook does not have an office in Sudan. But spokesman Fishman says a team is tracking the situation on the ground and they've made substantial investments in hiring Arabic-speaking content reviewers.
Shortly before the Khartoum massacre, one paramilitary page on Facebook featured a video in which critics of the pro-democracy sit-in claimed it was failing. In the days after, the RSF said on Facebook that it was acting in the interest of the country, and that activist groups lacked patriotism. In another, it took credit for bringing stability to Darfur and securing the Sudanese borders against illegal immigrants.
While these military leaders use Facebook to promote their message, they've cut off Internet access to the rest of the country, imposing a digital blackout and citing security reasons. Sudanese citizens who had relied on Messenger, WhatsApp and Instagram (all owned by Facebook) to publicize meeting dates, post footage of human rights abuses or message each other are not able to.
"It bothers me a lot," says Mohamed Suliman, an expat based in the U.S. He helped launch the online petition after Facebook failed to respond to his and other users' requests to take down Hemeti's pages. "It's like every day, you see the one who killed your sister, who raped innocent women, being promoted as the leader. And the people he's attacking cannot speak," Salih said.
Editor's note: Facebook is among NPR's recent sponsors.
A previous version of this story incorrectly referred to Mohamed Suliman as Mohamed Salih. Additionally, the quote about story commonalities was from Lwin, not Lewis.
ARI SHAPIRO, HOST:
A Sudanese warlord who led an attack last month on protesters that left more than a hundred people dead is, it turns out, a Facebook personality. He is using the platform to promote himself as a strong-yet-kind leader. Pro-democracy activists want him booted off the site. And so far, Facebook says no. Here to talk with us about it is NPR's Aarti Shahani. Welcome back to the studio.
AARTI SHAHANI, BYLINE: Hi.
SHAPIRO: Tell us about who this warlord is and what the relationship is with Facebook.
SHAHANI: Yes. So his name is Lieutenant General Mohamed Hamdan Dagalo, better known as Hemeti. He's got a long history of violence. He was a member of Janjaweed, the militia considered responsible for the genocide in Darfur. He was a senior aide to the former dictator and most recently, as you mentioned, the massacre. This is the man who's using Facebook to whitewash his image. He's got a bunch of pages dedicated to making him look good. In one post, he promises to raise teacher salaries in Sudan. In another, which went up soon after the recent killings, there's a lovely video of him standing on his Jeep with throngs of men, women and children dancing around him.
The message there and in many others is Hemeti is the protector of the nation, and the protesters are the unpatriotic ones. Sudanese activists find this sickening. They've petitioned Facebook saying, hey, you booted off American right-wingers like Alex Jones; please boot off this warlord who is way, way worse. Ahmed el-Gaili, a Sudanese international law expert who's based in Dubai - he says Facebook is letting itself be a propaganda machine for Hemeti and his paramilitary group.
AHMED EL-GAILI: This is giving a pulpit to what is essentially a terrorist organization. Even if older posting are pictures of cats and dogs, they do not belong in Facebook or in any other forum.
SHAPIRO: Is Facebook responding to this?
SHAHANI: Yeah. The company has a response, and it's fascinating. I had a long talk on the phone with a man at the company who leads their work on dangerous organizations. Facebook was not comfortable with recording, so basically to recap what he said, he said, look; there is a big difference between an Alex Jones and this warlord. Jones is not a governmental official; neither is ISIS, which Facebook has built artificial intelligence to root out. But the warlord could be considered a representative of the state.
Hemeti started at the fringes of Sudanese society. He's since climbed into the mainstream. He was appointed interim vice president, second in command in Sudan. So Facebook is hesitant to intervene. Many governments are worried that the company is too powerful already. The Facebook official says if we go ahead and ban someone who it could be argued is the representative of a sovereign state, that could make many governments even more wary.
SHAPIRO: So Facebook is choosing not to get involved in this particular issue. Do they have an ethical responsibility, though?
SHAHANI: So plenty of people who've tracked extremism online say yes and that Facebook should know better. And they give the example of Myanmar, OK? As early as 2014, human rights groups there were telling Facebook extremist anti-Muslim leaders are exploiting your platform. They're building followings. They are dangerous people. Facebook ignored the warnings until, fast-forward, these extremists activated their followers online and called for attacks against Rohingya Muslims. Last year, Facebook acknowledged it was slow to respond to what became a genocide in Myanmar. So people watching Sudan say, Facebook, you should know better. Violent leaders in volatile regions start soft, flip the switch. That is a familiar playbook.
SHAPIRO: That's NPR's Aarti Shahani. Thanks a lot.
SHAHANI: Thank you.
AUDIE CORNISH, HOST:
And we should note Facebook is a sponsor of NPR.
(SOUNDBITE OF TRENTEMOLLER'S "MISS YOU") Transcript provided by NPR, Copyright NPR.