Instagram live streamed a brutal murder-suicide in Bosnia. A war-weary nation wonders how that could happen.

This article was originally published by Radio Free Europe/Radio Liberty and is reprinted with permission.

The sadistic nightmare was beamed across the Balkans and the world. Thousands of viewers on social media were reduced to horrified spectators as a Bosnian man live streamed the murders of his ex-wife and two other innocents before eventually turning the gun on himself.

The episode shocked a society already scarred by conflict and lingering trauma, prompting desperate calls for greater protections for women like Nizama Hecimovic, the young mother killed point-blank in the northeastern city of Gradacac, who had previously alleged abuse by the father of her baby daughter.

But it also rekindled frustrations in Bosnia-Herzegovina and elsewhere over the failure of social networks like Instagram, which along with WhatsApp is owned by Facebook parent company Meta, to prevent such live content appearing or to at least get it quickly taken down.

Experts warn that algorithms like Instagram’s are “notoriously ineffective” at sniffing out danger and say the Gradacac episode underscores how Meta and other leading social network operators are failing miserably at moderation, particularly when it comes to videos and non-English content.

Maida Muminovic, executive director of Mediacenter Sarajevo, which is part of an EU- and UNESCO-backed alliance to curb harmful online content, said Meta’s response should have been faster.

“The circulation of such a video for hours on Instagram showed all the weakness of the mechanisms for reporting and removing disturbing content on social networks, the irresponsibility of Meta,” Muminovic said.

She said the Coalition for Freedom of Expression and Content Moderation in Bosnia and Herzegovina, which was officially launched in June, would request details of the case from Meta to aid in a detailed analysis of what went wrong.

“I have no words to describe what happened today in Gradacac,” said Nermin Niksic, the prime minister of the Bosniak-Croat Federation, one of the two entities that makes up Bosnia. Exactly a month earlier, Niksic was sharing a photo from the 1995 genocide that took place during the 1992-1995 Bosnian War and assuring the country that the tragedy would “never be forgotten and never repeated against anyone.”

Algorithms Are ‘Notoriously Ineffective’

The video of 35-year-old bodybuilder Nermin Sulejmanovic announcing his intentions and then shooting his wife at close range began at 10:20 a.m. As the grisly live stream continued into its third hour on Instagram, its viewership rose along with the death toll. It was only taken down at around 2:30 p.m., by which time Sulejmanovic had also killed a father and son and wounded three others as he was being pursued by police.

Nearly 12,000 people watched the killing live, and viewership peaked at around 15,000 at one point. The alleged killer’s Instagram profile, which has since been taken down, gained 300 or so new followers.

Sasa Petrovic, a cybercrime inspector for the federal police, told RFE/RL’s Balkan Service that he contacted Meta after being alerted to the live stream by Tuzla-area prosecutors at around 12:20 p.m. “I received the approval of the police from Tuzla…and I contacted the Meta administrator,” Petrovic said. “In about 20 minutes, the video was removed, as was the account.”

Petrovic said he and Bosnian police agencies have contact details for Meta and some other social networks that allow for freezing data and banning users but said he was personally unaware of any connection to, say, TikTok, where he estimated that dozens of accounts share graphic murder videos.

When contacted by RFE/RL on the day of the killings, Meta did not respond to say how much time passed between the application and the video’s removal. But in a statement obtained by RFE/RL’s Balkan Service, the social media titan said that it was “deeply saddened” by the “horrific” Gradacac attack and was “in contact with Bosnian authorities to help support their investigations.”

Asked to clarify its policies and practices by RFE/RL, the company said through a spokesperson that “when it comes to Live content, we use [artificial intelligence] classifiers to proactively identify potentially violating content in Live — as we do elsewhere across Facebook and Instagram — and we respond to reports from our community.”

Meta said it had “made several updates to help limit our services from being used in this way” after the notorious live streamed killings in Christchurch, New Zealand, in 2019.

Caitlin Chin, a fellow at the Washington, D.C.-based Center for Strategic and International Studies (CSIS) who researches technology regulation, said the Gradacac broadcast shows the pitfalls of algorithmic solutions.

“Social media algorithms are notoriously ineffective, especially for images or non-written content,” she told RFE/RL’s Balkan Service. She described a complicated process of conversion from photo or video to text that is scanned for telltales and concluded, simply, that “the software itself is just not adequate.”

She also referred to Meta’s recent staff reductions and its focus, as well as the effectiveness of its internal content moderation policy, which she suggested is “fuzzy.”

“Unfortunately, one of our biggest lessons [from shared graphic videos and images] is just how challenging and consequential online content moderation can be,” Chin said.

And the chances of a miscue are likely higher for people in countries like Bosnia.

‘Completely Opaque’ Processes

Meta dedicates a massive amount of its attention to English-language content moderation, Chin said, “but this terrible tragedy reveals how problematic it is to neglect Bosnian and other non-English languages.”

James Waldo, a professor of the practice of computer science at Harvard University, said moderating content is “not a simple thing,” particularly on a scale like that of Meta, originally known as Facebook after its founding in 2004.

He thinks Meta “is still coming to grips with the fact that they are a worldwide company…and have to deal with having representatives in all sorts of places that they may think of as niche but they are vital to those areas.”

“I don’t know what their staffing levels are like in Bosnia or Romania or any of the Eastern European countries,” Waldo said. “I know that they’re fairly lightly staffed in South Asia in various spots…yet they are present in all of these places where there are lots and lots of different languages and cultures that they need to deal with.”

But he called Facebook/Meta’s internal process on video removal “completely opaque” in a way that “means it’s very hard to trust Facebook to be doing the right thing.”

He said multiple jurisdictions further complicate the issue around, say, a video viewed in the United States through a server in France but originally posted in Bosnia.

Milos Jovanovic, from the Belgrade-based OpenLink Group, an IT agency that provides development and education services, expressed sympathy for Meta’s dilemma. He said real-time events like the Gradacac live stream are particularly challenging and it will probably take years to adequately police what happens after someone turns on a live video transmission.

“I’m convinced that it wasn’t possible to react and remove that content in a quick period of time,” Jovanovic said.

Claudio Agosti, founder of the Tracking Exposed platform and software for analyzing social network algorithms, said that the moderation process is resource-intensive and “optimizations mean that certain content is scanned more superficially or later.”

Sand dunes, he noted, are frequently mistaken for naked bodies, and food preparation can appear as violent patterns if a detection mechanism is “loose.”

“The content that will be uploaded tomorrow is by definition new, and artificial intelligence is trained on what has been shown before,” Agosti said, although he said that, as opposed to disinformation or hate speech, graphic violence or behavior “should be language-independent.”

Darko Obradovic, from the Center for Strategic Analysis in Belgrade, suggested that after the brutal internecine wars of the 1990s many people in the region have a particularly high tolerance for violence.

He said algorithms aimed at compliance with social media company’s terms of use cannot exclude a human dimension.

“It tells us that we have all become numb and lost the basic values of a culture of security, in which it is far more valuable for everyone to share the video, to watch the video, instead of, for instance, calling the police,” Obradovic said. “And when these two dimensions intersect, then we get something called the execution of a criminal plan to which no one reacted.”

The grisly crime’s viewers could also come under the scrutiny of Bosnian authorities.

The Federal Ministry said Interior Minister Ramo Isak would ask federal police and the Interior Ministry of the Tuzla Canton, where Gradacac is located, to investigate anyone on social media who might have encouraged the killer or glorified the crime.

On August 14, thousands of Bosnians demonstrated in multiple cities, including the capital, Sarajevo, to demand greater protections for women like Hecimovic, who just days before her murder had sought police protection from her ex-husband.

The local UN mission expressed its shock and urged authorities to commit to stopping femicide and said it was particularly “horrified by the fact that the murder of a female victim was live streamed via a social network.”

Source link

Stay up to date
Register now to get updates on promotions and coupons
Optimized by Optimole

Shopping cart

×