Please be aware that this article mentions suicides and mental illness.
"My job is to prevent sinful images," says one of the characters of "The Cleaners" documentary, the closing film of Travelling Docudyas UA 2019 programme in Lviv region. "It's like a virus to me, it slowly penetrates my brain," says the character of the film.
“The Cleaners” puts a spotlight on what is happening "beyond" the Facebook Community Standards and other big platforms. The film allowed forbidden voices to be heard — those of employees who observe thousands of photos and videos marked as inappropriate on one platform or another every day. Their function is to decide whether to remove or keep the content within seconds.
Thousands of downloads per day. These include horrifying and negative images: violence, murder, child pornography, cruelty to animals, suicide and more. Sometimes more complex ratings need to be done: extremist propaganda? Hate speech? Sexually suggestive cartoons of Trump and Erdogan are also subject to seizure. Video evidence of war crimes in Syria. Image of a boy who died in a flood.
Moderators are not allowed to talk about their work because they signed non-disclosure agreements. Their biggest fear is to miss the content that had to be removed. Moderators are allowed to make only 3 errors per month. From tens of thousands of posts. Big platforms also try to keep human, manual work as invisible and secret as possible — from content moderators to employees who decrypt human speech to voice assistants like Alexa or Siri when the algorithm can't handle it.
This work does not fit into the vision of nonchalance the tech giants try to promote. Everything should be accessible in a single click without effort and hesitation. While Mark Zuckerberg is proud that Facebook allowed everyone to express themselves (an opportunity that existed on the Internet long before the advent of Facebook), algorithms and people are constantly working to create a unified, filtered reality on these commercial platforms and set the universal communication format and universal taboos.
In the book "Brave New World" Aldous Huxley depicts a "perfect" civilization, where all the basic needs of the inhabitants are met, everyone is in the right place, pleasure is always at hand, and any deviation from nonchalant mood can be regulated by the euphoric "soma" most of the characters are using.
This world is sterile, it has nothing unpleasant to the eye, nothing that could distract from daily carelessness. "Images that may shock or scare viewers" is one of the points in the Facebook Moderator Guidelines that describe the content to be removed.
Facebook and other big platforms hire people to moderate content through contractors. As of February 2019, this number amounted to 15 thousand people. Working conditions include observing hundreds to thousands of images daily, the content of which lead moderators to panic attacks, depression, post-traumatic stress disorder, and even suicide. To relieve stress, employees at one of the offices had sex straight at work, used drugs and alcohol. The price for the "sterile" content of large platforms is the outsourced work of people who look at the dirt instead of us.
How acceptable for us is outsourced work which consists in filtering out what we will see? Are there any other ways to build quality communication in the digital environment?
These and other issues were discussed after the screening. Communication rules for the whole Internet? Who and how should set the rules of conduct and communication for the billions of Internet-users?
Before looking for the answer to this question, you have to ask yourself another question — what is the Internet? It's simple enough: it's a network of networks. It integrates various computer networks into one "supernetwork" based on open standards of data exchange. Thanks to it, you do not have to buy separate access to watch videos, create email separately and pay for video calls with loved ones. All this belongs to one network where any public IP address can exchange data with any other.
The principle of "network of networks" is also reflected by basic Internet technologies, such as email or the web: you are free to send emails to people who use another email provider; the ability to link independent web pages to one another is a fundamental principle of the web. The Internet does not have a single owner who can set binding rules of expression for everyone.
That is why each specific online community develops own set of rules of communication and interaction and ensures their implementation. Often, experienced members of the community are selected as moderators. However, centralized commercial platforms like Facebook break this principle: they try to concentrate information sharing within a particular website or service, gather as many people as possible within it, and monetize their attention by selling it to advertisers.
It is no longer a network of networks, it is just a website that is owned by specific people. But one of these private websites has two billion registered users, and much of their communication and interaction take place there. The platform becomes the window through which users look at the open Internet. For all of them, Facebook seeks to set the same standards of conduct and rules of expression, which are not shaped by a specific thematic community, but which are assigned from above by a platform according to its priorities. The use of mechanical routine labour of moderators in such circumstances becomes an inevitable tool for implementing uniform standards for the 2 billion users of the platform.
Back to the Future: Decentralization
Although it is only about a few websites, the management of these websites are being increasingly seen as "bosses of the Internet" who shape what we can and cannot write or read on the Internet. Search engines and platforms often make concessions to authoritarian (and not only) states by filtering search, blocking content, and even changing the picture of reality for the citizens of those states.
All of this happens due to the unique monopoly position of these platforms, which enables them to influence how millions of people interact with information.
However, people's ability to independently create digital horizontal and decentralized networks is the basis of the Internet. Even when the Internet was not available to the general public, in parallel, people from different continents were creating their own, independent networks using telephone lines. For example, the Fidonet network in the mid-1990s had millions of users and about 40,000 nodes. There were also people in Ukraine who joined this network in the early 1990s.
In Cuba, where access to the Internet is still severely restricted by the authorities, where imports of network equipment are banned, grassroots, independent networks have emerged in Havana and other cities, bringing together tens of thousands of ordinary people's computers. These networks even have their websites. And all this is disconnected from the global Internet due to political reasons.
The ability to create own decentralized networks manifests itself in the so-called world of social networks. If Facebook or Twitter allow interaction only within their platforms, decentralized social networks embody the idea of "network of networks", that is, independent servers, where users are free to communicate with each other, even if they are registered on different servers. For instance, an account on Mastodon (microblogging platform) enables you to make friends and comment on someone's posts from Pixelfed (Instagram counterpart), PeerTube (decentralized video platform), Funkwhale (music community), or any other compatible networks within the same platform. These networks are collectively called Fediverse.
Under these conditions, the moderation is more horizontal and distributed. For example, when the online community of people with extreme right views, called Gab, started using Mastodon software on their server, administrators of other Mastodon servers gradually added Gab to a "blacklist" of nodes their node would not liaise.
In this way, the network collectively displaces things that are not acceptable.
On the other hand, the search for new mechanisms for protection against unwanted online interaction is continuing.
Virtual vs Real
One of the tensest scenes in the film “The Cleaners” is the memory of one of the moderators of a case where he and his team saw a live broadcast of a suicide. He sais that according to the rules, they uld not block the broadcast, as it would be an "abuse of authority" by moderators: suicide videos are banned on the platform, but something that has not become suicide yet does not fall into a prohibited content category.
Many people commented on the broadcast — some of them asked the guy not to do it; others joked and encouraged him to carry out his plan. Finally, he did it.
The question remains open to us — if moderators had blocked the broadcast earlier, would it have stopped the suicide? Or would it just have hidden him from the public eyes?
Does the centralized removal of online content depicting violence contribute to reducing physical violence?
To what extent an attempt to create a corporate, state or interstate system for tracking and restricting expression on Internet can distract our attention from the real problems in the physical world we live in? Rather than encouraging us to look for the causes and sources of this negative content on the Internet? There are no simple answers to these questions. But that doesn't mean you don't have to look for those answers.
P.S. Neither the authors of the film, nor the author of this article, nor the organizers of the screening believe that the moderation of digital environments should be stopped. But the author of this text is convinced that we should not delegate this function to corporations wholly, and as citizens, we should feel responsible for what rights we have on the Internet.
Yurii Bulka for Travelling DOCUDAYS UA.
This text is licensed under a Creative Commons Attribution-ShareAlike 4.0 International.
 In December 2019, Twitter declared an interest in creating a new decentralized social media protocol; however, it is too early to say that Twitter abandoned centralized architecture.