Content Moderation and Digital Manipulation/ Addiction
After studying content moderation and digital manipulation/addiction, it has become quite clear that the two are connected. Nearly everyone in the country uses social media. If they don’t use Instagram or at least Facebook, quite often, it’s a red flag that indicates that the individual isn’t in touch.
In my illustration, you can see that I have drawn the internet hierarchy. The moderators at the bottom, making social media possible for us to use, users becoming addicted to social media, and at the top, big tech companies plotting ways to make social media more addicting.
I used Procreate to illustrate my image.
In my illustration, you can see that I have drawn the internet hierarchy. The moderators at the bottom, making social media possible for us to use, users becoming addicted to social media, and at the top, big tech companies plotting ways to make social media more addicting.
I used Procreate to illustrate my image.
Digital Manipulation/Addiction
If you are using social media, then you most likely have an addiction to said media. Nearly 210 million individuals are in some form, addicted to social media (MediaKix, 2020). This is because social media is engineered to be addicting. Features like the infinite scroll and double tapping to like images have rewired our brains to become dependent on the instant gratification it provides. Features like notifications when someone comments or tags you in a post, or when someone is “typing”, give our brains small doses of dopamine, which makes us crave social media more and more. (Girish, 2020)
Because social media is manipulated to make its users addicted, user’s attention is easy to monetize. Not only can social media manipulate us to become addicted to it, it can also manipulate us into watching advertisements, clicking on certain items, and even buying products from online ads. Big tech giants such as google track our every move, collecting data. They build online profiles for its users, and use said profiles to make advertisements that they know we will be vulnerable to. The tech giants are so good at building profiles and algorithms from the data they collect from their users, that they know exactly what time and where to strike. (Social Dilemma)
Because social media is manipulated to make its users addicted, user’s attention is easy to monetize. Not only can social media manipulate us to become addicted to it, it can also manipulate us into watching advertisements, clicking on certain items, and even buying products from online ads. Big tech giants such as google track our every move, collecting data. They build online profiles for its users, and use said profiles to make advertisements that they know we will be vulnerable to. The tech giants are so good at building profiles and algorithms from the data they collect from their users, that they know exactly what time and where to strike. (Social Dilemma)
Content Moderation
So not only are users of social media manipulated to be addicted to it, they are also manipulated to make actions such as interacting with advertisements. Underneath all of this is content moderation. Without content moderation, none of the issues above would be possible. Content moderators truly carry social media on their backs.
More than one hundred thousand people work in content moderation. (Chotiner, 2019) They are responsible for filtering through all of the content that is posted to social media, and removing the obscene and explicit content before viewers even know it existed. Some work in Silicon Valley, but may work in places such as India and the Philippines, where moderators are paid extremely low. Moderators maintain a speed of “2000 photos per hour” (The Moderators) and witness horrible things such as nudity, murder, child pornography, bestiality, and many more terrible things. Obviously, one cannot view these kinds of things for hours on end and just be “fine.” Many moderators suffer from trauma or PTSD, and develop substance addictions. One moderator even brings a gun to work and sleeps with a gun besides him because he no longer trusts humanity. (Albert, 2019) Some moderators can’t even shake hands or tolerate any form of human contact because they know the darkness people can be capable of. (All things Considered)
You may be wondering “well than why don’t we stop using moderators”, however as tech giants continue to manipulate users to become addicted to social media, social media and the internet have become a necessity. Moderators are the price we must pay. In order to feed our addictions to Twitter and YouTube, moderators must be in place to protect us from viewing obscene images.
More than one hundred thousand people work in content moderation. (Chotiner, 2019) They are responsible for filtering through all of the content that is posted to social media, and removing the obscene and explicit content before viewers even know it existed. Some work in Silicon Valley, but may work in places such as India and the Philippines, where moderators are paid extremely low. Moderators maintain a speed of “2000 photos per hour” (The Moderators) and witness horrible things such as nudity, murder, child pornography, bestiality, and many more terrible things. Obviously, one cannot view these kinds of things for hours on end and just be “fine.” Many moderators suffer from trauma or PTSD, and develop substance addictions. One moderator even brings a gun to work and sleeps with a gun besides him because he no longer trusts humanity. (Albert, 2019) Some moderators can’t even shake hands or tolerate any form of human contact because they know the darkness people can be capable of. (All things Considered)
You may be wondering “well than why don’t we stop using moderators”, however as tech giants continue to manipulate users to become addicted to social media, social media and the internet have become a necessity. Moderators are the price we must pay. In order to feed our addictions to Twitter and YouTube, moderators must be in place to protect us from viewing obscene images.