Cyber Harassment, Misogyny and South Korean Online Sphere
Article by Katarzyna Szpargala.
Abstract: Early media scholars believed that the Internet could be a tool to combat racism, xenophobia, sexism, and other discriminatory attitudes and practices (Vickery & Everbach, 2018). The Internet opened doors for numerous groups and society members to education, jobs, information, and to share their opinion and experience. However, reality has quickly proven that the Internet is just another space where discriminatory attitudes and practices visible in society are mirrored. The stereotypes and discrimination based on race, sexuality, gender, and others are created and spread in both online and offline spheres.
Nowadays, the Internet and social media are essential parts of people’s lives. Our online lives are entwined with our “offline” lives, and this online world impacts our real lives and state of mind. Thus, this article aims to examine online misogyny and its harmfulness. A case study of online misogyny in South Korea will be presented for a comprehensive explanation of this discriminatory practice.
In the early days of the Internet, some scholars believed that it could be a tool to combat racism, xenophobia, sexism, and other discriminatory attitudes and practices (Vickery & Everbach, 2018). The possibilities and omnipresence of the Internet provided a great opportunity for numerous groups and society members to share their opinion and experience. Underrepresented and discriminated groups, such as racial minorities or LGBT communities, got a space and opportunity to express their voices and increase their presence in the online sphere. In other words, the Internet was supposed to be a place where various groups could have a chance to meet and discuss, which would strengthen mutual understanding and respect. The idea itself was admirable, but reality has quickly proven that the Internet is just another space where hierarchies and discriminatory attitudes and practices visible in society are mirrored. The stereotypes and discrimination based on race, sexuality, gender, and others are created and spread in both online and offline spheres.
The Internet and digital platforms did not create these discriminatory attitudes and practices; however, they provide more ways for people to discriminate and offend others (Vickery & Everbach, 2018). Without Twitter or Facebook, hateful comments would still be around, but these and similar platforms make these comments visible to the broader community and help them spread faster. Moreover, originally, scholars believed that due to its anonymous nature, the Internet would create an egalitarian space (Orgad 2005 as cited in Kim, 2018). However, as Daniels (2008) noticed, this anonymity and mobility of the Internet mirrored discriminatory attitudes visible in society and even helped extremists find like-minded people. Not only it is easier to find and mobilize people with the same opinions and beliefs, but also for tormentors to find victims.
Additionally, the growing importance and popularity of online websites give access to our personal lives and blur the lines between personal and public spheres. Offenders often use this easy access to personal information. It can be used to attack the victims offline, for instance, by posting online death or rape threats and then posting a victim’s home address or phone number. Online misogyny, transphobia, and racism have real-life consequences. Studies show that online harassment, such as death and rape threats, cyberstalking, sending unsolicited pornography, or hate speech, impacts victims’ lives and leads to offline harassment (Citron 2009; Daniels, 2008). Online harassment is not something unique to one place or community; it is a global problem. The Internet allows people to share their posts and comments all over the map, and once something is posted on the Internet, it stays there permanently.
There are many names for these discriminatory behaviors online – “online harassment,” “cyber harassment,” “cyberbullying,” “e-bile,” “generic trolling,” or in the case of online misogyny – “gendertrolling.” Regardless of the labels, this kind of behavior causes emotional distress for the victims, intimidates them, and ultimately excludes them from online spaces (Citron, 2009).
Women in the Cyberspace – Gendertrolling
Danielle Citron (2009), a law professor at Boston University Law School, argued that cyber harassment is a civil rights violation and that it is not treated as seriously as it should be. Moreover, Citron (2009) also stated that women are most often the targets of cyber harassment, but public opinion and officials are not taking them seriously. For instance, women who talk about online gender discrimination are often called “too sensitive,” and that these posts and comments are “just jokes.” Women and other people who are victims of cyber harassment usually feel isolated and embarrassed which is one of the goals of the abusers. These attacks are designed to intimidate the victim (Citron, 2009). However, online harassment and violent behaviors are not isolated incidents, and they are not just jokes; they are attempts to control, embarrass, and terrorize the victims.
To help the victims of online harassment name and contextualize cyber harassment, Soraya Chemaly and Debjani Roy designed the Online Abuse Wheel (Domestic Abuse Intervention Project 2015 as cited in Vickery & Everbach, 2018). The Online Abuse Wheel is based on the Power and Control Wheel created by the National Center on Domestic and Sexual Violence in the United States, which illustrates the relation between physical abuse and other forms of abuse. The Online Abuse Wheel aims to explain these online abusive behaviors as a part of other forms of abuse and an attempt to control and terrorize the victims; thus, these attacks are not isolated incidents, but part of a bigger form of abusive behaviors (Vickery & Everbach, 2018). The Online Abuse Wheel describes many different types of cyber harassment, including gender-based slurs, doxxing (publishing personal information about an individual or organization), defamation, death or rape threats, hate speech, unsolicited pornography, stalking and other abusive behaviors. Thus, online harassment and violent behaviors take various forms. However, the goal in all cases is to intimidate, humiliate, terrorize, and silence women and victims of online harassment (Vickery & Everbach, 2018).
As this paper focuses on online misogyny and gender-based discrimination, it is important to mention gendertrolling. As Citron (2009) observed, women are often the targets of online harassment, and gender-based discriminatory posts and comments are particularly visible in the online sphere. However, as Karla Mantilla (2016) noticed, there is a difference in a way that women and men are harassed online. She coined a special term, gendertrolling, to describe how women are targeted online and identified seven characteristics and patterns that distinguish it from generic trolling. These seven characteristics are as follows (Mantilla, 2016):
1. Women are attacked for expressing their opinions online.
2. The cyber harassment features graphic sexualized and gender-based insults.
3. Gender-based online harassment often includes death or rape threats – many of these threats are creditable.
4. Offensive posts and comments are not contained to one website or platform but are posted on multiple online platforms.
5. Gendertrolling is very intense and frequent – numerous threats and insulting posts per day or even per hour.
6. These threats and disrespectful messages can last for weeks, months or even years.
7. Online misogynic attacks are usually organized in a concrete and coordinated campaign.
Thus, according to Mantilla (2016), women are often attacked for practicing their right to express their opinions in the online sphere. Of course, men are also victims of cyber harassment, but the way men are attacked differs from attacks experienced by women. Men are usually insulted for their ideas or actions, but their right to express their opinion is not challenged (Vickery & Everbach, 2018). Moreover, men show more violent online behavior and target women when they do not agree with their opinions or actions, including death or rape threats or gender-based slurs. While men are also victims, their experience differs from that of women. According to research, gender-based cyber harassment is more severe and sexualized simply because women are women and are voicing their rights and opinions in a male-dominated sphere, the Internet (Duggan 2014 as cited in Vickery & Everbach, 2018).
Gendertrolling in the South Korean Online Sphere
South Korea is one of the most technologically advanced countries in the world. The Internet, social media, content-sharing websites, and forums are an essential part of South Korean culture, especially for young people. However, according to Kim (2017), Korean online spheres have been constructed as gendered and male-dominated areas. Furthermore, as Kim (2018) noticed, online misogyny in Korean online spheres is not limited to extreme conservative platforms, but is also visible on liberal, male-dominated platforms. Kim (2018) argued that “online misogyny must therefore be viewed not as exceptional extremist speech, but as a socially-constructed collective discourse that resonates with broader contexts in Korea” (p.152).
Misogynistic posts and comments are easily noticeable on YouTube or Korean popular online platforms, such as DC Inside or Ilbe. In 2018, the Korean Institute for Gender Equality and Education monitored 1,600 posts and 16,000 comments shared on eight popular content-sharing websites and forums for seven days (Lee, 2018). In the report released by the Institute, 90 cases of sexism and misogyny and 71 posts expressing hostility toward women were identified. Some examples included discrimination against plus-size women and descriptions of the “ideal” wife as “serving her husband whenever he wants to have sex” or “being ready for physical punishment if her husband’s shirt is not ironed every morning” (Lee, 2018).
One of the newest reports about online misogyny in South Korea has been released in March 2020 by Moonshot Solutions, an online platform working on solutions to disrupt violence. Moonshot Solutions gathered data from YouTube, Google and Naver, the most popular search engine in South Korea. According to the report, YouTube is a popular platform for sharing misogynistic attitudes and comments, and channels expressing these attitudes are widely popular. Moreover, these channels are growing in popularity, as the number of views of videos of channels that present misogynistic content is increasing. In 2016, the number of views of these channels was 344,861; in 2017, it was 2,849,742 views; 11,547,367 views in 2018; and 96,516,870 views in 2019 (Moonshot Solutions, 2020). Among the viewers, 71% are men, and 64% are under the age of 35 (Moonshot Solutions, 2020). Moonshot Solutions (2020) also gathered anonymized search traffic data across both Google and Naver. On Naver, the most popular search terms were: ‘Yoon Ji-oh,’ a friend of Jang Ja-yeon, an actress who took her own life because of continuous sexual harassment; terms related to Sung Jae-gi, a known misogynist and founder of the Man of Korea group; and “Burning Sun scandal,” a celebrity scandal where celebrities drugged and raped women and then shared the videos, of course without consent. On Google, Koreans search for more explicit content, for instance, a pornography site Soranet, which was shut down in 2016 for hosting thousands of nonconsensual spy-cam videos, known as molka (Moonshot Solutions, 2020).
Moreover, similar to Mantilla (2016), Kwon In-Soo, a president of the Korean Women’s Development Institute, noticed the increasing number of misogynistic comments and posts attacking women just for being women, without any valid reason (Lee, 2018). Furthermore, the conservative website Ilbe, which is known for promoting misogyny, xenophobia, or other discriminatory attitudes, bans users from identifying themselves as female (Kim, 2018). The website introduced the so-called bomingban (a ban on boming-out). Boming-out is a term used to refer to users that identify as women or are accused by other users to be women, based on their comments or posts. The term is a combination of English term coming-out and Korean prefix bo (from boji, “pussy”).
As Kim (2018) noticed, the discourse of increasing online hatred towards women is a response to a changing gender relations and economic crisis in Korean society. Men are feeling pressure and anxiety because of the changing socio-cultural behaviors and gender relations, and they blame women and feminists for the turbulence in their lives. Moreover, the online misogyny and derogatory terms for women and feminists, such as kimchi-nyeo/nyeon (kimchi woman/bitch), femi-nyeon (feminist bitch) or ggolfemi (feminazi), visible on websites and cyber platforms, normalize misogyny and misogynistic behaviors for users through casual use of these derogatory terms (Kim, 2018).
This paper examined cyber harassment with a special focus on gendertrolling in South Korea. However, it is crucial to remember that online harassment is not a problem unique to South Korea, but a global problem. Women all over the world use digital platforms to increase their voices, express their opinions, and share their experiences. Changing gender roles, visibility of discriminated groups, and anxiety of dominant groups have triggered and increased online hate discourse. Moreover, online harassment and hate discourse have real-life consequences as they often result in offline actions. For instance, Ilbe users often share female users’ URLs to attack them online and offline (Kim, 2018).
Besides providing platforms for communities to support each other, the Internet also creates communities whose main goal is to insult and attack others, often already discriminated groups. However, the World Wide Web also helps to create a collective response to this kind of discrimination, such as #myfirstharassment in Brazil, #WhyIStayed in the U.S., or #iamafeminist in South Korea. All these are examples of online feminist activism that give a voice to victims and demonstrate that the problem is not the victim but the offender, as well as political and socio-cultural norms.
Citron, D. K. (2009). Law’s expressive value in combating cyber gender harassment. Michigan Law Review, 108(3), 373–415. Retrieved from https://repository.law.umich.edu/mlr/vol108/iss3/3.
Daniels, J. (2008). Race, civil rights, and hate speech in the digital era. In A. Everett (Ed.), Learning race and ethnicity: Youth and digital media (129–154). Cambridge, MA: The MIT Press. Retrieved from https://tandis.odihr.pl/handle/20.500.12389/20719.
Kim, J. (2017). #iamafeminist as the “mother tag”: feminist identification and activism against misogyny on Twitter in South Korea. Feminist Media Studies, 17(5), 804–820. Retrieved from https://doi.org/10.1080/14680777.2017.1283343.
Kim, J. (2018). Misogyny for Male Solidarity: Online Hate Discourse Against Women in South Korea. In J.R. Vickery & T. Everbach (Eds.), Mediating Misogyny. Gender, Technology, and Harassment (151-169). Cham, Switzerland: Palgrave Macmillan. Retrieved from https://link.springer.com/book/10.1007%2F978-3-319-72917-6#about.
Lee, C. (2018).Misogyny in Korean online communities a serious concern: report. The Korea Herald. Retrieved from http://www.koreaherald.com/view.php?ud=20180731000789.
Mantilla, K. (2016, April 15). Understanding the Difference Between Generic Harassment and GenderTrolling. Women’s Media Center. Retrieved from https://www.womensmediacenter.com/speech-project/understanding-the-difference-between-generic-harassment-and-gendertrolling.
Moonshot Solutions. (2020). Misogyny, molka, and victims of domestic violence. A Moonshot Solutions report into online misogyny in South Korea. Retrieved from http://moonshotcve.com/misogyny-online-south-korea/.
Vickery, J.R. & Everbach, T. (2018). The Persistence of Misogyny: From the Streets, to Our Screens, to the White House. In J.R. Vickery & T. Everbach (Eds.), Mediating Misogyny. Gender, Technology, and Harassment (1-27). Cham, Switzerland: Palgrave Macmillan. Retrieved from https://link.springer.com/book/10.1007%2F978-3-319-72917-6#about.