Juri Han

Protecting Children in the Attention Economy

According to a 2019 research conducted by Thorn (International Anti Human Trafficking Foundation) “40% of kids aged 13-17” agreed that it was normal to share nudes amongst peers and 1 in 5 girls of the same age group agreed to have shared nudes. This data is most likely underrepresented considering the sensitivity around the subject.

While this data might frighten you into scrutinizing your child about whether or not they participate in such activities, we must take a step back and consider what is the actual problem? Is it wrong that children are deciding to explore their sexual identities online or have adults been irresponsibly moving towards “innovation”? Are we questioning the ethics behind these tech companies' economic models? And how should we go about perceiving sexuality and nudity amongst the new generation? While individual caution is always required, in a world where Self-generated child sexual abuse material (SG-CSAM) is rapidly growing there needs to be a re-evaluation of attention-based economic incentives within major Social Media Platforms to effectively protect our children.

In considering what causes SG-CSAM to spread, there needs to be a re-evaluation of the attention-based economic incentives that drive major social media platforms. We must address the connection between the development of Web 2 (the internet) technologies and the spread of CSAM content. By the late 1980s, child pornography had been nearly eliminated due to improved law enforcement that cracked down on CSAM trading through mail. However, with the development of the internet and social media, the trade of CSAM exploded. According to a 2019 report by The New York Times, “tech companies reported over 45 million online photos and videos of children being sexually abused” that year, more than double what was found in 2018 (Keller & X, 2019). In 2021, the CyberTipline “received 29.3 million reports of suspected child sexual exploitation, an increase of 35% from 2020” (NCMEC).

Beyond this, the current economic models of major social media platforms may actually be incentivizing users to create and exploit SG-CSAM content. Big tech platforms are attention merchants, providing direct economic incentives for users creating viral content. When businesses are built on attention, their major revenue streams come from advertisers who want maximum exposure and information (user data) that leads to sales. In fact, “a platform whose main source of revenue is from advertising may not benefit from better technology, because less accurate technology creates a porous community with more eyeballs.” (Yildirim & Zhang). Viral content evokes “greater intensity of feeling”; incentivizing content based on its potential for virality can therefore encourage provocative material such as SG-SCSAM to proliferate (Ignatius, 2020).

Another issue is the false illusion of safety these platforms create through features such as expiring messages, which encourage children to feel protected when creating SG-CSAM content. Major platforms such as Snapchat, Instagram, or Facebook Messenger have features like timed content, which create the illusion that shared images will not be traceable or retrievable after a certain period of time (Thorn). These features seem to be resolving the issues of privacy and content management, but do not actually prevent the spread of sensitive content. WhatsApp, a messaging app owned by Facebook, clarifies in its help center that if disappearing messages are forwarded to a chat without that feature, or if a user creates backups of the chat before the message has disappeared, they remain accessible (WhatsApp). Instagram allows users to take screenshots of temporary images. On Snapchat, users are notified when photos they've sent have been screenshotted - however, turning on Airplane Mode before screenshotting sneakily bypasses this check and has become a well-known trick. While the spread of CSAM content and the inadequacies of these features cannot be directly correlated, we should question whether companies are only partially addressing privacy concerns with these porous approaches.

Lastly, the easy accessibility of the internet and a lack of monitoring of predatory behaviors online are the key issues to this problem. In 2020, police and journalists in South Korea uncovered the Nth Room Case, an abuse scandal in which nonconsensual sexual content, some depicting minors, was shared via the messaging app Telegram amongst “anywhere between 60,000 and over 100,000 users”. 71.3% of the minors involved were found to have been groomed by adults they met through social media platforms (Hankook Ilbo, 2022). Many of these kids find it hard to seek a caregiver's help due to the social stigma put on victims of abuse.

Companies like Meta have responded to these issues by “allocating 5% of the firm's revenue” to content moderation- a fraction of the 20% spent on their R&D and 12% spent on Sales and Marketing. It is no surprise that over 97% of their revenue comes from advertisements (Cuofano, 2022).

On a positive note, there are immense efforts being made privately and at scale to resolve the spread of CSAM. Thorn built a software that has reduced the time needed to identify an at-risk child by 65 percent, and has deployed it in 38 countries around the world. The software implements the hash system, a popular method provided by the National Center for Missing & Exploited Children (NCEMEC), which uses machine learning to analyze and convert a CSAM image into a specific number set. Only images that appear nearly identical to that one can produce the same number, regardless of file size. Julia Cordua, Thorn's CEO, describes in her TED talk the success image-based tech company Imgur has had in using this software to not only reduce CSAM content, but successfully trace it back to the perpetrator and report them to the necessary institutions. Moreover, while the increasing reports of CSAM content online may indicate a larger number of perpetrators, it could also be a sign that electronic service providers (ESPs) are more actively participating in reporting, detecting, and removing incidents of CSAM. As Google's Safety and Security page explains, because the hash system is based on aggregated data, the more companies that get involved, the faster the machine can learn (Google). Furthermore, in September 2022, META reported it was working on ways to protect users from receiving unwanted nudes and messages through Instagram, and shared an early image of the tool.

It is hard to say that attention economies are completely evil in nature. However, when platforms fail to take responsibility for adequate content moderation, children's safety is left behind as our society moves hastily towards new technologies without precaution. While a catch-all solution to removing SG-CSAM content does not currently exist, and the continuation of sexual behavior online is inevitable, perhaps we should consider other ways to educate children on how to protect their sexual privacy online.

Cuofano, G. (2022, October 27). How does facebook [meta] make money? facebook business model analysis 2022. FourWeekMBA. Retrieved December 15, 2022, from https://fourweekmba.com/how-does-facebook-make-money/#:~:text=In%202021%20Meta%27s%20business%20model,
Marketing%20activities%20(%2414%20billion).

Cybertipline Data. National Center for Missing & Exploited Children. (n.d.). Retrieved December 15, 2022, from https://www.missingkids.org/gethelpnow/cybertipline/
cybertiplinedata

Hankook Ilbo (Ed.). (2022, March 24). 디지털 성착취물 징역형 비율 2% → 53.9% 급증 ... 'n번방 사태' 효과. 한국일보. Retrieved December 15, 2022, from https://m.hankookilbo.com/News/Read/A2022032410240005431

How hash matching technology helps NCMEC - Google Safety Centre. How Hash Matching Technology Helps NCMEC - Google Safety Centre. (n.d.). Retrieved December 15, 2022, from https://safety.google/intl/en_uk/stories/hash-matching-to-help-ncmec/

Ignatius, A. (2020, September 9). Why some videos go viral. Harvard Business Review. Retrieved December 15, 2022, from https://hbr.org/2015/09/why-some-videos-go-viral

Keller, M. H., & X, G. J. (2019, September 29). The internet is overrun with images of child sexual abuse. what went wrong? The New York Times. Retrieved December 15, 2022, from https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html

Yildirim, P., & Zhang, Z. J. (n.d.). How social media firms moderate their content. Knowledge at Wharton. Retrieved December 15, 2022, from https://knowledge.wharton.upenn.edu/article/social-media-firms-moderate-content/#:~:text=Indeed%2C%20content%20moderation%20efforts
%20eat,than%20Twitter%27s%20entire%20annual%20revenue.