In the midst of the coronavirus crisis, child sexual abuse material (CSAM) is now recognised as another global epidemic. CSAM includes images, video, text and drawings of children being abused and exploited (previously called “child pornography”). For at least fifty years, CSAM has been a significant social problem, however CSAM victims and survivors have received an uncertain and insufficient response from state authorities. In the 1980s and 1990s, child victims and adult survivors disclosing that images or video were made of their abuse were folded into the “false memory” debates and subject to pervasive disbelief. The argument that victims of sexual exploitation were instead suffering from “false memories” stalled the development of effective state responses to sexual exploitation.
However, the advent of the internet has made the extent of demand for CSAM undeniable. Over the last decade, reports of CSAM to US authorities have been increasing by 50% per year, due to new technologies that increase the ease and anonymity of CSAM consumption (Bursztein et al., 2019). Large online surveys of men finding that between 2.2 – 4.4% have intentionally viewed CSAM of prepubescent children (Dombert et al., 2016; Seto et al., 2015). With children and adults spending more time online than ever due to coronavirus lockdowns, CSAM and online sexual exploitation have spiked. Finally, governments and the technology industry are being forced to grapple with the global black market in child sexual exploitation.
In their recent ground-breaking coverage, the New York Times revealed that US authorities received 70 million reports of suspected CSAM in 2018 – 2019. New York Times reporters Michael H. Keller and Gabriel J.X. Dance were the recipients of the 2020 ISSTD Media Award in the written category for their investigative series “Exploited” which documented, for the first time, the failure of state authorities and technology companies to tackle the CSAM epidemic. Their articles focused on victim voices and impact, acknowledging the anxiety and hyper-vigilance of survivors who know that images and video of their abuse continue to circulate widely.
In the aftermath of this reporting, there have been major shifts in the willingness of policy-makers to hold technology companies to account for escalating reports of CSAM. In March 2020, US Attorney General William P. Barr and other senior government figures met at the White House with the Phoenix 11, an advocacy group of CSAM survivors. You can learn more about the Phoenix 11 here. The Phoenix 11 described how CSAM victimisation has affected their mental health and physical safety. Alongside ministers from Australia, Canada, New Zealand and the United Kingdom, Attorney General Barr launched the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse. The Voluntary Principles sets out a baseline for internet companies to deter the use of online services for sexual offending, and acts as a precursor to more significant government regulation.
In June 2020, the US Senate Judiciary Committee voted unanimously to remove liability protections from online companies that take inadequate action to remove CSAM from their services. This bill has strong bipartisan support and seeks to walk a fine line between protecting the privacy rights of internet users and incentivising the technology industry to act decisively against CSAM. The United Kingdom is developing a legislative approach to “online harms” in which internet companies will have a statutory duty of care to internet users, with children recognised as a particularly vulnerable group.
The technology industry has been vocal in denouncing online child sexual exploitation however they have often been unwilling to prioritise child protection over their business prerogatives. Nonetheless, there are positive signs that Silicon Valley has sensed that governments are no longer turning a blind eye to the extent of online child abuse. This year, the Technology Coalition, a consortium of leading technology companies announced a “Plan to Combat Online Child Abuse” that includes research and development funds for “technological tools” to prevent online sexual exploitation and the publication of an annual progress report charting industry efforts.
Increased awareness of CSAM is bringing to light many of the troubling dynamics disclosed by clients in the complex trauma field, including organised sexual abuse, parental perpetration and sadistic abuse. In a recent survey of 150 adult survivors of CSAM, half of participants identified having a dissociative disorder. Half of participants described organised child sexual abuse, most often identifying one or both parents as the primary perpetrators. Analysis of CSAM photo series finds that images of fathers abusing their prepubescent daughters are the most highly traded and in-demand illegal material. Images of child torture and the abuse of infants are widely available online.
Government and industry action to curb the trade in CSAM is welcome and long overdue. However, as clinicians in the fields of complex trauma and dissociation know all too well, the problem of child sexual exploitation is not primarily a technological one. The children in abuse images and videos have been surfacing in mental health settings for decades. Trauma therapists have much to offer the expanding response to CSAM. As exploited children and adult survivors come forward in increasing numbers, the mental health and psychosocial impacts of CSAM are becoming a global policy priority. State investment in victim-focused measures, including mental health care, is central to a just and comprehensive CSAM response.
Bursztein, E., Clarke, E., DeLaune, M., Elifff, D. M., Hsu, N., Olson, L., Shehan, J., Thakur, M., Thomas, K., & Bright, T. (2019). Rethinking the detection of child sexual abuse imagery on the Internet. Paper presented at the The World Wide Web Conference. Available here
Dombert, B., Schmidt, A. F., Banse, R., Briken, P., Hoyer, J., Neutze, J., & Osterheider, M. (2016). How common is men’s self-reported sexual interest in prepubescent children? The Journal of Sex Research, 53(2), 214-223.
Seto, M., Hermann, C. A., Kjellgren, C., Priebe, G., Svedin, C. G., & Långström, N. (2015). Viewing child pornography: Prevalence and correlates in a representative community sample of young Swedish men. Archives of Sexual