Content warning: discussion of the sexual abuse of children


Related: Is preventing child abuse a plausible Cause X?

Recently, the New York Times ran a long piece (a) about the rise of child pornography.

The trend is grim:

  • In 1998 – 3,000 reports of child sexual abuse imagery online
  • In 2008 – 100,000 reports of child sexual abuse imagery online
  • In 2014 – 1 million reports of child sexual abuse imagery online
  • In 2018 – 18.4 million (!) reports of child sexual abuse imagery online

This is obviously horrendous. And if the relationship between childhood trauma and later-life negative consequences (as found in the ACE study) is true, it could have very large downstream consequences.

As a refresher, here's a passage about the ACE study from The Body Keeps the Score:

The first time I heard Robert Anda present the results of the ACE study, he could not hold back his tears. In his career at the CDC he had previously worked in several major risk areas, including tobacco research and cardiovascular health.
But when the ACE study data started to appear on his computer screen, he realized that they had stumbled upon the gravest and most costly public health issue in the United States: child abuse.
[Anda] had calculated that its overall costs exceeded those of cancer or heart disease and that eradicating child abuse in America would reduce the overall rate of depression by more than half, alcoholism by two-thirds, and suicide, IV drug use, and domestic violence by three-quarters. It would also have a dramatic effect on workplace performance and vastly decrease the need for incarceration.

It's not clear whether the increase in child abuse imagery circulating online implies an increase in child abuse, though that seems very plausible. (Some anecdotal evidence for increased incidence of abuse, from the Times piece: "In some online forums, children are forced to hold up signs with the name of the group or other identifying information to prove the images are fresh.")

Why is this happening?

6

0
0

Reactions

0
0
Comments11
Sorted by Click to highlight new comments since: Today at 9:24 PM

I doubt porn-related child abuse is growing.

NCMEC says that reports of child porn are growing, but that could easily be reports per posting, postings per image, or images per activity. NCMEC just *counts* reports, which are either a member of the public clicking a "report" button or an algorithm finding suspicious content. They acknowledge that a significant part of the rise in from broader deployment of such algorithms.

Similarly, the fraction of porn-producing activities which involve traumatic abuse is unclear. And is likely declining, judging by common anecdotes of sexual teenage selfies. I realize anecdotes are weak evidence at best, but producing such images is becoming easier, and puberty ages are dropping, so I'll stand by my weak claim.

NCMEC sites IWF as saying that "28% of CSAI images involve rape and sexual torture", but I cannot find a matching statement in IWF's report. The closest I find is "28% of these reports [from members of the public] correctly identified child sexual abuse images," but IWF seems to regard any sexualized imagery of an under-18-year-old as "abuse", even if no other person is involved.

In any case, the IWF report is from 2016 and clearly states that "self-produced content" is increasing, and the share of content which involves children under 10 is decreasing (10 is an awkward age to draw a line at, but it's the one they reported on). Likely these trends continued into 2018.

On the meta level, I note that NCMEC and IWF are both organizations whose existence depends on the perceived severity of internet child porn problems, and NYT's business model depends on general dislike of the internet. I don't suspect any of these organizations of outright fraud, but I doubt they've been entirely honest either.

NCMEC says that reports of child porn are growing, but that could easily be reports per posting, postings per image, or images per activity. NCMEC just *counts* reports, which are either a member of the public clicking a "report" button or an algorithm finding suspicious content. They acknowledge that a significant part of the rise in from broader deployment of such algorithms.

Good point. I wonder:

  • Did algorithm deployment expand a lot from 2014 to 2018? (I'm particularly boggled by the 18x increase in reports between 2014 and 2018)
  • What amount of the increase seems reasonable to explain away by changes in reporting methods?
    • About half? (i.e. remaining 2014-to-2018 increase to be explained is 9x?)
    • 75%? (i.e. remaining 2014-to-2018 increase to be explained is 4x?)
From the NCMEC report:
A major contributor to the observed exponential growth is the rise of proactive, automated detection efforts by ESPs [electronic service providers], as shown in Figure 3 . Since then, reporting by ESPs increased an average of 101% year-over-year, likely due to increasing user bases and an influx of user-generated content. While automated detection solutions help ESPs scale their protections, law enforcement and NCMEC analysts currently contend with the deluge of reports in a non-automated fashion as they are required to manually reviews the reports

I’d only be surprised if this was a different trend from the total amount of pornography available online. The internet allows people to coordinate better and increase the demand for lots of products and industries - including illegal and immoral ones - especially where the product is images.

I don't think the amount of porn overall increased 18x from 2014 to 2018.

Hard to find a perfect statistic for this... PornHub reported 18.4 billion visits (a, SFW) in 2014 and 33.5 billion visits (a, SFW) in 2018.

So a ~2x increase in visits from 2014 to 2018.

My suspicion is that we are seeing a "one time" increase due to better ability to create and share child abuse content. That is, my guess is the incident rate of child abuse is not much changing, but the visibility of it is because it's become easier to produce and share content featuring the actions that were already happening privately. I could imagine some small (let's say 10%) marginal increase in abuse incentivized by the ability to share, but on the whole I expect the majority of child abuser are continuing to abuse at the same rate.

Most of this argument rests on a prior I have that unexpected large increases like this are usually not signs of change in the thing we care about, but instead changes in secondary things that make the primary thing more visible. I'm sure I could be convinced this was evidence of an increase in child abuse proportionate with the reported numbers, but I think it far more likely lacking such evidence that it's mostly explained by increased ease producing and sharing content only.

I don't think this explains the 18x increase between 2014 and 2018. Communication technology didn't change much in that timeframe, and it'd be surprising if child porn communities substantially lagged behind the mainstream in terms of their tech (there are heavy incentives for them to stay up-to-date).

>Communication technology didn't change much in that timeframe

I find it plausible that de facto availability of secure communication channels had a lowered enough technical bar that thresholds were passed in that time frame.

Yeah, maybe. Messenger's user base doubled over that timeframe, though was already at 600 million users in early 2015.

Facebook did roll out opt-in end-to-end encryption for Messenger in late 2016, which is a possible inflection for this.

Also most (!) of this stuff circulates through FB Messenger, so plans to encrypt Messenger end-to-end have a dark implication. From the Times piece:


And when tech companies cooperate fully, encryption and anonymization can create digital hiding places for perpetrators. Facebook announced in March plans to encrypt Messenger, which last year was responsible for nearly 12 million of the 18.4 million worldwide reports of child sexual abuse material, according to people familiar with the reports.
Curated and popular this week
Relevant opportunities