Big tech pools its efforts to protect children

Portrait of happy smiling little boy holding ball in hands and running on grass at children playground

With hundreds of millions of people largely confined to their homes during the Coronavirus lockdown, the global economy is facing its biggest recession in a century.

The good news is that goes for the underworld economy too, with crimes like drug trafficking, theft and murder all significantly lower as social distancing measures disrupted criminal networks.

Sadly the same cannot be said for virtual crimes likes internet fraud, and most disturbingly, the online sexual abuse of minors, which according to Europol and other agencies, has seen an alarming increase during the period. Under lockdown, “sex offenders have increased their criminal activities in social media, via peer-to-peer networks and on the dark web,” Europol said in a statement. A rise in online sexual abuse offences like solicitation and sextortion have been reported in some countries and webcam videos featuring forced or coerced abuse of children have proliferated, Europol said. In an average month, the National Centre for Missing and Exploited Children (NCMEC) in the United States, which is an important partner for Europol in the area of child protection, would report around 100,000 cases of online child exploitation to its European partner, however in March, that number rose tenfold to 1 million. The NCMEC received a total of 2,027,520 reports of child sexual abuse on its CyberTipline in March this year, compared to 983,784 in the same month last year, a whopping 106 percent increase.

While the Coronavirus lockdown has clearly resulted in a spike in online sexual abuse, an investigation by the New York Times reveals that this type of crime was already on a steeply upward trend.

According to a NYT report published at the end of 2019, there exists “an insatiable criminal underworld that [has] exploited the flawed and insufficient efforts to contain it.” Among those efforts the Times highlights the landmark Protect Our Children Act passed in 2008, when there were around 100,000 reported images and videos of child sexual abuse online. Despite that law being in place, that number exploded to 45 million today, due to a lack of cooperation between tech companies and drastically underfunded and under resourced authorities.

The 2008 law committed sixty million dollars a year to fight against this scourge; however in a good year, the reports says, less than half of that money was actually released. Today, one in ten Homeland Security agents work on child sexual assault cases, but “we could double our numbers and still be getting crushed,” a Homeland Security agent told the Times. Another agent explained that they have been reduced to prioritising the crimes committed against the youngest victims.

Although many of the forums that host child pornography are found on the dark web, the seedy underbelly of the Internet that is inaccessible by conventional browsers, almost all of the 45 million photos and videos of child sexual abuse reported in 2018 were shared on the internet’s most popular platforms, such as Facebook Messenger or Dropbox.

With the system at breaking point, authorities believe the only way it can possibly keep up with the caseload of work is with advancements in machine learning. Thankfully, these are on the way. For example, the Technology Coalition, formed by twenty internet giants including Google, Facebook, Microsoft, Amazon, Apple, Twitter, as well as start-ups such as Yubo, recently announced an initiative to eradicate child abuse images from their platforms by, among other things, setting up a multi-million dollar Research and Innovation Fund “to build crucial technological tools needed to more effectively prevent and work to eradicate Child Sexual Exploitation and Abuse (CSEA).

Over the past decade, the Technology Coalition has developed innovative algorithms to assist in the fight against CSEA. For example, PhotoDNA, a collaboration between Microsoft and Dartmouth University, that uses computers to recognise photos and compare them against databases of known illegal images, is used by organisations to detect and report millions of instances of child sexual abuse online. Similarly, Google’s Content Safety API and Facebook’s photo – and video – matching technology dramatically improve the ability of NGOs and other tech companies to review CSEA content at scale.

Meanwhile, Yubo, a French social media platform aimed at users aged 13-25, partnered up with digital identity verification champion Yoti to suss out people outside the app’s age range. The tool is used to estimate the ages of Yubo users based on their photos as analysed through complex neural network learning algorithms. Since the app’s user base tripled during the coronavirus pandemic, tools such as Yoti and the partnership with NCMEC are of critical importance for keeping communities safe.

Deployed in chat rooms, such tools can be used to snare paedophiles in the act of grooming or soliciting their victims. The Artificial Intelligence platform Childsafe.ai can scan millions of online conversations for child abuse content and intercede in cases where a minor is being solicited for sex.

What’s becoming increasingly clear then is that child sexual abuse in its online form is a vice that has far outstripped the capacity of the traditional methods of detection and prevention; exacerbated by technology, it can only be defeated by the same means.