Tech companies forced to address online child abuse

Online child sexual abuse is at an all time high, with new legislation holding tech firms across the UK accountable for their role in tackling such abuse. Similar moves have been restricted in the past due to privacy concerns, however, the need for protection is at an all time high.

Credit: Sky News.

By: Derry Salter.

On Wednesday 6 July, Home Secretary Priti Patel published an amendment to the drafted Online Safety Bills which will force tech companies to roll out new technology to identify child sexual abuse material on their platforms.

The amendment promises both privacy and security to tackle the crime, by requiring tech companies to identify child sexual abuse material posted or sent privately on their platforms. Such legislation is essential at a time like this with offences labelled ‘Sexual Communication with a Child’ increasing 80% over the past four years.

Patel inculpated large tech firms: 'We must all work to ensure criminals are not allowed to run rampant online and technology companies must paly their part and take responsibility for keeping our children safe.'

To ensure all companies abide by the new laws, telecommunications regulator Ofcom have the power to fine companies up to £18 million or 10% company turnover if there is a failure to comply.

Encrypted messaging services like Facebook and WhatsApp have previously fallen under scrutiny from government ministers due to their attempt to introduce end-to-end encrypted (E2EE) messaging. Such services make it difficult to monitor what is posted on the messaging services, however, tech companies have hit back arguing there are better ways to police child sexual abuse.

Digital Minister Nadine Dorries quickly retaliated claiming that technology companies should not 'blind themselves to these awful crimes happening on their sites.'

Director General of the NCA, Rob Jones, also lent support to the Digital Minister, arguing that the monitoring of online platforms can help tackle offline child sexual abuse, with child abusers often using services to share their material.

This isn’t the first time such tech has been introduced, with Apple’s attempts to scan iPhone images for child sexual abuse imagery delayed last year after privacy concerns.

The system, Neural Hash, aimed to identify images by analysing the phone locally rather in a data centre, however, concerns quickly arose concerning possible misuse and abuse of the software. Albeit government ministers such as former Secretary of State for Health and Social Care Sajid Javid welcoming the new technology, WhatsApp head Will Carhart referred to the scheme as 'very concerning' which saw the promising system quickly put on hold.

Such fears remain prominent, especially with the Cambridge Analytica scandal of the past decade sitting freshly in many people’s minds. The draft legislation is set to bring further worries of another breach of privacy from tech giants like Meta.

However, with an estimated 850,000 people across the UK posing a serious risk to children, according to the National Crime Agency (NCA), this stringent legislation is clearly necessary. In 2021 alone, the NCA recorded over 33,000 online child sexual abuse cases. Throughout the year, the Internet Watch Foundation successfully blocked nearly nine million attempts across the UK to watch online child abuse.

With child abuse exploitation on such a scale, the new legislation should offer some assistance in decreasing the crime.