The $300 System In The Fight Against Illegal Images

Christian HaschekImage copyright Christian Haschek

A security researcher has built a system for detecting illegal images that costs less than $300 (£227) and uses less power than a lightbulb.

Christian Haschek, who lives in Austria, came up with the solution after he discovered an image showing child sex abuse had been uploaded on his image hosting platform Pictshare.

He called the police, who told him to print it out and bring it to them.

However it is illegal to possess images of child abuse, digitally or in print.

"Erm... not what I planned to do," Mr Haschek said.

Instead he put together a homegrown solution for identifying and removing explicit images.

Mr Haschek used three Raspberry PIs, powering two Intel Movidius sticks, which can be trained to classify images. He also used an open source algorithm for identifying explicit material called NSFW (Not Safe For Work), available free of charge from Yahoo.

Image copyright Christian Haschek
Image caption Christian Haschek said the system took a couple of hours to assemble.

He set it to find images which the system could say with 30% or more certainty was likely to contain pornography - he said he set the possibility low so as to be sure not to miss anything.

He has since discovered 16 further illegal images featuring children on his platform, all of which he reported to Interpol and deleted.

He then contacted a larger image hosting service, which he declined to name, and found thousands more by running images uploaded to their platform through his system as well.

"When I first started working on my open source image hosting service PictShare I didn't think anyone but myself would use it," Mr Haschek said on his blog.

"Over the years the usage has increased and with increased usage of a site where you can upload images anonymously, there will be those who upload illegal things.

"There are thousands of images on PictShare - I can't look them through even in a year so I had to think of something else."

Prof Alan Woodward from Surrey University said Mr Haschek's project was encouraging.

"Law enforcement agencies around the world are struggling to find this horrible material and have it taken down. Sadly, the police have to work with tech firms and that takes time," he said.

"I like the idea that this particular site has taken responsibility and found a solution that mitigates the problem.

"The scale of the problem faced by the large tech firms is admittedly enormous and although these solutions could be scaled up, it takes money and effort. However, where there's a will there's a way."

RECENT NEWS

From Chip War To Cloud War: The Next Frontier In Global Tech Competition

The global chip war, characterized by intense competition among nations and corporations for supremacy in semiconductor ... Read more

The High Stakes Of Tech Regulation: Security Risks And Market Dynamics

The influence of tech giants in the global economy continues to grow, raising crucial questions about how to balance sec... Read more

The Tyranny Of Instagram Interiors: Why It's Time To Break Free From Algorithm-Driven Aesthetics

Instagram has become a dominant force in shaping interior design trends, offering a seemingly endless stream of inspirat... Read more

The Data Crunch In AI: Strategies For Sustainability

Exploring solutions to the imminent exhaustion of internet data for AI training.As the artificial intelligence (AI) indu... Read more

Google Abandons Four-Year Effort To Remove Cookies From Chrome Browser

After four years of dedicated effort, Google has decided to abandon its plan to remove third-party cookies from its Chro... Read more

LinkedIn Embraces AI And Gamification To Drive User Engagement And Revenue

In an effort to tackle slowing revenue growth and enhance user engagement, LinkedIn is turning to artificial intelligenc... Read more