In the fight against the spread of child sexual abuse images on the web, the Internet Watch Foundation has announced that it is to share its database of digital signatures of images, known as the hash list, with internet giants Facebook, Google, Microsoft, Twitter and Yahoo.
This action follows David Cameron’s announcement of tougher measures to combat online child sexual abuse material at the #WePROTECT conference in November 2014.
The big question is whether this will make any great difference and, if so, why hasn’t it happened sooner?
The IWF and its critics
The Internet Watch Foundation (IWF) is a charity founded in 1996 to receive and act upon reports of images depicting the sexual abuse of children (mislabelled as child pornography) and those images of adult content that are deemed illegal in the UK. Funded by telecoms operators, software and hardware manufacturers and other organisations, the IWF’s role is officially outlined by a Memorandum of Understanding between the Crown Prosecution Service and the Association of Chief Police Officers, which protects IWF staff from prosecution. The IWF operates a reporting hotline service and directs law enforcement to illegal images it has assessed so that take-down notices can be issued and investigators can follow-up.
Over the past 20 years the IWF has been at the forefront of policing child sexual abuse and other extreme imagery, overseeing development of content rating systems and encouraging the development of similar practices in other countries. During this time, the IWF’s workload and need for its services have increased – reports of child sexual abuse imagery have risen from 1,291 in 1996-7 to 74,199 in 2013-14.
The IWF is not without its critics, however. It has been labelled as government censorship by the back door, while others suggest the images it deals with are harmless by themselves, and that they may even have a beneficial or preventative use for those with paedophilic sexual interests. These are minority views, however, and a more widely held fear is that viewing images of child sexual abuse may precipitate thoughts to into action and lead to real harm to real children. While such an argument seems logical, there is no conclusive evidence yet that this is the case as research findings often conflict.
But, what is a fact is that the demand for these images leads to more images being created – which perpetuates the abuse of children. The law is quite clear that possession of such imagery, even images that are computer generated but depict similar scenes, are illegal.
Where should the focus be?
The main criticism of the IWF’s move to share its hash list of abusive and extreme images is that it is not tackling the real problem. While Google, Facebook and other major internet firms could, equipped with the IWF’s hash list of known images, provide automatic scanning and blocking of some kind, this wouldn’t takle the paedophile groups involved in the trade of such imagery. They do not use social media or the open web: they hide in the darker recesses of the internet where the IWF does not go. Perhaps even more disturbing is the fact that collections of imagery are used as tokens for entry to closed paedophile networks which are involved in the organisation of harmful activity towards children and underage young people, such as providing children for sex and then photographing or filming them.
But many of these more secretive networks are already the target of police operations and require a different type of policing. Significantly, this argument also detracts from the main issue being highlighted by the IWF, which is the importance of keeping these illegal images out of the public domain and preventing them from becoming normalised. It also encourages partnerships between a range of organisations and businesses that have hitherto been rather reluctant to accepting full responsibility for their role in facilitating the sharing of materials. Clearly, undesirable internet activity can only be prevented effectively by collective action.
IWF’s recent announcement sends the message that possession of child abuse imagery is wrong; it also keeps those undesirable images away from the more public side of the internet. As long as valuable police time is not tied up investigating minor infringements at the expense of actually shutting down paedophile networks, IWF’s decision is surely a good thing.
No comments:
Post a Comment