Pressure to take part in a sexualised life before they are ready and pressure to consume the vast range of goods and services that are available to children and young people.
In Britain, there has been a sustained and informed public debate for the last three years around the commercialisation and sexualisation of childhood.
David Cameron, the prime minister, has announced a long anticipated proverbial government crackdown on internet pornography. He said that all internet users will be contacted by their service providers and given an “unavoidable choice” on whether to use pornography filters. The changes will be introduced by the end of next year. As a first step, customers who set up new broadband accounts or switch providers will actively have to disable the filters by the end of this year.
Where there are technical problems, there are technical solutions. Filtering software has been widely touted as the way to solve the ‘problem’ of children and the internet. As well as preventing children from encountering pornographic, violent, or generally inappropriate content, some tools also control the amount of time children spend online, monitor what they do, or stop them from getting involved in activities like chatting or using webcams.
In theory, when it comes to pornography, filtering is an elegant solution. A piece of software that can ensure children access only content that is appropriate for them is attractive. We can’t always be there when our children go online, so something like that can be reassuring. However, as a technical solution, there is much discussion about its effectiveness.
To date in Ireland, device-level filtering solutions are the ones most widely available to parents. This means software must be set-up and configured on every device in the house. In my house, there are 11 gadgets regularly used by everyone in the family to go online.
As there is a big age gap between my children, I should probably have customised filters set up for every user on every device. An explosion in the number of internet-enabled devices makes the process of individual device protection increasingly complicated.
There has also been criticism of these individual device filters for being too complicated to use, slowing down the performance of the machines they run on, and generally not being sophisticated enough to deal with the new world of social media where content is easily shared, reposted, and embedded in multiple locations.
The most recent audit of filtering tools commissioned by the European Commission found that even the best product missed one in five of the 6,000 harmful pages tested. This is called underblocking. All tools tested were better at blocking pornographic content than other categories of harmful content such as gambling, drugs, and & self-harm, racist content, violence, and so on.
When it comes to social media, the tools were even less effective. This is significant as blogging platforms such as Tumblr and Blogger contain a lot of adult content. More worrying are the findings in relation to mobile phones and tablets: “All tools can be easily uninstalled or bypassed, making them useless.”
The ineffective nature of this type of filtering is evidenced by its low uptake. The most widely used software is not so much concerned with child safety but relates to security. Spam filters and virus control software is used by 79% of households in Ireland, according to the EU Kids Online research. Other parental controls are used much less frequently, with device level filters used by just 41% of parents.
The use of filtering technology in Ireland (41%) is significantly above that of the European average (28%) and is second only to the UK (46%). In short, many feel that device-level filters are no longer offering sufficient protection for children online. Only a minority of parents use these filters and this number is falling.
‘Whole home filtering’ or ‘family filters’ or ‘network level filters’, as is being introduced in Britain, are a more robust and user-friendly solution. It works for every device brought into the home and doesn’t require any installation or configuration.
You turn it on in the same way that you might order a movie package from your TV provider: Either by ticking a box when you subscribe to the service or by contacting your provider’s customer support. From a user’s point of view, this service is easy to set up and completely transparent until you try to connect to a prohibited site.
No technical solution is 100% effective. Young people have numerous, often ingenious ways of circumventing the restrictions. It is most effective at preventing children from accidently coming across sexual content. If a determined, tech-savvy teenager really wants to get access to online adult content, they will usually find a way.
A quick search on Google or Twitter will tell you the new pornography filter is not universally popular. It seems that the people most concerned about the idea of whole home filtering are the ‘technorati’: Bloggers, tweeters, techies, ISPs, and digital rights activists. Like most of us who have operated in the online space for a while, they are very protective of the openness and freedom of the internet, and suspicious of any intrusion by governments.
There is also the not insignificant matter of the costs associated with setting up these services on existing broadband networks. It was estimated to me that the cost for a provider in the UK to set such a service would be in excess of £1m (€1.2m). While British providers have a massive user base that they can use to defray this cost; this isn’t the case in Ireland. Introducing network level filters could result in higher broadband prices for all.
The ISPs are also concerned with over-blocking, or what’s known in the business as the Scunthorpe problem. This problem occurs when internet filters block legitimate sites in error. They are concerned they will unintentionally prevent users from accessing legitimate online businesses and will have to compensate these businesses for lost earnings in this event. There also have other technical concerns about degrading the speed and reliability of broadband networks.
There are also some more esoteric concerns about what I call the ‘thin end of the wedge’ argument. It goes that if we block pornography today, tomorrow we will be blocking pages that infringe copyright, then we will have to block pages that contain information on abortion and so on.
And who will decide on what we can access?
Since 2005 as part of the Government’s Broadband for Schools Programme, a centralised content filter has been implemented on the broadband network that connects all Irish primary and post-primary schools to the internet.
All websites accessed on the schools broadband network go through a content filter that ensures only appropriate sites can be accessed. Schools have a range of six content filtering levels to choose from. The most restrictive level permits access only to approved educational resources. The most open level permits access to social networking sites. Known pornography sites are blocked at all levels.
At home, just as in schools, there is a role for technology solutions but we need to be careful not to lean too heavily on the crutch of technical solutions. Parents also have responsibilities to foster personal responsibility and make sure that they speak to their children regularly about what to do if they come across something that bothers them. There is also a role for educators to ensure that opportunities are given in schools for children to reflect on the ethical and moral issues raised by pornography.
We need good filters that are accurate, transparent, and easy to use. Filters that can easily be customised to suit different sets of values and users of different ages. It is important that we know what content is being blocked, and rates of over blocking and underblocking. Filters should also have mechanisms for getting content recategorised in a timely, fair manner.