I’ve just received a response to my post about Internet filtering and child pornography from Child Wise CEO Bernadette McMenamin. She raises some good questions — particularly why people in the Internet industry seem to react so angrily when there doesn’t seem to be any argument about child pornography and other exploitation being A Bad Thing.
Ms McMenamin has given permission for her response to be published. I’ve highlighted what I think are her most interesting questions. Answers appreciated. I’ll be drafting my own reply overnight.
I appreciate your comments and feedback. I also appreciate the respectful words that you say about me as yes I am trying to stamp out child sexual abuse and have dedicated my life to this cause for the last 25 years. I will do so until I die, which is hopefully not soon.
Clearly my ISP filtering article has stirred up much debate in the ISP world and I think that is good. Some emails to me from the ISP industry have been very educational for me but some have been downright rude. I just want to make my position clear. I only support ISP filtering of child pornography and other illegal content and not pornography and definitely not use this as a form of censorship of free speech.
I am a child protection specialist dealing with the aftermath of child sexual abuse on a daily basis. My motivation needs to be made clear. I am sure that the majority of people in the ISP industry are decent people and in retrospect my intention wasn’t to denigrate and insult people who care. There is no doubt that my opinion piece was emotive and if you could see what I see on a daily basis with people’s lives wrecked by the trauma of child sexual abuse I think that you and other members of the ISP industry may feel emotional about it as well. However in some ways I have stirred up healthy debate which will hopefully lead to a better outcome for children.
I am looking to you the ISP industry for the answers and to find and/or create the best use of filters to block child pornography. At least this will reduce some of the demand and some is better than none!
I am also a little perplexed about some of the feedback that I have been receiving about the uselessness of child pornography filtering. Now I am simply asking a question. If filtering of child pornography cannot work then why is there so much anger, fear and resentment to any attempt to block child pornography and other illegal sites? This is what I don’t understand.
Why not work on a filter that does only block out child pornography and if it doesn’t work then let’s stop any waste of money. All I am trying to do is to suggest an approach that may work.
It seems to be working (to an extent) in other countries i.e. Sweden, Denmark and UK. I am clearly not an IT expert so I will hand this over to you as the experts to find a way that is possible because I am sure you all agree that we should find ways to stop the sexual abuse of children in pornography which seems to be a growing and lucrative industry.
Bernadette McMenamin AO
[Photo of Bernadette McMenamin from the National Australia Day Council. Ms McMenamin was a Victorian Australian of the Year finalist in 2004.]
34 Replies to “Child Wise’s Bernadette McMenamin on Internet filtering”
I will highlight some areas that I think are of particular importance:
Unfortunately, I suspect that she is in for a minority of the first, and a continuing swell of the latter. Many pundits, commentators, bloggers and micro-bloggers have been taking the current proposed “protections”, and then Ms McMenamin’s comments as “thin edge of the wedge” with some real fear (and substance in that fear), as the proposal is poorly defined.
Her comments have been seen in the context of the Govt proposal, and not on their own, stand-alone, merits. (And for good reason — nothing exists outside of context).
The educational aspect of this is something that we, as a concerned industry group, need to address — as a matter of priority.
A small number of effective, accurate, cohesive and understandable messages need to be created and used in response to some of these non-technical problems, and the policy responses that require technological controls. (Another soap box entirely — not for this venue.)
This is very encouraging. But, unfortunately, technically somewhat naive. This refers back to the previous point regarding education and technical controls.
Starts so well, and then falls flat. Sorry Bernadette, filtering content does not affect demand it affects supply. This does not, in any way, affect my response, or lower the value of filtering…
But we DO need to be aware of the differences in limiting supply, and lowering demand. (Remember however that over time, a contraction in supply will lead to a reduction in demand as consumers find substitutes or modify expectations to reduce desire for the product.)
The “ISP industry” DOES need to help find some answers to this problem. It is incumbent on us — the people that understand the problem and the limitations of the technology — to provide options & solutions to the policy makers and the related interest groups if we want a useful & workable result.
This is CRUCIAL to the whole discussion. We need to show that either it can work or demonstrate why it won’t work.
Then, and only then can we get this discussion onto realistic, workable, and hopefully (cost) effective means of achieving the stated goals. (NOTE: The stated goals are “to stamp out child sexual abuse”, and by inference, the use of the internet to distribute material that contributes to this abuse.)
I also note with interest that It seems to be working (to an extent) in other countries. I was not aware of this, and would like to know how/what those countries are doing, how they are reporting the efficacy of their controls, how those reports are presented/critiqued, and how much money is being spent (publicly, privately, and silently) to implement and maintain those controls.
Computerworld has an article up in which Bernadette McMenamin explicitly distances herself from the kind of filter system that Senators Conroy and Fielding are proposing:
Child Wise CEO calls for government re-think on ISP filtering
Her comments there match the reading above. Looks like her message got hijacked in the Federal Debate, and then she got run over by the internet train.
Urrrrrrrrrrr, because we would prefer to spend the money somewhere that would actually help prevent kids becoming the victims of child abuse. You know, as opposed to wasting it on a filter that will NOT reduce the amount of child pornography available or reduce it’s production.
You see we don’t want to waste money and pretend we are helping kids when we actually COULD BE HELPING KIDS. You know by funding health services and the police. A bloody filter won’t reduce how many kids are being exploited.
I think that’s why your article copped so much heat Bernadette and your continued lack of knowledge only means you’ll cop a lot more.
I responded to some of Bernadette’s other comments from CW today at http://www.somebodythinkofthechildren.com/child-wise-ceo-wants-gov-to-re-think-isp-filtering/
Thanks for the comments, guys. Keep them coming. Because I’m at my most creative early in the morning, I’ll respond in detail some time before 9am Sydney time.
In response to Crispin’s remarks concerning what other countries are doing, some information about the BT ‘Cleanfeed’ system (which is a system made by BT for their own infrastructure) is here:
BT’s modest plan to clean up the Net
Back door to the black list, The Guardian UK
“BT’s CleanFeed system, which blocks access to a register of websites containing sexual images of children, can also be used to discover the contents of the secret blacklist, according to new research.
Technically skilled users of BT’s internet service can use the system to find out which sites are blocked, says Richard Clayton, formerly internet expert at service provider Demon and currently a doctoral student at Cambridge University’s Computer Laboratory. This means they are able to gain access to a secret blacklist provided by the watchdog Internet Watch Foundation (IWF).
“We’ve built a system that won’t stop the hardened paedophile,” admits [BT’s] Galvin, who says that CleanFeed’s main aim is to stop accidental access from users following links such as those in spam email. …”
Richard Clayton’s research report titled “Failures in a Hybrid Content Blocking System” is available at:
We are not angry because of the possibility that child porn might be blocked, no matter how unlikely.
We are angry that the Government is building an all-pervasive and necessarily opaque censorship mechanism. We are angry because the Government clearly has intentions to block more than just child porn (if not, why the opt-out option?). We are angry because this decision paints every responsible Australian adult as potential sex offender with the mental age of a child.
We are angry because, as responsible adults, we expect our Government to respect us to use our freedoms responsibly. That’s why they call us responsible, um, adults. This is why we call ourselves a democracy. Only Governments with totalitarian paranoia distrust their populations.
As long as people such a McMenamin and Stephen Conroy insist on equating free speech with advocacy of child pornography they will continue to be thoroughly deluded about what we are upset about. Until they recognize that freedom of speech and the entitlement to simple, dignified respect have intrinsic value, they will continue to misunderstand our anger.
In a recent post I asked , somewhat tounge in cheek, if proponents of censorship would be willing to trade their right to privacy if this helped in the battle against child pornography. If they are not prepared to do this, what right do they have to ask that we live in a state that has the mechanisms to censor content at will?
It doesn’t matter if they censor content that I am interested in. The only thing that matters is that they have assumed the right do it if they choose. That’s what ticks us off, and until they understand that, they will continue to be deluded.
Another comment that I have made several times is this: if tackling child pornography is the game, then monitoring rather than censorship is more likely to be productive.
Censorship inevitably drives offenders towards covert channels making them harder to track. Monitoring, on the other hand, at least stands some chance of delivering useful intelligence to investigating authorities. Of course, if monitoring becomes too successful then it tends to drive offenders towards covert channels anyway, so it’s not a silver bullet.
@Crispin: I think there’s two key take-home messages from your post about why Ã¼bergeeks get angry about this, but first comes the reason why everyone should get angry.
You’re quite right when you say that we, geeks in general, need to explain ourselves more clearly. The old-fashioned BOFH approach of attacking the lusers is well past its use-by date!
@Neil H: Thanks for the pointer, wouldn’t have seen that article otherwise. It helped me find another one.
@Michael: Exactly. The anger comes from point 3 above.
@rene: Thanks for the BT pointers, and welcome! Yes, the message from real-world experience always seems to be that filters only block the casual user from accidentally stumbling across something they don’t want to see. Those who are specifically trying to find the material will always find it. The filters work just well enough that politicians can be shown a “convincing” demonstration.
@Jon Seymour: That is a very fine summary of the sources of anger.
On this point…
… I suspect one problem is that many of the proponents of the filters would say that, yes, they are happy to lose their privacy. Many seem to believe that old adage that “if you’ve got nothing to hide you shouldn’t be afraid”, in the rather naive assumption that those doing the looking will only ever have the same moral and social values as themselves, and wouldn’t ever be people who’d try to do them harm.
[P.S. I’ve slightly re-formatted some comments and corrected spelling and punctuation errors. I just hate my web pages looking ugly. Keep those comments coming, though! I’m still putting together a formal response to Ms McMenamin’s letter.]
Honestly, I think Bernadette’s intentions are commendable. She wants to wipe out child pornography. We all do. It’s just unfortunate that someone in her position, with influence in the media and with politicians, is wasting time on something like filtering. Filtering will not protect children or reduce the amount of child pornography available or produced.
I hope you understand soon Bernadette why the majority of people with technical know how, especially those with internet knowledge, disagree with mandatory filtering. It’s simply a bad idea that benefits no one except moral crusaders and politicians. Children certainly aren’t helped.
@Michael Meloni: Agreed, Ms McMenamin is, as I said in my original post, fighting the good fight. But as Crispin said, it’s up to us to help her understand. And indeed, having exchanged a few emails privately with her yesterday, she’s open-minded and wants to understand.
Crikey will be running her letter today. I’m writing a reply as we speak, essentially a refined version of the discussion we’re having. Both should go online around 2pm Sydney time. You’ll get a mention.
[Update 12.45pm: My response has been held back until tomorrow’s Crikey, but I’m told it’ll definitely run then.]
Hopefully it’s in the freebie edition. 🙂
“If filtering of child pornography cannot work then why is there so much anger, fear and resentment to any attempt to block child pornography and other illegal sites?”
Because there is a general rule of thumb that can be applied to doing things that don’t work… it’s stupid.
Hitting my car with a hammer to make it run faster doesn’t work, therefore hitting my car with a hammer in an attempt to make it run faster is a stupid thing to do. Attempting to block child pornography by running automated filterbots over everyone’s internet connection doesn’t work, therefore running automated filterbots over everyone’s internet connection in an attempt to block child pornography is a stupid thing to do.
The difference between those who support this plan and those who don’t is simply a matter of who understands the technology and who doesn’t – in the same way that various “flat-tax” plans seem to garner support among those who don’t have much of an idea about economics and hostility among those who do. Fortunately for the economy, the number of people who understand how a flat tax is a bad idea comfortably outweighs the number who don’t. Unfortunately for the country’s IT&T sector, that ratio is rather more precariously balanced when it comes to understanding filtering.
Filtering traffic against a known black-list of illegal sites is one thing. What the government is proposing is some sort of automated technology it believes will automatically be able to differentiate between legal and illegal content ‘on-the-fly’ (as it is being requested by and delivered to a consumer). Even limiting such technology to filtering words (leaving aside pictures and videos for the moment) is not overly effective. It has almost become a cliched argument in anti-filtering circles that if you try and screen out the word “breast” then you run a very real risk of denying people access to information regarding mammograms or how best to grill chicken.
The English language has a finite (large, but finite) list of words, only a few of which are ‘naughty’ enough to warrant ending up on any reasonable person’s filter list. Running a query across a page of text to check for naughty words or phrases is a fairly simple process. However, even aside from the ‘false positive’ example of the innocuous use of words like ‘breast’ there is an almost certain chance of acquiring ‘false negative’ results. A false negative is where the filter gives the page a clean bill of health even though it ought – by the principles of the filter – to have been blocked. Blocking words like “cock” and “cunt” won’t pick up euphemistic phrases such as “skin flute” or “bearded clam”.
Extrapolating such difficulties out to filtering photographs presents challenges that are an order of magnitude greater than when dealling with text. Digitised photographs are represented as a two dimensional array of coloured dots. Unlike words, coloured dots impart absolutely no meaning whatsoever as to the content of the photograph. Depending on the lighting and angle of the shot, a gynaecological closeup could be represented by much the same array of coloured dots as a photograph of a person’s hand, or even a pig on a farm. Filtering software attempts to define ‘patterns of dots’ that – as a rule – depict things like “large expanse of flesh tones” – or even with cutting edge technology – “human face”. Expecting an automatic filter to examine a photograph and determine whether it depicted an innocent shot of dad on the beach, or a ‘model’ putting her ankles behind her ears is simply not within the realms of technological reality.
Bring the difference between ‘legal’ and ‘illegal’ down even further and the task being asked of a machine is beyond that which even the human brain is capable. As a society we use the magic number 18 as a yardstick for all sorts of things, including the age at which it is legal to appear in a pornographic photograph. At 216 months, a woman is deemed legally able to be photographed naked in exchange for money, at 215 months she is not. Picking people’s ages is not an easy task. I have a hard time picking young people’s ages at anywhere between 16 and 24 these days, I certainly wouldn’t be able to ascertain someone’s age to the month from simply looking at a photograph. It is easy to see how unrealistic the expectations that people have of filtering software are.
Even if we could engineer a software routine that could determine the age of the subject of a photograph to the level of granularity required to “just block child porn and not block anything that isn’t child porn”, the amount of processing time and effort required for such a routine to run would be immense. Consider that a digitised video is a series of digital images displayed one after the other at a rate of 30 images per second and we would have to run the same process (once for every frame of the video) nearly three thousand times just to cover a minute and a half of digitised video.
I would be very surprised if there was any true ‘hostility’ to the notion of eradicating child pornography from either the internet or the community at large. The percieved hostility is not directed toward the aim of making the world a safer place for children, it is directed at the wholesale ‘magic thinking’ that seems to have gripped people regarding the capabilities of filtering technology. There is simply no way, using contempory hardware and software, that the internet can be automatically filtered of child pornography. And there is certainly no way that it can be even attempted without impacting both the availability and speed of access of perfectly legal material.
@Gabriel J. Buckley: That is a superbly-written summary of the technical challenges involved in automatic filtering. I thank you for such a detailed post! For me, the key sentence is this one:
@Michael Meloni: I have no control over where in Crikey my pieces appear. However I’ll re-publish it here the following day.
It doesnâ€™t matter if they censor content that I am interested in. The only thing that matters is that they have assumed the right do it if they choose. Thatâ€™s what ticks us off, and until they understand that, they will continue to be deluded.
This would probably be better phrased as:
Whether or not they actually censor material we are likely to read is not the issue. The issue is that they are attempting to build a censorship mechanism at all. Viewing child pornography is already a crime, those wishing to view it can and will subvert any filter. The correct response to child pornography is and continues to be law enforcement. In the age of the Internet, any attempt at censorship is futile at best and most likely counter-productive since it simply encourages widespread adoption of covert channels.
As Gabriel notes, censorship is futile and dogged attempts to implement a futile idea are stupid. And as Michael notes, stupid attempts to implement futile ideas in the face of well-intentioned and well-informed criticism are enough to enrage even the mildest of (idealistic) geeks.
Again, as Michael says, I think we do owe it to the pro-censorship lobby to explain the inner workings of Tor hidden services so they understand how intensely futile censorship is. Sites that advertise their perverted wares via a Tor hidden service (or something equivalent) are completely invisible as far the rest of the Internet is concerned. A consumer of child pornography can download stuff from a Tor hidden service and not have a clue what the physical IP address of the server is and likewise the provider will not have a clue what the physical IP address of the client is. With technology like that available, you could block every single pornography site on the web and still not impede access to a child porn site one single iota. The only way to defeat Tor (and Tor-like services) is to physically unplug everyone’s DSL routers and smash their wireless routers. It really is that futile.
DSL routers -> DSL modems (to be strictly correct 🙂
I’ve expanded on some of the futility arguments touched on above, here.
In particular, I offer a more extended explanation of why overlay networks such as those established by Tor make ISP-level filtering irrelevant and hence futile.
@Jon Seymour I really appreciate you’re refining your arguments as this issue develops. That helps me refine what I write, and it all helps shape the debate. I intend writing a formal submission to whichever parliamentary committee evaluates the filtering trial at the appropriate time.
While the human rights aspects fuel my passion on this issue, I think that since I’m a literate geek it’s more my role to help explain the “technical difficulties” (!) of what’s being proposed.
I’ll read your comments on your own site as I head north on the train later this morning.
My comments are obviously a little late here, but who knows information may filter back.
These ongoing discussions are often amusing, as peoples emotions and passion for their respective viewpoints sometimes get in the way of factual and logical discussions.
I have recently been getting involved in the debate on internet filtering and one of my first steps was to research where the discussions came from. After some searching I came across an article quoting Bernadette McMenamin from January 2008.
More searching eventually led me here.
This article was obviously designed to not only scare parents, but it in turn brands anyone against her ideas essentially “indecent”.
I want to add some additional information to the argument for the purpose of educating anyone else who stumbles on to this.
Accuracy of Statistics
When Bernadette McMenamin quoted her statistics in her article:
The statistics she quoted have been proven to be wrong. Details on the real statistics are located here.
Unfortunately when people on a crusade start quoting made up or unverified statistics, all that happens is people stop listening. This in turn detracts from what is in essence a very good cause.
There is no need to make up or exaggerate the stats. The problem is bad enough. Even 1 child is 1 too many.
What has made this worse is the stats she used have been picked up and quoted by the media as fact, which it is not. This in turn continues to make things worse, because when the truth comes out everyone becomes wary of any future numbers quoted.
Technical Constraints to Filtering
Next is the technical aspect, and something that is not discussed very often. I would be so bold as to claim that most people who peddle in illegal material (child porn included) DO NOT do so using normal web sites. And those few that do, have random fast moving sites that an internet filter would simply have too much trouble trying to keep up with.
There are so many other methods of moving files around the internet that there is simply no way to stop it. Believe me the RIAA has been trying to stop illegal music downloads for 10+ years with no success. And most music sites aren’t really trying to hide!!!
We are better off spending the money from any planned internet filter on our police force and give them the facilities they need to catch the small but nasty group of people that Bernadette (and many others of us) want to catch.
Imagine giving the Federal Police an additional 45 Million to set up and an additional 33 Million per year (the expected setup and running costs of an internet filter). Just think how much more effective they would be.
Thank you for your time
Comments are closed.