Content Filtering — the good, the bad, and the ugly!

Matt Tett

Matt Tett is the Managing Director of Enex TestLab, an independent testing laboratory with over 22 years history and a heritage stemming from RMIT University. Matt holds the following security certifications in good standing CISSP, CISM, CSEPS and CISA. He is a long standing committee member of the Australian Information Security Association (AISA), Melbourne branch, and is also a member of the Information Systems Audit and Control Association (ISACA). Enex TestLab can be found at blog at and can be found on twitter as @enextestlab.

I was recently engaged to present a half-day workshop to a Government agency on the topic of content filtering technologies. Naturally this is a technology topic that Enex TestLab has had significant involvement with over the years, and something that I have personally had to deal with on a number of levels (and for a number of reasons).

It wasn’t until I sat myself down and started compiling the workshop material that I realised we had actually been involved in this for over ten years! Scary thought really. We’ve worked on everything, from the NetAlert initiative of the early noughties through to the most recent developments with ISP voluntary filtering and the Internet Industry Association (IIA) Family Friendly Filter (FFF) program.

And what a decade it has been; two Governments, two agendas, the Liberals pushing the Protecting Australian Families Online (PAFO) through Government subsidised PC/Mac based content filters, followed by a lab trial of server based technologies, and onto Labor and its live trials of similar technologies.

There have been a few key take-away messages for Enex TestLab, and myself personally through all this. Primarily, maintain independence and rigor. A good source to determine criteria for any methodology are critics. And when evaluating filtering technologies, critics come out of the woodwork in droves. Focus is key, particularly when the debate is one as polarising as this. One is required to divorce oneself from their personal opinion and apply scientific rigor to the topic, and essentially, that involves evaluating the technologies against the methodology developed. Another is to ensure that naysayers are grounded by focusing on the technology, not putting forward emotion or ethics.

There have been a number of “stories” that have evolved over the years, some good, some bad, some just sad. Those are all for another time, and another place.

The summary, and outcome, is that after eight solid years, the program which has endured through this all is the IIA FFF program, still going strong and rarely requiring change. It gives a solid lease of life and some independent information about independently tested and approved vendor products to parents wishing to take steps to protect their families on-line, and make their own choice.

The bottom line is to remember that while a filter installed on a family computer goes some ways to ensuring that children will not be exposed to the worst of the worst, it still is not an excuse for failure to monitor and educate the family about the reasons for it being installed in the first place. Treat it as a line drawn in the sand, a family rule. I am sure a child or juvenile found to have breached the trust once will find a cyber-grounding - through the removal of their access privileges to the Internet - will have a similar effect to physical grounding and restricting privileges to go out with friends. No matter what the circumstance, parents will still need to be aware of and monitor their children’s online activities and implement appropriate discipline the same as they would in the real world.

Read Content Filtering Technologies Overview.

Follow @CSO_Australia and sign up to the CSO Australia newsletter.

Tags: Content, Filtering

Show Comments