Filters Don't Censor, They Protect Our Kids

Libraries don't stock videos of "Deep Throat" and schools don't provide students with copies of Hustler. Yet any child with unrestricted Internet access, in school or at the public library is only a couple of mouse-clicks away from exposure to much harder core pornography.

SHOULD TAXPAYERS pay for our schools and libraries Internet access to be a pornography outlet? Congress doesn't think so. The Children's Internet Protection Act (CIPA) passed with overwhelming bipartisan support in both the House and Senate and was signed into law last December by President Clinton. CIPA requires schools and libraries using federal "e-rate" subsidies to dedicate some of those funds to install software that filters out porn. Specifically, child pornography, obscenity (the legal term for hardcore porn) and softcore content legally defined as "harmful to minors" must be filtered for those aged 16 or under. For adult library users, both child porn and online obscenity would be filtered, since neither of these is constitutionally protected. Finally, the law allows a library supervisor or school administrator to disable a filter for research or any other bona fide purpose.

Even though, according to a Digital Media Forum survey, an overwhelming 92 percent of Americans support blocking obscenity on school computers, the American Library Association, the American Civil Liberties Union and other groups are launching a legal challenge, questioning the constitutionality of the Children's Internet Protection Act. They contend that CIPA is censorship, that Internet safety decisions should be made locally by libraries and schools, that filters are guilty of overblocking constitutionally protected speech, and that acceptable use policies alone are capable of protecting children online.

I believe they represent a radical view, which is not shared by the majority of librarians or the public. CIPA isn't censorship because nothing is removed from public access. Libraries and schools that choose not to implement filters simply do not receive the subsidy. Additionally, specifically appropriated federal funds can be used to cover the cost of filtering.

Critics of the law claim that filters block legitimate content such as breast cancer sites, but this is an obsolete argument that no longer applies to the sophisticated filter technology available today.

Many filtering companies segregate objectionable content into various categories, allowing end-users to choose which categories are filtered. Critics misleadingly claim that sex education sites are blocked when, in fact, these types of sites fall in the sex education category and not in the pornography category covered by CIPA.

Additionally, filters can be customized to allow teachers or librarians to unblock an inadvertently blocked site in a manner of minutes, a far shorter period of time than the customary two weeks to get a book that is not available in the library or classroom.

While filter critics focus on technical flaws in the software, they fail to acknowledge the real dangers to children of the readily available pornography and the actions of sexual predators on the Internet.

With unrestricted Internet access, a child who types an innocent word such as "toys" into a search engine can be deceptively routed to a porn site displaying free images. Once routed to such a site, "mousetrapping" technology will prevent that child from exiting the site, short of shutting down the computer. It is not a matter of if children will be exposed to these tactics, it is a matter of when - 91 percent of children accessing objectionable sites do so accidentally, according to a 1999 Yankelovitch poll. The National Center for Missing and Exploited Children reports that one in five children aged 10 to 17 has received a sexual solicitation in the past year.

More than 90 percent of the nation's public libraries have in place "acceptable use policies" touted by the ALA and ACLU. But while such policies satisfy civil libertarians, they fail to take into account accidental access or intentional access by kids and adults. Acceptable use policies don't teach children that disguised sites such as are pornography sites. Nor do such policies prevent a student, searching the Internet for information on wolves for a school report, from being exposed to a picture of a woman having sex with a wolf. Such policies certainly don't take into account the fact that porn sites often deceptively use popular brand names such as Disney, Barbie and Nintendo in their search engine's magnets, metatags and links in order to trick web surfers.

The best way to shield children from such dangers is the implementation of software tools in tandem with such safety policies. One without the other is insufficient. We do not rely on education and policies alone to protect children from other material we consider harmful to them, such as alcohol, tobacco, and gambling. We rely on a combination of mandated safety and enforcement mechanisms as well as education and policies.

The wisdom of a combination of education and filtering is born out in two studies by the National Commission on Library and Information Science. In 1998, only 1,679 public libraries offering public Internet access filtered some or all Internet access. In 2000, that number had more than doubled to 3,711. One in four public libraries offering public Internet access now use filters.

But neither the ACLU nor the ALA can accept any form of filtering because they apparently believe it's unconstitutional to keep kids and porn apart.

Let's face it, these are not mainstream views. ACLU vs. Ashcroft? ALA vs. Ashcroft? Let's just call this legal argument what it is, ACLU and the ALA vs. the Innocence of America's children.