Use Policy in Lieu of Filtering

Jon Lebkowsky jonl at onr.com
Mon Jun 30 08:32:33 EDT 1997


At 09:25 PM 6/29/97 -0700, John R. Gause (407) 425-4694 ext. 301 wrote:

>Some have advocated that filtering software is unnecessary -- that it is 
>better to handle the situation with an Acceptable Use Policy.  Some (not 
>all) seem to imply from reliance on a policy that you are in agreement that 
>hardcore pornography is inappropriate for display on computer screens in 
>public areas at the library; in other words, it is not so much the end 
>results but the means to which you object.  Because some filters currently 
>have faults, you are not yet ready to use them.  But you would expect staff 
>to enforce your policy by asking someone to stop viewing a particular 
>image?  Are you censoring their viewing rights?  How does this differ from 
>the use of filtering software as one tool for enforcing your policy?  What 
>burden does this place on the individual staff member and how do you ensure 
>consistent application?  

I advocate an Acceptable Use policy, but you lack the context that led me
to arrive at that conclusion.  In Austin, the library's stated concern was
that librarians could be held liable (under the harmful to minors statute)
where minors access content that is found "harmful." They were also
concerned about adult patrons viewing child porn or obscene material.  We
suggested that an acceptable use policy would place the responsibility on
patrons, or in the case of minors, on parents if parents are required to
sign something...

The 'acceptable use' statement that I suggested didn't specify a particular
kind of content; rather, it specified the patron's agreement not to do
anything illegal.  Whether 'illegal' was to be further defined was still up
in the air last time we discussed it.  The reason I felt the statement
should adhere pretty strictly to the question of legality was to ensure
that it would not unduly 'censor' the viewing rights of the patron...though
any action you take is problematic in this regard.

I talked to one attorney who felt that the library would still be liable if
minors accessed 'harmful' material, but another (Mike Godwin of EFF) feels
that the library clearly would NOT be liable, because of the lack of intent
to distribute the objectionable material. Hasn't been tested in court, though.

>If the lists of blocked sites in filtering software can be viewed, 
>evaluated, and modified by the individual library system, then your library 
>can provide consistent application of the policy.  Challenged sites (to be 
>blocked or to be unblocked) could be handled in the same manner in which 
>you handle challenged books.  Using filtering software in this manner 
>assumes that you are willing and able to devote the time to at least 
>spot-check the filter.  If you care about the quality of service and 
>responsible use of your funding, shouldn't you monitor any service you've 
>outsourced, whether it's filtering software, lease book plans, or 
>janitorial services?  

The filter blocks thousands of sites, and most of the filters actually
implemented in libraries (primarily CyberPatrol, which many libraries seem
to be selecting) don't disclose the database.


=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Jon Lebkowsky     *     jonl at onr.com     *     www.well.com/~jonl
President, EFF-Austin

"Bring a child into the world, write a book,  assemble a machine, 
build a chair. Those who create have the sensation of playing 
a role  larger than themselves. Those who make no effort to 
create ... will be nothing."  -- Jacques Cousteau
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=


More information about the Web4lib mailing list