Update on filter testing
Karen G. Schneider
kgs at bluehighways.com
Thu Apr 10 20:25:14 EDT 1997
We now have over 30 volunteers for the filter-evaluation project. I have
two vendors interested in having librarians evaluate their filtering
products. I'm wondering if anyone would want to help develop the
guidelines for assessment? Would anyone else like to help contact other
vendors to see if they're interested in evaluation? I think we can get
started in another week or a bit more. Right now I can't access home email
at work, due to my ISP turning off telnet access to the server (!@3$%@#!),
so progress is a little slow.
Meanwhile, a fellow traveller on this issue had some interesting ideas
involving forming a team (or committee, or working group... whatever term
works best) to help libraries AVOID using filtering tools that have
proprietary, nonviewable, noneditable stoplists. One idea: have online
communities--such as PUBLIB--create and maintain publicly-accessible
"notlists" (lists of sites or keywords to filter) to be loaded in proxy
servers or software tools such as Net Nanny. Another idea: engage library
consortia in this enterprise.
This solution addresses several problems. First, many libraries are "under
the gun" to address the issue of bad stuff on the 'net. But, Mrs.
McGillicuddy in the Okefenokee Public Library doesn't have the time or
training to create her own notlist. With the good ol' boys that serve as
her trustees breathing down her neck, she needs to do something. Given the
alternatives, she can purchase a product and rely on the vendor's
assessment of what "bad" entails, or she can do nothing--unless we make an
alternative available.
We could teach Mrs. McGillicuddy to be a Unix systems administrator, but
she's a busy woman, what with the sneaker-painting workshop and those
problems with spine labels (this is a little projection... I have spent
more time than I wanted to worrying about spine labels this year). Mrs.
McGillicuddy needs a canned approach. She probably doesn't have a
webserver in her library, but she can install a Windows software program
and add a special tool created by her library colleagues.
Then there is Suzy Suave of systems section in BigCity Public Library.
After she boots into her NT server and launches EXodus, she's only too
happy to see the latest additions to the library notlist. She cruises
through them (maybe bookmarks a couple... I'm not judging), deletes a
couple, expands a term or two, then updates the file. Her proxy server is
working, everything's ok.
There are other possibilities here. Maybe enough of us could look at the
Dublin Core and a few other projects and think about "cataloging the Web."
Too big a project? Well, maybe. Filter vendors don't think so.
Meanwhile, the filter committee... hmmm, need a number here... assesses an
unknown number of sites per person per week. It's a big team, maybe broken
down into teams assigned by domain or other criteria. It's a strict,
spartan standard--what would you not collect in a library (yes, I know
that's a can of worms, but that's librarianship for you)--and libraries are
still experiencing challenges on the sites people can access. But the right
interplay between the right working groups minimizes this problem.
Maybe this could emanate from a professional association; maybe we have
what we need, right here among us.
Anyway, your thoughts/feedback encouraged.
------------------------------------------------------------------
Karen G. Schneider * kgs at bluehighways.com * schneider.karen at epamail.epa.gov
Author, The Internet Access Cookbook (e-mail Neal-Schuman at icm.com)
Director, US EPA Region 2 Library Contractor, Garcia Consulting
Cybrarian * Columnist, American Libraries
Visit our library at http://www.epa.gov/Region2/library/
These opinions strictly mine!
More information about the Web4lib
mailing list