Proxy Server vs. URL Rewriter for Authentication
JQ Johnson
jqj at darkwing.uoregon.edu
Sun Jan 23 10:05:26 EST 2000
Sherry Crowther asks about the advantages of using a true proxy server over
EZproxy. I'm very interested in the same question, though I'd like to
broaden it from EZproxy to similar rewriting proxy servers (e.g. the III
server).
My theoretical understanding is that the key advantage of such rewriting
proxy servers is that they require no client configuration. That means you
can deploy them and count on them doing their job even when your patron is
unprepared to set a proxy server configuration in her browser (for example,
because he isn't technically competent, or because she is already behind a
mandated corporate firewall proxy server). In contrast, a true proxy
server solution requires dealing remotely with a range of patron browsers,
some of which don't support proxy servers at all, some that don't support
autoconfiguration, and others varying in how one configures them. Do
academic web sites that have deployed true proxy servers have any hard data
on whether their remote patrons have been succesful in using them? My
intuition (based on a career in networking support) is that this is likely
to be a huge problem for a large percentage (more than 25%) of users.
On the other hand, rewriting proxy servers work by tricking the protocol,
and so may not be successful in proxying complex web sites. They almost
certainly won't work if the site provides web pages that contain
javascript-generated links, or links that appear in anything except HTML
documents (PDF files can have hyperlinks from them too). Depending on the
particular rewriting server, they may not even handle such simple
constructs as URLs in (or to) cascading style sheets. To handle
domain-based cookies seems to require that the rewriting proxy server
maintain a notion of a client session. Any rewriting of web pages
invalidates byte-range headers in a client request; do the rewriting proxy
servers handle this gracefully? Etc. The rewriting proxy servers that we
might look at have generally been tested against the sites we care about (a
few database providers); as a practical matter, do they handle all the
special cases that are relevant to those sites, and can we count on those
sites not using newfangled HTML constructs that the rewriting proxy doesn't
understand?
By the way, a question on terminology: Is there any consensus on it?
"True" proxy servers have evolved a lot in the past few years, mostly as
protocol support for them was added to HTTP 1.1. Is there a consensus on
the functionality that one expects in a "proxy server"? Conversely, what
term should we use to refer to servers like EZProxy (they use "rewriting
proxy server", so that's the term I've used here) or the Brown U. server
(they use "pass-through proxy server", I believe).
JQ Johnson Office: 115F Knight Library
Academic Education Coordinator mailto:jqj at darkwing.uoregon.edu
1299 University of Oregon phone: 1-541-346-1746; -3485 fax
Eugene, OR 97403-1299 http://darkwing.uoregon.edu/~jqj/
More information about the Web4lib
mailing list