[WEB4LIB] RE: web cache and remote access
Bryan H. Davidson
bdavidso at comp.uark.edu
Wed May 31 14:46:15 EDT 2000
All -
I would first like to thank all of you who replied to the original posting
concerning web caching and remote access.
This has really brought to light the larger issue of web caching servers in
general. After speaking to some folks in computing services on our end, it
seems that, even if the caching server was not in place in Little Rock,
chances are that one's request is being answered by a caching server
somewhere else on the net anyway, rather than the actual live "source". At
least that is my understanding?
In our situation, the only reason this was brought to light in the first
place was that this particular caching server in Little Rock also has some
proxy / firewall functionality that distorts our IP.
Most computing services people do not seem to think that this is a tragedy,
and most accept this as a common practice that will be an overall benefit to
users.
Mr. Lansky brings up an interesting point though. -- I'll quote his previous
posting:
"...this will mean something for us as legal pages change frequently--such
things as rules of courts, etc. we don't want the attorney to look at an
old page on their machine when actually there is a more current version on
the web."
What are the implications of web caching and the information profession?
As an update to our current situation, I just forwarded a list of about 100
publishers that provide us with access to e-journal titles on IP
authentication. All of which somebody in Little Rock will have to key in to
the exception list by hand. I think that this will be one means of getting
the IP issue resolved, especially when every library in the state begins to
make the same requests.
We cannot, unfortunately, as some of you suggested, simply add the IP range
of the caching server to the list we provide our vendors with. Since an
entire network of libraries must go through this one pipe / server, it seems
that all would be granted access to those databases.
I also passed along Dan's solution, of simply asking DIS not to cache
anything with a referrer of
library.uark.edu/*. For some reason, they say this will not work. I'll keep
everybody posted as I am still pressing for an better answer than "just
because". Again, I think that they will be receiving an education as to
exactly what it is libraries "do" once they realize how frequently we and
other libraries add and modify links to IP authenticated resources, which
DIS now has to do to also, in order for us get IP authentication to work.
Bryan
----- Original Message -----
From: "Lansky, Yale M. (NYC)" <LANSKYY at JacksonLewis.com>
To: "Multiple recipients of list" <web4lib at webjunction.org>
Sent: Wednesday, May 31, 2000 1:07 PM
Subject: [WEB4LIB] RE: web cache and remote access
> I forwarded this original message to my MIS guru, who sent me the reply
> below. I have also included my original message to him when I forwarded
the
> message which begins to explain my question/problem.
>
> As you can see, despite trying to set up Internet Explorer to get the
newest
> page from the server, I am still concerned that my users are actually
seeing
> the page that is stored on their hard drive and not the latest page from
the
> server. I am concerned about this because if one of my users relies upon
a
> document from the web, which is actually being viewed from the cache, they
> may not be getting the most current information available which is
essential
> in most cases.
>
> As you can see from my MIS person, the problem actually seems to lie with
> the servers. Does anyone have any further information about this? How do
> we insure that our users are getting the most up to date information
without
> cleaning out their temporary internet files on a regular basis? I have
told
> people to hit the refresh or reload buttons, but based upon some testing
> with pages I am developing, this is not always a fool proof solution
either.
> I have had the same page display, even though refresh was pushed. (I had
to
> close the browser and relaunch for the newest page to be displayed.)
>
> I am also aware that in Netscape you can view the expiration and created
> date by choosing Page Info from the View menu. However, I have never
found
> a comparable feature in Internet Explorer. Have you?
>
> I look forward to your thoughts.
>
>
> Yale M. Lansky--Electronic Services Librarian
> Jackson Lewis Schnitzler & Krupman
> New York, New York
> 212-545-4058 Fax: 212-687-0228
> LanskyY at jacksonlewis.com
> http://www.jacksonlewis.com
>
> > Jackson Lewis is dedicated exclusively to representing employers in
> the practice of employment, labor, benefits and immigration law and
related
> litigation.
> >
> Confidentiality Note: This e-mail, and any attachment to it,
> contains privileged and confidential information intended only for the use
> of the individual(s) or entity named on the e-mail. If the reader of this
> e-mail is not the intended recipient, or the employee or agent responsible
> for delivering it to the intended recipient, you are hereby notified that
> reading it is strictly prohibited. If you have received this e-mail in
> error, please immediately return it to the sender and delete it from your
> system. Thank you.
>
>
>
> -----Original Message-----
> From: Hrncir, Garrett (DAL)
> Sent: Wednesday, May 31, 2000 1:24 PM
> To: Lansky, Yale M. (NYC)
> Subject: RE: [WEB4LIB] web cache and remote access
>
>
> Sounds like there proxy server is not set up correctly. It is interesting.
>
> The easiest way to get the pages to refresh correctly is to set an expire
> date in the metatags, this is what the browser looks for when it decides
> whether to get the page or to use the cached version. If the cached
version
> has not expired yet it will serve it. If there is no expired date (I am a
> little sketchy on this but I believe this is correct), the browser goes to
> the web site and checks the modified date. If it is newer than the one in
> the cache it gets it, otherwise it uses the one in the cache. The reason
you
> sometimes have problems is that not all web servers report the time the
page
> was last modified or if the page was modified on the same day the time
could
> be off causing the browser to think that the page in the cache is the same
> or newer, etc.
>
> That setting in explorer just means check for a newer version, not get a
> newer version, if the server is not set up correctly you will never get
the
> newer pages.
>
> -----Original Message-----
> From: Lansky, Yale M. (NYC)
> Sent: Tuesday, May 30, 2000 4:28 PM
> To: Hrncir, Garrett (DAL)
> Subject: FW: [WEB4LIB] web cache and remote access
>
>
> We don't have to worry about this yet, but it is interesting. We have
this
> problem locally when people look at a page in the cache (temp int files)
and
> they are not seeing the most up to date page even though IE is set to get
> the page fresh each time. (never can seem to get it to work) once people
> use the net more this will mean something for us as legal pages change
> frequently--such things as rules of courts, etc. we don't want the
attorney
> to look at an old page on their machine when actually there is a more
> current version on the web.
>
>
> -----Original Message-----
> From: Bryan H. Davidson [mailto:bdavidso at comp.uark.edu]
> Sent: Tuesday, May 30, 2000 4:43 PM
> To: Multiple recipients of list
> Subject: [WEB4LIB] web cache and remote access
>
>
> Greetings -
>
> Here at the University of Arkansas, Fayetteville, we have recently
> encountered a serious problem with remote access to our online databases
> that authenticate by our campus IP address. I would be very interested in
> hearing whether other institutions or libraries have encountered a similar
> problem.
>
> The Problem:
>
> All of our Internet traffic, as well as most of the libraries' Internet
> traffic in the state, travels through a server in Little Rock before going
> out into the world (and that is another story). This in itself has not
been
> a major problem until now.
>
> This weekend, they implemented something to the effect of a "web caching
> server" without any notification. As a result we immediately lost access
to
> all of our subscription online resources that authenticate by IP, because
> the server altered the IP number. Once we figured out that this was the
> cause of the problem, they had to add the URL of each database to an
> "exception list" on the server. As a result, not only did we spend hours
> scanning through all of our major vendors URLs to get them added to the
> exception list, but we are now in the process of going through all of the
> URLs for the individual ejournal titles that we subscribe to through
> different publishers (about 50 right now).
>
> This implies that every time we acquire a new e-journal from a different
> vendor, purchase access to a new database, or even want to set up a
database
> trial, that we will have to contact DIS in Little Rock to have this URL
> added to the exception list. As you might expect, the needs of the library
> are not at the top of their list and as a result, they are not very quick
> about making the necessary changes to the exception list.
>
> Their argument:
> They argue that 80% of web page requests are for static pages. The web
> caching system is therefor supposed to increase network speed by serving
up
> a cached version of that page the next time it is requested, thus reducing
> network traffic.
>
> It seems to me that, for most libraries, the opposite is true - 80% or
more
> of a libraries' Internet activity goes to online subscription databases
> where users perform dynamic, unique searches every time. Even as I type I
> am getting calls from our users who are being prompted for passwords where
> they should be authenticated by IP.
>
> Has anyone else encountered this problem. Should it even be a problem?
>
> Much Thanks
>
> ~~~~~~~~~~~~~~~~~~~~~
> Bryan H. Davidson
> Electronic Products Librarian / Webmaster
> University of Arkansas Libraries, Fayetteville
> Ph. 501-575-4665
>
More information about the Web4lib
mailing list