[WEB4LIB] web cache and remote access

Lansky, Yale M. (NYC) LANSKYY at JacksonLewis.com
Wed May 31 14:02:51 EDT 2000


I forwarded this original message to my MIS guru, who sent me the reply
below.  I have also included my original message to him when I forwarded the
message which begins to explain my question/problem.  

As you can see, despite trying to set up Internet Explorer to get the newest
page from the server, I am still concerned that my users are actually seeing
the page that is stored on their hard drive and not the latest page from the
server.  I am concerned about this because if one of my users relies upon a
document from the web, which is actually being viewed from the cache, they
may not be getting the most current information available which is essential
in most cases.

As you can see from my MIS person, the problem actually seems to lie with
the servers.  Does anyone have any further information about this?  How do
we insure that our users are getting the most up to date information without
cleaning out their temporary internet files on a regular basis?  I have told
people to hit the refresh or reload buttons, but based upon some testing
with pages I am developing, this is not always a fool proof solution either.
I have had the same page display, even though refresh was pushed. (I had to
close the browser and relaunch for the newest page to be displayed.)

I am also aware that in Netscape you can view the expiration and created
date by choosing Page Info from the View menu.  However, I have never found
a comparable feature in Internet Explorer. Have you?

I look forward to your thoughts.  


Yale M. Lansky--Electronic Services Librarian
Jackson Lewis Schnitzler & Krupman
New York, New York
212-545-4058 Fax: 212-687-0228
LanskyY at jacksonlewis.com
http://www.jacksonlewis.com

> 	Jackson Lewis is dedicated exclusively to representing employers in
the practice of employment, labor, benefits and immigration law and related
litigation.  
> 
	Confidentiality Note:  This e-mail, and any attachment to it,
contains privileged and confidential information intended only for the use
of the individual(s) or entity named on the e-mail.  If the reader of this
e-mail is not the intended recipient, or the employee or agent responsible
for delivering it to the intended recipient, you are hereby notified that
reading it is strictly prohibited.  If you have received this e-mail in
error, please immediately return it to the sender and delete it from your
system.  Thank you.



-----Original Message-----
From: Hrncir, Garrett (DAL) 
Sent: Wednesday, May 31, 2000 1:24 PM
To: Lansky, Yale M. (NYC)
Subject: RE: [WEB4LIB] web cache and remote access


Sounds like there proxy server is not set up correctly. It is interesting. 

The easiest way to get the pages to refresh correctly is to set an expire
date in the metatags, this is what the browser looks for when it decides
whether to get the page or to use the cached version. If the cached version
has not expired yet it will serve it. If there is no expired date (I am a
little sketchy on this but I believe this is correct), the browser goes to
the web site and checks the modified date. If it is newer than the one in
the cache it gets it, otherwise it uses the one in the cache. The reason you
sometimes have problems is that not all web servers report the time the page
was last modified or if the page was modified on the same day the time could
be off causing the browser to think that the page in the cache is the same
or newer, etc. 

That setting in explorer just means check for a newer version, not get a
newer version, if the server is not set up correctly you will never get the
newer pages. 

-----Original Message-----
From: Lansky, Yale M. (NYC) 
Sent: Tuesday, May 30, 2000 4:28 PM
To: Hrncir, Garrett (DAL)
Subject: FW: [WEB4LIB] web cache and remote access


We don't have to worry about this yet, but it is interesting.  We have this
problem locally when people look at a page in the cache (temp int files) and
they are not seeing the most up to date page even though IE is set to get
the page fresh each time. (never can seem to get it to work)  once people
use the net more this will mean something for us as legal pages change
frequently--such things as rules of courts, etc.  we don't want the attorney
to look at an old page on their machine when actually there is a more
current version on the web.


-----Original Message-----
From: Bryan H. Davidson [mailto:bdavidso at comp.uark.edu]
Sent: Tuesday, May 30, 2000 4:43 PM
To: Multiple recipients of list
Subject: [WEB4LIB] web cache and remote access


Greetings -

Here at the University of Arkansas, Fayetteville, we have recently
encountered a serious problem with remote access to our online databases
that authenticate by our campus IP address. I would be very interested in
hearing whether other institutions or libraries have encountered a similar
problem.

The Problem:

All of our Internet traffic, as well as most of the libraries' Internet
traffic in the state, travels through a server in Little Rock before going
out into the world (and that is another story).  This in itself has not been
a major problem until now.

This weekend, they implemented something to the effect of  a "web caching
server" without any notification. As a result we immediately lost access to
all of our subscription online resources that authenticate by IP, because
the server altered the IP number. Once we figured out that this was the
cause of the problem, they had to add the URL of each database to an
"exception list" on the server. As a result, not only did we spend hours
scanning through all of our major vendors URLs to get them added to the
exception list, but we are now in the process of going through all of the
URLs for the individual ejournal titles that we subscribe to through
different publishers (about 50 right now).

This implies that every time we acquire a new e-journal from a different
vendor, purchase access to a new database, or even want to set up a database
trial, that we will have to contact DIS in Little Rock to have this URL
added to the exception list. As you might expect, the needs of the library
are not at the top of their list and as a result, they are not very quick
about making the necessary changes to the exception list.

Their argument:
They argue that 80% of web page requests are for static pages. The web
caching system is therefor supposed to increase network speed by serving up
a cached version of that page the next time it is requested, thus reducing
network traffic.

It seems to me that, for most libraries, the opposite is true -  80% or more
of a libraries' Internet activity goes to online subscription databases
where users perform dynamic, unique searches every time.  Even as I type I
am getting calls from our users who are being prompted for passwords where
they should be authenticated by IP.

Has anyone else encountered this problem. Should it even be a problem?

Much Thanks

~~~~~~~~~~~~~~~~~~~~~
Bryan H. Davidson
Electronic Products Librarian / Webmaster
University of Arkansas Libraries, Fayetteville
Ph. 501-575-4665


More information about the Web4lib mailing list