FW: [WEB4LIB] Re: FW: counting use of resources

Gimon, Charles A CAGimon at mpls.lib.mn.us
Thu Oct 12 17:16:13 EDT 2000


Bunch of issues and questions here...

> -----Original Message-----
> From: Julia Schult [mailto:jschult at elmira.edu] 
> Sent: Thursday, October 12, 2000 12:51 PM
> To: Multiple recipients of list
> Subject: [WEB4LIB] Re: FW: counting use of resources
> 
> 
> "Gimon, Charles A" wrote:
> 
> > We run visitors through a counter page before referring 
> them on to the
> > remote database. URLs look like this:
> >
> > http://www.mpls.lib.mn.us/subcount.asp?URL=41
> >
> > We're running a little Access database behind this that 
> stores info about
> > all the remote databases. Here, 41 is "What Do I Read 
> Next?". When the user
> > clicks on the link, the subcount.asp page queries the 
> database for the URL
> > for What Do I Read Next, increments the clickthrough count 
> for that database
> > by 1, and sends a redirect to the user to go to that URL.
> 
> This sounds like what I've been looking for.  Because of 
> caching, I've always
> felt very uncomfortable about my web statistics supposedly 
> counting "hits" on
> our pages, when really both end-user and ISP caching render 
> those counts
> *completely* wrong.
> 
> Being able to use click-thru counts would help me almost more 
> than overall web
> page usage stats because I would be able to tell how my web 
> page design helps
> and hinders my users, as well as what they most want from my pages.
> 

No logs are perfect, of course. Clickthroughs will still capture a few
actions from the curious, bored, and simply lost, just like the page usage
stats do. But yes, looking at a variety of data gives you a more rounded
picture of what's going on.

> However, I do want my users to be able to see the url when 
> they mouse over the
> link.  I guess I can use the "title" attribute we all just 
> learned about on this
> list to have the url show up in the bottom bar in Netscape.  
> Does it do that in
> Internet Explorer, too, or does it just show up as a "mouse-tip"?
> 

We don't show the URL on our remote database pages, but we also count
clickthroughs on our big list of Internet resources, and we do show URLs
there. We just show them as text right under the title. Here again, the
records are being pulled from a database. 

> > Separate counts are kept for library staff, public 
> terminals in the library,
> > and users outside the library. We have a separate page for 
> staff that
> > queries the same database and reports the clickthroughs on 
> each database
> > month by month.
> 
> This, too, is something all our staff want to know.  How is 
> our usage in the
> dorms compared to here in the library?  But I don't 
> understand how you can keep
> separate counts for these groups?
> 

We do it through a combination of looking at the path the files are in and
the remote user's IP address through the REMOTE_ADDR variable. Eventually
we'd like to do it completely by IP address.  This all depends on how your
network is set up; if the IP addresses for public and staff are all in one
big DHCP pool, you won't be able to tell them apart. 

> > Of course, we host our own server; what you're able to do 
> will be limited by
> > what your web hosting service allows.
> 
> And therein lies my real problem, perhaps.  It would be 
> interesting to survey
> Web4Lib and find out how many of us have complete control 
> over our own servers,
> how many of us rely on a central IT department which serves 
> other masters, and
> how many have outsourced servers with ISPs.  Anyone willing 
> to take that one on?
> 

We have complete control of our main web server. We have limited access to
our Innopac catalog server due to our contract with them.

> Does this solution require Microsoft Access to be loaded on 
> the Web Server?  (Or
> could the Access program and file be on another computer in 
> the network?)

The basic concept should work with just about any combo of tools and
databases; you should be able to do exactly the same thing with PHP on
Apache and MySQL, for example. I'm using ASP/Perlscript on IIS with Access.
I try to keep my SQL statements as "vanilla" as possible in case we want to
move to a more powerful database in the future.

Doing this with remote databases doesn't require a very large database on
the web server at all. Our database holds about 90 records in one table
(tiny!), one record for each remote database. There's also a new table each
month to hold the clickthrough counts with a comparable number of records.

The big reason that I built this database-driven system is that we used to
have to keep track of six flat, static HTML pages of remote databases for
six different situations. Now all the databases are in one central store,
and they're flagged for "staff only", "remote access", and so on. I never
have to change the code in the web pages--each page just queries the central
Access database to get the listings it needs, then serves them up to the web
user. When we get a new database, or drop an old one, I just add it once to
the Access database and it shows up in exactly the places it's supposed to.

After that, the clickthroughs were gravy. As others have pointed out,
there's a bunch of free scripts out on the web to count clickthroughs. I
downloaded one, didn't like how it was coded, and decided to code the same
thing myself from scratch on a Sunday afternoon. (No ringing phones,
interruptions, etc.)

--Charles Gimon
  Minneapolis Public Library


> 
> ---Julia E. Schult
> Access/Electronic Services Librarian
> Elmira College
> Jschult at elmira.edu
> 
> 


More information about the Web4lib mailing list