[WEB4LIB] Sniffers?
Thomas Dowling
tdowling at ohiolink.edu
Sun Mar 26 15:10:52 EST 2000
----- Original Message -----
From: Michael Dargan <darganm at maple.iren.net>
> I'm using the Elron Internet Manager to gather stats about web usage by =
> public and staff. It's rather expensive and has many features that I =
> don't need or want. For example, I don't need to actually block or keep =
> track of individual usage. I really want to show how many unique sites =
> have been visited over a period of time, the top 100 most popular sites =
> and the busiest workstations.
>
> Does anyone have a cheap, reliable way to grab this traffic and pull =
> reports out of it?
grep? Excel?
Reports like these invariably come from the raw log files maintained by
servers, proxies, or firewalls. All they do is read them line for line,
break them into separate fields, and sort or tabulate based on values in
those fields. To do the same, all you need is to know the format of the log
file--what fields go in what order and what divides them. If you're an
Excel jockey, go for it; it would also be a textbook exercise for Perl 101.
The only thing you'd lose from a commercial package would be niceties like
on-the-fly pie charts.
Thomas Dowling
Ohio Library and Information Network
tdowling at ohiolink.edu
More information about the Web4lib
mailing list