Monitoring Link Usage

Mark Ellis mark.ellis at rpl.richmond.bc.ca
Mon Jun 15 16:31:29 EDT 1998


Robin,

Here's an expansion of what Thomas is suggesting.  It logs to the common
log format so you can use a standard analyzer on the output.

If your're going to run it on Unix, you'll need to add
#!/path/to/your/perl/interpreter to the beginning.

You'll probably also want to change the log file path and time zone.

All the usual warnings about not using this in air traffic control
systems, nuclear power stations, etc. apply.

########################################################################
##########
# redirect.pl
# &re_direct( location ): redirects browser to a new URL

chdir ('/website/redir_logs/');

my $curlog = "redirect.log";

my $delim = "\n";
my $field_sep = " ";

# Code to produce string which is date and time
my ($sec, $min, $hour, $mday, $mon, $year) = localtime( time );
$year += 1900;
$mday = '0' . $mday if (length( $mday ) < 2);
my $TimeOnly = sprintf("%02d:%02d:%02d", $hour, $min, $sec);
my $month = (Jan,Feb,Mar,Apr,May,Jun,Jul,Aug,Sep,Oct,Nov,Dec)[$mon];
my $DateOnly = $mday."/".$month."/".$year;

my $tzone = "-0800"; # Change this if you're not in the Pacific time
zone.

my $logdate ="[".$DateOnly.":".$TimeOnly." ".$tzone."]";
	
# Redirect the browser to URL passed in the query string
#

&re_direct( $ENV{'QUERY_STRING'} );
	

# Output the form to the log in pseudo-common log format, with fields
separated by $field_sep and
# records delimited by $delim
#
open( FILE, ">>$curlog" );
print FILE $ENV{'REMOTE_ADDR'}, $field_sep, "-", $field_sep, "-",
$field_sep, $logdate, $field_sep, "\"GET ", $ENV{'QUERY_STRING'}, "
HTTP/1.0\"", $field_sep, "200", $field_sep, "0", $field_sep, $delim;

close( FILE );

# re_direct( $location )
# This sub redirects the browser to the URL given in $location
#
sub re_direct {
local ($location) = @_;
print <<"--end--";
HTTP/1.0 301 Redirect
MIME-version: 1.0
Content-type: text/html
Location: $location

<h1>301 Redirect</h1>
Document is located at <a href="$location">$location</a>
--end--
}
########################################################################
#######

> -----Original Message-----
> From: Robin L. Gelinson-Zalben [mailto:gelinsrl at alverno.edu]
> Sent: Friday, June 12, 1998 2:18 PM
> To: Multiple recipients of list
> Subject: Monitoring Link Usage
> 
> 
> 
> --------------3B8072BA7B07BD08B8AEE445
> Content-Type: text/plain; charset=us-ascii
> Content-Transfer-Encoding: 7bit
> 
> This is a bit different monitoring question...
> 
> My library has decided to make a rather comprehensive web 
> page..over 150
> links which I need to create into a bunch of web pages.  I have great
> fears that people really won't use these sites, and I will be wasting
> lots of time maintaining this site. So....I'm looking for a way to
> monitor link usage.  I want to know which links are being used, and
> which ones are being ignored.  Does anyone know of such a program?
> 
> TIA,
> 
> Robin
> 
----------------------------------------------------------------
Mark Ellis
Network Support Analyst			Phone: (604) 231-6410
Richmond Public Library
Richmond, British Columbia
Email:mark.ellis at rpl.richmond.bc.ca
----------------------------------------------------------------


More information about the Web4lib mailing list