[WEB4LIB] Link checkers?

Kristin.Hlynsdottir at glitnir.is Kristin.Hlynsdottir at glitnir.is
Wed May 18 15:15:24 EDT 2005


Have you tried Xenu? http://home.snafu.de/tilman/xenulink.html
It's small, fast, simple and absolutely free.  I´ve been using it for
several years for both off-line and on-line webs.

"Xenu's Link Sleuth (TM) is a spidering software that checks Web sites
for broken links. Link verification is done on "normal" links, images,
frames, plug-ins, backgrounds, local image maps, style sheets, scripts
and java applets. It displays a continously updated list of URLs which
you can sort by different criteria. A report can be produced at any
time.

Additional features:

Simple, no-frills user-interface 
Can re-check broken links (useful for temporary routing errors) 
Report format is simple, can be e-mailed easily 
Executable file less than 500K 
Supports SSL websites ("https:// ") 
Partial testing of ftp and gopher sites 
Detects and reports redirected URLs 
Site Map "



Kristin


-----------------------------------
Kristin Osk Hlynsdottir
Web manager
Glitnir Leasing
Kirkjusandi - 155 Reykjavik - Iceland
Tel + 354 560 8809
Fax + 354 560 8810
http://www.glitnir.is
kristin.osk at glitnir.is  
-----------------------------------    



-----Original Message-----
From: Darryl Friesen [mailto:Darryl.Friesen at usask.ca]
Sent: 21. júní 2001 14:20
To: Multiple recipients of list
Subject: [WEB4LIB] Link checkers?


All the recent discussion on HTML and CSS validation has reminded me to
ask
about a related topic, link checkers.  Anyone have a good one they like?

Ideally I'd like one that can be installed locally and run "off-line"
(i.e
via cron) in the to produce reports.  It would also be great if it read
the
Apache httpd.conf file, grab DocumentRoot and Aliased directories (web
directories that reside somewhere else other than with the main web
pages),
and check files on the local file system instead of making HTTP requests
for
them (for speed, and to keep my already large log files from getting
even
larger).

Oh ya.  And it's got to be free.  :)

I've installed grabbed W3C's checklink.pl (which runs either on the
command
line or as a CGI script), but it appears to be HTTP based only.  I've
also
tried webxref, but it seems to have more than it's share of bugs, and
doesn't understand Alaised directories.

Do I ask too much?


- Darryl

 ----------------------------------------------------------------------
  Darryl Friesen, B.Sc., Programmer/Analyst    Darryl.Friesen at usask.ca
  Education & Research Technology Services,     http://gollum.usask.ca/
  Department of Computing Services,
  University of Saskatchewan
 ----------------------------------------------------------------------
  "Go not to the Elves for counsel, for they will say both no and yes"





*********************************************************************
Due to deletion of content types excluded from this list by policy,
this multipart message was reduced to a single part, and from there
to a plain text message.
*********************************************************************


More information about the Web4lib mailing list