URL minder and website downloading softw
Earl Young
eayoung at bna.com
Fri May 30 09:18:33 EDT 1997
The NetAttache software from Tympani http://www.tympani.com is an
off-line reader. We have looked at several such programs, and
NetAttache is the one we prefer.
It allows us to easily download a site, and offers the ability to go
as deeply or shallowly as we like. We include or exclude external
links - and tell it not to loop higher in the site (if there is a link
to home, for example) if we just want to go "down" from a given point.
The downside is that it saves the pages in a proprietary format. The
pages are easily viewed with Netscape and IR - there is no proprietary
browser involved. The database they build to store the files is very
compact and easy-ish with which to work. It also makes it very
difficult for people to muck with the data - which is good for us
because we want a "true copy" of the site at a different time.
Attache will also check for pages that have changed, and offers
unattended operation and a lot of scheduling flexibility. We use it
to hit weather and news sites automatically, for example.
We make copies of our sites using NetAttache so that we can keep a
copy of what we've done.
Earl Young
______________________________ Reply Separator _________________________________
Subject: Re: URL minder and website downloading softw
Author: pem at po.cwru.edu at INTERNET
Date: 5/29/97 1:14 AM
On Tue, 27 May 1997 15:44:13 -0700 toth-waddell at ontla.ola.org wrote:
> URL minders:
> I've heard that software exists which tells you not only when a web page
> has changed, but also what the change is. Does anyone know where I could
> find this software? If not, are there any regular types of URL minders
> anyone can recommend?
I'm sure there is software out there somewhere to do it, but I just use a
service on the net that will e-mail me when pages change. The URL is:
http://www.netmind.com/URL-minder/URL-minder.html
> Website downloading:
> We would like to be able to download a complete website occasionally so
> that we can catalogue it and preserve it in some manner (diskette, server
> or print- we haven't fully explored these options yet). Does anyone know
> of any such software they can recommend? Has anyone had experience with
> trying to preserve complete sites?
I've been using Nicolai Langfeldt's w3mir program lately. It runs with
PERL plus some modules on a UNIX or win32 system. The URL of the info is:
http://www.math.uio.no/~janl/w3mir/
Peter
--
Peter Murray, Library Systems Manager pem at po.cwru.edu
Digital Media Services http://www.cwru.edu/home/pem.html
Case Western Reserve University, Cleveland, Ohio W:216-368-5888
More information about the Web4lib
mailing list