[Web4lib] http batch downloader that will submit cookies and post
data
Mark Sandford
mark.autocat at gmail.com
Tue Aug 25 10:19:46 EDT 2009
Outwit Hub, a firefox extension (http://www.outwit.com/) may be able
to do this. I'm not sure, but it seems pretty powerful.
Mark Sandford
Special Formats Cataloger
William Paterson University
(973)270-2437
sandfordm1 at wpunj.edu
On Tue, Aug 25, 2009 at 6:50 AM, John
Fitzgibbon<jfitzgibbon at galwaylibrary.ie> wrote:
> Hi,
>
> In the past, I have used a batch downloader like Download Accelerator Plus to download a number of web pages. Each web page has an URL with a different query string. For example, if I wish to download files
>
> http://www.somesite.com?age=1<http://www.somesite.com/?age=1>
> http://www.somesite.com?age=2<http://www.somesite.com/?age=2>
> ...
> http://www.somesite.com?age=100<http://www.somesite.com/?age=100>.
>
> I can easily create a text file of such URLs and point the downloader at this file. The downloader, then, downloads each page, in turn, into a folder. I copy the resulting HTML files into one file and convert it into XML to extract the information I need.
>
> This will not work if the site requires a cookie to be submitted each time. None of the downloaders I have tried will submit a cookie. Is there a downloader that will do this?
>
> Secondly, if the page uses the POST method rather than the GET method to submit data to the server, specifying a file of URLs will not suffice; are there downloaders out there that can POST data to a web server in batch download mode.
>
> I would appreciate any suggestions.
>
>
> Regards,
>
> John
>
>
>
> John Fitzgibbon
>
>
>
> w: www.galwaylibrary.ie
>
> e: info at galwaylibrary.ie
>
> p: 00 353 91 562471
>
> f: 00 353 91 565039
More information about the Web4lib
mailing list