[WEB4LIB] RE: Huge file delivery
Mike Beccaria
mike.beccaria at pictometry.com
Thu Apr 28 18:40:59 EDT 2005
Whoops. Someone pointed out my silly mistake...I have never sent 20-30GB file sets over FTP. I read it fast and assumed they were MB's. I think there are problems (sometimes) sending files over 2GB with FTP. Is it possible to use an archiving utility (winrar comes to mind) that can break the file down into smaller chunks and then use FTP?
Mike
-----Original Message-----
From: Mike Beccaria
Sent: Thu 4/28/2005 5:06 PM
To: Multiple recipients of list
Cc:
Subject: [WEB4LIB] RE: Huge file delivery
Andrew,
I have used FTP to transfer files of this size pretty often. The NY State GIS web page often has files that exceed this size and they use FTP for their transfers as well. See here for an example: http://www.nysgis.state.ny.us/gateway/mg/livingston_download.htm
Best of luck!
Mike
-----Original Message-----
From: A. Bullen [mailto:abullen at ameritech.net]
Sent: Thu 4/28/2005 4:39 PM
To: Multiple recipients of list
Cc:
Subject: [WEB4LIB] Huge file delivery
All--
Forgive a naive question, but I have never had to deal with the
following situation and I don't know how to it off. We will be in
receipt of a very large GPS data set consisting of files that total 1
terabyte all together; I think the individual data sets are 20-30 GB a
piece.
Does anyone have a suggestion how I can successfully distribute files
this large on an on-demand basis? I can put them on servers that share a
T-3, but I am not sure FTP can handle this size and scope of file
transmission.
Like I said, probably a naive question, but any insight would be helpful.
Andrew Bullen
Illinois State Library
More information about the Web4lib
mailing list