Re: wget POST 'multipart/form-data' problem (win xp sp2)

2006-12-21 Thread Denis Golovan
Gerhard Blum [EMAIL PROTECTED] news:[EMAIL PROTECTED]
 hi,

 i'm wondering if my mail 2 weeks ago was received...
 or am i too silly to ask correct, or is this problem off topic?
 could someone please be so kind and send a short reply - thanks a lot

  Hi, Gerhard Blum. I am also a new reader of this conference. It seems to 
me everyone is dead here.
Sorry, I can't help you with your problem. 




Wget timestamping is flawed across timezones

2006-12-21 Thread Remko Scharroo

Dear wget developers.

I'm sure this has been reported before, and I've seen references to  
it going back all the way to 2003 but the problem I'm facing is still  
there in wget version 1.10.2.


When I turn on --timestamping I suspect, as the manual says, that the  
time tags are preserved. But that is not the case. It only preserves  
the time stamp when the ftp server and the machine your are running  
wget on are in THE SAME TIMEZONE. The are many cases where that is  
not the case:
- some ftp server choose to run in UTC (GMT) time zone, no matter  
where they are.
- when I am in the US and I download from Europe, the files I  
download are off by 6 hours.


Wget clearly tries to match the time stamp that it gets in  
the .listing file. But that file has the time tags of the files in  
the server's time zone, not in the time zone wget is running in.


Can this be fixed?

Regards,
Remko



Re: Wget timestamping is flawed across timezones

2006-12-21 Thread Steven M. Schweda
From: Remko Scharroo:

 Can this be fixed?

   Of course it can be fixed, but someone will need to fix it, which
would involve defining the user interface and adding the code to do the
actual time offset.  I assume that the user will need to specify the
offset.

   For an indication of what could be done, you might look for
WGET_TIMEZONE_DIFFERENTIAL in my VMS-adapted src/ftp-ls.c:
ftp_parse_vms_ls().

  http://antinode.org/dec/sw/wget.html

   This is a common problem on VMS systems, which normally (sadly), use
local time instead of, say, UTC.  One result of this is that FTP servers
on VMS tend to provide file date-times in the server's local time.

   I chose to add an environment variable (a VMS logical name on a VMS
system) as the user interface for code simplicity (less work for me),
and partly because VMS uses a similar logical name
(SYS$TIMEZONE_DIFFERENTIAL) to specify the offset from UTC to local
time, so the concept would already be familiar to a VMS user.

   I use WGET_TIMEZONE_DIFFERENTIAL in the code only for a VMS FTP
server, but I assume that it could easily be adapted to the other
ftp_parse*_ls() functions.  (Or a new command-line option could be used
to specify the offset.)  When I did the work, I probably didn't consider
the possibility that any non-VMS FTP servers would provide file
date-times in non-UTC.  Otherwise I might have made it more general.

   Trying to get my VMS-related changes into the main Wget development
stream has been sufficiently unsuccessful that I don't spend much time
working on adding features and fixes which are not trivially easy and
which I don't actually need myself.  But I wouldn't try to discourage
anyone else.



   Steven M. Schweda   [EMAIL PROTECTED]
   382 South Warwick Street(+1) 651-699-9818
   Saint Paul  MN  55105-2547


re: problem at 4 gigabyte mark downloading wikipedia database file.

2006-12-21 Thread Jonathan Bazemore
Hello,

I am a former computer tech, and I've followed all
instructions closely regarding wget.  I am using wget
1.9 in conjunction with the wgetgui program.  

I have confirmed resumability with smaller binary
files, up to 2.3 gigabytes in size.  

What happens is, that when downloading the wikipedia
database, which is about 8 gigabytes, using wget, the
download proceeds and is resumable up to about the 4
gig mark, then, when I attempt resumption, the
internet connection appears to be working, but the
file just sits there, and doesn't increase in size.

I theorize that the datastream is being corrupted, and
my next step will be to shave pieces of the file off
the end, in several megabyte increments, until I reach
the uncorrupted part.  

Please let me know what's going on and why this is
happening at this email address, as I am not a
developer and not currently subscribed to the mailing
list, but I do need to have wget working properly to
get the database.  

Thanks,

Jonathan.

__
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com