Nick <[email protected]> writes:

> Quoth Troels Henriksen:
>> Nick <[email protected]> writes:
>> 
>> > Howdy,
>> >
>> > I recently got around to updating the patch posted to the list in 
>> > July last year by Carlos, which builds downloading into surf rather 
>> > than relying on e.g. wget. While this sounds like a bad idea, it 
>> > isn't, as it's needed for horrible javascript sorts of downloads, 
>> > such as from rapidshare and megaupload.
>> 
>> Huh?  I have not had trouble using wget for this.
>
> Really? That suprises me. I suppose it may be the fault of my 
> slightly baroque shell script...
>
> Can anyone else confirm / deny this?

The only problem I've had is having to add --no-check-certificate to
wget.  Apart from that, any download, even one initiated by the browser,
is authenticated solely through cookies (what would a "javascript
download" be?).  In theory, a download might be authenticated via
session cookies or form parameters, which would be harder to pass to
wget (but not impossible), although I don't recall seeing this in the
wild.

Actually, now that I think about it, there is another potential problem:
wget may attempt to autoguess the filename, and for some URIs, this will
result in a name that is too long for the file system to handle.  There
are two solutions to this:

1) Use webkit_download_get_suggested_filename and pass the string on to
   wget.  This almost always works.

2) Let the user input a file name (via dmenu or whatever) for all
   downloads.

I have done this in my own Surf fork and it works well.

-- 
\  Troels
/\ Henriksen

Reply via email to