Hi 

I have an question regarding the wget,could you please help me to get out of 
the problem.

Issue: when i pass a 300 MB file to wget in one shot, it willl not able to 
download the file at the client side.

Problem reasons:

In normal scenario, when we click on a file for downloading, the file comes in 
the form of chunk.Means to say we got small small chunks at the client side 
then we reassemble he PDU to compoose the whole file.

Now,in a scenario i block the file at the gateways.stores it in a temp 
directory and as the download get complete ar the gateway machine.i push it to 
the internal client.
Will wget can handle such amount of huge data in one shot

actually in chunk format the WGET able to handle the data, what will happened 
when i push the whole file in one shot.
Do wget has the feature of buffer where it is holding the stream, if it there 
then by increasing or specifying th buffer limit, i think we can overcome the 
issue.

Can you please help in identying the answer of the question and problem of 
soltuion.

question:
1. Do we have any feature of buffer.
2, how the data stream (chunks ) are being handled by the wget.
3. Can the WGET is able to handle large amount of date (in MBs or GB's) 
4. What is the architecture of handling the data and how the wget sownloads the 
fiile at the client side.

waiting to hear from you.

Thanks
-Manish

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 

Reply via email to