Update of bug #60246 (project wget):
Severity: 3 - Normal => 4 - Important
Status: Confirmed => In Progress
Assigned to: None => darnir
_______________________________________________________
Follow-up Comment #2:
Okay, this is a crazy issue, and I'm surprised no one has reported it in so
many years! I think this issue has likely existed for 15 years if not more.
The problem is simple:
We pass the file descriptor of the file to which the downloaded document is
saved in order to parse it for any links. So, when you combine -p with -O-,
what you get is that Wget waits indefinitely on a read() call on stdout. Which
will _NEVER_ return anything.
In order to handle -p and -O properly, I think we should download to a
temporary file before parsing and then append it to the output file. This is
especially important when using -O-. When writing to a real file, we could
probably optimize the process by writing directly to the file and then calling
fseek() to only process the newly downloaded part of the file.
I would wager this is also an issue with -r -O-.
_______________________________________________________
Reply to this item at:
<https://savannah.gnu.org/bugs/?60246>
_______________________________________________
Message sent via Savannah
https://savannah.gnu.org/