how to parse a webpage to download links of certain type?
Hi all, I'm sure wget has a way in which one can parse a webpage download certain fileypes . Say something like look for .odf fileypes in this webpage? If something like this has been asked before then a link would be good enough. -- Regards, Shirish Agarwal This email is licensed under http://creativecommons.org/licenses/by-nc/3.0/ 065C 6D79 A68C E7EA 52B3 8D70 950D 53FB 729A 8B17
Re: how to parse a webpage to download links of certain type?
On Sun, Mar 9, 2008 at 3:49 PM, shirish [EMAIL PROTECTED] wrote: Hi all, I'm sure wget has a way in which one can parse a webpage download certain fileypes . Say something like look for .odf fileypes in this webpage? CMIIW, but I guess the command for this would be wget -r -l 1 -A .odf http://site-url --- Charles
Re: how to parse a webpage to download links of certain type?
Charles, Pretty cool, this works :) wget -r -l 1 -A .odf http://site-url -- Regards, Shirish Agarwal This email is licensed under http://creativecommons.org/licenses/by-nc/3.0/ 065C 6D79 A68C E7EA 52B3 8D70 950D 53FB 729A 8B17
Re: how to parse a webpage to download links of certain type?
Hi all, Charles, there is one thing though, what it does is it makes directories sub-directories . I want to have them in the same directory where I'm running wget, not directories, possible? -- Regards, Shirish Agarwal This email is licensed under http://creativecommons.org/licenses/by-nc/3.0/ 065C 6D79 A68C E7EA 52B3 8D70 950D 53FB 729A 8B17
Re: how to parse a webpage to download links of certain type?
From: shirish [...] not directories [...] alp $ wget -h [...] Directories: -nd, --no-directories don't create directories. [...] Sounds as if it may be worth a try. Steven M. Schweda [EMAIL PROTECTED] 382 South Warwick Street(+1) 651-699-9818 Saint Paul MN 55105-2547