got it!

C:\>wget.exe -O file.txt "http://en.wikipedia.org/wiki/Wget";
--2008-12-23 21:04:59--  http://en.wikipedia.org/wiki/Wget
Resolving en.wikipedia.org... 208.80.152.2
Connecting to en.wikipedia.org|208.80.152.2|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 61182 (60K) [text/html]
Saving to: `file.txt'

100%[==============================================================================>]
 61,182       107K/s   in 0.6s

2008-12-23 21:05:00 (107 KB/s) - `file.txt' saved [61182/61182]

C:\>

Thanks!
- Rich



----- Original Message ----
From: "[email protected]" <[email protected]>
To: [email protected]
Sent: Tuesday, December 23, 2008 12:57:37 PM
Subject: Re: [Mediawiki-l] maintinence script to append a new section to a page

Never mind libwww-perl's GET, let's use the more well known wget:
$ wget -O file.txt 
'http://en.wikipedia.org/w/index.php?title=Not_Dead_Yet&action=raw'
--2008-12-24 01:52:21--  
http://en.wikipedia.org/w/index.php?title=Not_Dead_Yet&action=raw
Resolving en.wikipedia.org... 208.80.152.2
Connecting to en.wikipedia.org|208.80.152.2|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1732 (1.7K) [text/x-wiki]
Saving to: `file.txt'

100%[======================================>] 1,732       --.-K/s   in 0.04s  

2008-12-24 01:52:22 (45.4 KB/s) - `file.txt' saved [1732/1732]

Anyway, your mission is to save URLs like the above into a file.

_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l



      

_______________________________________________
MediaWiki-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/mediawiki-l

Reply via email to