Never mind libwww-perl's GET, let's use the more well known wget: $ wget -O file.txt 'http://en.wikipedia.org/w/index.php?title=Not_Dead_Yet&action=raw' --2008-12-24 01:52:21-- http://en.wikipedia.org/w/index.php?title=Not_Dead_Yet&action=raw Resolving en.wikipedia.org... 208.80.152.2 Connecting to en.wikipedia.org|208.80.152.2|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 1732 (1.7K) [text/x-wiki] Saving to: `file.txt'
100%[======================================>] 1,732 --.-K/s in 0.04s 2008-12-24 01:52:22 (45.4 KB/s) - `file.txt' saved [1732/1732] Anyway, your mission is to save URLs like the above into a file. _______________________________________________ MediaWiki-l mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/mediawiki-l
