On Mon, 2004-11-08 at 21:28, Alasdair Tennant wrote: > On Mon, 08 Nov 2004 00:08:43 +1300 > Ross Drummond <[EMAIL PROTECTED]> wrote: > > It will > > be a cold day in hell before I knuckle under to the monopoly abuse by > > telecom of the asdl sector. > > > > Big ISO to download? Just leave your modem running all night with > > wget and go off line when you get up. It may take several nights to > > get the ISO, but who rings in the middle of the night. > > I like your attitude! > > Seems to me that the underlying assumption that if you're serious about > computers you must have broadband is simply propaganda spread by vendors > who wish to ease their workload. > You lucky sods in Christchurch with cheap cable etc. quickly forget that > it's not available to all. >
not everyone has cheap cable, it doesn't get to all suburbs, especially mine. in point of fact i went to adsl at work because it is cheaper than dialup for a business (telecom charge 3c per minute to dial your isp, i did the math at the time). and face it, its fun to have broadband! > Linux was born, raised and thrived in an atmosphere where MODEMs were > king. I'm sure that there are many tricks out there that I've never > heard of that can help us in the current world of big downloads. > > Those of us who are still on dial-up - for whatever reason - need some > pointers! > > a very few come to my mind, though I don't yet know how to maximise > benefit from them. > wget > rsync > > More, anyone? especially including case studies and examples. I would > like to know how to do the above job unattended. Say, set up a cron job > to connect (say at 1 am), chug through a list of downloads, and > disconnect at, say, 6:00 am, finished or not, to be continued the next > night unless manually stopped or run out of downloads. wget will take a list of files to download in a file eg wget -i filelist, where filelist has a list of the urls to download. try looking for the at command to specify to start a program at a specific time, on a one off basis. cron if you want to do it periodicly. not particularly difficult to write a script to dial in, start wget on a list of files, and trigger something to kill wvdial and pppd at 6.00 am. If it was refined it could delete the urls from the list as it downloaded each one, and not bother if the list was empty. > > Another task is to wit till the wee hours to download all email, but > that's a whole other topic . . .
