hi all,
I would like to be straight to the point here.
i was doing some file manipulations in linux and have a need for
utilities which help me do the following two activities...
1. to be able to split a big file into different parts (not of similar
sizes) for eg. a 1000 kb file split from 1-100kb, 100-123kb,
123-500kb,.etc... ('split' only does same sizes, 'cut' does only for
one line, and not the whole file...!)
2. secondly, and more importantly, to download only a part of a big file
from an ftp (or an http) server. i.e. download only 100-200kb from a 1Mb
file from the net. ('wget' only restarts a partial download, doesnt seem
to have an option for stopping a download at a particular point..)
any ideas??
(a pointer to a shell/perl script, or a perl module would be fantastic
too..!)
tia.
affly
robins
_______________________________________________
ilugd mailing list
[EMAIL PROTECTED]
http://frodo.hserus.net/mailman/listinfo/ilugd