Hello Linda, what you need to give back your changes in GNU wget is to give copyright assignments to the FSF; this is the only "overhead"; and of course the changes must be accepted.
Cheers, Giuseppe Linda Walsh <[email protected]> writes: > If we wanted to modify wget and check back in changes > what type of overhead is there in dealing with that process if the package is > in the gnu system? > > I can see the necessity for a gate-keeper to keep junk out of wget, but that > operation -- including code review, build testing, etc. is something more more > knowable than the processes of dealing with a gnu project. They may be the > same, but I dont' know, as there is a resume process for doing work on a gnu > process that looks off-putting for casual changes... > > Not that it's be a casual change necessarily, but it'd be a neat feature as > sit > here for a per-session bandwidth capped download, to be able to break up > a large download in to "N" chunks and have wget automatically determine > the sizes and download the chunks in parallel (which in my would multiple > bandwidth by # chunks :-). > > I've always wondered why the megaupload manager had that feature > builtin to it -- but I can't imagine they would cap it on their end, but maybe > some customers are capped per stream? Dunno... Not something I've > ever used. > > But it could be an interesting side project ... if I made the time...:-) > > Linda Walsh
