On Thursday 06 March 2003 22:00, Paul Dorman wrote:That's interesting. There seem to be a bunch of projects applying p2p in interesting and imaginative ways, so perhaps any problems wouldn't last for long... The Linux community is getting bigger all the time; there has to be some threshold past which p2p could be effective.
Or what about some kind of p2p solution? Where xxxx-light machines are
networked to and updated from other xxxx-light machines across the net?
Checksumming and other tools could be used to address security concerns.
You know, I almost took a job working for a company that thought the time for this had come a year and a half ago.... Maybe it is more doable now, at least for open source software (you don't have to worry about how to bill people, how to force users to stay online whenever possible, etc.), but there is still a major project, and there are problems that nobody's yet solved.
On the one hand, an open source project can just use an existing protocol (say, gnutella) rather than building something new from scratch, and doesn't need to worry about billing, etc. And just distributing SHA URI's on official mirrors would be enough to search for the file online and verify that you've downloaded the right one (and of course RPM signatures provide security on top of that).Good, good. I was thinking something based on Gnutella. Many of the clients have built in discussion and chat facilities, as well as administrative tools. Lots to build off there.
I think MandrakeSoft would be the ones to do it. The installer *is* looking pretty slick -- perhaps they have some spare developers looking for something to do ;oP The network, the tools needed to make it work, and the active community would be a great asset for the company. There's a lot of people using this distro, and the number of potential participants is growing all the time. Your CPU cycles, storage, and bandwidth could be a way of giving back to the community...
But on the other hand, where does the network come from? If you build a new p2p network from scratch, you need to get people online. Most users won't be connected to the network except when they're in the middle of their own upgrade. If you use, say, the existing gnutella network, you have the advantage that every Mandrake user who's using gtkg, qtella, limewire, etc. (assuming they've added their package repository to their p2p upload directory list) is available--but the disadvantage that most of the people on the network don't have the files you want.
I think a separate network would be required - as then specialist functions particular to the purpose (such as developers flagging bugs they are working on, checking package integrity, etc.) can be done without the restrictions imposed by the capabilities of current Gnutella clients. Perhaps as the generic clients get more modular MandrakeNetwork plugins would be the thing...
Clearly the more machines the better....
Either way, you'll probably still need mirror sites--and I'm guessing it's much easier to find someone who will run ftp, rsync, and/or http mirrors than finding someone who will attach their mirror server to either a brand-new p2p network or the existing gnutella network....
True about the force-upgrade, but this doesn't restore the machine to the former state. When you upgrade, the packages you are replacing should be archived somewhere on your network if possible, so you have everything right there if something doesn't work right. Remember that storage is getting cheaper all the time. A big install on my systems now uses less than 5% of my disk space. My personal feeling is that reverting should be done using a separate micro install somewhere on the system, accessed through the bootloader, so that even fatal upgrades can be easily undone (and oh, haven't we all been there!). And if you are going from one green light system to a yellow or green, then there isn't a lot that you'd have to store... All people will need is reasonable assurance that the changes were successful and not detrimental to the functionality of thier machines.Oh, and I think that packages should be revertable on installed systems
as well. Users should be protected against unstable software wherever
possible, but at the same time they will demand the very latest releases.
It would be nice to be able to downgrade through urpmi and the GUI tools (of course you can already downgrade today--just download and force-upgrade--but it's not as easy as installing or upgrading). If I try to downgrade kdebase, it would tell me "you also need to downgrade kdelibs and kdegames and uninstall kdevelop," and (if I approve) it would go get the relevant versions of kdebase, kdelibs, and kdegames and so on.
The categorisation thing is a hard problem I think. What relevance is there *really* in choosing a KDE workstation or a GNOME workstation? Surely it's the individual programs that matter. There's a convergence happening as themes and look-n-feel elements are being shared, which blurs the basis for choosing one framework over another...
I think that being able to deal with the same package groups as the installer when upgrading, installing, or downgrading would also be helpful. A beginning user knows that he installed "KDE Workstation," and wants to upgrade that, or that he skipped "LAN Filesharing" (or whatever that option is called) but now he wants it, but probably doesn't know what packages that involves.
This is what I'd be imagining, but something tuned and automated to per-package resolution. There's no reason for having users arbitarily setting points. They should be able to rewind the system to whatever point they feel things were right for them... But I disagree that it would be harder to do it under Linux than under XP. We have openess, community, and package management systems!
Maybe something like Microsoft's "restore points" in XP, but done right, would be useful as well. I mark a system restore point, then upgrade to the new version of Mandrake, install a bunch of new packages through rpmdrake, whatever; then, if it doesn't work, I just restore to the last point. Unfortunately, I think it would be even harder to get this right under linux than under XP.
Thankyou for the feedback. I always feel like I'm putting myself on the line when I have these public babbles.. :o) I hope that I have been clear enough so as to not come across as critical of the current happenings. I think the problems discussed in this thread results from systemic failures, and not through any overt failing in the developers and cooker crew. They have a really hard job to do which I really, really appreciate. I do think though that if we improve things then more people will be able to concentrate on more fun stuff, -- like getting their own apps out the door! I'm really looking forward to 9.1 and feel confident that it'll be great. I'm definitely talking about the future here. The future's got to start sometime though, right!
Anyway, I think that all of these ideas deserve looking into. Of course these kinds of suggestions always come up at the worst possible time, because that's when people think about them. Certainly you don't want anyone at Mandrake, or anyone who could be contributing to the 9.1 effort, putting much time into anything like this for the next few days.
So, remember the ideas that are most important to you, wait until 9.1's out the door and everyone's had a little breathing time, then start a discussion when it's still months to go before the next freeze.
I think we can learn alot from how things are right now, and I also think we can keep things constructive. It shouldn't matter as long as people are free to contribute and respect the positive intentions of the discussions. Nothing wrong with being timely and immediate.
Cheers, Paul.
