== Jason Gunthorpe [EMAIL PROTECTED] writes:
On 8 Jan 2001, Goswin Brederlow wrote:
I don't need to get a filelisting, apt-get tells me the
name. :)
You have missed the point, the presence of the ability to do
file listings prevents the adoption of rsync servers
On 8 Jan 2001, Goswin Brederlow wrote:
Then that feature should be limited to non-recursive listings or
turned off. Or .listing files should be created that are just served.
*couf* rproxy *couf*
So when you have more blocks, the hash will fill up. So you have more
hits on the first level
On Fri, 5 Jan 2001 09:33:05 -0700 (MST)
Jason Gunthorpe [EMAIL PROTECTED] wrote:
If that suits your needs, feel free to write a bugreport on apt about
this.
Yes, I enjoy closing such bug reports with a terse response.
Hint: Read the bug page for APT to discover why!
From bug report #76118:
On Fri, 5 Jan 2001 19:08:38 +0200
[EMAIL PROTECTED] (Sami Haahtinen) wrote:
Or, can rsync sync binary files?
hmm.. this sounds like something worth implementing..
rsync can, but the problem is with a compressed stream if you insert or alter
data early on in the stream, the data after that
== Sam Vilain [EMAIL PROTECTED] writes:
On Fri, 5 Jan 2001 09:33:05 -0700 (MST) Jason Gunthorpe
[EMAIL PROTECTED] wrote:
If that suits your needs, feel free to write a bugreport on
apt about this. Yes, I enjoy closing such bug reports with a
terse response. Hint:
Sam Vilain [EMAIL PROTECTED] writes:
On Fri, 5 Jan 2001 19:08:38 +0200
[EMAIL PROTECTED] (Sami Haahtinen) wrote:
Or, can rsync sync binary files?
hmm.. this sounds like something worth implementing..
rsync can, but the problem is with a compressed stream if you insert
or alter data
On Sun, Jan 07, 2001 at 03:49:43PM +0100, Goswin Brederlow wrote:
Actually the load should drop, providing the following feature add
ons:
[...]
The load should drop from that induced by the current rsync setup (for the
mirrors), but if many, many more client start using rsync (instead of
== Matt Zimmerman [EMAIL PROTECTED] writes:
On Sun, Jan 07, 2001 at 03:49:43PM +0100, Goswin Brederlow
wrote:
Actually the load should drop, providing the following feature
add ons: [...]
The load should drop from that induced by the current rsync
setup (for
On 7 Jan 2001, Goswin Brederlow wrote:
Actually the load should drop, providing the following feature add
ons:
1. cached checksums and pulling instead of pushing
2. client side unpackging of compressed streams
Apparently reversing the direction of rsync infringes on a patent.
Plus there
Goswin == Goswin Brederlow [EMAIL PROTECTED] writes:
Goswin Actually the load should drop, providing the following
Goswin feature add ons:
How does rproxy cope? Does it require a high load on the server? I
suspect not, but need to check on this.
I think of rsync as just being a quick
== Brian May [EMAIL PROTECTED] writes:
Goswin == Goswin Brederlow [EMAIL PROTECTED] writes:
Goswin Actually the load should drop, providing the following
Goswin feature add ons:
How does rproxy cope? Does it require a high load on the
server? I suspect not, but need to
Goswin == Goswin Brederlow [EMAIL PROTECTED] writes:
Goswin URL?
URL:http://linuxcare.com.au/projects/rproxy/
The documentation seems very comprehensive, but I am not sure when it
was last updated.
Goswin Sounds more like encapsulation of an rsync similar
Goswin protocol in html,
== Jason Gunthorpe [EMAIL PROTECTED] writes:
On 7 Jan 2001, Goswin Brederlow wrote:
Actually the load should drop, providing the following feature
add ons:
1. cached checksums and pulling instead of pushing 2. client
side unpackging of compressed streams
On 8 Jan 2001, Goswin Brederlow wrote:
Apparently reversing the direction of rsync infringes on a
patent.
When I rsync a file, rsync starts ssh to connect to the remote host
and starts rsync there in the reverse mode.
Not really, you have to use quite a different set of
Quoting Goswin Brederlow [EMAIL PROTECTED]:
== Sami Haahtinen [EMAIL PROTECTED] writes:
Or, can rsync sync binary files?
Of cause, but forget it with compressed data.
Doesn't gzip have a --rsync option, or somesuch? Apparently Andrew
Tridgell (Samba, Rsync) has a patch to do this,
Andrew Stribblehill [EMAIL PROTECTED] wrote:
Doesn't gzip have a --rsync option, or somesuch? Apparently Andrew
Tridgell (Samba, Rsync) has a patch to do this, but I don't know
whether he passed it onto the gzip maintainers.
I like the idea of having plugins for rsync to handle different
Sam == Sam Couter [EMAIL PROTECTED] writes:
Sam Andrew Stribblehill [EMAIL PROTECTED] wrote:
Doesn't gzip have a --rsync option, or somesuch? Apparently
Andrew Tridgell (Samba, Rsync) has a patch to do this, but I
don't know whether he passed it onto the gzip maintainers.
On Sun, Jan 07, 2001 at 11:43:39AM +1100, Sam Couter wrote:
A deb plugin would be better. :)
One problem with a deb plugin is that .debs are signed in compressed
form. gzip isn't guaranteed to produce the same compressed file from
identical uncompressed files on different architectures and
On Sun, Jan 07, 2001 at 12:53:14PM +1100, Drake Diedrich wrote:
On Sun, Jan 07, 2001 at 11:43:39AM +1100, Sam Couter wrote:
A deb plugin would be better. :)
One problem with a deb plugin is that .debs are signed in compressed
form. gzip isn't guaranteed to produce the same
If you don't like large Packages files, implement a rsync transfer
method for them.
--
see shy jo
On 5 Jan 2001, Goswin Brederlow wrote:
If that suits your needs, feel free to write a bugreport on apt about
this.
Yes, I enjoy closing such bug reports with a terse response.
Hint: Read the bug page for APT to discover why!
Jason
On Fri, Jan 05, 2001 at 03:05:03AM +0100, Goswin Brederlow wrote:
Whats the problem with a big Packages file?
If you don't want to download it again and again just because of small
changes I have a better solution for you:
rsync
apt-get update could rsync all Packages files (yes, not
On Fri, Jan 05, 2001 at 05:46:35AM +0800, zhaoway wrote:
how about diffs bethween dinstall runs?..
sorry, but i don't understand here. dinstall is a server side thing here?
yes, when dinstall runs it would copy the old packages file to, lets say,
packages.old and create it's changes to the
Previously Sami Haahtinen wrote:
this would bring us to, apt renaming the old deb (if there is one) to the
name of the new package and rsync those. and we would save some time once
again...
There is a --fuzzy-names patch for rsync that makes rsync do that itself.
Or, can rsync sync binary
== Sami Haahtinen [EMAIL PROTECTED] writes:
On Fri, Jan 05, 2001 at 03:05:03AM +0100, Goswin Brederlow
wrote:
Whats the problem with a big Packages file?
If you don't want to download it again and again just because
of small changes I have a better solution for
== Jason Gunthorpe [EMAIL PROTECTED] writes:
On 5 Jan 2001, Goswin Brederlow wrote:
If that suits your needs, feel free to write a bugreport on apt
about this.
Yes, I enjoy closing such bug reports with a terse response.
Hint: Read the bug page for APT to
In 05 Jan 2001 19:51:08 +0100 Goswin Brederlow [EMAIL PROTECTED] cum veritate
scripsit :
Hello,
I'm currently discussing some changes to the rsync client with some
people from the rsync ML which would uncompress compressed data on the
client side (no changes to the server) and rsync those.
== Junichi Uekawa [EMAIL PROTECTED] writes:
In 05 Jan 2001 19:51:08 +0100 Goswin Brederlow
[EMAIL PROTECTED] cum veritate
scripsit : Hello,
I'm currently discussing some changes to the rsync client with
some people from the rsync ML which would uncompress
Jason Gunthorpe wrote:
Hint: Read the bug page for APT to discover why!
Looking through the apt bugs., saw this one, rejected:
Bug#77054: wish: show current-upgraded versions on upgrade -u
My private solution to this is the following patch to `apt-get':
---
hi,
[i'm not sure if this has been resolved, lart me if you like.]
my proposal to resolve big Packages.gz is through package
pool system.
add 36 or so new debian package, namely,
[a-zA-Z0-1]-packages-gz_date_all.deb
contents of each is quite obvious. ;-)
and a virtual unstable-packages-gz
[read my previous semi-proposal]
this has some more benefits,
1) package maintainer could upload (to pool) in whatever
frequency they like.
2) release is seperated from package pool which is a storage
system. and release is a qa system.
3) release could be managed through BTS on specific
On Fri, Jan 05, 2001 at 03:17:30AM +0800, zhaoway wrote:
[read my previous semi-proposal]
this has some more benefits,
1) package maintainer could upload (to pool) in whatever
frequency they like.
in an ideal world, developer should upload to ''xxx-auto-builder'' ;-)
9i'm turning out to
On Fri, Jan 05, 2001 at 03:02:15AM +0800, zhaoway wrote:
my proposal to resolve big Packages.gz is through package
pool system.
add 36 or so new debian package, namely,
[a-zA-Z0-1]-packages-gz_date_all.deb
contents of each is quite obvious. ;-)
and a virtual unstable-packages-gz
to:
Subject: Re: package pool and big
Packages.gz file
01/04/2001
03:01 PM
On Thu, Jan 04, 2001 at 03:07:00PM -0600, Vince Mulhollon wrote:
The only other possibility not yet proposed (?) would be to split the
packages file by section.
base-packages
games-packages
x11-packages
net-packages
Then a server that just doesn't do x11 or doesn't go games has no
On Thu, Jan 04, 2001 at 11:01:15PM +0200, Sami Haahtinen wrote:
On Fri, Jan 05, 2001 at 03:02:15AM +0800, zhaoway wrote:
my proposal to resolve big Packages.gz is through package
pool system.
add 36 or so new debian package, namely,
[a-zA-Z0-1]-packages-gz_date_all.deb
contents
On Thu, Jan 04, 2001 at 11:19:59PM +0200, Sami Haahtinen wrote:
how would the package manager (namely apt) know which ones you need.. even if
you don't have X11 installed (and apt assumes you don't need X11 packages
file)
doesn't mean that you wouldn't want to install x11 packages file.
On Fri, Jan 05, 2001 at 06:07:20AM +0800, zhaoway wrote:
another solution is to let every single deb provides its.pkg-gz
then, apt-get update will do nothing,
apt-get install some.deb will first download some.pkg-gz, then check its
dependency,
then grab them.pkg-gz all, then install.
that
On Fri, Jan 05, 2001 at 06:07:20AM +0800 , zhaoway wrote:
On Thu, Jan 04, 2001 at 11:19:59PM +0200, Sami Haahtinen wrote:
how would the package manager (namely apt) know which ones you need.. even
if
you don't have X11 installed (and apt assumes you don't need X11 packages
file)
[quote myself, ;-) this is semi-final now ;-)]
another solution is to let every single deb provides its.pkg-gz
then, apt-get update will do nothing,
apt-get install some.deb will first download some.pkg-gz, then check its
dependency,
then grab them.pkg-gz all, then install.
that is a minimum.
On Thu, Jan 04, 2001 at 11:19:25PM +0100, Petr Cech wrote:
On Fri, Jan 05, 2001 at 06:07:20AM +0800 , zhaoway wrote:
then, apt-get update will do nothing,
apt-get install some.deb will first download some.pkg-gz, then check its
dependency,
then grab them.pkg-gz all, then install.
but
final thoughts ;-)
On bigger and bigger Packages.gz file, a try
The directory structure looks roughly like this:
debian/dists/woody/main/binary-all/Packages.deb
debian/pool/main/a/abba/abba_1989.orig.tar.gz
abba_1989-12.diff.gz
abba_1989-12.dsc
== zhaoway [EMAIL PROTECTED] writes:
hi, [i'm not sure if this has been resolved, lart me if you
like.]
my proposal to resolve big Packages.gz is through package pool
system.
Whats the problem with a big Packages file?
If you don't want to download it again and
43 matches
Mail list logo