* Goswin von Brederlow:
However, patching rred to apply patches in a single run would be a
good start because all further optimizations will need it.
Why should the number of chunks matter?
If you use the naïve algorithm, it does. But rred implements
something more involved, leading to a
Florian Weimer [EMAIL PROTECTED] writes:
* Goswin von Brederlow:
However, patching rred to apply patches in a single run would be a
good start because all further optimizations will need it.
Why should the number of chunks matter?
If you use the naïve algorithm, it does. But rred
Florian Weimer [EMAIL PROTECTED] writes:
* Goswin von Brederlow:
What code do you need there? If the rred method keeps the full Index
file in memory during patching it can just be fed all the patches one
after another and only write out the final result at the
end. Combining the patches is
* Goswin von Brederlow:
What code do you need there? If the rred method keeps the full Index
file in memory during patching it can just be fed all the patches one
after another and only write out the final result at the
end. Combining the patches is a simple cat.
#383881 suggests that I/O
On Fri 30 Jun 2006, Martin Schulze wrote:
You know that you can easily turn off this feature by adjusting apt.conf:
Acquire::Pdiffs { false; };
Ah, great :)
After not having done aptitude update for a month or so, after
downloading all the hunderds (!) of diffs, I got the following
On Fri, 30 Jun 2006 11:10:37 +0200, Eduard Bloch [EMAIL PROTECTED] wrote:
I have doubts, have you measured the real difference?
Yes, I have. Test was done in a sid chroot that hasn't been updated
for like two weeks.
|$ time sudo aptitude update
|Reading package lists... Done
|Building dependency
* Marc Haber [EMAIL PROTECTED]:
Yes, I have. Test was done in a sid chroot that hasn't been updated
for like two weeks.
...
Updating with pdiffs took one minute nine seconds while downloading a
completely new set of list files took eight seconds.
Test environment was quite unfair though
Florian Weimer [EMAIL PROTECTED] writes:
* Marc Haber:
The machine in Question is a P3 with 1200 MHz. What's making the
process slow is the turnaround time for the http requests, as observed
multiple times in this thread alone.
Then your setup is very broken. APT performs HTTP pipelining.
On Friday 07 July 2006 15:36, Goswin von Brederlow wrote:
Florian Weimer [EMAIL PROTECTED] writes:
* Marc Haber:
The machine in Question is a P3 with 1200 MHz. What's making the
process slow is the turnaround time for the http requests, as observed
multiple times in this thread alone.
On 7/7/06, Goswin von Brederlow [EMAIL PROTECTED] wrote:
What code do you need there? If the rred method keeps the full Index
file in memory during patching it can just be fed all the patches one
after another and only write out the final result at the
end. Combining the patches is a simple cat.
On Fri, Jul 07, 2006 at 02:28:49PM +0200, Marc Haber wrote:
Updating with pdiffs took one minute nine seconds while downloading a
completely new set of list files took eight seconds.
Test environment was quite unfair though (an old machine with an 1200
MHz CPU and a single, slow disk on an
On Fri, 30 Jun 2006 18:29:40 +0200, Florian Weimer [EMAIL PROTECTED]
wrote:
* Marc Haber:
The machine in Question is a P3 with 1200 MHz. What's making the
process slow is the turnaround time for the http requests, as observed
multiple times in this thread alone.
Then your setup is very broken.
On 6/30/06, Florian Weimer [EMAIL PROTECTED] wrote:
* Marc Haber:
The machine in Question is a P3 with 1200 MHz. What's making the
process slow is the turnaround time for the http requests, as observed
multiple times in this thread alone.
Then your setup is very broken. APT performs HTTP
On Fri, Jun 30, 2006 at 05:38:31PM +0200, Steinar H. Gunderson wrote:
On Fri, Jun 30, 2006 at 04:55:58PM +0200, Martin Schulze wrote:
You know that you can easily turn off this feature by adjusting apt.conf:
Sure, and I've done so for several of my machines now. Actually, for many
enough
On Thu, 29 Jun 2006 11:43:45 -0700, Tyler MacDonald [EMAIL PROTECTED]
wrote:
Steinar H. Gunderson [EMAIL PROTECTED] wrote:
I usually notice the difference -- the other way. aptitude update on a
machine that hasn't been updated in a while suddenly takes minutes instead of
seconds...
Yes,
Miles Bader wrote:
Yeah I noticed this too -- some .pdiff files appeared to be downloaded
dozens of times!
It prints the same pdiff filenames when downloading files with the same
basename from different paths.
Just to confuse things it does print out each seprate pdiff file 3 times,
although
* Joey Hess [EMAIL PROTECTED] [2006-06-30 02:05]:
Just to confuse things it does print out each seprate pdiff file 3
times, although my squid logs show it downloads each exactly once.
My guess w/o reading the code is that one represents the download,
one the extraction, and one the application
On Fri, 30 Jun 2006 08:44:30 +0200, Martin Michlmayr [EMAIL PROTECTED]
wrote:
Your guess is correct, see #372504. This is currently a UI problem.
It displays the line three times, but it only downloads it in the
first. The other two lines are unpack and rred (patch).
So the rred is not a badly
#include hallo.h
* Marc Haber [Fri, Jun 30 2006, 08:00:57AM]:
On Thu, 29 Jun 2006 11:43:45 -0700, Tyler MacDonald [EMAIL PROTECTED]
wrote:
Steinar H. Gunderson [EMAIL PROTECTED] wrote:
I usually notice the difference -- the other way. aptitude update on a
machine that hasn't been updated in
On Fri, 30 Jun 2006 11:10:37 +0200, Eduard Bloch [EMAIL PROTECTED] wrote:
* Marc Haber [Fri, Jun 30 2006, 08:00:57AM]:
file:// URLs are not the only issue here - aptitude update is also
much slower than before on a hosted box which has 100 Mbit/s
connectivity and could load the Packages.gz in,
* Marc Haber ([EMAIL PROTECTED]) [060630 10:58]:
On Fri, 30 Jun 2006 08:44:30 +0200, Martin Michlmayr [EMAIL PROTECTED]
wrote:
Your guess is correct, see #372504. This is currently a UI problem.
It displays the line three times, but it only downloads it in the
first. The other two lines are
Steinar H. Gunderson wrote:
On Thu, Jun 29, 2006 at 08:35:41PM +0200, martin f krafft wrote:
Not really. pdiff's mainly reduce download size for low bandwidth
connections. file:// is pretty high bandwidth, you won't notice the
difference.
I usually notice the difference -- the other way.
On Fri, Jun 30, 2006 at 04:55:58PM +0200, Martin Schulze wrote:
You know that you can easily turn off this feature by adjusting apt.conf:
Sure, and I've done so for several of my machines now. Actually, for many
enough machines that it's becoming bothersome...
/* Steinar */
--
Homepage:
* Marc Haber:
The machine in Question is a P3 with 1200 MHz. What's making the
process slow is the turnaround time for the http requests, as observed
multiple times in this thread alone.
Then your setup is very broken. APT performs HTTP pipelining.
On my machines, I see the behavior Miles
On Fri, Jun 30, 2006 at 06:29:40PM +0200, Florian Weimer [EMAIL PROTECTED]
wrote:
* Marc Haber:
The machine in Question is a P3 with 1200 MHz. What's making the
process slow is the turnaround time for the http requests, as observed
multiple times in this thread alone.
Then your setup
* Mike Hommey:
The fix is to combine the diffs before applying them, so that you only
need one process the large Packages file once. I happen to have ML
code which does this (including the conversion to a patch
representation which is more amenable to this kind of optimization)
and would be
Is it at all useful/better for apt-get to use the .pdiff files when dealing
with a local (file://) debian repo?
Thanks,
Tyler
--
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of unsubscribe. Trouble? Contact [EMAIL PROTECTED]
also sprach Tyler MacDonald [EMAIL PROTECTED] [2006.06.29.2005 +0200]:
Is it at all useful/better for apt-get to use the .pdiff files
when dealing with a local (file://) debian repo?
Not really. pdiff's mainly reduce download size for low bandwidth
connections. file:// is pretty high bandwidth,
On Thu, Jun 29, 2006 at 08:35:41PM +0200, martin f krafft wrote:
Not really. pdiff's mainly reduce download size for low bandwidth
connections. file:// is pretty high bandwidth, you won't notice the
difference.
I usually notice the difference -- the other way. aptitude update on a
machine that
Steinar H. Gunderson [EMAIL PROTECTED] wrote:
On Thu, Jun 29, 2006 at 08:35:41PM +0200, martin f krafft wrote:
Not really. pdiff's mainly reduce download size for low bandwidth
connections. file:// is pretty high bandwidth, you won't notice the
difference.
I usually notice the difference
Steinar H. Gunderson wrote:
On Thu, Jun 29, 2006 at 08:35:41PM +0200, martin f krafft wrote:
Not really. pdiff's mainly reduce download size for low bandwidth
connections. file:// is pretty high bandwidth, you won't notice the
difference.
I usually notice the difference -- the other way.
Tyler MacDonald on 2006-06-29 11:43:45 -0700:
Steinar H. Gunderson [EMAIL PROTECTED] wrote:
On Thu, Jun 29, 2006 at 08:35:41PM +0200, martin f krafft wrote:
Not really. pdiff's mainly reduce download size for low bandwidth
connections. file:// is pretty high bandwidth, you won't notice
On Thu, Jun 29, 2006 at 09:15:13PM +0200, Bastian Venthur wrote:
Same here. Very annoying on a box where you only update every few weeks
or something. Wouldn't it be possible to make snapshots every week and
only pdiff from this snapshot?
You can turn off pdiffs if you'd like to; the old
Steinar H. Gunderson wrote:
On Thu, Jun 29, 2006 at 09:15:13PM +0200, Bastian Venthur wrote:
Same here. Very annoying on a box where you only update every few weeks
or something. Wouldn't it be possible to make snapshots every week and
only pdiff from this snapshot?
You can turn off pdiffs
also sprach Bastian Venthur [EMAIL PROTECTED] [2006.06.29.2135 +0200]:
Someone could make stats to calculate the average day-count x when the
summ of the pdiffs becomes larger that the package-files. Then aptitude
(or apt-get) could decide whether the last update is more than x days
away and
martin f krafft wrote:
also sprach Bastian Venthur [EMAIL PROTECTED] [2006.06.29.2135 +0200]:
Someone could make stats to calculate the average day-count x when the
summ of the pdiffs becomes larger that the package-files. Then aptitude
(or apt-get) could decide whether the last update is more
On Thu, Jun 29, 2006 at 09:35:09PM +0200, Bastian Venthur wrote:
Someone could make stats to calculate the average day-count x when the
summ of the pdiffs becomes larger that the package-files. Then aptitude
(or apt-get) could decide whether the last update is more than x days
away and decide
On Thu, Jun 29, 2006 at 10:53:14PM +0200, Robert Lemmen wrote:
it might be easier to just generate fewer diffs on the server side, if
there is no matching diff available apt will fall back to using the
standard method. you will however find out that the size of all diffs
together is already
On Thu, Jun 29, 2006 at 09:35:09PM +0200, Bastian Venthur wrote:
Steinar H. Gunderson wrote:
On Thu, Jun 29, 2006 at 09:15:13PM +0200, Bastian Venthur wrote:
Same here. Very annoying on a box where you only update every few weeks
or something. Wouldn't it be possible to make snapshots every
Robert Lemmen wrote:
standard method. you will however find out that the size of all diffs
together is already less than the size of the regular packages file.
Yeah, looking at the average filesize of a diff compared to a packages
file, I guess you'll need to wait like 100-200 days until the
Robert Lemmen [EMAIL PROTECTED] writes:
it might be easier to just generate fewer diffs on the server side, if
there is no matching diff available apt will fall back to using the
standard method. you will however find out that the size of all diffs
together is already less than the size of the
Kurt Roeckx [EMAIL PROTECTED] writes:
But what I don't get is that it seems to be downloading every
file more than once. It atleast looks to be downloading twice as
files as it should, but it more looks like it's downloading the
same file 3 times if I look at the sizes.
Yeah I noticed this
Bastian Venthur [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]
Robert Lemmen wrote:
standard method. you will however find out that the size of all diffs
together is already less than the size of the regular packages file.
Yeah, looking at the average filesize of a diff compared
43 matches
Mail list logo