Thanks Dennis for the follow up.

/dev/urandom is the default (if existing), so if you don't ./configure
--with-devrandom explicitly it should work too.

Well clearly we found the problem and here is some offensive performance
data to reveal the horrible truth :


node000 $ /usr/bin/time -p dd if=/dev/random bs=8192 \
> count=16384 of=dev_random2.dat
0+16384 records in
0+16384 records out

real 2393.45
user 0.21
sys 2391.88


node000 $ /usr/bin/time -p dd if=/dev/urandom bs=8192 \
> count=16384 of=dev_urandom2.dat
0+16384 records in
0+16384 records out

real 5.67
user 0.09
sys 5.57


So there is the proof right there.  That was the whole problem.

I may have to recompile httpd-2.4.25 also depending on how apr and
apt-util really interface with httpd and I am curious where I would see
and performance difference.

I am also somewhat curious about the quality difference in entropy from
the /dev/random data vs /dev/urandom and thus I fetched the sources for
"ent" from http://www.fourmilab.ch/random/  and compiled with c99 here
to get this :

node000 $ ./ent -b -c dev_random.dat
Value Char Occurrences Fraction
  0         34079262   0.500008
  1         34078178   0.499992

Total:      68157440   1.000000

Entropy = 1.000000 bits per bit.

Optimum compression would reduce the size
of this 68157440 bit file by 0 percent.

Chi square distribution for 68157440 samples is 0.02, and randomly
would exceed this value 89.55 percent of the times.

Arithmetic mean value of data bits is 0.5000 (0.5 = random).
Monte Carlo value for Pi is 3.140871554 (error 0.02 percent).
Serial correlation coefficient is 0.000185 (totally uncorrelated = 0.0).
node000 $


node000 $ ./ent -b -c dev_urandom.dat
Value Char Occurrences Fraction
  0         34071954   0.499901
  1         34085486   0.500099

Total:      68157440   1.000000

Entropy = 1.000000 bits per bit.

Optimum compression would reduce the size
of this 68157440 bit file by 0 percent.

Chi square distribution for 68157440 samples is 2.69, and randomly
would exceed this value 10.12 percent of the times.

Arithmetic mean value of data bits is 0.5001 (0.5 = random).
Monte Carlo value for Pi is 3.142601198 (error 0.03 percent).
Serial correlation coefficient is -0.000063 (totally uncorrelated = 0.0).
node000 $


Essentially no measurable difference for a large number of bits.

Dennis

Reply via email to