Hi,
Talking in the sci.crypt newsgroup, I did have an
idea about how to do the Web more secure against traffic analysis. The
idea come from a paper I been reading ("Analysis of the SSL 3.0
protocol" by B. Schneier and D. Wagner). They describe how an attacker
can guess the pages you have been
2) If you were to attempt acceleration where do you get the most bang for
your proverbial buck; just doing the encryption/decryption or doing the
entire SSL on a card ?
The encryption is where you'd get the "bang for the buck." There are some RSA
accellerator chips out there that go up to
2) If you were to attempt acceleration where do you get the most bang for
your proverbial buck; just doing the encryption/decryption or doing the
entire SSL on a card ?
The encryption is where you'd get the "bang for the buck."
??? One wants to accelerate a server, right? And what does
Unfortunately the way the original SSLeay (and now OpenSSL) ASN1 works
is to be "memory based"
As almost everyone finds out sooner or later, memory fragmentation can soon
become an issue in the performance of long-running servers. The ASN1
functions are particularly prone to this. I would love
I wonder if anybody could remember what was the magic trick?
Nothing in particular, unfortunately. When the new release was
planned, I asked Mark (who had reported the bug) to test the then
current snapshot, and it passed all the tests then.
One of our QA guys here came up with this one, so don't blame me. :)
If you are using a CA root file with a duplicate entry in it (actually, a
cert file with just a duplicated subject DN, doesn't have to be an exactly
duplicate cert), parsing of the file stops at the duplicate cert.
Is this