Alexander Klimov wrote:
[snip]
(Of course, with 60K passwords there is almost for sure at
least one password1 or Steven123 and thus the salts are
irrelevant.)
I'm not sure I understand this statement as I just calculated the
HMAC MD5 for password1 using a salt of 7D00 (32,000 decimal)
On Sun, 28 Jan 2007, Steven M. Bellovin wrote:
Beyond that, 60K doesn't make that much of a difference even with a
traditional /etc/passwd file -- it's only an average factor of 15
reduction in the attacker's workload. While that's not trivial, it's
also less than, say, a one-character
Bill Stewart wrote:
Salt is designed to address a couple of threats
- Pre-computing password dictionaries for attacking wimpy passwords
...
Yes indeed. The rainbow-tables style attacks are important to protect
against, and a salt does the trick. This is why you can find rainbow tables
for
On Mon, 22 Jan 2007 16:57:34 -0800
Abe Singer [EMAIL PROTECTED] wrote:
On Sun, Jan 21, 2007 at 12:13:09AM -0500, Steven M. Bellovin wrote:
One sometimes sees claims that increasing the salt size is
important. That's very far from clear to me. A collision in the
salt between two entries
On Sun, Jan 28, 2007 at 11:52:16AM -0500, Steven M. Bellovin wrote:
Is that all in one /etc/passwd file (or the NIS equivalent)? Or is it a
Kerberos KDC? I note that a salt buys the defense much less in a
For SDSC, one file. For UCSD, not sure, but I suspect it's (now) a KDC.
(Brian, are
With 4K possible salts, you'd need a
very large password file to have more than a very few collisions,
Definition of very large can vary. (alliteration intended).[...]
UCSD has maybe 60,000 active users. I think very large is very common
in the University environment.
Different decade,
Hi gang,
As an outsider, sort of, looking in I had an interesting thought
about this. Since insider threats are the biggest problem, what
vector could an insider use against password hashes to gain root
password access?
The problem with Rainbow tables is that they would be too massive
when
Joseph Ashwood wrote:
I'm going to try to make this one a bit less aggregious in tone. I'm also
Thank you.
- Original Message - From: Matthias Bruestle
Joseph Ashwood wrote:
- Original Message - From: Matthias Bruestle
You also ended up removing a large portion of my point.
| ...One sometimes sees claims that increasing the salt size is important.
| That's very far from clear to me. A collision in the salt between
| two entries in the password file lets you try each guess against two
| users' entries. Since calculating the guess is the hard part,
| that's a savings
On Sat, 20 Jan 2007 18:41:34 -0600
Travis H. [EMAIL PROTECTED] wrote:
BTW, dictionary attacks can probably be effectively resisted by
making the hashes of passwords twice as big, and using a random value
concatenated with the password before hashing, and storing it
alongside the hash (it's
On Sun, Jan 21, 2007 at 12:13:09AM -0500, Steven M. Bellovin wrote:
Could you explain this? It's late, but this makes no sense at all to
me.
I probably wasn't clear, you bring out my realization that there
are a number of unwritten assumptions going on here.
Similarly, the size of the output
On Fri, Jan 19, 2007 at 12:11:40AM -0800, Bill Stewart wrote:
One of the roots of the problem is that for many applications,
i is a well-defined event and P(i) is a fixed value (for i) ,
but for many other applications,
i might not be a well-defined event, and/or
P(i) is really a conditional
At 01:55 PM 1/18/2007, John Denker wrote:
We would be better off maintaining just the one technical
definition of entropy, namely S = sum_i P_i log(1/P_i).
If you want to talk about something else, call it something
else ... or at least make it clear that you are using the
term in a nontechnical
- From: Matthias Bruestle
[EMAIL PROTECTED]
Subject: Private Key Generation from Passwords/phrases
What do you think about this?
I think you need some serious help in learning the difference between
2^112 and 112, and that you really don't seem to have much grasp of the
entire concept. 112
John Denker [EMAIL PROTECTED] writes:
There is only one technical definition of entropy,
Oh?
So you're saying Chaitin-Kolmogrov information and other ways of
studying entropy are wrong? I think that's a bit unreasonable, don't
you?
There are different definitions that are useful at different
Joseph Ashwood wrote:
- Original Message - From: Matthias Bruestle
[EMAIL PROTECTED]
What do you think about this?
I think you need some serious help in learning the difference between
2^112 and 112, and that you really don't seem to have much grasp of the
entire concept.
Please
On 1/11/07, Joseph Ashwood [EMAIL PROTECTED] wrote:
112 bits of entropy is 112 bits of entropy...anything else and you're
into the world of trying to prove equivalence between entropy and
work which work in physics but doesn't work in computation
because next year the work level will be
- Original Message -
From: Matthias Bruestle [EMAIL PROTECTED]
Subject: Private Key Generation from Passwords/phrases
What do you think about this?
I think you need some serious help in learning the difference between 2^112
and 112, and that you really don't seem to have much grasp
Matthias Bruestle wrote:
Hi,
I am thinking about this since last night. On the web I haven't found
much and I want to go in a different direction as I have found.
Say I want to have 112bit security, i.e. as secure as 3DES. For this I
would choose (as everybody writes) 224bit ECC (or Tipsy
Hi,
I am thinking about this since last night. On the web I haven't found
much and I want to go in a different direction as I have found.
Say I want to have 112bit security, i.e. as secure as 3DES. For this I
would choose (as everybody writes) 224bit ECC (or Tipsy Curve
Cryptosystem/TCC as I
20 matches
Mail list logo