On Wed, Dec 08, 2021 at 03:01:18PM -0800, Benjamin Kaduk wrote:
> Hi Rene,
> 
> On Wed, Dec 01, 2021 at 03:32:24PM -0500, Rene Struik wrote:
> > Hi Ben:
> > 
> > With all due respect, one would expect an Area Director to stay away 
> > from hypothetical and conditional claims (which in this case seem to be 
> > single-source base, based on Ilari Liusvaara's assessments {where, in 
> 
> To reiterate: my comments here were intended to mediate in the discussion
> between you and Ilari.  That is, if you want Ilari to stop raising this
> topic, you should (in my opinion) address the questions and concerns as I
> phrased them, since I believe that my restatement matches Ilari's intent.
> Your answers thus far have addressed only some parts of what I have
> understood Ilari to be saying, and thus I expect Ilari to consider his
> concerns unaddressed.

I think this is one the thorniest standards interpretation issues I
have ever came across.

Basically, if one takes the definitions of ECDSA and SHA-3/SHAKE and
cooks an bug-free implementation based on those definitions:

- It will not interoperate with multiple existing implementations.
- It will not agree with ECDSA with SHA-3 examples from NIST.
- Two such implementations would interoperate.


This also impacts RFC 8692. I wonder how many implementations there are
for ECDSA with SHAKE, and how many get the bit order right.


> > the past, these were consistently inaccurate, but nevertheless quoted}). 
> > Moreover, one would expect a Security Area Director to be able to look 
> > up some specs and read this, if this helps, and not simply declare 
> > oneself incompetent in the matter.
> 
> To be clear, I did spend quite some time with the relevant standards, even
> going so far as to spend my employer's money to purchase the relevant ANSI
> standard.  I concluded (as you do downthread) that the standards are clear
> and consistent and treat inputs/outputs as bit strings.  However, when I
> went to go compare what common implementations (openssl, in particular) do
> against what the standard says, I ran into some behaviors that I could not
> fully explain. 

The way software implmentations of SHA-3/SHAKE work is that lanes
(64 bits) are mapped into 64-bit words using the bijection:

A'[x,y] = sum(A[x,y,z]*2^z,z,0,63)

Then every operation Keccak permutation does to the state can be
expressed as combination of four simple operations:

- Permuting the word order.
- Rotating the words.
- Performing bitwise AND on words.
- Performing bitwise XOR on words.

Translating words to octets can be described as storing words using
little-endian. Translating octets to strings can be described as
loading words using litle-endian. There are no explicit bitreversals
or other annoying operations.

This comes from Keccak, which was designd for 64-bit little-endian
CPUs. And then the bit order was chosen to be something convinent for
that.

>From that description, one can already tell there is trouble: The
first bit, A[0,0,0], ends up as LSB of A'[0,0]. Little-endian store is
going to place it as LSB of the first octet. Which the generic ECDSA
signhash is going to incorrectly interpret as 8th bit.


> NIST provides some examples that include both bitwise and hex
> representations of SHA-3 inputs, that helped me understand that part
> of implementation behavior, but I don't remember seeing anything that
> really provided clarity on the bitstring interpretation of the
> outputs.  

If one takes the definition of SHA-3/SHAKE from FIPS202, that
definition already gives a bit string. The bad thing about the SHA-3
examples is that they proceed to interpret that bit string as an
octet string, without giving the original bit string output.


If I take the 30-bit example file for SHA3-256 which is hash for:

1 1 0 0 1 0 1 0 0 0 0 1 1 0 1 0 1 1 0 1 1 1 1 0 1 0 0 1 1 0

And compute what is the hash for that using my translation of FIPS202
into Rust, I get:

0001001100100100111101001111011100000010011110010101101010010111
1000101110001111000100111110101001110101101100100110001100100100
1001110111010100100110000000000111111001010001100101010100110001
1110000010000010001110000010101000000101000111101000110100001011

The example file itself says:

C8 24 2F EF 40 9E 5A E9 D1 F1 C8 57 AE 4D C6 24
B9 2B 19 80 9F 62 AA 8C 07 41 1C 54 A0 78 B1 D0


Turns out these two outputs are related by the b2h/h2b algorithms from
the infamous Appendix B of FIPS 202.


And if one wants to go through NIST SHAKE examples looking for trouble,
there are the SHAKE128 truncation examples. Those show pretty clearly
that the order of bits in octet is LSB to MSB, not the usual MSB to
LSB.



Oh, and in investigating this, I found out that sha3sum (version 1.04)
bitreverses the input (octetwise) if reading in bits mode:

$ echo "1 1 0 0 1 0 1 0 0 0 0 1 1 0 1 0 1 1 0 1 1 1 1 0 1 0 0 1 1 0" | sha3sum 
--01 -a 256
04a488169a7fe0f21bfe38ecf30098198bbdb87ba39394b949f5c1c4e691a375 ^-
$ echo "01010011  01011000  01111011  011001 " | sha3sum --01 -a 256
c8242fef409e5ae9d1f1c857ae4dc624b92b19809f62aa8c07411c54a078b1d0 ^-



-Ilari

_______________________________________________
COSE mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/cose

Reply via email to