I am building a DLL that includes the OpenSSL FIPS object module. This is on 
Windows using Visual Studio 10.0. I have the 64-bit version working fine but 
when I build a 32-bit version, the "incore fingerprint" fails to match when I 
load the DLL and call FIPS_mode_set(1). I had the same problem with the 64-bit 
version at one point but then it seemed to just "fix itself" and I never saw 
the problem again.

While debugging the code, I found that when linking the DLL, the .text and 
.rodata values are:
.text:5CC1B000+302160=5CC64C50
.rodata:5CCDA134+46364=5CCE5650

But when we load the DLL, the values are:

.text:5C85B000+302160=5C8A4C50
.rodata:5C91A134+46364=5C925650

(Not always exactly this, but they never match the numbers above. The sizes are 
always correct.)

We're calculating the signature on a different chunk of memory so obviously 
they will not match. The question is why are the pointers different?

Graeme Perrow


Reply via email to