I noticed that the `H` member of `gcm128_context` seems to be unnecessary for builds that aren't using the 1-bit GCM math. Since this member is large (128-bits) and some applications may have lots of GCM contexts relative to the amount of memory they have, I think it would be great to only put the `H` member into the structure when the 1-bit math is used.
I tried to write a patch to do this myself, but I got stuck, because the assembly language code does pointer math assuming that the `H` member (which it doesn't use, AFAICT) is there. And, I can't quite understand the assembly language code well enough to adjust all the offsets to make the code work with `H` removed. Could somebody adjust who understand the assembly code (probably Andy) modify it to use symbolic names for the offsets that are used to access Xi, H, Htable? If so, then I can write the patch to conditionally exclude `H` on platforms that don't need it after `CRYPTO_gcm128_init` finishes executing. Also, I wonder how important it is to keep the 1-bit math code? It seems pretty much every platform have optimized code that uses 4-bit or 8-bit math. Thanks, Brian -- https://briansmith.org/ -- openssl-dev mailing list To unsubscribe: https://mta.openssl.org/mailman/listinfo/openssl-dev