I'm testing a very simple SSL web server. Everything seems to work OK
with RSA and DSA 1024-bit keys.

I tried using DSA 2048-bit key and now I'm getting errors:


# Generate DSA parameters
openssl dsaparam -out dsa_param.pem -outform PEM 2048

# Generate a certificate request
openssl req -newkey dsa:dsa_param.pem \
-keyout netcorp_privkey_dsa.pem -keyform PEM \
-out netcorp_req.pem -outform PEM

# Issue a certificate from a certificate request
openssl ca -in netcorp_req.pem


On the server side I set up a callback function for DH parameters:

DH *tmp_dh_callback(SSL *ssl, int is_export, int keylength)
{
        printf("keylength = %d\n", keylength);

        if(dh1024 == NULL || dh2048 == NULL)
                init_dhparams();

        switch(keylength)
        {
                case 1024:
                        return dh1024;
                break;

                case 2048:
                        return dh2048;
                break;

                default:
                        return dh1024;
        }
}

Then when I use Firefox to connect to the server I get:

Thread starting
keylength = 1024
SSL_accept() error
error:1409441B:SSL routines:SSL3_READ_BYTES:tlsv1 alert decrypt error

Any ideas why I'm getting decrypt error with OpenSSL? Is this related
to the fact that the tmp_dh_callback() is passed 1024-bit key length,
even though the certificate was set up with a 2048-bit key? Why does
this happen?
______________________________________________________________________
OpenSSL Project                                 http://www.openssl.org
User Support Mailing List                    openssl-users@openssl.org
Automated List Manager                           majord...@openssl.org

Reply via email to