Ok, found that I should not use rsapk.Load for a StringSource, but
should be using rsapk.BERDecodePrivateKey
(privateKey,false,privateKey.MaxRetrievable());
Correct? Anything wrong with that?


On Apr 28, 2:10 pm, yourfriend <[email protected]> wrote:
> My code is like this:
>
>         StringSource privateKey( defaultPrivateKey, defaultPrivateKeyLength,
> true);
>
>         CryptoPP::RSA::PrivateKey rsapk;
>         rsapk.Load(privateKey);
>         LC_RNG rng( 0 ); // ignored for RSA -- but not for others if you ever
> use another signer
>         CryptoPP::RSASSA_PKCS1v15_SHA_Signer signer( rsapk );
>         std::string signature;
>         StringSource m(buf, fsize, true,
>                 new SignerFilter(rng, signer,
>                 new StringSink( signature)
>                 )//SignerFilter
>                 );//StringSource
>
> but it throws a BERDecodeErr when it gets to the line rsapk.Load
> (privateKey);  at which point privateKey.m_store has a first byte of
> 0, just as BERDecode expects, but inside the Load function the
> BufferedTransformation& bt parameter of the Load function has an
> invalid m_buf ... so the real error here seems to be how to get the
> StringSource to a BufferedTransformation correctly, right?
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the "Crypto++ Users" 
Google Group.
To unsubscribe, send an email to [email protected].
More information about Crypto++ and this group is available at 
http://www.cryptopp.com.
-~----------~----~----~----~------~----~------~--~---

Reply via email to