On Monday, 17 October 2016 at 08:46:36 UTC, Era Scarecrow wrote:
encrypting multiple times won't get you the original value
My impression is different. This is how decryption looks like for
chacha:
void ECRYPT_decrypt_bytes(ECRYPT_ctx *x,const u8 *c,u8 *m,u32
bytes)
{
On Monday, 17 October 2016 at 08:20:23 UTC, Kagamin wrote:
On Wednesday, 12 October 2016 at 10:34:52 UTC, Era Scarecrow
wrote:
Maybe it would be better for random number generation rather
than secure encryption? Not sure.
It's used in windows CRNG to compute a big hash of big amount
of
On Wednesday, 12 October 2016 at 10:34:52 UTC, Era Scarecrow
wrote:
Maybe it would be better for random number generation rather
than secure encryption? Not sure.
It's used in windows CRNG to compute a big hash of big amount of
entropy.
BTW if you encrypt something twice, isn't it
On Wednesday, 12 October 2016 at 10:34:52 UTC, Era Scarecrow
wrote:
Anyways, 16bit replacement, extending to 64bit via reordering,
and 8 unique xor's between stages. Once I get the 576 block
finished (1 for salt) I'll probably publish my ugly code for
consideration and to be torn apart for
On Sunday, 9 October 2016 at 20:33:29 UTC, Era Scarecrow wrote:
Something coming to mind is the idea of making a small
algorithm to be used with other already existing encryption
functions to extend the blocksize of encryption with minimal
complexity growth.
For fun I'm experimenting with
On Monday, 10 October 2016 at 09:54:32 UTC, Era Scarecrow wrote:
The largest portion would be that much like a hash, one small
change will change the entire thing rather than a smaller
portion (with the original blocksize). The multiple
re-arranging and encryption steps is to ensure small
On Monday, 10 October 2016 at 03:15:07 UTC, sarn wrote:
End users won't want to permute+encrypt multiple times unless
you can show there's a benefit in the speed/security tradeoff,
so you'll need to think about that in the design.
The largest portion would be that much like a hash, one small
On Sunday, 9 October 2016 at 20:33:29 UTC, Era Scarecrow wrote:
Something coming to mind is the idea of making a small
algorithm to be used with other already existing encryption
functions to extend the blocksize of encryption with minimal
complexity growth. In theory this would extend a