Cryptography-Digest Digest #112

2000-02-13 Thread Digestifier

Cryptography-Digest Digest #112, Volume #11  Sun, 13 Feb 00 09:13:01 EST

Contents:
  Re: CHEATING AT PARADISE POKER (Linda Sherman)
  Re: CHEATING AT PARADISE POKER (Tony L. Svanstrom)
  Crypto-related algorithms in Perl. (Tony L. Svanstrom)
  Re: Has some already created a DATA DIODE? (Matthias Bruestle)
  Re: Basic Crypto Question 2 (John Savard)
  Re: How to Annoy the NSA (MCKAY john)
  Apple file security question (RC2 and ASC) ([EMAIL PROTECTED])
  Re: RFC: Reconstruction of XORd data ("Rick Braddam")
  Re: Crypto-related algorithms in Perl. (Jeff Zucker)
  New encryption algorithm ABC. ("Bogdan Tomaszewski")
  Re: SHA-1 sources needed! (Tom St Denis)
  Re: Has some already created a DATA DIODE? (Tom St Denis)
  Re: SHA1 and longer keys? (Tom St Denis)
  Re: Has some already created a DATA DIODE? (Tom St Denis)



From: Linda Sherman [EMAIL PROTECTED]
Crossposted-To: rec.gambling.poker
Subject: Re: CHEATING AT PARADISE POKER
Date: Sun, 13 Feb 2000 09:01:47 GMT

Joseph Ashwood wrote:
 
 Their statements are almost all fluff.
 
 The facts they do give:
 Their choice of using modular division in their so-called
 random number generation leads to a bias towards the low
 order cards, this results in a tendancy for the cards that
 started at the top of the deck to end at the top.
 
 Their choice of using the C/C++ rand() function allows for a
 good cryptographer to begin computing the future values
 without too much difficulty.

I must have read different pages than you did.

The way I read it, they don't use rand(). They are showing why they
DON'T use rand(). They admittedly confuse the issue by saying they use a
"32-bit rand() function" but on the random number generator page they
say they use a modified version of prng that takes a 2016-bit seed. prng
is most definitely not rand().

 They state where they get their random seeds from, it's
 almost a joke it's so poor. The low order bits of the
 counter on a computer are only random when sampled at
 suitably random, fairly distant intervals, if their doing it
 to get the stated 17 bits per second it's no longer random.

Again, I think we must have read different pages. The page I read says
take seed bits from a variety of sources and sample them in different
parts of the code. This includes unpredictable external sources, such as
the clocks on 

I'm satisfied that, if they are doing what they say they are doing,
there is no way anyone can predict the next card on any deal without
actually having access to the server.

Lin
-- 
Linda K Sherman [EMAIL PROTECTED]
[EMAIL PROTECTED]  www.cti-pro.com   www.dalati.com



--

From: [EMAIL PROTECTED] (Tony L. Svanstrom)
Crossposted-To: rec.gambling.poker
Subject: Re: CHEATING AT PARADISE POKER
Date: Sun, 13 Feb 2000 10:13:20 +0100

Linda Sherman [EMAIL PROTECTED] wrote:

 Joseph Ashwood wrote:
  
  Their statements are almost all fluff.
  
  The facts they do give:
  Their choice of using modular division in their so-called
  random number generation leads to a bias towards the low
  order cards, this results in a tendancy for the cards that
  started at the top of the deck to end at the top.
  
  Their choice of using the C/C++ rand() function allows for a
  good cryptographer to begin computing the future values
  without too much difficulty.
 
 I must have read different pages than you did.
 
 The way I read it, they don't use rand(). They are showing why they
 DON'T use rand(). They admittedly confuse the issue by saying they use a
 "32-bit rand() function" but on the random number generator page they
 say they use a modified version of prng that takes a 2016-bit seed. prng
 is most definitely not rand().


From URL:http://www.paradisepoker.com/shuffling.html:

  Fortunately most rand() functions have more than 6 bits of
  result, however even at 15 bits of result (the most common
  with C compilers), there is a definite bias towards the
  first 8 cards in the deck.

  At Paradise Poker, we chose to use a 31 bit rand()
  function (coming from a 2016 bit seed that has entropy
  constantly added to it - see random number generator for
  more details), thereby decreasing the bias even further
  (by a factor of 65536), as well as shuffling the deck more
  than once. We found that shuffling twice removes 99% of
  the bias, and shuffling 3 times makes it pretty much
  immeasurable, then we decided to triple it for good
  measure and bump it up to 10 times to keep the marketing
  people happy. These two changes result in a shuffled deck
  with no detectable bias.

  They state where they get their random seeds from, it's
  almost a joke it's so poor. The low order bits of the
  counter on a computer are only random when sampled at
  

Cryptography-Digest Digest #113

2000-02-13 Thread Digestifier

Cryptography-Digest Digest #113, Volume #11  Sun, 13 Feb 00 11:13:01 EST

Contents:
  Re: Which compression is best? (Helger Lipmaa)
  Re: Message to SCOTT19U.ZIP_GUY (Tim Tyler)
  Re: Period of cycles in OFB mode (Helger Lipmaa)
  Re: Which compression is best? (SCOTT19U.ZIP_GUY)
  Re: RFC: Reconstruction of XORd data (Mok-Kong Shen)
  Re: New encryption algorithm ABC. (Tim Tyler)
  Re: Which compression is best? (Tim Tyler)
  Free C Crypto API -- Update (Tom St Denis)
  Re: Which compression is best? (Tim Tyler)
  Re: Which compression is best? (SCOTT19U.ZIP_GUY)
  Re: Guaranteed Public Key Exchanges (No Brainer)
  Re: Guaranteed Public Key Exchanges (No Brainer)



From: Helger Lipmaa [EMAIL PROTECTED]
Subject: Re: Which compression is best?
Date: Sun, 13 Feb 2000 16:27:08 +0200

Tim Tyler wrote:

 Douglas A. Gwyn [EMAIL PROTECTED] wrote:
 : [EMAIL PROTECTED] wrote:

 : 1) From a security perspective, how important is compression?  Is
 : prior compression just a kind of "weak enhancement" or is it
 : considered an integral part of the encryption process as a whole?

 : If you don't know for sure that the enemy cannot crack your
 : encryption, then precompression at least interferes with attacks
 : based on statistical characteristics of the plaintext source
 : language, which *might* reduce the chances of the enemy reading
 : your message.

 This appears to cover 100% of all cases.

 : On the other hand, if you have justifiable confidence in your
 : cryptosystem, precompression would be a waste of resources.

 While this seems to cover pretty much 0% of them.  How can you
 /know/ that any confidence you have in your cryptosystem is justifiable?
 You can't.  You have to juggle probabilities.

Use semantically secure public key cryptosystems (ElGamal, for example). Only
in very weak cryptosystems (like some published in this newsgroup;)
compression really helps: thwarting statistical attacks is one of the most
basic requirements to the todays cryptosystems. If you don't believe they do
it in a satisfactory manner, don't use them for anything at all!


 You're compressing in order to add strength, and frustrate the analyst -
 *not* simply to remove redundancy present in the plaintext.  Size simply
 is not the only criteria which is relevant.

This is done by encryption, frustrating the analyst has never been an
objective of compression. Leave security for cryptography and compressing for
compression!

Helger Lipmaa
http://home.cyber.ee/helger



--

From: Tim Tyler [EMAIL PROTECTED]
Subject: Re: Message to SCOTT19U.ZIP_GUY
Reply-To: [EMAIL PROTECTED]
Date: Sun, 13 Feb 2000 14:23:38 GMT

Douglas A. Gwyn [EMAIL PROTECTED] wrote:
: Tim Tyler wrote:

: Encrypting something twice does not double the time to break.
: Speaking *very* roughly, if anything, it squares it.

: If the encryption uses the same key, then it doubles the time
: for a brute-force key search.

Yes, but the passage apparently at the root of this seems to begin:

``Encrypt with Block Cipher A  ( Key kA) [...]
  Encrypt with Cipher B (Key kB) [...]
  Encrypt with Cipher C (kC) [...]''
-- 
__
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Where you go depends on how you look.

--

From: Helger Lipmaa [EMAIL PROTECTED]
Subject: Re: Period of cycles in OFB mode
Date: Sun, 13 Feb 2000 16:32:59 +0200

David Wagner wrote:

 In article [EMAIL PROTECTED], Terry Ritter [EMAIL PROTECTED] wrote:
  Clearly, an arithmetic count sequence is very regular:  The lsb
  changes every time; the next higher bit half as often, and then only
  when the lsb changes to zero.  That is a lot of statistical
  bit-position information which may be amenable to frequency and phase
  detection techniques.
 
  Now, if the block cipher is perfect in practice, all this should not
  be a problem.  But practical block ciphers may not wrap one edge to
  the other, and bit-diffusion from an edge of the block may not be as
  good as we would expect elsewhere.  The extremely strong counter
  signal may thus expose cipher problems which would otherwise be
  hidden.

 Good point.
 For instance, if there are good differential attacks, counter
 mode is skating on thin ice.

But a good cipher is not supposed to have such attacks, right? :-) If a cipher is
differentially weak, you can mount a lot of different attacks on it (including
ciphertext-only attacks). I wouldn't buy such cipher anyways; a good block cipher
should look like a pseudorandom permutation. For ideal ciphers (pseudorandom
functions), counter mode is actually better than the CBC mode, since one cannot
apply birthday attacks on this mode.

 Do you mean using a LFSR to drive the block cipher?
 That sounds like a good idea, if it was what you meant.

Or may be Gray codes :-)

Helger Lipmaa
http://home.cyber.ee/helger


--


Cryptography-Digest Digest #115

2000-02-13 Thread Digestifier

Cryptography-Digest Digest #115, Volume #11  Sun, 13 Feb 00 16:13:01 EST

Contents:
  Re: Period of cycles in OFB mode (Tim Tyler)
  Re: Has some already created a DATA DIODE? (Terry Ritter)
  Re: Block chaining (Tim Tyler)
  Q: Division in GF(2^n) (Mok-Kong Shen)
  Re: Which compression is best? (Tim Tyler)
  Re: Is there a list server for this newsgroup? (Mok-Kong Shen)
  Re: SHA1 and longer keys? (Tom St Denis)
  Re: Which compression is best? (wtshaw)
  Re: Which compression is best? (wtshaw)
  Re: Guaranteed Public Key Exchanges (Mok-Kong Shen)
  Re: Message to SCOTT19U.ZIP_GUY (wtshaw)
  Re: Basic Crypto Question 2 (wtshaw)
  Re: Is there a list server for this newsgroup? (Tony L. Svanstrom)
  Basic Crypto Question 3 ([EMAIL PROTECTED])
  Re: Guaranteed Public Key Exchanges (Ralph Hilton)
  Re: New encryption algorithm ABC. ([EMAIL PROTECTED])



From: Tim Tyler [EMAIL PROTECTED]
Subject: Re: Period of cycles in OFB mode
Reply-To: [EMAIL PROTECTED]
Date: Sun, 13 Feb 2000 17:03:06 GMT

Helger Lipmaa [EMAIL PROTECTED] wrote:
: David Wagner wrote:

[problems using counter mode?]

: Do you mean using a LFSR to drive the block cipher?
: That sounds like a good idea, if it was what you meant.

: Or may be Gray codes :-)

Gray codes /could/ be even worse.  At least with a counter a few bits
change on an increment once in a while.  With Gray codes you only ever
have one bit change at a time.
-- 
__
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Legalise IT.

--

From: [EMAIL PROTECTED] (Terry Ritter)
Subject: Re: Has some already created a DATA DIODE?
Date: Sun, 13 Feb 2000 17:15:17 GMT


On Sun, 13 Feb 2000 13:35:54 GMT, in 886bvq$ods$[EMAIL PROTECTED], in
sci.crypt Tom St Denis [EMAIL PROTECTED] wrote:

[...]
AlgM is a good place to start, just don't use LCG's as the underlying
PRNG.  

I doubt that is nearly enough.


AlgM with two 'long' Fibonacii generators is secure.

Really?  Care to back that up?  

We have been through this before, and I have quoted at length from the
attack references.  As far as I know, MacLaren-Marsaglia techniques
have appeared exactly twice in cryptographic literature -- and they
have *failed* twice.  Yet even with this information you continue to
claim strength where there is a damn good reason to suspect that there
is no strength.  Why do you do it, and why do you do it in response to
newbies?

---
Terry Ritter   [EMAIL PROTECTED]   http://www.io.com/~ritter/
Crypto Glossary   http://www.io.com/~ritter/GLOSSARY.HTM


--

From: Tim Tyler [EMAIL PROTECTED]
Subject: Re: Block chaining
Reply-To: [EMAIL PROTECTED]
Date: Sun, 13 Feb 2000 17:24:48 GMT

David Wagner [EMAIL PROTECTED] wrote:
: In article [EMAIL PROTECTED], Tim Tyler  [EMAIL PROTECTED] wrote:

: However... Plaintext Block Chaining (PBC), and Plaintext FeedBack (PFB)
: modes allow parallel processing in the encryption direction.

: Are these ones secure, when used with non-random plaintexts?
: (e.g., English text as the plaintext)

: I'm not worried so much about chosen-text attacks as whether there is
: the possibility that they might share some of the weaknesses of ECB mode,
: but to a lesser extent.  Any thoughts?

A cursory examination suggests that birthday-type attacks on ECB mode
which depend on repeated plaintext blocks, would need repeats of two
consecutive plaintext blocks (instead of one) to work against PBC mode.

Even then there'd be the possibility that the repeated cyphertext had
other causes besides repeated plaintext.  However this may be a component
of the reason why these modes do not see much use.

Chosen plaintexts are also a concern ;-/

I find the lack of any feedback from the output of the block cypher to
its inputs rather aesthetically displeasing.

Also, while encryption can be done in parallel, decryption cannot.

This may waste the time of the brute-force attacker - but otherwise
generally seems undesirable.
-- 
__
 |im |yler  The Mandala Centre  http://www.mandala.co.uk/  [EMAIL PROTECTED]

Dates in calendars are often closer than they appear.

--

From: Mok-Kong Shen [EMAIL PROTECTED]
Subject: Q: Division in GF(2^n)
Date: Sun, 13 Feb 2000 19:19:23 +0100

I learned from a mailing list that there is a US patent US5890800
'Method and device for the division of elements of a Galois field'
(http://www.patents.ibm.com/details?pn=US05890800) for computing
A/B in GF(2^n). The algorithm amounts to computing with squaring
and multiplication of numbers A and B represented in n bits the
expression

A^(2^n) * B^(2^n-2)

The writer of a post in the mailing list then reasoned that this 
reduces to A/B, since x^(2^n-1) = 1 for all x.

It seems, however, that the algorithm is doing computations all 
the way in Z_(2^n) instead of GF(2^n). So I wonder how the binary 
number thus obtained can 

Cryptography-Digest Digest #116

2000-02-13 Thread Digestifier

Cryptography-Digest Digest #116, Volume #11  Sun, 13 Feb 00 18:13:01 EST

Contents:
  Re: Guaranteed Public Key Exchanges (Mok-Kong Shen)
  Re: Basic Crypto Question 3 (Mok-Kong Shen)
  Re: Does the NSA have ALL Possible PGP keys? ("tiwolf")
  Re: Guaranteed Public Key Exchanges (Ralph Hilton)
  Re: Message to SCOTT19U.ZIP_GUY (Tim Tyler)
  Re: Which compression is best? (Jerry Coffin)
  Re: Guaranteed Public Key Exchanges (Mok-Kong Shen)
  Re: Does the NSA have ALL Possible PGP keys? ("tiwolf")
  Re: Does the NSA have ALL Possible PGP keys? ("tiwolf")
  ECES Question ! ("Manik Taneja")
  Re: Guaranteed Public Key Exchanges (Ralph Hilton)
  Re: SHA-1 sources needed! (Gilles BAUDRILLARD)
  Re: Guaranteed Public Key Exchanges (Mok-Kong Shen)



From: Mok-Kong Shen [EMAIL PROTECTED]
Subject: Re: Guaranteed Public Key Exchanges
Date: Sun, 13 Feb 2000 22:23:48 +0100

Ralph Hilton wrote:
 

 Alice and Bob wish to establish a key, All communications are monitored by
 Charlie. The communications would have to appear in public though to avoid
 unobserved modification. But the fact of the key exchange being public is
 irrelevant.

I guess that all the trouble centers practically around how to 
guarantee (absolutely) that there is no 'unobserved modification'.

M. K. Shen

--

From: Mok-Kong Shen [EMAIL PROTECTED]
Subject: Re: Basic Crypto Question 3
Date: Sun, 13 Feb 2000 22:24:49 +0100

[EMAIL PROTECTED] wrote:
 
 Cascading Multiple Block Ciphers:
 
 1.  If a plain text is encrypted with three Ciphers, is it as strong as
 the strongest Cipher in the Chain or as week as the weekest?
 
 2. Coud there be some subtle interaction between the algorithms that may
 reduce security or reduced keyspace?
 
 3. Is there an optimum way of combining ciphers together or
 rules...assuming that the cascade is made up of block ciphers of the
 same size and key length?  i.e. should one choose IDEA, 3DES, CAST128 or
 Blowfish,IDEA,CAST128, or Blowfish, RC5 and 3DES..what are the
 criteria???
 
 4. What if the ciphers have different block size and key length, is it
 still ok to cascade them?  Blowfish, Twofish, IDEA?
 
 5. Is it too complex to alternately encrypt the plaintext blocks with
 the different ciphers in one Pass?  Does that make sense?

In my humble knowledge, one could only make such 'general' claims that,
if two ciphers are sufficiently different in nature, then it is 
unlikely that cascading would result in weakening due to unfavourable 
'interations' and it is likely that the combination would lead to 
strength greater than the strongest component. I don't see different 
block sizes could be an unfavourable factor, excepting that the 
processing is a little bit complicated technically. But experts may 
correct my opinions.

M. K. Shen

--

From: "tiwolf" [EMAIL PROTECTED]
Crossposted-To: comp.security.pgp,misc.survivalism
Subject: Re: Does the NSA have ALL Possible PGP keys?
Date: Sun, 13 Feb 2000 13:21:56 -0800

Does anyone here really think that any cryto program self made or commercial
is not broken already or can't be broken given a little effort by the NSA
geeks. I know that someone might use some type of cryto that might give them
trouble for a while, but if they really want to I think that the NSA geeks
can break it.



cfm wrote in message ...
What's the big deal, any one of us who wishes to spend the time can
generate all possible PGP keys. So what, now if they can search them and
discover which one is in use in a particular message and then decrypt
it, that's news, but its also pretty far fetched that nsa is performing
a search across the key space for all PGP encrypted messages in the
internet. (Ignores question of how all traffic in the internet is
funneled to NSA!)

carl.

In article 8764db$vqo$[EMAIL PROTECTED], "Scott Fluhrer"
[EMAIL PROTECTED] wrote:

Anonymous [EMAIL PROTECTED] wrote in message
news:[EMAIL PROTECTED]...
 There are a couple of interesting threads on talk.politics.crypto
 originating from a cryptographer with www.filesafety.com.  They
 purport that the NSA has ALL POSSIBLE keys for PGP and that all PGP
 encrypted netmail has been "transparent" for at least two years to
 the NSA and certain elements of the military and FBI.  The
 cryptographic basis for this alleged total compromise of PGP is
 discussed.

 This is a low-traffic NG and I should like to see serious analysis of
 these claims by those who are more technically qualified to discuss
 them.
Summary: either he's nuts, he's trolling or he's deliberately lying about
his competition (I rather suspect the latter, myself).

Facts:

- The source code for older versions of PGP is publicly available.  In
spite
of repeated requests from other posters, he refuses to point out where in
the source code the number of keys are limited, or where the random number
generator is chilled

- The number of 

Cryptography-Digest Digest #117

2000-02-13 Thread Digestifier

Cryptography-Digest Digest #117, Volume #11  Sun, 13 Feb 00 23:13:01 EST

Contents:
  Re: I'm returning the Dr Dobbs CDROM ("Trevor Jackson, III")
  Odp: New encryption algorithm ABC. ("Bogdan Tomaszewski")
  Re: Basic Crypto Question 3 (Bruce Schneier)
  Re: Period of cycles in OFB mode ([EMAIL PROTECTED])
  Re: Using Gray Codes to help crack DES ([EMAIL PROTECTED])
  Re: Basic Crypto Question 3 ("Joseph Ashwood")
  Re: RFC: Reconstruction of XORd data (Tim Tyler)
  Re: Block chaining (David Hopwood)
  Re: Basic Crypto Question 3 (stanislav shalunov)
  Re: Period of cycles in OFB mode (David Wagner)
  Re: Which compression is best? ("Douglas A. Gwyn")
  Re: Using Gray Codes to help crack DES ("Trevor Jackson, III")
  Re: RFC: Reconstruction of XORd data ("Douglas A. Gwyn")



Date: Sun, 13 Feb 2000 18:26:08 -0500
From: "Trevor Jackson, III" [EMAIL PROTECTED]
Subject: Re: I'm returning the Dr Dobbs CDROM

Victor Zandy wrote:

 A couple weeks ago I asked for opinions of the Dr Dobbs CDROM
 collection of cryptography books.  Overwhelmingly the response was
 positive, so I bought it.  (Thanks again to those of you who replied.)

 I am returning the CDROM because it is not suitable for printing.
 For example, to print chapter 1 of the Stinson book (44 pages) Adobe
 acroread (x86/Solaris 2.6) creates a 500MB postscript file.  I cannot
 print this file directly, probably because it is too big.  Although I
 might be able to find a way to print the file, at 500MB it would take
 too much time.

 I don't know how the PDF files on the CDROM were prepared, but
 they look like they were scanned from physical book pages.  For recent
 titles, like Stinson, they should have been generated from the
 computer originals to make a smaller file, with better image quality.

 Several people who responded to me said they appreciate being able
 to search and cut-and-paste the text on the CDROM.  For those
 features, and for anyone who doesn't mind reading books from computer
 displays, the CDROM is a great deal.  But it is useless for printing
 paper copies of its contents, even a tiny amount.

Have you considered running the PDF images through an OCR filter?  You
might squeeze out most of the file at a cost of a few recognition errors.



--

From: "Bogdan Tomaszewski" [EMAIL PROTECTED]
Subject: Odp: New encryption algorithm ABC.
Date: Sun, 13 Feb 2000 23:15:29 GMT

Then take the best compression tool
 you know and compress the cyphertext file. If there's a significant
 amount of compression as a result, your algorithm might be weak.

I work with this : 100% in -about 100 % out

--
Bogdan Tomaszewski
  [EMAIL PROTECTED]

U¿ytkownik [EMAIL PROTECTED] w wiadomo¶ci do grup dyskusyjnych
napisa³:8872ri$7ie$[EMAIL PROTECTED]
 In article T6yp4.8832$[EMAIL PROTECTED],
   "Bogdan Tomaszewski" [EMAIL PROTECTED] wrote:
  I discovered new encryption algorithm. I called him ABC - Alternative
Binary
  Code.
  Permits on usage of key about unrestricted lengths from this
  neself( dependent only from lengths of key) with level of safety.
  You know some programme testing entropia or breaking keys?
  I can somewhere check whether message after coding realizes conditionses
of
  good agorithm?

 To find out if the algorithm is good, you should post it here into the
 public. I think here are many people who at least can tell you when it
 happens to be not so good. If you test the cyphertext output with some
 programs, you probably will get to know that it is very insecure if it is
 very insecure. But if your algorithm passes the test, this might just
 mean that it is quite insecure, just not insecure enough to be detected
 by simple statistical analysis.

 I'm not an expert, just a newbie, but here's one test that might work:
 Take a very large chunk of plain text (e.g. a book in electronic form)
 and encrypt it to raw binary data.  ...
 Greetings,

 John Stone


 Sent via Deja.com http://www.deja.com/
 Before you buy.





--

From: [EMAIL PROTECTED] (Bruce Schneier)
Subject: Re: Basic Crypto Question 3
Date: Sun, 13 Feb 2000 23:10:30 GMT

First, understand that you can't mathematically prove anything more
than: a cascade of block ciphers is as strong as the weakest block
cipher in the cascade.  Realistically, though, there is every reason
to believe that a cascade is stronger than the individual ciphers.

On Sun, 13 Feb 2000 20:02:48 GMT, [EMAIL PROTECTED] wrote:

Cascading Multiple Block Ciphers:

1.  If a plain text is encrypted with three Ciphers, is it as strong as
the strongest Cipher in the Chain or as week as the weekest?

I believe it is much stronger than any of the three ciphers.

2. Coud there be some subtle interaction between the algorithms that may
reduce security or reduced keyspace?

Yes, that's the reason you can't prove anything very useful here.  But
such interactions are