Title:
I apologise if this e-mail is sent to an inappropriate e-mail list, but as I am a little desperate I am willing to give it a try.  Being an electronic engineering student statistics and probability is not a strong point.

I am trying to develop high level characterisation for the power consumption of arithmetic units.  This basically involves designing a circuit and simulating it's power consumption for different input distribution.
 
I am trying to use entropy of the input vectors as a variable to change power consumption of the circuits.  This in initial tests works well as with normally distributed data with a small entropy, power consumption is low increasing as entropy is increased.

The problem I have is with two input vectors how do I calculate the combined entropy.  Is it just the individual input entropy's added, or is it more complex. 
 
For an individual this is what I do:
The numbers can be in the range -128 to 127.  I am randomly generating 200 numbers which are normally distributed (mean 0).  Then calculate the probability of each number in the range by dividing the sample space into 255 slots and then the probability of each number is (number in slot)/255.
 
Thus the the the entropy is:

H(X) = -SP(Xi) log(P(Xi))

(So I suppose firstly is this right)

Now if I have have a 2 sets of 200 normally distributed input vectors (X,Y) in the range –128 to 127, then presumably

 H(X,Y) = -SP(Xi,Yi) log(P(Xi,Yi))

 But how do you calculate P(Xi,Yi)

Can anyone out there help me. PLEASE.

Thanks

Nigel, 

Reply via email to