Here's an example of measuring performance in J.

First, I define some inefficient operations whose
performance I will measure.

   F=:3 :'6!:2 ''+/i.'',":y'"0
   H=:3 :'r=.0 for_n.i.y do.r=.r+n end.'
   G=:3 :'6!:2 ''H '',":y'"0

In this case, I did not factor them very meaningfully,
I have the benchmarking wired right into one of the
functions itself.  (But the function itself is useless,
so I am not inclined to reuse it.)

Next, I collect some data:

   T1=: F 1e3#1e6
   T2=: G 1e3#2e3

Finally, I examine that data:

   plot /:~T1
   plot /:~T2
   plot /:~@".&>;:'T1 T2'
   plot T1
   plot T2

What's interesting, to me, is the shape of these
curves.   In particular, most but not all of the values
are relatively close to the mean

   +/1=+/T1 >/~ 0.9 1.1 * avg T1
866
   +/1=+/T1 >/~ 0.9 1.1 * avg T2
305
   +/1=+/T1 >/~ 0.8 1.2 * avg T2
865

But some of the values vary widely:
   (>./%<./) T1
4.17399
   (>./%<./) T2
1.73824

I expect that this last variation comes from occasional garbage
collection overhead, as a consequence of throwing around
large regions of memory.  I believe this is my OS and not J
itself.

But, also, a factor of 2 is usually but not always a sufficient threshold,
for ignoring measurement noise.

FYI,

-- 
Raul
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to