On Fri, 2007-03-16 at 09:55 -0400, Chris King wrote:
> On 3/16/07, skaller <[EMAIL PROTECTED]> wrote:
> > The idea is that you can write:
> >
> >         fun f(x:float): float => ... cost x *2.0;
> >         fun g(x:float): float => ... cost x * x;
> >
> > so now, g is chosen for small x, whilst f is chosen for big x.
> 
> I think this sounds like a good idea, but who chooses the constants?

Good question. One method is to use the config script to
time operations. Another is not to use constants: use variables,
which means run time choice.

Note that ATLAS (BLAS on steroids) already does exactly that.
It actually generates source code tuned by config time tests.

> e.g. Say you implement f and give it cost x * 2.0 because it does two
> multiplications for every x, and I implement it (without knowledge of
> yours) and give it cost x * 3.0 because it does three additions for
> every x.  On a PowerPC this is all good, but on a 386 multiplications
> take much longer than additions and the cost measurement would be
> incorrect.  One idea would be for the user to specify a cost function
> of the form x * _ (where _ represents some constant), and (as you
> suggested) let Felix profile the functions to fill in the constants.
> The user would of course have to supply example data, e.g.
> 
> fun quicksort[n] (a: int^n): int^n => ... cost _ * x * lg x;
> 
> run_profile(quicksort, (3,5,2,1,4));
> 
> so Felix wouldn't pick inappropriate test data (as would be disastrous
> in the case of algorithms like quicksort).

The constants are hard. But the O() performance is important.
I think even of this:

        average_cost log n * n worst_cost n * n

so if you specify command line switch 'hard-real-time' it uses
worst cost, otherwise average cost.

The idea here is to actually use theorem proving at compile
time to show that, for example

        n * n < n * n * n

For a data structure it's even harder: you'd need to weight
operations, eg: 3 reads for each write.

To do all this properly is like proving correctness -- it's way
beyond the scope of this project at this time.

but to get *something* is better than nothing. Eiffel has
pre- and post-conditions, and whilst they're not that useful,
the IDEA of programming by contract is very useful.

It always helps to have something lame for researchers to
criticise .. much easier for them, than criticising a 
vaccuum.


BTW: guessing at the constants isn't bad. Consider the decision
'should we parallelise this operation or not?'

Well for a matrix side n, if you pick formula 

        parallel = n > 50

it would be better than nothing: small matrices use serial ops,
large one done in parallel. Optimal? No. But probably better
than one method or the other for all values of n.



-- 
John Skaller <skaller at users dot sf dot net>
Felix, successor to C++: http://felix.sf.net

-------------------------------------------------------------------------
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on IT & business topics through brief surveys-and earn cash
http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV
_______________________________________________
Felix-language mailing list
Felix-language@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/felix-language

Reply via email to