On Fri, 16 Oct 1998, Matthew Donadio wrote:
> Dave Tweed wrote:
> > But there's a lot of problems, probably more in the hazy region between
> > science & engineering, where `numerically intensive' algorithms are
> > developed which don't look anything like existing classical techniques.
> > Here the issue is to generate CORRECT results REASONABLY QUICKLY, ie, the
...
> One of my pervious jobs was developing new receiver techniques for
> digital communication systems. Given a description of the communication
> channel we were working with (point-to-point mincrowave, cellular, etc)
> we could generate our theoretical error rate curves bases on system
> parameters. Our job was see how close we could get to the curve; ie.
> there isn't a concept of correct results, just degrees of success. We
As clarification, when I wrote `correct' above I meant: this program
implements the scheme defined by my mathematical equations except in areas
such as floating point vs real where I understand the degree of
approximation. Unfortunately my algorithms often produce results which
reflect neither reality nor my hopes for their output even when
implemented correctly :-( .
> while. Even a 1.5 times slowdown would mean that we could run less
> simulations per day. For numerically intensive applications like this
> where running time is measure in hours, speed really does matter. A
I guess it depends how good you are at debugging & how complex the logic
in your code is. A current run of one of my programs take ~25 minutes; I'd
be happy to wait 1 1/2 - 2 hours with a language that was much simpler
because I can easily spend three or four times as long trying to find the
typo or incorrect conversion that's completely preventing convergence.
___cheers,_dave__________________________________________________________
email: [EMAIL PROTECTED] I _never_ buy items advertised by those
www.cs.bris.ac.uk/~tweed/pi.htm annoying cards that drop from magazines
work tel: (0117) 954-5253 or by bulk e-mail on principle.