Radim,

And thanks for the pointer to accuracy studies. I haven't read them frankly.

It would seem to me that compiling 500 values very accurately is
currently either prohibitively costly or not very plausible in large
scale solvers regardless of approach.it would be interesting to
compare if they were. Also I am dubious if at scale it really matters
that much as even double precision arithmetic would perhaps accumulate
enough errors from that many flops. In practical terms, people don't
care so much even if they skip a couple documents or even hundreds of
documents if they are to process millions, which may be a disturbance
far greater than a precision error accumulated. But like you said,
it's just my opinion.

Reply via email to