Dear Tim:
This would be a very ambitious project. In brief, I think
it would be more manageable to collect survey papers and
books rather than a list of papers leading up to the most
recent or efficient algorithms. The reason is, it is
difficult to "order" the latter (chronological order by
publication dates is only an approximation due to the lag
between the research work and final publication, and may
lead to claims difficult to adjudicate). Survey papers or
books have done the "distillation" and are usually easier
to follow. At least, I think that might be a first step.
The next stage can be a refinement to include the critical
papers themselves.
William
On Sat, 9 Aug 2014 22:54:14 -0400 (EDT)
Tim Daly <[email protected]> wrote:
The time has come, it seems to me, to organize an effort
to
collect and standardize symbolic algorithms, similar in
spirit
to the NIST Handbook of Mathematical Functions.
It should be possible to order algorithm development for
things
like integration, starting with Liouville's work, then
Risch, etc.
The idea is to provide the algorithm and a series of
improvements
in some reasonably accessible pseudocode, perhaps with
some
agreed-upon benchmark of time and space complexity.
There should
also be an associated website with a cache of the papers
for each
algorithm. The book would be updated yearly with new
developments.
I have been collecting bibliographic references as part
of the
Axiom project and have recently started organizing them
by topic.
http://axiom-developer.org/axiom-website/bookvolbib.pdf
Is a NIST-like algorithm collection reasonable? Opinions
welcome.
Tim Daly
[email protected]
William Sit, Professor Emeritus
Mathematics, City College of New York
Office: R6/291D Tel: 212-650-5179
Home Page: http://scisun.sci.ccny.cuny.edu/~wyscc/
_______________________________________________
Axiom-developer mailing list
[email protected]
https://lists.nongnu.org/mailman/listinfo/axiom-developer