On Friday, April 29, 2016 at 1:07:31 PM UTC-5, saad khalid wrote:
>
> However, what are other things? And do you think it would actually be 
> worthwhile? It's an algorhythm for calculating groebner bases. I don't know 
> if that matter. I just didn't know if the language would give any sizable 
> increase in speed. Thanks!
>

For the sake of argument, let's assume that Sage does give you an increase 
in speed. Whether it's worthwhile depends on the novelty of the algorithm 
and whether you want that increase in speed in order to study it or in 
order to set records:

If you want speed to study it, implementing it yourself in Sage is (IMHO) a 
good idea because Sage has fairly good debugging facilities like a trace 
function, and you can compile parts of it via Cython. I did this for an 
algorithm I published a paper on a couple of years ago; it wasn't the speed 
that got the paper published so much as the novelty of the algorithm and my 
ability to document certain structural issues. I'm 99% certain it would 
have been much harder to do that in Mathematica or Maple (which I once 
used) or other CAS's that use a non-standard programming language as their 
interface.

On the other hand, setting records is unlikely, unless you're very 
dedicated and plan to spend a lot of time and energy on it. Caboara has 
said that implementing an algorithm to compute Groebner bases doesn't take 
a lot of effort or time, but neither will it perform well. One then has to 
spend the next several years (!!!) optimizing the implementation. 
Similarly, Faugere has said that he spent at least five years working on 
the F5 algorithm, for similar reasons. I know Christian Eder spent a long 
time years implementing an F5-like algorithm in the Singular kernel, though 
to be honest I don't remember exactly how long, but he documented parts of 
the theoretical process in papers published over 2-3 years. (The algorithm 
is called dstd().)

I know at least two people who worked on Groebner basis implementations and 
ended up leaving the CAS community to make more money in private industry. 
I know one of them was at least partly discouraged by how difficult it was 
to build an efficient implementation.

One reason for the difficulty is that *a lot* of small optimizations in 
most implementations aren't documented anywhere, neither in publications 
nor in the code themselves, and the contribute an enormous amount of the 
speedup. Some of the standard algorithms published in textbooks actually 
give the wrong advice; the Becker-Weispfenning text, for instance, tells 
you to discard redundant polynomials from the ideal. (That may not be 100% 
true, but it's certainly how I understood it. They certainly imply it, and 
don't tell you otherwise.) But that's actually a bad idea, because the 
redundant polynomials are typically sparser than the polynomials you'd use 
to replace them, and dense polynomials seem to make reduction much more 
time-consuming (for hopefully obvious reasons).

I would suggest that you use Sage, but rather than doing it for speed, do 
it to get a foot in the door with Sage development & the open-source 
mindset. As you work more and more in Sage, you can dig into the details of 
how Sage implements things, and that can give you ideas how either to 
modify Singular (the engine Sage uses for Groebner bases) or how to build 
your own implementation later (typically unwise).

-- 
You received this message because you are subscribed to the Google Groups 
"sage-support" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-support+unsubscr...@googlegroups.com.
To post to this group, send email to sage-support@googlegroups.com.
Visit this group at https://groups.google.com/group/sage-support.
For more options, visit https://groups.google.com/d/optout.

Reply via email to