On Sat, Sep 29, 2007 at 01:24:30PM +0200, Ondrej Certik wrote:
> 
> > I myself *very* need SymPy to be faster. For example it takes me a lot
> > of time to just wait for tests passing.
> 
> The tests are annoying. What I am definitely going to do is to fix
> bugs in SymPy so that py.test -d works again (your py.test patch will
> have to be improved though).

Sure, I'll take care of this.

> Then it is possible to run all tests in
> about 30s on Intel Core Quad -- I can provide you an account. Then I
> tried 4 computers (one amd64, one core Quad, two core Duos) an it runs
> for about 20s. By improving the testsuite and py.test, we could maybe
> even speed it up (I don't see a reason why it couldn't scale up
> linearly).
> 
> However, this is a "hammer method". If we could speedup sympy, it will
> be fine, but I suspect it's simply because of so many tests.
> 
> The next step is rewriting parts in C++, but I would wait a little bit
> with this one, until the internals of SymPy are not changing that
> often as now. The huge disadvantage of C++ rewriting is that we will
> need to have two equivalent implementations - C++ and Python and keep
> both updated, because I certainly want SymPy to run on pure Python
> only, the C++ should be only an optional addon. But I believe it will
> be possible - that's my secret plan from the beginning - to write it
> in Python first, to have something now (and slow) and then when we
> know how to do it and it works, to rewrite parts to C++ or even C.

I think we should stick to Python as long as we can. Only when the
design settles, should we consider rewriting some parts of the core in
C/C++.

Also, it seems to me, it is possible to be reasonably fast even when
doing everything in Python. Let's consider tests to be our benchmark.

> 
> > > It
> > > would be awesome, if we could cleanly separate it (even to a separate
> > > project, and just copy it to sympy, like we do with pyglet, so that
> > > endusers are not affected at all),
> >
> > I do not think separating sympy.core into its own project is a good idea.
> >
> > The reason is that not only core affects everything else, but everything
> > else also affects core and it's design. So having both core and
> > 'high-level' stuff in one place will help to improve *both*.
> >
> > I agree that structuring things is good, but we are already on that
> > road, right?
> 
> I think so I think we are doing good.
> 
> > I think what we really need to be able to experiment easily is a more
> > distributed development model. This could be slightly ooftopic, but an
> > interested reader could go to:
> >
> > http://svk.bestpractical.com/view/HomePage
> >
> > and also:
> >
> > http://darcs.net/
> > http://www.selenic.com/mercurial/wiki/
> > http://git.or.cz/
> 
> And you forgot bazaar and arch. Well, I am not against, but then we
> could use the google code hosting for everything, except the source
> repository and use some other hosting for it, for example:
> 
> http://repo.or.cz/
> 
> Nevertheless before deciding not to use svn, what are the problems
> with that? For example how could it help us with this Apply things
> transition to use let's say git, that is distributed? I myself only
> used svn and cvs so far, but let's switch to something else if it is
> so much better.

I originally meant this to be used when say reworking core.

It is not a one-minute process & people have to branch.

If they we using something distributed and change-oriented they could

- periodically rebase their work to the latest trunk
- periodically merge some of the parts that are ready into the trunk
- when they finish provide patches that applies cleanly to the main
  branch.

The main thing here is that distributed systems are designed to help
forking and merging. Since people do it a lot, this systems are tailored
for exactly this task.

Not to mention some other goodies they have. For example we often have
regressions where things that were working now fail. How do we find when
and what broke them?

For example git & mercurial have 'bisect' to quickly find this:
http://www.selenic.com/mercurial/wiki/index.cgi/BisectExtension


---

Of course an SCM is not a silver bullet and woudn't solve automatically
any problem. What some can do though, is to assist.


Please don't get me wrong -- I'm not an SCM expert and do *not* propose
SymPy migrate from svn to something other right now. All that I say is
that distributed things can be helpful and if we are having 'some pain'
merging cores we could think about it.

And also separating core into another package won't help. It will only
delay the moment when the "merge pain" comes. And it will be
"accumulated" in this case.


---

In case anyone is interested:

I think we could even ask SAGE people why they use mercurial and what
are proc & cons for them.

Recent thread where GNOME is going to migrate from svn:
http://mail.gnome.org/archives/desktop-devel-list/2007-September/msg00064.html



-- 
    Всего хорошего, Кирилл.
    http://landau.phys.spbu.ru/~kirr/aiv/


P.S. a long-time almost-happy darcs user.  I've used it since early
     2005, but now darcs is somewhat loosing its momentum.

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"sympy" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sympy?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to