From: [email protected] [[email protected]] On Behalf Of jtg [[email protected]]
> o The profiler plots are especially cool. Were these produced by the > products Dr. Guyer mentioned? Yes, RunSnakeRun was used in conjunction with cProfile. It does a nice job. > Looks like a lot more fun than piecing together cProfile results! And how > could you go wrong with something called "runsnakerun"? Exactly. > o A quick clarification: were the plots, especially those showing exec time > vs grid cells, done *after* the mods were in place? Yes, the preconditioners and residual norms were matched before any profiling or comparison was done. We needed to make sure, as Wheeler said, that we were comparing apples to apples. > On Wed, Jun 2, 2010 at 12:38 PM, O'Beirne, James Frederick > <[email protected]<mailto:[email protected]>> wrote: > > > > - > > For benchmarking purposes, we disabled the use of > > any preconditioners and matched Trilinos' residual > > norm to PySparse's choice, the ``b'' norm. We did > > so by making the following changes: > > ... > > Once we made these changes, iteration numbers and > > residuals were in agreement for the two solvers. > > o Good point about setting the 2 solvers on an equal footing for benchmarking > purposes. You note that the moded versions compared well with each other -- > how do they compare with their pre-moded selves? I will make your suggested > changes and compare the pre- and post-modification solutions to see what I > get, but I'm wondering about your opinion as to these changes in general: > are they appropriate for the production code? We haven't done any comparison between identical runs with and without preconditioning; my inclination is to believe that since preconditioning is relatively inexpensive and lowers the number of iterations you must do in order to reach a certain tolerance, the runs with preconditioning would be faster than those without. Again, we don't have any results which allude to this; just a guess from me. > o Will make your suggested mods and try them on the demos and then our model. > Thanks very much, James, for the insight into FiPy's innards. My pleasure; thanks for the feedback. Regards, James
