Hi Jørgen,

great work and thanks for the benchmarks.

On Fri, Nov 14, 2014 at 03:33:30PM +0100, Jørgen Kvalsvik wrote: 
> Running the benchmark with this new implementation using Dune-istl
> with CG/ILU(0) on my Intel [email protected] I get the following
> output:
> 
> Wallclock timing:
> Input- and grid processing: 2.65775 sec
> Upscaling:                  143.47 sec
> Total wallclock time:       146.128 sec  (2 min 26.1279 sec)
> 
> Do the numbers look ok? The original, upstream code gives the following:
> Wallclock timing:
> Input- and grid processing: 2.75897 sec
> Upscaling:                  171.677 sec
> Total wallclock time:       174.436 sec  (2 min 54.4357 sec)
> 
>  [...] 
> 
> Running the same benchmark with CG/ILU on petsc:
> Wallclock timing:
> Input- and grid processing: 5.40389 sec
> Upscaling:                  445.309 sec
> Total wallclock time:       450.713 sec  (7 min 30.7128 sec)
> 
> Which brings me to the questions:
> Petsc obviously performs a LOT worse than Dune. I ran the benchmark
> in callgrind which revealed that it spends ~48% of its time inside
> petsc's PCApply. Another 43% is spent in KSP_MatMult.
> 

While it is a big compliment that dune-istl is so much faster, I can
hardly believe this. What initially strikes me is the difference in
the time needed for "Input- and grid processing". Shouldn't this be
the same code for both? Both solvers used are simple should do pretty
much the same things. Therefore I assume that one of the following
things must be different:

- Compile time options. Are you sure that PetSc is compiled with
  optimization?
- Stopping criterion (Arne Morton pointed that out). Istl uses the
  relative residual reduction here.
- Initial guess. (I think OPM uses zeros for this always.)

Cheers,

Markus

-- 
Dr. Markus Blatt - HPC-Simulation-Software & Services http://www.dr-blatt.de
Hans-Bunte-Str. 8-10, 69123 Heidelberg, Germany,  USt-Id: DE279960836
Tel.: +49 (0) 160 97590858

Attachment: signature.asc
Description: Digital signature

_______________________________________________
Opm mailing list
[email protected]
http://www.opm-project.org/mailman/listinfo/opm

Reply via email to