All,
I am try to match some EM problems with analytical solutions.
How do I deal with infinity when using a uniform grid ?
How far out do I need to process, do I need to use a particular technique?
John
On Mon, Sep 22, 2014 at 7:36 AM, John Alletto 4bikerboyj...@gmail.com
wrote:
All,
I am try to match some EM problems with analytical solutions.
How do I deal with infinity when using a uniform grid ?
It depends on your problem. Keep making it bigger and see if you get
convergence.
How
Barry Smith bsm...@mcs.anl.gov writes:
On Sep 21, 2014, at 12:35 PM, De Groof, Vincent Frans Maria
vincent.de-gr...@uibk.ac.at wrote:
the natural norm for positive definite systems in Petsc uses the
preconditioner B, and is defined by r' * B * r. Am I right assuming
that this way we want
Matthew Knepley knep...@gmail.com writes:
On Mon, Sep 22, 2014 at 7:36 AM, John Alletto 4bikerboyj...@gmail.com
wrote:
All,
I am try to match some EM problems with analytical solutions.
How do I deal with infinity when using a uniform grid ?
It depends on your problem. Keep making it
On Mon, Sep 22, 2014 at 9:08 AM, Jed Brown j...@jedbrown.org wrote:
Matthew Knepley knep...@gmail.com writes:
On Mon, Sep 22, 2014 at 7:36 AM, John Alletto 4bikerboyj...@gmail.com
wrote:
All,
I am try to match some EM problems with analytical solutions.
How do I deal with infinity
On Mon, 22 Sep 2014, Matthew Hills wrote:
Hi PETSc Team,
I'm still experiencing difficulties with configuring PETSc with TAU. I'm
currently:
building OpenMPI
1. ./configure --prefix=${SESKADIR}/packages/openmpi
2. make all install
set library path
1. export
Sorry for late reply. I just pushed a fix for the crash. It is in master.
Stefano
On Fri, 5 Sep 2014, Jed Brown wrote:
Satish Balay ba...@mcs.anl.gov writes:
Perhaps the following is the fix [with proper comments, more error
checks?]. But someone more familiar with this code should check
On Sep 22, 2014, at 8:54 AM, Jed Brown j...@jedbrown.org wrote:
Barry Smith bsm...@mcs.anl.gov writes:
On Sep 21, 2014, at 12:35 PM, De Groof, Vincent Frans Maria
vincent.de-gr...@uibk.ac.at wrote:
the natural norm for positive definite systems in Petsc uses the
preconditioner B, and
Barry Smith bsm...@mcs.anl.gov writes:
All true. Switching the CG “norm” does not effect the algorithm (in
exact arithmetic) it only affects the norm that is printed and when
the algorithm stops. The same minimization principles hold
independent of the “norm” used.
Yup, and the
Dear all,
Sorry for the delay on this topic.
Thank you Gaetan for your suggestion. I had thought about doing that
originally, but I had left it out since I thought that a rank owned the entire
row of the matrix (and not only the sub-diagonal part). I will certainly give
it a try.
I still
I'll add it. It would not take too long, just matter of priority.
I'll try to get it done in a day or two, then let you know when it works.
Hong
On Mon, Sep 22, 2014 at 12:11 PM, Antoine De Blois
antoine.debl...@aero.bombardier.com wrote:
Dear all,
Sorry for the delay on this topic.
Thank
Dear all,
I am using the ASM preconditioner to solve a transposed system through
MatSolveTranspose. Strangely, the results I obtain differ in each call. Is
there a non-deterministic operations within ASM? If I use it in a
non-transposed way, I get correct results... If I use GASM, then the
Please email the data file to petsc-ma...@mcs.anl.gov or if you fear it is
too large tell us from where we may download it.
Barry
On Sep 22, 2014, at 1:28 PM, Antoine De Blois
antoine.debl...@aero.bombardier.com wrote:
Dear all,
I am using the ASM preconditioner to solve a
Hi,
I am new to PETSc and trying to determine if GPU speedup is possible
with the 3D Poisson solvers. I configured 2 copies of 'petsc-master' on
a standalone machine, one with CUDA toolkit 5.0 and one without (both
without MPI):
Machine: HP Z820 Workstation, Redhat Enterprise Linux 5.0
CPU:
On 09/22/2014 12:57 PM, Chung Shen wrote:
Dear PETSc Users,
I am new to PETSc and trying to determine if GPU speedup is possible with the
3D Poisson solvers. I configured 2 copies of 'petsc-master' on a standalone
machine, one with CUDA toolkit 5.0 and one without (both without MPI):
Machine:
Dominic, I second a request for such a branch.
Thanks,
Ashwin
On Mon, Sep 22, 2014 at 3:38 PM, Dominic Meiser dmei...@txcorp.com wrote:
On 09/22/2014 12:57 PM, Chung Shen wrote:
Dear PETSc Users,
I am new to PETSc and trying to determine if GPU speedup is possible with
the 3D Poisson
All,
I have two code baseline one uses a standard 7 point STAR stencil the other a
13 point Star stencil.
The first baseline works the second comes back with errors MatSetValuesStencil
Argument out of range
In the second baseline I have a fourth order 13 point stencil which spans +- 2
in all
You should not need to provide the same information elsewhere. Please send
the entire error message.
On Sep 22, 2014, at 4:02 PM, Alletto, John M john.m.alle...@lmco.com wrote:
All,
I have two code baseline one uses a standard 7 point STAR stencil the other a
13 point Star stencil.
I am solving one of the PETSc 3D Laplacian examples with a 7 point stencil of
width 1 and in a separate baseline with a 13 point stencil of width 2 (a more
accurate mesh).
What worked fast in terms of solvers and pre-conditioner for the less accurate
baseline was non-optimal (very slow) for
John,
For any non-trivial size problem for the Laplacian you definitely want to
use multigrid. You can start by trying algebraic multigrid on both cases with
-pc_type gamg
Barry
On Sep 22, 2014, at 6:41 PM, Alletto, John M john.m.alle...@lmco.com wrote:
I am solving one of the
Are there any PML example problems using PETSc?
John
On Sep 22, 2014, at 7:52 AM, Matthew Knepley knep...@gmail.com wrote:
On Mon, Sep 22, 2014 at 9:08 AM, Jed Brown j...@jedbrown.org wrote:
Matthew Knepley knep...@gmail.com writes:
On Mon, Sep 22, 2014 at 7:36 AM, John Alletto
On Mon, Sep 22, 2014 at 8:51 PM, John Alletto 4bikerboyj...@gmail.com
wrote:
Are there any PML example problems using PETSc?
I do not believe we have any.
Thanks,
Matt
John
On Sep 22, 2014, at 7:52 AM, Matthew Knepley knep...@gmail.com wrote:
On Mon, Sep 22, 2014 at 9:08 AM,
Hi all,
I am using PETSc (dev version) to solve the Stokes + temperature equations. My
DM has fields (vx, vy, p, T).
I would like to use nested fieldsplits to separate the T part from the Stokes
part, and apply a Schur complement approach to the Stokes block.
Unfortunately, I keep getting this
Hi all,
below is the complete error message list of options.
Best,
Arthur
STARTING SOLVE FOR TIMESTEP: 1
0 KSP unpreconditioned resid norm 7.5605e+10 true resid norm
7.5605e+10 ||r(i)||/||b|| 1.e+00
[0]PETSC ERROR: DMCreateFieldDecomposition() line 1274 in
Antoine,
That is one nasty matrix! You are actually getting essentially garbage
during the solution process with and without the transpose. There is no reason
to think that the additive Schwarz method, or any standard iterative method
will work much at all on this matrix.
You
Hi Dominic,
I've got some time available at the end of this week for a merge to
next. Is there anything other than PR #178 needed? It currently shows
some conflicts, so is there any chance to rebase it on ~Thursday?
Best regards,
Karli
On 09/22/2014 09:38 PM, Dominic Meiser wrote:
On
Dominic Meiser dmei...@txcorp.com writes:
- To get reliable timing you should configure PETSc without debugging
(i.e. --with-debugging=no)
- The ILU preconditioning in your GPU benchmark is done on the CPU. The
host-device data transfers are killing performance. Can you try to run
with the
Alletto, John M john.m.alle...@lmco.com writes:
I am solving one of the PETSc 3D Laplacian examples with a 7 point stencil of
width 1 and in a separate baseline with a 13 point stencil of width 2 (a more
accurate mesh).
What worked fast in terms of solvers and pre-conditioner for the less
28 matches
Mail list logo