On 25 March 2015 at 18:34, Matthew Knepley knep...@gmail.com wrote:
That file is obviously old and was removed.
After installing petsc4py, you can do make docs in the top level
source tree, and the docs/ directory will be populated. For this to
work, you need sphinx and epydoc installed.
Am Donnerstag, 26. März 2015, 13:59:33 schrieb Lisandro Dalcin:
On 25 March 2015 at 18:34, Matthew Knepley knep...@gmail.com wrote:
That file is obviously old and was removed.
After installing petsc4py, you can do make docs in the top level
source tree, and the docs/ directory will be
Hello,
I'm using petsc with petsc4py.
A matrix is created like that
MPIrank = MPI.COMM_WORLD.Get_rank()
MPIsize = MPI.COMM_WORLD.Get_size()
print(MPI Rank = , MPIrank)
print(MPI Size = , MPIsize)
parts = partitions()
print(Dimension= , nSupport + dimension, bsize =
On Thu, Mar 26, 2015 at 2:15 AM, Sanjay Kharche
sanjay.khar...@manchester.ac.uk wrote:
Dear All
I have a fedora pc which some times cannot have an internet connection. I
found that I can run non-Petsc MPI programs when it is not connected to the
internet. However, when I try to run my
Thanks.
Quick question (out of ignorance): does it matter that the HEX8 may still be
arranged in an unstructured fashion? Meaning, that although I use brick
elements, my grid does not have a structured grid appearance.
-Manav
On Mar 26, 2015, at 11:14 AM, Barry Smith bsm...@mcs.anl.gov
This is fine. PCGAMG does algebraic multigrid so the mesh doesn't matter in
its use.
Barry
On Mar 26, 2015, at 11:20 AM, Manav Bhatia bhatiama...@gmail.com wrote:
Thanks.
Quick question (out of ignorance): does it matter that the HEX8 may still be
arranged in an unstructured
Florian Lindner mailingli...@xgm.de writes:
Hello,
I'm using petsc with petsc4py.
A matrix is created like that
MPIrank = MPI.COMM_WORLD.Get_rank()
MPIsize = MPI.COMM_WORLD.Get_size()
print(MPI Rank = , MPIrank)
print(MPI Size = , MPIsize)
parts = partitions()
Ok, so I ran my fluids problem with -pc_type jacobi. This time it did not
return with “inf”, but there was no convergence.
Here is the first 10 iterations:
0 KSP preconditioned resid norm 7.840061446913e+07 true resid norm
2.709083260443e+06 ||r(i)||/||b|| 1.e+00
1 KSP
On Thu, Mar 26, 2015 at 11:31 AM, Manav Bhatia bhatiama...@gmail.com
wrote:
Ok, so I ran my fluids problem with -pc_type jacobi. This time it did not
return with “inf”, but there was no convergence.
Good. This seems to confirm that ILU(0) was the problem.
It is not a surprise that Jacobi is
On Thu, Mar 26, 2015 at 1:48 PM, David Knezevic david.kneze...@akselos.com
wrote:
Hi all,
I'm trying to configure PETSc using Intel's MKL and with --download-ml.
Here is my configure line:
./configure
--with-blas-lapack-dir=/opt/intel/composer_xe_2015/mkl/lib/intel64
--download-ml
I get
On Mar 26, 2015, at 10:51 AM, Manav Bhatia bhatiama...@gmail.com wrote:
Barry,
On a related note, I have another elasticity problem that I am trying to
solver with HEX8 elements. It is an isotropic solid structure. Do you have a
recommended preconditioned for this problem?
Yes,
On Thu, Mar 26, 2015 at 4:06 PM, Luc Berger-Vergiat lb2...@columbia.edu
wrote:
Ok,
this work is still part of my Schur complement approach using the full
schur but with a block diagonal A00^-1.
I implemented the computation of A00^-1 by extracting each diagonal block
and inverting them
On Thu, Mar 26, 2015 at 3:07 PM, Luc Berger-Vergiat lb2...@columbia.edu
wrote:
Hi all,
I want to multiply two matrices together, one is MATAIJ and the second is
MATBAIJ, is there a way to leverage the properties of the blocked matrix in
the BAIJ format or should I just assemble the BAIJ
On Thu, Mar 26, 2015 at 4:21 PM, Matthew Knepley knep...@gmail.com wrote:
On Thu, Mar 26, 2015 at 1:48 PM, David Knezevic
david.kneze...@akselos.com wrote:
Hi all,
I'm trying to configure PETSc using Intel's MKL and with --download-ml.
Here is my configure line:
./configure
Ok,
this work is still part of my Schur complement approach using the full
schur but with a block diagonal A00^-1.
I implemented the computation of A00^-1 by extracting each diagonal
block and inverting them individually.
This works quite well and does not cost some much, especially since I
The difference between IS(1), IS(2) and IS(1)',IS(2)' is that they
operate on two different matrices. The former operate on A whereas the
operates on S and these matrices have different sizes so it's not
obvious to me that they IS would be identical? I guess it depends on the
initial ordering
On Thu, Mar 26, 2015 at 5:10 PM, David Knezevic david.kneze...@akselos.com
wrote:
On Thu, Mar 26, 2015 at 4:21 PM, Matthew Knepley knep...@gmail.com
wrote:
On Thu, Mar 26, 2015 at 1:48 PM, David Knezevic
david.kneze...@akselos.com wrote:
Hi all,
I'm trying to configure PETSc using
Eric,
I have now updated all the standard MPI matrix types AIJ, BAIJ, SBAIJ to
print the correct global indices in the error messages when a new nonzero
location is generated thus making debugging this issue easier. In the branches
barry/fix-inserting-new-nonzero-column-location, next
Dear All
I have a fedora pc which some times cannot have an internet connection. I found
that I can run non-Petsc MPI programs when it is not connected to the internet.
However, when I try to run my Petsc based program without a connection, I get
the following error. By googling a little bit,
On Thu, Mar 26, 2015 at 8:16 AM, Florian Lindner mailingli...@xgm.de
wrote:
Hello,
I'm using petsc with petsc4py.
A matrix is created like that
MPIrank = MPI.COMM_WORLD.Get_rank()
MPIsize = MPI.COMM_WORLD.Get_size()
print(MPI Rank = , MPIrank)
print(MPI Size = , MPIsize)
On Thu, Mar 26, 2015 at 10:36 AM, Ataollah Mesgarnejad
ames...@tigers.lsu.edu wrote:
Dear all,
I was wondering if someone can tell me how you get a field's Vec (subVec
of the global Vec) from a distributed DMPlex. It seems that DMCreateFieldIS
works for sequential DMPlex but gives incorrect
Hi,
I am using the KSP linear solver for my system of equations, without any
command line options at this point. I have checked that the L1 norms of my
system matrix and the force vector are finite values, but the KSP solver is
returning with an “inf” residual in the very first iteration.
Thanks, Matt.
Following is the output with: -ksp_monitor_lg_residualnorm -ksp_log -ksp_view
-ksp_monitor_true_residual -ksp_converged_reason
0 KSP preconditioned resid norminf true resid norm
2.709083260443e+06 ||r(i)||/||b|| 1.e+00
Linear solve did not converge due
Dear all,
I was wondering if someone can tell me how you get a field's Vec (subVec of
the global Vec) from a distributed DMPlex. It seems that DMCreateFieldIS
works for sequential DMPlex but gives incorrect values for a distributed
DMPlex.
Many thanks,
Ata
The default preconditioner with ILU(0) on each process is not appropriate for
your problem and is producing overflow. Try -sub_pc_type lu and see if that
produces a different result.
Is this a Stokes-like problem?
Barry
On Mar 26, 2015, at 10:10 AM, Manav Bhatia bhatiama...@gmail.com
On Thu, Mar 26, 2015 at 9:21 AM, Manav Bhatia bhatiama...@gmail.com wrote:
Hi,
I am using the KSP linear solver for my system of equations, without any
command line options at this point. I have checked that the L1 norms of my
system matrix and the force vector are finite values, but the
On Mar 26, 2015, at 10:19 AM, Manav Bhatia bhatiama...@gmail.com wrote:
Thanks, Barry. I will try that.
This is Euler flow equations discretized with SUPG. The mesh is made of
4-noded tetrahedra. The flow parameters correspond to transonic flow.
Yes, ILU could easily fail on this
Thanks, Barry. I will try that.
This is Euler flow equations discretized with SUPG. The mesh is made of 4-noded
tetrahedra. The flow parameters correspond to transonic flow.
-Manav
On Mar 26, 2015, at 10:17 AM, Barry Smith bsm...@mcs.anl.gov wrote:
The default preconditioner with
28 matches
Mail list logo