Re: [petsc-users] PETSC_VIEWER_DRAW_(Comm) Fortran interface missing?
Thanks Satish! > On Mar 8, 2018, at 9:50 AM, Satish Balaywrote: > > Great! I've merged balay/ftn-VIEWER_DRAW/maint to 'maint' > > Satish > > On Thu, 8 Mar 2018, Tim Steinhoff wrote: > >> Yeah, thanks Satish. >> PETSC_VIEWER_DRAW_() works perfectly and I could not reproduce my >> above mentioned issue with PETSC_VIEWER_STDOUT_(), so probably I >> messed something up. >> Thanks again for your great support, >> Volker >> >> 2018-03-06 18:14 GMT+01:00 Satish Balay : >>> Ok - I've added ftn interface for PETSC_VIEWER_DRAW_() [similar to >>> PETSC_VIEWER_STDOUT_()] in branch balay/ftn-VIEWER_DRAW/maint. >>> >>> You can give it a try and see if it works. >>> >>> Satish >>> >>> On Tue, 6 Mar 2018, Satish Balay wrote: >>> Hm - I do see the fortran interface code for PETSC_VIEWER_STDOUT_() https://bitbucket.org/petsc/petsc/src/049452fa31a1384b9850ba16ebcec79c99e4198c/src/sys/classes/viewer/impls/ascii/ftn-custom/zvcreatef.c?at=master=file-view-default and it does appear to work for me. balay@asterix /home/balay/petsc/src/mat/examples/tests (master *=) $ git diff diff --git a/src/mat/examples/tests/ex16f90.F90 b/src/mat/examples/tests/ex16f90.F90 index 6bb83b4644..fee44aea95 100644 --- a/src/mat/examples/tests/ex16f90.F90 +++ b/src/mat/examples/tests/ex16f90.F90 @@ -15,6 +15,7 @@ PetscInt one PetscScalar v(1) PetscScalar, pointer :: array(:,:) + PetscViewer vw call PetscInitialize(PETSC_NULL_CHARACTER,ierr) @@ -49,7 +50,8 @@ ! ! Print the matrix to the screen ! - call MatView(A,PETSC_VIEWER_STDOUT_WORLD,ierr);CHKERRA(ierr) + vw = PETSC_VIEWER_STDOUT_(PETSC_COMM_WORLD) + call MatView(A,vw,ierr);CHKERRA(ierr) ! balay@asterix /home/balay/petsc/src/mat/examples/tests (master *=) $ mpiexec -n 2 ./ex16f90 Mat Object: 2 MPI processes type: mpidense 9.e+00 4.5000e+00 4.5000e+00 3.e+00 3.e+00 2.2500e+00 balay@asterix /home/balay/petsc/src/mat/examples/tests (master *=) $ Satish On Tue, 6 Mar 2018, Tim Steinhoff wrote: > Hi Barry, > > thanks for your fast response. Yes, I'm fine with that. > Indeed, my code with the routine PETSC_VIEWER_STDOUT_(comm) was > compilable, but turned out it did not work properly at runtime > (hangs). > When using PETSC_VIEWER_DRAW_(comm), the compiler immediately gives an > error and I was confused. > > Thanks, > > Volker > > By the way, the output parameter description of > PetscViewerASCIIGetStdout is missing in the documentation: > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerASCIIGetStdout.html > > > 2018-03-06 16:26 GMT+01:00 Smith, Barry F. : >> >> >> Tim, >> >> The PETSC_VIEWER_STDOUT_(comm) construct doesn't work in Fortran. >> Only the >> PETSC_VIEWER_STDOUT_WORLD or PETSC_VIEWER_STDOUT_SELF work. This is also >> true for >> draw see src/ksp/ksp/examples/tutorials/ex100f.F90 >> >> Note that you can use PetscViewerDrawOpen() to get a draw to open on >> any communicator you like. >> >> >> Does this answer your question or did I misunderstand? >> >> Barry >> >> >> >> >>> On Mar 6, 2018, at 9:02 AM, Tim Steinhoff wrote: >>> >>> Hi all, >>> >>> it seems like the routine PETSC_VIEWER_DRAW_(comm) is not available >>> in Fortran, while PETSC_VIEWER_STDOUT_(comm) is. >>> Is there anything I can do about it? (I am using the current maint >>> branch of PETSc) >>> >>> Thank you very much and kind regards, >>> >>> Volker >> > >>> >> >
[petsc-users] make: *** No rule to make target `chk_makej', needed by `all'. Stop.
Hi, I've just been trying to compile SLEPC on a cluster, and am getting the following error message after I type "make": > ./configure Checking environment... done Checking PETSc installation... done Checking LAPACK library... done Writing various configuration files... done Generating Fortran stubs... done === SLEPc Configuration === SLEPc directory: /marconi/home/userexternal/slanthal/slepc-git It is a git repository on branch: master PETSc directory: /marconi/home/userexternal/slanthal/petsc-git It is a git repository on branch: master Architecture "arch-complex" with double precision complex numbers xxx=xxx Configure stage complete. Now build the SLEPc library with (gnumake build): make SLEPC_DIR=$PWD PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git PETSC_ARCH=arch-complex xxx=xxx > make SLEPC_DIR=$PWD PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git PETSC_ARCH=arch-complex make: *** No rule to make target `chk_makej', needed by `all'. Stop. Does someone maybe know what the cause of this could be? Make is correct in that there is no rule for chk_makej, but I'm assuming it should really be some kind of function? However, since I don't understand what the dependency on chk_makej exactly does, I'm a bit lost... Thanks for you help in advance! Samuel
Re: [petsc-users] No VecGetLocalSubVector?
On Wed, 2018-03-07 at 16:13 -0700, Jed Brown wrote: > No reason, just didn't need it. I don't think any PETSc developers > use > VecSetValues* because VecScatter is a more natural interface with > lower > overhead. > Any demos of its recommended use? Garth > Garth Wellswrites: > > > Is there a reason why there is no 'VecGetLocalSubVector' to match > > MatGetLocalSubMatrix? > > > > I've used VecGetSubVector, but can't use local indices with it, > > e.g. > > can't use VecSetValuesLocal. > > > > Garth
Re: [petsc-users] No VecGetLocalSubVector?
Garth Wellswrites: > On Wed, 2018-03-07 at 16:13 -0700, Jed Brown wrote: >> No reason, just didn't need it. I don't think any PETSc developers >> use >> VecSetValues* because VecScatter is a more natural interface with >> lower >> overhead. >> > > Any demos of its recommended use? snes/examples/tutorials/ex28.c is my recommended demo for both MatGetLocalSubMatrix() and composite vectors. You can follow the example (using DMComposite) or call VecScatter directly.
Re: [petsc-users] No VecGetLocalSubVector?
Garth Wellswrites: > On Thu, 2018-03-08 at 11:25 -0700, Jed Brown wrote: >> Garth Wells writes: >> >> > On Wed, 2018-03-07 at 16:13 -0700, Jed Brown wrote: >> > > No reason, just didn't need it. I don't think any PETSc >> > > developers >> > > use >> > > VecSetValues* because VecScatter is a more natural interface with >> > > lower >> > > overhead. >> > > >> > >> > Any demos of its recommended use? >> >> snes/examples/tutorials/ex28.c is my recommended demo for both >> MatGetLocalSubMatrix() and composite vectors. You can follow the >> example (using DMComposite) or call VecScatter directly. > > Thanks, Jed. Could you expand on VecSetValues* vs VecScatter? I'm a bit > perplexed because I can't find anything in the docs discussing use of > VecSetValues* vs VecScatter. Most PETSc examples create a local vector and use VecScatterBegin/End (or DMLocalToGlobalBegin/End which wraps it) to communicate from local vector to global vector. This is typical when using finite element methods with a non-overlapping element partition. (Some low-order FEM practitioners do a bit of redundant computation to assemble using a vertex partition.) The advantage of VecScatter is that it knows in advance exactly what values are moving where. In contrast, VecSetValues + VecAssemblyBegin/End needs to determine where all the entries are going (sort of like a reduction operation; can more efficient when MPI-3 is available) and how many to expect. If you need to use VecAssemblyBegin/End and it is a bottleneck, there is an optimization to cache the communication pattern if you promise to only communicate a subset of the values on the first assembly. But when VecScatter does the job, it tends to be more convenient and a bit faster.
Re: [petsc-users] No VecGetLocalSubVector?
No reason, just didn't need it. I don't think any PETSc developers use VecSetValues* because VecScatter is a more natural interface with lower overhead. Garth Wellswrites: > Is there a reason why there is no 'VecGetLocalSubVector' to match > MatGetLocalSubMatrix? > > I've used VecGetSubVector, but can't use local indices with it, e.g. > can't use VecSetValuesLocal. > > Garth
[petsc-users] Tweaking my code for CUDA
Hello all, I am working on porting a linear solver into GPUs for timing purposes, so far i've been able to compile and run the CUSP libraries and compile PETSc to be used with CUSP and ViennaCL, after the initial runs i noticed some errors, they are different for different flags and i would appreciate any help interpreting them, The only elements in this program that use PETSc are the laplacian matrix (sparse), the RHS and X vectors and a scatter petsc object, so i would say it's safe to pass the command line arguments for the Mat/VecSetType()s instead of changing the source code, If i use *-vec_type cuda -mat_type aijcusparse* or *-vec_type viennacl -mat_type aijviennacl *i get the following: [0]PETSC ERROR: [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: - Stack Frames [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] VecSetValues line 847 /home/valera/petsc/src/vec/vec/interface/rvector.c [0]PETSC ERROR: [0] VecSetType line 36 /home/valera/petsc/src/vec/vec/interface/vecreg.c [0]PETSC ERROR: [0] VecSetTypeFromOptions_Private line 1230 /home/valera/petsc/src/vec/vec/interface/vector.c [0]PETSC ERROR: [0] VecSetFromOptions line 1271 /home/valera/petsc/src/vec/vec/interface/vector.c [0]PETSC ERROR: - Error Message -- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600 [0]PETSC ERROR: ./gcmSeamount on a cuda named node50 by valera Thu Mar 8 09:50:51 2018 [0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/cusp --with-vienacl=1 --download-fblaslapack=1 --download-hypre [0]PETSC ERROR: #5 User provided function() line 0 in unknown file -- This seems to be a memory out of range, maybe my vector is too big for my CUDA system? how do i assess that? Next, if i use *-vec_type cusp -mat_type aijcusparse *i get something different and more interesting: [0]PETSC ERROR: - Error Message -- [0]PETSC ERROR: Object is in wrong state [0]PETSC ERROR: Vec is locked read only, argument # 3 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600 [0]PETSC ERROR: ./gcmSeamount on a cuda named node50 by valera Thu Mar 8 10:02:19 2018 [0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/cusp --with-vienacl=1 --download-fblaslapack=1 --download-hypre [0]PETSC ERROR: #48 KSPSolve() line 615 in /home/valera/petsc/src/ksp/ksp/interface/itfunc.c PETSC_SOLVER_ONLY 6.8672990892082453E-005 s [0]PETSC ERROR: - Error Message -- [0]PETSC ERROR: Invalid argument [0]PETSC ERROR: Object (seq) is not seqcusp or mpicusp [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Development GIT revision: v3.8.3-1817-g96b6f8a GIT Date: 2018-02-28 10:19:08 -0600 [0]PETSC ERROR: ./gcmSeamount on a cuda named node50 by valera Thu Mar 8 10:02:19 2018 [0]PETSC ERROR: Configure options PETSC_ARCH=cuda --with-cc=mpicc --with-cxx=mpic++ --with-fc=mpifort --COPTFLAGS=-O3 --CXXOPTFLAGS=-O3 --FOPTFLAGS=-O3 --with-shared-libraries=1 --with-debugging=1 --with-cuda=1 --with-cuda-arch=sm_60 --with-cusp=1 --with-cusp-dir=/home/valera/cusp --with-vienacl=1 --download-fblaslapack=1 --download-hypre [0]PETSC ERROR: #49 VecCUSPGetArrayReadWrite() line 1718 in /home/valera/petsc/src/vec/vec/impls/seq/seqcusp/veccusp2.cu [0]PETSC ERROR: #50
Re: [petsc-users] No VecGetLocalSubVector?
On Thu, 2018-03-08 at 11:25 -0700, Jed Brown wrote: > Garth Wellswrites: > > > On Wed, 2018-03-07 at 16:13 -0700, Jed Brown wrote: > > > No reason, just didn't need it. I don't think any PETSc > > > developers > > > use > > > VecSetValues* because VecScatter is a more natural interface with > > > lower > > > overhead. > > > > > > > Any demos of its recommended use? > > snes/examples/tutorials/ex28.c is my recommended demo for both > MatGetLocalSubMatrix() and composite vectors. You can follow the > example (using DMComposite) or call VecScatter directly. Thanks, Jed. Could you expand on VecSetValues* vs VecScatter? I'm a bit perplexed because I can't find anything in the docs discussing use of VecSetValues* vs VecScatter. Garth
Re: [petsc-users] make: *** No rule to make target `chk_makej', needed by `all'. Stop.
Thanks, Jose. "Jose E. Roman"writes: > Update the SLEPc repository and try again. > When a change in petsc-master affects SLEPc, you have to allow one or two > days for us to re-synchronize. > Jose > >> El 9 mar 2018, a las 3:59, Samuel Lanthaler escribió: >> >> Hi, >> >> I've just been trying to compile SLEPC on a cluster, and am getting the >> following error message after I type "make": >> >> > ./configure >> Checking environment... done >> Checking PETSc installation... done >> Checking LAPACK library... done >> Writing various configuration files... done >> Generating Fortran stubs... done >> >> === >> SLEPc Configuration >> === >> >> SLEPc directory: >> /marconi/home/userexternal/slanthal/slepc-git >> It is a git repository on branch: master >> PETSc directory: >> /marconi/home/userexternal/slanthal/petsc-git >> It is a git repository on branch: master >> Architecture "arch-complex" with double precision complex numbers >> >> xxx=xxx >> Configure stage complete. Now build the SLEPc library with (gnumake build): >> make SLEPC_DIR=$PWD >> PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git >> PETSC_ARCH=arch-complex >> xxx=xxx >> >> > make SLEPC_DIR=$PWD >> > PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git >> > PETSC_ARCH=arch-complex >> make: *** No rule to make target `chk_makej', needed by `all'. Stop. >> >> Does someone maybe know what the cause of this could be? Make is correct in >> that there is no rule for chk_makej, but I'm assuming it should really be >> some kind of function? However, since I don't understand what the dependency >> on chk_makej exactly does, I'm a bit lost... >> >> Thanks for you help in advance! >> Samuel >> >>
Re: [petsc-users] make: *** No rule to make target `chk_makej', needed by `all'. Stop.
Ah, now I undestood what you meant... sorry. Thank you, Jose! On 08.03.2018 22:26, Jed Brown wrote: Thanks, Jose. "Jose E. Roman"writes: Update the SLEPc repository and try again. When a change in petsc-master affects SLEPc, you have to allow one or two days for us to re-synchronize. Jose El 9 mar 2018, a las 3:59, Samuel Lanthaler escribió: Hi, I've just been trying to compile SLEPC on a cluster, and am getting the following error message after I type "make": ./configure Checking environment... done Checking PETSc installation... done Checking LAPACK library... done Writing various configuration files... done Generating Fortran stubs... done === SLEPc Configuration === SLEPc directory: /marconi/home/userexternal/slanthal/slepc-git It is a git repository on branch: master PETSc directory: /marconi/home/userexternal/slanthal/petsc-git It is a git repository on branch: master Architecture "arch-complex" with double precision complex numbers xxx=xxx Configure stage complete. Now build the SLEPc library with (gnumake build): make SLEPC_DIR=$PWD PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git PETSC_ARCH=arch-complex xxx=xxx make SLEPC_DIR=$PWD PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git PETSC_ARCH=arch-complex make: *** No rule to make target `chk_makej', needed by `all'. Stop. Does someone maybe know what the cause of this could be? Make is correct in that there is no rule for chk_makej, but I'm assuming it should really be some kind of function? However, since I don't understand what the dependency on chk_makej exactly does, I'm a bit lost... Thanks for you help in advance! Samuel
Re: [petsc-users] make: *** No rule to make target `chk_makej', needed by `all'. Stop.
Update the SLEPc repository and try again. When a change in petsc-master affects SLEPc, you have to allow one or two days for us to re-synchronize. Jose > El 9 mar 2018, a las 3:59, Samuel Lanthalerescribió: > > Hi, > > I've just been trying to compile SLEPC on a cluster, and am getting the > following error message after I type "make": > > > ./configure > Checking environment... done > Checking PETSc installation... done > Checking LAPACK library... done > Writing various configuration files... done > Generating Fortran stubs... done > > === > SLEPc Configuration > === > > SLEPc directory: > /marconi/home/userexternal/slanthal/slepc-git > It is a git repository on branch: master > PETSc directory: > /marconi/home/userexternal/slanthal/petsc-git > It is a git repository on branch: master > Architecture "arch-complex" with double precision complex numbers > > xxx=xxx > Configure stage complete. Now build the SLEPc library with (gnumake build): > make SLEPC_DIR=$PWD PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git > PETSC_ARCH=arch-complex > xxx=xxx > > > make SLEPC_DIR=$PWD PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git > > PETSC_ARCH=arch-complex > make: *** No rule to make target `chk_makej', needed by `all'. Stop. > > Does someone maybe know what the cause of this could be? Make is correct in > that there is no rule for chk_makej, but I'm assuming it should really be > some kind of function? However, since I don't understand what the dependency > on chk_makej exactly does, I'm a bit lost... > > Thanks for you help in advance! > Samuel > >
Re: [petsc-users] make: *** No rule to make target `chk_makej', needed by `all'. Stop.
Thanks, Jose. I actually tried that before, but it turns out there probably was some kind of an issue already with the compilation of petsc; even though it looked like it had worked. Trying again, now. Thanks, Samuel On 08.03.2018 22:14, Jose E. Roman wrote: Update the SLEPc repository and try again. When a change in petsc-master affects SLEPc, you have to allow one or two days for us to re-synchronize. Jose El 9 mar 2018, a las 3:59, Samuel Lanthalerescribió: Hi, I've just been trying to compile SLEPC on a cluster, and am getting the following error message after I type "make": ./configure Checking environment... done Checking PETSc installation... done Checking LAPACK library... done Writing various configuration files... done Generating Fortran stubs... done === SLEPc Configuration === SLEPc directory: /marconi/home/userexternal/slanthal/slepc-git It is a git repository on branch: master PETSc directory: /marconi/home/userexternal/slanthal/petsc-git It is a git repository on branch: master Architecture "arch-complex" with double precision complex numbers xxx=xxx Configure stage complete. Now build the SLEPc library with (gnumake build): make SLEPC_DIR=$PWD PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git PETSC_ARCH=arch-complex xxx=xxx make SLEPC_DIR=$PWD PETSC_DIR=/marconi/home/userexternal/slanthal/petsc-git PETSC_ARCH=arch-complex make: *** No rule to make target `chk_makej', needed by `all'. Stop. Does someone maybe know what the cause of this could be? Make is correct in that there is no rule for chk_makej, but I'm assuming it should really be some kind of function? However, since I don't understand what the dependency on chk_makej exactly does, I'm a bit lost... Thanks for you help in advance! Samuel
Re: [petsc-users] [petsc-maint] how to check if cell is local owned in DMPlex
On 18-03-07 03:54 PM, Jed Brown wrote: Danyang Suwrites: Based on my test, this function works fine using current PETSc-dev version, but I cannot get it compiled correctly using other versions for Fortran code, as mentioned in the previous emails. I asked this question because some of the clusters we use do not have the PETSc-dev version and it takes time get staff reinstall another version. You can install PETSc yourself as a normal user. Thanks, Jed. This is the way I will follow. Danyang
Re: [petsc-users] Inertia of Hermitian Matrix
You can verify there is a generalization of the inertia theorem thanks to Ikramov. Furthermore, recall the inertia theorem is about eigenvalues; for a Hermitian matrix, you can diagonalize it in a form that has real entries, and that preserves the eigenvalues. Of course, being able to apply the theorem and actually applying the theorem/method are very different. Should be something that could be added to MUMPS. On Feb 22, 2018 8:39 PM, "Anthony Ruth"wrote: > Hello, > > I am trying to diagonalize a hermitian matrix using the Eigen Problem > Solver in SLEPc, I run into errors on calls to MatGetInertia() with complex > hermitian matrices that I did not see with real matrices. The complex and > real versions were done with separate PETSC_ARCH. I do not know if the > problem is with the set up of the matrix or more generally a problem > calculating the inertia for a complex matrix. > The matrix is created by: > > ierr = MatSetType(A,MATAIJ);CHKERRQ(ierr); > ierr = MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,N,N);CHKERRQ(ierr); > ierr = MatSetFromOptions(A);CHKERRQ(ierr); > ierr = MatSetUp(A);CHKERRQ(ierr); > ierr = MatSetOption(A,MAT_HERMITIAN,PETSC_TRUE);CHKERRQ(ierr); > ierr = MatGetOwnershipRange(A,_row,_row);CHKERRQ(ierr); > ierr = MatSetValues(A,m,idxm,n,idxn,data,INSERT_VALUES);CHKERRQ(ierr); > ierr = MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > ierr = MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY);CHKERRQ(ierr); > > For a hermitian matrix, all the eigenvalues are real, so I believe it is > possible to calculate an inertia by looking at the signs of the diagonal > entries. I believe if it was complex but not hermitian, the complex > eigenvalues calculating inertia would be difficult. Is there some problem > with doing this through sparse iterative methods? Is there a certain place > the matrix needs to be specified as hermitian besides upon assembly? > > Here is the error stack I see when running: > > > Mat Object: 1 MPI processes > type: seqaij > row 0: (0, 0.) (1, 1. + 1. i) (2, 0.) (3, 0.) (4, 0.) > row 1: (0, 1. - 1. i) (1, 0.) (2, 1. + 1. i) (3, 0.) (4, 0.) > row 2: (0, 0.) (1, 1. - 1. i) (2, 0.) (3, 1. + 1. i) (4, 0.) > row 3: (0, 0.) (1, 0.) (2, 1. - 1. i) (3, 0.) (4, 1. + 1. i) > row 4: (0, 0.) (1, 0.) (2, 0.) (3, 1. - 1. i) (4, 0.) > [0]PETSC ERROR: - Error Message > -- > [0]PETSC ERROR: No support for this operation for this object type > [0]PETSC ERROR: Mat type mumps > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.8.2, Nov, 09, 2017 > [0]PETSC ERROR: Configure options --download-metis --download-mumps > --download-parmetis --download-scalapack --with-scalar-type=complex > [0]PETSC ERROR: #1 MatGetInertia() line 8416 in > /home/anthony/DFTB+SIPs/petsc-3.8.2/src/mat/interface/matrix.c > [0]PETSC ERROR: #2 EPSSliceGetInertia() line 333 in > /home/anthony/DFTB+SIPs/slepc-3.8.1/src/eps/impls/krylov/ > krylovschur/ks-slice.c > [0]PETSC ERROR: #3 EPSSetUp_KrylovSchur_Slice() line 459 in > /home/anthony/DFTB+SIPs/slepc-3.8.1/src/eps/impls/krylov/ > krylovschur/ks-slice.c > [0]PETSC ERROR: #4 EPSSetUp_KrylovSchur() line 146 in > /home/anthony/DFTB+SIPs/slepc-3.8.1/src/eps/impls/krylov/ > krylovschur/krylovschur.c > [0]PETSC ERROR: #5 EPSSetUp() line 165 in /home/anthony/DFTB+SIPs/slepc- > 3.8.1/src/eps/interface/epssetup.c > [0]PETSC ERROR: #6 EPSSliceGetEPS() line 298 in > /home/anthony/DFTB+SIPs/slepc-3.8.1/src/eps/impls/krylov/ > krylovschur/ks-slice.c > [0]PETSC ERROR: #7 EPSSetUp_KrylovSchur_Slice() line 408 in > /home/anthony/DFTB+SIPs/slepc-3.8.1/src/eps/impls/krylov/ > krylovschur/ks-slice.c > [0]PETSC ERROR: #8 EPSSetUp_KrylovSchur() line 146 in > /home/anthony/DFTB+SIPs/slepc-3.8.1/src/eps/impls/krylov/ > krylovschur/krylovschur.c > [0]PETSC ERROR: #9 EPSSetUp() line 165 in /home/anthony/DFTB+SIPs/slepc- > 3.8.1/src/eps/interface/epssetup.c > [0]PETSC ERROR: #10 SIPSolve() line 195 in /home/anthony/DFTB+SIPs/dftb- > eig15/sips.c > [0]PETSC ERROR: - Error Message > -- > [0]PETSC ERROR: Invalid argument > [0]PETSC ERROR: Wrong type of object: Parameter # 1 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.8.2, Nov, 09, 2017 > [0]PETSC ERROR: Configure options --download-metis --download-mumps > --download-parmetis --download-scalapack --with-scalar-type=complex > [0]PETSC ERROR: #11 EPSGetConverged() line 257 in > /home/anthony/DFTB+SIPs/slepc-3.8.1/src/eps/interface/epssolve.c > [0]PETSC ERROR: #12 squareFromEPS() line 131 in > /home/anthony/DFTB+SIPs/dftb-eig15/sips_square.c > > > > regards, > Anthony Ruth > Condensed Matter Theory > University of Notre Dame >
Re: [petsc-users] PETSC_VIEWER_DRAW_(Comm) Fortran interface missing?
Great! I've merged balay/ftn-VIEWER_DRAW/maint to 'maint' Satish On Thu, 8 Mar 2018, Tim Steinhoff wrote: > Yeah, thanks Satish. > PETSC_VIEWER_DRAW_() works perfectly and I could not reproduce my > above mentioned issue with PETSC_VIEWER_STDOUT_(), so probably I > messed something up. > Thanks again for your great support, > Volker > > 2018-03-06 18:14 GMT+01:00 Satish Balay: > > Ok - I've added ftn interface for PETSC_VIEWER_DRAW_() [similar to > > PETSC_VIEWER_STDOUT_()] in branch balay/ftn-VIEWER_DRAW/maint. > > > > You can give it a try and see if it works. > > > > Satish > > > > On Tue, 6 Mar 2018, Satish Balay wrote: > > > >> Hm - I do see the fortran interface code for PETSC_VIEWER_STDOUT_() > >> > >> https://bitbucket.org/petsc/petsc/src/049452fa31a1384b9850ba16ebcec79c99e4198c/src/sys/classes/viewer/impls/ascii/ftn-custom/zvcreatef.c?at=master=file-view-default > >> > >> and it does appear to work for me. > >> > >> balay@asterix /home/balay/petsc/src/mat/examples/tests (master *=) > >> $ git diff > >> diff --git a/src/mat/examples/tests/ex16f90.F90 > >> b/src/mat/examples/tests/ex16f90.F90 > >> index 6bb83b4644..fee44aea95 100644 > >> --- a/src/mat/examples/tests/ex16f90.F90 > >> +++ b/src/mat/examples/tests/ex16f90.F90 > >> @@ -15,6 +15,7 @@ > >>PetscInt one > >>PetscScalar v(1) > >>PetscScalar, pointer :: array(:,:) > >> + PetscViewer vw > >> > >> > >>call PetscInitialize(PETSC_NULL_CHARACTER,ierr) > >> @@ -49,7 +50,8 @@ > >> ! > >> ! Print the matrix to the screen > >> ! > >> - call MatView(A,PETSC_VIEWER_STDOUT_WORLD,ierr);CHKERRA(ierr) > >> + vw = PETSC_VIEWER_STDOUT_(PETSC_COMM_WORLD) > >> + call MatView(A,vw,ierr);CHKERRA(ierr) > >> > >> > >> ! > >> balay@asterix /home/balay/petsc/src/mat/examples/tests (master *=) > >> $ mpiexec -n 2 ./ex16f90 > >> Mat Object: 2 MPI processes > >> type: mpidense > >> 9.e+00 4.5000e+00 > >> 4.5000e+00 3.e+00 > >> 3.e+00 2.2500e+00 > >> balay@asterix /home/balay/petsc/src/mat/examples/tests (master *=) > >> $ > >> > >> > >> Satish > >> > >> > >> On Tue, 6 Mar 2018, Tim Steinhoff wrote: > >> > >> > Hi Barry, > >> > > >> > thanks for your fast response. Yes, I'm fine with that. > >> > Indeed, my code with the routine PETSC_VIEWER_STDOUT_(comm) was > >> > compilable, but turned out it did not work properly at runtime > >> > (hangs). > >> > When using PETSC_VIEWER_DRAW_(comm), the compiler immediately gives an > >> > error and I was confused. > >> > > >> > Thanks, > >> > > >> > Volker > >> > > >> > By the way, the output parameter description of > >> > PetscViewerASCIIGetStdout is missing in the documentation: > >> > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Viewer/PetscViewerASCIIGetStdout.html > >> > > >> > > >> > 2018-03-06 16:26 GMT+01:00 Smith, Barry F. : > >> > > > >> > > > >> > >Tim, > >> > > > >> > > The PETSC_VIEWER_STDOUT_(comm) construct doesn't work in > >> > > Fortran. Only the > >> > > PETSC_VIEWER_STDOUT_WORLD or PETSC_VIEWER_STDOUT_SELF work. This is > >> > > also true for > >> > > draw see src/ksp/ksp/examples/tutorials/ex100f.F90 > >> > > > >> > >Note that you can use PetscViewerDrawOpen() to get a draw to open > >> > > on any communicator you like. > >> > > > >> > > > >> > >Does this answer your question or did I misunderstand? > >> > > > >> > >Barry > >> > > > >> > > > >> > > > >> > > > >> > >> On Mar 6, 2018, at 9:02 AM, Tim Steinhoff > >> > >> wrote: > >> > >> > >> > >> Hi all, > >> > >> > >> > >> it seems like the routine PETSC_VIEWER_DRAW_(comm) is not available > >> > >> in Fortran, while PETSC_VIEWER_STDOUT_(comm) is. > >> > >> Is there anything I can do about it? (I am using the current maint > >> > >> branch of PETSc) > >> > >> > >> > >> Thank you very much and kind regards, > >> > >> > >> > >> Volker > >> > > > >> > > >> > >> > > >