I have tried with slepc-master and it works: $ mpiexec -n 2 ./ex1 -eps_ciss_partitions 2 matrix size 774 (-78.7875,8.8022) (-73.9569,-42.2401) (-66.9942,-7.50907) (-62.262,-2.71603) (-58.9716,0.601111) (-57.9883,0.298729) (-57.8323,1.06041) (-56.5317,1.10758) (-56.0234,45.2405) (-54.4058,2.88373) (-25.946,26.0317) (-23.5383,-16.9096) (-19.0999,0.194467) (-18.795,1.15113) (-15.3051,0.915914) (-14.803,-0.00475538) (-8.52467,10.6032) (-4.36051,2.29996) (-0.525758,0.796658) (1.41227,0.112858) (1.53801,0.446984) (9.43357,0.505277)
slepc-master will become version 3.12 in a few days. I have not tried with 3.11 but I think it should work. It is always recommended to use the latest version. Version 3.8 is two years old. Jose > El 19 sept 2019, a las 20:33, Povolotskyi, Mykhailo <mpovo...@purdue.edu> > escribió: > > Hong, > > do you have in mind a reason why the newer version should work or is it a > general recommendation? > > Which stable version would you recommend to upgrade to? > > Thank you, > > Michael. > > > On 09/19/2019 02:22 PM, Zhang, Hong wrote: >> Michael, >> >> -------------------------------------------------------------- >> [0]PETSC ERROR: No support for this operation for this object type >> [0]PETSC ERROR: Mat type seqdense >> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting. >> [0]PETSC ERROR: Petsc Release Version 3.8.4, Mar, 24, 2018 >> >> This is an old version of Petsc. Can you update to the latest Petsc release? >> Hong >> >> >> On 09/19/2019 04:55 AM, Jose E. Roman wrote: >> > Michael, >> > >> > In my previous email I should have checked it better. The CISS solver >> > works indeed with dense matrices: >> > >> > $ mpiexec -n 2 ./ex2 -n 30 -eps_type ciss -terse -rg_type ellipse >> > -rg_ellipse_center 1.175 -rg_ellipse_radius 0.075 -eps_ciss_partitions 2 >> > -mat_type dense >> > >> > 2-D Laplacian Eigenproblem, N=900 (30x30 grid) >> > >> > Solution method: ciss >> > >> > Number of requested eigenvalues: 1 >> > Found 15 eigenvalues, all of them computed up to the required tolerance: >> > 1.10416, 1.10416, 1.10455, 1.10455, 1.12947, 1.12947, 1.13426, >> > 1.13426, >> > 1.16015, 1.16015, 1.19338, 1.19338, 1.21093, 1.21093, 1.24413 >> > >> > >> > There might be something different in the way matrices are initialized in >> > your code. Send me a simple example that reproduces the problem and I will >> > track it down. >> > >> > Sorry for the confusion. >> > Jose >> > >> > >> > >> >> El 19 sept 2019, a las 6:20, hong--- via petsc-users >> >> <petsc-users@mcs.anl.gov> escribió: >> >> >> >> Michael, >> >> We have support of MatCreateRedundantMatrix for dense matrices. For >> >> example, petsc/src/mat/examples/tests/ex9.c: >> >> mpiexec -n 4 ./ex9 -mat_type dense -view_mat -nsubcomms 2 >> >> >> >> Hong >> >> >> >> On Wed, Sep 18, 2019 at 5:40 PM Povolotskyi, Mykhailo via petsc-users >> >> <petsc-users@mcs.anl.gov> wrote: >> >> Dear Petsc developers, >> >> >> >> I found that MatCreateRedundantMatrix does not support dense matrices. >> >> >> >> This causes the following problem: I cannot use CISS eigensolver from >> >> SLEPC with dense matrices with parallelization over quadrature points. >> >> >> >> Is it possible for you to add this support? >> >> >> >> Thank you, >> >> >> >> Michael. >> >> >> >> >> >> p.s. I apologize if you received this e-mail twice, I sent if first from >> >> a different address. >> >> >> >