Re: [petsc-users] configuration option for PETSc on cluster

2018-04-25 Thread Satish Balay
Samuel, thanks for the update. Glad the workaround gave a useable build. BTW - looking at configure.log - I think the option LDFLAGS="-dynamic" is interfering with some of these tests - in this case the default -fPIC test - hence this need for the workarround. I think the problem won't appear

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Danyang Su
On 2018-04-25 09:47 AM, Matthew Knepley wrote: On Wed, Apr 25, 2018 at 12:40 PM, Danyang Su > wrote: Hi Matthew, In the worst case, every node/cell may have different label. Do not use Label for this. Its not an appropriate thing. If

Re: [petsc-users] Streams scaling.log

2018-04-25 Thread Manuel Valera
I get it, thanks, that's a strong argument i will tell my advisor about Have a great day, On Wed, Apr 25, 2018 at 12:30 PM, Smith, Barry F. wrote: > > > > On Apr 25, 2018, at 2:12 PM, Manuel Valera wrote: > > > > Hi and thanks for the quick answer, > >

Re: [petsc-users] Streams scaling.log

2018-04-25 Thread Smith, Barry F.
> On Apr 25, 2018, at 2:12 PM, Manuel Valera wrote: > > Hi and thanks for the quick answer, > > Yes it looks i am using MPICH for my configure instead of using the system > installation of OpenMPI, in the past i had better experience using MPICH but > maybe this will be

Re: [petsc-users] Streams scaling.log

2018-04-25 Thread Manuel Valera
Hi and thanks for the quick answer, Yes it looks i am using MPICH for my configure instead of using the system installation of OpenMPI, in the past i had better experience using MPICH but maybe this will be a conflict, should i reconfigure using the system MPI installation? I solved the problem

Re: [petsc-users] Streams scaling.log

2018-04-25 Thread Karl Rupp
Hi Manuel, this looks like the wrong MPI gets used. You should see an increasing number of processes, e.g. Number of MPI processes 1 Processor names node37 Triad: 6052.3571 Rate (MB/s) Number of MPI processes 2 Processor names node37 node37 Triad:9138.9376 Rate (MB/s)

Re: [petsc-users] Streams scaling.log

2018-04-25 Thread Smith, Barry F.
It should not have Number of MPI processes 1 it should have an increasing number of MPI processes. Do you have multiple MPI's installed on your machine and have the "wrong" mpiexec in your path for the MPI you built PETSc with? What command line options did you use for ./configure ?

[petsc-users] Streams scaling.log

2018-04-25 Thread Manuel Valera
Hi, I'm running scaling tests on my system to check why my scaling is so poor, and after following the MPIVersion guidelines my scaling.log output looks like this: Number of MPI processes 1 Processor names node37 Triad:12856.9252 Rate (MB/s) Number of MPI processes 1 Processor names

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Matthew Knepley
On Wed, Apr 25, 2018 at 12:40 PM, Danyang Su wrote: > Hi Matthew, > > In the worst case, every node/cell may have different label. > > Do not use Label for this. Its not an appropriate thing. If every cell is different, just use the cell number. Labels are for mapping a

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Danyang Su
Hi Matthew, In the worst case, every node/cell may have different label. Below is one of the worst scenario with 102299 nodes and 102299 different labels for test. I found the time cost increase during the loop. The first 9300 loop takes least time (<0.5) while the last 9300 loops takes much

Re: [petsc-users] configuration option for PETSc on cluster

2018-04-25 Thread Samuel Lanthaler
Dear Satish, Your workaround worked! Great! Thanks a lot for all your help, everyone. =) Cheers, Samuel On 04/24/2018 06:33 PM, Satish Balay wrote: Hm - I'm not sure why configure didn't try and set -fPIC compiler options. Workarround is: CFLAGS=-fPIC FFLAGS=-fPICC CXXFLAGS=-fPIC or

Re: [petsc-users] SLEPC matrix Query

2018-04-25 Thread Matthew Knepley
On Wed, Apr 25, 2018 at 8:10 AM, Savneet Kaur wrote: > Hello Again, > > Yes I would like to find for the initial 10% and 20% of the eigenpairs. > But in addition to this,I also want to check for the full spectrum. > > So, how shall I proceed with it? > First just setup the

Re: [petsc-users] SLEPC matrix Query

2018-04-25 Thread Savneet Kaur
Hello Again, Yes I would like to find for the initial 10% and 20% of the eigenpairs. But in addition to this,I also want to check for the full spectrum. So, how shall I proceed with it? Thank You. Regards, savneet Le 25/04/2018 à 11:33, Matthew Knepley a écrit : On Wed, Apr 25, 2018 at

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Matthew Knepley
On Tue, Apr 24, 2018 at 11:57 PM, Danyang Su wrote: > Hi All, > > I use DMPlex in unstructured grid code and recently found DMSetLabelValue > takes a lot of time for large problem, e.g., num. of cells > 1 million. In > my code, I use > I read your code wrong. For large

Re: [petsc-users] SLEPC matrix Query

2018-04-25 Thread Matthew Knepley
On Wed, Apr 25, 2018 at 5:10 AM, Savneet Kaur wrote: > Hello, > > Warm Regards > > I am Savneet Kaur, a master student at University Paris Saclay and > currently pursuing an internship at CEA Saclay (France). > > I have recently started to understand the slepc and petsc

Re: [petsc-users] DMSetLabelValue takes a lot of time for large domain

2018-04-25 Thread Matthew Knepley
On Tue, Apr 24, 2018 at 11:57 PM, Danyang Su wrote: > Hi All, > > I use DMPlex in unstructured grid code and recently found DMSetLabelValue > takes a lot of time for large problem, e.g., num. of cells > 1 million. In > my code, I use > > DMPlexCreateFromCellList () > > Loop

[petsc-users] SLEPC matrix Query

2018-04-25 Thread Savneet Kaur
Hello, Warm Regards I am Savneet Kaur, a master student at University Paris Saclay and currently pursuing an internship at CEA Saclay (France). I have recently started to understand the slepc and petsc solvers, by taking up the tutorials for eigenvalue problems. In my internship work I