[petsc-users] Interpreting Redistribution SF

2023-01-19 Thread Nicholas Arnold-Medabalimi
Hi Petsc Users I'm working with a distribution start forest generated by DMPlexDistribute and PetscSFBcast and Reduce to move data between the initial distribution and the distribution generated by DMPlex Distribute. I'm trying to debug some values that aren't being copied properly and wanted to

Re: [petsc-users] locally deploy PETSc

2023-01-19 Thread Jed Brown
You're probably looking for ./configure --prefix=/opt/petsc. It's documented in ./configure --help. Tim Meehan writes: > Hi - I am trying to set up a local workstation for a few other developers who > need PETSc installed from the latest release. I figured that it would be > easiest for me

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Barry Smith
Remove --download-cmake=/home/danyangs/soft/petsc/petsc-3.18.3/packages/cmake-3.25.1.tar.gz and install CMake yourself. Then configure PETSc with --with-cmake=directory you installed it in. Barry > On Jan 19, 2023, at 1:46 PM, Danyang Su wrote: > > Hi All, > > I am trying to

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Satish Balay via petsc-users
BTW: cmake is required by superlu-dist not petsc. And its possible that petsc might not build with this old version of openmpi - [and/or the externalpackages that you are installing - might not build with this old version of intel compilers]. Satish On Thu, 19 Jan 2023, Barry Smith wrote: >

[petsc-users] locally deploy PETSc

2023-01-19 Thread Tim Meehan
Hi - I am trying to set up a local workstation for a few other developers who need PETSc installed from the latest release. I figured that it would be easiest for me to just clone the repository, as mentioned in the Quick Start. So, in /home/me/opt, I issued: git clone -b release

Re: [petsc-users] locally deploy PETSc

2023-01-19 Thread Tim Meehan
Thanks Jed! I ran: make clean ./configure --prefix=/opt/petsc make all check sudo make install It then worked like you said, so thanks! -Original Message- From: Jed Brown Sent: Thursday, January 19, 2023 12:56 PM To: Tim Meehan ; petsc-users@mcs.anl.gov Subject: Re: [petsc-users]

Re: [petsc-users] multi GPU partitions have very different memory usage

2023-01-19 Thread Mark Adams
On Wed, Jan 18, 2023 at 6:03 PM Mark Lohry wrote: > Thanks Mark, I'll try the kokkos bit. Any other suggestions for minimizing > memory besides the obvious use less levels? > > Unfortunately Jacobi does poorly compared to ILU on these systems. > > I'm seeing grid complexity 1.48 and operator

Re: [petsc-users] Interpreting Redistribution SF

2023-01-19 Thread Matthew Knepley
On Thu, Jan 19, 2023 at 11:58 AM Nicholas Arnold-Medabalimi < narno...@umich.edu> wrote: > Hi Petsc Users > > I'm working with a distribution start forest generated by > DMPlexDistribute and PetscSFBcast and Reduce to move data between the > initial distribution and the distribution generated by

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Danyang Su
Hi Satish, That's a bit strange since I have already use export PETSC_DIR=/home/danyangs/soft/petsc/petsc-3.18.3. Yes, I have petsc 3.13.6 installed and has PETSC_DIR set in the bashrc file. After changing PETSC_DIR in the bashrc file, PETSc can be compiled now. Thanks, Danyang On

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Satish Balay via petsc-users
Looks like .bashrc is getting sourced again during the build process [as make creates new bash shell during the build] - thus overriding the env variable that's set. Glad you have a working build now. Thanks for the update! BTW: superlu-dist requires cmake 3.18.1 or higher. You could check if

Re: [petsc-users] Interpreting Redistribution SF

2023-01-19 Thread Nicholas Arnold-Medabalimi
Hi Matt Yep, that makes sense and is consistent. My question is a little more specific. So let's say I take an initial mesh and distribute it and get the distribution SF with an overlap of one. Consider a cell that is a root on process 0 and a leaf on process 1 after the distribution. Will the

Re: [petsc-users] Interpreting Redistribution SF

2023-01-19 Thread Matthew Knepley
On Thu, Jan 19, 2023 at 9:13 PM Nicholas Arnold-Medabalimi < narno...@umich.edu> wrote: > Hi Matt > > Yep, that makes sense and is consistent. > > My question is a little more specific. So let's say I take an initial mesh > and distribute it and get the distribution SF with an overlap of one. >

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Satish Balay via petsc-users
> /home/danyangs/soft/petsc/petsc-3.13.6/src/sys/makefile contains a directory > not on the filesystem: ['\\'] Its strange that its complaining about petsc-3.13.6. Do you have this location set in your .bashrc or similar file - that's getting sourced during the build? Perhaps you could start

Re: [petsc-users] Interpreting Redistribution SF

2023-01-19 Thread Nicholas Arnold-Medabalimi
Ok thanks for the clarification. In theory, if before the Reduction back to the original distribution, if I call DMGlobaltoLocal then even with MPI_REPLACE all the leafs corresponding to the original root should have the same value so I won't have an ambiguity, correct? On Thu, Jan 19, 2023 at

Re: [petsc-users] Cmake problem on an old cluster

2023-01-19 Thread Danyang Su
Hi Satish, For some unknown reason during Cmake 3.18.5 installation, I get error "Cannot find a C++ compiler that supports both C++11 and the specified C++ flags.". The system installed Cmake 3.2.3 is way too old. I will just leave it as is since superlu_dist is optional in my model. Thanks