On 1/17/22 9:50 PM, David Montiel Taboada wrote:
Do you mean the line in the script candi.sh?
The PETSc people will want to know the exact command line with which candi
called the PETSc ./configure.py script. They will want to be able to reproduce
the error without having to download candi,
Thank you, Wolfgang. I will send them a message.
Regarding the following:
*Can you extract what the command line was with which the PETSc
configurationscript was called?*
Do you mean the line in the script candi.sh?
David
On Mon, Jan 17, 2022 at 10:41 PM Wolfgang Bangerth
wrote:
>
>
David,
I think this is a problem for the PETSc folks to give you feedback with. Can
you extract what the command line was with which the PETSc configuration
script was called?
Best
W>
On 1/17/22 10:36 AM, David Montiel Taboada wrote:
*** Caution: EXTERNAL Sender ***
Hello,
I am trying
On 1/17/22 8:06 AM, krishan...@gmail.com wrote:
Thanks for the response! The grids I am planning to use are not simple and may
have protrusions which might to too complex to achieve with the grid generator
function of deal.ii. So, I think I would have to follow the first method you
mentioned.
Mariia,
I was trying to make adjustments in my script to enable parallel computing (as
it takes unbearably long to calculate it even for small problems due to big
sparsity matrix, I guess).
In my script I am using block matrices and vectors with complex elements
(i.e. BlockSparseMatrix>).
Hello,
I am trying to install deal.II (v. 9.2.0) using candi on Fedora v. 35 (a
virtual machine). I have edited the candi.cfg file to only install the
following packages: hdf5, p4est, petsc and dealii.
I installed all the packages suggested for the OS, loaded the openmpi
module, and set the
Thanks for the response! The grids I am planning to use are not simple and
may have protrusions which might to too complex to achieve with the grid
generator function of deal.ii. So, I think I would have to follow the first
method you mentioned. Could you please refer the tutorial programs that
Dear colleagues,
I was trying to make adjustments in my script to enable parallel computing
(as it takes unbearably long to calculate it even for small problems due to
big sparsity matrix, I guess).
In my script I am using block matrices and vectors with complex elements
(i.e.