?
Sorry for the bother but any advice would be greatly appreciated.
Cheers, Francis
--
Francis Poulin
Associate Professor
Associate Chair, Undergraduate Studies
Department of Applied Mathematics
University of Waterloo
email: fpou...@uwaterloo.ca
Web:https
forget about what kind of matrix I needed to
define. I hope I've learned my lesson for next time.
Francis
--
Francis Poulin
Associate Professor
Associate Chair, Undergraduate Studies
Department of Applied Mathematics
University of Waterloo
email: fpou...@uwaterloo.ca
Web
--
Francis Poulin
Associate Professor
Associate Chair, Undergraduate Studies
Department of Applied Mathematics
University of Waterloo
email: fpou...@uwaterloo.ca
Web:https://uwaterloo.ca/poulin-research-group/
Telephone: +1 519 888 4567 x32637
From: Francis Poulin
Sent: Saturday, July 19, 2014 2:09 PM
To: petsc-users
Subject: RE: [petsc-users] Question about PETSc installs and MPI
Hello Satish,
Thanks for the help on both counts.
1) I installed csh and petscmpiexec works, but I think I will stick to mpirun
2) Your modifications
Hello,
Sorry for not thinking of this myself. Thanks for pointing out that it was
Xcode and also telling me about the command line tools. That saved me from
sending another email.
It seems to be installing very nicely right now.
Cheers, Francis
On 2012-03-13, at 6:00 PM, Sean Farley wrote:
Hello,
I am trying to call Petsc from a C++ program and having difficulties. I'm
using v3.2.6 and I can run all the examples so I assumed that everything was
installed ok. The body of the function is very simple, see below. I get a
segmentation fault. In my installation I used mpich
I
, Francis Poulin fpoulin at uwaterloo.ca wrote:
I am trying to call Petsc from a C++ program and having difficulties. I'm
using v3.2.6 and I can run all the examples so I assumed that everything was
installed ok. The body of the function is very simple, see below. I get a
segmentation fault
, 2012, at 8:24 PM, Jed Brown wrote:
On Thu, Feb 23, 2012 at 18:58, Francis Poulin fpoulin at uwaterloo.ca
wrote:
I can do it for each of them if that helps but I suspect the method is the
same so I'm sending the information for the first 3, n = 2, 4, 8. In the
mean time I will figure out
I don't seem to have the hydra in my bin folder but I do have petscmpiexec that
I've been using.
Use these, it will run the same method and sizes as the options I gave for
ex22 before.
mpiexec.hydra -n 2 ./ex45 -da_grid_x 5 -da_grid_y 5 -da_grid_z 5 -da_refine 5
-ksp_monitor -pc_type
,
Francis
On 2012-02-24, at 12:59 PM, Jed Brown wrote:
On Fri, Feb 24, 2012 at 11:49, Francis Poulin fpoulin at uwaterloo.ca wrote:
I don't seem to have the hydra in my bin folder but I do have petscmpiexec
that I've been using.
That's just the name of my mpiexec. Use whichever one is used
Thanks for the link. I did the tests on an SGI SMP machine that, if I
understand correctly, shares the memory much better than most other systems. I
will read about these different components and see what I can figure out.
Thanks,
Francis
You need to understand the issues of memory
, Feb 24, 2012 at 14:43, Francis Poulin fpoulin at uwaterloo.ca wrote:
It seems to scale beautifully starting at 2 but there is a big drop from 1 to
2. I suspect there's a very good reason for this, I just don't know what.
Can you send the full output? The smoother is slightly different, so
Hello,
I am learning to use PetSc but am just a notice. I have a rather basic
question to ask and couldn't not find it on the achieves.
I am wanting to test the scalability of a Multigrid solver to the 3D Poisson
equation. I found ksp/ex22.c that seems to solve the problem that I'm
wrote:
Always send output with -log_summary for each run that you do.
On Thu, Feb 23, 2012 at 14:16, Francis Poulin fpoulin at uwaterloo.ca wrote:
Hello,
I am learning to use PetSc but am just a notice. I have a rather basic
question to ask and couldn't not find it on the achieves.
I am
are using two levels of multigrid) and since it is solved redundantly that
takes all the time. Run with say 5 levels.
Barry
On Feb 23, 2012, at 3:03 PM, Francis Poulin wrote:
Hello again,
I am using v3.1 of PETSc.
I changed the grid sizes slightly and I'm including 4 log_summary
15 matches
Mail list logo