Hi Matt,
I have attached the page which explains what to modify to use PETSc with
the K computer.
Thanks
On 24/4/2016 10:52 PM, Matthew Knepley wrote:
On Sun, Apr 24, 2016 at 4:14 AM, Wee-beng TAY <zon...@gmail.com
<mailto:zon...@gmail.com>> wrote:
Hi,
I was re
Sent using CloudMagic Email
[https://cloudmagic.com/k/d/mailapp?ct=pa=7.4.10=5.0.2=email_footer_2]
On Sun, Nov 08, 2015 at 7:41 PM, Matthew Knepley < knep...@gmail.com
[knep...@gmail.com] > wrote:
On Sat, Nov 7, 2015 at 11:27 PM, TAY wee-beng < zon...@gmail.com
[zon...@gmail.com] > wrote:
Hi,
Beng Tay zon...@gmail.com
[zon...@gmail.com] wrote:
Sent using CloudMagic
[https://cloudmagic.com/k/d/mailapp?ct=pacv=7.2.9pv=5.0.2] On Mon, Aug 24,
2015 at 6:21 PM, Matthew Knepley knep...@gmail.com [knep...@gmail.com]
wrote:
On Mon, Aug 24, 2015 at 4:09 AM, Wee-Beng Tay zon...@gmail.com
Hi,
I'm modifying my 3d fortran code from MPI along 1 direction (z) to MPI
along 2 directions (y,z)
Previously I was using MatSetValues with global indices. However, now I'm
using DM and global indices is much more difficult.
I come across MatSetValuesStencil or MatSetValuesLocal.
So what's
it to work.
Best
Timothee
Hi Timothee,
Thanks for the help. So for boundary pts I will just leave blank for non
existent locations?
Also, can I use PETSc multigrid to solve this problem? This is a poisson eqn.
2015-08-24 18:09 GMT+09:00 Wee-Beng Tay zon...@gmail.com [zon...@gmail.com]
:
Hi,
I'm
Sent using CloudMagic
[https://cloudmagic.com/k/d/mailapp?ct=pacv=7.2.9pv=5.0.2] On Mon, Aug 24,
2015 at 6:21 PM, Matthew Knepley knep...@gmail.com [knep...@gmail.com]
wrote:
On Mon, Aug 24, 2015 at 4:09 AM, Wee-Beng Tay zon...@gmail.com
[zon...@gmail.com] wrote:
Hi,
I'm modifying my 3d
Hi,
So I can use PetscScalar view directly to view the u_array?
Sent using CloudMagic
[https://cloudmagic.com/k/d/mailapp?ct=pacv=7.2.7pv=5.0.2]
On Wed, Aug 19, 2015 at 10:41 PM, Satish Balay ba...@mcs.anl.gov
[ba...@mcs.anl.gov] wrote:
check PetscScalarView()
Satish
On Tue, 21 Jul
the following (or newer?) - as it has the required fixes.
https://software.intel.com/en-us/articles/intel-mpi-library-50-update-2-readme
Satish
On Mon, 9 Feb 2015, Wee Beng Tay wrote:
Hi,
I'm trying to install petsc on Linux with Intel mpi 5 and compiler. What should
Hi,
I'm trying to install petsc on Linux with Intel mpi 5 and compiler. What should be the configure
Command line to use? Anyone has experience?
I tried the usual options but they all can't work.
Thanks
Sent using CloudMagic
On 4/10/2012 5:11 PM, Matthew Knepley wrote:
On Thu, Oct 4, 2012 at 11:01 AM, TAY wee-beng zonexo at gmail.com wrote:
On 4/10/2012 3:40 AM, Matthew Knepley wrote:
On Wed, Oct 3, 2012 at 4:05 PM, TAY wee-beng zonexo at gmail.com wrote:
Hi Jed,
I believe they are real cores. Anyway, I
Hi,
During compile, I got the warning msg in Compaq visual fortran:
C:\Libs\petsc-3.3-dev_win32_cvf/include\finclude/ftn-custom/petscdmcomplex.h90(175)
: Warning: This name has not been given an explicit type. [BCFIELD]
numDof,numBC,bcField,bcPoints,section,ierr)
Can you check?
Tks!
Hi,
It seems that some of you mention some adaptive mesh packages. I know of
paramesh, pflotran and libmesh. Are there any other packages?
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
On 30/10/2010 10:46 PM, Mohammad Mirzadeh wrote:
Thanks guys. I will definitely
hypre.F90 can be
solved as well.
Thank you.
--
Wee-Beng TAY
Postdoctoral Fellow
Aerodynamics Group
Aerospace Engineering
Delft University of Technology
Kluyverweg 2
2629 HT Delft
The Netherlands
Temporary E-mail: zonexo at gmail.com
help from the developers.
Only the 2.6b version works with windows. However, I have not tried
using both hypre and PETSc in windows.
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
On 7/26/2010 5:12 PM, Matthew Knepley wrote:
This is what we mean when we say
Hi Satish,
It finally worked!
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
On 13/5/2010 1:56 AM, Satish Balay wrote:
try the attached makefile. If you have problems copy/paste the
*complete* make session from your terminal.
Satish
On Thu, 13 May 2010, Wee-Beng
*.h depending on what petsc commands are used?
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
On 12/5/2010 9:55 AM, Satish Balay wrote:
Lets step back and deal with this primary issue.
Have you attempted to use makefiles in petsc format? What problems
have you
the same flag for all sources. In that case,
how can I build them?
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
On 13/5/2010 1:27 AM, Satish Balay wrote:
Attaching the modified makefile. It might need a couple of iterations
of fixing.
Hopefully its clear why it would
:276:
undefined reference to `MPI_Type_get_extent'
May I know what's wrong?
--
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
!
On Fri, Apr 16, 2010 at 11:38 AM, Matthew Knepley knepley at gmail.com wrote:
If you are using petsc-dev, you only need petsc.h
Matt
On Thu, Apr 15, 2010 at 10:27 PM, Wee-Beng Tay zonexo at gmail.com wrote:
Hi,
I have successfully built the PETSc libraries no my linux system.
make ex1f
,
Wee-Beng Tay
if I do it in Fortran, is it the same?
Thanks again!
On Sat, Dec 12, 2009 at 12:25 AM, Ryan Yan vyan2000 at gmail.com wrote:
On Fri, Dec 11, 2009 at 1:44 AM, Wee-Beng Tay zonexo at gmail.com wrote:
Hi,
I'm also looking at the fortran version of ex22. I have some questions:
1. M,N,P
momentum
eqn with DA still since there's staggering?
Thanks alot!
On Fri, Dec 11, 2009 at 7:00 PM, Jed Brown jed at 59a2.org wrote:
On Fri, 11 Dec 2009 18:24:58 +0800, Wee-Beng Tay zonexo at gmail.com wrote:
But you mention abt latency, so shouldn't minimizing the number of
neighbor
processes
Hi,
I'm also looking at the fortran version of ex22. I have some questions:
1. M,N,P are global dimension in each direction of the array. I'm abit
confused. Do you mean the no. of grid pts at the coarse level? Hence, if my
problem has 27x27 grid pts, do I use M=N=9 with 3 multigrid levels or
and is usually detrimental to solver performance.?
But you mention abt latency, so shouldn't minimizing the number of neighbor
processes reduce latency and improve performance?
On Thu, Dec 10, 2009 at 5:09 PM, Jed Brown jed at 59a2.org wrote:
On Thu, 10 Dec 2009 16:44:02 +0800, Wee-Beng Tay
Hi,
I'm working on a 2D Cartesian grid and I'm going to decompose the grid for
MPI for my CFD Fortran code. The grid size is in the ratio of 110 x 70. I
wonder how I should decompose the grid - horizontally or vertically?
For e.g., for 2 processors, to 2 55x70 grids, or 2 110x35 grids.
I
Hi,
Has anyone managed to use PETSc successfully with Silverfrost FTN95 for
windows. I didn't manage to google any info regarding this. Silverfrost
FTN95 seems like a good free alternative fortran compiler in windows.
Are these possible:
1. Using pre-compiled library of PETSc with Silverfrost
day!
Yours sincerely,
Wee-Beng Tay
Hi Barry,
That's just what I'm thinking!
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Barry Smith wrote:
Speed should be pretty much the same.
I could interpret your question as: If one as a 3d PDE solver is it
worth writing a totally separate 2d code
the problem.
If this is a MPI problem, then you can just ignore it. I'll check it in
some MPI forum.
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Satish Balay wrote:
Yes - all includes statements in both the sourcefiles should start
with finclude/... [so that -Id:\cygwin
Hi Satish and Barry,
It worked! Btw, this is only one of the subroutines in my module file.
Hence, my implicit none is found at the top of the module file. Thanks
for the reminder though, Barry.
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Satish Balay wrote
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Satish Balay wrote:
And if v_ast() is used with PETSc - then it should be defined 'PetscScalar'
and then you use MPI_ISEND(,MPIU_SCALAR,...)
Satish
On Mon, 13 Apr 2009, Barry Smith wrote:
Where is your
Hi Satish,
That's strange. This is because I initially declared u_ast as
real(8), allocatable :: u_ast(:,:)
I also have the -r8 option enabled. Any idea what is going on?
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Satish Balay wrote:
(--with-scalar-type=[real
having an implicit none on the top should be enough.
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Satish Balay wrote:
MPIU_SCALAR is defined to be MPI_DOUBLE_PRECISION
I guess MPI_REAL/MPI_REAL8 behavior is affected by -r8.
Satish
On Tue, 14 Apr 2009, Wee-Beng
sincerely,
Wee-Beng Tay
very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Matthew Knepley wrote:
On Mon, Apr 13, 2009 at 10:50 PM, Wee-Beng TAY zonexo at gmail.com
mailto:zonexo at gmail.com wrote:
Hi,
In the past, I did not use ghost cells. Hence, for e.g., on a grid
8x8, I can divide
,ierr)
---^
Error executing df.exe.
global.obj - 3 error(s), 0 warning(s)
I can't test flux_area.f90 since it's dependent on global.F
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Satish Balay
/petscmatdef.h
#include finclude/petsckspdef.h
#include finclude/petscpcdef.h
for flux_area.f90 and it's working now. Can you explain what's
happening? Is this the correct way then?
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Satish Balay wrote:
2 changes you have
error if I compile under cygwin using
the same parameters.
Hope you can help.
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
Satish Balay wrote:
Do you get these errors with PETSc f90 examples?
what 'USE statement' do you have in your code?
I guess you'll have
\include
Interestingly, when I change my PETSC_DIR to petsc-dev, which correspond
to an old build of petsc-2.3.3-p13, there is no problem.
May I know what's wrong? Btw, I've converted my mpif.h from using C as
comments to !.
Thank you very much and have a nice day!
Yours sincerely,
Wee-Beng Tay
.
Regards,
Wee-Beng TAY
40 matches
Mail list logo