Re: [petsc-users] Creating multi-field section using DMPlex

2020-05-22 Thread Matthew Knepley
On Fri, May 22, 2020 at 12:57 PM Shashwat Tiwari wrote: > Hi, > I am working on a Finite Volume scheme for a hyperbolic system of > equations equations with 6 variables on unstructured grids. I am trying to > create a section with 6 fields for this purpose, and have written a small > test code

Re: [petsc-users] Question about DMLocalToLocal for DM_BOUNDARY_GHOSTED conditions

2020-05-22 Thread Matthew Knepley
On Fri, May 22, 2020 at 4:34 PM Lucas Banting wrote: > Hello, > > I am converting a serial code to parallel in fortran with petsc. I am > using the DMDA to manage communication of the information that used to be > in old two-dimensional fortran arrays. > > I noticed when using

Re: [petsc-users] Gather and Broadcast Parallel Vectors in k-means algorithm

2020-05-22 Thread Mills, Richard Tran via petsc-users
Hi Eda, If you are using the MATLAB k-means function, calling it like idx = kmeans(X,k) will give you the index set, but if you do [idx,C] = kmeans(X,k) then you will also get a matrix C which contains the cluster centroids. Is this not what you need? --Richard On 5/22/20 10:38 AM, Eda

[petsc-users] Question about DMLocalToLocal when using DM_BOUNDARY_GHOSTED

2020-05-22 Thread Lucas Banting
Hello, I am converting a serial code to parallel in fortran with petsc. I am using the DMDA to manage communication of the information that used to be in old two-dimensional fortran arrays. I noticed when using DMLocalToLocalBegin/End, not all the ghost values in the array at the

[petsc-users] Question about DMLocalToLocal for DM_BOUNDARY_GHOSTED conditions

2020-05-22 Thread Lucas Banting
Hello, I am converting a serial code to parallel in fortran with petsc. I am using the DMDA to manage communication of the information that used to be in old two-dimensional fortran arrays. I noticed when using DMLocalToLocalBegin/End, not all the ghost values in the array at the

Re: [petsc-users] Possible bug PETSc+Complex+CUDA

2020-05-22 Thread Junchao Zhang
$ module list Currently Loaded Modules: 1) cuda/10.2 2) gcc/8.3.0-fjpc5ys 3) cmake/3.17.0-n3kslpc 4) openmpi-4.0.2-gcc-8.3.0-e2zcbqz $nvcc -V Cuda compilation tools, release 10.2, V10.2.89 --Junchao Zhang On Fri, May 22, 2020 at 12:57 PM Mills, Richard Tran via petsc-users <

Re: [petsc-users] Possible bug PETSc+Complex+CUDA

2020-05-22 Thread Mills, Richard Tran via petsc-users
Yes, Junchao said he gets the segfault, but it works for Karl. Sounds like this may be a case of one compiler liking the definitions for complex that Thrust uses, and some not, as Stefano says. Karl and Junchao, can you please share the version of the compilers (and maybe associated settings)

Re: [petsc-users] Gather and Broadcast Parallel Vectors in k-means algorithm

2020-05-22 Thread Eda Oktay
I am sorry, I used VecDuplictaeVecs not MatDuplicateVecs Eda Oktay , 22 May 2020 Cum, 20:31 tarihinde şunu yazdı: > Dear Richard, > > Thank you for your email. From MATLAB's kmeans() function I believe I got > the final clustering index set, not centroids. What I am trying to do is to > cluster

Re: [petsc-users] Gather and Broadcast Parallel Vectors in k-means algorithm

2020-05-22 Thread Eda Oktay
Dear Richard, Thank you for your email. From MATLAB's kmeans() function I believe I got the final clustering index set, not centroids. What I am trying to do is to cluster vectors created by MatDuplicateVecs() according to the index set (whose type is not IS since I took it from MATLAB) that I

[petsc-users] Creating multi-field section using DMPlex

2020-05-22 Thread Shashwat Tiwari
Hi, I am working on a Finite Volume scheme for a hyperbolic system of equations equations with 6 variables on unstructured grids. I am trying to create a section with 6 fields for this purpose, and have written a small test code for creating the section by editing the example given at

Re: [petsc-users] multiple definition of `main' with intel compilers

2020-05-22 Thread Alfredo Jaramillo
Hello Satish and PETSc developers, I'm sending this email in case somebody else is having the same problem with the intel compilers. In my personal computer (with Fedora 32, gcc 10). When I tried to execute the resulting binary there appears the error message: