Ok - this is not clang from apple. So I guess it needs that extra 'xcode-select
--install'
I don't think gcc from brew needed this. [And I don't remember if I checked
clang from brew]
Satish
On Mon, 16 Oct 2017, Kong, Fande wrote:
> On Mon, Oct 16, 2017 at 12:07 PM, Satish Balay
On Mon, Oct 16, 2017 at 12:07 PM, Satish Balay wrote:
> BTW: Which clang are you using?
>
> mpicc -show
>
mpicc -show
clang -Wl,-commons,use_dylibs
-I/opt/moose/mpich/mpich-3.2/clang-opt/include
-L/opt/moose/mpich/mpich-3.2/clang-opt/lib -lmpi -lpmpi
> mpicc --version
>
BTW: Which clang are you using?
mpicc -show
mpicc --version
Satish
On Mon, 16 Oct 2017, Satish Balay wrote:
> Thats weird.
>
> From what I can recall - some tools (like pgi compilers) need this -
> but the xcode compilers do not.
>
> Basically xcode clang can pick up includes from the xcode
Now it is working. It turns out I need to do something like "xcode-select
--install" after upgrading OS, and of course we need to agree the license.
Fande,
On Mon, Oct 16, 2017 at 10:58 AM, Richard Tran Mills
wrote:
> Fande,
>
> Did you remember to agree to the XCode license
Fande,
Did you remember to agree to the XCode license after your upgrade, if you
did an XCode upgrade? You have to do the license agreement again, otherwise
the compilers don't work at all. Apologies if this seems like a silly thing
to ask, but this has caused me a few minutes of confusion
"Kong, Fande" writes:
> Hi All,
>
> I just upgraded MAC OS, and also updated all other related packages. Now
> I can not configure PETSc-master any more.
Your compiler paths are broken.
On Mon, Oct 16, 2017 at 12:07 PM, Kong, Fande wrote:
> Hi All,
>
> I just upgraded MAC OS, and also updated all other related packages. Now
> I can not configure PETSc-master any more.
>
> See the attachment for more details.
>
Something is really wrong with your compilers
I am interested to learn more about how this works. How are the vectors
created if the ids are not contiguous in a partition ?
Thanks
praveen
On Mon, Oct 16, 2017 at 2:02 PM, Stefano Zampini
wrote:
>
>
> 2017-10-16 10:26 GMT+03:00 Michael Werner
Matt, thanks for your input.
This is very helpful.
Ling
On Fri, Oct 13, 2017 at 8:13 AM, Matthew Knepley wrote:
> On Fri, Sep 29, 2017 at 11:06 AM, Zou, Ling wrote:
>
>> Hi all,
>>
>> I know this is a bit off topic on PETSc email list.
>> I would like to
> On 16-Oct-2017, at 12:56 PM, Michael Werner wrote:
>
> However, since its an unstructured grid, those subgrids are not necessarily
> made up of points with successive global IDs.
It should be easy to renumber the points so that each partition has
contiguously
Hello,
I'm having trouble with parallelizing a matrix-free code with PETSc. In
this code, I use an external CFD code to provide the matrix-vector
product for an iterative solver in PETSc. To increase convergence rate,
I'm using an explicitly stored Jacobian matrix to precondition the
solver.
11 matches
Mail list logo