Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
Presumably the error occurs after you type 'exit' from the terminal. I'm not sure what to suggest. Having tools that break - is not ideal. You could edit configure script - and bypass this check. But what is your requirement wrt petsc and matlab? Perhaps other petsc developers can guide you

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Satish, If I just run, /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay, then Matlab can run normally in the terminal as Scott-Grad-MBP:~ zhihui$ /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay < M A T L A B (R) >

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
If I run it directly, I got the following. But if I double click /Applications/MATLAB_R2018a.app/bin/matlab in the folder. The matlab start normally. Last login: Thu Oct 25 19:38:28 on ttys002 Scott-Grad-MBP:~ zhihui$ /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay -r

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
>> Testing Matlab at /Applications/MATLAB_R2018a.app Executing: /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay -r "display(['Version ' version]); exit" stdout: < M A T L A B (R) > Copyright 1984-2018 The MathWorks, Inc. R2018a

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
1. You need to send us the complete log. 2. Also use current release - petsc-3.10 [not 3.8] Satish On Fri, 26 Oct 2018, avatar wrote: > Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure > --with-matlab-dir=/Applications/MATLAB_R2018a.app/ >

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure --with-matlab-dir=/Applications/MATLAB_R2018a.app/ === Configuring PETSc to compile on your system

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
On Fri, 26 Oct 2018, avatar wrote: > Scott-Grad-MBP:bin zhihui$ pwd > /Applications/MATLAB_R2018a.app/bin Sorry - you need --with-matlab-dir=/Applications/MATLAB_R2018a.app/ Satish

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
When I do the following, it shows Scott-Grad-MBP:~ zhihui$ cd Applications/ Scott-Grad-MBP:Applications zhihui$ ls Chrome Apps.localized Scott-Grad-MBP:Applications zhihui$ From these, we can see there is no MATALB_R2018a.app in Applications. But when I get into bin, and do the following, it

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
still shows these Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure --with-matlab-dir=/Users/zhihui/Applications/MATLAB_R2018a.app === Configuring PETSc to compile on your system

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
you need --with-matlab-dir=/Users/zhihui/Applications/MATLAB_R2018a.app Satish On Fri, 26 Oct 2018, avatar wrote: > Thank Satish. It worked as you said. But when I rebuild petsc using > ./configure --with-matlab > It prompts >

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Thank Satish. It worked as you said. But when I rebuild petsc using ./configure --with-matlab It prompts === Configuring PETSc to compile on your system

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
On Fri, 26 Oct 2018, avatar wrote: > Is there other way to overcome this problem? Because if I don't set TMPDIR as > \tmp. My other project will break. And actually I don't even know where my > project set up the TMPDIR value. I guess you need to read up on some basic unix and system admin.

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Is there other way to overcome this problem? Because if I don't set TMPDIR as \tmp. My other project will break. And actually I don't even know where my project set up the TMPDIR value. -- Original -- From: "Balay, Satish";; Date: Oct 26, 2018 To:

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
To clarify, I meant: remove 'code that is changing the value of TMPDIR to 'tmp' Satish On Thu, 25 Oct 2018, Satish Balay wrote: > I see - you have: > > >> > TMPDIR=tmp > << > > Did you set this in your ~/.bashrc or somewhere? This is wrong and is > breaking tools. > OSX should setup

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
I see - you have: >> TMPDIR=tmp << Did you set this in your ~/.bashrc or somewhere? This is wrong and is breaking tools. OSX should setup something likethe following for you. petsc-mini:~ balay$ echo $TMPDIR /var/folders/lw/hyrlb1051p9fj96qkktfvhvmgn/T/ Remove it - and retry building

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
These are what I got: Scott-Grad-MacBook-Pro:benchmarks zhihui$ echo $TMPDIR tmp Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l libconf1.a -rw-r--r-- 1 zhihui staff 1624 Oct 25 17:36 libconf1.a Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar t libconf1.a __.SYMDEF SORTED sizeof.o

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
Looks like it worked. What do you have for: echo $TMPDIR ls -l libconf1.a /usr/bin/ar t libconf1.a TMPDIR=$PWD /usr/bin/ar t libconf1.a Satish On Fri, 26 Oct 2018, avatar wrote: > As follow. Then, what I should do next? > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
As follow. Then, what I should do next? Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o Scott-Grad-MacBook-Pro:benchmarks zhihui$ -- Original -- From: "Balay, Satish";; Date: Oct 26, 2018 To:

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
How about: TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o Satish On Fri, 26 Oct 2018, avatar wrote: > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls > Index.c PetscGetCPUTime.c PetscMemcmp.c >

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls Index.c PetscGetCPUTime.c PetscMemcmp.c PetscTime.c benchmarkExample.py sizeof.c Index.c.htmlPetscGetCPUTime.c.html

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
What about: mpicc -c sizeof.c /usr/bin/ar cr libconf1.a sizeof.o Satish On Fri, 26 Oct 2018, avatar wrote: > I could not do all things you posted below. I get these: > > > Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src > Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/ >

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
I could not do all things you posted below. I get these: Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/ Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o -rw-r--r-- 1 zhihui

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
On Fri, 26 Oct 2018, avatar wrote: > Hi Satish, > > > Thank you very much for your quick response. > > > The log file is as follow: > >> Executing: /usr/bin/ar cr /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a /tmp/petsc-mjVUVK/config.setCompilers/conf1.o Possible ERROR while running

Re: [petsc-users] "Could not find a suitable archiver. Use --with-ar to specify an archiver"

2018-10-25 Thread Balay, Satish
Its best to stick with one list - so will follow up on petsc-maint. Satish On Fri, 26 Oct 2018, avatar wrote: > Hi, > > When I try to configure PETSc to connect Matlab with the following command > ./petsc-3.8.3/configure ??with-matlab > , it prompts >

[petsc-users] "Could not find a suitable archiver. Use --with-ar to specify an archiver"

2018-10-25 Thread avatar
Hi, When I try to configure PETSc to connect Matlab with the following command ./petsc-3.8.3/configure ??with-matlab , it prompts === Configuring PETSc to compile on your system

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Maximilian Hartig
>> Ahh, thanks. I was missing the option " -dm_plex_gmsh_periodic “. But using >> this option I now generate a segmentation fault error when calling VecView() >> on the solution vector with vtk and hdf5 viewers. Any suggestions? >> > Small example? VTK is deprecated. HDF5 should work,

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Stephen Wornom
Did you forget to attach "The nice picture attached" Stephen > From: "Stefano Zampini" > To: "petsc-maint" > Cc: "imilian hartig" , "PETSc users list" > > Sent: Thursday, October 25, 2018 4:47:43 PM > Subject: Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake > Opened the PR [ >

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Stefano Zampini
Opened the PR https://bitbucket.org/petsc/petsc/pull-requests/1203/fix-dump-vtk-field-with-periodic-meshes/diff Il giorno gio 25 ott 2018 alle ore 17:44 Matthew Knepley ha scritto: > Good catch Stefano. > > Matt > > On Thu, Oct 25, 2018 at 9:36 AM Stefano Zampini > wrote: > >> Maybe this is

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Matthew Knepley
Good catch Stefano. Matt On Thu, Oct 25, 2018 at 9:36 AM Stefano Zampini wrote: > Maybe this is a fix > > diff --git a/src/dm/impls/plex/plexvtu.c b/src/dm/impls/plex/plexvtu.c > index acdea12c2f..1a8bbada6a 100644 > --- a/src/dm/impls/plex/plexvtu.c > +++ b/src/dm/impls/plex/plexvtu.c > @@

Re: [petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-25 Thread Stefano Zampini
> I actually use hybridization and I was reading the preprint "Algebraic > Hybridization and Static Condensation with Application to Scalable H(div) > Preconditioning" by Dobrev et al. ( https://arxiv.org/abs/1801.08914 ) > and they show that multigrid is optimal for the grad-div problem >

Re: [petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-25 Thread Abdullah Ali Sivas
Right now, one to four. I am just running some tests with small matrices. Later on, I am planning to do large scale tests hopefully up to 1024 processes. I was worried that iteration numbers may get worse. I actually use hybridization and I was reading the preprint "Algebraic Hybridization and

Re: [petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-25 Thread Stefano Zampini
How many processes (subdomains) are you using? I would not say the number of iterations is bad, and it also seems to plateau. The grad-div problem is quite hard to be solved (unless you use hybridization), you can really benefit from the "Neumann" assembly. I believe you are using GMRES, as the

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Stefano Zampini
Matt, you can reproduce it via $ valgrind ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh -dm_plex_gmsh_periodic Long time ago I added support for viewing meshes

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-25 Thread Ale Foggia
No, the eigenvalue is around -15. I've tried KS and the number of iterations differs in one when I change the number of MPI processes, which seems fine for me. So, I'll see if this method is fine for my specific goal or not, and I'll try to use. Thanks for the help. El mié., 24 oct. 2018 a las

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-25 Thread Ale Foggia
El mar., 23 oct. 2018 a las 13:53, Matthew Knepley () escribió: > On Tue, Oct 23, 2018 at 6:24 AM Ale Foggia wrote: > >> Hello, >> >> I'm currently using Lanczos solver (EPSLANCZOS) to get the smallest real >> eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). Those are >> the only