Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
Presumably the error occurs after you type 'exit' from the terminal.

I'm not sure what to suggest. Having tools that break - is not ideal.

You could edit configure script - and bypass this check.

But what is your requirement wrt petsc and matlab?

Perhaps other petsc developers can guide you towards a suitable path.

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Satish,
> 
> 
> If I just run, /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay, 
> then Matlab can run normally in the terminal as
> Scott-Grad-MBP:~ zhihui$ /Applications/MATLAB_R2018a.app/bin/matlab -nojvm 
> -nodisplay
> 
> 
>  < M A T L A B (R) >
>Copyright 1984-2018 The MathWorks, Inc.
> R2018a (9.4.0.813654) 64-bit (maci64)
>   February 23, 2018
> 
> 
> 
> 
> For online documentation, see http://www.mathworks.com/support
> For product information, visit www.mathworks.com.
> 
> 
> >>
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> >>
> Testing Matlab at /Applications/MATLAB_R2018a.app
> Executing: /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay -r 
> "display(['Version ' version]); exit"
> stdout:
> < M A T L A B (R) >
>   Copyright 1984-2018 The MathWorks, Inc.
>R2018a (9.4.0.813654) 64-bit (maci64)
>  February 23, 2018
> 
> For online documentation, see http://www.mathworks.com/support
> For product information, visit www.mathworks.com.
> 
> Version 9.4.0.813654 (R2018a)
> WARNING: Found Matlab at /Applications/MATLAB_R2018a.app but unable to run
> <
> 
> 
> So configure attempted to run Matlab - and perhaps got an error. What do you 
> get if you run this manually?
> 
> 
> /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay -r 
> "display(['Version ' version]); exit"
> echo $?
> 
> Satish
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > Hi Satish, 
> > 
> > 
> > The attached is the whole configure.log file. I could not past it here 
> > because it is too big and crash my webpage. I could not use the latest 
> > version right now, because the project is maintained by other people. If I 
> > update petsc, I will break the whole project. But we will use the latest 
> > version when the maintenance guys update the dependences.
> > 
> > 
> > Thank you.
> > 
> > 
> > 
> > 
> > -- Original --
> > From:  "Balay, Satish";;
> > Date:  Oct 26, 2018
> > To:  "avatar"<648934...@qq.com>; 
> > Cc:  "petsc-users"; 
> > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > --with-arto specify an archiver"
> > 
> > 
> > 
> > 1. You need to send us the complete log.
> > 
> > 2. Also use current release - petsc-3.10 [not 3.8]
> > 
> > Satish
> > 
> > On Fri, 26 Oct 2018, avatar wrote:
> > 
> > > Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
> > > --with-matlab-dir=/Applications/MATLAB_R2018a.app/
> > > ===
> > >  Configuring PETSc to compile on your system
> > > ===
> > > TESTING: configureLibrary from 
> > > config.packages.Matlab(config/BuildSystem/config/***
> > >  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> > > details):
> > > ---
> > > You set a value for --with-matlab-dir, but 
> > > /Applications/MATLAB_R2018a.app cannot be used
> > > ***
> > > 
> > > 
> > > 
> > > Part of the log file as follow
> > > 
> > > 
> > > #ifndef PETSC_HAVE_MPI_REDUCE_SCATTER
> > > #define PETSC_HAVE_MPI_REDUCE_SCATTER 1
> > > #endif
> > > 
> > > 
> > > #ifndef PETSC_HAVE_MPI_COMBINER_DUP
> > > #define PETSC_HAVE_MPI_COMBINER_DUP 1
> > > #endif
> > > 
> > > 
> > > #ifndef PETSC_HAVE_MPIIO
> > > #define PETSC_HAVE_MPIIO 1
> > > #endif
> > > 
> > > 
> > > #ifndef PETSC_HAVE_MPI_COMM_SPAWN
> > > #define PETSC_HAVE_MPI_COMM_SPAWN 1
> > > #endif
> > > 
> > > 
> > > #ifndef PETSC_HAVE_MPI_FINT
> > > #define PETSC_HAVE_MPI_FINT 1
> > > #endif
> > > 
> > > 
> > > #ifndef PETSC_HAVE_MPI_IBARRIER
> > > #define PETSC_HAVE_MPI_IBARRIER 1
> > > #endif
> > > 
> > > 
> > > #ifndef PETSC_HAVE_MPI_ALLTOALLW
> > > #define PETSC_HAVE_MPI_ALLTOALLW 1
> > > #endif
> > > 
> > > 
> > > #ifndef PETSC_HAVE_OMPI_RELEASE_VERSION
> > > #define PETSC_HAVE_OMPI_RELEASE_VERSION 2
> > > 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Satish,


If I just run, /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay, 
then Matlab can run normally in the terminal as
Scott-Grad-MBP:~ zhihui$ /Applications/MATLAB_R2018a.app/bin/matlab -nojvm 
-nodisplay


 < M A T L A B (R) >
   Copyright 1984-2018 The MathWorks, Inc.
R2018a (9.4.0.813654) 64-bit (maci64)
  February 23, 2018




For online documentation, see http://www.mathworks.com/support
For product information, visit www.mathworks.com.


>>









-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  "avatar"<648934...@qq.com>; 
Cc:  "petsc-users"; 
Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
--with-arto specify an archiver"



>>
Testing Matlab at /Applications/MATLAB_R2018a.app
Executing: /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay -r 
"display(['Version ' version]); exit"
stdout:
< M A T L A B (R) >
  Copyright 1984-2018 The MathWorks, Inc.
   R2018a (9.4.0.813654) 64-bit (maci64)
 February 23, 2018

For online documentation, see http://www.mathworks.com/support
For product information, visit www.mathworks.com.

Version 9.4.0.813654 (R2018a)
WARNING: Found Matlab at /Applications/MATLAB_R2018a.app but unable to run
<


So configure attempted to run Matlab - and perhaps got an error. What do you 
get if you run this manually?


/Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay -r 
"display(['Version ' version]); exit"
echo $?

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Hi Satish, 
> 
> 
> The attached is the whole configure.log file. I could not past it here 
> because it is too big and crash my webpage. I could not use the latest 
> version right now, because the project is maintained by other people. If I 
> update petsc, I will break the whole project. But we will use the latest 
> version when the maintenance guys update the dependences.
> 
> 
> Thank you.
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> 1. You need to send us the complete log.
> 
> 2. Also use current release - petsc-3.10 [not 3.8]
> 
> Satish
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
> > --with-matlab-dir=/Applications/MATLAB_R2018a.app/
> > ===
> >  Configuring PETSc to compile on your system
> > ===
> > TESTING: configureLibrary from 
> > config.packages.Matlab(config/BuildSystem/config/***
> >  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> > details):
> > ---
> > You set a value for --with-matlab-dir, but /Applications/MATLAB_R2018a.app 
> > cannot be used
> > ***
> > 
> > 
> > 
> > Part of the log file as follow
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_REDUCE_SCATTER
> > #define PETSC_HAVE_MPI_REDUCE_SCATTER 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_COMBINER_DUP
> > #define PETSC_HAVE_MPI_COMBINER_DUP 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPIIO
> > #define PETSC_HAVE_MPIIO 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_COMM_SPAWN
> > #define PETSC_HAVE_MPI_COMM_SPAWN 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_FINT
> > #define PETSC_HAVE_MPI_FINT 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_IBARRIER
> > #define PETSC_HAVE_MPI_IBARRIER 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_ALLTOALLW
> > #define PETSC_HAVE_MPI_ALLTOALLW 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_OMPI_RELEASE_VERSION
> > #define PETSC_HAVE_OMPI_RELEASE_VERSION 2
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_REDUCE_LOCAL
> > #define PETSC_HAVE_MPI_REDUCE_LOCAL 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_REPLACE
> > #define PETSC_HAVE_MPI_REPLACE 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_EXSCAN
> > #define PETSC_HAVE_MPI_EXSCAN 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_C_DOUBLE_COMPLEX
> > #define PETSC_HAVE_MPI_C_DOUBLE_COMPLEX 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_FINALIZED
> > #define PETSC_HAVE_MPI_FINALIZED 1
> > #endif
> > 
> > 
> > #ifndef PETSC_USE_INFO
> > #define PETSC_USE_INFO 1
> > #endif
> > 
> > 
> > #ifndef PETSC_Alignx
> > #define PETSC_Alignx(a,b)   
> > #endif
> > 
> > 
> > #ifndef 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
If I run it directly, I got the following. But if I double click 
/Applications/MATLAB_R2018a.app/bin/matlab in the folder. The matlab start 
normally.


Last login: Thu Oct 25 19:38:28 on ttys002
Scott-Grad-MBP:~ zhihui$ /Applications/MATLAB_R2018a.app/bin/matlab -nojvm 
-nodisplay -r "display(['Version ' version]); exit"


   < M A T L A B (R) >
 Copyright 1984-2018 The MathWorks, Inc.
  R2018a (9.4.0.813654) 64-bit (maci64)
February 23, 2018




For online documentation, see http://www.mathworks.com/support
For product information, visit www.mathworks.com.


Version 9.4.0.813654 (R2018a)



   Segmentation violation detected at Thu Oct 25 19:39:13 2018 -0600



Configuration:
  Crash Decoding   : Disabled - No sandbox or build area path
  Crash Mode   : continue (default)
  Default Encoding : ISO-8859-1
  Deployed : false
  Graphics Driver  : Unknown software
  MATLAB Architecture  : maci64
  MATLAB Entitlement ID: 789930
  MATLAB Root  : /Applications/MATLAB_R2018a.app
  MATLAB Version   : 9.4.0.813654 (R2018a)
  OpenGL   : software
  Operating System : Mac OS Version 10.12.6 (Build 16G29)
  Process ID   : 77227
  Processor ID : x86 Family 6 Model 158 Stepping 9, GenuineIntel
  Session Key  : 3308a767-ffe8-4f64-954b-ed266f8dea9c
  Window System: None


Fault Count: 1




Abnormal termination


Register State (from fault):
  RAX =   RBX = 7f922bf7de00
  RCX = 7f922bf7dde0  RDX = 
  RSP = 000108315005  RBP = 7b082a10
  RSI = 7b082c60  RDI = 7b082a30


   R8 =    R9 = 
  R10 = 7b082c60  R11 = 7f922be0ffd8
  R12 = 7b082a50  R13 = 0001082d9a7c
  R14 = 7b082a50  R15 = 7fff9f6aa410


  RIP =   RFL = 7b083bc0


   CS =    FS = 7f922bf7d550   GS = 7b082f70


Stack Trace (from fault):
[  0] 0x0001070f6f54   
bin/maci64/libmwfl.dylib+00053076 
_ZN10foundation4core4diag15stacktrace_base7captureERKNS1_14thread_contextEm+0052
[  1] 0x0001070fbe26   
bin/maci64/libmwfl.dylib+00073254 
_ZN10foundation4core4test17terminate_handledEv+3958
[  2] 0x0001070fac49   
bin/maci64/libmwfl.dylib+00068681 
_ZN10foundation4core4diag13terminate_logEPKcPK17__darwin_ucontext+0185
[  3] 0x00010b04a2f0  
bin/maci64/libmwmcr.dylib+00574192 
_Z19mnPrintErrorMessageRKNSt3__112basic_stringIcNS_11char_traitsIcEENS_9allocatorIc+00010208
[  4] 0x00010b047f72  
bin/maci64/libmwmcr.dylib+00565106 
_Z19mnPrintErrorMessageRKNSt3__112basic_stringIcNS_11char_traitsIcEENS_9allocatorIc+1122
[  5] 0x00010b046681  
bin/maci64/libmwmcr.dylib+00558721 mnFatalSignalHandler+0145
[  6] 0x7fffa0d40b3a   
/usr/lib/system/libsystem_platform.dylib+00011066 _sigtramp+0026
[  7] 0x002e   
+
[  8] 0x000114092674
bin/maci64/libmwddux_impl.dylib+00251508 
_ZNK17DDUXConfigService16updateDDUXConfigEv+0148
[  9] 0x00011406d426
bin/maci64/libmwddux_impl.dylib+00099366 
_ZNK4ddux6detail30UsageDataCollectionServiceImpl15logSessionStartERKN26UsageDataCollectionService18SessionStartParamsE+6518
[ 10] 0x00011405d65f
bin/maci64/libmwddux_impl.dylib+00034399 
_ZN4ddux16DDUXServiceProxyclERKNS_23SendSessionStartMessageERKN10foundation7msg_svc8exchange7RoutingE+1711
[ 11] 0x0001140680a7
bin/maci64/libmwddux_impl.dylib+00077991 
_ZN4ddux23SendSessionStartMessageC1Ev+0071
[ 12] 0x000114065da5
bin/maci64/libmwddux_impl.dylib+00069029 
_ZN4ddux16DDUXServiceProxyclERKNS_23LogFunctionUsageMessageERKN10foundation7msg_svc8exchange7RoutingE+00024853
[ 13] 0x000114065ab0
bin/maci64/libmwddux_impl.dylib+00068272 
_ZN4ddux16DDUXServiceProxyclERKNS_23LogFunctionUsageMessageERKN10foundation7msg_svc8exchange7RoutingE+00024096
[ 14] 0x000106d3fd68   
bin/maci64/libmwms.dylib+00785768 
_ZN10foundation7msg_svc8exchange6detail23DefaultMessageQueueImpl15deliverMessagesEPNS1_19DefaultMessageQueueE+0088
[ 15] 0x00011406675c
bin/maci64/libmwddux_impl.dylib+00071516 
_ZN4ddux16DDUXServiceProxyclERKNS_23LogFunctionUsageMessageERKN10foundation7msg_svc8exchange7RoutingE+00027340
[ 16] 0x000114066574

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
>>
Testing Matlab at /Applications/MATLAB_R2018a.app
Executing: /Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay -r 
"display(['Version ' version]); exit"
stdout:
< M A T L A B (R) >
  Copyright 1984-2018 The MathWorks, Inc.
   R2018a (9.4.0.813654) 64-bit (maci64)
 February 23, 2018

For online documentation, see http://www.mathworks.com/support
For product information, visit www.mathworks.com.

Version 9.4.0.813654 (R2018a)
WARNING: Found Matlab at /Applications/MATLAB_R2018a.app but unable to run
<


So configure attempted to run Matlab - and perhaps got an error. What do you 
get if you run this manually?


/Applications/MATLAB_R2018a.app/bin/matlab -nojvm -nodisplay -r 
"display(['Version ' version]); exit"
echo $?

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Hi Satish, 
> 
> 
> The attached is the whole configure.log file. I could not past it here 
> because it is too big and crash my webpage. I could not use the latest 
> version right now, because the project is maintained by other people. If I 
> update petsc, I will break the whole project. But we will use the latest 
> version when the maintenance guys update the dependences.
> 
> 
> Thank you.
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> 1. You need to send us the complete log.
> 
> 2. Also use current release - petsc-3.10 [not 3.8]
> 
> Satish
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
> > --with-matlab-dir=/Applications/MATLAB_R2018a.app/
> > ===
> >  Configuring PETSc to compile on your system
> > ===
> > TESTING: configureLibrary from 
> > config.packages.Matlab(config/BuildSystem/config/***
> >  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> > details):
> > ---
> > You set a value for --with-matlab-dir, but /Applications/MATLAB_R2018a.app 
> > cannot be used
> > ***
> > 
> > 
> > 
> > Part of the log file as follow
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_REDUCE_SCATTER
> > #define PETSC_HAVE_MPI_REDUCE_SCATTER 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_COMBINER_DUP
> > #define PETSC_HAVE_MPI_COMBINER_DUP 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPIIO
> > #define PETSC_HAVE_MPIIO 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_COMM_SPAWN
> > #define PETSC_HAVE_MPI_COMM_SPAWN 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_FINT
> > #define PETSC_HAVE_MPI_FINT 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_IBARRIER
> > #define PETSC_HAVE_MPI_IBARRIER 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_ALLTOALLW
> > #define PETSC_HAVE_MPI_ALLTOALLW 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_OMPI_RELEASE_VERSION
> > #define PETSC_HAVE_OMPI_RELEASE_VERSION 2
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_REDUCE_LOCAL
> > #define PETSC_HAVE_MPI_REDUCE_LOCAL 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_REPLACE
> > #define PETSC_HAVE_MPI_REPLACE 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_EXSCAN
> > #define PETSC_HAVE_MPI_EXSCAN 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_C_DOUBLE_COMPLEX
> > #define PETSC_HAVE_MPI_C_DOUBLE_COMPLEX 1
> > #endif
> > 
> > 
> > #ifndef PETSC_HAVE_MPI_FINALIZED
> > #define PETSC_HAVE_MPI_FINALIZED 1
> > #endif
> > 
> > 
> > #ifndef PETSC_USE_INFO
> > #define PETSC_USE_INFO 1
> > #endif
> > 
> > 
> > #ifndef PETSC_Alignx
> > #define PETSC_Alignx(a,b)   
> > #endif
> > 
> > 
> > #ifndef PETSC_USE_BACKWARD_LOOP
> > #define PETSC_USE_BACKWARD_LOOP 1
> > #endif
> > 
> > 
> > #ifndef PETSC_USE_DEBUG
> > #define PETSC_USE_DEBUG 1
> > #endif
> > 
> > 
> > #ifndef PETSC_USE_LOG
> > #define PETSC_USE_LOG 1
> > #endif
> > 
> > 
> > #ifndef PETSC_IS_COLOR_VALUE_TYPE_F
> > #define PETSC_IS_COLOR_VALUE_TYPE_F integer2
> > #endif
> > 
> > 
> > #ifndef PETSC_IS_COLOR_VALUE_TYPE
> > #define PETSC_IS_COLOR_VALUE_TYPE short
> > #endif
> > 
> > 
> > #ifndef PETSC_USE_CTABLE
> > #define PETSC_USE_CTABLE 1
> > #endif
> > 
> > 
> > #ifndef PETSC_MEMALIGN
> > #define PETSC_MEMALIGN 16
> > #endif
> > 
> > 
> > #ifndef PETSC_LEVEL1_DCACHE_LINESIZE
> > #define PETSC_LEVEL1_DCACHE_LINESIZE 32
> > #endif
> > 
> > 
> > #ifndef PETSC_LEVEL1_DCACHE_SIZE
> > #define PETSC_LEVEL1_DCACHE_SIZE 32768
> > #endif
> > 
> > 
> > #ifndef PETSC_LEVEL1_DCACHE_ASSOC
> > #define PETSC_LEVEL1_DCACHE_ASSOC 2
> > #endif
> > 
> > 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
1. You need to send us the complete log.

2. Also use current release - petsc-3.10 [not 3.8]

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
> --with-matlab-dir=/Applications/MATLAB_R2018a.app/
> ===
>  Configuring PETSc to compile on your system
> ===
> TESTING: configureLibrary from 
> config.packages.Matlab(config/BuildSystem/config/***
>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> details):
> ---
> You set a value for --with-matlab-dir, but /Applications/MATLAB_R2018a.app 
> cannot be used
> ***
> 
> 
> 
> Part of the log file as follow
> 
> 
> #ifndef PETSC_HAVE_MPI_REDUCE_SCATTER
> #define PETSC_HAVE_MPI_REDUCE_SCATTER 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_COMBINER_DUP
> #define PETSC_HAVE_MPI_COMBINER_DUP 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPIIO
> #define PETSC_HAVE_MPIIO 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_COMM_SPAWN
> #define PETSC_HAVE_MPI_COMM_SPAWN 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_FINT
> #define PETSC_HAVE_MPI_FINT 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_IBARRIER
> #define PETSC_HAVE_MPI_IBARRIER 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_ALLTOALLW
> #define PETSC_HAVE_MPI_ALLTOALLW 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_OMPI_RELEASE_VERSION
> #define PETSC_HAVE_OMPI_RELEASE_VERSION 2
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_REDUCE_LOCAL
> #define PETSC_HAVE_MPI_REDUCE_LOCAL 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_REPLACE
> #define PETSC_HAVE_MPI_REPLACE 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_EXSCAN
> #define PETSC_HAVE_MPI_EXSCAN 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_C_DOUBLE_COMPLEX
> #define PETSC_HAVE_MPI_C_DOUBLE_COMPLEX 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_MPI_FINALIZED
> #define PETSC_HAVE_MPI_FINALIZED 1
> #endif
> 
> 
> #ifndef PETSC_USE_INFO
> #define PETSC_USE_INFO 1
> #endif
> 
> 
> #ifndef PETSC_Alignx
> #define PETSC_Alignx(a,b)   
> #endif
> 
> 
> #ifndef PETSC_USE_BACKWARD_LOOP
> #define PETSC_USE_BACKWARD_LOOP 1
> #endif
> 
> 
> #ifndef PETSC_USE_DEBUG
> #define PETSC_USE_DEBUG 1
> #endif
> 
> 
> #ifndef PETSC_USE_LOG
> #define PETSC_USE_LOG 1
> #endif
> 
> 
> #ifndef PETSC_IS_COLOR_VALUE_TYPE_F
> #define PETSC_IS_COLOR_VALUE_TYPE_F integer2
> #endif
> 
> 
> #ifndef PETSC_IS_COLOR_VALUE_TYPE
> #define PETSC_IS_COLOR_VALUE_TYPE short
> #endif
> 
> 
> #ifndef PETSC_USE_CTABLE
> #define PETSC_USE_CTABLE 1
> #endif
> 
> 
> #ifndef PETSC_MEMALIGN
> #define PETSC_MEMALIGN 16
> #endif
> 
> 
> #ifndef PETSC_LEVEL1_DCACHE_LINESIZE
> #define PETSC_LEVEL1_DCACHE_LINESIZE 32
> #endif
> 
> 
> #ifndef PETSC_LEVEL1_DCACHE_SIZE
> #define PETSC_LEVEL1_DCACHE_SIZE 32768
> #endif
> 
> 
> #ifndef PETSC_LEVEL1_DCACHE_ASSOC
> #define PETSC_LEVEL1_DCACHE_ASSOC 2
> #endif
> 
> 
> #ifndef PETSC_HAVE_CLOSURE
> #define PETSC_HAVE_CLOSURE 1
> #endif
> 
> 
> #ifndef PETSC__BSD_SOURCE
> #define PETSC__BSD_SOURCE 1
> #endif
> 
> 
> #ifndef PETSC__DEFAULT_SOURCE
> #define PETSC__DEFAULT_SOURCE 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT
> #define PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_GFORTRAN_IARGC
> #define PETSC_HAVE_GFORTRAN_IARGC 1
> #endif
> 
> 
> #ifndef PETSC_USE_BYTES_FOR_SIZE
> #define PETSC_USE_BYTES_FOR_SIZE 1
> #endif
> 
> 
> #ifndef PETSC_HAVE_SYS_SYSCTL_H
> #define PETSC_HAVE_SYS_SYSCTL_H 1
> #endif
> 
> 
> #endif
>  C specific Configure header 
> /var/folders/z_/2vhmh9zx3kx3h80k5wbkjmtrgp/T/petsc-yrHkm7/conffix.h 
> #if !defined(INCLUDED_UNKNOWN)
> #define INCLUDED_UNKNOWN
> 
> 
> #if defined(__cplusplus)
> extern "C" {
> }
> #else
> #endif
> #endif
> ***
>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> details):
> ---
> You set a value for --with-matlab-dir, but /Applications/MATLAB_R2018a.app 
> cannot be used
> ***
>   File "./config/configure.py", line 393, in petsc_configure
> framework.configure(out = sys.stdout)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 1097, in configure
> self.processChildren()
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 1086, in processChildren
> self.serialEvaluation(self.childGraph)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 1067, in serialEvaluation
> 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
--with-matlab-dir=/Applications/MATLAB_R2018a.app/
===
 Configuring PETSc to compile on your system
===
TESTING: configureLibrary from 
config.packages.Matlab(config/BuildSystem/config/***
 UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
details):
---
You set a value for --with-matlab-dir, but /Applications/MATLAB_R2018a.app 
cannot be used
***



Part of the log file as follow


#ifndef PETSC_HAVE_MPI_REDUCE_SCATTER
#define PETSC_HAVE_MPI_REDUCE_SCATTER 1
#endif


#ifndef PETSC_HAVE_MPI_COMBINER_DUP
#define PETSC_HAVE_MPI_COMBINER_DUP 1
#endif


#ifndef PETSC_HAVE_MPIIO
#define PETSC_HAVE_MPIIO 1
#endif


#ifndef PETSC_HAVE_MPI_COMM_SPAWN
#define PETSC_HAVE_MPI_COMM_SPAWN 1
#endif


#ifndef PETSC_HAVE_MPI_FINT
#define PETSC_HAVE_MPI_FINT 1
#endif


#ifndef PETSC_HAVE_MPI_IBARRIER
#define PETSC_HAVE_MPI_IBARRIER 1
#endif


#ifndef PETSC_HAVE_MPI_ALLTOALLW
#define PETSC_HAVE_MPI_ALLTOALLW 1
#endif


#ifndef PETSC_HAVE_OMPI_RELEASE_VERSION
#define PETSC_HAVE_OMPI_RELEASE_VERSION 2
#endif


#ifndef PETSC_HAVE_MPI_REDUCE_LOCAL
#define PETSC_HAVE_MPI_REDUCE_LOCAL 1
#endif


#ifndef PETSC_HAVE_MPI_REPLACE
#define PETSC_HAVE_MPI_REPLACE 1
#endif


#ifndef PETSC_HAVE_MPI_EXSCAN
#define PETSC_HAVE_MPI_EXSCAN 1
#endif


#ifndef PETSC_HAVE_MPI_C_DOUBLE_COMPLEX
#define PETSC_HAVE_MPI_C_DOUBLE_COMPLEX 1
#endif


#ifndef PETSC_HAVE_MPI_FINALIZED
#define PETSC_HAVE_MPI_FINALIZED 1
#endif


#ifndef PETSC_USE_INFO
#define PETSC_USE_INFO 1
#endif


#ifndef PETSC_Alignx
#define PETSC_Alignx(a,b)   
#endif


#ifndef PETSC_USE_BACKWARD_LOOP
#define PETSC_USE_BACKWARD_LOOP 1
#endif


#ifndef PETSC_USE_DEBUG
#define PETSC_USE_DEBUG 1
#endif


#ifndef PETSC_USE_LOG
#define PETSC_USE_LOG 1
#endif


#ifndef PETSC_IS_COLOR_VALUE_TYPE_F
#define PETSC_IS_COLOR_VALUE_TYPE_F integer2
#endif


#ifndef PETSC_IS_COLOR_VALUE_TYPE
#define PETSC_IS_COLOR_VALUE_TYPE short
#endif


#ifndef PETSC_USE_CTABLE
#define PETSC_USE_CTABLE 1
#endif


#ifndef PETSC_MEMALIGN
#define PETSC_MEMALIGN 16
#endif


#ifndef PETSC_LEVEL1_DCACHE_LINESIZE
#define PETSC_LEVEL1_DCACHE_LINESIZE 32
#endif


#ifndef PETSC_LEVEL1_DCACHE_SIZE
#define PETSC_LEVEL1_DCACHE_SIZE 32768
#endif


#ifndef PETSC_LEVEL1_DCACHE_ASSOC
#define PETSC_LEVEL1_DCACHE_ASSOC 2
#endif


#ifndef PETSC_HAVE_CLOSURE
#define PETSC_HAVE_CLOSURE 1
#endif


#ifndef PETSC__BSD_SOURCE
#define PETSC__BSD_SOURCE 1
#endif


#ifndef PETSC__DEFAULT_SOURCE
#define PETSC__DEFAULT_SOURCE 1
#endif


#ifndef PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT
#define PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT 1
#endif


#ifndef PETSC_HAVE_GFORTRAN_IARGC
#define PETSC_HAVE_GFORTRAN_IARGC 1
#endif


#ifndef PETSC_USE_BYTES_FOR_SIZE
#define PETSC_USE_BYTES_FOR_SIZE 1
#endif


#ifndef PETSC_HAVE_SYS_SYSCTL_H
#define PETSC_HAVE_SYS_SYSCTL_H 1
#endif


#endif
 C specific Configure header 
/var/folders/z_/2vhmh9zx3kx3h80k5wbkjmtrgp/T/petsc-yrHkm7/conffix.h 
#if !defined(INCLUDED_UNKNOWN)
#define INCLUDED_UNKNOWN


#if defined(__cplusplus)
extern "C" {
}
#else
#endif
#endif
***
 UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
details):
---
You set a value for --with-matlab-dir, but /Applications/MATLAB_R2018a.app 
cannot be used
***
  File "./config/configure.py", line 393, in petsc_configure
framework.configure(out = sys.stdout)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 1097, in configure
self.processChildren()
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 1086, in processChildren
self.serialEvaluation(self.childGraph)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 1067, in serialEvaluation
child.configure()
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/package.py", 
line 857, in configure
self.executeTest(self.configureLibrary)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/base.py", 
line 126, in executeTest
ret = test(*args,**kargs)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/packages/Matlab.py",
 line 40, in configureLibrary
for matlab in self.generateGuesses():
  File 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
On Fri, 26 Oct 2018, avatar wrote:

> Scott-Grad-MBP:bin zhihui$ pwd
> /Applications/MATLAB_R2018a.app/bin


Sorry - you need

--with-matlab-dir=/Applications/MATLAB_R2018a.app/

Satish



Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
When I do the following, it shows
Scott-Grad-MBP:~ zhihui$ cd Applications/
Scott-Grad-MBP:Applications zhihui$ ls
Chrome Apps.localized
Scott-Grad-MBP:Applications zhihui$



From these, we can see there is no MATALB_R2018a.app in Applications. But when 
I get into bin, and do the following, it does show
Scott-Grad-MBP:bin zhihui$ pwd
/Applications/MATLAB_R2018a.app/bin
Scott-Grad-MBP:bin zhihui$ ls
activate_matlab.sh  lcdata_utf8.xml matlab-glselector.shmexsh
deactivate_matlab.shldd matopts.sh  
optsetup.sh
engopts.sh  m3iregistry mex registry
lcdata.xml  maci64  mexext  util
lcdata.xsd  matlab  mexopts.sh





-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  "avatar"<648934...@qq.com>; 
Cc:  "petsc-users"; 
Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
--with-arto specify an archiver"



you need

--with-matlab-dir=/Users/zhihui/Applications/MATLAB_R2018a.app

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Thank Satish. It worked as you said. But when I rebuild petsc using 
> ./configure --with-matlab
> It prompts 
> ===
>  Configuring PETSc to compile on your system
> ===
> TESTING: configureLibrary from 
> config.packages.Matlab(config/BuildSystem/config/packages/Matlab.py:35)   
>   
> ***
>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> details):
> ---
> Could not find a functional Matlab
> Run with --with-matlab-dir=Matlabrootdir if you know where it is
> ***
> 
> 
> 
> Then, I did the following. But it shows that 
> "/Users/zhihui/Applications/MATLAB_R2018a.app/bin/matlab" nonexistent. 
> However, it does exist as
> Scott-Grad-MBP:bin zhihui$ pwd
> /Applications/MATLAB_R2018a.app/bin
> Scott-Grad-MBP:bin zhihui$ ls
> activate_matlab.shlcdata_utf8.xml matlab-glselector.shmexsh
> deactivate_matlab.sh  ldd matopts.sh  
> optsetup.sh
> engopts.shm3iregistry mex registry
> lcdata.xmlmaci64  mexext  util
> lcdata.xsdmatlab  mexopts.sh
> 
> 
> 
> Do you know what it is going on here?
> 
> 
> 
> 
> Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
> --with-matlab-dir=./../../../../Applications/MATLAB_R2018a/bin/matlab
> ===
>  Configuring PETSc to compile on your system
> ===
> ***
> ERROR in COMMAND LINE ARGUMENT to ./configure
> ---
> Nonexistent directory: /Users/zhihui/Applications/MATLAB_R2018a/bin/matlab 
> for key with-matlab-dir
> ***
> 
> 
> 
> 
>   File "./config/configure.py", line 390, in petsc_configure
> framework = 
> config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:],
>  loadArgDB = 0)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 110, in __init__
> self.createChildren()
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 321, in createChildren
> self.getChild(moduleName)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 306, in getChild
> config.setupDependencies(self)
>   File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/PETSc/Configure.py", 
> line 107, in setupDependencies
> obj = self.registerPythonFile(package,'config.packages')
>   File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/PETSc/Configure.py", 
> line 63, in registerPythonFile
> utilityObj = 
> self.framework.require(directory+utilityName, self)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 326, in require
> config = self.getChild(moduleName, keywordArgs)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 304, in getChild
> config.setup()
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/script.py", 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
still shows these


Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
--with-matlab-dir=/Users/zhihui/Applications/MATLAB_R2018a.app
===
 Configuring PETSc to compile on your system
===
***
ERROR in COMMAND LINE ARGUMENT to ./configure
---
Nonexistent directory: /Users/zhihui/Applications/MATLAB_R2018a.app for key 
with-matlab-dir
***




  File "./config/configure.py", line 390, in petsc_configure
framework = 
config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:],
 loadArgDB = 0)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 110, in __init__
self.createChildren()
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 321, in createChildren
self.getChild(moduleName)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 306, in getChild
config.setupDependencies(self)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/PETSc/Configure.py", 
line 107, in setupDependencies
obj = self.registerPythonFile(package,'config.packages')
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/PETSc/Configure.py", 
line 63, in registerPythonFile
utilityObj = 
self.framework.require(directory+utilityName, self)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 326, in require
config = self.getChild(moduleName, keywordArgs)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 304, in getChild
config.setup()
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/script.py", 
line 101, in setup
logger.Logger.setup(self)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/logger.py", 
line 85, in setup
args.ArgumentProcessor.setup(self)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/args.py", 
line 75, in setup
self.setupArguments(self.argDB)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/script.py", 
line 85, in setupArguments
self.setupHelp(self.help)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/packages/Matlab.py",
 line 15, in setupHelp
help.addArgument('MATLAB', '-with-matlab-dir=', 
nargs.ArgDir(None, None, 'Specify the root directory of the Matlab 
installation'))
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/help.py", 
line 107, in addArgument
self.argDB.setType(self.getArgName(name), argType, forceLocal = 1)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/RDict.py", 
line 213, in setType
value.setValue(v.getValue())
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/nargs.py", 
line 324, in setValue
raise ValueError('Nonexistent directory: '+str(value)+' for key 
'+str(self.key))







-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  "avatar"<648934...@qq.com>; 
Cc:  "petsc-users"; 
Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
--with-arto specify an archiver"



you need

--with-matlab-dir=/Users/zhihui/Applications/MATLAB_R2018a.app

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Thank Satish. It worked as you said. But when I rebuild petsc using 
> ./configure --with-matlab
> It prompts 
> ===
>  Configuring PETSc to compile on your system
> ===
> TESTING: configureLibrary from 
> config.packages.Matlab(config/BuildSystem/config/packages/Matlab.py:35)   
>   
> ***
>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> details):
> ---
> Could not find a functional Matlab
> Run with --with-matlab-dir=Matlabrootdir if you know where it is
> ***
> 
> 
> 
> Then, I did the following. But it shows that 
> "/Users/zhihui/Applications/MATLAB_R2018a.app/bin/matlab" nonexistent. 
> However, it does exist as
> Scott-Grad-MBP:bin zhihui$ pwd
> /Applications/MATLAB_R2018a.app/bin
> Scott-Grad-MBP:bin zhihui$ ls
> activate_matlab.shlcdata_utf8.xml matlab-glselector.shmexsh
> deactivate_matlab.sh  ldd

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
you need

--with-matlab-dir=/Users/zhihui/Applications/MATLAB_R2018a.app

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Thank Satish. It worked as you said. But when I rebuild petsc using 
> ./configure --with-matlab
> It prompts 
> ===
>  Configuring PETSc to compile on your system
> ===
> TESTING: configureLibrary from 
> config.packages.Matlab(config/BuildSystem/config/packages/Matlab.py:35)   
>   
> ***
>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> details):
> ---
> Could not find a functional Matlab
> Run with --with-matlab-dir=Matlabrootdir if you know where it is
> ***
> 
> 
> 
> Then, I did the following. But it shows that 
> "/Users/zhihui/Applications/MATLAB_R2018a.app/bin/matlab" nonexistent. 
> However, it does exist as
> Scott-Grad-MBP:bin zhihui$ pwd
> /Applications/MATLAB_R2018a.app/bin
> Scott-Grad-MBP:bin zhihui$ ls
> activate_matlab.shlcdata_utf8.xml matlab-glselector.shmexsh
> deactivate_matlab.sh  ldd matopts.sh  
> optsetup.sh
> engopts.shm3iregistry mex registry
> lcdata.xmlmaci64  mexext  util
> lcdata.xsdmatlab  mexopts.sh
> 
> 
> 
> Do you know what it is going on here?
> 
> 
> 
> 
> Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
> --with-matlab-dir=./../../../../Applications/MATLAB_R2018a/bin/matlab
> ===
>  Configuring PETSc to compile on your system
> ===
> ***
> ERROR in COMMAND LINE ARGUMENT to ./configure
> ---
> Nonexistent directory: /Users/zhihui/Applications/MATLAB_R2018a/bin/matlab 
> for key with-matlab-dir
> ***
> 
> 
> 
> 
>   File "./config/configure.py", line 390, in petsc_configure
> framework = 
> config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:],
>  loadArgDB = 0)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 110, in __init__
> self.createChildren()
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 321, in createChildren
> self.getChild(moduleName)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 306, in getChild
> config.setupDependencies(self)
>   File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/PETSc/Configure.py", 
> line 107, in setupDependencies
> obj = self.registerPythonFile(package,'config.packages')
>   File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/PETSc/Configure.py", 
> line 63, in registerPythonFile
> utilityObj = 
> self.framework.require(directory+utilityName, self)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 326, in require
> config = self.getChild(moduleName, keywordArgs)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
>  line 304, in getChild
> config.setup()
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/script.py", line 
> 101, in setup
> logger.Logger.setup(self)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/logger.py", line 
> 85, in setup
> args.ArgumentProcessor.setup(self)
>   File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/args.py", 
> line 75, in setup
> self.setupArguments(self.argDB)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/script.py", line 
> 85, in setupArguments
> self.setupHelp(self.help)
>   File 
> "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/packages/Matlab.py",
>  line 15, in setupHelp
> help.addArgument('MATLAB', '-with-matlab-dir=', 
> nargs.ArgDir(None, None, 'Specify the root directory of the Matlab 
> installation'))
>   File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/help.py", 
> line 107, in addArgument
> self.argDB.setType(self.getArgName(name), argType, forceLocal = 1)
>   File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/RDict.py", 
> line 213, in setType
> 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Thank Satish. It worked as you said. But when I rebuild petsc using 
./configure --with-matlab
It prompts 
===
 Configuring PETSc to compile on your system
===
TESTING: configureLibrary from 
config.packages.Matlab(config/BuildSystem/config/packages/Matlab.py:35) 
***
 UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
details):
---
Could not find a functional Matlab
Run with --with-matlab-dir=Matlabrootdir if you know where it is
***



Then, I did the following. But it shows that 
"/Users/zhihui/Applications/MATLAB_R2018a.app/bin/matlab" nonexistent. However, 
it does exist as
Scott-Grad-MBP:bin zhihui$ pwd
/Applications/MATLAB_R2018a.app/bin
Scott-Grad-MBP:bin zhihui$ ls
activate_matlab.sh  lcdata_utf8.xml matlab-glselector.shmexsh
deactivate_matlab.shldd matopts.sh  
optsetup.sh
engopts.sh  m3iregistry mex registry
lcdata.xml  maci64  mexext  util
lcdata.xsd  matlab  mexopts.sh



Do you know what it is going on here?




Scott-Grad-MBP:petsc-3.8.3 zhihui$ ./configure 
--with-matlab-dir=./../../../../Applications/MATLAB_R2018a/bin/matlab
===
 Configuring PETSc to compile on your system
===
***
ERROR in COMMAND LINE ARGUMENT to ./configure
---
Nonexistent directory: /Users/zhihui/Applications/MATLAB_R2018a/bin/matlab for 
key with-matlab-dir
***




  File "./config/configure.py", line 390, in petsc_configure
framework = 
config.framework.Framework(['--configModules=PETSc.Configure','--optionsModule=config.compilerOptions']+sys.argv[1:],
 loadArgDB = 0)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 110, in __init__
self.createChildren()
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 321, in createChildren
self.getChild(moduleName)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 306, in getChild
config.setupDependencies(self)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/PETSc/Configure.py", 
line 107, in setupDependencies
obj = self.registerPythonFile(package,'config.packages')
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/PETSc/Configure.py", 
line 63, in registerPythonFile
utilityObj = 
self.framework.require(directory+utilityName, self)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 326, in require
config = self.getChild(moduleName, keywordArgs)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/framework.py",
 line 304, in getChild
config.setup()
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/script.py", 
line 101, in setup
logger.Logger.setup(self)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/logger.py", 
line 85, in setup
args.ArgumentProcessor.setup(self)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/args.py", 
line 75, in setup
self.setupArguments(self.argDB)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/script.py", 
line 85, in setupArguments
self.setupHelp(self.help)
  File 
"/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/config/packages/Matlab.py",
 line 15, in setupHelp
help.addArgument('MATLAB', '-with-matlab-dir=', 
nargs.ArgDir(None, None, 'Specify the root directory of the Matlab 
installation'))
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/help.py", 
line 107, in addArgument
self.argDB.setType(self.getArgName(name), argType, forceLocal = 1)
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/RDict.py", 
line 213, in setType
value.setValue(v.getValue())
  File "/Users/zhihui/igx/deps/srcs/petsc-3.8.3/config/BuildSystem/nargs.py", 
line 324, in setValue
raise ValueError('Nonexistent directory: '+str(value)+' for key 
'+str(self.key))







-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
On Fri, 26 Oct 2018, avatar wrote:

> Is there other way to overcome this problem? Because if I don't set TMPDIR as 
> \tmp. My other project will break. And actually I don't even know where my 
> project set up the TMPDIR value.

I guess you need to read up on some basic unix and system admin.

You probably meant to use:

export TMPDIR=/tmp

However  - you have:

export TMPDIR=tmp

To overcome this problem - you do:

export TMPDIR=/tmp

Satish


Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Is there other way to overcome this problem? Because if I don't set TMPDIR as 
\tmp. My other project will break. And actually I don't even know where my 
project set up the TMPDIR value.




-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  "petsc-users"; 
Cc:  "avatar"<648934...@qq.com>; 
Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
--with-arto specify an archiver"



To clarify, I meant: remove 'code that is changing the value of TMPDIR to 'tmp'

Satish

 On Thu, 25 Oct 2018, Satish Balay wrote:

> I see - you have:
> 
> >>
> TMPDIR=tmp
> <<
> 
> Did you set this in your ~/.bashrc or somewhere? This is wrong and is 
> breaking tools.
> OSX should setup something likethe following for you.
> 
> petsc-mini:~ balay$ echo $TMPDIR
> /var/folders/lw/hyrlb1051p9fj96qkktfvhvmgn/T/
> 
> Remove it - and retry building PETSc.
> 
> Satish
> 
> On Thu, 25 Oct 2018, Satish Balay wrote:
> 
> > Looks like it worked.
> > 
> > What do you have for:
> > 
> > echo $TMPDIR
> > ls -l libconf1.a
> > /usr/bin/ar t libconf1.a
> > TMPDIR=$PWD /usr/bin/ar t libconf1.a
> > 
> > Satish
> > 
> > 
> > On Fri, 26 Oct 2018, avatar wrote:
> > 
> > > As follow. Then, what I should do next?
> > > 
> > > 
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD /usr/bin/ar cr 
> > > libconf1.a sizeof.o
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > > 
> > > 
> > > 
> > > 
> > > 
> > > -- Original --
> > > From:  "Balay, Satish";;
> > > Date:  Oct 26, 2018
> > > To:  "avatar"<648934...@qq.com>; 
> > > Cc:  "petsc-users"; 
> > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > --with-arto specify an archiver"
> > > 
> > > 
> > > 
> > > How about:
> > > 
> > > TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o
> > > 
> > > Satish
> > > 
> > > On Fri, 26 Oct 2018, avatar wrote:
> > > 
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls
> > > > Index.c PetscGetCPUTime.c   PetscMemcmp.c   
> > > > PetscTime.c benchmarkExample.py sizeof.c
> > > > Index.c.htmlPetscGetCPUTime.c.html  PetscMemcmp.c.html  
> > > > PetscTime.c.htmldaemon.py   sizeof.o
> > > > MPI_Wtime.c PetscGetTime.c  PetscMemcpy.c   
> > > > PetscVecNorm.c  index.html  streams
> > > > MPI_Wtime.c.htmlPetscGetTime.c.html PetscMemcpy.c.html  
> > > > PetscVecNorm.c.html libconf1.a
> > > > PLogEvent.c PetscMalloc.c   PetscMemzero.c  
> > > > benchmarkAssembly.pymakefile
> > > > PLogEvent.c.htmlPetscMalloc.c.html  PetscMemzero.c.html 
> > > > benchmarkBatch.py   makefile.html
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr libconf1.a 
> > > > sizeof.o
> > > > ar: temporary file: No such file or directory
> > > > 
> > > > 
> > > > 
> > > > 
> > > > 
> > > > -- Original --
> > > > From:  "Balay, Satish";;
> > > > Date:  Oct 26, 2018
> > > > To:  "avatar"<648934...@qq.com>; 
> > > > Cc:  "petsc-users"; 
> > > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > > --with-arto specify an archiver"
> > > > 
> > > > 
> > > > 
> > > > What about:
> > > > 
> > > > 
> > > > mpicc -c sizeof.c
> > > > /usr/bin/ar cr libconf1.a sizeof.o
> > > > 
> > > > Satish
> > > > 
> > > > On Fri, 26 Oct 2018, avatar wrote:
> > > > 
> > > > > I could not do all things you posted below. I get these:
> > > > > 
> > > > > 
> > > > > Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> > > > > Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> > > > > -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr 
> > > > > /tmp/libconf1.a sizeof.o
> > > > > ar: temporary file: No such file or directory
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr 
> > > > > /tmp/libconf1.a  sizeof.o
> > > > > ar: temporary file: No such file or directory
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > > > > 
> > > > > 
> > > > > 
> > > > > 
> > > > > 
> > > > > -- Original --
> > > > > From:  "Balay, Satish";;
> > > > > Date:  Oct 26, 2018
> > > > > To:  "avatar"<648934...@qq.com>; 
> > > > > Cc:  "petsc-users"; 
> > > > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > > > --with-arto specify an archiver"
> > > > > 
> > > > > 
> > > > > 
> > > > > On Fri, 26 Oct 2018, avatar wrote:
> > > > > 
> > > > > > Hi Satish,
> > > > > > 
> > > > > > 
> > > > > > Thank you very much for your quick response.
> > > > > > 
> > > > > > 
> > > > > > The log file is as follow:
> > > > > > 
> 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
To clarify, I meant: remove 'code that is changing the value of TMPDIR to 'tmp'

Satish

 On Thu, 25 Oct 2018, Satish Balay wrote:

> I see - you have:
> 
> >>
> TMPDIR=tmp
> <<
> 
> Did you set this in your ~/.bashrc or somewhere? This is wrong and is 
> breaking tools.
> OSX should setup something likethe following for you.
> 
> petsc-mini:~ balay$ echo $TMPDIR
> /var/folders/lw/hyrlb1051p9fj96qkktfvhvmgn/T/
> 
> Remove it - and retry building PETSc.
> 
> Satish
> 
> On Thu, 25 Oct 2018, Satish Balay wrote:
> 
> > Looks like it worked.
> > 
> > What do you have for:
> > 
> > echo $TMPDIR
> > ls -l libconf1.a
> > /usr/bin/ar t libconf1.a
> > TMPDIR=$PWD /usr/bin/ar t libconf1.a
> > 
> > Satish
> > 
> > 
> > On Fri, 26 Oct 2018, avatar wrote:
> > 
> > > As follow. Then, what I should do next?
> > > 
> > > 
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD /usr/bin/ar cr 
> > > libconf1.a sizeof.o
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > > 
> > > 
> > > 
> > > 
> > > 
> > > -- Original --
> > > From:  "Balay, Satish";;
> > > Date:  Oct 26, 2018
> > > To:  "avatar"<648934...@qq.com>; 
> > > Cc:  "petsc-users"; 
> > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > --with-arto specify an archiver"
> > > 
> > > 
> > > 
> > > How about:
> > > 
> > > TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o
> > > 
> > > Satish
> > > 
> > > On Fri, 26 Oct 2018, avatar wrote:
> > > 
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls
> > > > Index.c PetscGetCPUTime.c   PetscMemcmp.c   
> > > > PetscTime.c benchmarkExample.py sizeof.c
> > > > Index.c.htmlPetscGetCPUTime.c.html  PetscMemcmp.c.html  
> > > > PetscTime.c.htmldaemon.py   sizeof.o
> > > > MPI_Wtime.c PetscGetTime.c  PetscMemcpy.c   
> > > > PetscVecNorm.c  index.html  streams
> > > > MPI_Wtime.c.htmlPetscGetTime.c.html PetscMemcpy.c.html  
> > > > PetscVecNorm.c.html libconf1.a
> > > > PLogEvent.c PetscMalloc.c   PetscMemzero.c  
> > > > benchmarkAssembly.pymakefile
> > > > PLogEvent.c.htmlPetscMalloc.c.html  PetscMemzero.c.html 
> > > > benchmarkBatch.py   makefile.html
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr libconf1.a 
> > > > sizeof.o
> > > > ar: temporary file: No such file or directory
> > > > 
> > > > 
> > > > 
> > > > 
> > > > 
> > > > -- Original --
> > > > From:  "Balay, Satish";;
> > > > Date:  Oct 26, 2018
> > > > To:  "avatar"<648934...@qq.com>; 
> > > > Cc:  "petsc-users"; 
> > > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > > --with-arto specify an archiver"
> > > > 
> > > > 
> > > > 
> > > > What about:
> > > > 
> > > > 
> > > > mpicc -c sizeof.c
> > > > /usr/bin/ar cr libconf1.a sizeof.o
> > > > 
> > > > Satish
> > > > 
> > > > On Fri, 26 Oct 2018, avatar wrote:
> > > > 
> > > > > I could not do all things you posted below. I get these:
> > > > > 
> > > > > 
> > > > > Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> > > > > Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> > > > > -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr 
> > > > > /tmp/libconf1.a sizeof.o
> > > > > ar: temporary file: No such file or directory
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr 
> > > > > /tmp/libconf1.a  sizeof.o
> > > > > ar: temporary file: No such file or directory
> > > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > > > > 
> > > > > 
> > > > > 
> > > > > 
> > > > > 
> > > > > -- Original --
> > > > > From:  "Balay, Satish";;
> > > > > Date:  Oct 26, 2018
> > > > > To:  "avatar"<648934...@qq.com>; 
> > > > > Cc:  "petsc-users"; 
> > > > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > > > --with-arto specify an archiver"
> > > > > 
> > > > > 
> > > > > 
> > > > > On Fri, 26 Oct 2018, avatar wrote:
> > > > > 
> > > > > > Hi Satish,
> > > > > > 
> > > > > > 
> > > > > > Thank you very much for your quick response.
> > > > > > 
> > > > > > 
> > > > > > The log file is as follow:
> > > > > > 
> > > > > 
> > > > > >>
> > > > > Executing: /usr/bin/ar cr 
> > > > > /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
> > > > > /tmp/petsc-mjVUVK/config.setCompilers/conf1.o
> > > > > Possible ERROR while running archiver: exit code 256
> > > > > stderr:
> > > > > ar: temporary file: No such file or directory
> > > > > Archiver is not functional
> > > > > 
> > > > > <<
> > > > > This is a strange error.
> > > > > 
> > > 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
I see - you have:

>>
TMPDIR=tmp
<<

Did you set this in your ~/.bashrc or somewhere? This is wrong and is breaking 
tools.
OSX should setup something likethe following for you.

petsc-mini:~ balay$ echo $TMPDIR
/var/folders/lw/hyrlb1051p9fj96qkktfvhvmgn/T/

Remove it - and retry building PETSc.

Satish

On Thu, 25 Oct 2018, Satish Balay wrote:

> Looks like it worked.
> 
> What do you have for:
> 
> echo $TMPDIR
> ls -l libconf1.a
> /usr/bin/ar t libconf1.a
> TMPDIR=$PWD /usr/bin/ar t libconf1.a
> 
> Satish
> 
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > As follow. Then, what I should do next?
> > 
> > 
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD /usr/bin/ar cr 
> > libconf1.a sizeof.o
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > 
> > 
> > 
> > 
> > 
> > -- Original --
> > From:  "Balay, Satish";;
> > Date:  Oct 26, 2018
> > To:  "avatar"<648934...@qq.com>; 
> > Cc:  "petsc-users"; 
> > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > --with-arto specify an archiver"
> > 
> > 
> > 
> > How about:
> > 
> > TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o
> > 
> > Satish
> > 
> > On Fri, 26 Oct 2018, avatar wrote:
> > 
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls
> > > Index.c   PetscGetCPUTime.c   PetscMemcmp.c   
> > > PetscTime.c benchmarkExample.py sizeof.c
> > > Index.c.html  PetscGetCPUTime.c.html  PetscMemcmp.c.html  
> > > PetscTime.c.htmldaemon.py   sizeof.o
> > > MPI_Wtime.c   PetscGetTime.c  PetscMemcpy.c   
> > > PetscVecNorm.c  index.html  streams
> > > MPI_Wtime.c.html  PetscGetTime.c.html PetscMemcpy.c.html  
> > > PetscVecNorm.c.html libconf1.a
> > > PLogEvent.c   PetscMalloc.c   PetscMemzero.c  
> > > benchmarkAssembly.pymakefile
> > > PLogEvent.c.html  PetscMalloc.c.html  PetscMemzero.c.html 
> > > benchmarkBatch.py   makefile.html
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr libconf1.a 
> > > sizeof.o
> > > ar: temporary file: No such file or directory
> > > 
> > > 
> > > 
> > > 
> > > 
> > > -- Original --
> > > From:  "Balay, Satish";;
> > > Date:  Oct 26, 2018
> > > To:  "avatar"<648934...@qq.com>; 
> > > Cc:  "petsc-users"; 
> > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > --with-arto specify an archiver"
> > > 
> > > 
> > > 
> > > What about:
> > > 
> > > 
> > > mpicc -c sizeof.c
> > > /usr/bin/ar cr libconf1.a sizeof.o
> > > 
> > > Satish
> > > 
> > > On Fri, 26 Oct 2018, avatar wrote:
> > > 
> > > > I could not do all things you posted below. I get these:
> > > > 
> > > > 
> > > > Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> > > > Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> > > > -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr 
> > > > /tmp/libconf1.a sizeof.o
> > > > ar: temporary file: No such file or directory
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr 
> > > > /tmp/libconf1.a  sizeof.o
> > > > ar: temporary file: No such file or directory
> > > > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > > > 
> > > > 
> > > > 
> > > > 
> > > > 
> > > > -- Original --
> > > > From:  "Balay, Satish";;
> > > > Date:  Oct 26, 2018
> > > > To:  "avatar"<648934...@qq.com>; 
> > > > Cc:  "petsc-users"; 
> > > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > > --with-arto specify an archiver"
> > > > 
> > > > 
> > > > 
> > > > On Fri, 26 Oct 2018, avatar wrote:
> > > > 
> > > > > Hi Satish,
> > > > > 
> > > > > 
> > > > > Thank you very much for your quick response.
> > > > > 
> > > > > 
> > > > > The log file is as follow:
> > > > > 
> > > > 
> > > > >>
> > > > Executing: /usr/bin/ar cr 
> > > > /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
> > > > /tmp/petsc-mjVUVK/config.setCompilers/conf1.o
> > > > Possible ERROR while running archiver: exit code 256
> > > > stderr:
> > > > ar: temporary file: No such file or directory
> > > > Archiver is not functional
> > > > 
> > > > <<
> > > > This is a strange error.
> > > > 
> > > > What do you get when you do the following:
> > > > 
> > > > balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
> > > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
> > > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
> > > > -rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
> > > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr 
> > > > /tmp/libconf1.a  sizeof.o
> > > > 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
These are what I got:
Scott-Grad-MacBook-Pro:benchmarks zhihui$ echo $TMPDIR
tmp
Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l libconf1.a
-rw-r--r--  1 zhihui  staff  1624 Oct 25 17:36 libconf1.a
Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar t libconf1.a
__.SYMDEF SORTED
sizeof.o
Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD /usr/bin/ar t libconf1.a
__.SYMDEF SORTED
sizeof.o





-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  "avatar"<648934...@qq.com>; 
Cc:  "petsc-users"; 
Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
--with-arto specify an archiver"



Looks like it worked.

What do you have for:

echo $TMPDIR
ls -l libconf1.a
/usr/bin/ar t libconf1.a
TMPDIR=$PWD /usr/bin/ar t libconf1.a

Satish


On Fri, 26 Oct 2018, avatar wrote:

> As follow. Then, what I should do next?
> 
> 
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD /usr/bin/ar cr 
> libconf1.a sizeof.o
> Scott-Grad-MacBook-Pro:benchmarks zhihui$
> 
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> How about:
> 
> TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o
> 
> Satish
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls
> > Index.c PetscGetCPUTime.c   PetscMemcmp.c   
> > PetscTime.c benchmarkExample.py sizeof.c
> > Index.c.htmlPetscGetCPUTime.c.html  PetscMemcmp.c.html  
> > PetscTime.c.htmldaemon.py   sizeof.o
> > MPI_Wtime.c PetscGetTime.c  PetscMemcpy.c   
> > PetscVecNorm.c  index.html  streams
> > MPI_Wtime.c.htmlPetscGetTime.c.html PetscMemcpy.c.html  
> > PetscVecNorm.c.html libconf1.a
> > PLogEvent.c PetscMalloc.c   PetscMemzero.c  
> > benchmarkAssembly.pymakefile
> > PLogEvent.c.htmlPetscMalloc.c.html  PetscMemzero.c.html 
> > benchmarkBatch.py   makefile.html
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr libconf1.a sizeof.o
> > ar: temporary file: No such file or directory
> > 
> > 
> > 
> > 
> > 
> > -- Original --
> > From:  "Balay, Satish";;
> > Date:  Oct 26, 2018
> > To:  "avatar"<648934...@qq.com>; 
> > Cc:  "petsc-users"; 
> > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > --with-arto specify an archiver"
> > 
> > 
> > 
> > What about:
> > 
> > 
> > mpicc -c sizeof.c
> > /usr/bin/ar cr libconf1.a sizeof.o
> > 
> > Satish
> > 
> > On Fri, 26 Oct 2018, avatar wrote:
> > 
> > > I could not do all things you posted below. I get these:
> > > 
> > > 
> > > Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> > > Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> > > -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a 
> > > sizeof.o
> > > ar: temporary file: No such file or directory
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a  
> > > sizeof.o
> > > ar: temporary file: No such file or directory
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > > 
> > > 
> > > 
> > > 
> > > 
> > > -- Original --
> > > From:  "Balay, Satish";;
> > > Date:  Oct 26, 2018
> > > To:  "avatar"<648934...@qq.com>; 
> > > Cc:  "petsc-users"; 
> > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > --with-arto specify an archiver"
> > > 
> > > 
> > > 
> > > On Fri, 26 Oct 2018, avatar wrote:
> > > 
> > > > Hi Satish,
> > > > 
> > > > 
> > > > Thank you very much for your quick response.
> > > > 
> > > > 
> > > > The log file is as follow:
> > > > 
> > > 
> > > >>
> > > Executing: /usr/bin/ar cr 
> > > /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
> > > /tmp/petsc-mjVUVK/config.setCompilers/conf1.o
> > > Possible ERROR while running archiver: exit code 256
> > > stderr:
> > > ar: temporary file: No such file or directory
> > > Archiver is not functional
> > > 
> > > <<
> > > This is a strange error.
> > > 
> > > What do you get when you do the following:
> > > 
> > > balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
> > > -rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr 
> > > /tmp/libconf1.a  sizeof.o
> > > 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
Looks like it worked.

What do you have for:

echo $TMPDIR
ls -l libconf1.a
/usr/bin/ar t libconf1.a
TMPDIR=$PWD /usr/bin/ar t libconf1.a

Satish


On Fri, 26 Oct 2018, avatar wrote:

> As follow. Then, what I should do next?
> 
> 
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD /usr/bin/ar cr 
> libconf1.a sizeof.o
> Scott-Grad-MacBook-Pro:benchmarks zhihui$
> 
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> How about:
> 
> TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o
> 
> Satish
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls
> > Index.c PetscGetCPUTime.c   PetscMemcmp.c   
> > PetscTime.c benchmarkExample.py sizeof.c
> > Index.c.htmlPetscGetCPUTime.c.html  PetscMemcmp.c.html  
> > PetscTime.c.htmldaemon.py   sizeof.o
> > MPI_Wtime.c PetscGetTime.c  PetscMemcpy.c   
> > PetscVecNorm.c  index.html  streams
> > MPI_Wtime.c.htmlPetscGetTime.c.html PetscMemcpy.c.html  
> > PetscVecNorm.c.html libconf1.a
> > PLogEvent.c PetscMalloc.c   PetscMemzero.c  
> > benchmarkAssembly.pymakefile
> > PLogEvent.c.htmlPetscMalloc.c.html  PetscMemzero.c.html 
> > benchmarkBatch.py   makefile.html
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr libconf1.a sizeof.o
> > ar: temporary file: No such file or directory
> > 
> > 
> > 
> > 
> > 
> > -- Original --
> > From:  "Balay, Satish";;
> > Date:  Oct 26, 2018
> > To:  "avatar"<648934...@qq.com>; 
> > Cc:  "petsc-users"; 
> > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > --with-arto specify an archiver"
> > 
> > 
> > 
> > What about:
> > 
> > 
> > mpicc -c sizeof.c
> > /usr/bin/ar cr libconf1.a sizeof.o
> > 
> > Satish
> > 
> > On Fri, 26 Oct 2018, avatar wrote:
> > 
> > > I could not do all things you posted below. I get these:
> > > 
> > > 
> > > Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> > > Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> > > -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a 
> > > sizeof.o
> > > ar: temporary file: No such file or directory
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a  
> > > sizeof.o
> > > ar: temporary file: No such file or directory
> > > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > > 
> > > 
> > > 
> > > 
> > > 
> > > -- Original --
> > > From:  "Balay, Satish";;
> > > Date:  Oct 26, 2018
> > > To:  "avatar"<648934...@qq.com>; 
> > > Cc:  "petsc-users"; 
> > > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > > --with-arto specify an archiver"
> > > 
> > > 
> > > 
> > > On Fri, 26 Oct 2018, avatar wrote:
> > > 
> > > > Hi Satish,
> > > > 
> > > > 
> > > > Thank you very much for your quick response.
> > > > 
> > > > 
> > > > The log file is as follow:
> > > > 
> > > 
> > > >>
> > > Executing: /usr/bin/ar cr 
> > > /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
> > > /tmp/petsc-mjVUVK/config.setCompilers/conf1.o
> > > Possible ERROR while running archiver: exit code 256
> > > stderr:
> > > ar: temporary file: No such file or directory
> > > Archiver is not functional
> > > 
> > > <<
> > > This is a strange error.
> > > 
> > > What do you get when you do the following:
> > > 
> > > balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
> > > -rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr 
> > > /tmp/libconf1.a  sizeof.o
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf1.a
> > > -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf1.a
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf1.a
> > > __.SYMDEF SORTED
> > > sizeof.o
> > > 
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ cp sizeof.o /tmp/
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/sizeof.o 
> > > -rw-r--r--  1 balay  wheel  3036 Oct 25 17:55 /tmp/sizeof.o
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr 
> > > /tmp/libconf2.a /tmp/sizeof.o
> > > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf2.a
> > > -rw-r--r--  1 

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
As follow. Then, what I should do next?


Scott-Grad-MacBook-Pro:benchmarks zhihui$ TMPDIR=$PWD /usr/bin/ar cr libconf1.a 
sizeof.o
Scott-Grad-MacBook-Pro:benchmarks zhihui$





-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  "avatar"<648934...@qq.com>; 
Cc:  "petsc-users"; 
Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
--with-arto specify an archiver"



How about:

TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls
> Index.c   PetscGetCPUTime.c   PetscMemcmp.c   
> PetscTime.c benchmarkExample.py sizeof.c
> Index.c.html  PetscGetCPUTime.c.html  PetscMemcmp.c.html  
> PetscTime.c.htmldaemon.py   sizeof.o
> MPI_Wtime.c   PetscGetTime.c  PetscMemcpy.c   
> PetscVecNorm.c  index.html  streams
> MPI_Wtime.c.html  PetscGetTime.c.html PetscMemcpy.c.html  
> PetscVecNorm.c.html libconf1.a
> PLogEvent.c   PetscMalloc.c   PetscMemzero.c  
> benchmarkAssembly.pymakefile
> PLogEvent.c.html  PetscMalloc.c.html  PetscMemzero.c.html 
> benchmarkBatch.py   makefile.html
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr libconf1.a sizeof.o
> ar: temporary file: No such file or directory
> 
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> What about:
> 
> 
> mpicc -c sizeof.c
> /usr/bin/ar cr libconf1.a sizeof.o
> 
> Satish
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > I could not do all things you posted below. I get these:
> > 
> > 
> > Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> > Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> > -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a 
> > sizeof.o
> > ar: temporary file: No such file or directory
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a  
> > sizeof.o
> > ar: temporary file: No such file or directory
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > 
> > 
> > 
> > 
> > 
> > -- Original --
> > From:  "Balay, Satish";;
> > Date:  Oct 26, 2018
> > To:  "avatar"<648934...@qq.com>; 
> > Cc:  "petsc-users"; 
> > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > --with-arto specify an archiver"
> > 
> > 
> > 
> > On Fri, 26 Oct 2018, avatar wrote:
> > 
> > > Hi Satish,
> > > 
> > > 
> > > Thank you very much for your quick response.
> > > 
> > > 
> > > The log file is as follow:
> > > 
> > 
> > >>
> > Executing: /usr/bin/ar cr /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
> > /tmp/petsc-mjVUVK/config.setCompilers/conf1.o
> > Possible ERROR while running archiver: exit code 256
> > stderr:
> > ar: temporary file: No such file or directory
> > Archiver is not functional
> > 
> > <<
> > This is a strange error.
> > 
> > What do you get when you do the following:
> > 
> > balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
> > -rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr 
> > /tmp/libconf1.a  sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf1.a
> > -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf1.a
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf1.a
> > __.SYMDEF SORTED
> > sizeof.o
> > 
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ cp sizeof.o /tmp/
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/sizeof.o 
> > -rw-r--r--  1 balay  wheel  3036 Oct 25 17:55 /tmp/sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr 
> > /tmp/libconf2.a /tmp/sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf2.a
> > -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf2.a
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf2.a
> > __.SYMDEF SORTED
> > sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ 
> > 
> > Satish

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
How about:

TMPDIR=$PWD /usr/bin/ar cr libconf1.a sizeof.o

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls
> Index.c   PetscGetCPUTime.c   PetscMemcmp.c   
> PetscTime.c benchmarkExample.py sizeof.c
> Index.c.html  PetscGetCPUTime.c.html  PetscMemcmp.c.html  
> PetscTime.c.htmldaemon.py   sizeof.o
> MPI_Wtime.c   PetscGetTime.c  PetscMemcpy.c   
> PetscVecNorm.c  index.html  streams
> MPI_Wtime.c.html  PetscGetTime.c.html PetscMemcpy.c.html  
> PetscVecNorm.c.html libconf1.a
> PLogEvent.c   PetscMalloc.c   PetscMemzero.c  
> benchmarkAssembly.pymakefile
> PLogEvent.c.html  PetscMalloc.c.html  PetscMemzero.c.html 
> benchmarkBatch.py   makefile.html
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr libconf1.a sizeof.o
> ar: temporary file: No such file or directory
> 
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> What about:
> 
> 
> mpicc -c sizeof.c
> /usr/bin/ar cr libconf1.a sizeof.o
> 
> Satish
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > I could not do all things you posted below. I get these:
> > 
> > 
> > Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> > Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> > -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a 
> > sizeof.o
> > ar: temporary file: No such file or directory
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a  
> > sizeof.o
> > ar: temporary file: No such file or directory
> > Scott-Grad-MacBook-Pro:benchmarks zhihui$
> > 
> > 
> > 
> > 
> > 
> > -- Original --
> > From:  "Balay, Satish";;
> > Date:  Oct 26, 2018
> > To:  "avatar"<648934...@qq.com>; 
> > Cc:  "petsc-users"; 
> > Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> > --with-arto specify an archiver"
> > 
> > 
> > 
> > On Fri, 26 Oct 2018, avatar wrote:
> > 
> > > Hi Satish,
> > > 
> > > 
> > > Thank you very much for your quick response.
> > > 
> > > 
> > > The log file is as follow:
> > > 
> > 
> > >>
> > Executing: /usr/bin/ar cr /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
> > /tmp/petsc-mjVUVK/config.setCompilers/conf1.o
> > Possible ERROR while running archiver: exit code 256
> > stderr:
> > ar: temporary file: No such file or directory
> > Archiver is not functional
> > 
> > <<
> > This is a strange error.
> > 
> > What do you get when you do the following:
> > 
> > balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
> > -rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr 
> > /tmp/libconf1.a  sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf1.a
> > -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf1.a
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf1.a
> > __.SYMDEF SORTED
> > sizeof.o
> > 
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ cp sizeof.o /tmp/
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/sizeof.o 
> > -rw-r--r--  1 balay  wheel  3036 Oct 25 17:55 /tmp/sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr 
> > /tmp/libconf2.a /tmp/sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf2.a
> > -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf2.a
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf2.a
> > __.SYMDEF SORTED
> > sizeof.o
> > balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ 
> > 
> > Satish



Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls
Index.c PetscGetCPUTime.c   PetscMemcmp.c   
PetscTime.c benchmarkExample.py sizeof.c
Index.c.htmlPetscGetCPUTime.c.html  PetscMemcmp.c.html  
PetscTime.c.htmldaemon.py   sizeof.o
MPI_Wtime.c PetscGetTime.c  PetscMemcpy.c   
PetscVecNorm.c  index.html  streams
MPI_Wtime.c.htmlPetscGetTime.c.html PetscMemcpy.c.html  
PetscVecNorm.c.html libconf1.a
PLogEvent.c PetscMalloc.c   PetscMemzero.c  
benchmarkAssembly.pymakefile
PLogEvent.c.htmlPetscMalloc.c.html  PetscMemzero.c.html 
benchmarkBatch.py   makefile.html
Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr libconf1.a sizeof.o
ar: temporary file: No such file or directory





-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  "avatar"<648934...@qq.com>; 
Cc:  "petsc-users"; 
Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
--with-arto specify an archiver"



What about:


mpicc -c sizeof.c
/usr/bin/ar cr libconf1.a sizeof.o

Satish

On Fri, 26 Oct 2018, avatar wrote:

> I could not do all things you posted below. I get these:
> 
> 
> Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a 
> sizeof.o
> ar: temporary file: No such file or directory
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a  
> sizeof.o
> ar: temporary file: No such file or directory
> Scott-Grad-MacBook-Pro:benchmarks zhihui$
> 
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > Hi Satish,
> > 
> > 
> > Thank you very much for your quick response.
> > 
> > 
> > The log file is as follow:
> > 
> 
> >>
> Executing: /usr/bin/ar cr /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
> /tmp/petsc-mjVUVK/config.setCompilers/conf1.o
> Possible ERROR while running archiver: exit code 256
> stderr:
> ar: temporary file: No such file or directory
> Archiver is not functional
> 
> <<
> This is a strange error.
> 
> What do you get when you do the following:
> 
> balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
> -rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr /tmp/libconf1.a 
>  sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf1.a
> -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf1.a
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf1.a
> __.SYMDEF SORTED
> sizeof.o
> 
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ cp sizeof.o /tmp/
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/sizeof.o 
> -rw-r--r--  1 balay  wheel  3036 Oct 25 17:55 /tmp/sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr /tmp/libconf2.a 
> /tmp/sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf2.a
> -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf2.a
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf2.a
> __.SYMDEF SORTED
> sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ 
> 
> Satish

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
What about:


mpicc -c sizeof.c
/usr/bin/ar cr libconf1.a sizeof.o

Satish

On Fri, 26 Oct 2018, avatar wrote:

> I could not do all things you posted below. I get these:
> 
> 
> Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
> Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
> -rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a 
> sizeof.o
> ar: temporary file: No such file or directory
> Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a  
> sizeof.o
> ar: temporary file: No such file or directory
> Scott-Grad-MacBook-Pro:benchmarks zhihui$
> 
> 
> 
> 
> 
> -- Original --
> From:  "Balay, Satish";;
> Date:  Oct 26, 2018
> To:  "avatar"<648934...@qq.com>; 
> Cc:  "petsc-users"; 
> Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
> --with-arto specify an archiver"
> 
> 
> 
> On Fri, 26 Oct 2018, avatar wrote:
> 
> > Hi Satish,
> > 
> > 
> > Thank you very much for your quick response.
> > 
> > 
> > The log file is as follow:
> > 
> 
> >>
> Executing: /usr/bin/ar cr /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
> /tmp/petsc-mjVUVK/config.setCompilers/conf1.o
> Possible ERROR while running archiver: exit code 256
> stderr:
> ar: temporary file: No such file or directory
> Archiver is not functional
> 
> <<
> This is a strange error.
> 
> What do you get when you do the following:
> 
> balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
> -rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr /tmp/libconf1.a 
>  sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf1.a
> -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf1.a
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf1.a
> __.SYMDEF SORTED
> sizeof.o
> 
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ cp sizeof.o /tmp/
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/sizeof.o 
> -rw-r--r--  1 balay  wheel  3036 Oct 25 17:55 /tmp/sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr /tmp/libconf2.a 
> /tmp/sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf2.a
> -rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf2.a
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf2.a
> __.SYMDEF SORTED
> sizeof.o
> balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ 
> 
> Satish



Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread avatar
I could not do all things you posted below. I get these:


Scott-Grad-MacBook-Pro:petsc-3.8.3 zhihui$ cd src
Scott-Grad-MacBook-Pro:src zhihui$ cd benchmarks/
Scott-Grad-MacBook-Pro:benchmarks zhihui$ mpicc -c sizeof.c
Scott-Grad-MacBook-Pro:benchmarks zhihui$ ls -l sizeof.o
-rw-r--r--  1 zhihui  staff  1452 Oct 25 17:00 sizeof.o
Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a 
sizeof.o
ar: temporary file: No such file or directory
Scott-Grad-MacBook-Pro:benchmarks zhihui$ /usr/bin/ar cr /tmp/libconf1.a  
sizeof.o
ar: temporary file: No such file or directory
Scott-Grad-MacBook-Pro:benchmarks zhihui$





-- Original --
From:  "Balay, Satish";;
Date:  Oct 26, 2018
To:  "avatar"<648934...@qq.com>; 
Cc:  "petsc-users"; 
Subject:  Re: [petsc-users] "Could not find a suitable archiver. Use 
--with-arto specify an archiver"



On Fri, 26 Oct 2018, avatar wrote:

> Hi Satish,
> 
> 
> Thank you very much for your quick response.
> 
> 
> The log file is as follow:
> 

>>
Executing: /usr/bin/ar cr /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
/tmp/petsc-mjVUVK/config.setCompilers/conf1.o
Possible ERROR while running archiver: exit code 256
stderr:
ar: temporary file: No such file or directory
Archiver is not functional

<<
This is a strange error.

What do you get when you do the following:

balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
-rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr /tmp/libconf1.a  
sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf1.a
-rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf1.a
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf1.a
__.SYMDEF SORTED
sizeof.o

balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ cp sizeof.o /tmp/
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/sizeof.o 
-rw-r--r--  1 balay  wheel  3036 Oct 25 17:55 /tmp/sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr /tmp/libconf2.a 
/tmp/sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf2.a
-rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf2.a
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf2.a
__.SYMDEF SORTED
sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ 

Satish

Re: [petsc-users] "Could not find a suitable archiver. Use --with-arto specify an archiver"

2018-10-25 Thread Balay, Satish
On Fri, 26 Oct 2018, avatar wrote:

> Hi Satish,
> 
> 
> Thank you very much for your quick response.
> 
> 
> The log file is as follow:
> 

>>
Executing: /usr/bin/ar cr /tmp/petsc-mjVUVK/config.setCompilers/libconf1.a 
/tmp/petsc-mjVUVK/config.setCompilers/conf1.o
Possible ERROR while running archiver: exit code 256
stderr:
ar: temporary file: No such file or directory
Archiver is not functional

<<
This is a strange error.

What do you get when you do the following:

balay@jpro^~/petsc(maint-3.8) $ cd src/benchmarks/
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ mpicc -c sizeof.c 
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l sizeof.o
-rw-r--r--  1 balay  staff  3036 Oct 25 17:54 sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr /tmp/libconf1.a  
sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf1.a
-rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf1.a
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf1.a
__.SYMDEF SORTED
sizeof.o

balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ cp sizeof.o /tmp/
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/sizeof.o 
-rw-r--r--  1 balay  wheel  3036 Oct 25 17:55 /tmp/sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ /usr/bin/ar cr /tmp/libconf2.a 
/tmp/sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ls -l /tmp/libconf2.a
-rw-r--r--  1 balay  wheel  3224 Oct 25 17:55 /tmp/libconf2.a
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ ar t /tmp/libconf2.a
__.SYMDEF SORTED
sizeof.o
balay@jpro^~/petsc/src/benchmarks(maint-3.8) $ 

Satish


Re: [petsc-users] "Could not find a suitable archiver. Use --with-ar to specify an archiver"

2018-10-25 Thread Balay, Satish
Its best to stick with one list - so will follow up on petsc-maint.

Satish

On Fri, 26 Oct 2018, avatar wrote:

> Hi,
> 
> When I try to configure PETSc to connect Matlab with the following command
> ./petsc-3.8.3/configure ??with-matlab
> , it prompts
> ===
>  Configuring PETSc to compile on your system
> ===
> TESTING: checkArchiver from 
> config.setCompilers(config/BuildSystem/config/setCompilers.py:1184)   
>
> ***
>  UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
> details):
> ---
> Could not find a suitable archiver.  Use --with-ar to specify an archiver.
> ***
> 
> 
> 
> I am a newbie on Petsc. Could you please tell me how I should specify an 
> archiver?  I am on MacOX system. I have tried many ways but it still does not 
> work.
> 
> 
> Thank you very much for your help.
> 
> 
> Best,
> Joe



[petsc-users] "Could not find a suitable archiver. Use --with-ar to specify an archiver"

2018-10-25 Thread avatar
Hi,

When I try to configure PETSc to connect Matlab with the following command
./petsc-3.8.3/configure ??with-matlab
, it prompts
===
 Configuring PETSc to compile on your system
===
TESTING: checkArchiver from 
config.setCompilers(config/BuildSystem/config/setCompilers.py:1184) 
 
***
 UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for 
details):
---
Could not find a suitable archiver.  Use --with-ar to specify an archiver.
***



I am a newbie on Petsc. Could you please tell me how I should specify an 
archiver?  I am on MacOX system. I have tried many ways but it still does not 
work.


Thank you very much for your help.


Best,
Joe

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Maximilian Hartig
>> Ahh, thanks. I was missing the option " -dm_plex_gmsh_periodic “. But using 
>> this option I now generate a segmentation fault error when calling VecView() 
>> on the solution vector with vtk and hdf5 viewers. Any suggestions?
>>  

> Small example? VTK is deprecated. HDF5 should work, although it will require 
> you to have proper coordinates I think. We have to
> think about what you mean. If its for a checkpoint, there is no problem, but 
> for viz, those programs do not understand periodicity. Thus I embed it in a 
> higher dimensional space.
> 
>Matt

Building the small example I realised that hdf5 wasn’t working altogether. I’m 
trying to fix this and see if I can run VecView with the periodic DMPlex then.
I planned using this for visualisation. 


> On 25. Oct 2018, at 15:36, Stefano Zampini  wrote:
> 
> Maybe this is a fix
> 
> diff --git a/src/dm/impls/plex/plexvtu.c b/src/dm/impls/plex/plexvtu.c
> index acdea12c2f..1a8bbada6a 100644
> --- a/src/dm/impls/plex/plexvtu.c
> +++ b/src/dm/impls/plex/plexvtu.c
> @@ -465,10 +465,11 @@ PetscErrorCode DMPlexVTKWriteAll_VTU(DM dm,PetscViewer 
> viewer)
>  if ((closure[v] >= vStart) && (closure[v] < vEnd)) {
>PetscScalar *xpoint;
>  
> -  ierr = DMPlexPointLocalRead(dm,v,x,);CHKERRQ(ierr);
> +  ierr = 
> DMPlexPointLocalRead(dm,closure[v],x,);CHKERRQ(ierr);
>y[cnt + off++] = xpoint[i];
>  }
>}
> +  cnt += off;
>ierr = DMPlexRestoreTransitiveClosure(dm, c, PETSC_TRUE, 
> , );CHKERRQ(ierr);
>  }
>}
> 
> Max, does this fix your problem? If you confirm, I'll fix this in the maint 
> branch

It did indeed fix the issue! Thank you. I’ve had problems with hdf5 before and 
switched to vtk. It’s good news I can continue to use it.
> 
> If I run the below command line with the patch and with snes tutorials ex12 I 
> get the nice picture attached
> $ ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet 
> -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f 
> ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh 
> -dm_plex_gmsh_periodic -x_periodicity periodic -y_periodicity periodic 
> -dm_refine 4
> 
> Il giorno gio 25 ott 2018 alle ore 15:11 Stefano Zampini 
> mailto:stefano.zamp...@gmail.com>> ha scritto:
> Matt,
> 
> you can reproduce it via
> 
> $ valgrind ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet 
> -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f 
> ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh 
> -dm_plex_gmsh_periodic
> 
> Long time ago I added support for viewing meshes with periodic vertices in 
> the VTK_VTU viewer, but I did not fix the part that writes fields
> 
> 
> Il giorno mer 24 ott 2018 alle ore 21:04 Matthew Knepley  > ha scritto:
> On Wed, Oct 24, 2018 at 11:36 AM Maximilian Hartig  > wrote:
> 
> 
>> On 24. Oct 2018, at 12:49, Matthew Knepley > > wrote:
>> 
>> On Wed, Oct 24, 2018 at 6:29 AM Lawrence Mitchell > > wrote:
>> Hi Max,
>> 
>> (I'm cc'ing in the petsc-users mailing list which may have more advice, if 
>> you are using PETSc you should definitely subscribe!
>> 
>> > On 24 Oct 2018, at 09:27, Maximilian Hartig > > > wrote:
>> > 
>> > Hello Lawrence,
>> > 
>> > sorry to message you out of the blue. My name is Max and I found your post 
>> > on GitHub (https://github.com/firedrakeproject/firedrake/issues/1246 
>> >  ) on DMPlex 
>> > being able to read periodic gmsh files. I am currently trying to do just 
>> > that (creating a periodic DMPlex mesh with gmsh) in the context of my PhD 
>> > work. So far I haven’ t found any documentation on the periodic BC’s with 
>> > DMPlex and gmsh in the official petsc documentation. 
>> > I was wondering whether you’d be so kind as to point me in a general 
>> > direction concerning how to achieve this. You seem experienced in using 
>> > petsc and I would greatly appreciate your help. 
>> 
>> 
>> I think the answer is "it depends". If you're just using DMPlex directly and 
>> all the of the functionality with PetscDS, then I /think/ that reading 
>> periodic meshes via gmsh (assuming you're using the appropriate gmsh mesh 
>> format [v2]) "just works".
>> 
>> There are two phases here: topological and geometric. DMPlex represents the 
>> periodic topological entity directly. For example,  a circle is just a 
>> segment with one end hooked to the other. Vertices are not duplicated, or 
>> mapped to each other. This makes topology simple and easy to implement. 
>> However, then geometry is more complicated. What Plex does is allow 
>> coordinates to be represented by a discontinuous field taking values on 
>> cells, in addition to vertices. In our circle example, each 

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Stephen Wornom
Did you forget to attach "The nice picture attached" 
Stephen 

> From: "Stefano Zampini" 
> To: "petsc-maint" 
> Cc: "imilian hartig" , "PETSc users list"
> 
> Sent: Thursday, October 25, 2018 4:47:43 PM
> Subject: Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

> Opened the PR [
> https://bitbucket.org/petsc/petsc/pull-requests/1203/fix-dump-vtk-field-with-periodic-meshes/diff
> |
> https://bitbucket.org/petsc/petsc/pull-requests/1203/fix-dump-vtk-field-with-periodic-meshes/diff
> ]

> Il giorno gio 25 ott 2018 alle ore 17:44 Matthew Knepley < [
> mailto:knep...@gmail.com | knep...@gmail.com ] > ha scritto:

>> Good catch Stefano.
>> Matt

>> On Thu, Oct 25, 2018 at 9:36 AM Stefano Zampini < [
>> mailto:stefano.zamp...@gmail.com | stefano.zamp...@gmail.com ] > wrote:

>>> Maybe this is a fix
>>> diff --git a/src/dm/impls/plex/plexvtu.c b/src/dm/impls/plex/plexvtu.c
>>> index acdea12c2f..1a8bbada6a 100644
>>> --- a/src/dm/impls/plex/plexvtu.c
>>> +++ b/src/dm/impls/plex/plexvtu.c
>>> @@ -465,10 +465,11 @@ PetscErrorCode DMPlexVTKWriteAll_VTU(DM dm,PetscViewer
>>> viewer)
>>> if ((closure[v] >= vStart) && (closure[v] < vEnd)) {
>>> PetscScalar *xpoint;
>>> - ierr = DMPlexPointLocalRead(dm,v,x,);CHKERRQ(ierr);
>>> + ierr = DMPlexPointLocalRead(dm,closure[v],x,);CHKERRQ(ierr);
>>> y[cnt + off++] = xpoint[i];
>>> }
>>> }
>>> + cnt += off;
>>> ierr = DMPlexRestoreTransitiveClosure(dm, c, PETSC_TRUE, ,
>>> );CHKERRQ(ierr);
>>> }
>>> }

>>> Max, does this fix your problem? If you confirm, I'll fix this in the maint
>>> branch

>>> If I run the below command line with the patch and with snes tutorials ex12 
>>> I
>>> get the nice picture attached
>>> $ ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet
>>> -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f
>>> ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh
>>> -dm_plex_gmsh_periodic -x_periodicity periodic -y_periodicity periodic
>>> -dm_refine 4

>>> Il giorno gio 25 ott 2018 alle ore 15:11 Stefano Zampini < [
>>> mailto:stefano.zamp...@gmail.com | stefano.zamp...@gmail.com ] > ha scritto:

 Matt,

 you can reproduce it via
 $ valgrind ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet
 -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f
 ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh
 -dm_plex_gmsh_periodic

 Long time ago I added support for viewing meshes with periodic vertices in 
 the
 VTK_VTU viewer, but I did not fix the part that writes fields

 Il giorno mer 24 ott 2018 alle ore 21:04 Matthew Knepley < [
 mailto:knep...@gmail.com | knep...@gmail.com ] > ha scritto:

> On Wed, Oct 24, 2018 at 11:36 AM Maximilian Hartig < [
> mailto:imilian.har...@gmail.com | imilian.har...@gmail.com ] > wrote:

>>> On 24. Oct 2018, at 12:49, Matthew Knepley < [ mailto:knep...@gmail.com 
>>> |
>>> knep...@gmail.com ] > wrote:

>>> On Wed, Oct 24, 2018 at 6:29 AM Lawrence Mitchell < [ 
>>> mailto:we...@gmx.li |
>>> we...@gmx.li ] > wrote:

 Hi Max,

 (I'm cc'ing in the petsc-users mailing list which may have more 
 advice, if you
 are using PETSc you should definitely subscribe!

> On 24 Oct 2018, at 09:27, Maximilian Hartig < [ 
> mailto:imilian.har...@gmail.com
 > | imilian.har...@gmail.com ] > wrote:

 > Hello Lawrence,

> sorry to message you out of the blue. My name is Max and I found your 
> post on
> GitHub ( [ https://github.com/firedrakeproject/firedrake/issues/1246 |
> https://github.com/firedrakeproject/firedrake/issues/1246 ] ) on 
> DMPlex being
> able to read periodic gmsh files. I am currently trying to do just 
> that
> (creating a periodic DMPlex mesh with gmsh) in the context of my PhD 
> work. So
> far I haven’ t found any documentation on the periodic BC’s with 
> DMPlex and
 > gmsh in the official petsc documentation.
> I was wondering whether you’d be so kind as to point me in a general 
> direction
> concerning how to achieve this. You seem experienced in using petsc 
> and I would
 > greatly appreciate your help.

 I think the answer is "it depends". If you're just using DMPlex 
 directly and all
 the of the functionality with PetscDS, then I /think/ that reading 
 periodic
 meshes via gmsh (assuming you're using the appropriate gmsh mesh 
 format [v2])
 "just works".

>>> There are two phases here: topological and geometric. DMPlex represents 
>>> the
>>> periodic topological entity directly. For example, a circle is just a 
>>> segment
>>> with one end hooked to the other. Vertices are not duplicated, or 
>>> mapped to
>>> each other. This makes topology simple and easy to implement. However, 
>>> then
>>> geometry 

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Stefano Zampini
Opened the PR
https://bitbucket.org/petsc/petsc/pull-requests/1203/fix-dump-vtk-field-with-periodic-meshes/diff

Il giorno gio 25 ott 2018 alle ore 17:44 Matthew Knepley 
ha scritto:

> Good catch Stefano.
>
>   Matt
>
> On Thu, Oct 25, 2018 at 9:36 AM Stefano Zampini 
> wrote:
>
>> Maybe this is a fix
>>
>> diff --git a/src/dm/impls/plex/plexvtu.c b/src/dm/impls/plex/plexvtu.c
>> index acdea12c2f..1a8bbada6a 100644
>> --- a/src/dm/impls/plex/plexvtu.c
>> +++ b/src/dm/impls/plex/plexvtu.c
>> @@ -465,10 +465,11 @@ PetscErrorCode DMPlexVTKWriteAll_VTU(DM
>> dm,PetscViewer viewer)
>>  if ((closure[v] >= vStart) && (closure[v] < vEnd)) {
>>PetscScalar *xpoint;
>>
>> -  ierr =
>> DMPlexPointLocalRead(dm,v,x,);CHKERRQ(ierr);
>> +  ierr =
>> DMPlexPointLocalRead(dm,closure[v],x,);CHKERRQ(ierr);
>>y[cnt + off++] = xpoint[i];
>>  }
>>}
>> +  cnt += off;
>>ierr = DMPlexRestoreTransitiveClosure(dm, c, PETSC_TRUE,
>> , );CHKERRQ(ierr);
>>  }
>>}
>>
>> Max, does this fix your problem? If you confirm, I'll fix this in the
>> maint branch
>>
>> If I run the below command line with the patch and with snes tutorials
>> ex12 I get the nice picture attached
>> $ ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet
>> -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f
>> ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh
>> -dm_plex_gmsh_periodic -x_periodicity periodic -y_periodicity periodic
>> -dm_refine 4
>>
>> Il giorno gio 25 ott 2018 alle ore 15:11 Stefano Zampini <
>> stefano.zamp...@gmail.com> ha scritto:
>>
>>> Matt,
>>>
>>> you can reproduce it via
>>>
>>> $ valgrind ./ex12 -quiet -run_type test -interpolate 1 -bc_type
>>> dirichlet -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f
>>> ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh
>>> -dm_plex_gmsh_periodic
>>>
>>> Long time ago I added support for viewing meshes with periodic vertices
>>> in the VTK_VTU viewer, but I did not fix the part that writes fields
>>>
>>>
>>> Il giorno mer 24 ott 2018 alle ore 21:04 Matthew Knepley <
>>> knep...@gmail.com> ha scritto:
>>>
 On Wed, Oct 24, 2018 at 11:36 AM Maximilian Hartig <
 imilian.har...@gmail.com> wrote:

>
>
> On 24. Oct 2018, at 12:49, Matthew Knepley  wrote:
>
> On Wed, Oct 24, 2018 at 6:29 AM Lawrence Mitchell 
> wrote:
>
>> Hi Max,
>>
>> (I'm cc'ing in the petsc-users mailing list which may have more
>> advice, if you are using PETSc you should definitely subscribe!
>>
>> > On 24 Oct 2018, at 09:27, Maximilian Hartig <
>> imilian.har...@gmail.com> wrote:
>> >
>> > Hello Lawrence,
>> >
>> > sorry to message you out of the blue. My name is Max and I found
>> your post on GitHub (
>> https://github.com/firedrakeproject/firedrake/issues/1246 ) on
>> DMPlex being able to read periodic gmsh files. I am currently trying to 
>> do
>> just that (creating a periodic DMPlex mesh with gmsh) in the context of 
>> my
>> PhD work. So far I haven’ t found any documentation on the periodic BC’s
>> with DMPlex and gmsh in the official petsc documentation.
>> > I was wondering whether you’d be so kind as to point me in a
>> general direction concerning how to achieve this. You seem experienced in
>> using petsc and I would greatly appreciate your help.
>>
>>
>> I think the answer is "it depends". If you're just using DMPlex
>> directly and all the of the functionality with PetscDS, then I /think/ 
>> that
>> reading periodic meshes via gmsh (assuming you're using the appropriate
>> gmsh mesh format [v2]) "just works".
>>
>
> There are two phases here: topological and geometric. DMPlex
> represents the periodic topological entity directly. For example,  a 
> circle
> is just a segment with one end hooked to the other. Vertices are not
> duplicated, or mapped to each other. This makes topology simple and easy 
> to
> implement. However, then geometry is more complicated. What Plex does is
> allow coordinates to be represented by a discontinuous field taking values
> on cells, in addition to vertices. In our circle example, each cells near
> the cut will have 2 coordinates, one for each vertex, but they will not
> agree across the cut. If you define a periodic domain, then Plex can
> construct this coordinate field automatically using DMPlexLocalize(). 
> These
> DG coordinates are then used by the integration routines.
>
>
> Ok, I think I understand the concept. DMPlex reads information about
> both topology and coordinates from the .msh file. Creating a periodic mesh
> in gmsh then should allow DMPlex to identify the periodic boundaries as 
> the
> “cut” and build the mesh 

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Matthew Knepley
Good catch Stefano.

  Matt

On Thu, Oct 25, 2018 at 9:36 AM Stefano Zampini 
wrote:

> Maybe this is a fix
>
> diff --git a/src/dm/impls/plex/plexvtu.c b/src/dm/impls/plex/plexvtu.c
> index acdea12c2f..1a8bbada6a 100644
> --- a/src/dm/impls/plex/plexvtu.c
> +++ b/src/dm/impls/plex/plexvtu.c
> @@ -465,10 +465,11 @@ PetscErrorCode DMPlexVTKWriteAll_VTU(DM
> dm,PetscViewer viewer)
>  if ((closure[v] >= vStart) && (closure[v] < vEnd)) {
>PetscScalar *xpoint;
>
> -  ierr =
> DMPlexPointLocalRead(dm,v,x,);CHKERRQ(ierr);
> +  ierr =
> DMPlexPointLocalRead(dm,closure[v],x,);CHKERRQ(ierr);
>y[cnt + off++] = xpoint[i];
>  }
>}
> +  cnt += off;
>ierr = DMPlexRestoreTransitiveClosure(dm, c, PETSC_TRUE,
> , );CHKERRQ(ierr);
>  }
>}
>
> Max, does this fix your problem? If you confirm, I'll fix this in the
> maint branch
>
> If I run the below command line with the patch and with snes tutorials
> ex12 I get the nice picture attached
> $ ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet
> -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f
> ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh
> -dm_plex_gmsh_periodic -x_periodicity periodic -y_periodicity periodic
> -dm_refine 4
>
> Il giorno gio 25 ott 2018 alle ore 15:11 Stefano Zampini <
> stefano.zamp...@gmail.com> ha scritto:
>
>> Matt,
>>
>> you can reproduce it via
>>
>> $ valgrind ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet
>> -petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f
>> ${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh
>> -dm_plex_gmsh_periodic
>>
>> Long time ago I added support for viewing meshes with periodic vertices
>> in the VTK_VTU viewer, but I did not fix the part that writes fields
>>
>>
>> Il giorno mer 24 ott 2018 alle ore 21:04 Matthew Knepley <
>> knep...@gmail.com> ha scritto:
>>
>>> On Wed, Oct 24, 2018 at 11:36 AM Maximilian Hartig <
>>> imilian.har...@gmail.com> wrote:
>>>


 On 24. Oct 2018, at 12:49, Matthew Knepley  wrote:

 On Wed, Oct 24, 2018 at 6:29 AM Lawrence Mitchell  wrote:

> Hi Max,
>
> (I'm cc'ing in the petsc-users mailing list which may have more
> advice, if you are using PETSc you should definitely subscribe!
>
> > On 24 Oct 2018, at 09:27, Maximilian Hartig <
> imilian.har...@gmail.com> wrote:
> >
> > Hello Lawrence,
> >
> > sorry to message you out of the blue. My name is Max and I found
> your post on GitHub (
> https://github.com/firedrakeproject/firedrake/issues/1246 ) on DMPlex
> being able to read periodic gmsh files. I am currently trying to do just
> that (creating a periodic DMPlex mesh with gmsh) in the context of my PhD
> work. So far I haven’ t found any documentation on the periodic BC’s with
> DMPlex and gmsh in the official petsc documentation.
> > I was wondering whether you’d be so kind as to point me in a general
> direction concerning how to achieve this. You seem experienced in using
> petsc and I would greatly appreciate your help.
>
>
> I think the answer is "it depends". If you're just using DMPlex
> directly and all the of the functionality with PetscDS, then I /think/ 
> that
> reading periodic meshes via gmsh (assuming you're using the appropriate
> gmsh mesh format [v2]) "just works".
>

 There are two phases here: topological and geometric. DMPlex represents
 the periodic topological entity directly. For example,  a circle is just a
 segment with one end hooked to the other. Vertices are not duplicated, or
 mapped to each other. This makes topology simple and easy to implement.
 However, then geometry is more complicated. What Plex does is allow
 coordinates to be represented by a discontinuous field taking values on
 cells, in addition to vertices. In our circle example, each cells near the
 cut will have 2 coordinates, one for each vertex, but they will not agree
 across the cut. If you define a periodic domain, then Plex can construct
 this coordinate field automatically using DMPlexLocalize(). These DG
 coordinates are then used by the integration routines.


 Ok, I think I understand the concept. DMPlex reads information about
 both topology and coordinates from the .msh file. Creating a periodic mesh
 in gmsh then should allow DMPlex to identify the periodic boundaries as the
 “cut” and build the mesh topology accordingly. Coordinate information is
 handled separately.
 That means, as Lawrence suggested, building periodic meshes in gmsh and
 reading them in to petsc’s DMPlex should indeed “just work”.  (From the
 user perspective). The only extra step is to call DMLocalizeCoordinates()
 after DMPlexReadFromFile(). Sorry to 

Re: [petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-25 Thread Stefano Zampini
> I actually use hybridization and I was reading the preprint "Algebraic
> Hybridization and Static Condensation with Application to Scalable H(div)
> Preconditioning" by Dobrev et al. ( https://arxiv.org/abs/1801.08914 )
> and they show that multigrid is optimal for the grad-div problem
> discretized with H(div) conforming FEMs when hybridized. That is actually
> why I think that BDDC also would be optimal. I will look into ngsolve to
> see if I can have such a domain decomposition. Maybe I can do it manually
> just as proof of concept.
>
>
If you are using hybridization, you can use PCGAMG (i.e. -pc_type gamg)


> I am using GMRES. I was wondering if the application of BDDC is a linear
> operator, if it is not maybe I should use FGMRES. But I could not find any
> comments about that.
>
>
BDDC is linear. The problem is that when you disassemble an already
assembled matrix, the operator of the preconditioner is not guaranteed to
stay positive definite for positive definite assembled problems,



> I will recompile PETSc with ParMETIS and try your suggestions. Thank you!
> I will update you soon.
>
> Best wishes,
> Abdullah Ali Sivas
>
> On Thu, 25 Oct 2018 at 09:53, Stefano Zampini 
> wrote:
>
>> How many processes (subdomains) are you using?
>> I would not say the number of iterations is bad, and it also seems to
>> plateau.
>> The grad-div problem is quite hard to be solved (unless you use
>> hybridization), you can really benefit from the "Neumann" assembly.
>> I believe you are using GMRES, as the preconditioned operator (i.e
>> M_BDDC^-1 A) is not guaranteed to be positive definite when you use the
>> automatic disassembling.
>> You may slightly improve the quality of the disassembling by using
>> -mat_is_disassemble_l2g_type nd -mat_partitioning_type parmetis if you have
>> PETSc compiled with ParMETIS support.
>>
>>
>> Il giorno mer 24 ott 2018 alle ore 20:59 Abdullah Ali Sivas <
>> abdullahasi...@gmail.com> ha scritto:
>>
>>> Hi Stefano,
>>>
>>> I am trying to solve the div-div problem (or grad-div problem in strong
>>> form) with a H(div)-conforming FEM. I am getting the matrices from an
>>> external source (to be clear, from an ngsolve script) and I am not sure if
>>> it is possible to get a MATIS matrix out of that. So I am just treating it
>>> as if I am not able to access the assembly code. The results are 2, 31, 26,
>>> 27, 31 iterations, respectively, for matrix sizes 282, 1095, 4314, 17133,
>>> 67242, 267549. However, norm of the residual also grows significantly;
>>> 7.38369e-09 for 1095 and 5.63828e-07 for 267549. I can try larger sizes, or
>>> maybe this is expected for this case.
>>>
>>> As a side question, if we are dividing the domain into number of MPI
>>> processes subdomains, does it mean that convergence is affected negatively
>>> by the increasing number of processes? I know that alternating Schwarz
>>> method and some other domain decomposition methods sometimes suffer from
>>> the decreasing radius of the subdomains. It sounds like BDDC is pretty
>>> similar to those by your description.
>>>
>>> Best wishes,
>>> Abdullah Ali Sivas
>>>
>>> On Wed, 24 Oct 2018 at 05:28, Stefano Zampini 
>>> wrote:
>>>
 Abdullah,

 The "Neumann" problems Jed is referring to result from assembling your
 problem on each subdomain ( = MPI process) separately.
 Assuming you are using FEM, these problems have been historically
 named "Neumann" as they correspond to a problem with natural boundary
 conditions (Neumann bc for Poisson).
 Note that in PETSc the subdomain decomposition is associated with the
 mesh decomposition.

 When converting from an assembled AIJ matrix to a MATIS format, such
 "Neumann" information is lost.
 You can disassemble an AIJ matrix, in the sense that you can find local
 matrices A_j such that A = \sum_j R^T_j A_j R_j (as it is done in ex72.c),
 but you cannot guarantee (unless if you solve an optimization problem) that
 the disassembling will produce subdomain Neumann problems that are
 consistent with your FEM problem.

 I have added such disassembling code a few months ago, just to have
 another alternative for preconditioning AIJ matrices in PETSc; there are
 few tweaks one can do to improve the quality of the disassembling, but I
 discourage its usage unless you don't have access to the FEM assembly code.

 With that said, what problem are you trying to solve? Are you using
 DMDA or DMPlex? What are the results you obtained with using the automatic
 disassembling?

 Il giorno mer 24 ott 2018 alle ore 08:14 Abdullah Ali Sivas <
 abdullahasi...@gmail.com> ha scritto:

> Hi Jed,
>
> Thanks for your reply. The assembled matrix I have corresponds to the
> full problem on the full mesh. There are no "Neumann" problems (or any 
> sort
> of domain decomposition) defined in the code generates the matrix. 
> However,
> I think 

Re: [petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-25 Thread Abdullah Ali Sivas
Right now, one to four. I am just running some tests with small matrices.
Later on, I am planning to do large scale tests hopefully up to 1024
processes. I was worried that iteration numbers may get worse.

I actually use hybridization and I was reading the preprint "Algebraic
Hybridization and Static Condensation with Application to Scalable H(div)
Preconditioning" by Dobrev et al. ( https://arxiv.org/abs/1801.08914 ) and
they show that multigrid is optimal for the grad-div problem discretized
with H(div) conforming FEMs when hybridized. That is actually why I think
that BDDC also would be optimal. I will look into ngsolve to see if I can
have such a domain decomposition. Maybe I can do it manually just as proof
of concept.

I am using GMRES. I was wondering if the application of BDDC is a linear
operator, if it is not maybe I should use FGMRES. But I could not find any
comments about that.

I will recompile PETSc with ParMETIS and try your suggestions. Thank you! I
will update you soon.

Best wishes,
Abdullah Ali Sivas

On Thu, 25 Oct 2018 at 09:53, Stefano Zampini 
wrote:

> How many processes (subdomains) are you using?
> I would not say the number of iterations is bad, and it also seems to
> plateau.
> The grad-div problem is quite hard to be solved (unless you use
> hybridization), you can really benefit from the "Neumann" assembly.
> I believe you are using GMRES, as the preconditioned operator (i.e
> M_BDDC^-1 A) is not guaranteed to be positive definite when you use the
> automatic disassembling.
> You may slightly improve the quality of the disassembling by using
> -mat_is_disassemble_l2g_type nd -mat_partitioning_type parmetis if you have
> PETSc compiled with ParMETIS support.
>
>
> Il giorno mer 24 ott 2018 alle ore 20:59 Abdullah Ali Sivas <
> abdullahasi...@gmail.com> ha scritto:
>
>> Hi Stefano,
>>
>> I am trying to solve the div-div problem (or grad-div problem in strong
>> form) with a H(div)-conforming FEM. I am getting the matrices from an
>> external source (to be clear, from an ngsolve script) and I am not sure if
>> it is possible to get a MATIS matrix out of that. So I am just treating it
>> as if I am not able to access the assembly code. The results are 2, 31, 26,
>> 27, 31 iterations, respectively, for matrix sizes 282, 1095, 4314, 17133,
>> 67242, 267549. However, norm of the residual also grows significantly;
>> 7.38369e-09 for 1095 and 5.63828e-07 for 267549. I can try larger sizes, or
>> maybe this is expected for this case.
>>
>> As a side question, if we are dividing the domain into number of MPI
>> processes subdomains, does it mean that convergence is affected negatively
>> by the increasing number of processes? I know that alternating Schwarz
>> method and some other domain decomposition methods sometimes suffer from
>> the decreasing radius of the subdomains. It sounds like BDDC is pretty
>> similar to those by your description.
>>
>> Best wishes,
>> Abdullah Ali Sivas
>>
>> On Wed, 24 Oct 2018 at 05:28, Stefano Zampini 
>> wrote:
>>
>>> Abdullah,
>>>
>>> The "Neumann" problems Jed is referring to result from assembling your
>>> problem on each subdomain ( = MPI process) separately.
>>> Assuming you are using FEM, these problems have been historically  named
>>> "Neumann" as they correspond to a problem with natural boundary conditions
>>> (Neumann bc for Poisson).
>>> Note that in PETSc the subdomain decomposition is associated with the
>>> mesh decomposition.
>>>
>>> When converting from an assembled AIJ matrix to a MATIS format, such
>>> "Neumann" information is lost.
>>> You can disassemble an AIJ matrix, in the sense that you can find local
>>> matrices A_j such that A = \sum_j R^T_j A_j R_j (as it is done in ex72.c),
>>> but you cannot guarantee (unless if you solve an optimization problem) that
>>> the disassembling will produce subdomain Neumann problems that are
>>> consistent with your FEM problem.
>>>
>>> I have added such disassembling code a few months ago, just to have
>>> another alternative for preconditioning AIJ matrices in PETSc; there are
>>> few tweaks one can do to improve the quality of the disassembling, but I
>>> discourage its usage unless you don't have access to the FEM assembly code.
>>>
>>> With that said, what problem are you trying to solve? Are you using DMDA
>>> or DMPlex? What are the results you obtained with using the automatic
>>> disassembling?
>>>
>>> Il giorno mer 24 ott 2018 alle ore 08:14 Abdullah Ali Sivas <
>>> abdullahasi...@gmail.com> ha scritto:
>>>
 Hi Jed,

 Thanks for your reply. The assembled matrix I have corresponds to the
 full problem on the full mesh. There are no "Neumann" problems (or any sort
 of domain decomposition) defined in the code generates the matrix. However,
 I think assembling the full problem is equivalent to implicitly assembling
 the "Neumann" problems, since the system can be partitioned as;

 [A_{LL} | A_{LI}]  [u_L] [F]
 ---| 

Re: [petsc-users] Using BDDC preconditioner for assembled matrices

2018-10-25 Thread Stefano Zampini
How many processes (subdomains) are you using?
I would not say the number of iterations is bad, and it also seems to
plateau.
The grad-div problem is quite hard to be solved (unless you use
hybridization), you can really benefit from the "Neumann" assembly.
I believe you are using GMRES, as the preconditioned operator (i.e
M_BDDC^-1 A) is not guaranteed to be positive definite when you use the
automatic disassembling.
You may slightly improve the quality of the disassembling by using
-mat_is_disassemble_l2g_type nd -mat_partitioning_type parmetis if you have
PETSc compiled with ParMETIS support.


Il giorno mer 24 ott 2018 alle ore 20:59 Abdullah Ali Sivas <
abdullahasi...@gmail.com> ha scritto:

> Hi Stefano,
>
> I am trying to solve the div-div problem (or grad-div problem in strong
> form) with a H(div)-conforming FEM. I am getting the matrices from an
> external source (to be clear, from an ngsolve script) and I am not sure if
> it is possible to get a MATIS matrix out of that. So I am just treating it
> as if I am not able to access the assembly code. The results are 2, 31, 26,
> 27, 31 iterations, respectively, for matrix sizes 282, 1095, 4314, 17133,
> 67242, 267549. However, norm of the residual also grows significantly;
> 7.38369e-09 for 1095 and 5.63828e-07 for 267549. I can try larger sizes, or
> maybe this is expected for this case.
>
> As a side question, if we are dividing the domain into number of MPI
> processes subdomains, does it mean that convergence is affected negatively
> by the increasing number of processes? I know that alternating Schwarz
> method and some other domain decomposition methods sometimes suffer from
> the decreasing radius of the subdomains. It sounds like BDDC is pretty
> similar to those by your description.
>
> Best wishes,
> Abdullah Ali Sivas
>
> On Wed, 24 Oct 2018 at 05:28, Stefano Zampini 
> wrote:
>
>> Abdullah,
>>
>> The "Neumann" problems Jed is referring to result from assembling your
>> problem on each subdomain ( = MPI process) separately.
>> Assuming you are using FEM, these problems have been historically  named
>> "Neumann" as they correspond to a problem with natural boundary conditions
>> (Neumann bc for Poisson).
>> Note that in PETSc the subdomain decomposition is associated with the
>> mesh decomposition.
>>
>> When converting from an assembled AIJ matrix to a MATIS format, such
>> "Neumann" information is lost.
>> You can disassemble an AIJ matrix, in the sense that you can find local
>> matrices A_j such that A = \sum_j R^T_j A_j R_j (as it is done in ex72.c),
>> but you cannot guarantee (unless if you solve an optimization problem) that
>> the disassembling will produce subdomain Neumann problems that are
>> consistent with your FEM problem.
>>
>> I have added such disassembling code a few months ago, just to have
>> another alternative for preconditioning AIJ matrices in PETSc; there are
>> few tweaks one can do to improve the quality of the disassembling, but I
>> discourage its usage unless you don't have access to the FEM assembly code.
>>
>> With that said, what problem are you trying to solve? Are you using DMDA
>> or DMPlex? What are the results you obtained with using the automatic
>> disassembling?
>>
>> Il giorno mer 24 ott 2018 alle ore 08:14 Abdullah Ali Sivas <
>> abdullahasi...@gmail.com> ha scritto:
>>
>>> Hi Jed,
>>>
>>> Thanks for your reply. The assembled matrix I have corresponds to the
>>> full problem on the full mesh. There are no "Neumann" problems (or any sort
>>> of domain decomposition) defined in the code generates the matrix. However,
>>> I think assembling the full problem is equivalent to implicitly assembling
>>> the "Neumann" problems, since the system can be partitioned as;
>>>
>>> [A_{LL} | A_{LI}]  [u_L] [F]
>>> ---|  = -
>>> [A_{IL}  |A_{II} ]   [u_I]  [G]
>>>
>>> and G should correspond to the Neumann problem. I might be thinking
>>> wrong (or maybe I completely misunderstood the idea), if so please correct
>>> me. But I think that the problem is that I am not explicitly telling PCBDDC
>>> which dofs are interface dofs.
>>>
>>> Regards,
>>> Abdullah Ali Sivas
>>>
>>> On Tue, 23 Oct 2018 at 23:16, Jed Brown  wrote:
>>>
 Did you assemble "Neumann" problems that are compatible with your
 definition of interior/interface degrees of freedom?

 Abdullah Ali Sivas  writes:

 > Dear all,
 >
 > I have a series of linear systems coming from a PDE for which BDDC is
 an
 > optimal preconditioner. These linear systems are assembled and I read
 them
 > from a file, then convert into MATIS as required (as in
 >
 https://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex72.c.html
 > ). I expect each of the systems converge to the solution in almost
 same
 > number of iterations but I don't observe it. I think it is because I
 do not
 > provide enough information to the preconditioner. I can 

Re: [petsc-users] Periodic BC in petsc DMPlex / firedrake

2018-10-25 Thread Stefano Zampini
Matt,

you can reproduce it via

$ valgrind ./ex12 -quiet -run_type test -interpolate 1 -bc_type dirichlet
-petscspace_degree 1 -vec_view vtk:test.vtu:vtk_vtu -f
${PETSC_DIR}/share/petsc/datafiles/meshes/square_periodic.msh
-dm_plex_gmsh_periodic

Long time ago I added support for viewing meshes with periodic vertices in
the VTK_VTU viewer, but I did not fix the part that writes fields


Il giorno mer 24 ott 2018 alle ore 21:04 Matthew Knepley 
ha scritto:

> On Wed, Oct 24, 2018 at 11:36 AM Maximilian Hartig <
> imilian.har...@gmail.com> wrote:
>
>>
>>
>> On 24. Oct 2018, at 12:49, Matthew Knepley  wrote:
>>
>> On Wed, Oct 24, 2018 at 6:29 AM Lawrence Mitchell  wrote:
>>
>>> Hi Max,
>>>
>>> (I'm cc'ing in the petsc-users mailing list which may have more advice,
>>> if you are using PETSc you should definitely subscribe!
>>>
>>> > On 24 Oct 2018, at 09:27, Maximilian Hartig 
>>> wrote:
>>> >
>>> > Hello Lawrence,
>>> >
>>> > sorry to message you out of the blue. My name is Max and I found your
>>> post on GitHub (
>>> https://github.com/firedrakeproject/firedrake/issues/1246 ) on DMPlex
>>> being able to read periodic gmsh files. I am currently trying to do just
>>> that (creating a periodic DMPlex mesh with gmsh) in the context of my PhD
>>> work. So far I haven’ t found any documentation on the periodic BC’s with
>>> DMPlex and gmsh in the official petsc documentation.
>>> > I was wondering whether you’d be so kind as to point me in a general
>>> direction concerning how to achieve this. You seem experienced in using
>>> petsc and I would greatly appreciate your help.
>>>
>>>
>>> I think the answer is "it depends". If you're just using DMPlex directly
>>> and all the of the functionality with PetscDS, then I /think/ that reading
>>> periodic meshes via gmsh (assuming you're using the appropriate gmsh mesh
>>> format [v2]) "just works".
>>>
>>
>> There are two phases here: topological and geometric. DMPlex represents
>> the periodic topological entity directly. For example,  a circle is just a
>> segment with one end hooked to the other. Vertices are not duplicated, or
>> mapped to each other. This makes topology simple and easy to implement.
>> However, then geometry is more complicated. What Plex does is allow
>> coordinates to be represented by a discontinuous field taking values on
>> cells, in addition to vertices. In our circle example, each cells near the
>> cut will have 2 coordinates, one for each vertex, but they will not agree
>> across the cut. If you define a periodic domain, then Plex can construct
>> this coordinate field automatically using DMPlexLocalize(). These DG
>> coordinates are then used by the integration routines.
>>
>>
>> Ok, I think I understand the concept. DMPlex reads information about both
>> topology and coordinates from the .msh file. Creating a periodic mesh in
>> gmsh then should allow DMPlex to identify the periodic boundaries as the
>> “cut” and build the mesh topology accordingly. Coordinate information is
>> handled separately.
>> That means, as Lawrence suggested, building periodic meshes in gmsh and
>> reading them in to petsc’s DMPlex should indeed “just work”.  (From the
>> user perspective). The only extra step is to call DMLocalizeCoordinates()
>> after DMPlexReadFromFile(). Sorry to reiterate, I am just trying to make
>> sense of this.
>>
>>
>>
>>> From my side, the issue is to do with mapping that coordinate field into
>>> one that I understand (within Firedrake). You may not have this problem.
>>>
>>
>> Firedrake uses its own coordinate mapping and integration routines, so
>> they must manage the second part independently. I hope to get change this
>> slightly soon by making the Firedrake representation a DMField, so that it
>> looks the same to Plex.
>>
>>   Thanks,
>>
>> Matt
>>
>>
>>> Thanks,
>>>
>>> Lawrence
>>
>>
>>
>> --
>> What most experimenters take for granted before they begin their
>> experiments is infinitely more interesting than any results to which their
>> experiments lead.
>> -- Norbert Wiener
>>
>> https://www.cse.buffalo.edu/~knepley/
>>
>>
>> To read periodic meshes from GMSH, you need to use the option
>> -dm_plex_gmsh_periodic and DMPlexCreateFromFile
>>
>>
>> Ahh, thanks. I was missing the option " -dm_plex_gmsh_periodic “. But
>> using this option I now generate a segmentation fault error when calling
>> VecView() on the solution vector with vtk and hdf5 viewers. Any suggestions?
>>
>
>  Small example? VTK is deprecated. HDF5 should work, although it will
> require you to have proper coordinates I think. We have to
> think about what you mean. If its for a checkpoint, there is no problem,
> but for viz, those programs do not understand periodicity. Thus I embed it
> in a higher dimensional space.
>
>Matt
>
>> See  src/dm/impls/plex/examples/tests/ex1.c. An example runs
>>
>> $ ./ex1 -filename
>> ${PETSC_DIR}/share/petsc/datafiles/meshes/cube_periodic_bin.msh
>> -dm_plex_gmsh_periodic -dm_view ::ascii_info_detail 

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-25 Thread Ale Foggia
No, the eigenvalue is around -15. I've tried KS and the number of
iterations differs in one when I change the number of MPI processes, which
seems fine for me. So, I'll see if this method is fine for my specific goal
or not, and I'll try to use. Thanks for the help.

El mié., 24 oct. 2018 a las 15:48, Jose E. Roman ()
escribió:

> Everything seems correct. I don't know, maybe your problem is very
> sensitive? Is the eigenvalue tiny?
> I would still try with Krylov-Schur.
> Jose
>
>
> > El 24 oct 2018, a las 14:59, Ale Foggia  escribió:
> >
> > The functions called to set the solver are (in this order): EPSCreate();
> EPSSetOperators(); EPSSetProblemType(EPS_HEP); EPSSetType(EPSLANCZOS);
> EPSSetWhichEigenpairs(EPS_SMALLEST_REAL); EPSSetFromOptions();
> >
> > The output of -eps_view for each run is:
> > =
> > EPS Object: 960 MPI processes
> >   type: lanczos
> > LOCAL reorthogonalization
> >   problem type: symmetric eigenvalue problem
> >   selected portion of the spectrum: smallest real parts
> >   number of eigenvalues (nev): 1
> >   number of column vectors (ncv): 16
> >   maximum dimension of projected problem (mpd): 16
> >   maximum number of iterations: 291700777
> >   tolerance: 1e-08
> >   convergence test: relative to the eigenvalue
> > BV Object: 960 MPI processes
> >   type: svec
> >   17 columns of global length 2333606220
> >   vector orthogonalization method: modified Gram-Schmidt
> >   orthogonalization refinement: if needed (eta: 0.7071)
> >   block orthogonalization method: GS
> >   doing matmult as a single matrix-matrix product
> >   generating random vectors independent of the number of processes
> > DS Object: 960 MPI processes
> >   type: hep
> >   parallel operation mode: REDUNDANT
> >   solving the problem with: Implicit QR method (_steqr)
> > ST Object: 960 MPI processes
> >   type: shift
> >   shift: 0.
> >   number of matrices: 1
> > =
> > EPS Object: 1024 MPI processes
> >   type: lanczos
> > LOCAL reorthogonalization
> >   problem type: symmetric eigenvalue problem
> >   selected portion of the spectrum: smallest real parts
> >   number of eigenvalues (nev): 1
> >   number of column vectors (ncv): 16
> >   maximum dimension of projected problem (mpd): 16
> >   maximum number of iterations: 291700777
> >   tolerance: 1e-08
> >   convergence test: relative to the eigenvalue
> > BV Object: 1024 MPI processes
> >   type: svec
> >   17 columns of global length 2333606220
> >   vector orthogonalization method: modified Gram-Schmidt
> >   orthogonalization refinement: if needed (eta: 0.7071)
> >   block orthogonalization method: GS
> >   doing matmult as a single matrix-matrix product
> >   generating random vectors independent of the number of processes
> > DS Object: 1024 MPI processes
> >   type: hep
> >   parallel operation mode: REDUNDANT
> >   solving the problem with: Implicit QR method (_steqr)
> > ST Object: 1024 MPI processes
> >   type: shift
> >   shift: 0.
> >   number of matrices: 1
> > =
> >
> > I run again the same configurations and I got the same result in term of
> the number of iterations.
> >
> > I also tried the full reorthogonalization (always with the
> -bv_reproducible_random option) but I still get different number of
> iterations: for 960 procs I get 172 iters, and for 1024 I get 362 iters.
> The -esp_view output for this case (only for 960 procs, the other one has
> the same information -except the number of processes-) is:
> > =
> > EPS Object: 960 MPI processes
> >   type: lanczos
> > FULL reorthogonalization
> >   problem type: symmetric eigenvalue problem
> >   selected portion of the spectrum: smallest real parts
> >   number of eigenvalues (nev): 1
> >   number of column vectors (ncv): 16
> >   maximum dimension of projected problem (mpd): 16
> >   maximum number of iterations: 291700777
> >   tolerance: 1e-08
> >   convergence test: relative to the eigenvalue
> > BV Object: 960 MPI processes
> >   type: svec
> >   17 columns of global length 2333606220
> >   vector orthogonalization method: classical Gram-Schmidt
> >   orthogonalization refinement: if needed (eta: 0.7071)
> >   block orthogonalization method: GS
> >   doing matmult as a single matrix-matrix product
> >   generating random vectors independent of the number of processes
> > DS Object: 960 MPI processes
> >   type: hep
> >   parallel operation mode: REDUNDANT
> >   solving the problem with: Implicit QR method (_steqr)
> > ST Object: 960 MPI processes
> >   type: shift
> >   shift: 0.
> >   number of matrices: 1
> > =
> >
> > El mié., 24 oct. 2018 a las 10:52, Jose E. Roman ()
> escribió:
> > This is very strange. Make sure you call EPSSetFromOptions in the 

Re: [petsc-users] [SLEPc] Number of iterations changes with MPI processes in Lanczos

2018-10-25 Thread Ale Foggia
El mar., 23 oct. 2018 a las 13:53, Matthew Knepley ()
escribió:

> On Tue, Oct 23, 2018 at 6:24 AM Ale Foggia  wrote:
>
>> Hello,
>>
>> I'm currently using Lanczos solver (EPSLANCZOS) to get the smallest real
>> eigenvalue (EPS_SMALLEST_REAL) of a Hermitian problem (EPS_HEP). Those are
>> the only options I set for the solver. My aim is to be able to
>> predict/estimate the time-to-solution. To do so, I was doing a scaling of
>> the code for different sizes of matrices and for different number of MPI
>> processes. As I was not observing a good scaling I checked the number of
>> iterations of the solver (given by EPSGetIterationNumber). I've encounter
>> that for the **same size** of matrix (that meaning, the same problem), when
>> I change the number of MPI processes, the amount of iterations changes, and
>> the behaviour is not monotonic. This are the numbers I've got:
>>
>
> I am sure you know this, but this test is strong scaling and will top out
> when the individual problem sizes become too small (we see this at several
> thousand unknowns).
>

Thanks for pointing this out, we are aware of that and I've been "playing"
around to try to see by myself this behaviour. Now, I think I'll go with
the Krylov-Schur method because is the only solution to the problem of the
number of iterations. With this I think I'll be able to see the individual
problem size effect in the scaling.


>   Thanks,
>
> Matt
>
>
>>
>> # procs   # iters
>> 960  157
>> 992  189
>> 1024338
>> 1056190
>> 1120174
>> 2048136
>>
>> I've checked the mailing list for a similar situation and I've found
>> another person with the same problem but in another solver ("[SLEPc] GD is
>> not deterministic when using different number of cores", Nov 19 2015), but
>> I think the solution this person finds does not apply to my problem
>> (removing "-eps_harmonic" option).
>>
>> Can you give me any hint on what is the reason for this behaviour? Is
>> there a way to prevent this? It's not possible to estimate/predict any time
>> consumption for bigger problems if the number of iterations varies this
>> much.
>>
>> Ale
>>
>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>