compiled fftw with intel mpi and successfully compiled without any error.
After running a job in parallel scf GOT THE FOLLOWING ERROR

grep: *scf1*: No such file or directory
cp: cannot stat '.in.tmp': No such file or directory
FERMI - Error
grep: *scf1*: No such file or directory
InBi.scf1_1: No such file or directory.
[1]  + Done                          ( ( $remote $machine[$p] "cd
$PWD;$t $taskset0 $exe ${def}_$loop.def ;fixerror_lapw ${def}_$loop";
rm -f .lock_$lockfile[$p] ) >& .stdout1_$loop; if ( -f .stdout1_$loop
) bashtime2csh.pl_lapw .stdout1_$loop > .temp1_$loop; grep \%
.temp1_$loop >> .time1_$loop; grep -v \% .temp1_$loop | perl -e "print
stderr " )
Host key verification failed.
ssh_askpass: exec(/usr/bin/ssh-askpass): No such file or directory
[1]  + Done                          ( ( $remote $machine[$p] "cd
$PWD;$t $taskset0 $exe ${def}_$loop.def ;fixerror_lapw ${def}_$loop";
rm -f .lock_$lockfile[$p] ) >& .stdout1_$loop; if ( -f .stdout1_$loop
) bashtime2csh.pl_lapw .stdout1_$loop > .temp1_$loop; grep \%
.temp1_$loop >> .time1_$loop; grep -v \% .temp1_$loop | perl -e "print
stderr " )
Host key verification failed.
ssh_askpass: exec(/usr/bin/ssh-askpass): No such file or directory
 LAPW0 END
 LAPW0 END


On Thu, May 30, 2019 at 6:29 AM Gavin Abo <gs...@crimson.ua.edu> wrote:

> As mentioned in a previous post [1], it looks like ompi_mpi in the error
> messages indicate that your fftw3 was compiled with Open MPI instead of
> Intel MPI.
>
> If you have both Open MPI and Intel MPI on your system, you have to take
> care of the double ii's.
>
> Perhaps, you used mpicc for Open MPI [2] instead of mpiicc for Intel MPI
> [3] when you compiled fftw3 [4].
> [1]
> https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg13360.html
> [2] https://www.open-mpi.org/faq/?category=mpi-apps#general-build
> [3]
> https://software.intel.com/en-us/mpi-developer-guide-linux-compilers-support
> [4]
> https://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/msg18664.html
>
>
> On 5/29/2019 11:25 AM, Indranil mal wrote:
>
> /home/dps/fftw3/lib/libfftw3_mpi.a(api.o): In function `bogosity_hook':
> api.c:(.text+0x20): undefined reference to `ompi_mpi_comm_null'
> api.c:(.text+0x44): undefined reference to `ompi_mpi_comm_null'
> api.c:(.text+0x73): undefined reference to `ompi_mpi_comm_null'
> /home/dps/fftw3/lib/libfftw3_mpi.a(api.o): In function `nowisdom_hook':
> api.c:(.text+0x9d): undefined reference to `ompi_mpi_comm_null'
> api.c:(.text+0xbc): undefined reference to `ompi_mpi_comm_null'
> /home/dps/fftw3/lib/libfftw3_mpi.a(api.o):api.c:(.text+0xd3): more
> undefined references to `ompi_mpi_comm_null' follow
> /home/dps/fftw3/lib/libfftw3_mpi.a(api.o): In function `wisdom_ok_hook':
> api.c:(.text+0x1cc): undefined reference to `ompi_mpi_unsigned'
> api.c:(.text+0x21f): undefined reference to `ompi_mpi_op_land'
> api.c:(.text+0x226): undefined reference to `ompi_mpi_int'
> api.c:(.text+0x24b): undefined reference to `ompi_mpi_comm_null'
> /home/dps/fftw3/lib/libfftw3_mpi.a(api.o): In function `cost_hook':
> api.c:(.text+0x2c7): undefined reference to `ompi_mpi_comm_null'
> api.c:(.text+0x2dc): undefined reference to `ompi_mpi_comm_null'
> api.c:(.text+0x2ee): undefined reference to `ompi_mpi_op_max'
> api.c:(.text+0x2f5): undefined reference to `ompi_mpi_op_sum'
> api.c:(.text+0x308): undefined reference to `ompi_mpi_double'
> api.c:(.text+0x33b): undefined reference to `ompi_mpi_comm_null'
> /home/dps/fftw3/lib/libfftw3_mpi.a(f03-wrap.o): In function
> `fftw_mpi_local_size_many_transposed_f03':
> f03-wrap.c:(.text+0x3a): undefined reference to `MPI_Comm_f2c'
> /home/dps/fftw3/lib/libfftw3_mpi.a(f03-wrap.o): In function
> `fftw_mpi_local_size_many_f03':
> f03-wrap.c:(.text+0xa5): undefined reference to `MPI_Comm_f2c'
> /home/dps/fftw3/lib/libfftw3_mpi.a(f03-wrap.o): In function
> `fftw_mpi_local_size_transposed_f03':
> f03-wrap.c:(.text+0x103): undefined reference to `MPI_Comm_f2c'
> /home/dps/fftw3/lib/libfftw3_mpi.a(f03-wrap.o): In function
> `fftw_mpi_local_size_f03':
> f03-wrap.c:(.text+0x148): undefined reference to `MPI_Comm_f2c'
> /home/dps/fftw3/lib/libfftw3_mpi.a(f03-wrap.o): In function
> `fftw_mpi_local_size_many_1d_f03':
> f03-wrap.c:(.text+0x1a2): undefined reference to `MPI_Comm_f2c'
> /home/dps/fftw3/lib/libfftw3_mpi.a(f03-wrap.o):f03-wrap.c:(.text+0x20a):
> more undefined references to `MPI_Comm_f2c' follow
> /home/dps/fftw3/lib/libfftw3_mpi.a(transpose-alltoall.o): In function
> `apply':
> transpose-alltoall.c:(.text+0x83): undefined reference to `ompi_mpi_double'
> transpose-alltoall.c:(.text+0x11f): undefined reference to
> `ompi_mpi_double'
> transpose-alltoall.c:(.text+0x16d): undefined reference to
> `ompi_mpi_double'
> transpose-alltoall.c:(.text+0x19f): undefined reference to
> `ompi_mpi_double'
> /home/dps/fftw3/lib/libfftw3_mpi.a(transpose-pairwise.o): In function
> `transpose_chunks':
> transpose-pairwise.c:(.text+0x53a): undefined reference to
> `ompi_mpi_double'
> /home/dps/fftw3/lib/libfftw3_mpi.a(transpose-pairwise.o):transpose-pairwise.c:(.text+0x54c):
> more undefined references to `ompi_mpi_double' follow
> /home/dps/fftw3/lib/libfftw3_mpi.a(any-true.o): In function
> `fftw_mpi_any_true':
> any-true.c:(.text+0xa): undefined reference to `ompi_mpi_op_lor'
> any-true.c:(.text+0x1f): undefined reference to `ompi_mpi_int'
> /home/dps/fftw3/lib/libfftw3_mpi.a(wisdom-api.o): In function
> `fftw_mpi_gather_wisdom':
> wisdom-api.c:(.text+0x92): undefined reference to `ompi_mpi_unsigned_long'
> wisdom-api.c:(.text+0xcd): undefined reference to `ompi_mpi_char'
> wisdom-api.c:(.text+0x12d): undefined reference to `ompi_mpi_unsigned_long'
> wisdom-api.c:(.text+0x158): undefined reference to `ompi_mpi_char'
> /home/dps/fftw3/lib/libfftw3_mpi.a(wisdom-api.o): In function
> `fftw_mpi_broadcast_wisdom':
> wisdom-api.c:(.text+0x222): undefined reference to `ompi_mpi_unsigned_long'
> wisdom-api.c:(.text+0x24d): undefined reference to `ompi_mpi_char'
> wisdom-api.c:(.text+0x2b5): undefined reference to `ompi_mpi_unsigned_long'
> wisdom-api.c:(.text+0x2da): undefined reference to `ompi_mpi_char'
> Makefile:117: recipe for target 'nlvdw_mpi' failed
> make[1]: *** [nlvdw_mpi] Error 1
> make[1]: Leaving directory '/home/dps/WIEN2K/SRC_nlvdw'
> Makefile:108: recipe for target 'para' failed
> make: *** [para] Error 2
>
> On Wed, May 29, 2019 at 5:34 PM Gavin Abo <gs...@crimson.ua.edu> wrote:
>
>> Look inside those compile.msg files as they likely contain messages
>> showing why they failed to compile.
>> On 5/28/2019 11:46 AM, Indranil mal wrote:
>>
>> Thank you for kind response
>> After following all the instructions given by you I have installed WIEN2k
>> with Intel parallel compiler. After compiling I got
>>
>> Compile time errors (if any) were:
>> SRC_lapw0/compile.msg:make[1]: *** [lapw0_mpi] Error 1
>> SRC_lapw0/compile.msg:make: *** [para] Error 2
>> SRC_nlvdw/compile.msg:make[1]: *** [nlvdw_mpi] Error 1
>>
>> _______________________________________________
> Wien mailing list
> Wien@zeus.theochem.tuwien.ac.at
> http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
> SEARCH the MAILING-LIST at:
> http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html
>
_______________________________________________
Wien mailing list
Wien@zeus.theochem.tuwien.ac.at
http://zeus.theochem.tuwien.ac.at/mailman/listinfo/wien
SEARCH the MAILING-LIST at:  
http://www.mail-archive.com/wien@zeus.theochem.tuwien.ac.at/index.html

Reply via email to