Re: [OMPI users] Passwordless ssh

2011-12-20 Thread Shaandar Nyamtulga

Can you clarify your answer please.
I have one master node and other slave nodes. I created rsa key on my master 
node and copied it to all slaves.
/home/mpiuser directory of all nodes are shared through NFS.The strange thing 
is why it requires password after I mount a slave and do ssh to the slave.
When I dismount I can ssh without password.

 



List-Post: users@lists.open-mpi.org
Date: Tue, 20 Dec 2011 10:45:12 +0100
From: mathieu.westp...@obs.ujf-grenoble.fr
To: us...@open-mpi.org
Subject: Re: [OMPI users] Passwordless ssh


Hello

You have to copy nodeX public key at the end of nodeY authorizedkeys.


Mathieu
Le 20/12/2011 05:03, Shaandar Nyamtulga a écrit : 



Hi
I built Beuwolf cluster using OpenMPI reading the following link.
http://techtinkering.com/2009/12/02/setting-up-a-beowulf-cluster-using-open-mpi-on-linux/
I can access my nodes without password before mounting my slaves.
But when I mount my slaves and run a program, it asks again passwords.

$ eval `ssh-agent`

$ ssh-add ~/.ssh/id_dsa

The above is not working. Terminal gives the reply "Could not open a connection 
to your authentication agent."

Help is needed urgently.

Thank you




 
___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users

___ users mailing list 
us...@open-mpi.org http://www.open-mpi.org/mailman/listinfo.cgi/users   
   

Re: [OMPI users] Latest Intel Compilers (ICS, version 12.1.0.233 Build 20110811) issues ...

2011-12-20 Thread Jonathan Dursi
For what it's worth, 1.4.4 built with the intel 12.1.0.233 compilers has been  
the default mpi at our centre for over a month and we haven't had any 
problems...

   - jonathan
-- 
Jonathan Dursi; SciNet, Compute/Calcul Canada

-Original Message-
From: Richard Walsh 
Sender: users-boun...@open-mpi.org
List-Post: users@lists.open-mpi.org
Date: Tue, 20 Dec 2011 21:14:44 
To: Open MPI Users
Reply-To: Open MPI Users 
Subject: Re: [OMPI users] Latest Intel Compilers (ICS,
 version 12.1.0.233 Build 20110811) issues ...


All,

I have not heard anything back on the inquiry below, so I take it
that no one has had any issues with Intel's latest compiler release,
or perhaps has not tried it yet.

Thanks,

rbw

Richard Walsh
Parallel Applications and Systems Manager
CUNY HPC Center, Staten Island, NY
W: 718-982-3319
M: 612-382-4620

Right, as the world goes, is only in question between equals in power, while 
the strong do what they can and the weak suffer what they must.  -- Thucydides, 
400 BC


From: users-boun...@open-mpi.org [users-boun...@open-mpi.org] on behalf of 
Richard Walsh [richard.wa...@csi.cuny.edu]
Sent: Friday, December 16, 2011 3:12 PM
To: Open MPI Users
Subject: [OMPI users] Latest Intel Compilers (ICS, version 12.1.0.233 Build 
20110811) issues ...

All,

Working through a stock rebuild of OpenMPI 1.5.4 and 1.4.4 with
the most current compiler suites from both PGI and Intel:

   1.  PGI,  Version 11.10

   2.  Intel,  Version 12.1.0.233 Build 20110811

My 1.5.4 'config.log' header looks like this for Intel:

./configure CC=icc CXX=icpc F77=ifort FC=ifort --with-openib 
--prefix=/share/apps/openmpi-intel/1.5.4 --with-tm=/share/apps/pbs/11.1.0.111761

and this for PGI:

./configure CC=pgcc CXX=pgCC F77=pgf77 FC=pgf90 --with-openib 
--prefix=/share/apps/openmpi-pgi/1.5.4 --with-tm=/share/apps/pbs/11.1.0.111761

This configure line has been used successfully before.  Configuration, build, 
and install
for BOTH compilers seems to work OK; however, my 'mpicc' build of my basic test
program ONLY works with the PGI built version of 'mpicc' for either the 1.4.4 
or the 1.5.4
will compile the code.

The Intel 1.4.4 and 1.5.4 'mpicc' wrapper-compilers produce an immediate 
segmentation
fault:

.[richard.walsh@bob pbs]$ ./compile_it
./compile_it: line 10: 19163 Segmentation fault  
/share/apps/openmpi-intel/1.5.4/bin/mpicc -o ./hello_mpi.exe hello_mpi.c
[richard.walsh@bob pbs]$
[richard.walsh@bob pbs]$ ./compile_it
./compile_it: line 10: 19515 Segmentation fault  
/share/apps/openmpi-intel/1.4.4/bin/mpicc -o ./hello_mpi.exe hello_mpi.c

This Intel stack is from the most recent release of their ICS released
in late October before SC11:

[richard.walsh@bob pbs]$ icc -V
Intel(R) C Intel(R) 64 Compiler XE for applications running on Intel(R) 64, 
Version 12.1.0.233 Build 20110811
Copyright (C) 1985-2011 Intel Corporation.  All rights reserved.

[richard.walsh@bob pbs]$ ifort -V
Intel(R) Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 
64, Version 12.1.0.233 Build 20110811
Copyright (C) 1985-2011 Intel Corporation.  All rights reserved.

Has anyone else encountered this problem ... ??  Suggestions ... ??

Thanks,

rbw


Richard Walsh
Parallel Applications and Systems Manager
CUNY HPC Center, Staten Island, NY
W: 718-982-3319
M: 612-382-4620

Right, as the world goes, is only in question between equals in power, while 
the strong do what they can and the weak suffer what they must.  -- Thucydides, 
400 BC




Change is in the Air - Smoking in Designated Areas Only in 
effect.
Tobacco-Free Campus as of July 1, 2012.

___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users



Change is in the Air - Smoking in Designated Areas Only in 
effect.
Tobacco-Free Campus as of July 1, 2012.

___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users



Re: [OMPI users] Latest Intel Compilers (ICS, version 12.1.0.233 Build 20110811) issues ...

2011-12-20 Thread Richard Walsh

All,

I have not heard anything back on the inquiry below, so I take it
that no one has had any issues with Intel's latest compiler release,
or perhaps has not tried it yet.

Thanks,

rbw

Richard Walsh
Parallel Applications and Systems Manager
CUNY HPC Center, Staten Island, NY
W: 718-982-3319
M: 612-382-4620

Right, as the world goes, is only in question between equals in power, while 
the strong do what they can and the weak suffer what they must.  -- Thucydides, 
400 BC


From: users-boun...@open-mpi.org [users-boun...@open-mpi.org] on behalf of 
Richard Walsh [richard.wa...@csi.cuny.edu]
Sent: Friday, December 16, 2011 3:12 PM
To: Open MPI Users
Subject: [OMPI users] Latest Intel Compilers (ICS, version 12.1.0.233 Build 
20110811) issues ...

All,

Working through a stock rebuild of OpenMPI 1.5.4 and 1.4.4 with
the most current compiler suites from both PGI and Intel:

   1.  PGI,  Version 11.10

   2.  Intel,  Version 12.1.0.233 Build 20110811

My 1.5.4 'config.log' header looks like this for Intel:

./configure CC=icc CXX=icpc F77=ifort FC=ifort --with-openib 
--prefix=/share/apps/openmpi-intel/1.5.4 --with-tm=/share/apps/pbs/11.1.0.111761

and this for PGI:

./configure CC=pgcc CXX=pgCC F77=pgf77 FC=pgf90 --with-openib 
--prefix=/share/apps/openmpi-pgi/1.5.4 --with-tm=/share/apps/pbs/11.1.0.111761

This configure line has been used successfully before.  Configuration, build, 
and install
for BOTH compilers seems to work OK; however, my 'mpicc' build of my basic test
program ONLY works with the PGI built version of 'mpicc' for either the 1.4.4 
or the 1.5.4
will compile the code.

The Intel 1.4.4 and 1.5.4 'mpicc' wrapper-compilers produce an immediate 
segmentation
fault:

.[richard.walsh@bob pbs]$ ./compile_it
./compile_it: line 10: 19163 Segmentation fault  
/share/apps/openmpi-intel/1.5.4/bin/mpicc -o ./hello_mpi.exe hello_mpi.c
[richard.walsh@bob pbs]$
[richard.walsh@bob pbs]$ ./compile_it
./compile_it: line 10: 19515 Segmentation fault  
/share/apps/openmpi-intel/1.4.4/bin/mpicc -o ./hello_mpi.exe hello_mpi.c

This Intel stack is from the most recent release of their ICS released
in late October before SC11:

[richard.walsh@bob pbs]$ icc -V
Intel(R) C Intel(R) 64 Compiler XE for applications running on Intel(R) 64, 
Version 12.1.0.233 Build 20110811
Copyright (C) 1985-2011 Intel Corporation.  All rights reserved.

[richard.walsh@bob pbs]$ ifort -V
Intel(R) Fortran Intel(R) 64 Compiler XE for applications running on Intel(R) 
64, Version 12.1.0.233 Build 20110811
Copyright (C) 1985-2011 Intel Corporation.  All rights reserved.

Has anyone else encountered this problem ... ??  Suggestions ... ??

Thanks,

rbw


Richard Walsh
Parallel Applications and Systems Manager
CUNY HPC Center, Staten Island, NY
W: 718-982-3319
M: 612-382-4620

Right, as the world goes, is only in question between equals in power, while 
the strong do what they can and the weak suffer what they must.  -- Thucydides, 
400 BC




Change is in the Air - Smoking in Designated Areas Only in 
effect.
Tobacco-Free Campus as of July 1, 2012.

___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users



Change is in the Air - Smoking in Designated Areas Only in 
effect.
Tobacco-Free Campus as of July 1, 2012.



[OMPI users] question about OMPI_MPIF90

2011-12-20 Thread Steve Heistand
so from what Ive seen all over the interwweb the OMPI_MPIF77/90 environment 
variables are able to
override the default compilers when using the mpif77/mpif90 wrappers.
But from what I have seen the F90 one doesnt seem to.

Using 1.4.3 and 1.4.4 for example:

bridge3:heistand% mpif90 -v
Version 11.1
bridge3:heistand% setenv OMPI_MPIF90 test
bridge3:heistand% mpif90 -v
Version 11.1

bridge3:heistand% setenv OMPI_MPIF90 gfortran
bridge3:heistand% mpif90 -v
Version 11.1

(didnt change what the wrapper used.)
but for F77:

bridge3:heistand% mpif77 -v
Version 11.1

bridge3:heistand% setenv OMPI_MPIF77 gfortran
bridge3:heistand% mpif77 -v
Using built-in specs.
Target: x86_64-suse-linux

this one works just fine.

so Im confused.

any comments/suggestions?

thanks

steve


-- 

 Steve Heistand   NASA Ames Research Center
 SciCon Group Mail Stop 258-6
 steve.heist...@nasa.gov  (650) 604-4369  Moffett Field, CA 94035-1000

 "Any opinions expressed are those of our alien overlords, not my own."



signature.asc
Description: OpenPGP digital signature


Re: [OMPI users] Passwordless ssh

2011-12-20 Thread Mathieu westphal

Hello

You have to copy nodeX public key at the end of nodeY authorizedkeys.


Mathieu
Le 20/12/2011 05:03, Shaandar Nyamtulga a écrit :

Hi
I built Beuwolf cluster using OpenMPI reading the following link.
http://techtinkering.com/2009/12/02/setting-up-a-beowulf-cluster-using-open-mpi-on-linux/
I can access my nodes without password before mounting my slaves.
But when I mount my slaves and run a program, it asks again passwords.
|$eval  `ssh-agent`
|
|$ssh-add ~/.ssh/id_dsa

The above is not working. Terminal gives the reply "Could not open a connection to 
your authentication agent."

Help is needed urgently.

Thank you

|



___
users mailing list
us...@open-mpi.org
http://www.open-mpi.org/mailman/listinfo.cgi/users