Good. Can you share what was the issue do it can help others? Thanks. Regards, Shahab
On Wed, Nov 26, 2014 at 6:41 PM, John Beaulaurier -X (jbeaulau - ADVANCED NETWORK INFORMATION INC at Cisco) <jbeau...@cisco.com> wrote: > Please disregard. Issue resolved. > > > > -John > > > > *From:* John Beaulaurier -X (jbeaulau - ADVANCED NETWORK INFORMATION INC > at Cisco) > *Sent:* Wednesday, November 26, 2014 9:34 AM > *To:* user@hadoop.apache.org > *Subject:* SSH passwordless & Hadoop starup/shutdown scripts > > > > Hello, > > > > I had originally configured our dev cluster with SSH passwordless > connectivity to the datanodes, but had a passphrase. I have > > updated with no passphrase, and have copied the new public key to all > datanodes updating their know_host files, and have tested > > SSH with no passphrase from the namenode to all three datanodes with > success. However when I initiate a stop-dfs.sh or > > stop-mapred.sh as user running the current Hadoop processes the following > is returned without ending the processes. > > > > no namenode to stop > > hostname,hostname,hostname,hostname: ssh: > hostname,hostname,hostname,hostname: Name or service not knows > > > > Can someone suggest where I should be looking for configuration issues? > > > > Thank you > > -John > > >