Re: Deprecation Error

2015-04-17 Thread Anand Murali
Many thanks Chris  Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail) On Thursday, April 16, 2015 11:27 PM, Chris Nauroth cnaur...@hortonworks.com wrote: Hello Anand, MapReduce provides 2 similar but slightly

Re: Add keys to column family in HBase using Python

2015-04-17 Thread Ted Yu
bq. add columns to the HBase table(from Hive) Since desired approach is to add column through Hive, please consider Hive mailing list as well. Cheers On Thu, Apr 16, 2015 at 10:45 AM, Chris Nauroth cnaur...@hortonworks.com wrote: Hello Manoj, I recommend restarting this thread over at

Connection refused Error

2015-04-17 Thread Anand Murali
Dear All: I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am unable to connect with following anand_vihar@Latitude-E5540:~/hadoop-2.6.0/sbin$ start-yarn.sh starting yarn daemons starting

ipc.Client: Retrying connect to server

2015-04-17 Thread Mahmood Naderan
Hello,I have done all steps (as far as I know) to bring up the hadoop. However, I get the this error 15/04/17 12:45:31 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 0 time(s). There are a lot of threads and posts regarding this error and I tried them.

Re: Found weird issue with HttpFS and WebHdfsFileSystem

2015-04-17 Thread Bram Biesbrouck
Hi Chris, Thanks for this reply. I thought something funny was happening. The childNum field is actually very useful (eg for (not) rendering a expansion marker next to a folder in a GUI when it has children), so it's a pity the info is there, but get's eaten up by the general interface, only to

Re: Exception while import in sqoop

2015-04-17 Thread Chris Nauroth
Hello Kumar, This is another question that would be better handled through the u...@sqoop.apache.orgmailto:u...@sqoop.apache.org list. Chris Nauroth Hortonworks http://hortonworks.com/ From: Kumar Jayapal kjayapa...@gmail.commailto:kjayapa...@gmail.com Reply-To:

Exception while import in sqoop

2015-04-17 Thread Kumar Jayapal
Hi, I installed sqoop2 and trying to execute simple export command to check the db connection Does any one have reason and resolution of this error. sqoop import --connect jdbc:mysql://mysql.mmc.com:3306/hive --username hive --password sdeecer --table ROLES Exception has occurred during

Re: Found weird issue with HttpFS and WebHdfsFileSystem

2015-04-17 Thread Chris Nauroth
Hello Bram, I'm glad to hear the information was helpful. If you'd like to request access to childNum as part of a guaranteed public API, then I encourage you to create a jira issue in the HDFS project. We could consider it for the future. HdfsFileStatus is a representation of the HDFS wire

several jobs in one MapReduce runtime

2015-04-17 Thread xeonmailinglist-gmail
Hi, I have a MapReduce runtime where I put several jobs running in concurrency. How I manage the job scheduler so that it won't run a job at a time? Thanks, -- --

Re: Connection refused Error

2015-04-17 Thread madhav krish
Did you start your name node using start-dfs.sh? On Apr 17, 2015 1:52 AM, Anand Murali anand_vi...@yahoo.com wrote: Dear All: I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am unable to connect

Error in YARN localization with Active Directory user -- inconsistent directory name escapement

2015-04-17 Thread John Lilley
We have a Cloudera 5.3 cluster running on CentOS6 that is Kerberos-enabled and uses an external AD domain controller for the KDC. We are able to authenticate, browse HDFS, etc. However, YARN fails during localization because it seems to get confused by the presence of a \ character in the

Re: Connection refused Error

2015-04-17 Thread madhav krish
Did you start your name node using start-dfs.sh? On Apr 17, 2015 1:52 AM, Anand Murali anand_vi...@yahoo.com wrote: Dear All: I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am unable to connect

The correct way to find number of mappers and reducers

2015-04-17 Thread Mahmood Naderan
Hi, There are good guides on the number of mappers and reducers in a hadoop job. For example: Running Hadoop on Ubuntu Linux (Single-Node Cluster)http://goo.gl/kaA1h5 Partitioning your job into maps and reduces http://goo.gl/tpU23 However, there are some, say noob, question here.

Re: Connection refused Error

2015-04-17 Thread Anand Murali
Yes Sent from my iPhone On 17-Apr-2015, at 9:01 pm, madhav krish mad...@gmail.com wrote: Did you start your name node using start-dfs.sh? On Apr 17, 2015 1:52 AM, Anand Murali anand_vi...@yahoo.com wrote: Dear All: I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able