Many thanks Chris
Anand Murali 11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004,
IndiaPh: (044)- 28474593/ 43526162 (voicemail)
On Thursday, April 16, 2015 11:27 PM, Chris Nauroth
cnaur...@hortonworks.com wrote:
Hello Anand,
MapReduce provides 2 similar but slightly
bq. add columns to the HBase table(from Hive)
Since desired approach is to add column through Hive, please consider Hive
mailing list as well.
Cheers
On Thu, Apr 16, 2015 at 10:45 AM, Chris Nauroth cnaur...@hortonworks.com
wrote:
Hello Manoj,
I recommend restarting this thread over at
Dear All:
I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to
connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am
unable to connect with following
anand_vihar@Latitude-E5540:~/hadoop-2.6.0/sbin$ start-yarn.sh
starting yarn daemons
starting
Hello,I have done all steps (as far as I know) to bring up the hadoop. However,
I get the this error
15/04/17 12:45:31 INFO ipc.Client: Retrying connect to server:
localhost/127.0.0.1:54310. Already tried 0 time(s).
There are a lot of threads and posts regarding this error and I tried them.
Hi Chris,
Thanks for this reply. I thought something funny was happening.
The childNum field is actually very useful (eg for (not) rendering a
expansion marker next to a folder in a GUI when it has children), so it's a
pity the info is there, but get's eaten up by the general interface, only
to
Hello Kumar,
This is another question that would be better handled through the
u...@sqoop.apache.orgmailto:u...@sqoop.apache.org list.
Chris Nauroth
Hortonworks
http://hortonworks.com/
From: Kumar Jayapal kjayapa...@gmail.commailto:kjayapa...@gmail.com
Reply-To:
Hi,
I installed sqoop2 and trying to execute simple export command to check the
db connection
Does any one have reason and resolution of this error.
sqoop import --connect jdbc:mysql://mysql.mmc.com:3306/hive --username
hive --password sdeecer --table ROLES
Exception has occurred during
Hello Bram,
I'm glad to hear the information was helpful.
If you'd like to request access to childNum as part of a guaranteed public API,
then I encourage you to create a jira issue in the HDFS project. We could
consider it for the future.
HdfsFileStatus is a representation of the HDFS wire
Hi,
I have a MapReduce runtime where I put several jobs running in
concurrency. How I manage the job scheduler so that it won't run a job
at a time?
Thanks,
--
--
Did you start your name node using start-dfs.sh?
On Apr 17, 2015 1:52 AM, Anand Murali anand_vi...@yahoo.com wrote:
Dear All:
I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to
connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am
unable to connect
We have a Cloudera 5.3 cluster running on CentOS6 that is Kerberos-enabled and
uses an external AD domain controller for the KDC. We are able to
authenticate, browse HDFS, etc. However, YARN fails during localization
because it seems to get confused by the presence of a \ character in the
Did you start your name node using start-dfs.sh?
On Apr 17, 2015 1:52 AM, Anand Murali anand_vi...@yahoo.com wrote:
Dear All:
I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able to
connect to hdfs and run mapreduce job on singlenode yarn setup. Today I am
unable to connect
Hi,
There are good guides on the number of mappers and reducers in a hadoop job.
For example:
Running Hadoop on Ubuntu Linux (Single-Node Cluster)http://goo.gl/kaA1h5
Partitioning your job into maps and reduces http://goo.gl/tpU23
However, there are some, say noob, question here.
Yes
Sent from my iPhone
On 17-Apr-2015, at 9:01 pm, madhav krish mad...@gmail.com wrote:
Did you start your name node using start-dfs.sh?
On Apr 17, 2015 1:52 AM, Anand Murali anand_vi...@yahoo.com wrote:
Dear All:
I installed Hadoop-2.6 on Ubuntu 14.10 desktop yesterday and was able
14 matches
Mail list logo