Hive error while starting up services using Ambari

2015-03-04 Thread Pratik Gadiya
Hi, I am trying to deploy a hadoop cluster using Ambari Blueprint. All the services are up and running except one i.e. Hive Server 2. I tried to look into the logs(/var/log/hive/hiveserver2.log) and looks like Hive is trying to access the MySQL service using username:hive. However, I think it

Re: Permission Denied

2015-03-04 Thread Sean Busbey
On Tue, Mar 3, 2015 at 3:49 PM, David Patterson patt...@gmail.com wrote: Thanks to all that helped. I've now got my configuration running. All of the configuration files that once referenced localhost now reference my hostname (AccumuloTN). The big deal that I omitted seeing in *any* of the

Re: Error while executing command on CDH5

2015-03-04 Thread Suresh Srinivas
Can you please use CDH mailing listd for this question? From: SP sajid...@gmail.com Sent: Wednesday, March 04, 2015 11:00 AM To: user@hadoop.apache.org Subject: Error while executing command on CDH5 Hello All, Why am I getting this error every time I execute

Error while executing command on CDH5

2015-03-04 Thread SP
Hello All, Why am I getting this error every time I execute a command. It was working fine with CDH4 version. When I upgraded to CDH5 version this message started showing up. does any one have resolution for this error sudo -u hdfs hadoop fs -ls / SLF4J: Failed to load class

[no subject]

2015-03-04 Thread SP
Hello All, Why am I getting this error every time I execute a command. It was working fine with CDH4 version. When I upgraded to CDH5 version this message started showing up. does any one have resolution for this error sudo -u hdfs hadoop fs -ls / SLF4J: Failed to load class

keberos issue

2015-03-04 Thread REYANE OUKPEDJO
Hi Everyone, I setup a kerberos enabled cluster using HDP-2.2 and I am facing the following issue : 2015-03-04 16:24:43,419 WARN  security.DelegationTokenRenewer (DelegationTokenRenewer.java:handleDTRenewerAppSubmitEvent(785)) - Unable to add the application to the delegation token

Re: How to Backup and Restore various components of Hadoop ?

2015-03-04 Thread Motty Cruz
Hi Krish, did you get a response in regards to Backup and Restore? I would like to have the ability to backup and restore if necessary. At this moment we're replicating to another cluster, however I want to be able to restore in case a table is delete and replicated to the backup cluster.

Re: How to Backup and Restore various components of Hadoop ?

2015-03-04 Thread Krish Donald
How do you recover if a table is deleted ? On Wed, Mar 4, 2015 at 2:29 PM, Motty Cruz motty.c...@gmail.com wrote: Hi Krish, did you get a response in regards to Backup and Restore? I would like to have the ability to backup and restore if necessary. At this moment we're replicating to

Re: Need advice about OLAP on Hadoop

2015-03-04 Thread Azuryy Yu
Hi VK, I have a similar requirement. we need a real time data analysis platform. Actually, you don't pay more attention on the Spark or Apache Drill, because data for LOAP cubes was calculated before cube build. you just consider two questions: 1) how to calculate the data for cube quickly?

Re: Can't list files in a federation of HDFS

2015-03-04 Thread Azuryy Yu
For HDFS federation, data share all datanodes, but namespace is separate, so did you write some data on hadoop-coc-2 namespace? you don't need to login hadoop-coc-2 then write data, just config a new client, which connect to hadoop-coc-2 for write. On Tue, Mar 3, 2015 at 6:20 PM,

Re: The Activities of Apache Hadoop Community

2015-03-04 Thread Azuryy Yu
That's good to know, On Tue, Mar 3, 2015 at 8:12 PM, Akira AJISAKA ajisa...@oss.nttdata.co.jp wrote: Hi all, One year after the previous post, we collected and analyzed JIRA tickets again to investigate the activities of Apache Hadoop community in 2014.

File is not written on HDFS after running libhdfs C API

2015-03-04 Thread Alexandru Calin
I am trying to run the basic libhdfs example, it compiles ok, and actually runs ok, and executes the whole program, but I cannot see the file on the HDFS. It is said here http://hadoop.apache.org/docs/r1.2.1/libhdfs.html, that you have to include *the right configuration directory containing