Re: Change for submitting to yarn in 1.3.1

2015-05-10 Thread Manku Timma
sc.applicationId gives the yarn appid. On 11 May 2015 at 08:13, Mridul Muralidharan wrote: > We had a similar requirement, and as a stopgap, I currently use a > suboptimal impl specific workaround - parsing it out of the > stdout/stderr (based on log config). > A better means to get to this is i

Hive.get() called without HiveConf being already set on a yarn executor

2015-05-05 Thread Manku Timma
Looks like there is a case in TableReader.scala where Hive.get() is being called without already setting it via Hive.get(hiveconf). I am running in yarn-client mode (compiled with -Phive-provided and with hive-0.13.1a). Basically this means the broadcasted hiveconf is not getting used and the defau

Re: hive initialization on executors

2015-04-29 Thread Manku Timma
The problem was in my hive-13 branch. So ignore this. On 27 April 2015 at 10:34, Manku Timma wrote: > I am facing an exception "Hive.get() called without a hive db setup" in > the executor. I wanted to understand how Hive object is initialized in the > executor threads?

Re: creating hive packages for spark

2015-04-28 Thread Manku Timma
Yash, This is exactly what I wanted! Thanks a bunch. On 27 April 2015 at 15:39, yash datta wrote: > Hi, > > you can build spark-project hive from here : > > https://github.com/pwendell/hive/tree/0.13.1-shaded-protobuf > > Hope this helps. > > > On Mon, Apr 27

creating hive packages for spark

2015-04-27 Thread Manku Timma
Hello Spark developers, I want to understand the procedure to create the org.spark-project.hive jars. Is this documented somewhere? I am having issues with -Phive-provided with my private hive13 jars and want to check if using spark's procedure helps.

hive initialization on executors

2015-04-26 Thread Manku Timma
I am facing an exception "Hive.get() called without a hive db setup" in the executor. I wanted to understand how Hive object is initialized in the executor threads? I only see Hive.get(hiveconf) in two places in spark 1.3 code. In HiveContext.scala - I dont think this is created on the executor In