Re: Best way to use Spark UDFs via Hive (Spark Thrift Server)

2015-10-23 Thread Deenar Toraskar
You can do the following. Start the spark-shell. Register the UDFs in the shell using sqlContext, then start the Thrift Server using startWithContext from the spark shell:

Best way to use Spark UDFs via Hive (Spark Thrift Server)

2015-10-22 Thread Dave Moyers
Hi, We have several udf's written in Scala that we use within jobs submitted into Spark. They work perfectly with the sqlContext after being registered. We also allow access to saved tables via the Hive Thrift server bundled with Spark. However, we would like to allow Hive connections to use