Re: Enable hql on the JDBC thrift server

2017-11-09 Thread Arnaud Wolf
eated Cassandra table. Spark SQL does not provide any feature for safe parameter binding, so I thought about using the JDBC thrift server and the JDBC interface. Inserting data into an external table from hive is performed by running CREATE EXTERNAL TABLE ... STORED BY... However, when tryi

Enable hql on the JDBC thrift server

2017-11-09 Thread Arnaud Wolf
thought about using the JDBC thrift server and the JDBC interface. Inserting data into an external table from hive is performed by running CREATE EXTERNAL TABLE ... STORED BY... However, when trying to execute this statement through the thrift server, I always get the following error

RE: JDBC thrift server

2015-10-08 Thread Younes Naguib
Sorry, we’re running 1.5.1. y From: Sathish Kumaran Vairavelu [mailto:vsathishkuma...@gmail.com] Sent: October-08-15 12:39 PM To: Younes Naguib; user@spark.apache.org Subject: Re: JDBC thrift server Which version of spark you are using? You might encounter SPARK-6882<https://issues.apache.

Re: JDBC thrift server

2015-10-08 Thread Sathish Kumaran Vairavelu
Which version of spark you are using? You might encounter SPARK-6882 <https://issues.apache.org/jira/browse/SPARK-6882> if Kerberos is enabled. -Sathish On Thu, Oct 8, 2015 at 10:46 AM Younes Naguib < younes.nag...@tritondigital.com> wrote: > Hi, > > > > We’ve been u

JDBC thrift server

2015-10-08 Thread Younes Naguib
Hi, We've been using the JDBC thrift server for a couple of weeks now and running queries on it like a regular RDBMS. We're about to deploy it in a shared production cluster. Any advice, warning on a such setup. Yarn or Mesos? How about dynamic resource allocation in a already runn

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-27 Thread Cheng Lian
. Could you please advice. Thanks in advance. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-generate-assembly-jar-which-includes-jdbc-thrift-server-tp19887p19963.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-27 Thread vdiwakar.malladi
.1001560.n3.nabble.com/Unable-to-generate-assembly-jar-which-includes-jdbc-thrift-server-tp19887p19963.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-26 Thread Cheng Lian
-> [Help 1] Thanks in advance. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-generate-assembly-jar-which-includes-jdbc-thrift-server-tp19887p19945.html Sent from the Apache Spark User List m

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-26 Thread vdiwakar.malladi
0\python"): CreateProcess error=2, The system cannot find the file specified -> [Help 1] Thanks in advance. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-generate-assembly-jar-which-includes-jdbc-thrift-server-tp19887p19945.html Sen

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-26 Thread Cheng Lian
vance. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-generate-assembly-jar-which-includes-jdbc-thrift-server-tp19887p19937.html Sent from the Apache Spark User List mailing list archive at Nabbl

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-26 Thread vdiwakar.malladi
Yes, I'm building it from Spark 1.1.0 Thanks in advance. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-generate-assembly-jar-which-includes-jdbc-thrift-server-tp19887p19937.html Sent from the Apache Spark User List mailing list archi

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-26 Thread Cheng Lian
mand. mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests clean package Regards. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-generate-assembly-jar-which-includes-jdbc-thrift-server-tp19887p19933.html Sent from the Apache Spark User Li

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-26 Thread vdiwakar.malladi
Thanks for your response. I'm using the following command. mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests clean package Regards. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-generate-assembly-jar-which-includes-jdbc-t

Re: Unable to generate assembly jar which includes jdbc-thrift server

2014-11-26 Thread Cheng Lian
What’s the command line you used to build Spark? Notice that you need to add |-Phive-thriftserver| to build the JDBC Thrift server. This profile was once removed in in v1.1.0, but added back in v1.2.0 because of dependency issue introduced by Scala 2.11 support. On 11/27/14 12:53 AM

Unable to generate assembly jar which includes jdbc-thrift server

2014-11-26 Thread vdiwakar.malladi
nable-to-generate-assembly-jar-which-includes-jdbc-thrift-server-tp19887.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional com

Re: Unable to share Sql between HiveContext and JDBC Thrift Server

2014-10-10 Thread Cheng Lian
Which version are you using? Also |.saveAsTable()| saves the table to Hive metastore, so you need to make sure your Spark application points to the same Hive metastore instance as the JDBC Thrift server. For example, put |hive-site.xml| under |$SPARK_HOME/conf|, and run |spark-shell| and

Unable to share Sql between HiveContext and JDBC Thrift Server

2014-10-09 Thread Steve Arnold
I am writing a Spark job to persist data using HiveContext so that it can be accessed via the JDBC Thrift server. Although my code doesn't throw an error, I am unable to see my persisted data when I query from the Thrift server. I tried three different ways to get this to work: 1)