The only way to do this for SQL is though the JDBC driver.

However, you can use literal values without lossy/unsafe string conversions
by using the DataFrame API.  For example, to filter:

import org.apache.spark.sql.functions._
df.filter($"columnName" === lit(value))

On Sun, Dec 27, 2015 at 1:11 PM, Ajaxx <ajack...@pobox.com> wrote:

> Given a SQLContext (or HiveContext) is it possible to pass in parameters
> to a
> query.  There are several reasons why this makes sense, including loss of
> data type during conversion to string, SQL injection, etc.
>
> But currently, it appears that SQLContext.sql() only takes a single
> parameter which is a string.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Passing-parameters-to-spark-SQL-tp25806.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to