Yes, that's what I was looking for.  Thanks!

On Mon, Nov 21, 2016 at 6:56 PM, Michael Armbrust
<mich...@databricks.com> wrote:
> You are looking for org.apache.spark.sql.functions.expr()
>
> On Sat, Nov 19, 2016 at 6:12 PM, Stuart White <stuart.whi...@gmail.com>
> wrote:
>>
>> I'd like to allow for runtime-configured Column expressions in my
>> Spark SQL application.  For example, if my application needs a 5-digit
>> zip code, but the file I'm processing contains a 9-digit zip code, I'd
>> like to be able to configure my application with the expression
>> "substring('zipCode, 0, 5)" to use for the zip code.
>>
>> So, I think I'm looking for something like this:
>>
>> def parseColumnExpression(colExpr: String) : Column
>>
>> I see that SparkSession's sql() method exists to take a string and
>> parse it into a DataFrame.  But that's not quite what I want.
>>
>> Does a mechanism exist that would allow me to take a string
>> representation of a column expression and parse it into an actual
>> column expression (something that could be use in a .select() call,
>> for example)?
>>
>> Thanks!
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to