016 18:36
To: Mich Talebzadeh
Cc: user
Subject: Re: using udf to convert Oracle number column in Data Frame
Please take a look at
sql/core/src/main/scala/org/apache/spark/sql/functions.scala :
def udf(f: AnyRef, dataType: DataType): UserDefinedFunction = {
UserDefinedFunction(f, da
Please take a look
at sql/core/src/main/scala/org/apache/spark/sql/functions.scala :
def udf(f: AnyRef, dataType: DataType): UserDefinedFunction = {
UserDefinedFunction(f, dataType, None)
And sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala :
test("udf") {
val foo =
Hi,
Unfortunately Oracle table columns defined as NUMBER result in overflow.
An alternative seems to be to create a UDF to map that column to Double
val toDouble = udf((d: java.math.BigDecimal) => d.toString.toDouble)
This is the DF I have defined to fetch one column as below