Thanks for your response.
Actually I want to return a Row or (return struct).
But, attempting to register UDF returning Row throws the following error,

scala> def test(r:Row):Row = r
test: (r: org.apache.spark.sql.Row)org.apache.spark.sql.Row

scala> sqlContext.udf.register("test",test _)
java.lang.UnsupportedOperationException: Schema for type
org.apache.spark.sql.Row is not supported
at
org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:153)
at
org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:29)


The use case I am working on will require creating Dynamic Case Classes.
It will be great help if you can give me pointers for creating Dynamic Case
Classes.

TIA.


On Fri, Nov 6, 2015 at 12:20 PM, Michael Armbrust <mich...@databricks.com>
wrote:

> You are returning the type StructType not an instance of a struct (i.e.
> StringType instead of "string").  If you'd like to return a struct you
> should return a case class.
>
> case class StringInfo(numChars: Int, firstLetter: String)
> udf((s: String) => StringInfo(s.size, s.head))
>
> If you'd like to take a struct as input, use Row as the type.
>
> On Thu, Nov 5, 2015 at 9:53 PM, Rishabh Bhardwaj <rbnex...@gmail.com>
> wrote:
>
>> Hi all,
>>
>> I am unable to register a UDF with return type as StructType:
>>
>> scala> def test(r:StructType):StructType = { r }
>>>
>>> test: (r:
>>>> org.apache.spark.sql.types.StructType)org.apache.spark.sql.types.StructType
>>>
>>>
>>>> scala> sqlContext.udf.register("test",test _ )
>>>
>>> scala.MatchError: org.apache.spark.sql.types.StructType (of class
>>>> scala.reflect.internal.Types$TypeRef$$anon$6)
>>>
>>> at
>>>> org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:101)
>>>
>>> at
>>>> org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:29)
>>>
>>> at
>>>> org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:64)
>>>
>>>
>> Can someone throw some light on this ? and Is there any work around for
>> it ?
>>
>> TIA.
>>
>> Regards,
>> Rishabh.
>>
>
>

Reply via email to