Hello 

I am trying to create a SchemaRDD from a RDD of case classes. Depending on
an argument to the program these case classes should be different. But this
throws an exception. I am using Spark version 1.1.0 and Scala version 2.10.4
The exception can be reproduced by:

val table = "type 1"
import org.apache.spark.rdd.RDD
case class type_1(a: String, b: Int, c: List[Int])
case class type_2(a: String, b: Int, c: List[Int], d: String)
val data = sc.parallelize(Seq(("asd",1,List(1,2)))) // some data of type
type_1
var supportedTypes: RDD[Product] = null
import csc.createSchemaRDD
table match{
case "type 1" => supportedTypes = data.map(row=>type_1(row._1 + 1, row._2,
row._3))
case "type 2" => supportedTypes = data.map(row=>type_2(row._1, row._2,
row._3, "ghj"))
}
supportedTypes.schema


The stacktrace: 

scala.ScalaReflectionException: <none> is not a method
        at
scala.reflect.api.Symbols$SymbolApi$class.asMethod(Symbols.scala:279)
        at
scala.reflect.internal.Symbols$SymbolContextApiImpl.asMethod(Symbols.scala:73)
        at
org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:60)
        at
org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:50)
        at
org.apache.spark.sql.catalyst.ScalaReflection$.attributesFor(ScalaReflection.scala:44)
        at
org.apache.spark.sql.execution.ExistingRdd$.fromProductRdd(basicOperators.scala:229)
        at
org.apache.spark.sql.SQLContext.createSchemaRDD(SQLContext.scala:94)

Kind regards



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/scala-ScalaReflectionException-when-creating-SchemaRDD-tp23023.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to