I have some code to recover a complex structured row from a dataset.
The row contains several ARRAY fields (mostly Array(IntegerType)),
which are populated with Array[java.lang.Integer], as that seems to be
the only way the Spark row serializer will accept them.
If the dataset is written out to a
Running 1.3.0 from binary install. When executing the example under the
subject section from within spark-shell, I get the following error:
scala> people.registerTempTable("people")
:35: error: value registerTempTable is not a member of
org.apache.spark.rdd.RDD[Person]
people.registe