Hi Niko,
I execute the script in 0.9/CDH5 using spark-shell , and it does not
generate ClassCastException. Which version are you using and can you give
more stack trace ?
Cheers,
a.
On Tue, Mar 25, 2014 at 7:55 PM, Niko Stahl wrote:
> Ok, so I've been able to narrow down the problem to this
Hi Niko
I'm having a similar problem with running the Sparks on standalone cluster.
Any suggestions on how to fix this? The error is happening on using
pairRDDFunction saveAsHadoopDataSet.
java.lang.ClassCastException (java.lang.ClassCastException: cannot assign
instance of org.apache.spark.rdd.P
Ok, so I've been able to narrow down the problem to this specific case:
def toCsv(userTuple: String) = {"a,b,c"}
val dataTemp = Array("line1", "line2")
val dataTempDist = sc.parallelize(dataTemp)
val usersFormatted = dataTempDist.map(toCsv)
usersFormatted.saveAsTextFile("hdfs://" + masterDomain +