.
-Original Message-
From: Jennifer15 [mailto:bsabe...@purdue.edu]
Sent: Monday, July 27, 2015 1:47 PM
To: user@spark.apache.org
Subject: unserialize error in sparkR
Hi,
I have a newbie question; I get the following error by increasing the number of
samples in my sample script samplescript.R
<h
m not sure if this error is because of using spark1.2 though if it is,
what is the equivalent of lapply/map to work on dataframes?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/unserialize-error-in-sparkR-tp24002.html
Sent from the Apache Spark User