Around 500KB each time i call the function (~150 times)
From: Felix Cheung
Sent: den 26 september 2018 14:57
To: Junior Alvarez ; user@spark.apache.org
Subject: Re: spark.lapply
It looks like the native R process is terminated from buffer overflow. Do you
know how much data is involved
It looks like the native R process is terminated from buffer overflow. Do you
know how much data is involved?
From: Junior Alvarez
Sent: Wednesday, September 26, 2018 7:33 AM
To: user@spark.apache.org
Subject: spark.lapply
Hi!
I’m using spark.lapply() in spark
. Please note
that SparkR code is now at
https://github.com/apache/spark/tree/master/R
_
From: Cinquegrana, Piero
mailto:piero.cinquegr...@neustar.biz>>
Sent: Thursday, August 25, 2016 6:08 AM
Subject: RE: spark.lapply in SparkR: Error in writeBin(batch, con,
h tries to convert from formula to
binary exploding the size of the object being passed.
https://github.com/amplab-extras/SparkR-pkg/blob/master/pkg/R/serialize.R
From: Felix Cheung [mailto:felixcheun...@hotmail.com]
Sent: Thursday, August 25, 2016 2:35 PM
To: Cinquegrana, Piero ; user@spark.apac
rom: Cinquegrana, Piero
mailto:piero.cinquegr...@neustar.biz>>
Sent: Wednesday, August 24, 2016 10:37 AM
Subject: RE: spark.lapply in SparkR: Error in writeBin(batch, con, endian =
"big")
To: Cinquegrana, Piero
mailto:piero.cinquegr...@neustar.biz>>, Felix
Cheung mailto:felixcheun...
r.biz]
Sent: Tuesday, August 23, 2016 2:39 PM
To: Felix Cheung ; user@spark.apache.org
Subject: RE: spark.lapply in SparkR: Error in writeBin(batch, con, endian =
"big")
The output from score() is very small, just a float. The input, however, could
be as big as several hundred MBs. I woul
, Piero ; user@spark.apache.org
Subject: Re: spark.lapply in SparkR: Error in writeBin(batch, con, endian =
"big")
How big is the output from score()?
Also could you elaborate on what you want to broadcast?
On Mon, Aug 22, 2016 at 11:58 AM -0700, "Cinquegrana, Piero"
m
How big is the output from score()?
Also could you elaborate on what you want to broadcast?
On Mon, Aug 22, 2016 at 11:58 AM -0700, "Cinquegrana, Piero"
mailto:piero.cinquegr...@neustar.biz>> wrote:
Hello,
I am using the new R API in SparkR spark.lapply (spark 2.0). I am defining a
comple