we use fine-grained mode. coarse-grained mode keeps JVMs around which often
leads to OOMs, which in turn kill the entire executor, causing entire
stages to be retried. In fine-grained mode, only the task fails and
subsequently gets retried without taking out an entire stage or worse.
On Tue, Nov
Hello,
I get a lot of these exceptions on my mesos cluster when running spark jobs:
14/07/19 16:29:43 WARN spark.network.SendingConnection: Error finishing
connection to prd-atl-mesos-slave-010/10.88.160.200:37586
java.net.ConnectException: Connection timed out
at
I typed spark parquet into google and the top results was this blog post
about reading and writing parquet files from spark
http://zenfractal.com/2013/08/21/a-powerful-big-data-trio/
On Mon, Jul 7, 2014 at 5:23 PM, Michael Armbrust mich...@databricks.com
wrote:
SchemaRDDs, provided by Spark
Are the hadoop configuration files on the classpath for your mesos
executors?
On Thu, Jul 3, 2014 at 6:45 PM, Steven Cox s...@renci.org wrote:
...and a real subject line.
--
*From:* Steven Cox [s...@renci.org]
*Sent:* Thursday, July 03, 2014 9:21 PM
*To:*
Yieldbot is pleased to announce the release of Flambo, our Clojure DSL for
Apache Spark.
Flambo allows one to write spark applications in pure Clojure as an
alternative to Scala, Java and Python currently available in Spark.
We have already written a substantial amount of internal code in
Does this perhaps have to do with the spark.closure.serializer?
On Sat, May 3, 2014 at 7:50 AM, Soren Macbeth so...@yieldbot.com wrote:
Poking around in the bowels of scala, it seems like this has something to
do with implicit scala - java collection munging. Why would it be doing
Is this supposed to be supported? It doesn't work, at least in mesos fine
grained mode. First it fails a bunch of times because it can't find my
registrator class because my assembly jar hasn't been fetched like so:
java.lang.ClassNotFoundException: pickles.kryo.PicklesRegistrator
at
it seems that it dying while trying to fetch results from my tasks to
return back to the driver.
Am I close?
On Fri, May 2, 2014 at 3:35 PM, Soren Macbeth so...@yieldbot.com wrote:
Hallo,
I've getting this rather crazy kryo exception trying to run my spark job:
Exception in thread main
Hello,
Is it possible to use a custom class as my spark's KryoSerializer running
under Mesos?
I've tried adding my jar containing the class to my spark context (via
SparkConf.addJars), but I always get:
java.lang.ClassNotFoundException: flambo.kryo.FlamboKryoSerializer
at