I was able to set up Spark in Eclipse using the Spark IDE plugin.
I also got unit tests running with Scala Test, which makes development quick
and easy.
I wanted to document the setup steps in this wiki page:
Hi Patrick,
Thanks for all the explanations, that makes sense. @DeveloperApi
worries me a little bit especially because of the things Colin
mentions - it's sort of hard to make people move off of APIs, or
support different versions of the same API. But maybe if expectations
(or lack thereof) are
On Mon, Jun 2, 2014 at 6:05 PM, Marcelo Vanzin van...@cloudera.com wrote:
You mentioned something in your shading argument that kinda reminded
me of something. Spark currently depends on slf4j implementations and
log4j with compile scope. I'd argue that's the wrong approach if
we're talking
Is there a way to specify the target version? -Xiangrui
Madhu, can you send me your Wiki username? (Sending it just to me is fine.) I
can add you to the list to edit it.
Matei
On Jun 2, 2014, at 6:27 PM, Reynold Xin r...@databricks.com wrote:
I tried but didn't find where I could add you. You probably need Matei to
help out with this.
On
Yeah - check out sparkPreviousArtifact in the build:
https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L325
- Patrick
On Mon, Jun 2, 2014 at 5:30 PM, Xiangrui Meng men...@gmail.com wrote:
Is there a way to specify the target version? -Xiangrui
Quite often I notice that shuffle file is missing thus FileNotFoundException
is throws.
Any idea why shuffle file will be missing ? Am I running low in memory?
(I am using latest code from master branch on yarn-hadoop-2.2)
--
java.io.FileNotFoundException: