[ 
https://issues.apache.org/jira/browse/SPARK-1317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169629#comment-14169629
 ] 

Sean Owen commented on SPARK-1317:
----------------------------------

PS if you're still interested in this, I am pretty sure #1 is the correct 
answer. I would use my own sbt (or really, the SBT support in my IDE perhaps, 
or Maven) to build my own app.

> sbt doesn't work for building Spark programs
> --------------------------------------------
>
>                 Key: SPARK-1317
>                 URL: https://issues.apache.org/jira/browse/SPARK-1317
>             Project: Spark
>          Issue Type: Bug
>          Components: Build, Documentation
>    Affects Versions: 0.9.0
>            Reporter: Diana Carroll
>
> I don't know if this is a doc bug or a product bug, because I don't know how 
> it is supposed to work.
> The Spark quick start guide page has a section that walks you through 
> creating a "standalone" Spark app in Scala.  I think the instructions worked 
> in 0.8.1 but I can't get them to work in 0.9.0.
> The instructions have you create a directory structure in the "canonical" sbt 
> format, but do not tell you where to locate this directory.  However, after 
> setting up the structure, the tutorial then instructs you to use the command 
> {code}sbt/sbt package{code}
> which implies that the working directory must be SPARK_HOME.
> I tried it both ways: creating a "mysparkapp" directory right in SPARK_HOME 
> and creating it in my home directory.  Neither worked, with different results:
> - if I create a "mysparkapp" directory as instructed in SPARK_HOME, cd to 
> SPARK_HOME and run the command sbt/sbt package as specified, it packages ALL 
> of Spark...but does not build my own app.
> - if I create a "mysparkapp" directory elsewhere, cd to that directory, and 
> run the command there, I get an error:
> {code}
> $SPARK_HOME/sbt/sbt package
> awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for 
> reading (No such file or directory)
> Attempting to fetch sbt
> /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
> directory
> /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or 
> directory
> Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please 
> install sbt manually from http://www.scala-sbt.org/
> {code}
> So, either:
> 1: the Spark distribution of sbt can only be used to build Spark itself, not 
> you own code...in which case the quick start guide is wrong, and should 
> instead say that users should install sbt separately
> OR
> 2: the Spark distribution of sbt CAN be used, with property configuration, in 
> which case that configuration should be documented (I wasn't able to figure 
> it out, but I didn't try that hard either)
> OR
> 3: the Spark distribution of sbt is *supposed* to be able to build Spark 
> apps, but is configured incorrectly in the product, in which case there's a 
> product bug rather than a doc bug
> Although this is not a show-stopper, because the obvious workaround is to 
> simply install sbt separately, I think at least updating the docs is pretty 
> high priority, because most people learning Spark start with that Quick Start 
> page, which doesn't work.
> (If it's doc issue #1, let me know, and I'll fix the docs myself.  :-) )



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to