Thanks for your help, everyone.  Several folks have explained that I can
surely solve the problem by installing sbt.

But I'm trying to get the instructions working *as written on the Spark
website*.  The instructions not only don't have you install sbt
separately...they actually specifically have you use the sbt that is
distributed with Spark.

If it is not possible to build your own Spark programs with
Spark-distributed sbt, then that's a big hole in the Spark docs that I
shall file.  And if the sbt that is included with Spark is MEANT to be able
to compile your own Spark apps, then that's a product bug.

But before I file the bug, I'm still hoping I'm missing something, and
someone will point out that I'm missing a small step that will make the
Spark distribution of sbt work!

Diana



On Mon, Mar 24, 2014 at 4:52 PM, Yana Kadiyska <yana.kadiy...@gmail.com>wrote:

> Diana, I just tried it on a clean Ubuntu machine, with Spark 0.8
> (since like other folks I had sbt preinstalled on my "usual" machine)
>
> I ran the command exactly as Ognen suggested and see
> Set current project to Simple Project (do you see this -- you should
> at least be seeing this)
> and then a bunch of Resolving ...
>
> messages. I did get an error there, saying it can't find
> javax.servlet.orbit. I googled the error and found this thread:
>
>
> http://mail-archives.apache.org/mod_mbox/spark-user/201309.mbox/%3ccajbo4nexyzqe6zgreqjtzzz5zrcoavfen+wmbyced6n1epf...@mail.gmail.com%3E
>
> adding the IvyXML fragment they suggested helped in my case (but
> again, the build pretty clearly complained).
>
> If you're still having no luck, I suggest installing sbt and setting
> SBT_HOME... http://www.scala-sbt.org/
>
> In either case though, it's not a Spark-specific issue...Hopefully
> some of all this helps.
>
> On Mon, Mar 24, 2014 at 4:30 PM, Diana Carroll <dcarr...@cloudera.com>
> wrote:
> > Yeah, that's exactly what I did. Unfortunately it doesn't work:
> >
> > $SPARK_HOME/sbt/sbt package
> > awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
> for
> > reading (No such file or directory)
> > Attempting to fetch sbt
> > /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> > directory
> > /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> > directory
> > Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. Please
> > install sbt manually from http://www.scala-sbt.org/
> >
> >
> >
> > On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski
> > <og...@plainvanillagames.com> wrote:
> >>
> >> You can use any sbt on your machine, including the one that comes with
> >> spark. For example, try:
> >>
> >> ~/path_to_spark/sbt/sbt compile
> >> ~/path_to_spark/sbt/sbt run <arguments>
> >>
> >> Or you can just add that to your PATH by:
> >>
> >> export $PATH=$PATH:~/path_to_spark/sbt
> >>
> >> To make it permanent, you can add it to your ~/.bashrc or
> ~/.bash_profile
> >> or ??? depending on the system you are using. If you are on Windows,
> sorry,
> >> I can't offer any help there ;)
> >>
> >> Ognen
> >>
> >>
> >> On 3/24/14, 3:16 PM, Diana Carroll wrote:
> >>
> >> Thanks Ongen.
> >>
> >> Unfortunately I'm not able to follow your instructions either.  In
> >> particular:
> >>>
> >>>
> >>> sbt compile
> >>> sbt run <arguments if any>
> >>
> >>
> >> This doesn't work for me because there's no program on my path called
> >> "sbt".  The instructions in the Quick Start guide are specific that I
> should
> >> call "$SPARK_HOME/sbt/sbt".  I don't have any other executable on my
> system
> >> called "sbt".
> >>
> >> Did you download and install sbt separately?  In following the Quick
> Start
> >> guide, that was not stated as a requirement, and I'm trying to run
> through
> >> the guide word for word.
> >>
> >> Diana
> >>
> >>
> >> On Mon, Mar 24, 2014 at 4:12 PM, Ognen Duzlevski
> >> <og...@plainvanillagames.com> wrote:
> >>>
> >>> Diana,
> >>>
> >>> Anywhere on the filesystem you have read/write access (you need not be
> in
> >>> your spark home directory):
> >>>
> >>> mkdir myproject
> >>> cd myproject
> >>> mkdir project
> >>> mkdir target
> >>> mkdir -p src/main/scala
> >>> cp $mypath/$mymysource.scala src/main/scala/
> >>> cp $mypath/myproject.sbt .
> >>>
> >>> Make sure that myproject.sbt has the following in it:
> >>>
> >>> name := "I NEED A NAME!"
> >>>
> >>> version := "I NEED A VERSION!"
> >>>
> >>> scalaVersion := "2.10.3"
> >>>
> >>> libraryDependencies += "org.apache.spark" % "spark-core_2.10" %
> >>> "0.9.0-incubating"
> >>>
> >>> If you will be using Hadoop/HDFS functionality you will need the below
> >>> line also
> >>>
> >>> libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.2.0"
> >>>
> >>> The above assumes you are using Spark 0.9 and Scala 2.10.3. If you are
> >>> using 0.8.1 - adjust appropriately.
> >>>
> >>> That's it. Now you can do
> >>>
> >>> sbt compile
> >>> sbt run <arguments if any>
> >>>
> >>> You can also do
> >>> sbt package to produce a jar file of your code which you can then add
> to
> >>> the SparkContext at runtime.
> >>>
> >>> In a more complicated project you may need to have a bit more involved
> >>> hierarchy like com.github.dianacarroll which will then translate to
> >>> src/main/scala/com/github/dianacarroll/ where you can put your multiple
> >>> .scala files which will then have to be a part of a package
> >>> com.github.dianacarroll (you can just put that as your first line in
> each of
> >>> these scala files). I am new to Java/Scala so this is how I do it. More
> >>> educated Java/Scala programmers may tell you otherwise ;)
> >>>
> >>> You can get more complicated with the sbt project subrirectory but you
> >>> can read independently about sbt and what it can do, above is the bare
> >>> minimum.
> >>>
> >>> Let me know if that helped.
> >>> Ognen
> >>>
> >>>
> >>> On 3/24/14, 2:44 PM, Diana Carroll wrote:
> >>>>
> >>>> Has anyone successfully followed the instructions on the Quick Start
> >>>> page of the Spark home page to run a "standalone" Scala application?
>  I
> >>>> can't, and I figure I must be missing something obvious!
> >>>>
> >>>> I'm trying to follow the instructions here as close to "word for word"
> >>>> as possible:
> >>>>
> >>>>
> http://spark.apache.org/docs/latest/quick-start.html#a-standalone-app-in-scala
> >>>>
> >>>> 1.  The instructions don't say what directory to create my test
> >>>> application in, but later I'm instructed to run "sbt/sbt" so I
> conclude that
> >>>> my working directory must be $SPARK_HOME.  (Temporarily ignoring that
> it is
> >>>> a little weird to be working directly in the Spark distro.)
> >>>>
> >>>> 2.  Create $SPARK_HOME/mysparktest/src/main/scala/SimpleApp.scala.
> >>>> Copy&paste in the code from the instructions exactly, replacing
> >>>> YOUR_SPARK_HOME with my spark home path.
> >>>>
> >>>> 3.  Create $SPARK_HOME/mysparktest/simple.sbt.  Copy&paste in the sbt
> >>>> file from the instructions
> >>>>
> >>>> 4.  From the $SPARK_HOME I run "sbt/sbt package".  It runs through the
> >>>> ENTIRE Spark project!  This takes several minutes, and at the end, it
> says
> >>>> "Done packaging".  unfortunately, there's nothing in the
> >>>> $SPARK_HOME/mysparktest/ folder other than what I already had there.
> >>>>
> >>>> (Just for fun, I also did what I thought was more logical, which is
> set
> >>>> my working directory to $SPARK_HOME/mysparktest, and but
> $SPARK_HOME/sbt/sbt
> >>>> package, but that was even less successful: I got an error:
> >>>> awk: cmd. line:1: fatal: cannot open file `./project/build.properties'
> >>>> for reading (No such file or directory)
> >>>> Attempting to fetch sbt
> >>>> /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> >>>> directory
> >>>> /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> >>>> directory
> >>>> Our attempt to download sbt locally to sbt/sbt-launch-.jar failed.
> >>>> Please install sbt manually from http://www.scala-sbt.org/
> >>>>
> >>>>
> >>>> So, help?  I'm sure these instructions work because people are
> following
> >>>> them every day, but I can't tell what they are supposed to do.
> >>>>
> >>>> Thanks!
> >>>> Diana
> >>>
> >>>
> >>
> >>
> >> --
> >> "A distributed system is one in which the failure of a computer you
> didn't
> >> even know existed can render your own computer unusable"
> >> -- Leslie Lamport
> >
> >
>

Reply via email to