Yes, actually even for spark, I mostly use the sbt I installed…..so always 
missing this issue….

If you can reproduce the problem with a spark-distribtued sbt…I suggest 
proposing a PR to fix the document, before 0.9.1 is officially released  

Best,  

--  
Nan Zhu



On Monday, March 24, 2014 at 8:34 PM, Diana Carroll wrote:

> It is suggested implicitly in giving you the command "./sbt/sbt". The 
> separately installed sbt isn't in a folder called sbt, whereas Spark's 
> version is.  And more relevantly, just a few paragraphs earlier in the 
> tutorial you execute the command "sbt/sbt assembly" which definitely refers 
> to the spark install.  
>  
> On Monday, March 24, 2014, Nan Zhu <zhunanmcg...@gmail.com 
> (mailto:zhunanmcg...@gmail.com)> wrote:
> > I found that I never read the document carefully and I never find that 
> > Spark document is suggesting you to use Spark-distributed sbt……  
> >  
> > Best,
> >  
> > --  
> > Nan Zhu
> >  
> >  
> >  
> > On Monday, March 24, 2014 at 5:47 PM, Diana Carroll wrote:
> >  
> > > Thanks for your help, everyone.  Several folks have explained that I can 
> > > surely solve the problem by installing sbt.
> > >  
> > > But I'm trying to get the instructions working as written on the Spark 
> > > website.  The instructions not only don't have you install sbt 
> > > separately...they actually specifically have you use the sbt that is 
> > > distributed with Spark.  
> > >  
> > > If it is not possible to build your own Spark programs with 
> > > Spark-distributed sbt, then that's a big hole in the Spark docs that I 
> > > shall file.  And if the sbt that is included with Spark is MEANT to be 
> > > able to compile your own Spark apps, then that's a product bug.  
> > >  
> > > But before I file the bug, I'm still hoping I'm missing something, and 
> > > someone will point out that I'm missing a small step that will make the 
> > > Spark distribution of sbt work!
> > >  
> > > Diana
> > >  
> > >  
> > >  
> > > On Mon, Mar 24, 2014 at 4:52 PM, Yana Kadiyska <yana.kadiy...@gmail.com> 
> > > wrote:
> > > > Diana, I just tried it on a clean Ubuntu machine, with Spark 0.8
> > > > (since like other folks I had sbt preinstalled on my "usual" machine)
> > > >  
> > > > I ran the command exactly as Ognen suggested and see
> > > > Set current project to Simple Project (do you see this -- you should
> > > > at least be seeing this)
> > > > and then a bunch of Resolving ...
> > > >  
> > > > messages. I did get an error there, saying it can't find
> > > > javax.servlet.orbit. I googled the error and found this thread:
> > > >  
> > > > http://mail-archives.apache.org/mod_mbox/spark-user/201309.mbox/%3ccajbo4nexyzqe6zgreqjtzzz5zrcoavfen+wmbyced6n1epf...@mail.gmail.com%3E
> > > >  
> > > > adding the IvyXML fragment they suggested helped in my case (but
> > > > again, the build pretty clearly complained).
> > > >  
> > > > If you're still having no luck, I suggest installing sbt and setting
> > > > SBT_HOME... http://www.scala-sbt.org/
> > > >  
> > > > In either case though, it's not a Spark-specific issue...Hopefully
> > > > some of all this helps.
> > > >  
> > > > On Mon, Mar 24, 2014 at 4:30 PM, Diana Carroll <dcarr...@cloudera.com> 
> > > > wrote:
> > > > > Yeah, that's exactly what I did. Unfortunately it doesn't work:
> > > > >
> > > > > $SPARK_HOME/sbt/sbt package
> > > > > awk: cmd. line:1: fatal: cannot open file 
> > > > > `./project/build.properties' for
> > > > > reading (No such file or directory)
> > > > > Attempting to fetch sbt
> > > > > /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> > > > > directory
> > > > > /usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
> > > > > directory
> > > > > Our attempt to download sbt locally to sbt/sbt-launch-.jar failed. 
> > > > > Please
> > > > > install sbt manually from http://www.scala-sbt.org/
> > > > >
> > > > >
> > > > >
> > > > > On Mon, Mar 24, 2014 at 4:25 PM, Ognen Duzlevski
> > > > > <og...@plainvanillagames.com> wrote:
> > > > >>
> > > > >> You can use any sbt on your machine, including the one that comes 
> > > > >> with
> > > > >> spark. For example, try:
> > > > >>
> > > > >> ~/path_to_spark/sbt/sbt compile
> > > > >> ~/path_to_spark/sbt/sbt run <arguments>
> > > > >>
> > > > >> Or you can just add that to your PATH by:
> > > > >>
> > > > >> export $PATH=$PATH:~/path_to_spark/sbt
> > > > >>
> > > > >> To make it permanent, you can add it to your ~/.bashrc or 
> > > > >> ~/.bash_profile
> > > > >> or ??? depending on the system you are using. If you are on Windows, 
> > > > >> sorry,
> > > > >> I can't offer any help there ;)
> > > > >>
> > > > >> Ognen
> > > > >>
> > > > >>
> > > > >> On 3/24/14, 3:16 PM, Diana Carroll wrote:
> > > > >>
> > > > >> Thanks Ongen.
> > > > >>
> > > > >> Unfortunately I'm not able to follow your instructions either.  In
> > > > >> particular:
> > > > >>>
> > > > >>>
> > > > >>> sbt compile
> > > > >>> sbt run <arguments if any>
> > > > >>
> > > > >>
> > > > >> This doesn't work for me because there's no program on my path called
> > > > >> "sbt".  The instructions in the Quick Start guide are specific that 
> > > > >> I sho  

Reply via email to