? How the sbt know which location specific?
And though it went smoothly, I didn't see any jar had been created.
Pls help.
Thanks,
Christy
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standalone-scala-program-tp3116p15120
I encountered exactly the same problem. How did you solve this?
Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standalone-scala-program-tp3116p15125.html
Sent from the Apache Spark User List mailing list archive
at 1:00 AM, christy 760948...@qq.com wrote:
I encountered exactly the same problem. How did you solve this?
Thanks
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/quick-start-guide-building-a-standalone-scala-program-tp3116p15125.html
Sent from
Has anyone successfully followed the instructions on the Quick Start page
of the Spark home page to run a standalone Scala application? I can't,
and I figure I must be missing something obvious!
I'm trying to follow the instructions here as close to word for word as
possible:
Hi, Diana,
See my inlined answer
--
Nan Zhu
On Monday, March 24, 2014 at 3:44 PM, Diana Carroll wrote:
Has anyone successfully followed the instructions on the Quick Start page of
the Spark home page to run a standalone Scala application? I can't, and I
figure I must be missing
I am able to run standalone apps. I think you are making one mistake
that throws you off from there onwards. You don't need to put your app
under SPARK_HOME. I would create it in its own folder somewhere, it
follows the rules of any standalone scala program (including the
layout). In the giude,
Yana: Thanks. Can you give me a transcript of the actual commands you are
running?
THanks!
Diana
On Mon, Mar 24, 2014 at 3:59 PM, Yana Kadiyska yana.kadiy...@gmail.comwrote:
I am able to run standalone apps. I think you are making one mistake
that throws you off from there onwards. You
Diana,
Anywhere on the filesystem you have read/write access (you need not be
in your spark home directory):
mkdir myproject
cd myproject
mkdir project
mkdir target
mkdir -p src/main/scala
cp $mypath/$mymysource.scala src/main/scala/
cp $mypath/myproject.sbt .
Make sure that myproject.sbt
Thanks, Nan Zhu.
You say that my problems are because you are in Spark directory, don't
need to do that actually , the dependency on Spark is resolved by sbt
I did try it initially in what I thought was a much more typical place,
e.g. ~/mywork/sparktest1. But as I said in my email:
(Just for
Thanks Ongen.
Unfortunately I'm not able to follow your instructions either. In
particular:
sbt compile
sbt run arguments if any
This doesn't work for me because there's no program on my path called
sbt. The instructions in the Quick Start guide are specific that I
should call
Yeah, that's exactly what I did. Unfortunately it doesn't work:
$SPARK_HOME/sbt/sbt package
awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
reading (No such file or directory)
Attempting to fetch sbt
/usr/lib/spark/sbt/sbt: line 33: sbt/sbt-launch-.jar: No such file or
Ah crud, I guess you are right, I am using the sbt I installed manually
with my Scala installation.
Well, here is what you can do:
mkdir ~/bin
cd ~/bin
wget
http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/0.13.1/sbt-launch.jar
vi sbt
Put the following contents into
Hi, Diana,
You don’t need to use spark-distributed sbt
just download sbt from its official website and set your PATH to the right place
Best,
--
Nan Zhu
On Monday, March 24, 2014 at 4:30 PM, Diana Carroll wrote:
Yeah, that's exactly what I did. Unfortunately it doesn't work:
Thanks for your help, everyone. Several folks have explained that I can
surely solve the problem by installing sbt.
But I'm trying to get the instructions working *as written on the Spark
website*. The instructions not only don't have you install sbt
separately...they actually specifically have
Diana, I think you are correct - I just installed
wget
http://mirror.symnds.com/software/Apache/incubator/spark/spark-0.9.0-incubating/spark-0.9.0-incubating-bin-cdh4.tgz
and indeed I see the same error that you see
It looks like in previous versions sbt-launch used to just come down
in the
I found that I never read the document carefully and I never find that Spark
document is suggesting you to use Spark-distributed sbt……
Best,
--
Nan Zhu
On Monday, March 24, 2014 at 5:47 PM, Diana Carroll wrote:
Thanks for your help, everyone. Several folks have explained that I can
It is suggested implicitly in giving you the command ./sbt/sbt. The
separately installed sbt isn't in a folder called sbt, whereas Spark's
version is. And more relevantly, just a few paragraphs earlier in the
tutorial you execute the command sbt/sbt assembly which definitely refers
to the spark
Yes, actually even for spark, I mostly use the sbt I installed…..so always
missing this issue….
If you can reproduce the problem with a spark-distribtued sbt…I suggest
proposing a PR to fix the document, before 0.9.1 is officially released
Best,
--
Nan Zhu
On Monday, March 24, 2014
18 matches
Mail list logo