This problem turned out to be a cockpit error. I had the same class name
defined in a couple different files, and didn't realize SBT was compiling
them all together, and then executing the "wrong" one. Mea culpa.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble
Yes, good catch. I also realized, after I posted, that I was calling 2
different classes, though they are in the same JAR. I went back and tried
it again with the same class in both cases, and it failed the same way. I
thought perhaps having 2 classes in a JAR was an issue, but commenting out
o
From: Tobias Pfeiffer mailto:t...@preferred.jp>>
Am I right that you are actually executing two different classes here?
Yes, I realized after I posted that I was calling 2 different classes, though
they are in the same JAR. I went back and tried it again with the same class
in both cases, an
H
On Tue, Nov 4, 2014 at 6:12 AM, spr wrote:
>
> sbt "runMain
> com.cray.examples.spark.streaming.cyber.StatefulDhcpServerHisto
> -f /Users/spr/Documents/<...>/tmp/ -t 10"
>
> [...]
>
> $S/bin/spark-submit --master local[12] --class StatefulNewDhcpServers
> target/scala-2.10/newd*jar -f /Users/sp
P.S. I believe I am creating output from the Spark Streaming app, and thus
not falling into the "no-output, no-execution" pitfall, as at the end I have
newServers.print()
newServers.saveAsTextFiles("newServers","out")
--
View this message in context:
http://apache-spark-user-list.1001560.n3.
I have a Spark Streaming program that works fine if I execute it via
sbt "runMain com.cray.examples.spark.streaming.cyber.StatefulDhcpServerHisto
-f /Users/spr/Documents/<...>/tmp/ -t 10"
but if I start it via
$S/bin/spark-submit --master local[12] --class StatefulNewDhcpServers
target/scala-2