Ah, I got it.  Makes sense, thank you.

On Fri, Dec 20, 2013 at 3:58 PM, Jey Kottalam <[email protected]> wrote:

> That URI is to be passed to your Spark application as the URI of the
> Spark Master. It's not anything to do with Mesos itself.
>
> -Jey
>
> On Fri, Dec 20, 2013 at 12:50 PM, Gary Malouf <[email protected]>
> wrote:
> > As of right now on the Mesos docs, there is no mention of putting
> 'mesos://'
> > in front of that:
> >
> > http://mesos.apache.org/documentation/latest/high-availability/
> >
> > This should match for versions up to 0.14.1
> >
> >
> > On Fri, Dec 20, 2013 at 3:30 PM, Jey Kottalam <[email protected]>
> wrote:
> >>
> >> I haven't checked this, but I believe the Mesos+ZooKeeper URI format
> >> has changed to require "mesos://" to be prepended to the URI, e.g:
> >> mesos://zk://zk-01:2181,zk-02:2181,zk-03:2181/masters
> >>
> >> On Fri, Dec 20, 2013 at 12:01 PM, Gary Malouf <[email protected]>
> >> wrote:
> >> > For the past few months, when using Spark or the shell I have been
> >> > explicitly pointing to a single Mesos master.  In reality, we have
> Mesos
> >> > configured with a primary and backup master which another application
> >> > (Chronos) respects.  A url like the following is used in Chronos but
> >> > results
> >> > in an error when creating a SparkContext:
> >> >
> >> > org.apache.spark.SparkException: Could not parse Master URL:
> >> > 'zk://zk-01:2181,zk-02:2181,zk-03:2181/masters'
> >> >
> >> > at org.apache.spark.SparkContext.<init>(SparkContext.scala:242)
> >> >
> >> > at
> >> >
> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:863)
> >> >
> >> > at $line1.$read$$iwC$$iwC.<init>(<console>:10)
> >> >
> >> > at $line1.$read$$iwC.<init>(<console>:22)
> >> >
> >> > at $line1.$read.<init>(<console>:24)
> >> >
> >> > at $line1.$read$.<init>(<console>:28)
> >> >
> >> > at $line1.$read$.<clinit>(<console>)
> >> >
> >> > at $line1.$eval$.<init>(<console>:7)
> >> >
> >> > at $line1.$eval$.<clinit>(<console>)
> >> >
> >> > at $line1.$eval.$export(<console>)
> >> >
> >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >> >
> >> > at
> >> >
> >> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >> >
> >> > at
> >> >
> >> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >> >
> >> > at java.lang.reflect.Method.invoke(Method.java:601)
> >> >
> >> > at
> >> >
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:629)
> >> >
> >> > at
> >> >
> >> >
> org.apache.spark.repl.SparkIMain$Request$$anonfun$10.apply(SparkIMain.scala:897)
> >> >
> >> > at
> >> >
> scala.tools.nsc.interpreter.Line$$anonfun$1.apply$mcV$sp(Line.scala:43)
> >> >
> >> > at scala.tools.nsc.io.package$$anon$2.run(package.scala:25)
> >> >
> >> > at java.lang.Thread.run(Thread.java:722)
> >> >
> >> > Failed to create SparkContext, exiting...
> >> >
> >> >
> >> > I tried this address based on the documentation in
> >> > http://spark.incubator.apache.org/docs/latest/running-on-mesos.html
> >> >
> >> >
> >> > After looking at the master regex parsing in SparkContext within
> master
> >> > however, I do not see how it could be supporting the zookeeper master
> >> > lookup.
> >> >
> >> >
> >> >
> https://github.com/apache/incubator-spark/blob/4b895013cc965b37d44fd255656da470a3d2c222/core/src/main/scala/org/apache/spark/SparkContext.scala
> >> >
> >> > Can anyone clarify whether the masters being stored within zookeeper
> is
> >> > actually supported and if so how to connect this way?
> >
> >
>

Reply via email to