Understood.

However the previous default was local directory. Now user has to specify
file:// scheme.

Maybe add release note to SPARK-2261 ?

Cheers

On Sat, Jan 31, 2015 at 8:40 AM, Sean Owen <so...@cloudera.com> wrote:

> This might have been on purpose, since the goal is to make this
> HDFS-friendly, and of course still allow local directories. With no
> scheme, a path is ambiguous.
>
> On Sat, Jan 31, 2015 at 4:18 PM, Ted Yu <yuzhih...@gmail.com> wrote:
> > Looking at https://github.com/apache/spark/pull/1222/files , the
> following
> > change may have caused what Stephen described:
> >
> > + if (!fileSystem.isDirectory(new Path(logBaseDir))) {
> >
> > When there is no schema associated with logBaseDir, local path should be
> > assumed.
> >
> > On Fri, Jan 30, 2015 at 8:37 AM, Stephen Haberman
> > <stephen.haber...@gmail.com> wrote:
> >>
> >> Hi Krishna/all,
> >>
> >> I think I found it, and it wasn't related to Scala-2.11...
> >>
> >> I had "spark.eventLog.dir=/mnt/spark/work/history", which worked
> >> in Spark 1.2, but now am running Spark master, and it wants a
> >> Hadoop URI, e.g. file:///mnt/spark/work/history (I believe due to
> >> commit 45645191).
> >>
> >> This looks like a breaking change to the spark.eventLog.dir
> >> config property.
> >>
> >> Perhaps it should be patched to convert the previously supported
> >> "just a file path" values to HDFS-compatible "file://..." URIs
> >> for backwards compatibility?
> >>
> >> - Stephen
> >>
> >>
> >> On Wed, 28 Jan 2015 12:27:17 -0800
> >> Krishna Sankar <ksanka...@gmail.com> wrote:
> >>
> >> > Stephen,
> >> >    Scala 2.11 worked fine for me. Did the dev change and then
> >> > compile. Not using in production, but I go back and forth
> >> > between 2.10 & 2.11. Cheers
> >> > <k/>
> >> >
> >> > On Wed, Jan 28, 2015 at 12:18 PM, Stephen Haberman <
> >> > stephen.haber...@gmail.com> wrote:
> >> >
> >> > > Hey,
> >> > >
> >> > > I recently compiled Spark master against scala-2.11 (by
> >> > > running the dev/change-versions script), but when I run
> >> > > spark-shell, it looks like the "sc" variable is missing.
> >> > >
> >> > > Is this a known/unknown issue? Are others successfully using
> >> > > Spark with scala-2.11, and specifically spark-shell?
> >> > >
> >> > > It is possible I did something dumb while compiling master,
> >> > > but I'm not sure what it would be.
> >> > >
> >> > > Thanks,
> >> > > Stephen
> >> > >
> >> > >
> ---------------------------------------------------------------------
> >> > > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> > > For additional commands, e-mail: user-h...@spark.apache.org
> >> > >
> >> > >
> >>
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: user-h...@spark.apache.org
> >>
> >
>

Reply via email to