Understood.
However the previous default was local directory. Now user has to specify
file:// scheme.
Maybe add release note to SPARK-2261 ?
Cheers
On Sat, Jan 31, 2015 at 8:40 AM, Sean Owen wrote:
> This might have been on purpose, since the goal is to make this
> HDFS-friendly, and of cours
This might have been on purpose, since the goal is to make this
HDFS-friendly, and of course still allow local directories. With no
scheme, a path is ambiguous.
On Sat, Jan 31, 2015 at 4:18 PM, Ted Yu wrote:
> Looking at https://github.com/apache/spark/pull/1222/files , the following
> change may
> Looking at https://github.com/apache/spark/pull/1222/files ,
> the following change may have caused what Stephen described:
>
> + if (!fileSystem.isDirectory(new Path(logBaseDir))) {
>
> When there is no schema associated with logBaseDir, local path
> should be assumed.
Yes, that looks right.
Looking at https://github.com/apache/spark/pull/1222/files , the following
change may have caused what Stephen described:
+ if (!fileSystem.isDirectory(new Path(logBaseDir))) {
When there is no schema associated with logBaseDir, local path should be
assumed.
On Fri, Jan 30, 2015 at 8:37 AM, Step
Hi Krishna/all,
I think I found it, and it wasn't related to Scala-2.11...
I had "spark.eventLog.dir=/mnt/spark/work/history", which worked
in Spark 1.2, but now am running Spark master, and it wants a
Hadoop URI, e.g. file:///mnt/spark/work/history (I believe due to
commit 45645191).
This looks
Stephen,
Scala 2.11 worked fine for me. Did the dev change and then compile. Not
using in production, but I go back and forth between 2.10 & 2.11.
Cheers
On Wed, Jan 28, 2015 at 12:18 PM, Stephen Haberman <
stephen.haber...@gmail.com> wrote:
> Hey,
>
> I recently compiled Spark master against