Custom Cluster Managers / Standalone Recovery Mode in Spark

2015-01-31 Thread Anjana Fernando
Hi everyone,

I've been experimenting, and somewhat of a newbie for Spark. I was
wondering, if there is any way, that I can use a custom cluster manager
implementation with Spark. Basically, as I understood, at the moment, the
inbuilt modes supported are with standalone, Mesos and  Yarn. My
requirement is basically a simple clustering solution with high
availability of the master. I don't want to use a separate Zookeeper
cluster, since this would complicate my deployment, but rather, I would
like to use something like Hazelcast, which has a peer-to-peer cluster
coordination implementation.

I found that, there is already this JIRA [1], which requests for a custom
persistence engine, I guess for storing state information. So basically,
what I would want to do is, use Hazelcast to use for leader election, to
make an existing node the master, and to lookup the state information from
the distributed memory. Appreciate any help on how to archive this. And if
it useful for a wider audience, hopefully I can contribute this back to the
project.

[1] https://issues.apache.org/jira/browse/SPARK-1180

Cheers,
Anjana.


Disabling eviction warnings when using sbt

2015-01-31 Thread nl32
I am trying to disabling eviction warnings when using sbt, such as these:
[warn] There may be incompatibilities among your library dependencies.[warn]
Here are some of the libraries that were evicted:[warn] *
com.typesafe.sbt:sbt-git:0.6.1 - 0.6.2[warn]   *
com.typesafe.sbt:sbt-site:0.7.0 - 0.7.1[warn] Run 'evicted' to see detailed
eviction warnings
I am using this line to disable eviction warnings:
evictionWarningOptions in update :=
EvictionWarningOptions.default.withWarnTransitiveEvictions(false).withWarnDirectEvictions(false).withWarnScalaVersionEviction(false)
as found here https://github.com/sbt/sbt/issues/1636#issuecomment-57498141.
But I am not sure where it belongs in project/ScalaBuild.scalaI've tried
putting it in SparkBuild.sharedSettings, but it doesn't work. Can anyone
help?



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Disabling-eviction-warnings-when-using-sbt-tp10369.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Re: [VOTE] Release Apache Spark 1.2.1 (RC2)

2015-01-31 Thread MartinWeindel
FYI: Spark 1.2.1rc2 does not work on Windows!

On creating a Spark context you get following log output on my Windows
machine:
INFO  org.apache.spark.SparkEnv:59 - Registering BlockManagerMaster
ERROR org.apache.spark.util.Utils:75 - Failed to create local root dir in
C:\Users\mweindel\AppData\Local\Temp\. Ignoring this directory.
ERROR org.apache.spark.storage.DiskBlockManager:75 - Failed to create any
local dir.

I have already located the cause. A newly added function chmod700() in
org.apache.util.Utils uses functionality which only works on a Unix file
system.

See also pull request [https://github.com/apache/spark/pull/4299] for my
suggestion how to resolve the issue.

Best regards,

Martin Weindel



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-2-1-RC2-tp10317p10370.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [VOTE] Release Apache Spark 1.2.1 (RC2)

2015-01-31 Thread Matei Zaharia
This looks like a pretty serious problem, thanks! Glad people are testing on 
Windows.

Matei

 On Jan 31, 2015, at 11:57 AM, MartinWeindel martin.wein...@gmail.com wrote:
 
 FYI: Spark 1.2.1rc2 does not work on Windows!
 
 On creating a Spark context you get following log output on my Windows
 machine:
 INFO  org.apache.spark.SparkEnv:59 - Registering BlockManagerMaster
 ERROR org.apache.spark.util.Utils:75 - Failed to create local root dir in
 C:\Users\mweindel\AppData\Local\Temp\. Ignoring this directory.
 ERROR org.apache.spark.storage.DiskBlockManager:75 - Failed to create any
 local dir.
 
 I have already located the cause. A newly added function chmod700() in
 org.apache.util.Utils uses functionality which only works on a Unix file
 system.
 
 See also pull request [https://github.com/apache/spark/pull/4299] for my
 suggestion how to resolve the issue.
 
 Best regards,
 
 Martin Weindel
 
 
 
 --
 View this message in context: 
 http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-2-1-RC2-tp10317p10370.html
 Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
 
 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org
 


-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Intellij IDEA 14 env setup; NoClassDefFoundError when run examples

2015-01-31 Thread Yafeng Guo
Hi,

I'm setting up a dev environment with Intellij IDEA 14. I selected profile
scala-2.10, maven-3, hadoop 2.4, hive, hive 0.13.1. The compilation passed.
But when I try to run LogQuery in examples, I met below issue:

Connected to the target VM, address: '127.0.0.1:37182', transport: 'socket'
Exception in thread main java.lang.NoClassDefFoundError:
org/apache/spark/SparkConf
at org.apache.spark.examples.LogQuery$.main(LogQuery.scala:46)
at org.apache.spark.examples.LogQuery.main(LogQuery.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 2 more
Disconnected from the target VM, address: '127.0.0.1:37182', transport:
'socket'

anyone met similar issue before? Thanks a lot

Regards,
Ya-Feng


Re: Intellij IDEA 14 env setup; NoClassDefFoundError when run examples

2015-01-31 Thread Ted Yu
Have you read / followed this ?

https://cwiki.apache.org/confluence/display/SPARK
/Useful+Developer+Tools#UsefulDeveloperTools-BuildingSparkinIntelliJIDEA

Cheers

On Sat, Jan 31, 2015 at 8:01 PM, Yafeng Guo daniel.yafeng@gmail.com
wrote:

 Hi,

 I'm setting up a dev environment with Intellij IDEA 14. I selected profile
 scala-2.10, maven-3, hadoop 2.4, hive, hive 0.13.1. The compilation passed.
 But when I try to run LogQuery in examples, I met below issue:

 Connected to the target VM, address: '127.0.0.1:37182', transport:
 'socket'
 Exception in thread main java.lang.NoClassDefFoundError:
 org/apache/spark/SparkConf
 at org.apache.spark.examples.LogQuery$.main(LogQuery.scala:46)
 at org.apache.spark.examples.LogQuery.main(LogQuery.scala)
 Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
 at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
 ... 2 more
 Disconnected from the target VM, address: '127.0.0.1:37182', transport:
 'socket'

 anyone met similar issue before? Thanks a lot

 Regards,
 Ya-Feng



Re: [VOTE] Release Apache Spark 1.2.1 (RC2)

2015-01-31 Thread Nicholas Chammas
Do we have any open JIRA issues to add automated testing on Windows to
Jenkins? I assume that's something we want to do.

On Sat Jan 31 2015 at 10:37:42 PM Matei Zaharia matei.zaha...@gmail.com
wrote:

 This looks like a pretty serious problem, thanks! Glad people are testing
 on Windows.

 Matei

  On Jan 31, 2015, at 11:57 AM, MartinWeindel martin.wein...@gmail.com
 wrote:
 
  FYI: Spark 1.2.1rc2 does not work on Windows!
 
  On creating a Spark context you get following log output on my Windows
  machine:
  INFO  org.apache.spark.SparkEnv:59 - Registering BlockManagerMaster
  ERROR org.apache.spark.util.Utils:75 - Failed to create local root dir in
  C:\Users\mweindel\AppData\Local\Temp\. Ignoring this directory.
  ERROR org.apache.spark.storage.DiskBlockManager:75 - Failed to create
 any
  local dir.
 
  I have already located the cause. A newly added function chmod700() in
  org.apache.util.Utils uses functionality which only works on a Unix file
  system.
 
  See also pull request [https://github.com/apache/spark/pull/4299] for my
  suggestion how to resolve the issue.
 
  Best regards,
 
  Martin Weindel
 
 
 
  --
  View this message in context: http://apache-spark-
 developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-1-2-1-RC2-
 tp10317p10370.html
  Sent from the Apache Spark Developers List mailing list archive at
 Nabble.com.
 
  -
  To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
  For additional commands, e-mail: dev-h...@spark.apache.org
 


 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org