Re: Spark REPL question

2014-04-17 Thread Zhan Zhang
Thanks a lot.

By spins up, do you mean using the same directory, specified by following?

  /** Local directory to save .class files too */
  val outputDir = {
val tmp = System.getProperty(java.io.tmpdir)
val rootDir = new SparkConf().get(spark.repl.classdir,  tmp)
Utils.createTempDir(rootDir)
  }
val virtualDirectory  = new
PlainFile(outputDir) // directory for classfiles
val classServer   = new
HttpServer(outputDir) /** Jetty server that will serve our classes to
worker nodes */



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-REPL-question-tp6331p6333.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.


Re: Double lhbase dependency in spark 0.9.1

2014-04-17 Thread Tathagata Das
Aaah, this should have been ported to Spark 0.9.1!

TD


On Thu, Apr 17, 2014 at 12:08 PM, Sean Owen so...@cloudera.com wrote:

 I remember that too, and it has been fixed already in master, but
 maybe it was not included in 0.9.1:

 https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L367
 --
 Sean Owen | Director, Data Science | London


 On Thu, Apr 17, 2014 at 8:03 PM, Dmitriy Lyubimov dlie...@gmail.com
 wrote:
  Not sure if I am seeing double.
 
  SparkBuild.scala for 0.9.1 has dobule hbase declaration
 
org.apache.hbase %  hbase   % 0.94.6
  excludeAll(excludeNetty, excludeAsm),
org.apache.hbase % hbase % HBASE_VERSION
 excludeAll(excludeNetty,
  excludeAsm),
 
 
  as a result i am not getting the right version of hbase here. Perhaps the
  old declaration crept in during a merge at some point?
 
  -d