Re: IntelliJ Runtime error

2015-04-04 Thread Cheng Lian
I found in general it's a pain to build/run Spark inside IntelliJ IDEA. 
I guess most people resort to this approach so that they can leverage 
the integrated debugger to debug and/or learn Spark internals. A more 
convenient way I'm using recently is resorting to the remote debugging 
feature. In this way, by adding driver/executor Java options, you may 
build and start the Spark applications/tests/daemons in the normal way 
and attach the debugger to it. I was using this to debug the 
HiveThriftServer2, and it worked perfectly.


Steps to enable remote debugging:

1. Menu Run / Edit configurations...
2. Click the + button, choose Remote
3. Choose Attach or Listen in Debugger mode according to your 
actual needs
4. Copy, edit, and add Java options suggested in the dialog to 
`--driver-java-options` or `--executor-java-options`
5. If you're using attaching mode, first start your Spark program, then 
start remote debugging in IDEA
6. If you're using listening mode, first start remote debugging in IDEA, 
and then start your Spark program.


Hope this can be helpful.

Cheng

On 4/4/15 12:54 AM, sara mustafa wrote:

Thank you, it works with me when I changed the dependencies from provided to
compile.



--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-error-tp11383p11385.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org





-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: [VOTE] Release Apache Spark 1.3.1

2015-04-04 Thread Krishna Sankar
+1 (non-binding, of course)

1. Compiled OSX 10.10 (Yosemite) OK Total time: 15:04 min
 mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
-Dhadoop.version=2.6.0 -Phive -DskipTests -Dscala-2.11
2. Tested pyspark, mlib - running as well as compare results with 1.3.0
   pyspark works well with the new iPython 3.0.0 release
2.1. statistics (min,max,mean,Pearson,Spearman) OK
2.2. Linear/Ridge/Laso Regression OK
2.3. Decision Tree, Naive Bayes OK
2.4. KMeans OK
   Center And Scale OK
2.5. RDD operations OK
  State of the Union Texts - MapReduce, Filter,sortByKey (word count)
2.6. Recommendation (Movielens medium dataset ~1 M ratings) OK
   Model evaluation/optimization (rank, numIter, lambda) with itertools
OK

On Sat, Apr 4, 2015 at 5:13 PM, Reynold Xin r...@databricks.com wrote:

 +1

 Tested some DataFrame functions locally on Mac OS X.

 On Sat, Apr 4, 2015 at 5:09 PM, Patrick Wendell pwend...@gmail.com
 wrote:

  Please vote on releasing the following candidate as Apache Spark version
  1.3.1!
 
  The tag to be voted on is v1.3.1-rc1 (commit 0dcb5d9f):
 
 
 https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
 
  The list of fixes present in this release can be found at:
  http://bit.ly/1C2nVPY
 
  The release files, including signatures, digests, etc. can be found at:
  http://people.apache.org/~pwendell/spark-1.3.1-rc1/
 
  Release artifacts are signed with the following key:
  https://people.apache.org/keys/committer/pwendell.asc
 
  The staging repository for this release can be found at:
  https://repository.apache.org/content/repositories/orgapachespark-1080
 
  The documentation corresponding to this release can be found at:
  http://people.apache.org/~pwendell/spark-1.3.1-rc1-docs/
 
  Please vote on releasing this package as Apache Spark 1.3.1!
 
  The vote is open until Wednesday, April 08, at 01:10 UTC and passes
  if a majority of at least 3 +1 PMC votes are cast.
 
  [ ] +1 Release this package as Apache Spark 1.3.1
  [ ] -1 Do not release this package because ...
 
  To learn more about Apache Spark, please see
  http://spark.apache.org/
 
  - Patrick
 
  -
  To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
  For additional commands, e-mail: dev-h...@spark.apache.org
 
 



Re: IntelliJ Runtime error

2015-04-04 Thread Stephen Boesch
Thanks Cheng. Yes, the problem is that the way to set up to run inside
Intellij changes v frequently.  It is unfortunately not simply a one-time
investment to get IJ debugging working properly: the steps required are a
moving target approximately monthly to bi-monthly.

Doing remote debugging is probably a good choice to reduce the dev
environment volatility/maintenance.



2015-04-04 5:46 GMT-07:00 Cheng Lian lian.cs@gmail.com:

 I found in general it's a pain to build/run Spark inside IntelliJ IDEA. I
 guess most people resort to this approach so that they can leverage the
 integrated debugger to debug and/or learn Spark internals. A more
 convenient way I'm using recently is resorting to the remote debugging
 feature. In this way, by adding driver/executor Java options, you may build
 and start the Spark applications/tests/daemons in the normal way and attach
 the debugger to it. I was using this to debug the HiveThriftServer2, and it
 worked perfectly.

 Steps to enable remote debugging:

 1. Menu Run / Edit configurations...
 2. Click the + button, choose Remote
 3. Choose Attach or Listen in Debugger mode according to your actual
 needs
 4. Copy, edit, and add Java options suggested in the dialog to
 `--driver-java-options` or `--executor-java-options`
 5. If you're using attaching mode, first start your Spark program, then
 start remote debugging in IDEA
 6. If you're using listening mode, first start remote debugging in IDEA,
 and then start your Spark program.

 Hope this can be helpful.

 Cheng


 On 4/4/15 12:54 AM, sara mustafa wrote:

 Thank you, it works with me when I changed the dependencies from provided
 to
 compile.



 --
 View this message in context: http://apache-spark-
 developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-
 error-tp11383p11385.html
 Sent from the Apache Spark Developers List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org




 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org