it does not stop at the breakpoint line within an anonymous function concerning RDD

2016-09-19 Thread chen yong
Hello ALL,

I am new to spark. I use IDEA ver 14.0.3 to debug spark recently.It is strange 
to me that any breakpoint set within an anonymous function concerning RDD,such 
as breakpoint-1 in the below code snippet, is invalid(a red X appears on the 
left of the line, mouse hovering message showing that "no executable code found 
at line xx in class apache.spark.example.sparkpi$"). The breakpoint-1 will be 
skipped., it stops at breakpoint-2 directly. However, the ultimate result PI 
value is correct.

I am running a "local" run/debug configuration

I have stumbled by this problem for a long time.Please help .Your help will be 
appreciated very much!!!

package org.apache.spark.examples
import scala.math.random
import org.apache.spark._
import scala.util.logging.Logged
 
/** Computes an approximation to pi */
object SparkPi{
  def main(args: Array[String]) {
 
val conf = new SparkConf().setAppName("Spark Pi").setMaster("local")
val spark = new SparkContext(conf)
val slices = if (args.length > 0) args(0).toInt else 2
val n = math.min(10L * slices, Int.MaxValue).toInt // avoid overflow
val count = spark.parallelize(1 until n, slices).map { i =>
val x = random * 2 - 1  (breakpoint-1)
val y = random * 2 - 1
if (x*x + y*y < 1) 1 else 0
  }.reduce(_ + _)
println("Pi is roughly " + 4.0 * count / (n - 1)) (breakpoint-2)
spark.stop()
  }
}

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



it does not stop at the breakpoint line within an anonymous function concerning RDD

2016-09-19 Thread chen yong
Hello ALL,

I am new to spark. I use IDEA ver 14.0.3 to debug spark recently.It is strange 
to me that any breakpoint set within an anonymous function concerning RDD,such 
as breakpoint-1 in the below code snippet, is invalid(a red X appears on the 
left of the line, mouse hovering message showing that "no executable code found 
at line xx in class apache.spark.example.sparkpi$"). The breakpoint-1 will be 
skipped., it stops at breakpoint-2 directly. However, the ultimate result PI 
value is correct.

I am running a "local" run/debug configuration

I have stumbled by this problem for a long time.Please help .Your help will be 
appreciated very much!!!

package org.apache.spark.examples
import scala.math.random
import org.apache.spark._
import scala.util.logging.Logged
 
/** Computes an approximation to pi */
object SparkPi{
  def main(args: Array[String]) {
 
val conf = new SparkConf().setAppName("Spark Pi").setMaster("local")
val spark = new SparkContext(conf)
val slices = if (args.length > 0) args(0).toInt else 2
val n = math.min(10L * slices, Int.MaxValue).toInt // avoid overflow
val count = spark.parallelize(1 until n, slices).map { i =>
val x = random * 2 - 1  (breakpoint-1)
val y = random * 2 - 1
if (x*x + y*y < 1) 1 else 0
  }.reduce(_ + _)
println("Pi is roughly " + 4.0 * count / (n - 1)) (breakpoint-2)
spark.stop()
  }
}

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org