Is it related with Spark version upgrade ?

Best,
Danny Chan
在 2020年1月10日 +0800 PM2:50,Vladimir Sitnikov <[email protected]>,写道:
> In case you wondered, the exception there is OOM in JVM 1.8:
>
> java.lang.OutOfMemoryError: unable to create new native thread
> at java.lang.Thread.start0(Native Method)
> at java.lang.Thread.start(Thread.java:717)
> at
> java.util.concurrent.ThreadPoolExecutor.addWorker(ThreadPoolExecutor.java:957)
> at
> java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1367)
> at
> org.apache.spark.MapOutputTrackerMaster$$anonfun$1.apply$mcVI$sp(MapOutputTracker.scala:315)
> at
> scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
> at
> org.apache.spark.MapOutputTrackerMaster.<init>(MapOutputTracker.scala:314)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:307)
> at
> org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
> at
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
> at
> org.apache.spark.SparkContext.<init>(SparkContext.scala:432)
> at
> org.apache.spark.SparkContext.<init>(SparkContext.scala:145)
> at
> org.apache.spark.SparkContext.<init>(SparkContext.scala:159)
> at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:64)
>
>
> Vladimir

Reply via email to