LuciferYang opened a new pull request #31939:
URL: https://github.com/apache/spark/pull/31939


   ### What changes were proposed in this pull request?
   SPARK-32160 add a config(`EXECUTOR_ALLOW_SPARK_CONTEXT`) to switch 
allow/disallow to create `SparkContext` in executors and the default value of 
the config is `false`
   
   `ExternalAppendOnlyUnsafeRowArrayBenchmark` will fail when 
`EXECUTOR_ALLOW_SPARK_CONTEXT` use the default value because the 
`ExternalAppendOnlyUnsafeRowArrayBenchmark#withFakeTaskContext` method try to 
create a `SparkContext` manually in Executor Side.
   
   So the main change of this pr is  set `EXECUTOR_ALLOW_SPARK_CONTEXT` to 
`true` to ensure `ExternalAppendOnlyUnsafeRowArrayBenchmark` run successfully.
   
   
   ### Why are the changes needed?
   Bug fix.
   
   
   ### Does this PR introduce _any_ user-facing change?
   No.
   
   
   ### How was this patch tested?
   Manual test:
   ```
   bin/spark-submit --class 
org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArrayBenchmark --jars 
spark-core_2.12-3.2.0-SNAPSHOT-tests.jar 
spark-sql_2.12-3.2.0-SNAPSHOT-tests.jar 
   ```
   
   **Before**
   ```
   Exception in thread "main" java.lang.IllegalStateException: SparkContext 
should only be created and accessed on the driver.
        at 
org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$assertOnDriver(SparkContext.scala:2679)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:137)
        at 
org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArrayBenchmark$.withFakeTaskContext(ExternalAppendOnlyUnsafeRowArrayBenchmark.scala:52)
        at 
org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArrayBenchmark$.testAgainstRawArrayBuffer(ExternalAppendOnlyUnsafeRowArrayBenchmark.scala:119)
        at 
org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArrayBenchmark$.$anonfun$runBenchmarkSuite$1(ExternalAppendOnlyUnsafeRowArrayBenchmark.scala:189)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at 
org.apache.spark.benchmark.BenchmarkBase.runBenchmark(BenchmarkBase.scala:40)
        at 
org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArrayBenchmark$.runBenchmarkSuite(ExternalAppendOnlyUnsafeRowArrayBenchmark.scala:185)
        at org.apache.spark.benchmark.BenchmarkBase.main(BenchmarkBase.scala:58)
        at 
org.apache.spark.sql.execution.ExternalAppendOnlyUnsafeRowArrayBenchmark.main(ExternalAppendOnlyUnsafeRowArrayBenchmark.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   ```
   
   **After**
   
   `ExternalAppendOnlyUnsafeRowArrayBenchmark` run successfully.
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to