HeartSaVioR opened a new pull request #24147: [SPARK-27205][CORE] Avoid 
initializing "object class" named "<clazz>$" while initializing mainClass in 
SparkSubmit
URL: https://github.com/apache/spark/pull/24147
 
 
   ## What changes were proposed in this pull request?
   
   [SPARK-26977](https://issues.apache.org/jira/browse/SPARK-26977) introduced 
very strange bug which spark-shell is no longer able to load classes which are 
provided via `--packages`. TBH I don't know about the details why it is broken, 
but looks like initializing `object class` brings the weirdness.
   
   This patch fixes the bug via not initializing `object class` named 
`childMainClass$` while initializing mainClass in SparkSubmit.
   
   ## How was this patch tested?
   
   Manual test: suppose we run spark-shell with `--packages` option like below:
   
   ```
   ./bin/spark-shell --verbose   --master "local[*]" --packages 
org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0
   ```
   
   Before this patch, importing class in transitive dependency fails:
   
   ```
   Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   Spark context Web UI available at http://localhost:4040
   Spark context available as 'sc' (master = local[*], app id = 
local-1553005771597).
   Spark session available as 'spark'.
   Welcome to
         ____              __
        / __/__  ___ _____/ /__
       _\ \/ _ \/ _ `/ __/  '_/
      /___/ .__/\_,_/_/ /_/\_\   version 3.0.0-SNAPSHOT
         /_/
   
   Using Scala version 2.12.8 (Java HotSpot(TM) 64-Bit Server VM, Java 
1.8.0_191)
   Type in expressions to have them evaluated.
   Type :help for more information.
   
   scala> import org.apache.kafka
   <console>:23: error: object kafka is not a member of package org.apache
          import org.apache.kafka
   ```
   
   After this patch, importing class in transitive dependency succeeds:
   
   ```
   Using Spark's default log4j profile: 
org/apache/spark/log4j-defaults.properties
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   Spark context Web UI available at http://localhost:4040
   Spark context available as 'sc' (master = local[*], app id = 
local-1553004095542).
   Spark session available as 'spark'.
   Welcome to
         ____              __
        / __/__  ___ _____/ /__
       _\ \/ _ \/ _ `/ __/  '_/
      /___/ .__/\_,_/_/ /_/\_\   version 3.0.0-SNAPSHOT
         /_/
   
   Using Scala version 2.12.8 (Java HotSpot(TM) 64-Bit Server VM, Java 
1.8.0_191)
   Type in expressions to have them evaluated.
   Type :help for more information.
   
   scala> import org.apache.kafka
   import org.apache.kafka
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to