[ 
https://issues.apache.org/jira/browse/SPARK-21857?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-21857.
-------------------------------
    Resolution: Not A Problem

Java 9 is not nearly supported

> Exception in thread "main" java.lang.ExceptionInInitializerError
> ----------------------------------------------------------------
>
>                 Key: SPARK-21857
>                 URL: https://issues.apache.org/jira/browse/SPARK-21857
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.2.0
>            Reporter: Nagamanoj
>
> After installing SPRAK using prebuilt version, when we run ./bin/pySpark
> JAVA Version = Java 9
> I'm getting the following exception
> sing Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
> 17/08/28 20:06:43 INFO SparkContext: Running Spark version 2.2.0
> Exception in thread "main" java.lang.ExceptionInInitializerError
>     at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
>     at 
> org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
>     at 
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
>     at 
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
>     at 
> org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:791)
>     at 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:761)
>     at 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:634)
>     at 
> org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2430)
>     at 
> org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2430)
>     at scala.Option.getOrElse(Option.scala:121)
>     at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2430)
>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:295)
>     at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
>     at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
>     at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
>     at scala.Option.getOrElse(Option.scala:121)
>     at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
>     at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
>     at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>     at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native 
> Method)
>     at 
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at 
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.base/java.lang.reflect.Method.invoke(Method.java:564)
>     at 
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
>     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
>     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 1
>     at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3116)
>     at java.base/java.lang.String.substring(String.java:1885)
>     at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to