[ 
https://issues.apache.org/jira/browse/LIVY-1010?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mnr Bsf updated LIVY-1010:
--------------------------
    Description: 
It will be good to keep the Apache Livy project up to date with the latest 
Spark version (3.5.4).

Tried Apache Livy 0.8 for Spark 3.5.4 and did not work with Java 8, 11, 17.

Java 17, I was not able to start an Apache Livy session
{code:java}
Exception in thread "main" java.util.concurrent.ExecutionException: 
javax.security.sasl.SaslException: Client closed before SASL negotiation 
finished {code}
Java 8, 11, I was not able to start a session
{code:java}
livy-server  | 25/02/05 17:08:27 INFO LineBufferedStream: Type --help for more 
information.
livy-server  | 25/02/05 17:08:27 WARN LivySparkUtils$: Current Spark (3,5) is 
not verified in Livy, please use it carefully
livy-server  | 25/02/05 17:08:27 WARN LivySparkUtils$: Spark version (3,5) is 
greater then the maximum version (3,0) supported by Livy, will choose Scala 
version 2.12 instead, please specify manually if it is the expected Scala 
version you want
livy-server  | Exception in thread "main" java.lang.ExceptionInInitializerError
livy-server  |  at 
org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
livy-server  |  at 
org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
livy-server  |  at 
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
livy-server  |  at 
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
livy-server  |  at 
org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:338)
livy-server  |  at 
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:332)
livy-server  |  at org.apache.livy.server.LivyServer.start(LivyServer.scala:106)
livy-server  |  at org.apache.livy.server.LivyServer$.main(LivyServer.scala:431)
livy-server  |  at org.apache.livy.server.LivyServer.main(LivyServer.scala)
livy-server  | Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, 
end 3, length 2
livy-server  |  at 
java.base/java.lang.String.checkBoundsBeginEnd(String.java:3319)
livy-server  |  at java.base/java.lang.String.substring(String.java:1874)
livy-server  |  at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
livy-server  |  ... 9 more {code}

Java 15, I was able to start a session, but running a basic code like creating 
a sample Spark dataframe throughs this exception:
{code:java}
'JavaPackage' object is not callable
Traceback (most recent call last):
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1443, 
in createDataFrame
    return self._create_dataframe(
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1485, 
in _create_dataframe
    rdd, struct = self._createFromLocal(map(prepare, data), schema)
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1093, 
in _createFromLocal
    struct = self._inferSchemaFromList(data, names=schema)
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 954, in 
_inferSchemaFromList
    prefer_timestamp_ntz = is_timestamp_ntz_preferred()
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 153, in 
is_timestamp_ntz_preferred
    return jvm is not None and jvm.PythonSQLUtils.isTimestampNTZPreferred()
TypeError: 'JavaPackage' object is not callable{code}

  was:
It will be good to keep the Apache Livy project up to date with the latest 
Spark version (3.5.4).
{code:java}
livy-server  | 25/02/04 22:42:48 WARN LivySparkUtils$: Current Spark (3,5) is 
not verified in Livy, please use it carefully

livy-server  | 25/02/04 22:42:48 WARN LivySparkUtils$: Spark version (3,5) is 
greater then the maximum version (3,0) supported by Livy, will choose Scala 
version 2.12 instead, please specify manually if it is the expected Scala 
version you want {code}
Tried Apache Livy 0.8 for Spark 3.5.4 and did not work with Java 8, 11, 17.

Java 17, I was not able to start an Apache Livy session
{code:java}
Exception in thread "main" java.util.concurrent.ExecutionException: 
javax.security.sasl.SaslException: Client closed before SASL negotiation 
finished {code}
Java 8, 11, I was able to start a session, but running a basic code like 
creating a sample Spark dataframe throughs this exception:
{code:java}
'JavaPackage' object is not callable
Traceback (most recent call last):
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1443, 
in createDataFrame
    return self._create_dataframe(
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1485, 
in _create_dataframe
    rdd, struct = self._createFromLocal(map(prepare, data), schema)
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1093, 
in _createFromLocal
    struct = self._inferSchemaFromList(data, names=schema)
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 954, in 
_inferSchemaFromList
    prefer_timestamp_ntz = is_timestamp_ntz_preferred()
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 153, in 
is_timestamp_ntz_preferred
    return jvm is not None and jvm.PythonSQLUtils.isTimestampNTZPreferred()
TypeError: 'JavaPackage' object is not callable{code}


> Add support for Spark 3.5.4
> ---------------------------
>
>                 Key: LIVY-1010
>                 URL: https://issues.apache.org/jira/browse/LIVY-1010
>             Project: Livy
>          Issue Type: Improvement
>    Affects Versions: 0.9.0
>            Reporter: Mnr Bsf
>            Priority: Major
>             Fix For: 0.9.0
>
>
> It will be good to keep the Apache Livy project up to date with the latest 
> Spark version (3.5.4).
> Tried Apache Livy 0.8 for Spark 3.5.4 and did not work with Java 8, 11, 17.
> Java 17, I was not able to start an Apache Livy session
> {code:java}
> Exception in thread "main" java.util.concurrent.ExecutionException: 
> javax.security.sasl.SaslException: Client closed before SASL negotiation 
> finished {code}
> Java 8, 11, I was not able to start a session
> {code:java}
> livy-server  | 25/02/05 17:08:27 INFO LineBufferedStream: Type --help for 
> more information.
> livy-server  | 25/02/05 17:08:27 WARN LivySparkUtils$: Current Spark (3,5) is 
> not verified in Livy, please use it carefully
> livy-server  | 25/02/05 17:08:27 WARN LivySparkUtils$: Spark version (3,5) is 
> greater then the maximum version (3,0) supported by Livy, will choose Scala 
> version 2.12 instead, please specify manually if it is the expected Scala 
> version you want
> livy-server  | Exception in thread "main" 
> java.lang.ExceptionInInitializerError
> livy-server  |  at 
> org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
> livy-server  |  at 
> org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:611)
> livy-server  |  at 
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:273)
> livy-server  |  at 
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:261)
> livy-server  |  at 
> org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:338)
> livy-server  |  at 
> org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:332)
> livy-server  |  at 
> org.apache.livy.server.LivyServer.start(LivyServer.scala:106)
> livy-server  |  at 
> org.apache.livy.server.LivyServer$.main(LivyServer.scala:431)
> livy-server  |  at org.apache.livy.server.LivyServer.main(LivyServer.scala)
> livy-server  | Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, 
> end 3, length 2
> livy-server  |  at 
> java.base/java.lang.String.checkBoundsBeginEnd(String.java:3319)
> livy-server  |  at java.base/java.lang.String.substring(String.java:1874)
> livy-server  |  at org.apache.hadoop.util.Shell.<clinit>(Shell.java:52)
> livy-server  |  ... 9 more {code}
> Java 15, I was able to start a session, but running a basic code like 
> creating a sample Spark dataframe throughs this exception:
> {code:java}
> 'JavaPackage' object is not callable
> Traceback (most recent call last):
>   File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1443, 
> in createDataFrame
>     return self._create_dataframe(
>   File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1485, 
> in _create_dataframe
>     rdd, struct = self._createFromLocal(map(prepare, data), schema)
>   File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 1093, 
> in _createFromLocal
>     struct = self._inferSchemaFromList(data, names=schema)
>   File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/session.py", line 954, 
> in _inferSchemaFromList
>     prefer_timestamp_ntz = is_timestamp_ntz_preferred()
>   File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 153, in 
> is_timestamp_ntz_preferred
>     return jvm is not None and jvm.PythonSQLUtils.isTimestampNTZPreferred()
> TypeError: 'JavaPackage' object is not callable{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to