[ https://issues.apache.org/jira/browse/LIVY-534?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16694249#comment-16694249 ]
Sai Varun Reddy Daram commented on LIVY-534: -------------------------------------------- Building livy from source at [https://github.com/apache/incubator-livy/commit/4cfb6bcb8fb9ac6b2d6c8b3d04b20f647b507e1f] was the fix, some how this fixed the issue. > Spark 2.4.0, kubernetes client mode. TypeError: object of type 'NoneType' has > no len() in authenticate_and_accum_updates of pyspark/accumulators.py > --------------------------------------------------------------------------------------------------------------------------------------------------- > > Key: LIVY-534 > URL: https://issues.apache.org/jira/browse/LIVY-534 > Project: Livy > Issue Type: Bug > Components: Core > Affects Versions: 0.5.0 > Environment: kubernetes on minikube. > kubernetes version 1.10.0 > Spark 2.4.0 client mode on kubernetes. > Apache livy 0.5.0 > Reporter: Sai Varun Reddy Daram > Priority: Major > > step 1) create dataframe. > {code:java} > df=spark_session.createDataFrame([{'a':1}]) > {code} > step 2) do a count or collect > {code:java} > df.count(){code} > I get this output. > {code:java} > // Exception happened during processing of request from ('127.0.0.1', 38690) > Traceback (most recent call last): > File "/usr/lib/python3.6/socketserver.py", line 317, in > _handle_request_noblock > self.process_request(request, client_address) > File "/usr/lib/python3.6/socketserver.py", line 348, in process_request > self.finish_request(request, client_address) > File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request > self.RequestHandlerClass(request, client_address, self) > File "/usr/lib/python3.6/socketserver.py", line 721, in __init__ > self.handle() > File > "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", > line 266, in handle > poll(authenticate_and_accum_updates) > File > "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", > line 241, in poll > if func(): > File > "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", > line 254, in authenticate_and_accum_updates > received_token = self.rfile.read(len(auth_token)) > TypeError: object of type 'NoneType' has no len() > {code} > Repeat the step 2, the error no longer comes. > -- This message was sent by Atlassian JIRA (v7.6.3#76005)