[ 
https://issues.apache.org/jira/browse/SPARK-26113?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sai Varun Reddy Daram updated SPARK-26113:
------------------------------------------
    Description: 
Machine OS: Ubuntu 16.04.

Kubernetes: Minikube 

Kubernetes Version: 1.10.0

Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) built 
using standard spark docker build.sh file.

Driver is inside pod in kubernetes cluster.

Steps to replicate:

1) Create a spark Session:  
{code:java}
// 
spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
{code}
 2) Create a sample DataFrame
{code:java}
// df=spark_session.createDataFrame([{'a':1}])
{code}
 3) Do some operation on this dataframe
{code:java}
// df.count(){code}
I get this output.
{code:java}
// Exception happened during processing of request from ('127.0.0.1', 38690)
Traceback (most recent call last):
File "/usr/lib/python3.6/socketserver.py", line 317, in _handle_request_noblock
self.process_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
self.finish_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
self.handle()
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 266, in handle
poll(authenticate_and_accum_updates)
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 241, in poll
if func():
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 254, in authenticate_and_accum_updates
received_token = self.rfile.read(len(auth_token))
TypeError: object of type 'NoneType' has no len()

{code}
4) Repeat above step; it won't show the error.

 

But now close the session, kill the python terminal or process. and try again, 
the same happens.

 

Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?

  was:
Machine OS: Ubuntu 16.04.

Kubernetes: Minikube 

Kubernetes Version: 1.10.0

Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) built 
using standard spark docker build.sh file.

Steps to replicate:

1) Create a spark Session:  
{code:java}
// 
spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
{code}
 2) Create a sample DataFrame
{code:java}
// df=spark_session.createDataFrame([{'a':1}])
{code}
 3) Do some operation on this dataframe
{code:java}
// df.count(){code}
I get this output.
{code:java}
// Exception happened during processing of request from ('127.0.0.1', 38690)
Traceback (most recent call last):
File "/usr/lib/python3.6/socketserver.py", line 317, in _handle_request_noblock
self.process_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
self.finish_request(request, client_address)
File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
self.RequestHandlerClass(request, client_address, self)
File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
self.handle()
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 266, in handle
poll(authenticate_and_accum_updates)
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 241, in poll
if func():
File 
"/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
line 254, in authenticate_and_accum_updates
received_token = self.rfile.read(len(auth_token))
TypeError: object of type 'NoneType' has no len()

{code}
4) Repeat above step; it won't show the error.

 

But now close the session, kill the python terminal or process. and try again, 
the same happens.

 

Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?


> TypeError: object of type 'NoneType' has no len() in 
> authenticate_and_accum_updates of pyspark/accumulators.py
> --------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-26113
>                 URL: https://issues.apache.org/jira/browse/SPARK-26113
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes, PySpark
>    Affects Versions: 2.4.0
>            Reporter: Sai Varun Reddy Daram
>            Priority: Major
>
> Machine OS: Ubuntu 16.04.
> Kubernetes: Minikube 
> Kubernetes Version: 1.10.0
> Spark Kubernetes Image: pyspark ( at docker hub: saivarunr/spark-py:2.4 ) 
> built using standard spark docker build.sh file.
> Driver is inside pod in kubernetes cluster.
> Steps to replicate:
> 1) Create a spark Session:  
> {code:java}
> // 
> spark_session=SparkSession.builder.master('k8s://https://192.168.99.100:8443').config('spark.executor.instances','1').config('spark.kubernetes.container.image','saivarunr/spark-py:2.4').getOrCreate()
> {code}
>  2) Create a sample DataFrame
> {code:java}
> // df=spark_session.createDataFrame([{'a':1}])
> {code}
>  3) Do some operation on this dataframe
> {code:java}
> // df.count(){code}
> I get this output.
> {code:java}
> // Exception happened during processing of request from ('127.0.0.1', 38690)
> Traceback (most recent call last):
> File "/usr/lib/python3.6/socketserver.py", line 317, in 
> _handle_request_noblock
> self.process_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 348, in process_request
> self.finish_request(request, client_address)
> File "/usr/lib/python3.6/socketserver.py", line 361, in finish_request
> self.RequestHandlerClass(request, client_address, self)
> File "/usr/lib/python3.6/socketserver.py", line 721, in __init__
> self.handle()
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 266, in handle
> poll(authenticate_and_accum_updates)
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 241, in poll
> if func():
> File 
> "/spark-2.4.0-bin-hadoop2.7/python/lib/pyspark.zip/pyspark/accumulators.py", 
> line 254, in authenticate_and_accum_updates
> received_token = self.rfile.read(len(auth_token))
> TypeError: object of type 'NoneType' has no len()
> {code}
> 4) Repeat above step; it won't show the error.
>  
> But now close the session, kill the python terminal or process. and try 
> again, the same happens.
>  
> Something related to https://issues.apache.org/jira/browse/SPARK-26019  ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to