TitusFong created ZEPPELIN-2692:
-----------------------------------

             Summary: pyspark die on second run error
                 Key: ZEPPELIN-2692
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-2692
             Project: Zeppelin
          Issue Type: Bug
          Components: pySpark, zeppelin-interpreter
    Affects Versions: 0.7.1
         Environment: mac, spark 2.1.1
            Reporter: TitusFong


I ran pyspark code the first time it was fine, the second time it dies and show 
this on every single cell of my zeppelin notebook, and also other notebook that 
I am running with pyspark, I have to turn on Zeppelin code and restart in order 
to fix this. 

Traceback (most recent call last):
  File 
"/var/folders/zh/dvdnf74d1t9cq78hjjm3xft80000gn/T/zeppelin_pyspark-1462033700144752464.py",
 line 343, in <module>
    sc.setJobGroup(jobGroup, "Zeppelin")
  File "/Users/titusfong/spark/python/pyspark/context.py", line 902, in 
setJobGroup
    self._jsc.setJobGroup(groupId, description, interruptOnCancel)
AttributeError: 'NoneType' object has no attribute 'setJobGroup'

this error should be similar to this issue which I don't think is solved yet:
http://apache-zeppelin-users-incubating-mailing-list.75479.x6.nabble.com/Error-about-PySpark-td4988.html
https://issues.apache.org/jira/browse/ZEPPELIN-2449?jql=project%20%3D%20ZEPPELIN%20AND%20text%20~%20%22sc.setJobGroup%22





--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to