Github user DylanGuedes commented on the issue:

    https://github.com/apache/spark/pull/21045
  
    Ok so It works fine in spark-shell but in pyspark I got this error:
    ```shell
    File "/home/dguedes/Workspace/spark/python/pyspark/sql/functions.py", line 
2155, in pyspark.sql.functions.zip                                              
                                   
    Failed example:                                                             
                                                                                
                                  
        df.select(zip(df.vals1, df.vals2).alias('zipped')).collect()            
                                                                                
                                  
    Exception raised:                                                           
                                                                                
                                  
        Traceback (most recent call last):                                      
                                                                                
                                  
          File "/usr/lib64/python2.7/doctest.py", line 1315, in __run           
                                                                                
                                  
            compileflags, 1) in test.globs                                      
                                                                                
                                  
          File "<doctest pyspark.sql.functions.zip[2]>", line 1, in <module>    
                                                                                
                                  
            df.select(zip(df.vals1, df.vals2).alias('zipped')).collect()        
                                                                                
                                  
          File "/home/dguedes/Workspace/spark/python/pyspark/sql/dataframe.py", 
line 466, in collect                                                            
                                  
            port = self._jdf.collectToPython()                                  
                                                                                
                                  
          File 
"/home/dguedes/Workspace/spark/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py",
 line 1160, in __call__                                                         
               
            answer, self.gateway_client, self.target_id, self.name)             
                                                                                
                                  
          File "/home/dguedes/Workspace/spark/python/pyspark/sql/utils.py", 
line 63, in deco                                                                
                                      
            return f(*a, **kw)                                                  
                                                                                
                                  
          File 
"/home/dguedes/Workspace/spark/python/lib/py4j-0.10.6-src.zip/py4j/protocol.py",
 line 320, in get_return_value                                                  
                   
            format(target_id, ".", name), value)                                
                                                                                
                                  
        Py4JJavaError: An error occurred while calling o2240.collectToPython.   
                                                                                
                                  
        : java.util.concurrent.ExecutionException: 
org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 41, 
Column 2: failed to compile: org.codehaus.commons.compiler.Com$
    ileException: File 'generated.java', Line 41, Column 2: Unexpected token 
"[" in primary                                                                  
                                     
    ```
    The problem is in the `doGenCode` function, but I can't see why. Any 
suggestions are welcome :)


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to