Github user holdenk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20678#discussion_r171110674
  
    --- Diff: python/pyspark/sql/dataframe.py ---
    @@ -1986,55 +1986,89 @@ def toPandas(self):
                 timezone = None
     
             if self.sql_ctx.getConf("spark.sql.execution.arrow.enabled", 
"false").lower() == "true":
    +            should_fallback = False
    --- End diff --
    
    This variable name is a little confusing to me while I'm tracing the code. 
How about "use_arrow" and swap the meanings? Because right now if a user 
doesn't have arrow enabled we skip the arrow conversion because of the value of 
should_fallback which seems.... odd.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to