Github user gatorsmile commented on a diff in the pull request:
    --- Diff: python/pyspark/sql/ ---
    @@ -351,8 +354,68 @@ def show(self, n=20, truncate=True, vertical=False):
                 print(self._jdf.showString(n, int(truncate), vertical))
    +    @property
    +    def _eager_eval(self):
    +        """Returns true if the eager evaluation enabled.
    +        """
    +        return self.sql_ctx.getConf(
    +            "spark.sql.repl.eagerEval.enabled", "false").lower() == "true"
    --- End diff --
    In the ongoing release, a nice-to-have refactoring is to move all the Core 
Confs into a single file just like what we did in Spark SQL Conf. Default 
values, boundary checking, types and descriptions. Thus, in PySpark, it would 
be better to do it starting from now. 


To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to