Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/20400#discussion_r164261678
--- Diff: python/pyspark/sql/window.py ---
@@ -124,16 +124,19 @@ def rangeBetween(start, end):
values directly.
:param start: boundary start, inclusive.
- The frame is unbounded if this is
``Window.unboundedPreceding``, or
+ The frame is unbounded if this is
``Window.unboundedPreceding``,
+
``org.apache.spark.sql.catalyst.expressions.UnboundedPreceding``, or
any value less than or equal to max(-sys.maxsize,
-9223372036854775808).
:param end: boundary end, inclusive.
- The frame is unbounded if this is
``Window.unboundedFollowing``, or
+ The frame is unbounded if this is
``Window.unboundedFollowing``,
+
``org.apache.spark.sql.catalyst.expressions.UnboundedPFollowing``, or
any value greater than or equal to min(sys.maxsize,
9223372036854775807).
"""
- if start <= Window._PRECEDING_THRESHOLD:
- start = Window.unboundedPreceding
- if end >= Window._FOLLOWING_THRESHOLD:
- end = Window.unboundedFollowing
+ if isinstance(start, int) and isinstance(end, int):
--- End diff --
Yup, but I think we can still have `long` type one:
```
>>> long(1)
1L
>>> isinstance(long(1), int)
False
```
You can simply do like `isinstance(long(1), (int, long))` with
```
if sys.version >= '3':
long = int
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]