itholic commented on a change in pull request #35191:
URL: https://github.com/apache/spark/pull/35191#discussion_r788362544
##########
File path: python/pyspark/pandas/series.py
##########
@@ -5228,22 +5228,62 @@ def asof(self, where: Union[Any, List]) ->
Union[Scalar, "Series"]:
where = [where]
index_scol = self._internal.index_spark_columns[0]
index_type = self._internal.spark_type_for(index_scol)
+ from pyspark.sql.functions import struct, lit, explode, col, row_number
Review comment:
`pyspark.sql.functions` is already imported as `F`. I think we can just
reuse it.
##########
File path: python/pyspark/pandas/tests/test_series.py
##########
@@ -2071,6 +2071,18 @@ def test_asof(self):
with ps.option_context("compute.eager_check", False):
self.assert_eq(psser.asof(20), 4.0)
+ pser = pd.Series([2, 1, np.nan, 4], index=[10, 20, 30, 40],
name="Koalas")
+ psser = ps.from_pandas(pser)
+ self.assert_eq(psser.asof([5, 25]), pser.asof([5, 25]))
Review comment:
How about `psser.asof([25, 25])` ? It might fail
##########
File path: python/pyspark/pandas/series.py
##########
@@ -5228,22 +5228,62 @@ def asof(self, where: Union[Any, List]) ->
Union[Scalar, "Series"]:
where = [where]
index_scol = self._internal.index_spark_columns[0]
index_type = self._internal.spark_type_for(index_scol)
+ from pyspark.sql.functions import struct, lit, explode, col, row_number
+
+ column_prefix_constant = "col_"
cond = [
- F.max(F.when(index_scol <= SF.lit(index).cast(index_type),
self.spark.column))
+ F.when(
+ index_scol <= SF.lit(index).cast(index_type),
+ struct(
+ lit(column_prefix_constant +
str(index)).alias("identifier"),
Review comment:
Since `where` can be the same value, the column name here can be
duplicated.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]