viirya commented on a change in pull request #27466: 
[SPARK-30722][PYTHON][DOCS] Update documentation for Pandas UDF with Python 
type hints
URL: https://github.com/apache/spark/pull/27466#discussion_r375663877
 
 

 ##########
 File path: docs/sql-pyspark-pandas-with-arrow.md
 ##########
 @@ -201,16 +273,15 @@ Note that all data for a cogroup will be loaded into 
memory before the function
 memory exceptions, especially if the group sizes are skewed. The configuration 
for [maxRecordsPerBatch](#setting-arrow-batch-size)
 is not applied and it is up to the user to ensure that the cogrouped data will 
fit into the available memory.
 
-The following example shows how to use `groupby().cogroup().apply()` to 
perform an asof join between two datasets.
+The following example shows how to use `groupby().cogroup().applyInPandas()` 
to perform an asof join between two datasets.
 
 <div class="codetabs">
 <div data-lang="python" markdown="1">
-{% include_example cogrouped_map_pandas_udf python/sql/arrow.py %}
+{% include_example cogrouped_apply_in_pandas python/sql/arrow.py %}
 </div>
 </div>
 
-For detailed usage, please see 
[`pyspark.sql.functions.pandas_udf`](api/python/pyspark.sql.html#pyspark.sql.functions.pandas_udf)
 and
-[`pyspark.sql.CoGroupedData.apply`](api/python/pyspark.sql.html#pyspark.sql.CoGroupedData.apply).
+For detailed usage, please see 
[`pyspark.sql.CoGroupedData.applyInPandas()`](api/python/pyspark.sql.html#pyspark.sql.CoGroupedData.applyInPandas).
 
 Review comment:
   Does this mean `PandasCogroupedOps` or `GroupedData` actually? I don't find 
any `CoGroupedData`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to