[
https://issues.apache.org/jira/browse/BEAM-12071?focusedWorklogId=574303&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-574303
]
ASF GitHub Bot logged work on BEAM-12071:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 30/Mar/21 18:10
Start Date: 30/Mar/21 18:10
Worklog Time Spent: 10m
Work Description: TheNeuralBit commented on a change in pull request
#14374:
URL: https://github.com/apache/beam/pull/14374#discussion_r604324692
##########
File path: sdks/python/apache_beam/dataframe/io.py
##########
@@ -521,7 +521,7 @@ def expand(self, pcoll):
return pcoll | fileio.WriteToFiles(
path=dir,
file_naming=fileio.default_file_naming(name),
- sink=_WriteToPandasFileSink(
+ sink=lambda _: _WriteToPandasFileSink(
Review comment:
Yeah I think so. I'm not sure how to address it though. Should we detect
and raise when this mode is used with non global windows? I'll file a jira for
this.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 574303)
Time Spent: 0.5h (was: 20m)
> DataFrame IO sinks do not correctly partition by window
> -------------------------------------------------------
>
> Key: BEAM-12071
> URL: https://issues.apache.org/jira/browse/BEAM-12071
> Project: Beam
> Issue Type: Improvement
> Components: sdk-py-core
> Affects Versions: 2.26.0, 2.27.0, 2.28.0
> Reporter: Brian Hulette
> Assignee: Brian Hulette
> Priority: P1
> Labels: dataframe-api
> Fix For: 2.29.0
>
> Time Spent: 0.5h
> Remaining Estimate: 0h
>
> I just discovered that when processing windowed data with DataFrameTransform,
> only one partition is written to every window, and a single window gets every
> other partition/window combination.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)