[ 
https://issues.apache.org/jira/browse/GOBBLIN-1947?focusedWorklogId=893471&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-893471
 ]

ASF GitHub Bot logged work on GOBBLIN-1947:
-------------------------------------------

                Author: ASF GitHub Bot
            Created on: 01/Dec/23 19:52
            Start Date: 01/Dec/23 19:52
    Worklog Time Spent: 10m 
      Work Description: hanghangliu commented on code in PR #3832:
URL: https://github.com/apache/gobblin/pull/3832#discussion_r1412515341


##########
gobblin-cluster/src/main/java/org/apache/gobblin/cluster/GobblinHelixJobLauncher.java:
##########
@@ -514,6 +599,7 @@ private TaskConfig getTaskConfig(WorkUnit workUnit, 
ParallelRunner stateSerDeRun
     
rawConfigMap.put(GobblinClusterConfigurationKeys.TASK_SUCCESS_OPTIONAL_KEY, 
"true");
     TaskConfig taskConfig = TaskConfig.Builder.from(rawConfigMap);
     workUnitToHelixConfig.put(workUnit.getId(), taskConfig);

Review Comment:
   removed workUnitToHelixConfig





Issue Time Tracking
-------------------

    Worklog Id:     (was: 893471)
    Time Spent: 1h 10m  (was: 1h)

> Send WorkUnitChangeEvent when helix task consistently fail 
> -----------------------------------------------------------
>
>                 Key: GOBBLIN-1947
>                 URL: https://issues.apache.org/jira/browse/GOBBLIN-1947
>             Project: Apache Gobblin
>          Issue Type: New Feature
>          Components: gobblin-cluster
>            Reporter: Hanghang Liu
>            Assignee: Hung Tran
>            Priority: Major
>          Time Spent: 1h 10m
>  Remaining Estimate: 0h
>
> When YarnAutoScalingManager detect helix task consistently fail, give an 
> option to send WorkUnitChangeEvent to let GobblinHelixJobLauncher handle the 
> event and split the work unit during runtime. This can help resolving 
> consistent failing containers issue(like OOM) during runtime instead of 
> relying on replaner to restart the whole pipeline



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to