yanghua commented on issue #10112: [FLINK-14640] Change Type of Field currentExecutions from ConcurrentHashMap to HashMap URL: https://github.com/apache/flink/pull/10112#issuecomment-551077609 I have found the problem. Here is a failed test case: ``` @Test public void testRegistrationOfExecutionsFailing() { try { final JobVertexID jid1 = new JobVertexID(); final JobVertexID jid2 = new JobVertexID(); JobVertex v1 = new JobVertex("v1", jid1); JobVertex v2 = new JobVertex("v2", jid2); //place 1 Map<ExecutionAttemptID, Execution> executions = setupExecution(v1, 7, v2, 6).f1; //place 2 for (Execution e : executions.values()) { //place 3 e.markFailed(null); } assertEquals(0, executions.size()); } catch (Exception e) { e.printStackTrace(); fail(e.getMessage()); } } ``` There are three places I have marked. Place 1 calls `ExecutionGraph#getRegisteredExecutions`: ``` public Map<ExecutionAttemptID, Execution> getRegisteredExecutions() { return Collections.unmodifiableMap(currentExecutions); } ``` Place 3 calls `ExecutionGraph#deregisterExecution` method which triggers remove execution from the `currentExecutions` Map: ``` Execution contained = currentExecutions.remove(exec.getAttemptId()); ``` I replaced `ConcurrentHashMap` with `HashMap` which causes `java.util.ConcurrentModificationException` when iterate it(place 2). Although, we do not use this mode in production code like this test. However, It seems we'd better revert this change?
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services
