Rui Li created HIVE-19439: ----------------------------- Summary: MapWork shouldn't be reused when Spark task fails during initialization Key: HIVE-19439 URL: https://issues.apache.org/jira/browse/HIVE-19439 Project: Hive Issue Type: Bug Components: Spark Reporter: Rui Li
Issue identified in HIVE-19388. When a Spark task fails during initializing the map operator, the task is retried with the same MapWork retrieved from cache. This can be problematic because the MapWork may be partially initialized, e.g. some operators are already in INIT state. -- This message was sent by Atlassian JIRA (v7.6.3#76005)