Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21558#discussion_r196592175
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/WriteToDataSourceV2.scala
 ---
    @@ -110,7 +108,7 @@ object DataWritingSparkTask extends Logging {
           useCommitCoordinator: Boolean): WriterCommitMessage = {
         val stageId = context.stageId()
         val partId = context.partitionId()
    -    val attemptId = context.attemptNumber()
    +    val attemptId = context.taskAttemptId().toInt
    --- End diff --
    
    I was going to suggest removing the cast to int, but well, that's in 
`DataWriterFactory` and would be an API breakage... hopefully won't cause 
issues aside from weird output names when the value overflows the int.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to