Github user attilapiros commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22880#discussion_r229732302
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetRowConverter.scala
 ---
    @@ -202,11 +204,15 @@ private[parquet] class ParquetRowConverter(
     
       override def start(): Unit = {
         var i = 0
    -    while (i < currentRow.numFields) {
    +    while (i < fieldConverters.length) {
           fieldConverters(i).updater.start()
           currentRow.setNullAt(i)
    --- End diff --
    
    I am also lost here. The `i` index seems to me following the parquet fields 
so is not the `updater.ordinal` the the correct index to update the currentRow?
    
    I would expect something like:
    
    `scala
    val updater = fieldConverters(i).updater
    updater.start()
    currentRow.setNull(updater.ordinal)
    `



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to