davlee1972 commented on issue #1847: URL: https://github.com/apache/arrow-adbc/issues/1847#issuecomment-2113304353
Ok I figured it out.. There is a bug with how adbc_ingest is handling batches.. I reduced the number of records to 12,000 and the bug still happens.. If I read data from a partial set of parquet files and try to write it to Snowflake I get ZERO rows inserted.. If I write my data to a single parquet file, reread it and then try to write it to Snowflake I get all my rows inserted..  On a side not I'm not sure why these params are STRINGs..  -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
