Hi All, I'm currently working on SPARK-13534 and trying to validate converted data for testing purposes. The data can be broken up into multiple ArrowRecordBatches that each have a number of rows (same columns) and I need to concat these, and compare with a JSON file by calling Validator.compareVectorSchemaRoot. On repeated calls to VectorLoader.load, each record batch seems to overwrite the previous, but maybe I'm missing something. Is this possible to do on the Java side of Arrow? It could happen that the order of batches gets mixed up, so maybe this is not a good way to validate anyway.
Thanks, Bryan
