kszucs commented on code in PR #45360:
URL: https://github.com/apache/arrow/pull/45360#discussion_r1988086607


##########
cpp/src/parquet/chunker_internal_test.cc:
##########
@@ -0,0 +1,908 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+#include <gtest/gtest.h>
+#include <algorithm>
+#include <iostream>
+#include <memory>
+#include <string>
+#include <utility>
+#include <vector>
+
+#include "arrow/table.h"
+#include "arrow/type_fwd.h"
+#include "arrow/util/float16.h"
+#include "parquet/arrow/reader.h"
+#include "parquet/arrow/reader_internal.h"
+#include "parquet/arrow/test_util.h"
+#include "parquet/arrow/writer.h"
+#include "parquet/column_writer.h"
+#include "parquet/file_writer.h"
+
+namespace parquet {
+
+using ::arrow::Array;
+using ::arrow::ChunkedArray;
+using ::arrow::ConcatenateTables;
+using ::arrow::DataType;
+using ::arrow::default_memory_pool;
+using ::arrow::Field;
+using ::arrow::Result;
+using ::arrow::Table;
+using ::arrow::io::BufferReader;
+using ::parquet::arrow::FileReader;
+using ::parquet::arrow::FileReaderBuilder;
+using ::parquet::arrow::MakeSimpleTable;
+using ::parquet::arrow::NonNullArray;
+using ::parquet::arrow::WriteTable;
+
+using ::testing::Bool;
+using ::testing::Combine;
+using ::testing::Values;
+
+// generate determinisic and platform-independent data
+inline uint64_t hash(uint64_t seed, uint64_t index) {

Review Comment:
   Well, I tried to adjust the generated data size and the various CDC 
parameters to have stable test outcomes with the random testing utility then I 
switch over `exact` random data. 
   
   The assertions try to identify the various cases visually represented in the 
evaluation repository's readme https://github.com/kszucs/de this is why I am 
diffing the generated data page sizes before and after modifications. When we 
use bigger arrays these assertions are getting more stable, but the test cases 
are becoming rather slow. This is why I tried to work with smaller generated 
data and smaller chunk sizes but then CDC is getting more sensitive to the data 
itself.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@arrow.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to