kou commented on PR #36406:
URL: https://github.com/apache/arrow/pull/36406#issuecomment-1614129087

   Thanks!
   
   Can we write a code that works with both of ORC 1.9.0 and 1.8.4?
   
   Our CI job that still uses ORC 1.8.4 failed with this change:
   
   
https://github.com/apache/arrow/actions/runs/5419179225/jobs/9852016855?pr=36406#step:6:2464
   
   ```text
   FAILED: 
src/arrow/adapters/orc/CMakeFiles/arrow-orc-adapter-test.dir/adapter_test.cc.o 
   /opt/conda/envs/arrow/bin/ccache 
/opt/conda/envs/arrow/bin/x86_64-conda-linux-gnu-c++ 
-DARROW_EXTRA_ERROR_CONTEXT -DARROW_HAVE_RUNTIME_AVX2 
-DARROW_HAVE_RUNTIME_AVX512 -DARROW_HAVE_RUNTIME_BMI2 
-DARROW_HAVE_RUNTIME_SSE4_2 -DARROW_HAVE_SSE4_2 -DARROW_HDFS -DARROW_MIMALLOC 
-DARROW_S3_HAS_CRT -DARROW_WITH_BENCHMARKS_REFERENCE -DARROW_WITH_BROTLI 
-DARROW_WITH_BZ2 -DARROW_WITH_LZ4 -DARROW_WITH_RE2 -DARROW_WITH_SNAPPY 
-DARROW_WITH_UTF8PROC -DARROW_WITH_ZLIB -DARROW_WITH_ZSTD 
-DAWS_AUTH_USE_IMPORT_EXPORT -DAWS_CAL_USE_IMPORT_EXPORT 
-DAWS_CHECKSUMS_USE_IMPORT_EXPORT -DAWS_COMMON_USE_IMPORT_EXPORT 
-DAWS_COMPRESSION_USE_IMPORT_EXPORT -DAWS_CRT_CPP_USE_IMPORT_EXPORT 
-DAWS_EVENT_STREAM_USE_IMPORT_EXPORT -DAWS_HTTP_USE_IMPORT_EXPORT 
-DAWS_IO_USE_IMPORT_EXPORT -DAWS_MQTT_USE_IMPORT_EXPORT 
-DAWS_MQTT_WITH_WEBSOCKETS -DAWS_S3_USE_IMPORT_EXPORT 
-DAWS_SDKUTILS_USE_IMPORT_EXPORT -DAWS_SDK_VERSION_MAJOR=1 
-DAWS_SDK_VERSION_MINOR=10 -DAWS_SDK_VERSION_PATCH=13 -DAWS_USE_EPOLL 
-DGTEST_LINKED_AS_SHARED_L
 IBRARY=1 -DURI_STATIC_BUILD -I/build/cpp/src -I/arrow/cpp/src 
-I/arrow/cpp/src/generated -isystem /arrow/cpp/thirdparty/flatbuffers/include 
-isystem /arrow/cpp/thirdparty/hadoop/include -isystem 
/build/cpp/jemalloc_ep-prefix/src -isystem 
/build/cpp/mimalloc_ep/src/mimalloc_ep/include/mimalloc-2.0 -isystem 
/build/cpp/googletest_ep-prefix/include -Wno-noexcept-type 
-fvisibility-inlines-hidden -fmessage-length=0 -march=nocona -mtune=haswell 
-ftree-vectorize -fPIC -fstack-protector-strong -fno-plt -O2 
-ffunction-sections -pipe -isystem /opt/conda/envs/arrow/include 
-fdiagnostics-color=always -fuse-ld=gold  -Wall -Wno-conversion 
-Wno-sign-conversion -Wunused-result -Wdate-time -fno-semantic-interposition 
-msse4.2  -g -Werror -O0 -ggdb -g1 -std=c++17 -fPIE -pthread 
-DS2N_KYBER512R3_AVX2_BMI2 -DS2N_STACKTRACE -DS2N_CPUID_AVAILABLE 
-DS2N_FEATURES_AVAILABLE -fPIC -DS2N_FALL_THROUGH_SUPPORTED 
-DS2N___RESTRICT__SUPPORTED -DS2N_MADVISE_SUPPORTED -DS2N_CLONE_SUPPORTED 
-DS2N_LIBCRYPTO_SUPPORTS_EV
 P_MD5_SHA1_HASH -DS2N_LIBCRYPTO_SUPPORTS_EVP_RC4 
-DS2N_LIBCRYPTO_SUPPORTS_EVP_MD_CTX_SET_PKEY_CTX -MD -MT 
src/arrow/adapters/orc/CMakeFiles/arrow-orc-adapter-test.dir/adapter_test.cc.o 
-MF 
src/arrow/adapters/orc/CMakeFiles/arrow-orc-adapter-test.dir/adapter_test.cc.o.d
 -o 
src/arrow/adapters/orc/CMakeFiles/arrow-orc-adapter-test.dir/adapter_test.cc.o 
-c /arrow/cpp/src/arrow/adapters/orc/adapter_test.cc
   /arrow/cpp/src/arrow/adapters/orc/adapter_test.cc: In function 'void 
arrow::{anonymous}::TestUnionConversion(std::shared_ptr<arrow::Array>)':
   /arrow/cpp/src/arrow/adapters/orc/adapter_test.cc:1046:31: error: no 
matching function for call to 'orc::Type::createRowBatch(int64_t, 
orc::MemoryPool&, bool, bool)'
    1046 |       orc_type->createRowBatch(array->length(), 
*liborc::getDefaultPool(),
         |       
~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    1047 |                                /*encoded=*/false, 
/*useTightNumericVector=*/false);
         |                                
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
   In file included from /opt/conda/envs/arrow/include/orc/Common.hh:23,
                    from /opt/conda/envs/arrow/include/orc/Reader.hh:23,
                    from /opt/conda/envs/arrow/include/orc/OrcFile.hh:25,
                    from /arrow/cpp/src/arrow/adapters/orc/adapter_test.cc:21:
   /opt/conda/envs/arrow/include/orc/Type.hh:73:47: note: candidate: 'virtual 
std::unique_ptr<orc::ColumnVectorBatch> orc::Type::createRowBatch(uint64_t, 
orc::MemoryPool&, bool) const'
      73 |     virtual ORC_UNIQUE_PTR<ColumnVectorBatch> 
createRowBatch(uint64_t size,
         |                                               ^~~~~~~~~~~~~~
   /opt/conda/envs/arrow/include/orc/Type.hh:73:47: note:   candidate expects 3 
arguments, 4 provided
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to