adamreeve opened a new issue, #47830:
URL: https://github.com/apache/arrow/issues/47830
### Describe the bug, including details regarding any error messages,
version, and platform.
I ran the release verification script to test the Arrow 22.0.0 release with:
```
TEST_DEFAULT=0 TEST_WHEELS=1 TEST_YUM=1 TEST_CPP=1 TEST_GLIB=1 TEST_PYTHON=1
TEST_INTEGRATION=1 ./dev/release/verify-release-candidate.sh 22.0.0 0
```
This failed when testing the Python wheels:
```
+ python -c 'import pyarrow._s3fs'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ImportError:
/tmp/arrow-22.0.0.drsAR/venv-wheel-3.11-manylinux_2_28_x86_64/lib64/python3.11/site-packages/pyarrow/_s3fs.cpython-311-x86_64-linux-gnu.so:
undefined symbol: _ZN5arrow2fs17EnsureS3FinalizedEv
```
Testing just the Python source and wheel also fails:
```
TEST_DEFAULT=0 TEST_WHEELS=1 TEST_PYTHON=1
./dev/release/verify-release-candidate.sh 22.0.0 0
```
But if I separately test the Python source and wheels then this works fine:
```
TEST_DEFAULT=0 TEST_PYTHON=1 ./dev/release/verify-release-candidate.sh
22.0.0 0
TEST_DEFAULT=0 TEST_WHEELS=1 ./dev/release/verify-release-candidate.sh
22.0.0 0
```
It looks like when the wheels are used, the previously built libarrow is
being loaded rather than the one provided with the wheel. This causes an error
because the library built from source didn't have S3 enabled.
I'm using Fedora 42 Linux on x64.
### Component(s)
Developer Tools, Python
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]