kolfild26 commented on issue #44513: URL: https://github.com/apache/arrow/issues/44513#issuecomment-2433010232
I also tryied to reduce the size of the right table and the working limit actually varies for me. Not able to find the exact number. I'm getting either the seg fault or the join result is incorrect. My system has 4Tb memory in total, so it's not connected to the out-of-memory issue. Here is the other system specs: > Oracle Linux Server 7.8 > **ulimit -a** > core file size (blocks, -c) 0 > data seg size (kbytes, -d) unlimited > scheduling priority (-e) 0 > file size (blocks, -f) unlimited > pending signals (-i) 16511255 > max locked memory (kbytes, -l) unlimited > max memory size (kbytes, -m) unlimited > open files (-n) 4096 > pipe size (512 bytes, -p) 8 > POSIX message queues (bytes, -q) 819200 > real-time priority (-r) 0 > stack size (kbytes, -s) unlimited > cpu time (seconds, -t) unlimited > max user processes (-u) 4096 > virtual memory (kbytes, -v) unlimited > file locks (-x) unlimited > Python 3.10.15 > import pyarrow as pa > pa.__version__ > '18.0.0.dev486' -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
