alamb opened a new issue, #17194:
URL: https://github.com/apache/datafusion/issues/17194

   ### Describe the bug
   
   - While verifying 49.0.1: https://github.com/apache/datafusion/issues/17036
   
   @milenkovicm reported 
([email](https://lists.apache.org/thread/g94ztqk5hlgxd0lnmhofsm5lowchnn7c)) 
   
   > I’m getting
   
   ```
   ---- fuzz_cases::sort_fuzz::test_sort_10k_mem stdout ----
   
   thread 'fuzz_cases::sort_fuzz::test_sort_10k_mem' panicked at 
datafusion/core/tests/fuzz_cases/sort_fuzz.rs:269:63:
   called `Result::unwrap()` on an `Err` value: Execution("Failed to create 
partition file at 
\"/var/folders/h_/cdg2zx090212402xhrwhv62w0000gn/T/.tmpdhEV7B/.tmpIhvMHc\": Os 
{ code: 24, kind: Uncategorized, message: \"Too many open files\" }")
   note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
   
   
   failures:
   fuzz_cases::sort_fuzz::test_sort_10k_mem
   
   test result: FAILED. 75 passed; 1 failed; 0 ignored; 0 measured; 0 filtered 
out; finished in 73.00s
   ```
   
   
   ### To Reproduce
   
   In a clean terminal without any adjusted `ulimit` run
   
   ```shell
   cargo test --test fuzz
   ```
   
   And you'll see 
   
   ```
   ---- fuzz_cases::sort_fuzz::test_sort_10k_mem stdout ----
   
   thread 'fuzz_cases::sort_fuzz::test_sort_10k_mem' panicked at 
datafusion/core/tests/fuzz_cases/sort_fuzz.rs:269:63:
   called `Result::unwrap()` on an `Err` value: Execution("Failed to create 
partition file at 
\"/var/folders/1l/tg68jc6550gg8xqf1hr4mlwr0000gn/T/.tmpj128U3/.tmp86cD6k\": Os 
{ code: 24, kind: Uncategorized, message: \"Too many open files\" }")
   note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
   
   ---- 
fuzz_cases::spilling_fuzz_in_memory_constrained_env::test_aggregate_with_high_cardinality_with_limited_memory_and_different_sizes_of_record_batch_and_changing_memory_reservation
 stdout ----
   Error: Execution("Failed to create partition file at 
\"/var/folders/1l/tg68jc6550gg8xqf1hr4mlwr0000gn/T/.tmpLsX6yT/.tmpHO0Q6l\": Os 
{ code: 24, kind: Uncategorized, message: \"Too many open files\" }")
   
   ---- 
fuzz_cases::spilling_fuzz_in_memory_constrained_env::test_aggregate_with_high_cardinality_with_limited_memory_and_large_record_batch
 stdout ----
   Error: Execution("Failed to create partition file at 
\"/var/folders/1l/tg68jc6550gg8xqf1hr4mlwr0000gn/T/.tmpRtUy2o/.tmp8OqkYC\": Os 
{ code: 24, kind: Uncategorized, message: \"Too many open files\" }")
   
   ---- 
fuzz_cases::spilling_fuzz_in_memory_constrained_env::test_sort_with_limited_memory_and_large_record_batch
 stdout ----
   Error: IoError(Custom { kind: Uncategorized, error: PathError { path: 
"/var/folders/1l/tg68jc6550gg8xqf1hr4mlwr0000gn/T/.tmp7aaX1I/.tmpX6Jzr0", err: 
Os { code: 24, kind: Uncategorized, message: "Too many open files" } } })
   
   ---- 
fuzz_cases::spilling_fuzz_in_memory_constrained_env::test_aggregate_with_high_cardinality_with_limited_memory_and_different_sizes_of_record_batch_and_take_all_memory
 stdout ----
   Error: Execution("Failed to create partition file at 
\"/var/folders/1l/tg68jc6550gg8xqf1hr4mlwr0000gn/T/.tmp0MNZTo/.tmp31Qukn\": Os 
{ code: 24, kind: Uncategorized, message: \"Too many open files\" }")
   
   
   failures:
       fuzz_cases::sort_fuzz::test_sort_10k_mem
       
fuzz_cases::spilling_fuzz_in_memory_constrained_env::test_aggregate_with_high_cardinality_with_limited_memory_and_different_sizes_of_record_batch_and_changing_memory_reservation
       
fuzz_cases::spilling_fuzz_in_memory_constrained_env::test_aggregate_with_high_cardinality_with_limited_memory_and_different_sizes_of_record_batch_and_take_all_memory
       
fuzz_cases::spilling_fuzz_in_memory_constrained_env::test_aggregate_with_high_cardinality_with_limited_memory_and_large_record_batch
       
fuzz_cases::spilling_fuzz_in_memory_constrained_env::test_sort_with_limited_memory_and_large_record_batch
   
   test result: FAILED. 81 passed; 5 failed; 0 ignored; 0 measured; 0 filtered 
out; finished in 81.65s
   ```
   
   ### Expected behavior
   
   I would really like the tests to run successfully with the default `ulimits` 
on Mac. It seems the default is 256 on my machine:
   
   > open files                          (-n) 256
   
   
   ```shell
   andrewlamb@Andrews-MacBook-Pro-3:~/Software/datafusion$ ulimit -a
   core file size              (blocks, -c) 0
   data seg size               (kbytes, -d) unlimited
   file size                   (blocks, -f) unlimited
   max locked memory           (kbytes, -l) unlimited
   max memory size             (kbytes, -m) unlimited
   open files                          (-n) 256
   pipe size                (512 bytes, -p) 1
   stack size                  (kbytes, -s) 8176
   cpu time                   (seconds, -t) unlimited
   max user processes                  (-u) 10666
   virtual memory              (kbytes, -v) unlimited
   ```
   
   ### Additional context
   
   The workaround / way to fix this is to raise the limit on the number of 
files via  `ulimit`
   
   For example,  setting the open file limit to 4096 beforehand will solve the 
problem:
   
   ```shell
   ulimit -n 4096
   cargo test --test fuzz
   ```
   
   Passes 
   ```
   test result: ok. 86 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out; 
finished in 73.86s
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to