As I'm unable to publish parts of the project I'm working on, I tried to come 
up with a minimal example.
For comparability (and as I didn't want to risk my currently working setup), I 
was using docker:
sudo docker run -it --rm ubuntu:24.04 bash
sudo docker run -it --rm ubuntu:25.10 bash

With the following commands in both setups:
apt update
#apt search coreutils-from-uutils
mkdir test && cd test
apt install make vim zstd pv parallel
vim Makefile # for the contents see further down
make clean ; time make -j 1024

Which made me reproduce at least a (smaller) performance hit:
# ubuntu 24.04 docker image with gnu coreutils:
real    0m22.570s
user    2m2.905s
sys     3m42.032s

# Ubuntu 25.10 docker image with coreutils-from-uutils
real    0m32.018s
user    2m57.540s
sys     5m11.862s

The performance loss is therefore likely independent of the stat failure
I've seen. Unfortunately I cannot spent more time right now on it to
also try to reproduce the stat coredump (e.g. by reinstalling coreutils-
from-uutils on my main system and experimenting further with the minimal
example). So I cannot say if the stat coredump has already been fixed in
the meantime.

The Makefile below (partly AI generated) doesn't have any meaning
besides using some of the simpler tools used in the way larger project
I'm working on, calling them in a stress testing approach by interacting
with many files / file handles / pipes at once. Please take note the
Makefile parameter NUM_FILES.

Thank you for taking a look! And sorry if this doesn't help too much,
but maybe it is useful for optimizing performance... Otherwise or in
regard to the stat coredump feel free to close this issue as not
reproducable, as I don't have more time for this right now. Kind
regards.

Makefile:

# Variables
RANDOM_DIR := random_files
COMPRESSED_DIR := compressed_files
NUM_FILES := 200
FILE_SIZE := 10K   # Each random file will be 1KB
RANDOM_FILES := $(addprefix $(RANDOM_DIR)/file,$(addsuffix .bin,$(shell seq 1 
$(NUM_FILES))))
COMPRESSED_FILES := $(patsubst 
$(RANDOM_DIR)/%.bin,$(COMPRESSED_DIR)/%.zst,$(RANDOM_FILES))

SHELL=bash

.PHONY: all randomfiles compress clean

all: compress

# Target to create 1000 random files using dd (with /dev/urandom)
$(RANDOM_DIR)/file%.bin:
        @mkdir -p $(RANDOM_DIR)
        @bash -c "umask 000 ; dd if=/dev/urandom of=$@ bs=$(FILE_SIZE) count=1 
status=none"

randomfiles: $(RANDOM_FILES)

# Generic compression target using zstd for each random file
$(COMPRESSED_DIR)/%.zst: $(RANDOM_DIR)/%.bin
        @mkdir -p $(COMPRESSED_DIR)
        echo -e "bash <<'EOF'\n(umask 000; cat $< $(ls -lrt $(RANDOM_DIR) |shuf 
|tail -5 |grep -oP 'file.*$'|parallel -I{} echo $(RANDOM_DIR)/{}) | pv | zstd 
-q ) | tee $@ >(( bash -c 'ls -1 $(RANDOM_DIR) | parallel -j32 -I{} stat 
$(RANDOM_DIR)/{}' > /dev/null && wc -c ) >&2) | sha512sum | awk '{print $$1}' > 
[email protected]\nEOF" |bash

compress: $(COMPRESSED_FILES)

clean:
        rm -rf $(RANDOM_DIR) $(COMPRESSED_DIR)

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/2134389

Title:
  coreutils-from-uutils performs badly and crashes

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/rust-coreutils/+bug/2134389/+subscriptions


-- 
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to