Hello. I have wrote a C program to implement the test I described above. Currently, it checks 100x 8K blocks around the 512MB mark, by swapping them with each other, back to front. Running the program twice should thus result in an identical file. Using this program, you can check that read/write ability of your filesystem, in particular around with 512MB mark in a file (both before, at, and after). http://codepad.org/msApnHFY If you don't have gcc installed (it isn't by default): sudo apt-get install build-essential To compile: gcc -o tester tester.c
Further, I wrote a perl script to simplify testing: http://codepad.org/djmLaJ0l To run: chmod +x tester.pl ./tester.pl If those links disappear at some stage, the programs can be found here: http://starcraftmazter.net/launchpad/453579/ Both me and my friend have ran the test on the Ubuntu iso itself. I am using a 64bit install of 9.10 and he is using a 32bit install of 9.10. The kernel used for our tests is the default 2.6.31-14-generic. We are both on ext4. Both of our tests came up fine, and read/write works perfectly and the before/after hashes are the same, hence we could not observe any problem. I would encourage anyone experiencing this problem to run the above tests and see what happens, in an effort to isolate the problem. Cheers -- corruption of large files reported with linux 2.6.31-14.46 on ext4 https://bugs.launchpad.net/bugs/453579 You received this bug notification because you are a member of Ubuntu Bugs, which is subscribed to Ubuntu. -- ubuntu-bugs mailing list [email protected] https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs
