Many systems (e.g. Ubuntu) come with a default configuration such that a core file will never appear in the current directory. Instead it will either be logged and discarded, or be processed as if it it is due to buggy OS software in order to notify distribution maintainers so they can fix the bug.
Depending on core dumping behavior is not portable. Bob On Thu, Mar 5, 2026, 2:34 PM G. Branden Robinson < [email protected]> wrote: > Hi Zack, > > Thanks for the swift follow-up! > > At 2026-03-05T15:59:41-0500, Zack Weinberg wrote: > > On Thu, Mar 5, 2026, at 10:58 AM, G. Branden Robinson wrote: > > > I had to use a nasty hack to force serialization of two regression > > > tests that both can cause core dumps, and so can race with each > > > other in the TOCTTOU window on the existence of a "core" file. > > > > Are the core dumps necessary for the test? > > Strictly? Probably not. I could alter the test cases to format some > hapax legomenon text after the applicable crash points observed in the > past and then grep the formatter output for it. > > > If not, I would suggest wrapping each one with a shell script that > > does `ulimit -c 0` and then execs the main test program. > > I kind of liked the idea of leaving a meaningful[1] core file in the > build tree so that it could be immediately inspected without having to > hack up the test script to take out the "ulimit -c 0". > > Was I using the documented serialization mechanism incorrectly? Why > didn't it work? > > Regards, > Branden > > [1] assuming the person doing the build didn't employ "configure" and > "make" options so as to boil off all useful debugging information, > as distributors have traditionally loved to do >
