John, I'll try to reduce testcase to smth appropriate for send to list. At least, now I can reproduce old error (April or March 2013) I use creduce 'commit 0ac7fc39d8260c7181344042979056157d8754aa'
When reduced file become zero then creduce hang $ creduce --verbose x.sh x.ii successfully checked prereqs for pass_blank successfully checked prereqs for pass_lines successfully checked prereqs for pass_special successfully checked prereqs for pass_ternary successfully checked prereqs for pass_balanced successfully checked prereqs for pass_clang successfully checked prereqs for pass_peep successfully checked prereqs for pass_ints successfully checked prereqs for pass_indent ===< 7086 >=== running 4 interestingness test(s) in parallel sanity check... successful INITIAL PASSES ===< pass_blank :: 0 >=== [0 pass_blank :: 0 s:0 f:0] [0 pass_blank :: 0 s:0 f:0] success (0.3 %, 1278 bytes) [0 pass_blank :: 0 s:1 f:0] [0 pass_blank :: 0 s:1 f:0] success (0.3 %, 1278 bytes) ===< pass_lines :: 0 >=== [0 pass_lines :: 0 s:0 f:0] initial granularity = 30 [0 pass_lines :: 0 s:0 f:0] [0 pass_lines :: 0 s:0 f:0] [0 pass_lines :: 0 s:0 f:0] granularity = 15 [0 pass_lines :: 0 s:0 f:0] [0 pass_lines :: 0 s:0 f:0] [0 pass_lines :: 0 s:0 f:0] [0 pass_lines :: 0 s:0 f:0] failure granularity = 8 [0 pass_lines :: 0 s:0 f:1] failure [0 pass_lines :: 0 s:0 f:1] [0 pass_lines :: 0 s:0 f:2] [0 pass_lines :: 0 s:0 f:2] success (100.0 %, 0 bytes) granularity = 15 granularity = 8 granularity = 4 granularity = 2 granularity = 1 ===< pass_lines :: 0 >=== [0 pass_lines :: 0 s:0 f:0] [0 pass_lines :: 0 s:0 f:0] initial granularity = 0 granularity = 0 creduve eat memory -- 10--100 Kbytes per second and CPU time -- 25-35% After I hit ^C and re-run I receive new error: $ time creduce x.sh x.ii ===< 28446 >=== running 4 interestingness test(s) in parallel ===< pass_blank :: 0 >=== Illegal division by zero at /usr/local/bin/creduce line 105. real 0m0.309s user 0m0.135s sys 0m0.043s Thanks, Dmitry 2013/6/9 John Regehr <[email protected]>: > Dmitry, can you send me instructions for reproducing a C-Reduce hang on the > current code? > > Thanks, > > > John > > > > On 06/08/2013 02:58 PM, Дмитрий Дьяченко wrote: >> >> John, >> with creduce-latest + clang-3.2 I got the same crash in test #6. >> Now I find crash-file in temporary directory in creduce/tests/ >> Crash-file attached. >> >> Sorry, I not mention early why I try clang-latest + creduce-latest: at >> first, to my fun :), and second, in hope to provide useful feedback to >> your research group. >> I do gcc-bug-hunting relatively rare and for each new one I try new >> clang + new creduce :) >> >> May be you find interesting [preliminary] comparison of >> creduce-current with clang-3.2/3.4-trunk >> >> 1. comparison creduce/tests/run_tests >> a) the same tests FAIL and PASS >> b) test #6 crash with both clang's versions >> c) elapsed time in seconds as run_tests reports >> 3.2 3.4-current >> 0: 922 207 >> 1: 23 28 >> 2: 928 212 >> 3: 17 24 >> 6: 2209 1048 >> >> 2. comparison for real-world tasks: >> results are identical with both versions of clang -- reductions hang >> >> Thanks, >> Dmitry >> >> 2013/6/8 John Regehr <[email protected]>: >>> >>> Hi Dmitry, >>> >>> I'm in the middle of some more C-Reduce changes right now. It might be >>> best >>> to not use the current version until things stabilize a bit. >>> >>> >>>> -- tests 0..3 PASS >>>> -- tests 4, 5, 7 FAIL with >>> >>> >>> >>> I'll look into this, thanks for letting us know. >>> >>> >>>> 0. <eof> parser at end of file >>>> sh: line 1: 23784 Segmentation fault (core dumped) >>>> /home/dimhen/build/creduce/creduce/../clang_delta/clang_delta >>>> --transformation=return-void --counter=1 >>>> /home/dimhen/tmp/creduce-WStouG/small.c > /tmp/file0I5kCp >>> >>> >>> >>> Right now we only develop and test using Clang 3.2, so we can't help you >>> with crashes against the LLVM top of tree. It's super easy to download >>> the >>> 3.2 binary distribution and compile C-Reduce against it, so I'd ask you >>> to >>> just do that. >>> >>> I have not seen any hangs like you are seeing, but it could easily be >>> that I >>> have introduced new bugs lately. I'm still making some changes to the >>> core >>> of C-Reduce but will hopefully be done soon, and then I'll try to >>> reproduce >>> the problems you are seeing. >>> >>> John
