Hi, Every time I try to use KLEE with the "--libc=uclibc" option on x86_64, I get a segfault. This occurs regardless of the program being analyzed, so it seems to be an issue with loading uClibc:
$ llvm-gcc -emit-llvm -c -g test.c -I/home/mlcreech/local/include $ klee --only-output-states-covering-new --libc=uclibc test.o 0 klee 0x0000000000cc414f 1 klee 0x0000000000cc5dfd 2 libpthread.so.0 0x00007f744fba19c0 3 klee 0x00000000008ed4a0 4 klee 0x00000000008eda7c 5 klee 0x00000000008eda7c 6 klee 0x00000000008eda7c 7 klee 0x00000000008ee01f 8 klee 0x00000000008f3135 llvm::Linker::LinkModules(llvm::Module*, llvm::Module*, std::string*) + 3237 9 klee 0x00000000008f7ad6 llvm::Linker::LinkInArchive(llvm::sys::Path const&, bool&) + 1462 10 klee 0x00000000008ea845 llvm::Linker::LinkInFile(llvm::sys::Path const&, bool&) + 997 11 klee 0x00000000005936c2 klee::linkWithLibrary(llvm::Module*, std::string const&) + 98 12 klee 0x000000000053eb9a 13 klee 0x000000000053fc5b main + 1035 14 libc.so.6 0x00007f744eeb2a3d __libc_start_main + 253 15 klee 0x0000000000538a39 Segmentation fault I'm not sure what flags I'd need to pass to get a better backtrace; "-g" seems to already be in my CFLAGS, I don't see ENABLE_OPTIMIZED actually being used anywhere so disabling that didn't help, and passing --with-runtime=Debug (assuming that's a valid choice) to the KLEE configure script yielded compile errors. My uClibc config is mostly standard (for the most part I just picked default answers for everything). I'm using LLVM 2.6 and the latest from KLEE svn. Any ideas on where to look? Thanks! -- Matthew L. Creech
