Hi Wei Shu,

The issue with Glibc was also something I experienced before. The benchmark was 
able to run fine outside Gem5, but in Gem5 it produces that error. Someone 
thought me to recompile the benchmark in the image kernel loaded by Gem5. That 
step involves chroot to the image kernel and then recompile it from there. This 
is because your executable is running using same Linux kernel. Right now, 
assuming you are running on Ubuntu, your Linux kernel is 4.++ but Gem5 kernel 
is running on kernel version 2.++.

I believe once you solve the Glibc issue, the seg fault shouldn't appear 
anymore.

Elena Woo Lai Leng

> On 3 Mar 2017, at 4:45 PM, Wei Shu <[email protected]> wrote:
> 
> Hi Elena,
> 
> Thank you for your quick response. It is a great help. I was surprised that 
> fmm, ocean_cp and fft, radix can pass through test after I compiled them with 
> dynamic linking. Initially I was thinking that I should make them statically 
> linked. 
> There are several of them still could not work. Below you can find the bug 
> report.
> 
> Anyone more experienced may know the solution. 
> 
> exeuting ./ocean_ncp -n258 -p1 -e1e-07 -r20000 -t28800
> Ocean simulation with W-cycle multigrid solver
>    Processors                         : 1
>    Grid size                          : 258 x 258
>    Grid resolution (meters)           : 20000.00
>    Time between relaxations (seconds) : 28800
>    Error tolerance                    : 1e-07
> 
> 
> ocean_ncp-dynam[953]: segfault at 0 ip 0000000000401359 sp 00007fffe6df08a0 
> error 6 in ocean_ncp-dynamic[400000+2b000]
> Segmentation fault
> 
> 
> exeuting ./volrend 1 head-scaleddown4 4
> Both computing normals, opacities and octree as well as rendering: starting 
> from .den file
> NOTE: Preprocessing (computing normals, opacities and octree) is done 
> serially by only one processor
> Gouraud shading from lookup tables used
>    1 processes
>    4x4 image block size
>    doing nonadaptive rendering
>    using precomputed normals and opacities in separate array
>    using opacity octree
>    starting angle at (90.0, -36.0, 0.0) with rotation step of 3
> rotating 4 steps in each of the three Cartesian directions
> 
> *****Entering init_decomposition with num_nodes = 1
> *****Exited init_decomposition with num_nodes = 1
> volrend-dynamic[956]: segfault at 4effa204 ip 00007fa3fe37dad0 sp 
> 00007fff06cfb858 error 4 in libc-2.6.1.so[7fa3fe30a000+136000]
> Segmentation fault
> 
> exeuting ./water_nsquared < water_nsquared_input_1
> ./water_nsquared-dynamic: /lib/libc.so.6: version `GLIBC_2.7' not found 
> (required by ./water_nsquared-dynamic)
> 
> 
> exeuting .water_spatial 1 < water_spatial_input_1
> ./water_spatial-dynamic: /lib/libc.so.6: version `GLIBC_2.7' not found 
> (required by ./water_spatial-dynamic)
> 
> 
> exeuting ./lu_cb -p1 -n512 -b16
> ./lu_cb-dynamic: /lib/libc.so.6: version `GLIBC_2.14' not found (required by 
> ./lu_cb-dynamic)
> 
> 
> exeuting ./lu_ncb -p1 -n512 -b16
> ./lu_ncb-dynamic: /lib/libc.so.6: version `GLIBC_2.14' not found (required by 
> ./lu_ncb-dynamic)
> 
> 
> exeuting ./radiosity -bf 1.5e-1 -batch -room -p 1
> ./raidosity-dynamic: /lib/libc.so.6: version `GLIBC_2.7' not found (required 
> by ./raidosity-dynamic)
> 
> 
> exeuting ./cholesky < cholesky_tk14.O
> ./cholesky-dynamic: /lib/libc.so.6: version `GLIBC_2.14' not found (required 
> by ./cholesky-dynamic)
> ./cholesky-dynamic: /lib/libc.so.6: version `GLIBC_2.7' not found (required 
> by ./cholesky-dynamic)
> 
> exeuting ./raytrace -s -p1 -a4 teapot.env
> ./raytrace-dynamic: /lib/libc.so.6: version `GLIBC_2.7' not found (required 
> by ./raytrace-dynamic)
> 
> exeuting barnes 1 < barnes_input_test
> ./barnes-dynamic: /lib/libc.so.6: version `GLIBC_2.14' not found (required by 
> ./barnes-dynamic)
> ./barnes-dynamic: /lib/libc.so.6: version `GLIBC_2.7' not found (required by 
> ./barnes-dynamic)
> 
> 
> 
> The unfounded GLIBC problem may due to the compilation with dynamic linking, 
> as my native GLIBC version is higher than this kernel.
> 
> Thanks,
> Wei
> 
> ----- Original Message -----
> From: "Woo L.L." <[email protected]>
> To: "gem5 users mailing list" <[email protected]>
> Sent: Thursday, March 2, 2017 6:38:36 PM
> Subject: Re: [gem5-users] Compiled Splash2x benchmarks segfault under FS with 
> X86
> 
> Hi Wei Shu,
> 
> I had similar problems when I tried running benchmarks from MiBench in FS 
> mode using X86 architecture. The problem I think was because all the 
> benchmarks were compiled statically. Once I recompiled those benchmarks 
> dynamically, it solved the segmentation fault problem. 
> 
> Perhaps you can try to recompile your benchmarks dynamically. 
> Thanks.
> 
> Regards,
> Elena Woo
> 
>> On 2 Mar 2017, at 9:53 PM, Wei Shu <[email protected]> wrote:
>> 
>> Hi all,
>> 
>> I compiled all the benchmarks of Splash2x statically from the PARSEC 3.0 
>> package. All of them can be run successfully on my native X86 machine. 
>> However, when I run these binaries in gem5 (built with X86 ISA, and run with 
>> FS.py with single core either with or without caches), I see all of the 
>> benchmarks have segfault:
>> 
>> exeuting barnes 1 < barnes_input_test
>> barnes[952]: segfault at 4d218fe8 ip 0000000000400ec4 sp 00007fff4d218ff0 
>> error 6 in barnes[400000+116000]
>> Segmentation fault
>> 
>> exeuting ./fmm 1 < fmm_input_test1
>> fmm[953]: segfault at 2caf38c8 ip 0000000000400c14 sp 00007fff2caf38d0 error 
>> 6 in fmm[400000+130000]
>> Segmentation fault
>> 
>> exeuting ./ocean_cp -n258 -p1 -e1e-07 -r20000 -t28800
>> ocean_cp[954]: segfault at 24cafa38 ip 0000000000402084 sp 00007fff24cafa40 
>> error 6 in ocean_cp[400000+161000]
>> Segmentation fault
>> 
>> exeuting ./ocean_ncp -n258 -p1 -e1e-07 -r20000 -t28800
>> ocean_ncp[955]: segfault at 86e6bbf8 ip 0000000000402454 sp 00007fff86e6bc00 
>> error 6 in ocean_ncp[400000+146000]
>> Segmentation fault
>> 
>> exeuting ./radiosity -bf 1.5e-1 -batch -room -p 1
>> radiosity[956]: segfault at 8f72a4b8 ip 0000000000401614 sp 00007fff8f72a4c0 
>> error 6 in radiosity[400000+102000]
>> Segmentation fault
>> 
>> exeuting ./raytrace -s -p1 -a4 teapot.env
>> raytrace[957]: segfault at 4f8ef698 ip 00000000004010d4 sp 00007fff4f8ef6a0 
>> error 6 in raytrace[400000+13e000]
>> Segmentation fault
>> 
>> exeuting ./volrend 1 head-scaleddown4 4
>> volrend[958]: segfault at c4797548 ip 0000000000400bc4 sp 00007fffc4797550 
>> error 6 in volrend[400000+12f000]
>> Segmentation fault
>> 
>> exeuting ./water_nsquared < water_nsquared_input_1
>> water_nsquared[959]: segfault at e6defbc8 ip 0000000000401434 sp 
>> 00007fffe6defbd0 error 6 in water_nsquared[400000+12e000]
>> Segmentation fault
>> 
>> exeuting .water_spatial 1 < water_spatial_input_1
>> water_spatial[960]: segfault at 4bc8ea58 ip 0000000000401d44 sp 
>> 00007fff4bc8ea60 error 6 in water_spatial[400000+12e000]
>> Segmentation fault
>> 
>> exeuting ./cholesky < cholesky_tk14.O
>> /tmp/script: line 21: can't open tk14.O: no such file
>> exeuting ./fft -m18 -p1
>> 
>> fft[961]: segfault at 7fe73ba8 ip 0000000000401724 sp 00007fff7fe73bb0 error 
>> 6 in fft[400000+f2000]
>> Segmentation fault
>> 
>> exeuting ./lu_cb -p1 -n512 -b16
>> Segmentation fault
>> 
>> exeuting ./lu_ncb -p1 -n512 -b16
>> Segmentation fault
>> 
>> exeuting ./radix -p1 -r4096 -n262144 -m524288
>> Segmentation fault
>> 
>> I searched a little bit and found that this segfault might be due to cache 
>> alignment issue, and proper src modification is required:
>> http://www.mail-archive.com/[email protected]/msg05592.html
>> But overall there's no solution yet.
>> 
>> I am confused on how to solve this. Did I missed something when compiling? 
>> Anyone solved this problem? I am using GLIC 2.24 and GCC 4.8.5.
>> 
>> Thanks for your help!
>> 
>> Wei
>> _______________________________________________
>> gem5-users mailing list
>> [email protected]
>> http://m5sim.org/cgi-bin/mailman/listinfo/gem5-users
> _______________________________________________
> gem5-users mailing list
> [email protected]
> http://m5sim.org/cgi-bin/mailman/listinfo/gem5-users
> _______________________________________________
> gem5-users mailing list
> [email protected]
> http://m5sim.org/cgi-bin/mailman/listinfo/gem5-users
_______________________________________________
gem5-users mailing list
[email protected]
http://m5sim.org/cgi-bin/mailman/listinfo/gem5-users

Reply via email to