Hi, I have been trying to get OpenGL sw rastering to work on Linux based ARM CortexA15 device but for some reason gallium llvmpipe gives me a segmentation fault. Has anybody managed to run sw rastering using gallium llvmpipe on ARM? Softpipe seems to be working ok.
I am using the following versions: - Mesa 9.2.5 - LLVM3.3 - Wayland/Weston 1.5 The issue is related to opengl/gles2 rendering i.e. weston-simple-egl demo and Qt openGL demos give seg faults when executed with llvmpipe enabled. Backtrace from weston-simple-egl looks something like this: #0 0x6d7de620 in ?? () #1 0xb354004c in fs17_variant0_partial () #2 0xb6841740 in lp_rast_shade_quads_mask (task=task@entry=0x1e360, inputs=inputs@entry=0x47a00, x=128, y=80, mask=4369) at lp_rast.c:466 #3 0xb6842d04 in do_block_4_1 (c=<optimized out>, y=<optimized out>, x=<optimized out>, plane=<optimized out>, tri=<optimized out>, task=<optimized out>) at lp_rast_tri_tmp.h:61 #4 do_block_16_1 (c=<synthetic pointer>, y=<optimized out>, x=<optimized out>, plane=0xb4f4ccf0, tri=<optimized out>, task=<optimized out>) at lp_rast_tri_tmp.h:130 #5 lp_rast_triangle_1 (task=0x1e360, arg=...) at lp_rast_tri_tmp.h:232 #6 0xb6840420 in do_rasterize_bin (bin=<optimized out>, task=0x1e360, x=<optimized out>, y=<optimized out>) at lp_rast.c:607 #7 rasterize_bin (y=<optimized out>, x=<optimized out>, bin=<optimized out>, task=0x1e360) at lp_rast.c:626 #8 rasterize_scene (task=task@entry=0x1e360, scene=0xb3646008) at lp_rast.c:675 #9 0xb6840cb0 in thread_function (init_data=0x1e360) at lp_rast.c:788 #10 0xb6d98ed2 in start_thread () from /lib/libpthread.so.0 #11 0xb6c91058 in ?? () from /lib/libc.so.6 #12 0xb6c91058 in ?? () from /lib/libc.so.6 regards, Marko marko.s.mob...@gmail.com
_______________________________________________ mesa-dev mailing list mesa-dev@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/mesa-dev