flex has a --reentrant (or -R) flag. Seems necessary. Nate
> EIO uses a flex-generated lexer to parse the trace file, and it > wouldn't surprise me at all if that's not thread-safe. Perhaps newer > versions of flex enable thread-safe lexing. > > Steve > > On Thu, Nov 12, 2009 at 4:09 AM, Stijn Souffriau > <[email protected]> wrote: >> Hi again, >> >> I'm currently testing my code with eio traces. I'm going tot need these >> for benchmarking later as well. The problem is that my code fails >> completely when running more then one eio workload concurrently leading >> me to think that this code is not thread-safe. I looked at the code but >> it's too complex to see the problem at first glance. Does anyone know >> what the problem(s) could be and how I could fixed it(them)? I'm >> currently simulating multiple systems concurrently that don't interact. >> Each system has its own EventQueue. (By the way, I can run multiple >> "hello world" workloads with SE) >> >> Here is some of the error output: >> >> fatal: fatal: system1.core.workload: cannot read EIO transaction >> @ cycle 500 >> [read_trace:build/ALPHA_SE/eio/eio.cc, line 447] >> Memory Usage: 4369996 KBytes >> For more information see: http://www.m5sim.org/fatal/fc06e2e5 >> system0.core.workload: cannot read EIO transaction >> @ cycle 500 >> [read_trace:build/ALPHA_SE/eio/eio.cc, line 447] >> Memory Usage: 4369996 KBytes >> For more information see: http://www.m5sim.org/fatal/fc06e2e5 >> >> ----------------- >> >> panic: bogus token >> @ cycle 500 >> [exo_read:build/ALPHA_SE/eio/libexo.cc, line 1284] >> Memory Usage: 4370276 KBytes >> For more information see: http://www.m5sim.org/panic/dc582bd7 >> Program aborted at cycle 500 >> ./m5.2.par_eio_1core.sh: line 84: 7796 Aborted >> ./${binary} -d "${outputdir}" --stats-file=${statsfile} ${config} -E >> CPU_TYPE=${CPU_TYPE} -E BENCHMARK1="${prog1}-eio" >> >> ---------------- >> >> EXO parse error: line 27967: unexpected ',' encountered >> flex scanner push-back overflow >> >> >> _______________________________________________ >> m5-dev mailing list >> [email protected] >> http://m5sim.org/mailman/listinfo/m5-dev >> > _______________________________________________ > m5-dev mailing list > [email protected] > http://m5sim.org/mailman/listinfo/m5-dev > > _______________________________________________ m5-dev mailing list [email protected] http://m5sim.org/mailman/listinfo/m5-dev
