> I looked into this a little bit and it turns out that we can't use a > bitset easily because we don't know at compile time how many bits we > need.
It seems as if bitsets are only used to determine the # of destinations in a memory hierarchy (e.g. L2 sharers) We dont know that a compile time but we do know that at configuration time (swig time?). That means, once Ruby builds its system, it can pass the max sharers to the gem5 builder and use that to generate a file with a typedef bitset. Something like: typedef std::bitset<MaxSharers> NetDest; Would that make sense? Its a least a better option then being limited to 64 cores of simulation. -- - Korey _______________________________________________ gem5-dev mailing list [email protected] http://m5sim.org/mailman/listinfo/gem5-dev
