On 10/09/15 22:11, Jeff Law wrote:
On 09/10/2015 12:23 PM, Bernd Schmidt wrote:
  > No testcase provided, as currently I don't know of targets with a high
  > enough branch cost to actually trigger the optimisation.

Hmm, so the code would not actually be used right now? In that case I'll
leave it to others to decide whether we want to apply it. Other than the
points above it looks OK to me.
Some targets have -mbranch-cost to allow overriding the default costing.
   visium has a branch cost of 10!  Several ports have a cost of 6 either
unconditionally or when the branch is not well predicted.

Presumably James is more interested in the ARM/AArch64 targets ;-)

I think that's probably what James is most interested in getting some
ideas around -- the cost model.

I think the fundamental problem is BRANCH_COST isn't actually relative
to anything other than the default value of "1".  It doesn't directly
correspond to COSTS_N_INSNS or anything else.  So while using
COSTS_N_INSNS (BRANCH_COST)) would seem to make sense, it actually
doesn't.  It's not even clear how a value of 10 relates to a value of 1
other than it's more expensive.

ifcvt (and others) comparing to magic #s is more than a bit lame.  But
with BRANCH_COST having no meaning relative to anything else I can see
why Richard did things that way.

Out of interest, what was the intended original meaning
of branch costs if it was not to be relative to instructions?

Thanks,
Kyrill

In an ideal world we'd find some mapping from BRANCH_COST that relates
to CONST_N_INSNS.  I suspect a simple mapping doesn't necessarily exist
and we'll likely regress targets with any simplistic mapping.  But maybe
now is the time to address that fundamental problem and deal with the
fallout.

jeff



Reply via email to