prefetching the data in to the caches isn't improving the performance in cmap_find_batch(). Moreover its found that there is slight improvement in performance with out prefetching.
This patch removes prefetching from cmap_find_batch(). Signed-off-by: Bhanuprakash Bodireddy <bhanuprakash.bodire...@intel.com> Co-authored-by: Antonio Fischetti <antonio.fische...@intel.com> Signed-off-by: Antonio Fischetti <antonio.fische...@intel.com> --- lib/cmap.c | 8 ++------ 1 file changed, 2 insertions(+), 6 deletions(-) diff --git a/lib/cmap.c b/lib/cmap.c index 8c7312d..8097b56 100644 --- a/lib/cmap.c +++ b/lib/cmap.c @@ -393,11 +393,10 @@ cmap_find_batch(const struct cmap *cmap, unsigned long map, const struct cmap_bucket *b2s[sizeof map * CHAR_BIT]; uint32_t c1s[sizeof map * CHAR_BIT]; - /* Compute hashes and prefetch 1st buckets. */ + /* Compute hashes. */ ULLONG_FOR_EACH_1(i, map) { h1s[i] = rehash(impl, hashes[i]); b1s[i] = &impl->buckets[h1s[i] & impl->mask]; - OVS_PREFETCH(b1s[i]); } /* Lookups, Round 1. Only look up at the first bucket. */ ULLONG_FOR_EACH_1(i, map) { @@ -411,15 +410,13 @@ cmap_find_batch(const struct cmap *cmap, unsigned long map, } while (OVS_UNLIKELY(counter_changed(b1, c1))); if (!node) { - /* Not found (yet); Prefetch the 2nd bucket. */ + /* Not found (yet). */ b2s[i] = &impl->buckets[other_hash(h1s[i]) & impl->mask]; - OVS_PREFETCH(b2s[i]); c1s[i] = c1; /* We may need to check this after Round 2. */ continue; } /* Found. */ ULLONG_SET0(map, i); /* Ignore this on round 2. */ - OVS_PREFETCH(node); nodes[i] = node; } /* Round 2. Look into the 2nd bucket, if needed. */ @@ -453,7 +450,6 @@ cmap_find_batch(const struct cmap *cmap, unsigned long map, continue; } found: - OVS_PREFETCH(node); nodes[i] = node; } return result; -- 2.4.11 _______________________________________________ dev mailing list dev@openvswitch.org http://openvswitch.org/mailman/listinfo/dev