Alessandro Rubini a écrit :
From: Alessandro Rubini rub...@unipv.it
If source and destination are aligned, this copies ulong values
until possible, trailing part is copied by byte. Thanks for the details
to Wolfgang Denk, Mike Frysinger, Peter Tyser, Chris Moore.
Signed-off-by: Alessandro
Hello Chris
+unsigned long *dl = (unsigned long *)dest, *sl = (unsigned long *)src;
Nitpick: Are you sure the casts are necessary here ?
Without the one on src it complains because of const. So I write
both for symetry.
+ if ( (((ulong)dest | (ulong)src) (sizeof(*dl) - 1)) == 0) {
From: Alessandro Rubini rub...@unipv.it
If source and destination are aligned, this copies ulong values
until possible, trailing part is copied by byte. Thanks for the details
to Wolfgang Denk, Mike Frysinger, Peter Tyser, Chris Moore.
Signed-off-by: Alessandro Rubini rub...@unipv.it
Acked-by:
On Friday 09 October 2009 05:12:20 Alessandro Rubini wrote:
+ /* while all data is aligned (common case), copy a word at a time */
+ if ( (((ulong)dest | (ulong)src | count) (sizeof(*dl) - 1)) == 0) {
i think you want to drop the count from the list, otherwise we dont consume
the
i think you want to drop the count from the list, otherwise we dont consume
the leading groups of 4 bytes if count isnt a multiple of 4.
Yes, same for memset. See Wolfgang it was not 10% more? These micro
optimizations are hairy, as you need to measure them to make sure they
work.
Ok, V4
5 matches
Mail list logo