x86, mem: memmove_64.S: Optimize memmove by enhanced REP MOVSB/STOSB
authorFenghua Yu <fenghua.yu@intel.com>
Tue, 17 May 2011 22:29:17 +0000 (15:29 -0700)
committerH. Peter Anvin <hpa@linux.intel.com>
Tue, 17 May 2011 22:40:30 +0000 (15:40 -0700)
commit057e05c1d6440117875f455e59da8691e08f65d5
tree79c8306de2ebe31252b730170e33bd62ed18ccdb
parent101068c1f4a947ffa08f2782c78e40097300754d
x86, mem: memmove_64.S: Optimize memmove by enhanced REP MOVSB/STOSB

Support memmove() by enhanced rep movsb. On processors supporting enhanced
REP MOVSB/STOSB, the alternative memmove() function using enhanced rep movsb
overrides the original function.

The patch doesn't change the backward memmove case to use enhanced rep
movsb.

Signed-off-by: Fenghua Yu <fenghua.yu@intel.com>
Link: http://lkml.kernel.org/r/1305671358-14478-9-git-send-email-fenghua.yu@intel.com
Signed-off-by: H. Peter Anvin <hpa@linux.intel.com>
arch/x86/lib/memmove_64.S