Linux kernel mirror (for testing) git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git
kernel os linux

arm64: Change .weak to SYM_FUNC_START_WEAK_PI for arch/arm64/lib/mem*.S

Commit 39d114ddc682 ("arm64: add KASAN support") added .weak directives to
arch/arm64/lib/mem*.S instead of changing the existing SYM_FUNC_START_PI
macros. This can lead to the assembly snippet `.weak memcpy ... .globl
memcpy` which will produce a STB_WEAK memcpy with GNU as but STB_GLOBAL
memcpy with LLVM's integrated assembler before LLVM 12. LLVM 12 (since
https://reviews.llvm.org/D90108) will error on such an overridden symbol
binding.

Use the appropriate SYM_FUNC_START_WEAK_PI instead.

Fixes: 39d114ddc682 ("arm64: add KASAN support")
Reported-by: Sami Tolvanen <samitolvanen@google.com>
Signed-off-by: Fangrui Song <maskray@google.com>
Tested-by: Sami Tolvanen <samitolvanen@google.com>
Tested-by: Nick Desaulniers <ndesaulniers@google.com>
Reviewed-by: Nick Desaulniers <ndesaulniers@google.com>
Cc: <stable@vger.kernel.org>
Link: https://lore.kernel.org/r/20201029181951.1866093-1-maskray@google.com
Signed-off-by: Will Deacon <will@kernel.org>

authored by

Fangrui Song and committed by
Will Deacon
ec9d7807 ce3d31ad

+3 -6
+1 -2
arch/arm64/lib/memcpy.S
··· 56 56 stp \reg1, \reg2, [\ptr], \val 57 57 .endm 58 58 59 - .weak memcpy 60 59 SYM_FUNC_START_ALIAS(__memcpy) 61 - SYM_FUNC_START_PI(memcpy) 60 + SYM_FUNC_START_WEAK_PI(memcpy) 62 61 #include "copy_template.S" 63 62 ret 64 63 SYM_FUNC_END_PI(memcpy)
+1 -2
arch/arm64/lib/memmove.S
··· 45 45 D_l .req x13 46 46 D_h .req x14 47 47 48 - .weak memmove 49 48 SYM_FUNC_START_ALIAS(__memmove) 50 - SYM_FUNC_START_PI(memmove) 49 + SYM_FUNC_START_WEAK_PI(memmove) 51 50 cmp dstin, src 52 51 b.lo __memcpy 53 52 add tmp1, src, count
+1 -2
arch/arm64/lib/memset.S
··· 42 42 tmp3w .req w9 43 43 tmp3 .req x9 44 44 45 - .weak memset 46 45 SYM_FUNC_START_ALIAS(__memset) 47 - SYM_FUNC_START_PI(memset) 46 + SYM_FUNC_START_WEAK_PI(memset) 48 47 mov dst, dstin /* Preserve return value. */ 49 48 and A_lw, val, #255 50 49 orr A_lw, A_lw, A_lw, lsl #8