Commit c071b0f1 authored by Nick Desaulniers's avatar Nick Desaulniers Committed by Linus Torvalds

x86: bitops: fix build regression

This is easily reproducible via CC=clang + CONFIG_STAGING=y +
CONFIG_VT6656=m.

It turns out that if your config tickles __builtin_constant_p via
differences in choices to inline or not, these statements produce
invalid assembly:

    $ cat foo.c
    long a(long b, long c) {
      asm("orb	%1, %0" : "+q"(c): "r"(b));
      return c;
    }
    $ gcc foo.c
    foo.c: Assembler messages:
    foo.c:2: Error: `%rax' not allowed with `orb'

Use the `%b` "x86 Operand Modifier" to instead force register allocation
to select a lower-8-bit GPR operand.

The "q" constraint only has meaning on -m32 otherwise is treated as
"r".  Not all GPRs have low-8-bit aliases for -m32.

Fixes: 1651e700 ("x86: Fix bitops.h warning with a moved cast")
Reported-by: default avatarkernelci.org bot <bot@kernelci.org>
Suggested-by: default avatarAndy Shevchenko <andriy.shevchenko@intel.com>
Suggested-by: default avatarBrian Gerst <brgerst@gmail.com>
Suggested-by: default avatarH. Peter Anvin <hpa@zytor.com>
Suggested-by: default avatarIlie Halip <ilie.halip@gmail.com>
Signed-off-by: default avatarNick Desaulniers <ndesaulniers@google.com>
Signed-off-by: default avatarAndrew Morton <akpm@linux-foundation.org>
Tested-by: default avatarSedat Dilek <sedat.dilek@gmail.com>
Tested-by: Nathan Chancellor <natechancellor@gmail.com>	[build, clang-11]
Reviewed-by: default avatarNathan Chancellor <natechancellor@gmail.com>
Reviewed-By: default avatarBrian Gerst <brgerst@gmail.com>
Reviewed-by: default avatarJesse Brandeburg <jesse.brandeburg@intel.com>
Cc: Thomas Gleixner <tglx@linutronix.de>
Cc: Ingo Molnar <mingo@redhat.com>
Cc: Borislav Petkov <bp@alien8.de>
Cc: Marco Elver <elver@google.com>
Cc: "Paul E. McKenney" <paulmck@kernel.org>
Cc: Andrey Ryabinin <aryabinin@virtuozzo.com>
Cc: Luc Van Oostenryck <luc.vanoostenryck@gmail.com>
Cc: Masahiro Yamada <yamada.masahiro@socionext.com>
Cc: Daniel Axtens <dja@axtens.net>
Cc: "Peter Zijlstra (Intel)" <peterz@infradead.org>
Link: http://lkml.kernel.org/r/20200508183230.229464-1-ndesaulniers@google.com
Link: https://github.com/ClangBuiltLinux/linux/issues/961
Link: https://lore.kernel.org/lkml/20200504193524.GA221287@google.com/
Link: https://gcc.gnu.org/onlinedocs/gcc/Extended-Asm.html#x86OperandmodifiersSigned-off-by: default avatarLinus Torvalds <torvalds@linux-foundation.org>
parent 60858c00
...@@ -52,9 +52,9 @@ static __always_inline void ...@@ -52,9 +52,9 @@ static __always_inline void
arch_set_bit(long nr, volatile unsigned long *addr) arch_set_bit(long nr, volatile unsigned long *addr)
{ {
if (__builtin_constant_p(nr)) { if (__builtin_constant_p(nr)) {
asm volatile(LOCK_PREFIX "orb %1,%0" asm volatile(LOCK_PREFIX "orb %b1,%0"
: CONST_MASK_ADDR(nr, addr) : CONST_MASK_ADDR(nr, addr)
: "iq" (CONST_MASK(nr) & 0xff) : "iq" (CONST_MASK(nr))
: "memory"); : "memory");
} else { } else {
asm volatile(LOCK_PREFIX __ASM_SIZE(bts) " %1,%0" asm volatile(LOCK_PREFIX __ASM_SIZE(bts) " %1,%0"
...@@ -72,9 +72,9 @@ static __always_inline void ...@@ -72,9 +72,9 @@ static __always_inline void
arch_clear_bit(long nr, volatile unsigned long *addr) arch_clear_bit(long nr, volatile unsigned long *addr)
{ {
if (__builtin_constant_p(nr)) { if (__builtin_constant_p(nr)) {
asm volatile(LOCK_PREFIX "andb %1,%0" asm volatile(LOCK_PREFIX "andb %b1,%0"
: CONST_MASK_ADDR(nr, addr) : CONST_MASK_ADDR(nr, addr)
: "iq" (CONST_MASK(nr) ^ 0xff)); : "iq" (~CONST_MASK(nr)));
} else { } else {
asm volatile(LOCK_PREFIX __ASM_SIZE(btr) " %1,%0" asm volatile(LOCK_PREFIX __ASM_SIZE(btr) " %1,%0"
: : RLONG_ADDR(addr), "Ir" (nr) : "memory"); : : RLONG_ADDR(addr), "Ir" (nr) : "memory");
...@@ -123,9 +123,9 @@ static __always_inline void ...@@ -123,9 +123,9 @@ static __always_inline void
arch_change_bit(long nr, volatile unsigned long *addr) arch_change_bit(long nr, volatile unsigned long *addr)
{ {
if (__builtin_constant_p(nr)) { if (__builtin_constant_p(nr)) {
asm volatile(LOCK_PREFIX "xorb %1,%0" asm volatile(LOCK_PREFIX "xorb %b1,%0"
: CONST_MASK_ADDR(nr, addr) : CONST_MASK_ADDR(nr, addr)
: "iq" ((u8)CONST_MASK(nr))); : "iq" (CONST_MASK(nr)));
} else { } else {
asm volatile(LOCK_PREFIX __ASM_SIZE(btc) " %1,%0" asm volatile(LOCK_PREFIX __ASM_SIZE(btc) " %1,%0"
: : RLONG_ADDR(addr), "Ir" (nr) : "memory"); : : RLONG_ADDR(addr), "Ir" (nr) : "memory");
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment