From: Ville Syrjälä Date: Sun, 13 Apr 2014 09:45:03 +0000 (+0300) Subject: x86/gpu: Fix sign extension issue in Intel graphics stolen memory quirks X-Git-Tag: firefly_0821_release~176^2~4016^2~3 X-Git-Url: http://demsky.eecs.uci.edu/git/?a=commitdiff_plain;h=86e587623a0ca8426267dad8d3eaebd6fc2d00f1;p=firefly-linux-kernel-4.4.55.git x86/gpu: Fix sign extension issue in Intel graphics stolen memory quirks Have the KB(),MB(),GB() macros produce unsigned longs to avoid unintended sign extension issues with the gen2 memory size detection. What happens is first the uint8_t returned by read_pci_config_byte() gets promoted to an int which gets multiplied by another int from the MB() macro, and finally the result gets sign extended to size_t. Although this shouldn't be a problem in practice as all affected gen2 platforms are 32bit AFAIK, so size_t will be 32 bits. Reported-by: Bjorn Helgaas Suggested-by: H. Peter Anvin Signed-off-by: Ville Syrjälä Cc: linux-kernel@vger.kernel.org Link: http://lkml.kernel.org/r/1397382303-17525-1-git-send-email-ville.syrjala@linux.intel.com Signed-off-by: Ingo Molnar --- diff --git a/arch/x86/kernel/early-quirks.c b/arch/x86/kernel/early-quirks.c index b0cc3809723d..6e2537c32190 100644 --- a/arch/x86/kernel/early-quirks.c +++ b/arch/x86/kernel/early-quirks.c @@ -240,7 +240,7 @@ static u32 __init intel_stolen_base(int num, int slot, int func, size_t stolen_s return base; } -#define KB(x) ((x) * 1024) +#define KB(x) ((x) * 1024UL) #define MB(x) (KB (KB (x))) #define GB(x) (MB (KB (x)))