Commit c6d53633 authored by Andy Whitcroft's avatar Andy Whitcroft Committed by Khalid Elmously

PM / hibernate: memory_bm_find_bit -- tighten node optimisation

BugLink: https://bugs.launchpad.net/bugs/1847118

When looking for a bit by number we make use of the cached result from the
preceding lookup to speed up operation.  Firstly we check if the requested
pfn is within the cached zone and if not lookup the new zone.  We then
check if the offset for that pfn falls within the existing cached node.
This happens regardless of whether the node is within the zone we are
now scanning.  With certain memory layouts it is possible for this to
false trigger creating a temporary alias for the pfn to a different bit.
This leads the hibernation code to free memory which it was never allocated
with the expected fallout.

Ensure the zone we are scanning matches the cached zone before considering
the cached node.

Deep thanks go to Andrea for many, many, many hours of hacking and testing
that went into cornering this bug.
Reported-by: default avatarAndrea Righi <andrea.righi@canonical.com>
Tested-by: default avatarAndrea Righi <andrea.righi@canonical.com>
Signed-off-by: default avatarAndy Whitcroft <apw@canonical.com>
Signed-off-by: default avatarRafael J. Wysocki <rafael.j.wysocki@intel.com>
(backported from commit 39800d8fc4083cfe5c8f69333336bf03f9571070 linux-next)
Signed-off-by: default avatarAndrea Righi <andrea.righi@canonical.com>
Acked-by: default avatarColin Ian King <colin.king@canonical.com>
Acked-by: default avatarThadeu Lima de Souza Cascardo <cascardo@canonical.com>
Signed-off-by: default avatarKleber Sacilotto de Souza <kleber.souza@canonical.com>
parent 8749e6f1
...@@ -662,8 +662,14 @@ static int memory_bm_find_bit(struct memory_bitmap *bm, unsigned long pfn, ...@@ -662,8 +662,14 @@ static int memory_bm_find_bit(struct memory_bitmap *bm, unsigned long pfn,
* node for our pfn. * node for our pfn.
*/ */
/*
* If the zone we wish to scan is the the current zone and the
* pfn falls into the current node then we do not need to walk
* the tree.
*/
node = bm->cur.node; node = bm->cur.node;
if (((pfn - zone->start_pfn) & ~BM_BLOCK_MASK) == bm->cur.node_pfn) if (zone == bm->cur.zone &&
((pfn - zone->start_pfn) & ~BM_BLOCK_MASK) == bm->cur.node_pfn)
goto node_found; goto node_found;
node = zone->rtree; node = zone->rtree;
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment