Which cache management algorithm is based on the assumption that data will not be requested
by the host when it has not been accessed for a while?

A.
LRU
B.
HWM
C.
LWM
D.
MRU
Explanation:
Cache Management: Algorithms
Cache is a finite and expensive resource that needs proper management. Even though modern
intelligent storage systems come with a large amount of cache, when all cache pages are filled,
some pages have to be freed up to accommodate new data and avoid performance degradation.
Various cache management algorithms are implemented in intelligent storage systems to
proactively maintain a set of free pages and a list of pages that can be potentially freed up
whenever required.
The most commonly used algorithms are discussed in the following list:
• Least Recently Used (LRU): An algorithm that continuously monitors data access in cache and
identifies the cache pages that have not been accessed for a long time. LRU either frees up these
pages or marks them for reuse. This algorithm is based on the assumption that data that has not
been accessed for a while will not be requested by the host.
However, if a page contains write data that has not yet been committed to disk, the data is first
written to disk before the page is reused.
• Most Recently Used (MRU): This algorithm is the opposite of LRU, where the pages that have
been accessed most recently are freed up or marked for reuse. This algorithm is based on the
assumption that recently accessed data may not be required for a while.
EMC E10-001 Student Resource Guide. Module 4: Intelligent Storage System