Silvermane LRU Cache: Design Revisited
๐ฆ Silvermane LRU Cache: Design Revisited
Difficulty: Advanced Tags: design, lru-cache, doubly-linked-list, hashmap, data-structures Series: CS 102: Problem Solving & Logic
Problem
The Silver mane's memory holds only the most recently accessed wisdom. Design a Least Recently Used (LRU) cache that evicts the oldest unused item when capacity is reached.
Implement an LRU cache with fixed capacity that supports:
- get(key) - Return value if key exists, -1 otherwise. Marks key as recently used.
- put(key, value) - Insert or update key-value pair. If capacity exceeded, evict least recently used item.
Both operations must run in O(1) time.
Real-World Application
LRU caches are everywhere in computing: - Operating systems - page replacement, buffer caches - Web browsers - caching recently viewed pages - CDNs - content delivery, edge caching - Databases - query result caching, buffer pools - CPU caches - L1/L2/L3 cache replacement policies - Web servers - session caching, API response caching - Mobile apps - image caching, data prefetching - Distributed systems - Redis, Memcached implementations
Input
data = {
'cap': int, # Cache capacity
'ops': List[List] # List of operations
}
Operations format:
- ['put', key, value] - Insert/update key with value
- ['get', key] - Retrieve value for key
Output
List[int] # Results of all 'get' operations (in order)
- Return value if key exists
- Return -1 if key not found
Constraints
- 1 โค capacity โค 100,000
- Operations โค 100,000
- O(1) time per operation required
- Keys and values are integers
Examples
Example 1: Basic LRU Operations
Input:
{
'cap': 2,
'ops': [
['put', 1, 1],
['put', 2, 2],
['get', 1], # Returns 1
['put', 3, 3], # Evicts key 2
['get', 2], # Returns -1 (evicted)
['get', 3] # Returns 3
]
}
Output: [1, -1, 3]
Explanation:
1. put(1, 1) - Cache: {1=1}
2. put(2, 2) - Cache: {1=1, 2=2}
3. get(1) โ 1 - Cache: {2=2, 1=1} (1 becomes most recent)
4. put(3, 3) - Cache full, evict LRU (key 2): {1=1, 3=3}
5. get(2) โ -1 (key 2 was evicted)
6. get(3) โ 3
Example 2: Update Existing Key
Input:
{
'cap': 2,
'ops': [
['put', 1, 1],
['put', 2, 2],
['put', 1, 10], # Update existing key
['get', 1], # Returns 10
['get', 2] # Returns 2
]
}
Output: [10, 2]
Explanation: - Updating existing key doesn't evict anything - Updated key becomes most recently used
Example 3: Capacity 1
Input:
{
'cap': 1,
'ops': [
['put', 1, 1],
['get', 1], # Returns 1
['put', 2, 2], # Evicts key 1
['get', 1] # Returns -1
]
}
Output: [1, -1]
Example 4: Multiple Gets
Input:
{
'cap': 2,
'ops': [
['put', 1, 1],
['put', 2, 2],
['get', 1], # 1 becomes most recent
['get', 1], # Still most recent
['put', 3, 3], # Evicts 2 (not 1)
['get', 2] # Returns -1
]
}
Output: [1, 1, -1]
What You'll Learn
- Combining hash map + doubly linked list
- O(1) insertion, deletion, and lookup
- Maintaining order with pointers
- Cache eviction policies
- Real-world system design patterns
Why This Matters
LRU Cache is the #1 most asked system design question at FAANG: - Tests data structure mastery - Requires understanding of pointers - Demonstrates O(1) optimization skills - Shows real-world problem-solving
Companies that ask this: Google, Facebook, Amazon, Microsoft, Apple, Netflix, Uber
Starter Code
def challenge_function(data):
"""
Implement LRU cache with O(1) get and put operations.
Args:
data: dict with keys:
- 'cap' (int): cache capacity
- 'ops' (List[List]): list of operations
['get', key] or ['put', key, value]
Returns:
List[int]: results of all 'get' operations
"""
# Your implementation here
pass