Stablehorn LRU Cache ๐ŸŒณ
README.md
main.py
notes.txt
Output
Tests

Stablehorn LRU Cache

๐ŸŒณ Advanced Project 381779768

๐Ÿฆ„ Stablehorn LRU Cache

Difficulty: Advanced Tags: design, lru, hashmap, doubly-linked-list Series: Data Structures


Problem

Design and implement an LRU (Least Recently Used) cache with a given capacity. Your cache should support two operations efficiently:

  • get(key): Return the value of the key if it exists in the cache, otherwise return -1. When you access a key, it becomes the most recently used.
  • put(key, value): Insert or update the value of the key. If the cache is at capacity, evict the least recently used item before inserting the new key.

Both operations must run in O(1) time complexity.

Why This Matters

LRU caching is fundamental in system design. Modern operating systems use it for memory management, databases use it for query result caching, and web browsers use it for storing recent pages. Understanding how to implement an LRU cache efficiently is a critical skill for technical interviews at top companies and for building performant systems in production.

Input

data = {'cap': int, 'ops': list}

Where: - cap: The maximum capacity of the cache (number of key-value pairs it can hold) - ops: A list of operations, each operation is either: - ['put', key, value]: Insert/update key with value - ['get', key]: Retrieve the value for key

Output

list[int]: Results of all get operations in order. Return -1 for cache misses.

Constraints

  • 1 โ‰ค cap โ‰ค 100_000
  • 1 โ‰ค len(ops) โ‰ค 100_000
  • Each operation must execute in O(1) time
  • Keys and values are integers

Examples

Example 1: Basic LRU Behavior

Input:

{
  'cap': 2,
  'ops': [
    ['put', 1, 1],  # Cache: {1:1}
    ['put', 2, 2],  # Cache: {1:1, 2:2}
    ['get', 1],     # Returns 1, Cache: {2:2, 1:1} (1 is now most recent)
    ['put', 3, 3],  # Evicts key 2, Cache: {1:1, 3:3}
    ['get', 2],     # Returns -1 (key 2 was evicted)
    ['get', 3]      # Returns 3
  ]
}

Output: [1, -1, 3]

Explanation: 1. Put key 1 with value 1 2. Put key 2 with value 2 (cache is now full) 3. Get key 1, returns 1 (key 1 is now most recently used) 4. Put key 3 with value 3 (cache is full, so evict least recently used key 2) 5. Get key 2, returns -1 (was evicted) 6. Get key 3, returns 3

Example 2: Update Existing Key

Input:

{
  'cap': 2,
  'ops': [
    ['put', 1, 1],
    ['put', 2, 2],
    ['put', 1, 10],  # Update key 1's value
    ['get', 1],      # Returns 10
    ['get', 2]       # Returns 2
  ]
}

Output: [10, 2]

Explanation: When you put a key that already exists, it updates the value and marks it as most recently used.

Example 3: Capacity of 1

Input:

{
  'cap': 1,
  'ops': [
    ['put', 1, 1],
    ['put', 2, 2],  # Evicts key 1
    ['get', 1],     # Returns -1
    ['get', 2]      # Returns 2
  ]
}

Output: [-1, 2]



What you'll learn

  • Design a data structure from operations/specs
  • Combine hash maps and doubly linked lists for optimal performance
  • Implement O(1) cache eviction policies
  • Handle complex pointer manipulation in linked lists

Why this matters

Real systems rely on caches. Understanding LRU builds design skills used in interviews and production. This pattern appears in operating systems (page replacement), databases (buffer pools), web servers (content caching), and many other high-performance systems.

OUTPUT
TESTS
Output
Code execution results displayed hereโ€ฆ
Test Results
Test results displayed hereโ€ฆ