LRU Cache
Recognize the pattern
Brute force idea
A straightforward first read of LRU Cache is this: List + linear scan to find/evict — per operation. That instinct is useful because it follows the prompt literally, but it usually keeps revisiting work the problem is begging you to organize.
Better approach
The real unlock in LRU Cache comes when you notice this: Hash map (key → node) + doubly linked list (ordered by recency). Get: look up in map, move to front. Put: add to front, evict from back if over capacity. Both. Instead of recomputing the world every time, you preserve just enough context to let the next decision become obvious.
Key invariant
The compass for LRU Cache is this: The doubly linked list maintains access order: most recent at front, least recent at back. The hash map gives access to any node for direct removal and reinsertion. As long as that statement keeps holding, you can trust the steps built on top of it.
Watch out for
A common way to get lost in LRU Cache is this: Using a singly linked list — you need removal from the middle, which requires a doubly linked list with previous pointers. Most mistakes here are not about syntax; they come from losing track of what your state, pointer, or structure is supposed to mean.