Home
A queue with two stacks

A queue with two stacks

8 min read
Original publication: July 29, 2016 Updated at: September 16, 2024 (added interactive simulation) Updated at: January 25, 2025 (added optimized solution)

A few years ago, during a job interview for Google, a friend got the following challenge: “implement a queue using two stacks”. I found it an interesting challenge, so I decided to give it a shot.

Before you keep reading, think about how you would approach this challenge.

Let’s begin with a brief review on stacks and queues.

Stacks and queues

A stack is a collection of elements based on the concept of Last In First Out (LIFO): the last added element is the first to leave. It has two basic operations: push (add an element to the end of the stack) and pop (remove and return the last element of the stack). You can interact with the examples below by pressing the buttons.

1
2
3

Analogously, a queue is a data structure based on First In First Out (FIFO): the first added element is the first to leave (e.g. a bank queue). It also has two basic operations: insert (add an element to the end of the queue) and remove (remove and return the first element of the queue).

1
2
3

Implementation

Let’s use the following implementation of a stack in Python (don’t worry if you don’t know Python, it should be possible to get the overall idea):

class Stack(object):
    # If you aren't familiar with Python, this is the constructor
    def __init__(self):
        '''Create an empty stack.'''
        self.elements = []

    def push(self, element):
        '''Add an element to the end of the stack.'''
        self.elements.append(element)

    def pop(self):
        '''Remove and return the last element of the stack.'''
        return self.elements.pop()

    def empty(self):
        '''Return True if the stack is empty, False otherwise.'''
        return len(self.elements) == 0

I added an extra function (empty) to tell us if the stack is empty or not. Now, to the implementation of the queue with two stacks:

class Queue(object):
    def __init__(self):
        self.in_order_st = Stack()
        self.reversed_st = Stack()

    def insert(self, element):
        transfer(self.reversed_st, self.in_order_st)
        self.in_order_st.push(elemento)

    def remove(self):
        transfer(self.in_order_st, self.reversed_st)
        return self.reversed_st.pop()

    def empty(self):
        return self.in_order_st.empty() and self.reversed_st.empty()

def transfer(origin, destination):
    while (not origin.empty()):
        destination.push(origin.pop())

Explanation

The “trick” is to use one stack as the in order queue and the other as the reversed queue. After any operation all elements are either in the in order or in the reversed stacks, thus at least one of the stacks is always empty. This invariant is essential for our implementation to work correctly. Let’s explain each function.

The transfer function

The auxiliary function transfer pops all elements from origin and pushes them, in order, to destination. Assuming one of the stacks is empty we have two cases:

  1. The stack origin is empty: nothing happens.
  2. The stack destination is empty: the elements in origin are removed from the last to the first and added in this order into the destination stack. In other words, the origin stack is reversed in the destination stack.

We can interpret that the function transfer always inverts the elements in origin and stores them in destination. Note that at the end of this operation the origin stack is always empty, so the invariant that at least one of the stacks are empty still holds.

The insert function

The function insert must add an element to the end of the queue. The function push inserts an element at the end of a stack. We just need to push the new element to the in order stack. Again, considering our invariant, we have two cases:

  1. The reversed stack is empty: the function transfer doesn’t do anything and all elements are already in order in the in order stack.
  2. The in order stack is empty: the function transfer inverts the reversed stack (thus, applying the correct order) and adds it to the in order stack.

In both cases, after the call to transfer the reversed stack is empty and the in order stack contains all elements. The new element is added to the end of the in order stack and the invariant still holds.

The remove function

Finally, the function remove must remove the first element of the queue. This is the reason we have a stack that stores the reversed queue, because popping the last element of the reversed queue is the same as removing the first element of the non-reversed queue. Now we want to make sure the elements are in the reversed stack. For this we invert and transfer the elements from the in order to reversed stack. According to our invariant we have, one more time:

  1. The reversed stack is empty: the function transfer inverts the in order stack and transfers the elements to the reversed stack.
  2. The in order stack is empty: the function transfer doesn’t do anything.

The last element in the reversed stack (first element of the queue) is then removed and the invariant still holds. As the function empty doesn’t change any of the stacks and in the initial condition both stacks are empty, the invariant is always true.

Simulation

To help understanding the algorithm, let’s see a simulation. You are free to play with it as much as you want. If you want a suggestion, try the following operations:

  1. Insert the numbers 0, 1, and 2;
  2. Remove the first element (the element 0 should be removed);
  3. Insert the numbers 3 and 4 (the elements will be transfered back to the in order stack);
  4. Remove all elements.

If we revisit the simulation we will see that the elements were removed from the queue in the same order they were inserted, thus we have a FIFO.

Can we do better?

Before we think about complexity, can you think of a sequence of operations that would make our implementation inefficient? Try to find a sequence of operations that would make the function transfer be called for the whole stack for every operation.

I’ll give you some time to work on it.

The sequence of operations that would make our implementation inefficient is:

  1. Insert the numbers 0, 1, 2, 3;
  2. Remove the first element (the element 0 should be removed);
  3. Insert the numbers 4;
  4. Remove the first element (the element 1 should be removed);
  5. Keep alternating between inserting and removing elements.

Let’s analyze the complexity of our implementation. The function transfer has a complexity of O(n)\mathcal{O}(n), where nn is the number of elements in the stack (the function empty has a complexity of O(1)\mathcal{O}(1)). The functions insert and remove call transfer exactly once and then perform a constant operation (either a push or a pop), so they also have a complexity of O(n)\mathcal{O}(n). So, can we do better than this? At first, we may think we can’t, as we need to transfer all elements from one stack to the other at least once. But it turns out we can.

Improving the complexity

The key here is in what I just said: we need to transfer all elements from one stack to the other at least once. We can improve our implementation by transferring elements only when necessary.

You can take a few minutes to think about it or go straight to the solution.

The idea is to transfer elements only when we need to. We can do this by transferring elements from the in order stack to the reversed stack only when the reversed stack is empty and we need to remove an element. The same applies to the in order stack. The new implementation is as follows:

Ok, the complexity of insert is now O(1)\mathcal{O}(1), but what about remove? We are still transferring all elements from one stack to the other at least once. So it is still O(n)\mathcal{O}(n), right?

If we consider a single operation, yes, the complexity is still O(n)\mathcal{O}(n). But if we consider a sequence of operations, every element is transfered to the reversed stack at most once. We can do a amortized analysis here. The idea is to consider that whenever an element is inserted in the queue, it costs 22 units of time: one unit to insert the element in the in order stack and another unit that we “save” to transfer the element to the reversed stack when necessary. So, when we need to transfer all elements from the in order stack to the reversed stack, we have enough “saved” units to do it in O(n)\mathcal{O}(n) time. This way, the amortized complexity of the remove function is O(1)\mathcal{O}(1).

Maybe you haven’t heard about amortized analysis before and it may seem my explanation was just a hand-waving argument. If that was the case, consider the following. Whenever we have a transfer operation with nn elements, we have already inserted nn elements in the queue in constant time. Also, in the following n1n-1 remove operations we will not transfer any element, so the complexity of these operations is O(1)\mathcal{O}(1). This way, the amortized complexity of the remove operation is O(1)\mathcal{O}(1). So the intuition is that any costly (linear) operation is “paid” by a sequence of cheap (constant) operations.

So why have I shown you the first implementation? Well, this was the solution I came up with when I first faced this problem. I kept it here because I think it has a simpler invariant, so it was easier for me to explain it. And, well, it is about the journey, not the destination, right?

This is our implementation of a queue with two stacks. Can you implement a stack with two queues?