The Most Important Sorting Algorithms for Coding Interviews (Merge Sort and Quick Sort)

Daniel Park
9 min readSep 22, 2023

--

Any Sorting Algorithm can work with a few numbers. But what about a hundred? A thousand? A million? Photo by Nick Hillier on Unsplash

In the context of coding interviews, knowing which sorting algorithm can feel like getting lost in a maze: there’s just so many out there! Now, if you’re aiming for the ‘Sorting Algorithm Encyclopedia’ title, by all means, by all means: you can memorize them all.

But if you’re normal like me prepping for interviews, keep it simple. According to Microsoft, candidates should be expected to know at least one O(Nlog(N)) where N signifies the size of the input array, and preferably two. They even list out the two you should know, and the two of which I recommend you to learn and know in detail: Merge Sort and Quick Sort.

Do you wish to be the “Sorting Algorithm Guru”? By all means, memorize them all! Photo by Emmanuel Ikwuegbu on Unsplash

Why do we need to understand O(N * log(N)) algorithms for interviews?

Understanding sorting algorithms, especially the O(N * log(N)) ones such as merge sort and quicksort, is fundamental in the realm of computer science and software development.

These algorithms serve as a performance benchmark, representing the best average-case time complexity achievable for comparison-based sorting methods. Their widespread use in real-world applications underscores their efficiency and reliability across diverse data sets. Moreover, they act as foundational pillars when delving into advanced data structures and computational strategies, like trees and heaps. Familiarity with these algorithms is also a common expectation in technical interviews, and can even be a question your interviewer asks you!

Beyond their direct applications, understanding these sorting algorithms hones problem-solving skills, emphasizing the ability to dissect complex problems into simpler components, as well as offer insights into the trade-offs between time and space, the significance of algorithm stability, and adaptivity. In essence, a comprehensive grasp of O(N * log(N)) algorithms equips individuals with a rich toolkit for both practical applications and theoretical explorations necessary for a software engineer.

Why Merge & Quick Sort Specifically?

Merge Sort and Quick Sort hold a unique prominence in technical interviews, standing out among other O(N * log(N)) algorithms such as Heap Sort and Timsort (though it is beneficial to understand how it works, more on that later).

The two algorithms hold historical significance and ubiquity in computer science curricula make them almost canonical representations of divide-and-conquer techniques. Beyond their basic implementations, they encapsulate a broad range of fundamental concepts, from the “divide, conquer, and combine” strategy of merge sort to the pivotal partitioning in quicksort. This breadth often leads interviewers to use them as springboards for deeper discussions, testing candidates on variations or diving into space-time trade-offs, given the clear contrasts between the two algorithms. For instance, while merge sort’s additional space requirement offers a topic on efficiency trade-offs, quicksort’s pivot choices can lead to intricate discussions on algorithm behavior.

And if you’re still not convinced, Microsoft and Cracking the Coding Interview from Gayle Laakmann McDowell also recommend knowing these two algorithms specifically.

The Holy Bible for Coding Interviews | Source: Amazon

Merge & Quick Sort

Complexity Analysis Comparisons

Now before going into the implementations of each algorithm, it is important to recognize the benefits as well as the tradeoffs for each algorithm.

We can first start by talking about the two algorithms’ complexities. While we can talk hours and hours of theory behind Merge and Quick Sort Complexities, in general for coding interviews, know the following complexities for each sorting algorithm. I provided links for quick complexity proofs below each if you want to dive deeper.

Merge Sort

  • (Average) Time complexity: O(Nlog(N))
  • (Worst) Time complexity: O(Nlog(N))
  • (Average) Space complexity: O(N)

https://www.geeksforgeeks.org/time-and-space-complexity-analysis-of-merge-sort/

Quick Sort

  • (Average) Time complexity: O(Nlog(N))
  • (Worst) Time complexity: O(N²)
  • (Average) Space complexity: O(log(N))

General Comparisons

Outside of complexities, it’s also important to know when to use each, and you can discuss them with your interviewer. In general, both sorting algorithms are both efficient divide-and-conquer sorting algorithms, but they cater to different needs.

Merge Sort is stable (meaning the order of equal elements in maintained in the sorted list) and guarantees O(Nlog(N)) performance, making it ideal for tasks like external sorting where data is too large for main memory. However, it requires additional O(n) space.

Quick Sort, on the other hand, often runs faster in practice and can be done in-place, saving memory. Yet, its performance can degrade to O(n²) with poor pivot choices, though techniques like “median-of-three” (implemented in the code example) can mitigate this. The choice between the two depends on the specific problem constraints, with merge sort being more predictable and quick sort often being more space-efficient and faster for in-memory sorts.

Code Implementations

I’ll be using Python for these implementations because Python is the easiest language to use for coding interviews. You can watch NeetCode’s video on the topic (and definitely plenty more on the website) if you’re unconvinced.

Ranking Tier List for the Best Language for Interviews (btw it’s Python) | Source: NeetCode

The following are the code implementations with in-code explanations of how the algorithm works. While I recognize that there are shorter algorithms out there, these implementations are more often than not, the “correct” way for implementing them. For example, a lot of Quick Sort implementations use internal arrays when the algorithm itself is meant to sort the array in place.

I also included images and video links for algorithm if you need further explanations for your learning.

Merge Sort

Merge Sort Visual Representation | Source: GeeksforGeeks
def sortArray(self, nums: List[int]) -> List[int]:
def merge_sort(nums):
# Base case: A single (or zero) element array is already sorted
if len(nums) <= 1:
return nums

# Split the array into two halves (Divide)
middle = len(nums) // 2
left_half = nums[:middle]
right_half = nums[middle:]

# Recursively sort both halves (Conquer)
left_sorted = merge_sort(left_half)
right_sorted = merge_sort(right_half)

# Merge the sorted halves
return merge(left_sorted, right_sorted)


def merge(left, right):
result = []
left_pointer, right_pointer = 0, 0

# Traverse through both left and right arrays
while left_pointer < len(left) and right_pointer < len(right):
# If the left value is lower, add it to the result
if left[left_pointer] < right[right_pointer]:
result.append(left[left_pointer])
left_pointer += 1
# Otherwise, add the right value
else:
result.append(right[right_pointer])
right_pointer += 1

# Append any leftover items from left or right if they exist
# One will always be empty
result.extend(left[left_pointer:])
result.extend(right[right_pointer:])

return result
return merge_sort(nums)
A Quick Introduction to Merge Sort | Source: Michael Sambol
A More In-depth Video Explanation to Merge Sort | Source: Abdul Bari

Quick Sort

Note: The visual representation below uses a simple method of choosing a pivot by choosing the last element, and likewise a pivot can be the first element, but I use the “median of three” technique, shown in the implementation below.

This method involves selecting the median value among the first, middle, and last elements of the array as the pivot to provide a better approximate of the array’s central value than simply picking endpoints and reduce the chance of encountering the worst case scenario of Quick Sort.

Divide and conquer visualisation of quick sort
Quick Sort Visual Representation using Last Element as Pivot | Source: EnjoyAlgorithms
A Quick Introduction to Quick Sort | Source: Michael Sambol
A More In-depth Video Explanation to Quick Sort | Source: Abdul Bari
def sortArray(self, nums):
def quick_sort(nums, low=None, high=None):
# If low and high are not specified, set them to cover the whole array.
if low is None and high is None:
low, high = 0, len(nums) - 1

# Continue if there's more than one element in the current section of the array.
if low < high:
# Partition the array around a pivot and get the index of the pivot.
pivot_index = partition(nums, low, high)

# Recursively sort elements before the pivot and after the pivot.
quick_sort(nums, low, pivot_index - 1)
quick_sort(nums, pivot_index + 1, high)

return nums

def partition(nums, low, high):
# Use the median-of-three method to get the pivot value.
# This involves finding the median value among the first, middle, and last elements.
pivot = sorted([nums[low], nums[(low + high) // 2], nums[high]])[1]

# Move the pivot value to the 'high' position, so the partition logic works seamlessly.
# Since we only know the pivot's value (not its index), we have to check each possibility.
if pivot == nums[low]:
nums[low], nums[high] = nums[high], nums[low]
elif pivot == nums[(low + high) // 2]:
nums[(low + high) // 2], nums[high] = nums[high], nums[(low + high) // 2]

# Pointer for the greater element
i = low - 1

# Traverse the array from 'low' to 'high-1'.
for j in range(low, high):
# If the current element is less than or equal to the pivot, move it to the left.
if nums[j] <= pivot:
i += 1
nums[i], nums[j] = nums[j], nums[i]

# Move the pivot to its correct position.
nums[i + 1], nums[high] = nums[high], nums[i + 1]

# Return the index of the pivot, dividing the array into two halves.
return i + 1
return quick_sort(nums)

What about Other Sorting Algorithms?

While it’s important to point out that while these algorithms are popular in interviews, there may be a case where different interviewers might emphasize other algorithms. That is why I recommend only ONLY knowing Merge & Quick Sorts exclusively, but also be familiar with the intuition of other popular sorting algorithms, such as Heap Sort as mentioned before.

Below are the five algorithms you should know intuitively (only how it works and complexity, no implementation)

Bubble Sort

(Average Time Complexity: O(N²) | Average Space Complexity: O(1))

Start at beginning of array and swap first two elements if the first is greater than the second and continuously making sweeps until it is sorted. So in a sense, the smaller items will “bubble” up to the beginning of the list.

A Quick Introduction to Bubble Sort | Source: Michael Sambol

Selection Sort

(Average Time Complexity: O(N²) | Average Space Complexity: O(1))

Finds the smallest element using a linear scan and move it to the front for each element in the list.

A Quick Introduction to Selection Sort | Source: Michael Sambol

Insertion Sort

(Average Time Complexity: O(N²) | Average Space Complexity: O(1))

For each number in the list, compare the number to every number leftwards of it and swap them if they are not in order.

A Quick Introduction to Insertion Sort | Source: Michael Sambol

Heap Sort

(Average Time Complexity: O(N * log(N)) | Average Space Complexity: O(1))

This sorting algorithm uses a binary heap (typically a max heap so the parent is always larger than its children) to store and return elements in order. By removing the top element, until the heap is empty, into a list, we effectively create a sorted list. The space complexity of heap sort is O(1) because we create the heap in-place.

A Quick Introduction to Heap Sort | Source: GeeksforGeeks

Radix Sort

(Average Time Complexity: O(K * N) | Average Space Complexity: O(K + N)) where N = number of elements | K = number of passes of sorting algorithm

The complexities with the unusual variables may seem intimidating, but it’s not once you understand it! Radix sort arranges numbers by their individual digits, starting from the least significant digit and progressing to the most significant to eventually end up with a sorted list.

A Quick Introduction to Radix Sort | Source: GeeksforGeeks

Conclusion

Before I get any angry comments about minor mistakes, not everything is exactly precise, especially for the complexities, as for example Heap Sort and Quick Sort’s time complexity may be the same at O(N * log(N)), scientifically Quick Sort is considerably faster than Heap Sort. This guide only serves as to the base knowledge you should know in the context of coding interviews, not as a scientific proof for sorting algorithms.

But with this article, I am aiming to combine everything about sorting algorithms in the context of interviews into one comprehensive guide. If there is anything I should include, please let me know!

Good luck on your interviews, and happy learning!

Disclosure: This article contains affiliate links. This means that if you click on a product or service link and make a purchase, I may receive a small commission at no extra cost to you. I only recommend products or services that I believe are valuable and relevant to my readers. Your support through these links helps keep this content free. Thank you for your support!

--

--

Daniel Park
Daniel Park

Written by Daniel Park

"The best way to learn is to teach" - Frank Oppenheimer

No responses yet