Given an array of integers nums, sort the array in ascending order and return it.
You must solve the problem without using any built-in functions in O(nlog(n)) time complexity and with the smallest space complexity possible.
Example 1:
Input: nums = [5,2,3,1] Output: [1,2,3,5] Explanation: After sorting the array, the positions of some numbers are not changed (for example, 2 and 3), while the positions of other numbers are changed (for example, 1 and 5).
Example 2:
Input: nums = [5,1,1,2,0,0] Output: [0,0,1,1,2,5] Explanation: Note that the values of nums are not necessairly unique.
Constraints:
1 <= nums.length <= 5 * 104-5 * 104 <= nums[i] <= 5 * 104The goal of #912 Sort an Array is to return the sorted version of a given integer array without relying on built-in sorting utilities. This problem is commonly used to test your understanding of classic sorting algorithms and their trade-offs.
A reliable approach is Merge Sort, a divide and conquer algorithm. The array is recursively split into smaller halves until each segment contains a single element. These segments are then merged in sorted order using a temporary structure. Merge Sort guarantees O(n log n) time complexity regardless of input distribution, making it a safe and predictable choice for interviews.
Another option is Heap Sort, which uses a priority queue (heap) to repeatedly extract the smallest element. It also runs in O(n log n) time and can be implemented with constant auxiliary space.
When the value range is limited, non-comparison techniques like Counting Sort or Radix Sort can achieve near O(n) performance. Choosing the right method depends on constraints such as array size and value range.
| Approach | Time Complexity | Space Complexity |
|---|---|---|
| Merge Sort | O(n log n) | O(n) |
| Heap Sort | O(n log n) | O(1) |
| Counting Sort | O(n + k) | O(k) |
| Radix Sort | O(d × (n + k)) | O(n + k) |
Greg Hogg
Merge Sort is a classic divide-and-conquer sorting algorithm that works in O(nlog(n)) time complexity and requires extra space proportional to the length of the array. The core concept is to divide the array into two halves, recursively sort each half, and then merge the sorted halves to produce a sorted array.
Time Complexity: O(nlog(n))
Space Complexity: O(n) (for temporary arrays used during merging)
1def merge(arr, left, mid, right):
2 n1 = mid - left + 1
3 n2 = right - mid
4 L = arr[left:left + n1]
5 R = arr[mid + 1:mid + 1 + n2]
6 i = j = 0
7 k = left
8 while i < n1 and j < n2:
9 if L[i] <= R[j]:
10 arr[k] = L[i]
11 i += 1
12 else:
13 arr[k] = R[j]
14 j += 1
15 k += 1
16 while i < n1:
17 arr[k] = L[i]
18 i += 1
19 k += 1
20 while j < n2:
21 arr[k] = R[j]
22 j += 1
23 k += 1
24
25def merge_sort(arr, left, right):
26 if left < right:
27 mid = (left + right) // 2
28 merge_sort(arr, left, mid)
29 merge_sort(arr, mid + 1, right)
30 merge(arr, left, mid, right)
31
32nums = [5, 2, 3, 1]
33merge_sort(nums, 0, len(nums) - 1)
34print(nums)The Python version implements merge sort smoothly with Python's slicing and operations. The recursive function merge_sort splits and calls merge to organize the sections of the list during the sort process.
Quick Sort is an efficient, in-place sorting algorithm that works in average O(nlog(n)) time complexity. This approach involves selecting a 'pivot' element from the array and partitioning the remaining elements into two subarrays according to whether they are less than or greater than the pivot. The subarrays are then sorted recursively.
Time Complexity: O(n2) in the worst case, O(nlog(n)) on average.
Space Complexity: O(log(n)) for recursive stack space.
1
Watch expert explanations and walkthroughs
Practice problems asked by these companies to ace your technical interviews.
Explore More ProblemsJot down your thoughts, approach, and key learnings
Yes, sorting problems like this are frequently used in technical interviews at companies such as Google, Amazon, and Meta. They test understanding of fundamental algorithms like merge sort, heap sort, and counting-based techniques.
A common optimal approach is Merge Sort because it guarantees O(n log n) time complexity for all cases. It follows the divide-and-conquer strategy by splitting the array and merging sorted halves efficiently.
It depends on the chosen algorithm. Merge Sort relies on temporary arrays for merging, while Heap Sort uses a heap (priority queue) structure to repeatedly extract the smallest or largest element.
Counting Sort or Radix Sort works best when the integer range is limited or digits are processed individually. These non-comparison algorithms can outperform traditional sorts by achieving near linear time complexity under the right constraints.
The Java solution implements Hoare's partition scheme quick sort. It efficiently manages in-place partitions and recursive sorting through the quickSort and partition methods.