Given an integer array nums sorted in non-decreasing order, remove the duplicates in-place such that each unique element appears only once. The relative order of the elements should be kept the same. Then return the number of unique elements in nums.
Consider the number of unique elements of nums to be k, to get accepted, you need to do the following things:
nums such that the first k elements of nums contain the unique elements in the order they were present in nums initially. The remaining elements of nums are not important as well as the size of nums.k.Custom Judge:
The judge will test your solution with the following code:
int[] nums = [...]; // Input array
int[] expectedNums = [...]; // The expected answer with correct length
int k = removeDuplicates(nums); // Calls your implementation
assert k == expectedNums.length;
for (int i = 0; i < k; i++) {
assert nums[i] == expectedNums[i];
}
If all assertions pass, then your solution will be accepted.
Example 1:
Input: nums = [1,1,2] Output: 2, nums = [1,2,_] Explanation: Your function should return k = 2, with the first two elements of nums being 1 and 2 respectively. It does not matter what you leave beyond the returned k (hence they are underscores).
Example 2:
Input: nums = [0,0,1,1,1,2,2,3,3,4] Output: 5, nums = [0,1,2,3,4,_,_,_,_,_] Explanation: Your function should return k = 5, with the first five elements of nums being 0, 1, 2, 3, and 4 respectively. It does not matter what you leave beyond the returned k (hence they are underscores).
Constraints:
1 <= nums.length <= 3 * 104-100 <= nums[i] <= 100nums is sorted in non-decreasing order.The key observation in #26 Remove Duplicates from Sorted Array is that the array is already sorted. This means duplicate values always appear next to each other, allowing us to process them efficiently without extra data structures.
A common strategy uses the two pointers technique. One pointer tracks the position of the last unique element, while the other pointer scans through the array to find new values. Whenever a different value is encountered, it is placed at the next valid position for unique elements. This allows the array to be updated in-place, which is an important constraint of the problem.
By moving the pointers carefully and only keeping unique elements at the front portion of the array, we effectively compress the array while maintaining the original order. This approach avoids additional memory usage and processes each element only once, making it highly efficient for interview scenarios.
The overall complexity is O(n) time with O(1) extra space.
| Approach | Time Complexity | Space Complexity |
|---|---|---|
| Two Pointers (In-place) | O(n) | O(1) |
NeetCode
Use these hints if you're stuck. Try solving on your own first.
In this problem, the key point to focus on is the input array being sorted. As far as duplicate elements are concerned, what is their positioning in the array when the given array is sorted? Look at the image below for the answer. If we know the position of one of the elements, do we also know the positioning of all the duplicate elements? <br> <img src="https://assets.leetcode.com/uploads/2019/10/20/hint_rem_dup.png" width="500"/>
We need to modify the array in-place and the size of the final array would potentially be smaller than the size of the input array. So, we ought to use a two-pointer approach here. One, that would keep track of the current element in the original array and another one for just the unique elements.
Essentially, once an element is encountered, you simply need to <b>bypass</b> its duplicates and move on to the next unique element.
The two-pointer technique is an efficient way to solve this problem in-place. We initialize two pointers, where one (called slow) tracks the position of unique elements, and the other (called fast) traverses the entire array.
Whenever we find a new unique element using the fast pointer, we move the slow pointer one step forward and place the unique element at that position. This way, the array elements from index 0 to the slow pointer are all unique.
Time Complexity: O(n), where n is the number of elements in the array, since each of the n elements is accessed once.
Space Complexity: O(1), since the solution uses only a constant amount of extra space.
1#include <stdio.h>
2
3int removeDuplicates(int* nums, int numsSize) {
4 if (numsSize == 0) return 0;
We start with the assumption that the array has at least one element if it's non-empty. We initialize two indices: j which tracks the last position of unique elements and starts at 0, and i, which iterates from 1 to the end of the array.
Whenever a non-duplicate element is found by comparing nums[i] with nums[j], j is incremented, and nums[j] is updated with nums[i]. Finally, we return j + 1 as it represents the count of unique elements.
An alternative approach without pointers uses a simple loop to iterate over the array and compare each element with the previous one. The still unique elements overwrite duplicate positions directly in the initial part of the array.
Time Complexity: O(n)
Space Complexity: O(1)
1#
Watch expert explanations and walkthroughs
Practice problems asked by these companies to ace your technical interviews.
Explore More ProblemsJot down your thoughts, approach, and key learnings
Yes, this is a common entry-level array problem asked in coding interviews, including FAANG-style interviews. It tests understanding of array traversal, in-place modification, and the two pointers technique.
The problem only requires the input array itself. Since the array is sorted, duplicates can be handled using pointer manipulation instead of additional data structures like hash sets.
The optimal approach uses the two pointers technique. One pointer tracks the position of the last unique element while the other scans the array for new values. Because the array is sorted, duplicates appear consecutively, making it easy to filter them in O(n) time.
Two pointers allow us to traverse the array once while keeping track of where the next unique element should be placed. This enables an in-place update without using additional memory, which satisfies the constraints of the problem.
This approach iterates from the second element, comparing it with its predecessor to check duplicates and copying only non-duplicates towards the start.