Watch 10 video solutions for Longest Harmonious Subsequence, a easy level problem involving Array, Hash Table, Sliding Window. This walkthrough by Algorithms Made Easy has 9,871 views views. Want to try solving it yourself? Practice on FleetCode or read the detailed text solution.
We define a harmonious array as an array where the difference between its maximum value and its minimum value is exactly 1.
Given an integer array nums, return the length of its longest harmonious subsequence among all its possible subsequences.
Example 1:
Input: nums = [1,3,2,2,5,2,3,7]
Output: 5
Explanation:
The longest harmonious subsequence is [3,2,2,2,3].
Example 2:
Input: nums = [1,2,3,4]
Output: 2
Explanation:
The longest harmonious subsequences are [1,2], [2,3], and [3,4], all of which have a length of 2.
Example 3:
Input: nums = [1,1,1,1]
Output: 0
Explanation:
No harmonic subsequence exists.
Constraints:
1 <= nums.length <= 2 * 104-109 <= nums[i] <= 109Problem Overview: Given an integer array nums, find the length of the longest harmonious subsequence. A subsequence is harmonious if the difference between its maximum and minimum values is exactly 1. Elements do not need to remain contiguous, but they must come from the original array.
Approach 1: HashMap for Counting Elements (O(n) time, O(n) space)
This approach counts the frequency of every number using a hash map. After building the frequency table, iterate through each unique value x and check whether x + 1 exists. If it does, the total length of a harmonious subsequence formed by those values is count[x] + count[x+1]. Track the maximum across all such pairs. The key insight: a harmonious subsequence only depends on the counts of two consecutive integers, not their positions. Hash lookups run in constant time, so the entire process completes in O(n) time. This method heavily relies on efficient counting with a hash table and works well for unsorted arrays.
Approach 2: Sorting + Sliding Window (O(n log n) time, O(1) extra space)
Another option sorts the array first. Once sorted, numbers with similar values appear next to each other. Use two pointers to maintain a window where the difference between the current maximum and minimum stays ≤ 1. Expand the right pointer while the condition holds. If the difference exceeds 1, move the left pointer forward. Whenever the difference equals exactly 1, update the maximum window length. Sorting costs O(n log n), while the two-pointer scan runs in O(n). This approach is intuitive if you are comfortable with sorting and sliding window patterns.
The sorted method works well when modifying the input array is acceptable and you prefer pointer-based reasoning. The hash map approach is typically faster in practice because it avoids sorting and directly uses element frequencies.
Recommended for interviews: The hash map counting approach is usually what interviewers expect. It demonstrates strong understanding of frequency counting and constant-time lookups. Mentioning the sorting approach shows you considered multiple strategies, but implementing the O(n) hash-based solution proves you can optimize beyond brute-force comparisons.
| Approach | Time | Space | When to Use |
|---|---|---|---|
| HashMap for Counting Elements | O(n) | O(n) | Best general solution. Works efficiently for unsorted arrays using frequency counting. |
| Sorting + Sliding Window | O(n log n) | O(1) extra | Useful when sorting is acceptable and you prefer two-pointer window logic. |