Sponsored
Sponsored
In this approach, we will use a hash map (or dictionary) to store the frequency of each character for each word. We then update a common frequency count table that holds the minimum frequency of each character across all words. This ensures that only characters existing in all words are recorded.
Time Complexity: O(N*K) where N is the number of words and K is the average length of the words. Space Complexity: O(1) since the space does not scale with input size.
1using System;
2using System.Collections.Generic;
3using System.Linq;
4
5public class CommonChars {
6 public static IList<char> FindCommonChars(string[] words) {
7 var minCount = new int[26];
8 Array.Fill(minCount, int.MaxValue);
9
10 foreach (var word in words) {
11 var count = new int[26];
12 foreach (var c in word) {
13 count[c - 'a']++;
14 }
15 for (int i = 0; i < 26; i++) {
16 minCount[i] = Math.Min(minCount[i], count[i]);
17 }
18 }
19
20 var result = new List<char>();
21 for (int i = 0; i < 26; i++) {
22 for (int j = 0; j < minCount[i]; j++) {
23 result.Add((char)(i + 'a'));
24 }
25 }
26 return result;
27 }
28
29 public static void Main() {
30 var words = new string[]{"bella", "label", "roller"};
31 var result = FindCommonChars(words);
32 Console.WriteLine(String.Join(", ", result));
33 }
34}
This C# solution follows the same logic: maintain minimum character count and populate the result list with characters appearing across all words with their respective minimum counts.
We can alternatively use direct character arrays to represent frequencies and update these arrays with each subsequent word processed. Starting with the first word's character frequencies, we iteratively compute the minimum with the rest.
Time Complexity: O(N*K). Space Complexity: O(1) for constant sized arrays.
1#
In this C implementation, we use frequency arrays to compute the minimum occurrence of each character across all words. Frequency is updated per word using a temporary array and merged via a minimum operation.