Sponsored
Sponsored
This approach involves using a hash map to count the frequency of each word in the paragraph after converting it to lowercase and removing punctuation. Then, the word with the highest count that is not in the banned list is selected as the result.
Time Complexity: O(N + M), where N is the length of the paragraph and M is the number of banned words. Space Complexity: O(N) for storing word frequencies.
1using System;
2using System.Collections.Generic;
3using System.Linq;
4using System.Text.RegularExpressions;
5
6class Program {
7 public static string MostCommonWord(string paragraph, string[] banned) {
8 var bannedSet = new HashSet<string>(banned);
9 var wordCounts = new Dictionary<string, int>();
10 var words = Regex.Matches(paragraph.ToLower(), "\b\w+\b");
11
12 foreach (Match match in words) {
13 var word = match.Value;
14 if (!bannedSet.Contains(word)) {
15 if (!wordCounts.ContainsKey(word)) {
16 wordCounts[word] = 0;
17 }
18 wordCounts[word]++;
19 }
20 }
21
22 return wordCounts.OrderByDescending(kvp => kvp.Value).First().Key;
23 }
24
25 static void Main() {
26 string paragraph = "Bob hit a ball, the hit BALL flew far after it was hit.";
27 string[] banned = {"hit"};
28 Console.WriteLine(MostCommonWord(paragraph, banned));
29 }
30}
In C#, the solution uses regular expressions to extract and lowercase words from the paragraph. Using a dictionary maps each word to its frequency, ignoring banned words. The result is determined by comparing the frequency counts and selecting the highest.
This approach leverages advanced string manipulation functions available in each language for efficient parsing and counting. The words are extracted, normalized, and counted using advanced language-specific methods and libraries for cleaner code.
Time Complexity: O(N log N) due to sorting, where N is total words extracted. Space Complexity: O(N).
1
JavaScript's solution involves reducing an iterable constructed by object entries to find word occurrence maximums efficiently.