Sponsored
Sponsored
This approach involves using a hash map to count the frequency of each word in the paragraph after converting it to lowercase and removing punctuation. Then, the word with the highest count that is not in the banned list is selected as the result.
Time Complexity: O(N + M), where N is the length of the paragraph and M is the number of banned words. Space Complexity: O(N) for storing word frequencies.
1function mostCommonWord(paragraph, banned) {
2 const bannedSet = new Set(banned);
3 const words = paragraph.toLowerCase().match(/\w+/g);
4 const wordCount = {};
5
6 for (const word of words) {
7 if (!bannedSet.has(word)) {
8 wordCount[word] = (wordCount[word] || 0) + 1;
9 }
10 }
11
12 let maxWord = '';
13 let maxCount = 0;
14
15 for (const [word, count] of Object.entries(wordCount)) {
16 if (count > maxCount) {
17 maxWord = word;
18 maxCount = count;
19 }
20 }
21
22 return maxWord;
23}
24
25const paragraph = "Bob hit a ball, the hit BALL flew far after it was hit.";
26const banned = ["hit"];
27console.log(mostCommonWord(paragraph, banned));
28
This JavaScript solution uses regular expressions to find words in a paragraph and converts all words to lowercase. Using a Set for banned words, it counts the frequency of non-banned words with a plain object. The word with the maximum frequency is chosen as the result.
This approach leverages advanced string manipulation functions available in each language for efficient parsing and counting. The words are extracted, normalized, and counted using advanced language-specific methods and libraries for cleaner code.
Time Complexity: O(N log N) due to sorting, where N is total words extracted. Space Complexity: O(N).
1
JavaScript's solution involves reducing an iterable constructed by object entries to find word occurrence maximums efficiently.