Sponsored
Sponsored
This approach involves using a hash map to count the frequency of each word in the paragraph after converting it to lowercase and removing punctuation. Then, the word with the highest count that is not in the banned list is selected as the result.
Time Complexity: O(N + M), where N is the length of the paragraph and M is the number of banned words. Space Complexity: O(N) for storing word frequencies.
1#include <stdio.h>
2#include <string.h>
3#include <stdlib.h>
4#include <ctype.h>
5
6#define MAX_WORD_LENGTH 100
7#define MAX_PARAGRAPH_LENGTH 1000
8
9typedef struct {
10 char word[MAX_WORD_LENGTH];
11 int count;
12} WordFrequency;
13
14int isBanned(const char* word, const char** banned, int bannedSize) {
15 for (int i = 0; i < bannedSize; i++) {
16 if (strcmp(word, banned[i]) == 0) {
17 return 1;
18 }
19 }
20 return 0;
21}
22
23char* mostCommonWord(char* paragraph, const char** banned, int bannedSize) {
24 WordFrequency wordFreq[MAX_PARAGRAPH_LENGTH] = {0};
25 char* result = NULL;
26 int maxCount = 0;
27 int wordFreqSize = 0;
28 char delimiters[] = " !?',;.",
29 *token = strtok(paragraph, delimiters);
30
31 while (token != NULL) {
32 char word[MAX_WORD_LENGTH];
33 strcpy(word, token);
34 for (int i = 0; word[i]; i++) {
35 word[i] = tolower(word[i]);
36 }
37 if (!isBanned(word, banned, bannedSize)) {
38 int found = 0;
39 for (int i = 0; i < wordFreqSize; i++) {
40 if (strcmp(wordFreq[i].word, word) == 0) {
41 wordFreq[i].count++;
42 found = 1;
43 break;
44 }
45 }
46 if (!found) {
47 strcpy(wordFreq[wordFreqSize].word, word);
48 wordFreq[wordFreqSize].count = 1;
49 wordFreqSize++;
50 }
51 }
52 token = strtok(NULL, delimiters);
53 }
54
55 for (int i = 0; i < wordFreqSize; i++) {
56 if (wordFreq[i].count > maxCount) {
57 maxCount = wordFreq[i].count;
58 result = wordFreq[i].word;
59 }
60 }
61 return result;
62}
63
64int main() {
65 const char* banned[] = {"hit"};
66 char paragraph[] = "Bob hit a ball, the hit BALL flew far after it was hit.";
67 printf("%s\n", mostCommonWord(paragraph, banned, 1));
68 return 0;
69}
This C solution uses an array of structures to store word frequencies, given C doesn't have default data structures for maps like other higher-level languages. It tokenizes the paragraph by spaces and punctuation, converts to lowercase, filters out banned words, and counts frequencies. The most frequent non-banned word is returned.
This approach leverages advanced string manipulation functions available in each language for efficient parsing and counting. The words are extracted, normalized, and counted using advanced language-specific methods and libraries for cleaner code.
Time Complexity: O(N log N) due to sorting, where N is total words extracted. Space Complexity: O(N).
1function
JavaScript's solution involves reducing an iterable constructed by object entries to find word occurrence maximums efficiently.