Sponsored
Sponsored
Use these hints if you're stuck. Try solving on your own first.
Select any node as the root.
Let <code>dp[x][c]</code> be the maximum sum we can get for the subtree rooted at node <code>x</code>, where <code>c</code> is a boolean representing whether the edge between node <code>x</code> and its parent (if any) is selected or not.
<code>dp[x][c] = max(sum(dp[y][cy]) + v(nums[x], sum(cy) + c))</code> where <code>cy</code> is <code>0</code> or <code>1</code>. When <code>sum(cy) + c</code> is odd, <code>v(nums[x], sum(cy) + c) = nums[x] XOR k</code>. When <code>sum(cy) + c</code> is even, <code>v(nums[x], sum(cy) + c) = nums[x]</code>.
There’s also an easier solution - does the parity of the number of elements where <code>nums[i] XOR k > nums[i]</code> help?