Seems like most movies I’ve seen lately, white men are villainized. Black females appear to be the protagonist (in many cases). I’m ok if it occurs ocasión ally but darn. Do white men have a target on their backs? Will this get worse? Reminds me of 1930s Germany’s attitude towards Jews.