Why are we the only ones never allowed to say anything about anything that's said or done to us ever? If we push back against the racism that's spewed at the Obamas, we're racist. If we voice our concern about the whitewashing that's going on in Hollywood, we're imagining things. If we call out the constant black-woman bashing that's become all the rage lately, we being overly sensitive and insecure. And if we scream for justice for the unjustified killings of our black brothers and sisters, we're race hustlers.
So when are we ever supposed to speak up? When it's too late? When white people tell us that it's okay to do so?