Apparently there is a huge new controversy about a TV show debuting in a couple of months on Netflix, a show based on the 2014 indie film "Dear White People."
Now I saw the movie and am looking forward to the TV series (though I'm not so sure it will be as effective without Tessa Thompson in the lead: she had the perfect comic snarkiness for the role). Frankly, I loved the movie. It was very funny and at the same time a pretty important values lesson. It's not a "comfortable" movie; it's not supposed to be. It's about racism in America in what was alleged to be a "post-racial" era (2014, during Obama's Presidency). The main character, the ironically named Sam White, has a radio show on the college station during which she makes such pronouncements as "Dear white people, this just in: Dating a black person to piss off your parents is a form of racism." Her comments are generally humorous but always within her snark there is an element of anger: why at this time in history do I still need to tell you people this stuff?
In the trailer for the TV show, Sam calls out frat partiers dressing in blackface, which were the actual events that stimulated the movie in the first place. In this Trumpian era, when the KKK and neo-Nazis have somehow been elevated into the mainstream, when we can already see civil rights being rolled back for immigrants and LGBT+ people by Presidential fiat, when our nation's public schools are now under the control of someone whose goal is to make them more Godly (sounds great for minority students who don't fit within the fundamentalist Christian mainstream, doesn't it?), when we've followed the most diverse cabinet ever with the least diverse one in several Presidencies, when the majority of white America truly does not understand why a movement like BLM is needed and thinks it is some kind of domestic terrorist thing...when it is 2017 and we are seemingly more divided into a "white America" and a "black America" than we've been in decades...my God, why would we NOT need a show like this?
Just look at the response from the right to the show's very existence: there is actually a movement to #boycottNetflix because of it! Because of a paid streaming network's television program, one that most of the country doesn't even have access to and those with access may never have even known about, let alone watched. (Now, of course, with the controversy, the right ironically has guaranteed better ratings for the show.) All of which leads to the simple and obvious question: WHY? What causes such an emotional and outsized response? What is the right afraid of? If the show's depiction of white people is so incredibly wrong, it should sink under its own misinformed weight, shouldn't it? Or are they afraid that the show is actually revealing an ugly truth that they'd really just rather not talk about?
You know what? We need to talk about this truth. We need to talk about why race relations in this country are so totally screwed up. We need to talk about why, the second a black man was elected President, it so freaked out the white establishment that they decided to abandon any pretense of governing and make limiting him to a single term their one and only agenda. Not only that, but a significant part of the country joined them in feeling this way. Why? For all the "post-racial" talk, we quickly learned that what we actually had was a country in which racism simmered just below the surface and had, perhaps for decades, been waiting for something to cause it to boil over. Well, it's boiling over, people. And if a satire like "Dear White People" can help us to examine it, GOOD! Because something definitely needs to help us. We don't seem to be doing it on our own.