The Right Hates America
For my entire adult life I've been told that I hate America.
Conservatives have informed me that I hate my freedom (I like it just fine), I hate capitalism (guilty as charged), I hate the troops (it's the imperialism I dislike, really), and I hate the founders (well, OK, that's fair).
So maybe I do hate aspects of the country in which I was born. Perhaps this makes me the first person to ever be born into a nation deserving of critique, or maybe I'm just an old-fashioned hater infected by the insidious woke mind virus. But there was always a hint of projection in right wingers telling me I hate the United States. As they huffed and puffed and demanded I love our country – whatever it means to love a country – they bemoaned civilizational progress that had transformed the nation, making it a little less white, a little less straight, a little less Christian, and a little less crushingly conventional.
The folks with spittle flying from their pursed lips, behind their gritted teeth, hate when they turn on the TV and a major corporation advertises its products with a gay couple, possibly with a kid in tow, or a person of indeterminate gender. They hate that pro football fields and basketball courts and baseball fields have messaging saying racism is not, in fact, good. They hate that their child or grandchild or the local Starbucks barista rejects suffocating gender norms and maybe dyes their hair purple or green or blue. They hate that they can't use slurs in mixed company. And they hate that an overwhelming majority of their fellow Americans rejected the Big Boy's bid for another term in the White House.