I have a stupid question. I never thought I’d sound like the whitest guy on earth, but here it goes: what the f*** does country music have to do with racism? Hell, even the whole “south’s gonna rise again” cliche has barely made it into two country songs in the past thirty years. You hear more anti white stuff in rap music being made every day, but nobody seems to care about that.