There's an article from the Washington Post claiming that the United States is now a center-left nation politically. I happen to disagree with this argument. The U.S. has been a relatively conservative nation. I need no other evidence for this in the 2008 election than the Proposition 8 results in California. As a Republican, I happen to support gay marriage, but even California rejected it.
Many of the folks that vote Democrat, such as the black and Hispanic vote, are socially conservative and church-goers. It's usually only the college student and college professor demographics that are the true leftists. America remains a moderate-to-center-right nation. Pro-gun (good), anti-gay marriage (bad thing), pro-choice (meh), and pro-America!
1 comment:
Much as it pains me to do so, I must agree with you. USA has shifted a bit to the left over the last 10 years or so, granted, but by any objective standard it is still much more conservative than not. The problem is, of course, that what is defined as "left" here, would be described as "center-right" anywhere else in the world (i.e., Obama: the attempts to paint him as some radical socialist left me howling with laughter), and the "right" in the US is the political philosophy with which Mussolini and Franco would not be too uncomfortable.
Post a Comment