minorsaint
Highest Rated Comments
minorsaint13 karma
To clarify, I didn't mean to deny that the 60's and 70's saw a leftward drift in economic policy, because for the United States viewed in a vacuum, it was a period of unprecedented liberalism and government expansion that had its roots in the New Deal. My point about it being perception is that in the rest of the world, what qualified as American "liberalism" is still pretty centrist, so to someone who lived their life in, say, Sweden, would look at that period and not see it as being left of anything. But you're right to point out that, in terms of the American experience, the Great Society was more liberal than anything we've seen since.
minorsaint13 karma
What do you guys think of the new World Trade Center 1 (or "Freedom Tower" as it's colloquially known in some circles)? I would imagine seeing a new building, now the tallest in the United States, going up on that site could produce a range of emotions. Thanks for the AMA.
minorsaint3 karma
I have to be honest, I don't know how the rest of Europe breaks down, but I do know that the UK had nationalized various industries (including health care) and had a top tax rate of 95%, making it quite farther to the left than the U.S. has ever really ventured. I know many of France's social policies date back to the post-war period as well, but I just don't know about the rest of Europe.
And you're absolutely right. "Liberalism" in the economic sense refers to Classical economics as often as not, although the American brand of fiscal conservatism often eschews a lot of the basic conceits in classic, liberal capitalism, so the demarcation of definitions can get a bit hairy.
minorsaint106 karma
A lot of people claimed the 1960's and 70's saw too much government expansion and a top-heavy bureaucracy. Reagan's victory over Carter in 1980 is still seen (at least in the mainstream narrative) of a conservative reassertion after government had continually grown since the New Deal.
The truth is far more nuanced than that, but that's the perception. Realistically, the U.S. has always been a more conservative nation relative to its allies, so you kinda have to judge America against itself instead of against the world on that point. By the world's standards, we are and have always been conservative.
View HistoryShare Link