What I mean is, most American intellectuals I've met are either lefties who basically have no pride and patriotism, or right-wings who love their country in a racist way against the others. I hate generalizing but I just wanted to see other point of views. I think U.S.A is a beautiful country with a beautiful history and the new generations should be taught to love their country in a non Nazi way, or white trash way.
My English is not too good, sorry if anything sounds wrong, don't flame me.