I'm conflicted about my feelings. I have been to the states several times, and know quite a few americans, and all of them are good, good people. But as a country, I'm having trouble.
(Note: outside view and all that)
You seem to think everyone who visits wants to stay (I'm very happy here in the Netherlands, thank you, but my job requires me to come over often because Americans don't seem to like to travel much). There's mandatory fingerprinting and mugshot on each visit, extensive questioning, and the random denial of entry. Even if you're just transferring on an airport in the US.
You seem to be overly proud of your country, yet do not appear to take very good care of the people in it.
Your political system is batshit insane and rotten to the core. You have politicians that can spout the greatest of lies, and get away with it. Your government also passes laws favouring certain small groups while disabling most of the rest, and then force extensions of those laws on other countries because you can't compete on a level field any more.
I also don't understand the obsession with guns, and I don't understand your weird interpretation of capitalism (granted, that goes around in a lot of places now).
As I said, I'm conflicted, the above probably sounds way more negative than my total feeling, and again, the people I've met over there are all great, and very level-headed. Also, there's certainly a crazy amount of insanity and ignorance in my own country (and the ones around us) too. But you were asking about yours
Oh, and hamburgers from diners are awesome.