The USA has a hell of a lot more to offer people aside from just money.
True, I lived there and have seen it. The burden every American carries is that the rest of the world sees the USA only one way and, unfortunately, it's an impression Americans themselves have promoted. What America offers the world is not the fact that you're rich, it's the social, political and economic freedom to be what you want to be through personal achievement. You've taken in the poor and dispossessed from throughout the world and took their shackles off. Great things have been accomplished and it has made America wealthy. I'm a big fan of America, but my admiration is for the country it once was, not for the country it is becoming since the banksters have taken over. It is not the corporations that have ruined America, a corporation is merely one way of structuring a business, it is the fluid movement of top level people between government and the financial industry. The regulation of the financial industry is weak because the Treasury Department and the executives of Goldman Sachs are one and the same people.