I was just wondering peoples opinions who are on the outside of the United States, if you think the US has a good or bad image in the global eye, so to speak.
Before the US there was no fatty cheeseburgers, clinical depression, copyright protection or Dr Phil Of course I'd probably be speaking Japanese right now, but what the hell, its a cool language and I love BBQ Pork Ramen
The US has a horrible outside image. I'm not sure why this needs a survey... It's pretty much common knowledge that the US is seen as a cesspit of ignorance and over indulgence/greed.
try going to peru in a week dye your hair orange and walking round that beach where sloot was enough said? he he he
Bad, I live here and i think it prolly viewed bad bc we never realy have peace time, are in an insone amount of debt among other things
Overall, I would definitely say bad. But I have always been friends with foreign exchange kids and the one thing they all say is that they always thought Americans would be more pretentious. So while our image is absolute shit, I think it's because of our political system and society at large. The individual American (in most cases) is a decent person.
I think americans are seen as extremely friendly and polite as individuals, but inward focussed (not too much knowledge of matters outside the 50 US states). They are definitely seen as a force of good (throughout the 20th century), but there is a fear in more 'socialised' democracies of becoming 'americanised' - i.e. less social welfare, more corporate/religious influence in government, huge gap between 'haves' and 'have nots' Then there are valley girls and republicans. haha. BAD IMAGE!!!
America is the center of the world. That is why everyone hates/is jealous. But it's ok. We are quickly consuming ourselves into the third world.
most of the people i know, when they think of america, have nothing good to say about it.... but ofcourse there are many Americans who feel the same way... they are acutely aware that it is NOT the land of the free and the brave afterall... unfortunately, as far as i have observed, they are in the minority...
Ok, first of all, "America" includes TWO WHOLE continents, north america, south america. With Canada, United States, Mexico, and then peru chile brazil, argentina, etc etc. I am a United Statesman.
The Americans I know are fairly normal. I do only know them online, though, but from what I can sense about them they could live just around the corner from me. There isn't much difference between them an us (The English)...a mixture of good bad and ugly (inside and out). I do feel sorry for Americans, they have an unfair reputation, really. They are damned if they do and damned if they don't. It doesn't help that they used to project an arrogant, superior and the world revolves around them mentality...through their media (Films/TV etc)... I do feel that image is being altered slowly... When I watch US films now I do sense a global collective responsibility when there is a crisis (usually the world coming to an end or alien invasion)...the rest of the world is included in the solution rather than a cigar chewing Yank saving the world. That might sound a little stupid, but I do think the image that the media gives out is a mirror of the attitudes of the people. It is wrong for people who have easy access to information to continue to have a poor image of America...it is lazy not to find out what the country actually is like. I can understand where there is a lack of information and a high degree of governmental propoganda, that the poor reputation continues to be poor. It serves another agenda.
This is the image that america portrays. and i think it still fits to some point. https://www.youtube.com/watch?v=GuP9YClyPRY"]YouTube- The Best of George Costanza - The Fire
you can definately tell I am from there. I said all that shit about USA'ians and then I sit there and call it america. Im such a hippo. But hey, what would the proper name be for someone from the states? United Statesian? United statesman? There's really not a word for it that I am aware of.
They generally are called American. I only know of you who is so anal about such things that you dragged up north america, south america. With Canada, United States, Mexico, and then peru chile brazil, argentina, etc etc.
Is that bad? I mean, if you said "north americans are....." you could be talking about Canada, USA, and Mexico. All three of these countries are extremely diverse. As you get further south, (I know im gonna get in trouble for this) the level of diversity decreases. South America is a whole group of countries with their own cultures that intermingle with each other more than North americans. I think I need a history lesson here to figure out why
It probably would have helped your cause if you had not said "America" only...in your second post. I could be talking about other countries, but I think most people think of North America when "America" is mentioned. I don't know where you used to live or live...but North America is a mongrel nation. Everybody has mixed blood. It is a melting pot afterall.
Another continent has it's own cultures outside of the American (which refers to people from the united states of america) culture? I know I'm shocked... How dare they be different... And you question if and why Americans have a bad rep? HAHAHA
The countries in south america have their own separate cultures. whereas canada/united states are all grouped into "white people". Canadians say "ay" and USAians carry guns.