- Joined
- Dec 18, 2010
- Posts
- 2,906
Greetings,
Let me start by saying I probably won't articulate this correctly, but i'll do my best.
Here in the US we have a government (and by government decision, a military) that is seemingly involved in everything.
We are allied with S. Korea and Japan against N. Korean threats toward all 3 of these nations.
We are allied militarily with both India and Pakistan, but moreso India, and tend to side with them re: Kashmir.
We are allied with Taiwan and support them in their disputes w/China and sell them weapons.
We are allied with Israel against...most of the world, it seems.
We are allied with Saudi Arabia and Jordan against Iranian plans of influence.
We are allied with Poland over the downing of their government plane by Russia.
We are allied with Columbia and Mexico against drug lords and gangs.
Yet - in most of these countries, the opinion of American and Americans seems quite low, and in some cases, outright hatred.
My belief is that so many want the US to "mind their own" and stay out. Stop intervening, stop making things worse, stop building military bases in other countries, stop threatening sanctions, but at the same time; THEIR governments seek out our influence, our military, our covert intelligence, and our financial aid.
What i'm interested to know, from our international members, it what is your real view of American involvement in these matters and do you think the world would be better off if they became less involved in world affairs.
Didn't take lead on Iran and let them overrun Saudi Arabie, didn't take on N. Korea and let them overrun S. Korea, didn't confront China and let them overrun Taiwan, let Russian overrun Ukraine and Georgia, etc and etc.
Do you not think that would happen?
If you think it would happen, do you care?
Would you want your nation to step up and intervene?
Genuinely curious about the perception and belief of America is perceived as keeping balance and fairness in the world, or if it is believed their involvement does the opposite and makes situations worse.
Let me start by saying I probably won't articulate this correctly, but i'll do my best.
Here in the US we have a government (and by government decision, a military) that is seemingly involved in everything.
We are allied with S. Korea and Japan against N. Korean threats toward all 3 of these nations.
We are allied militarily with both India and Pakistan, but moreso India, and tend to side with them re: Kashmir.
We are allied with Taiwan and support them in their disputes w/China and sell them weapons.
We are allied with Israel against...most of the world, it seems.
We are allied with Saudi Arabia and Jordan against Iranian plans of influence.
We are allied with Poland over the downing of their government plane by Russia.
We are allied with Columbia and Mexico against drug lords and gangs.
Yet - in most of these countries, the opinion of American and Americans seems quite low, and in some cases, outright hatred.
My belief is that so many want the US to "mind their own" and stay out. Stop intervening, stop making things worse, stop building military bases in other countries, stop threatening sanctions, but at the same time; THEIR governments seek out our influence, our military, our covert intelligence, and our financial aid.
What i'm interested to know, from our international members, it what is your real view of American involvement in these matters and do you think the world would be better off if they became less involved in world affairs.
Didn't take lead on Iran and let them overrun Saudi Arabie, didn't take on N. Korea and let them overrun S. Korea, didn't confront China and let them overrun Taiwan, let Russian overrun Ukraine and Georgia, etc and etc.
Do you not think that would happen?
If you think it would happen, do you care?
Would you want your nation to step up and intervene?
Genuinely curious about the perception and belief of America is perceived as keeping balance and fairness in the world, or if it is believed their involvement does the opposite and makes situations worse.