Mcface said:
zehydra said:
I'd say it was wrong because war in general is wrong. But hey, relatively speaking it was probably much less worse than if we had attempted a land invasion. Even the japanese civilians were willing to fight the fight to the death.
War is not wrong, to say so is ignorant.
Regardless of what country you live in, it became a country through WAR.
There is no more slavery in America because of WAR.
America is an independent country, because of WAR.
Without WAR, we would all be speaking German and saluting a Nazi flag.
War is wrong, and I fail to see how to say so is ignorant? (Ignorance is not knowing information).
While it is true that slavery is gone in America because of it, it wasn't the only way that could of happened.
America could have become an independent nation without specifically declaring war on Great Britain.
Really, my take on war, is that it is not necessarily wrong for a nation to defend itself if directly attacked, but war itself is nothing more than legal murder.
(btw, with Nazi Germany, the reason they were a threat in the first place was because of WAR). But even without our involvement, Nazi germany would have fallen. Long before our intervention there were dissident groups all over the third reich and several assassination attempts against the fuhrer.