Why Do America Think They Won WW2

America Think They Won WW2

There were several European and other nations involved in winning the World War 2. However, Americans have claimed many a times that they won this battle for Europe and saved it from the rule of Nazis. This was majorly because of the supplies America sent to Britain to fight this war. Many people believe that the supply of arms and ammunition was not done for free. America greatly exploited Britain for this favor. In fact, Britain was economically destroyed by the end of the World War 2. It had lost all its industrial and economic resources in exchange for the supplies received from America.

Thus, one may not call this kind of an exchange as help. As opposed to this, there were countries like Russia that significantly helped Britain in fighting this war and destroying the German army. Their destruction of German forces was far more significant than what Americans did. However, America was a resourceful nation and came out stronger in terms of its image and role in this war. That could be one reason to think that they solely won the World War 2. Apart from this, it is also a belief that Americans enjoy singing their praises, as they are influenced by secondary informational sources like Hollywood movies.

Many people, who have not witnessed this war, watch it in movies that might show their contribution as superior. Thus, people take it as reality without trying to go into its actual historical depth. Some Americans also think that Britain could not have made it without America’s industrial power. They solely look at Britain fighting against Germany and ignore other nations that supported Britain in the war. Thus, they feel that America’s might was much more than any other nation at the time of this war. This makes them think that America emerged as winners in this war.