General History/Hitler


I've always been a fan of history, especially on the little known details usually not shared in school books. My impression from WW2, is that Hitler was obsessed with regaining lost German lands. Just how accurate is that? Did he have any interest in the United States? If he had won the war, would he really have taken over the USA or is this just Hollywood propaganda we've taken for granted?

Hi William,
Hitler was a German nationalist, meaning that he believed the Germans to be destined to rule the world. That's where the term 'Master race' comes from. This belief in German superiority was probably stronger in the 1800s than in the 1900s, but remember, Hitler was born in the 1800s and was educated in the years before and after 1900 so that his education was in the tradition of the 1800s German nationalists. Hitler saw territorial occupation as a tangible manifestation of German greatness. In other words, territorial expansion was a way for Hitler to prove Germany's greatness.

His specific territorial goals were:
1. Reoccupation of the lands Germany had lost in WWI such as the Polish corridor, Alsace-Lorraine, and Germany's overseas colonies in Africa, China, and the Pacific. (Most of the Pacific islands the US had to invade between 1942 and 1945 had been German before Japan captured them in WWI. That's where names like the Bismarck Archipelago come from.)

2. Hitler also spoke abstractly of 'living space' in the east. He believed Germany was too small for its population and wanted to displace the Poles and western Russians so that Germans could occupy those lands.

3. He also wanted to unify all of the German peoples under one state. This was why he annexed Austria, the Sudetenland, Schleswig-Holstien (on the border between Germany and Denmark), Alsace-Lorraine, and other regions inhabited by Germans.

Most of the talk about Hitler occupying the US was probably American propaganda. Hitler rarely, if ever, spoke of occupying the US or even most of Europe. Even when he had all of France for the taking, he did not initially occupy southern France because he eventually saw Germany withdrawing from France (except Alsace-Lorraine). He only occupied northern and western of France because he needed bases to continue the war against the British.

Like many Germans, Hitler bought into 'The Stab in the Back Theory.' This was a conspiracy theory that circulated in Germany after WWI that sought to explain how Germany could lose the war with no enemy armies yet in German territory. According to the conspiracy theory, Germany had been betrayed by either the Jews or the Communists, possibly both. So Hitler definitely wanted to re-fight WWI only this time it would end differently, or so Hitler hoped. This is probably one explanation to why Hitler made one of his biggest mistakes - declaring war on the US after Pearl Harbor. Hitler wanted to re-fight WWI, and Germany and the US had been enemies in that war.

Hope this answers your question,


General History

All Answers

Answers by Expert:

Ask Experts


C.M. Aaron


My interests are pretty diverse: military history, technology, and social trends for most historical eras, general U.S. history, European history, especially Roman and early medieval history but also the 18th, 19th and 20th centuries, European colonialism in Africa, a little bit of Asian history, especially as it relates to European and American imperialism. I'm also pretty good with presidential history.


I've read hundreds of history books on various subjects. I've also been writing historical fiction for about twenty years. Story development drives my research. I am also a tour guide at a local museum.

Bachelor's degree in history and geography.

©2016 All rights reserved.