Description
Nazi Germany has fallen. After allied forces defeated Nazi Germany in World War II, Europe became a dangerous place to be associated with the Nazi regime as officers, party members, and supporters of Hitler began to flee Germany.
Image Not Available
2023-01-31
$0.1M
47 min
Nazi Germany has fallen. After allied forces defeated Nazi Germany in World War II, Europe became a dangerous place to be associated with the Nazi regime as officers, party members, and supporters of Hitler began to flee Germany.