When Did Germany Start Being Called The Fatherland?
When Did Germany Start Being Called The Fatherland? It’s a term that means land of my forefathers or fathers. It caught on during the nationalistic furor in the late 19th and early 20th century. The Germans were not the only ones using the term. The Nazis used it in reference to the state and as