Why Is The West Called The West?
Why Is The West Called The West? The concept of “The West” was born in Europe. The concept of the West or the Western World originated in the Greco-Roman Civilizations of ancient times. The term, “West” comes from the Latin term, “occidens”, which means sunset or west, as opposed to “oriens”, meaning rise or east.