Word info

Westcentrism

Noun

Meaning

Westcentrism (uncountable)

The practice of viewing the world from a Western perspective, with an implied belief, either consciously or subconsciously, in the preeminence of Western culture.

Source: en.wiktionary.org

Related terms

Close letter words and terms