Jump to content

Westcentrism

From Wiktionary, the free dictionary

English

[edit]

Alternative forms

[edit]

Etymology

[edit]

From West +‎ -centrism.

Noun

[edit]

Westcentrism (uncountable)

  1. The practice of viewing the world from a Western perspective, with an implied belief, either consciously or subconsciously, in the preeminence of Western culture.

Meronyms

[edit]
[edit]

Translations

[edit]