Jump to content

American Century

From Wiktionary, the free dictionary

English

[edit]
English Wikipedia has an article on:
Wikipedia

Proper noun

[edit]

the American Century

  1. The period since the middle of the 20th century, seen as largely dominated by the United States in political, economic, and cultural terms.