Jump to content

Tay's law

From Wiktionary, the free dictionary

English

[edit]

Alternative forms

[edit]

Etymology

[edit]

Microsoft's interactive chatbot "Tay" was soon found to adopt racist viewpoints and was shut down.[1] Analogies to the original scandal were drawn in the case of later projects.[2]

Proper noun

[edit]

Tay's law

  1. (slang, alt-right) The view that any requisitely complex artificial intelligence will come to be racist.
    • 2022 June 22, CaveOpp's BitChute, BitChute[1]:
      Tay's Law: Any sufficiently advanced AI will inevitably become a white supremacist."
    • 2024 January 25, Liram Koblentz-Stenzler and Uri Klempner, Global Network on Extremism & Technology[2]:
      Torba explains that when AI is unrestricted, it “starts talking about taboo truths no one wants to hear.” This statement reflects an underlying and pervasive conspiratorial opinion among the far right that AI models and their outputs are being controlled and restricted by an evil agenda. This connects to what we observed as a foundational cornerstone of this mindset, in what far-right users have coined ‘Tay’s Law’.

Translations

[edit]

See also

[edit]

References

[edit]