Наконец выяснилось кто и как открыл современный ИИ!
8 Google Employees Invented Modern AI. Here’s the Inside Story
EIGHT NAMES ARE listed as authors on “Attention Is All You Need,” a scientific paper written in the spring of 2017.
https://www.wired.com/story/eight-google-employees-invented-modern-ai-transformers-paper/Authored by eight scientists, it was responsible for expanding 2014 attention mechanisms proposed by Bahdanau et. al. into a new deep learning architecture known as the transformer. The paper is considered the founding document for modern artificial intelligence, as transformers became the main architecture of large language models
https://en.wikipedia.org/wiki/Attention_Is_All_You_NeedMachine learning-based attention is a mechanism which intuitively mimics cognitive attention. It calculates "soft" weights for each word, more precisely for its embedding, in the context window. These weights can be computed either in parallel (such as in transformers) or sequentially (such as recurrent neural networks). "Soft" weights can change during each runtime, in contrast to "hard" weights, which are (pre-)trained and fine-tuned and remain frozen afterwards.
https://en.wikipedia.org/wiki/Attention_(machine_learning)
Хотя статья в Вики недостоверный документ, потому как может быть изменен, тем не менее направление понятно. Меж тем. Запатентовал за 8 лет до написания статьи:
obtaining respective weights of the context phrases using parameters related to frequency of occurrence of a context phrase relative to other context phrases or to absolute number
US8504580B2