Додати до Chrome
✅ Перевірена відповідь на це питання доступна нижче. Наші рішення, перевірені спільнотою, допомагають краще зрозуміти матеріал.
Select one statement about attention and dot products that is correct.
Dot products quantify (i.e. measure) similarity between word embeddings, which is then useful for computing attention between words.
Dot products don't have anything to do with attention in Transformers, and Marek simply got carried away when he was writing his slides for this module, and as a result, he presented a lot of irrelevant linear algebra staff in his lectures.
Python can only approximate dot products, and therefore it is better to use other measures of similarity when Transformers are implemented in Python. This means that they should be avoided when Transformers are designed and implemented in Python.
Отримайте необмежений доступ до відповідей на екзаменаційні питання - встановіть розширення Crowdly зараз!