NLP - Sequence Models for NLP
Find the bug in this code snippet for attention output:
import numpy as np
Q = np.array([1, 0])
K = np.array([[1, 0], [0, 1]])
V = np.array([[5, 5], [10, 10]])
scores = np.dot(Q, K.T)
weights = np.exp(scores)
weights /= np.sum(weights)
output = np.dot(weights, V.T)
print(output)
