In a transformer model, what does an attention score represent?
a. The probability of each output token in relation to the input sequence.
b. The importance of each input token to every other token in the sequence.
c. The accuracy of the model's predictions.
d. The total number of trainable parameters in the model.