In natural language processing models like BERT, what does the "attention mask" and "pad token id" primarily contribute to?
A) Sentence segmentation
B) Named entity recognition
C) Masked language modeling
D) Sequence classification