Complete the code to import the Transformer model from the Hugging Face library.
from transformers import [1]
The correct import is AutoModel which is the base class for transformer models in Hugging Face.
Complete the code to tokenize input text using a pretrained tokenizer.
tokenizer = AutoTokenizer.from_pretrained('[1]')
The bert-base-uncased is a common pretrained model name used for tokenization.
Fix the error in the code to correctly create attention masks for input tokens.
attention_mask = (input_ids != [1]).long()Attention masks use 0 to mark padding tokens, so tokens not equal to 0 are attended to.
Fill both blanks to create a dictionary of token ids and attention masks for model input.
inputs = {'input_ids': [1], 'attention_mask': [2]The model expects keys 'input_ids' and 'attention_mask' with corresponding tensors.
Fill all three blanks to run the transformer model and get the output logits.
outputs = model([1], [2]) logits = outputs.[3]
The model is called with input_ids and attention_mask, and the logits are accessed from outputs.