0
0
NLPml~10 mins

Why transformers revolutionized NLP - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to import the Transformer model from the Hugging Face library.

NLP
from transformers import [1]
Drag options to blanks, or click blank then click option'
Atransformers
BTransformerModelClass
CAutoModel
DTransformerModel
Attempts:
3 left
💡 Hint
Common Mistakes
Using lowercase or plural form like 'transformers' which is the package name.
Using incorrect class names like 'TransformerModel'.
2fill in blank
medium

Complete the code to tokenize input text using a pretrained tokenizer.

NLP
tokenizer = AutoTokenizer.from_pretrained('[1]')
Drag options to blanks, or click blank then click option'
Atransformer-base
Bbert-base-uncased
Cnlp-transformer
Dbert-large-cased
Attempts:
3 left
💡 Hint
Common Mistakes
Using non-existent model names like 'transformer-base'.
Using 'bert-large-cased' which is a different model variant.
3fill in blank
hard

Fix the error in the code to correctly create attention masks for input tokens.

NLP
attention_mask = (input_ids != [1]).long()
Drag options to blanks, or click blank then click option'
A0
B1
CNone
D-1
Attempts:
3 left
💡 Hint
Common Mistakes
Using 1 instead of 0, which reverses the mask meaning.
Using None or -1 which causes errors.
4fill in blank
hard

Fill both blanks to create a dictionary of token ids and attention masks for model input.

NLP
inputs = {'input_ids': [1], 'attention_mask': [2]
Drag options to blanks, or click blank then click option'
Ainput_ids
Battention_mask
Ctoken_ids
Dmask
Attempts:
3 left
💡 Hint
Common Mistakes
Using incorrect variable names like 'token_ids' or 'mask'.
Swapping the values between keys.
5fill in blank
hard

Fill all three blanks to run the transformer model and get the output logits.

NLP
outputs = model([1], [2])
logits = outputs.[3]
Drag options to blanks, or click blank then click option'
Ainput_ids
Battention_mask
Clogits
Dhidden_states
Attempts:
3 left
💡 Hint
Common Mistakes
Passing wrong arguments or missing attention_mask.
Accessing 'hidden_states' instead of 'logits' for predictions.