0
0
NLPml~20 mins

Hugging Face Transformers library in NLP - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Transformer Mastery Badge
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
What is the output shape of the model's last hidden state?
Given the following code using Hugging Face Transformers, what is the shape of the last hidden state tensor?
NLP
from transformers import BertModel, BertTokenizer
import torch

tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertModel.from_pretrained('bert-base-uncased')

inputs = tokenizer('Hello', return_tensors='pt')
outputs = model(**inputs)
last_hidden_state = outputs.last_hidden_state

print(last_hidden_state.shape)
Atorch.Size([3, 768])
Btorch.Size([1, 3, 768])
Ctorch.Size([1, 768])
Dtorch.Size([768, 3])
Attempts:
2 left
💡 Hint
Remember that BERT outputs batch size, sequence length, and hidden size dimensions.
Model Choice
intermediate
1:30remaining
Which model is best suited for sequence classification tasks?
You want to classify movie reviews as positive or negative using Hugging Face Transformers. Which model class should you use?
ABertModel
BBertTokenizer
CBertForSequenceClassification
DBertForMaskedLM
Attempts:
2 left
💡 Hint
Look for the model class designed specifically for classification tasks.
Hyperparameter
advanced
1:30remaining
Which hyperparameter controls the learning rate in Hugging Face Trainer?
When fine-tuning a transformer model using the Hugging Face Trainer API, which argument sets the learning rate?
Alearning_rate
Blr_scheduler_type
Cnum_train_epochs
Dbatch_size
Attempts:
2 left
💡 Hint
This hyperparameter directly affects how fast the model updates weights.
Metrics
advanced
1:30remaining
Which metric is most appropriate for evaluating a multi-class text classification model?
You trained a transformer model to classify news articles into 5 categories. Which metric should you use to evaluate its performance?
APerplexity
BBLEU score
CMean Squared Error
DAccuracy
Attempts:
2 left
💡 Hint
Choose a metric that measures correct label predictions.
🔧 Debug
expert
2:00remaining
What error does this code raise when loading a tokenizer?
Consider this code snippet: from transformers import AutoTokenizer tokenizer = AutoTokenizer.from_pretrained('nonexistent-model')
AOSError: Model 'nonexistent-model' not found
BValueError: Invalid tokenizer name
CTypeError: from_pretrained() missing required positional argument
DRuntimeError: Tokenizer loading failed due to corrupted files
Attempts:
2 left
💡 Hint
Check what happens when you try to load a model that does not exist on Hugging Face hub.