0
0
NLPml~10 mins

Hugging Face Transformers library in NLP - Interactive Code Practice

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to load a pre-trained BERT tokenizer from Hugging Face.

NLP
from transformers import [1]
tokenizer = [1].from_pretrained('bert-base-uncased')
Drag options to blanks, or click blank then click option'
ABertTokenizer
BBertModel
CAutoModel
DAutoTokenizer
Attempts:
3 left
💡 Hint
Common Mistakes
Using BertModel instead of BertTokenizer will cause errors because BertModel is for the model, not the tokenizer.
Using AutoModel instead of a tokenizer class.
2fill in blank
medium

Complete the code to tokenize a sentence using the tokenizer.

NLP
sentence = "Hello, how are you?"
tokens = tokenizer.[1](sentence, return_tensors='pt')
Drag options to blanks, or click blank then click option'
Aencode
Bencode_plus
Ctokenize
Dbatch_encode_plus
Attempts:
3 left
💡 Hint
Common Mistakes
Using tokenize returns only tokens as strings, not tensors.
Using encode returns token ids but not attention masks.
3fill in blank
hard

Fix the error in the code to load a pre-trained BERT model for sequence classification.

NLP
from transformers import BertForSequenceClassification
model = BertForSequenceClassification.from_pretrained([1])
Drag options to blanks, or click blank then click option'
Abert-base
Bbert-base-cased
Cbert-base-uncased-cased
Dbert-base-uncased
Attempts:
3 left
💡 Hint
Common Mistakes
Using 'bert-base-uncased-cased' is invalid because it mixes casing.
Using 'bert-base' is incomplete and will cause an error.
4fill in blank
hard

Fill both blanks to prepare inputs and get model outputs.

NLP
inputs = tokenizer([1], return_tensors=[2])
outputs = model(**inputs)
Drag options to blanks, or click blank then click option'
A"This is a test sentence."
Bpt
Ctf
D"Another example text."
Attempts:
3 left
💡 Hint
Common Mistakes
Passing a list instead of a string for a single sentence.
Using 'tf' when the model expects PyTorch tensors.
5fill in blank
hard

Fill all three blanks to extract predicted class from model logits.

NLP
import torch
logits = outputs.logits
predicted_class = torch.[1](logits, dim=[2]).[3]()
Drag options to blanks, or click blank then click option'
Aargmax
B1
Citem
Dmax
Attempts:
3 left
💡 Hint
Common Mistakes
Using max instead of argmax returns the max value, not the index.
Using dim=0 instead of dim=1 for batch outputs.