0
0
NLPml~10 mins

Why production NLP needs engineering - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to load a pre-trained NLP model using Hugging Face Transformers.

NLP
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained([1])
Drag options to blanks, or click blank then click option'
Atransformers
Bbert-base-uncased
Cload_model
D"bert-base-uncased"
Attempts:
3 left
💡 Hint
Common Mistakes
Forgetting to put quotes around the model name.
Using a variable name instead of a string.
2fill in blank
medium

Complete the code to tokenize input text for the NLP model.

NLP
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
inputs = tokenizer([1], return_tensors="pt")
Drag options to blanks, or click blank then click option'
A"Hello, how are you?"
Binput_text
Ctext
D['Hello', 'world']
Attempts:
3 left
💡 Hint
Common Mistakes
Passing a variable name without defining it.
Passing a list instead of a string.
3fill in blank
hard

Fix the error in the code to get model predictions from tokenized inputs.

NLP
outputs = model([1])
predictions = outputs.logits.argmax(dim=1)
Drag options to blanks, or click blank then click option'
Ainputs['input_ids']
Binputs.input_ids
Cinputs['tokens']
Dinputs
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the whole inputs dictionary instead of input_ids.
Using a wrong key like 'tokens'.
4fill in blank
hard

Fill both blanks to create a function that preprocesses text and returns model predictions.

NLP
def predict(text):
    inputs = tokenizer([1], return_tensors=[2])
    outputs = model(inputs['input_ids'])
    return outputs.logits.argmax(dim=1).item()
Drag options to blanks, or click blank then click option'
Atext
B"pt"
Attempts:
3 left
💡 Hint
Common Mistakes
Passing the wrong variable or forgetting quotes around 'pt'.
Using inconsistent variable names.
5fill in blank
hard

Fill all three blanks to add batch processing and return a list of predictions.

NLP
def batch_predict(texts):
    inputs = tokenizer([1], padding=True, truncation=True, return_tensors=[2])
    outputs = model(inputs['input_ids'])
    preds = outputs.logits.argmax(dim=[3])
    return preds.tolist()
Drag options to blanks, or click blank then click option'
Atexts
B"pt"
C1
D0
Attempts:
3 left
💡 Hint
Common Mistakes
Using dim=0 which would give wrong axis for batch predictions.
Passing a single text instead of a list.