0
0
NLPml~20 mins

Custom NER training basics in NLP - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Custom NER Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Predict Output
intermediate
2:00remaining
Output of training loop snippet for custom NER
What will be the printed output after running this training loop snippet for 3 iterations?
NLP
import spacy
from spacy.training.example import Example

nlp = spacy.blank('en')
ner = nlp.add_pipe('ner')
ner.add_label('ANIMAL')

optimizer = nlp.begin_training()

TRAIN_DATA = [
    ('I have a dog', {'entities': [(7, 10, 'ANIMAL')]}),
    ('She owns a cat', {'entities': [(10, 13, 'ANIMAL')]})
]

for i in range(3):
    losses = {}
    for text, annotations in TRAIN_DATA:
        doc = nlp.make_doc(text)
        example = Example.from_dict(doc, annotations)
        nlp.update([example], sgd=optimizer, losses=losses)
    print(f'Iteration {i+1}, Losses: {losses}')
A
Iteration 1, Losses: {'ner': 0.5}
Iteration 2, Losses: {'ner': 0.3}
Iteration 3, Losses: {'ner': 0.1}
B
Iteration 1, Losses: {'ner': 0.0}
Iteration 2, Losses: {'ner': 0.0}
Iteration 3, Losses: {'ner': 0.0}
C
Iteration 1, Losses: {'ner': 0.5}
Iteration 2, Losses: {'ner': 0.0}
Iteration 3, Losses: {'ner': 0.0}
D
Iteration 1, Losses: {'ner': 0.0}
Iteration 2, Losses: {'ner': 0.3}
Iteration 3, Losses: {'ner': 0.1}
Attempts:
2 left
💡 Hint
Losses usually decrease as training progresses but start from a positive value.
Model Choice
intermediate
1:30remaining
Choosing the right model architecture for custom NER
Which model architecture is best suited for training a custom Named Entity Recognition (NER) system from scratch?
AA convolutional neural network (CNN) designed for image classification
BA transformer-based model like BERT fine-tuned for token classification
CA recurrent neural network (RNN) with LSTM layers for sequence labeling
DA simple feedforward neural network without sequence context
Attempts:
2 left
💡 Hint
NER requires understanding context around each word in a sentence.
Hyperparameter
advanced
1:30remaining
Effect of batch size on custom NER training
What is the most likely effect of increasing the batch size during training of a custom NER model?
ATraining becomes slower but model generalizes better
BTraining becomes faster and model always achieves higher accuracy
CTraining speed and model performance remain unchanged
DTraining becomes faster but may lead to less stable updates
Attempts:
2 left
💡 Hint
Think about how many examples the model sees before updating weights.
Metrics
advanced
1:30remaining
Choosing the right metric for custom NER evaluation
Which metric best evaluates the performance of a custom NER model on a test set?
AMean squared error between predicted and true entity labels
BAccuracy of token classification ignoring entity boundaries
CPrecision, Recall, and F1-score based on exact entity matches
DConfusion matrix of sentence-level classification
Attempts:
2 left
💡 Hint
NER evaluation requires matching whole entities, not just tokens.
🔧 Debug
expert
2:00remaining
Identifying cause of poor entity recognition in custom NER
After training a custom NER model, it fails to recognize any entities in new sentences. Which is the most likely cause?
AThe training data had no entity annotations or was empty
BThe model was trained with too many epochs causing overfitting
CThe optimizer was set to None during training
DThe model was trained on a different language than the test sentences
Attempts:
2 left
💡 Hint
If the model never saw entities during training, it cannot learn to recognize them.