0
0
NLPml~20 mins

Multi-class text classification in NLP - Practice Problems & Coding Challenges

Choose your learning style9 modes available
Challenge - 5 Problems
🎖️
Multi-class Text Classification Master
Get all challenges correct to earn this badge!
Test your skills under time pressure!
Model Choice
intermediate
2:00remaining
Choosing the right model for multi-class text classification

You want to classify news articles into 5 categories based on their text content. Which model is most suitable for this multi-class text classification task?

AA k-means clustering algorithm
BA linear regression model
CA convolutional neural network (CNN) designed for text classification
DA simple decision tree for regression
Attempts:
2 left
💡 Hint

Think about models that handle text data and can output multiple classes.

Metrics
intermediate
2:00remaining
Evaluating multi-class classification with accuracy

You trained a multi-class text classifier with 4 classes. After testing, you got the following predictions and true labels:

Predictions: [2, 0, 1, 3, 1, 0]
True labels: [2, 0, 0, 3, 1, 1]

What is the accuracy of the model on this test set?

A0.6667
B0.5
C0.8333
D0.3333
Attempts:
2 left
💡 Hint

Accuracy = (number of correct predictions) / (total predictions).

Predict Output
advanced
2:00remaining
Output of softmax activation in multi-class classification

Given the logits output from a model for 3 classes: [2.0, 1.0, 0.1], what is the softmax output vector?

NLP
import numpy as np
logits = np.array([2.0, 1.0, 0.1])
exp_logits = np.exp(logits)
softmax = exp_logits / np.sum(exp_logits)
print(np.round(softmax, 3))
A[0.500, 0.300, 0.200]
B[0.659, 0.242, 0.099]
C[0.700, 0.200, 0.100]
D[0.333, 0.333, 0.333]
Attempts:
2 left
💡 Hint

Softmax converts logits to probabilities that sum to 1.

Hyperparameter
advanced
2:00remaining
Choosing batch size for training a multi-class text classifier

You are training a multi-class text classification model on a large dataset. Which batch size choice is likely to improve training stability and speed without using too much memory?

ABatch size of 1 (stochastic gradient descent)
BBatch size of 1024 (very large batch)
CBatch size equal to the entire dataset
DBatch size of 32 (moderate size)
Attempts:
2 left
💡 Hint

Consider trade-offs between memory, speed, and gradient stability.

🔧 Debug
expert
2:00remaining
Identifying the cause of poor multi-class classification accuracy

You trained a multi-class text classifier with 6 classes. The training accuracy is 95%, but test accuracy is only 40%. Which issue is most likely causing this problem?

AThe model is overfitting the training data
BThe test data has fewer classes than training data
CThe model is underfitting the training data
DThe loss function used is mean squared error
Attempts:
2 left
💡 Hint

Think about why training accuracy is high but test accuracy is low.