0
0
NLPml~10 mins

Why sequence models understand word order in NLP - Test Your Understanding

Choose your learning style9 modes available
Practice - 5 Tasks
Answer the questions below
1fill in blank
easy

Complete the code to create a sequence model layer that processes input words in order.

NLP
from tensorflow.keras.layers import [1]
sequence_layer = [1](units=32)
Drag options to blanks, or click blank then click option'
AConv2D
BLSTM
CDense
DFlatten
Attempts:
3 left
💡 Hint
Common Mistakes
Choosing Dense or Conv2D which do not handle sequence order.
2fill in blank
medium

Complete the code to add positional encoding to input embeddings to help the model understand word order.

NLP
import tensorflow as tf
import numpy as np

def positional_encoding(seq_len, d_model):
    pos = np.arange(seq_len)[:, np.newaxis]
    i = np.arange(d_model)[np.newaxis, :]
    angle_rates = 1 / np.power(10000, (2 * (i//2)) / np.float32(d_model))
    angle_rads = pos * angle_rates
    sines = np.sin(angle_rads[:, 0::2])
    cosines = np.cos(angle_rads[:, 1::2])
    pos_encoding = np.concatenate([sines, cosines], axis=-1)
    return tf.cast(pos_encoding, dtype=tf.float32)

seq_len = 50
d_model = 128
pos_encoding = positional_encoding(seq_len, d_model)

input_embeddings = tf.random.uniform((1, seq_len, d_model))
output = input_embeddings + [1]
Drag options to blanks, or click blank then click option'
Atf.random.normal(input_embeddings.shape)
Btf.zeros_like(input_embeddings)
Cpos_encoding
Dtf.ones_like(input_embeddings)
Attempts:
3 left
💡 Hint
Common Mistakes
Adding zeros or random noise instead of positional encoding.
3fill in blank
hard

Fix the error in the code that tries to create a Transformer model input layer for sequences.

NLP
from tensorflow.keras.layers import Input

sequence_input = Input(shape=([1],), dtype='int32')
Drag options to blanks, or click blank then click option'
A100
B1
C32
DNone
Attempts:
3 left
💡 Hint
Common Mistakes
Using fixed sequence length which restricts input flexibility.
4fill in blank
hard

Fill both blanks to create a dictionary comprehension that maps words to their lengths only if length is greater than 3.

NLP
words = ['apple', 'cat', 'banana', 'dog']
lengths = {word: [1] for word in words if [2]
Drag options to blanks, or click blank then click option'
Alen(word)
Blen(word) > 3
Cword.startswith('a')
Dword == 'banana'
Attempts:
3 left
💡 Hint
Common Mistakes
Using wrong condition or value expression.
5fill in blank
hard

Fill all three blanks to create a dictionary comprehension that maps uppercase words to their lengths if length is less than 6.

NLP
words = ['apple', 'cat', 'banana', 'dog']
result = { [1]: [2] for word in words if [3] }
Drag options to blanks, or click blank then click option'
Aword.upper()
Blen(word)
Clen(word) < 6
Dword.lower()
Attempts:
3 left
💡 Hint
Common Mistakes
Mixing up keys and values or wrong condition.