0
0
NLPml~5 mins

Why sequence models understand word order in NLP

Choose your learning style9 modes available
Introduction

Sequence models learn the order of words to understand meaning better. This helps them make sense of sentences just like we do when reading.

When translating a sentence from one language to another
When predicting the next word in a sentence while typing
When analyzing the sentiment of a review based on word order
When recognizing speech where word order changes meaning
When summarizing a paragraph by understanding the flow of ideas
Syntax
NLP
model = Sequential()
model.add(Embedding(input_dim=vocab_size, output_dim=embedding_dim))
model.add(LSTM(units=hidden_units))
model.add(Dense(units=output_classes, activation='softmax'))

The LSTM layer processes words in order, remembering previous words.

Embedding converts words into numbers that keep their meaning.

Examples
This model reads sentences word by word to classify them into 5 categories.
NLP
model = Sequential()
model.add(Embedding(10000, 64))
model.add(LSTM(128))
model.add(Dense(5, activation='softmax'))
Using GRU instead of LSTM for a simpler sequence model to detect positive or negative sentiment.
NLP
model = Sequential()
model.add(Embedding(5000, 32))
model.add(GRU(64))
model.add(Dense(2, activation='sigmoid'))
Sample Model

This simple example shows how an LSTM model learns word order from small sentences to classify them into two groups. The prediction shows probabilities for each class and the chosen class.

NLP
import numpy as np
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding, LSTM, Dense

# Sample data: 3 sentences, each with 4 words (word indices)
x_train = np.array([[1, 2, 3, 4], [4, 3, 2, 1], [1, 3, 2, 4]])
y_train = np.array([0, 1, 0])  # Two classes

vocab_size = 10
embedding_dim = 8
hidden_units = 16
output_classes = 2

model = Sequential()
model.add(Embedding(input_dim=vocab_size, output_dim=embedding_dim, input_length=4))
model.add(LSTM(units=hidden_units))
model.add(Dense(units=output_classes, activation='softmax'))

model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])

history = model.fit(x_train, y_train, epochs=5, verbose=0)

# Predict on a new sentence
x_test = np.array([[1, 2, 3, 4]])
prediction = model.predict(x_test)

print(f"Predicted probabilities: {prediction}")
print(f"Predicted class: {np.argmax(prediction)}")
print(f"Training accuracy after 5 epochs: {history.history['accuracy'][-1]:.2f}")
OutputSuccess
Important Notes

Sequence models like LSTM and GRU keep track of word order by remembering previous words.

Embedding layers turn words into numbers that keep their meaning and order.

Without sequence models, word order is lost and meaning can be misunderstood.

Summary

Sequence models understand word order by processing words one after another.

This helps models grasp sentence meaning and context better.

LSTM and GRU are common layers used to capture word order in text.