Complete the code to create a simple neural network layer using TensorFlow.
import tensorflow as tf model = tf.keras.Sequential([ tf.keras.layers.Dense(10, activation=[1]) ])
The ReLU activation function helps neural networks learn complex patterns by introducing non-linearity.
Complete the code to compile the model with an appropriate loss function for classification.
model.compile(optimizer='adam', loss=[1], metrics=['accuracy'])
Categorical crossentropy is the right loss function for multi-class classification problems.
Fix the error in the code to correctly predict classes from model output probabilities.
predictions = model.predict(test_data)
predicted_classes = tf.argmax(predictions, axis=[1])Axis 1 selects the class dimension to find the index of the highest probability.
Fill both blanks to create a dictionary comprehension that maps words to their lengths only if length is greater than 3.
word_lengths = {word: [1] for word in words if [2]The dictionary comprehension maps each word to its length only if the length is greater than 3.
Fill all three blanks to create a dictionary comprehension that maps uppercase words to their counts only if count is greater than 1.
word_counts = [1]: [2] for word, count in counts.items() if [3]
This comprehension creates a dictionary with uppercase words as keys and their counts as values, filtering counts greater than 1.