A Dense layer connects every input to every output. It helps the model learn patterns by mixing all input information.
Dense (fully connected) layers in TensorFlow
tf.keras.layers.Dense(units, activation=None, use_bias=True, kernel_initializer='glorot_uniform')
units is the number of neurons in the layer.
activation is the function that adds non-linearity, like 'relu' or 'softmax'.
tf.keras.layers.Dense(10)tf.keras.layers.Dense(5, activation='relu')
tf.keras.layers.Dense(3, activation='softmax')
This code creates a simple neural network with one Dense layer. It takes 3 features as input and outputs 2 numbers per sample. The ReLU activation makes sure outputs are not negative. We print the output for 4 input samples.
import tensorflow as tf import numpy as np # Create sample input data: 4 samples, each with 3 features input_data = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0], [7.0, 8.0, 9.0], [10.0, 11.0, 12.0]], dtype=np.float32) # Define a simple model with one Dense layer of 2 neurons and ReLU activation model = tf.keras.Sequential([ tf.keras.layers.Dense(2, activation='relu', input_shape=(3,)) ]) # Run the model on input data to get predictions output = model(input_data) # Print the output values print(output.numpy())
The Dense layer automatically adds a bias term unless you set use_bias=False.
Weights and biases are learned during training to improve model predictions.
Activation functions help the model learn complex patterns beyond simple linear combinations.
Dense layers connect every input to every output neuron.
They learn weights and biases to transform data.
Activation functions add non-linearity to help learn complex patterns.