0
0
TensorFlowml~5 mins

Dense (fully connected) layers in TensorFlow

Choose your learning style9 modes available
Introduction

A Dense layer connects every input to every output. It helps the model learn patterns by mixing all input information.

When you want to combine features from previous layers to make a decision.
When building simple neural networks for tasks like classification or regression.
When you need a layer that learns weighted sums of inputs plus a bias.
When you want to add a final decision layer that outputs predictions.
When you want to transform data into a new space for better learning.
Syntax
TensorFlow
tf.keras.layers.Dense(units, activation=None, use_bias=True, kernel_initializer='glorot_uniform')

units is the number of neurons in the layer.

activation is the function that adds non-linearity, like 'relu' or 'softmax'.

Examples
A Dense layer with 10 neurons and no activation function.
TensorFlow
tf.keras.layers.Dense(10)
A Dense layer with 5 neurons using ReLU activation to add non-linearity.
TensorFlow
tf.keras.layers.Dense(5, activation='relu')
A Dense layer with 3 neurons using softmax activation, often used for multi-class classification.
TensorFlow
tf.keras.layers.Dense(3, activation='softmax')
Sample Model

This code creates a simple neural network with one Dense layer. It takes 3 features as input and outputs 2 numbers per sample. The ReLU activation makes sure outputs are not negative. We print the output for 4 input samples.

TensorFlow
import tensorflow as tf
import numpy as np

# Create sample input data: 4 samples, each with 3 features
input_data = np.array([[1.0, 2.0, 3.0],
                       [4.0, 5.0, 6.0],
                       [7.0, 8.0, 9.0],
                       [10.0, 11.0, 12.0]], dtype=np.float32)

# Define a simple model with one Dense layer of 2 neurons and ReLU activation
model = tf.keras.Sequential([
    tf.keras.layers.Dense(2, activation='relu', input_shape=(3,))
])

# Run the model on input data to get predictions
output = model(input_data)

# Print the output values
print(output.numpy())
OutputSuccess
Important Notes

The Dense layer automatically adds a bias term unless you set use_bias=False.

Weights and biases are learned during training to improve model predictions.

Activation functions help the model learn complex patterns beyond simple linear combinations.

Summary

Dense layers connect every input to every output neuron.

They learn weights and biases to transform data.

Activation functions add non-linearity to help learn complex patterns.