Overview - Linear (fully connected) layers
What is it?
A linear layer is a basic building block in neural networks that connects every input to every output with a weight. It multiplies the input by a matrix of weights and adds a bias to produce the output. This layer helps the model learn relationships between input features and output predictions. It is often called a fully connected or dense layer.
Why it matters
Linear layers allow neural networks to learn complex patterns by combining input features in flexible ways. Without them, models would be unable to mix information from different inputs, limiting their ability to solve real-world problems like image recognition or language understanding. They form the foundation for most deep learning models.
Where it fits
Before learning linear layers, you should understand basic matrix multiplication and vectors. After mastering linear layers, you can learn activation functions, convolutional layers, and how to build deeper neural networks.