Activation functions help a machine learning model decide what information to keep or ignore. They add the ability to learn complex patterns by introducing non-linearity.
0
0
Activation functions in ML Python
Introduction
When building a neural network to classify images like cats and dogs.
When predicting house prices using many features with complex relationships.
When creating a model that needs to understand speech or text.
When training a deep learning model to recognize handwritten digits.
When you want your model to learn from data that is not just straight lines.
Syntax
ML Python
def activation_function(x): # apply some math to x return result
Activation functions take a number and return another number.
They are applied to each neuron's output in a neural network.
Examples
ReLU returns zero if input is negative, else returns input itself.
ML Python
def relu(x): return max(0, x)
Sigmoid squashes input to a value between 0 and 1.
ML Python
import math def sigmoid(x): return 1 / (1 + math.exp(-x))
Tanh squashes input to a value between -1 and 1.
ML Python
import math def tanh(x): return math.tanh(x)
Sample Model
This program shows how ReLU and Sigmoid activation functions transform a list of numbers. ReLU clips negatives to zero, sigmoid squashes values between 0 and 1.
ML Python
import math def relu(x): return max(0, x) def sigmoid(x): return 1 / (1 + math.exp(-x)) inputs = [-2, -1, 0, 1, 2] relu_outputs = [relu(x) for x in inputs] sigmoid_outputs = [sigmoid(x) for x in inputs] print(f"ReLU outputs: {relu_outputs}") print(f"Sigmoid outputs: {[round(s, 3) for s in sigmoid_outputs]}")
OutputSuccess
Important Notes
Choosing the right activation function affects how well your model learns.
ReLU is popular because it is simple and helps models learn faster.
Sigmoid is good for outputs that represent probabilities.
Summary
Activation functions add non-linearity to neural networks.
Common types include ReLU, Sigmoid, and Tanh.
They help models learn complex patterns beyond straight lines.