0
0
ML Pythonml~5 mins

Neural network architecture in ML Python

Choose your learning style9 modes available
Introduction

A neural network architecture is like a blueprint for how a computer learns from data. It shows how layers and neurons connect to solve problems like recognizing pictures or understanding speech.

When you want a computer to recognize handwritten numbers.
When building a system that understands spoken words.
When predicting house prices based on features like size and location.
When classifying emails as spam or not spam.
When creating a model to recommend movies based on your past choices.
Syntax
ML Python
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense

model = Sequential([
    Dense(units=64, activation='relu', input_shape=(input_size,)),
    Dense(units=10, activation='softmax')
])

Sequential means layers are stacked one after another.

Dense layers are fully connected layers where each neuron connects to all neurons in the next layer.

Examples
A simple network for binary classification with 20 input features, one hidden layer with 32 neurons, and one output neuron.
ML Python
model = Sequential([
    Dense(32, activation='relu', input_shape=(20,)),
    Dense(1, activation='sigmoid')
])
A deeper network with two hidden layers for classifying into 10 categories.
ML Python
model = Sequential([
    Dense(128, activation='relu', input_shape=(100,)),
    Dense(64, activation='relu'),
    Dense(10, activation='softmax')
])
Sample Model

This program creates a small neural network to classify data into 3 categories. It trains on random data and shows the training accuracy and prediction probabilities for a new sample.

ML Python
import numpy as np
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.utils import to_categorical

# Create dummy data: 100 samples, 20 features
X = np.random.random((100, 20))
# Create dummy labels for 3 classes
y = np.random.randint(3, size=(100,))
y_cat = to_categorical(y, 3)

# Build a simple neural network
model = Sequential([
    Dense(16, activation='relu', input_shape=(20,)),
    Dense(3, activation='softmax')
])

# Compile the model
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])

# Train the model
history = model.fit(X, y_cat, epochs=5, batch_size=10, verbose=0)

# Print training accuracy after last epoch
print(f"Training accuracy after 5 epochs: {history.history['accuracy'][-1]:.4f}")

# Make a prediction on a new sample
new_sample = np.random.random((1, 20))
prediction = model.predict(new_sample)
print(f"Prediction probabilities: {prediction[0]}")
OutputSuccess
Important Notes

Input shape must match the number of features in your data.

Activation functions like 'relu' help the network learn complex patterns.

Output layer activation depends on the task: 'softmax' for multi-class, 'sigmoid' for binary.

Summary

Neural network architecture defines how layers and neurons connect to learn from data.

Use Sequential models to stack layers simply.

Choose layer sizes and activations based on your problem type.