A neural network architecture is like a blueprint for how a computer learns from data. It shows how layers and neurons connect to solve problems like recognizing pictures or understanding speech.
Neural network architecture in ML Python
from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense model = Sequential([ Dense(units=64, activation='relu', input_shape=(input_size,)), Dense(units=10, activation='softmax') ])
Sequential means layers are stacked one after another.
Dense layers are fully connected layers where each neuron connects to all neurons in the next layer.
model = Sequential([
Dense(32, activation='relu', input_shape=(20,)),
Dense(1, activation='sigmoid')
])model = Sequential([
Dense(128, activation='relu', input_shape=(100,)),
Dense(64, activation='relu'),
Dense(10, activation='softmax')
])This program creates a small neural network to classify data into 3 categories. It trains on random data and shows the training accuracy and prediction probabilities for a new sample.
import numpy as np from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense from tensorflow.keras.utils import to_categorical # Create dummy data: 100 samples, 20 features X = np.random.random((100, 20)) # Create dummy labels for 3 classes y = np.random.randint(3, size=(100,)) y_cat = to_categorical(y, 3) # Build a simple neural network model = Sequential([ Dense(16, activation='relu', input_shape=(20,)), Dense(3, activation='softmax') ]) # Compile the model model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy']) # Train the model history = model.fit(X, y_cat, epochs=5, batch_size=10, verbose=0) # Print training accuracy after last epoch print(f"Training accuracy after 5 epochs: {history.history['accuracy'][-1]:.4f}") # Make a prediction on a new sample new_sample = np.random.random((1, 20)) prediction = model.predict(new_sample) print(f"Prediction probabilities: {prediction[0]}")
Input shape must match the number of features in your data.
Activation functions like 'relu' help the network learn complex patterns.
Output layer activation depends on the task: 'softmax' for multi-class, 'sigmoid' for binary.
Neural network architecture defines how layers and neurons connect to learn from data.
Use Sequential models to stack layers simply.
Choose layer sizes and activations based on your problem type.