Activation functions introduce non-linearities to neural networks, enabling them to learn complex patterns and make non-linear predictions. Activation functions determine the output of a neuron or a node in a neural network based on the weighted sum of its inputs. Some commonly used activation functions include sigmoid, tanh, and ReLU (Rectified Linear Unit).
Example code for ReLU activation function:
import numpy as np
def relu(x):
return np.maximum(0, x)