Mini-batch gradient descent is an optimization algorithm used to train machine learning models, particularly in the context of neural networks. It combines the advantages of both batch gradient descent and stochastic gradient descent by dividing the dataset into small subsets called mini-batches, allowing the model to update weights more frequently while maintaining a stable convergence. This approach helps speed up training and improve performance on large datasets, making it particularly effective for deep learning applications.
congrats on reading the definition of mini-batch gradient descent. now let's actually learn it.