Mini-batch gradient descent is an optimization algorithm used to minimize a function by iteratively updating parameters based on a small subset of data, known as a mini-batch. This method strikes a balance between the efficiency of stochastic gradient descent and the stability of batch gradient descent, making it particularly effective for training large-scale machine learning models. By processing smaller batches of data, it helps in reducing the variance of the parameter updates and can lead to faster convergence.
congrats on reading the definition of mini-batch gradient descent. now let's actually learn it.