Programming for Mathematical Applications
Mini-batch gradient descent is an optimization algorithm used in training machine learning models, which updates model parameters using a small subset of the training data, called a mini-batch, instead of the entire dataset or a single example. This method strikes a balance between the efficiency of stochastic gradient descent, which uses one sample at a time, and batch gradient descent, which processes the entire dataset at once. It allows for faster convergence and can help improve the model's generalization performance.
congrats on reading the definition of mini-batch gradient descent. now let's actually learn it.