Subgradient methods are optimization techniques used for minimizing convex functions that may not be differentiable. These methods extend the concept of gradient descent to cases where the gradient does not exist or is not unique, utilizing subgradients instead to guide the optimization process. They play a crucial role in solving vector variational inequalities, especially when applied to problems involving non-smooth or multi-dimensional objectives.
congrats on reading the definition of Subgradient Methods. now let's actually learn it.