Holdout Method: A simple form of cross-validation where the dataset is split into a training set and a separate test set, with the model trained on the training set and evaluated on the test set.
K-Fold Cross-Validation: A more robust cross-validation technique where the dataset is divided into k equal-sized subsets, and the model is trained and evaluated k times, using a different subset for evaluation each time.
Leave-One-Out Cross-Validation (LOOCV): A special case of K-Fold cross-validation where the number of folds is equal to the number of observations in the dataset, and each observation is used as the test set once.