Leave P Group(s) Out cross-validator
Provides train/test indices to split data according to a third-party provided group. This group information can be used to encode arbitrary domain specific stratifications of the samples as integers.
For instance the groups could be the year of collection of the samples and thus allow for cross-validation against time-based splits.
The difference between LeavePGroupsOut and LeaveOneGroupOut is that the former builds the test sets with all the samples assigned to ``p`` different values of the groups while the latter uses samples all assigned the same groups.
Read more in the :ref:`User Guide <cross_validation>`.
Parameters ---------- n_groups : int Number of groups (``p``) to leave out in the test split.
Examples -------- >>> import numpy as np >>> from sklearn.model_selection import LeavePGroupsOut >>> X = np.array([1, 2], [3, 4], [5, 6]
) >>> y = np.array(1, 2, 1
) >>> groups = np.array(1, 2, 3
) >>> lpgo = LeavePGroupsOut(n_groups=2) >>> lpgo.get_n_splits(X, y, groups) 3 >>> lpgo.get_n_splits(groups=groups) # 'groups' is always required 3 >>> print(lpgo) LeavePGroupsOut(n_groups=2) >>> for train_index, test_index in lpgo.split(X, y, groups): ... print('TRAIN:', train_index, 'TEST:', test_index) ... X_train, X_test = Xtrain_index
, Xtest_index
... y_train, y_test = ytrain_index
, ytest_index
... print(X_train, X_test, y_train, y_test) TRAIN: 2
TEST: 0 1
[5 6]
[1 2]
[3 4]
1
1 2
TRAIN: 1
TEST: 0 2
[3 4]
[1 2]
[5 6]
2
1 1
TRAIN: 0
TEST: 1 2
[1 2]
[3 4]
[5 6]
1
2 1
See also -------- GroupKFold: K-fold iterator variant with non-overlapping groups.