Approximates feature map of an RBF kernel by Monte Carlo approximation of its Fourier transform.
It implements a variant of Random Kitchen Sinks.1
Read more in the :ref:`User Guide <rbf_kernel_approx>`.
Parameters ---------- gamma : float Parameter of RBF kernel: exp(-gamma * x^2)
n_components : int Number of Monte Carlo samples per original feature. Equals the dimensionality of the computed feature space.
random_state : int, RandomState instance or None, optional (default=None) Pseudo-random number generator to control the generation of the random weights and random offset when fitting the training data. Pass an int for reproducible output across multiple function calls. See :term:`Glossary <random_state>`.
Attributes ---------- random_offset_ : ndarray of shape (n_components,), dtype=float64 Random offset used to compute the projection in the `n_components` dimensions of the feature space.
random_weights_ : ndarray of shape (n_features, n_components), dtype=float64 Random projection directions drawn from the Fourier transform of the RBF kernel.
Examples -------- >>> from sklearn.kernel_approximation import RBFSampler >>> from sklearn.linear_model import SGDClassifier >>> X = [0, 0], [1, 1], [1, 0], [0, 1]
>>> y = 0, 0, 1, 1
>>> rbf_feature = RBFSampler(gamma=1, random_state=1) >>> X_features = rbf_feature.fit_transform(X) >>> clf = SGDClassifier(max_iter=5, tol=1e-3) >>> clf.fit(X_features, y) SGDClassifier(max_iter=5) >>> clf.score(X_features, y) 1.0
Notes ----- See 'Random Features for Large-Scale Kernel Machines' by A. Rahimi and Benjamin Recht.
1
'Weighted Sums of Random Kitchen Sinks: Replacing minimization with randomization in learning' by A. Rahimi and Benjamin Recht. (https://people.eecs.berkeley.edu/~brecht/papers/08.rah.rec.nips.pdf)