package sklearn

  1. Overview
  2. Docs
Legend:
Library
Module
Module type
Parameter
Class
Class type
type tag = [
  1. | `NuSVR
]
type t = [ `BaseEstimator | `BaseLibSVM | `NuSVR | `Object | `RegressorMixin ] Obj.t
val of_pyobject : Py.Object.t -> t
val to_pyobject : [> tag ] Obj.t -> Py.Object.t
val as_estimator : t -> [ `BaseEstimator ] Obj.t
val as_regressor : t -> [ `RegressorMixin ] Obj.t
val as_lib_svm : t -> [ `BaseLibSVM ] Obj.t
val create : ?nu:float -> ?c:float -> ?kernel:[ `Linear | `Poly | `Rbf | `Sigmoid | `Precomputed ] -> ?degree:int -> ?gamma:[ `Scale | `Auto | `F of float ] -> ?coef0:float -> ?shrinking:bool -> ?tol:float -> ?cache_size:float -> ?verbose:int -> ?max_iter:int -> unit -> t

Nu Support Vector Regression.

Similar to NuSVC, for regression, uses a parameter nu to control the number of support vectors. However, unlike NuSVC, where nu replaces C, here nu replaces the parameter epsilon of epsilon-SVR.

The implementation is based on libsvm.

Read more in the :ref:`User Guide <svm_regression>`.

Parameters ---------- nu : float, default=0.5 An upper bound on the fraction of training errors and a lower bound of the fraction of support vectors. Should be in the interval (0, 1]. By default 0.5 will be taken.

C : float, default=1.0 Penalty parameter C of the error term.

kernel : 'linear', 'poly', 'rbf', 'sigmoid', 'precomputed', default='rbf' Specifies the kernel type to be used in the algorithm. It must be one of 'linear', 'poly', 'rbf', 'sigmoid', 'precomputed' or a callable. If none is given, 'rbf' will be used. If a callable is given it is used to precompute the kernel matrix.

degree : int, default=3 Degree of the polynomial kernel function ('poly'). Ignored by all other kernels.

gamma : 'scale', 'auto' or float, default='scale' Kernel coefficient for 'rbf', 'poly' and 'sigmoid'.

  • if ``gamma='scale'`` (default) is passed then it uses 1 / (n_features * X.var()) as value of gamma,
  • if 'auto', uses 1 / n_features.

.. versionchanged:: 0.22 The default value of ``gamma`` changed from 'auto' to 'scale'.

coef0 : float, default=0.0 Independent term in kernel function. It is only significant in 'poly' and 'sigmoid'.

shrinking : bool, default=True Whether to use the shrinking heuristic. See the :ref:`User Guide <shrinking_svm>`.

tol : float, default=1e-3 Tolerance for stopping criterion.

cache_size : float, default=200 Specify the size of the kernel cache (in MB).

verbose : bool, default=False Enable verbose output. Note that this setting takes advantage of a per-process runtime setting in libsvm that, if enabled, may not work properly in a multithreaded context.

max_iter : int, default=-1 Hard limit on iterations within solver, or -1 for no limit.

Attributes ---------- support_ : ndarray of shape (n_SV,) Indices of support vectors.

support_vectors_ : ndarray of shape (n_SV, n_features) Support vectors.

dual_coef_ : ndarray of shape (1, n_SV) Coefficients of the support vector in the decision function.

coef_ : ndarray of shape (1, n_features) Weights assigned to the features (coefficients in the primal problem). This is only available in the case of a linear kernel.

`coef_` is readonly property derived from `dual_coef_` and `support_vectors_`.

intercept_ : ndarray of shape (1,) Constants in decision function.

Examples -------- >>> from sklearn.svm import NuSVR >>> from sklearn.pipeline import make_pipeline >>> from sklearn.preprocessing import StandardScaler >>> import numpy as np >>> n_samples, n_features = 10, 5 >>> np.random.seed(0) >>> y = np.random.randn(n_samples) >>> X = np.random.randn(n_samples, n_features) >>> regr = make_pipeline(StandardScaler(), NuSVR(C=1.0, nu=0.1)) >>> regr.fit(X, y) Pipeline(steps=('standardscaler', StandardScaler()), ('nusvr', NuSVR(nu=0.1)))

See also -------- NuSVC Support Vector Machine for classification implemented with libsvm with a parameter to control the number of support vectors.

SVR epsilon Support Vector Machine for regression implemented with libsvm.

Notes ----- **References:** `LIBSVM: A Library for Support Vector Machines <http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf>`__

val fit : ?sample_weight:[> `ArrayLike ] Np.Obj.t -> x:[> `ArrayLike ] Np.Obj.t -> y:[> `ArrayLike ] Np.Obj.t -> [> tag ] Obj.t -> t

Fit the SVM model according to the given training data.

Parameters ---------- X : array-like, sparse matrix of shape (n_samples, n_features) or (n_samples, n_samples) Training vectors, where n_samples is the number of samples and n_features is the number of features. For kernel='precomputed', the expected shape of X is (n_samples, n_samples).

y : array-like of shape (n_samples,) Target values (class labels in classification, real numbers in regression)

sample_weight : array-like of shape (n_samples,), default=None Per-sample weights. Rescale C per sample. Higher weights force the classifier to put more emphasis on these points.

Returns ------- self : object

Notes ----- If X and y are not C-ordered and contiguous arrays of np.float64 and X is not a scipy.sparse.csr_matrix, X and/or y may be copied.

If X is a dense array, then the other methods will not support sparse matrices as input.

val get_params : ?deep:bool -> [> tag ] Obj.t -> Dict.t

Get parameters for this estimator.

Parameters ---------- deep : bool, default=True If True, will return the parameters for this estimator and contained subobjects that are estimators.

Returns ------- params : mapping of string to any Parameter names mapped to their values.

val predict : x:[> `ArrayLike ] Np.Obj.t -> [> tag ] Obj.t -> [> `ArrayLike ] Np.Obj.t

Perform regression on samples in X.

For an one-class model, +1 (inlier) or -1 (outlier) is returned.

Parameters ---------- X : array-like, sparse matrix of shape (n_samples, n_features) For kernel='precomputed', the expected shape of X is (n_samples_test, n_samples_train).

Returns ------- y_pred : ndarray of shape (n_samples,)

val score : ?sample_weight:[> `ArrayLike ] Np.Obj.t -> x:[> `ArrayLike ] Np.Obj.t -> y:[> `ArrayLike ] Np.Obj.t -> [> tag ] Obj.t -> float

Return the coefficient of determination R^2 of the prediction.

The coefficient R^2 is defined as (1 - u/v), where u is the residual sum of squares ((y_true - y_pred) ** 2).sum() and v is the total sum of squares ((y_true - y_true.mean()) ** 2).sum(). The best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get a R^2 score of 0.0.

Parameters ---------- X : array-like of shape (n_samples, n_features) Test samples. For some estimators this may be a precomputed kernel matrix or a list of generic objects instead, shape = (n_samples, n_samples_fitted), where n_samples_fitted is the number of samples used in the fitting for the estimator.

y : array-like of shape (n_samples,) or (n_samples, n_outputs) True values for X.

sample_weight : array-like of shape (n_samples,), default=None Sample weights.

Returns ------- score : float R^2 of self.predict(X) wrt. y.

Notes ----- The R2 score used when calling ``score`` on a regressor uses ``multioutput='uniform_average'`` from version 0.23 to keep consistent with default value of :func:`~sklearn.metrics.r2_score`. This influences the ``score`` method of all the multioutput regressors (except for :class:`~sklearn.multioutput.MultiOutputRegressor`).

val set_params : ?params:(string * Py.Object.t) list -> [> tag ] Obj.t -> t

Set the parameters of this estimator.

The method works on simple estimators as well as on nested objects (such as pipelines). The latter have parameters of the form ``<component>__<parameter>`` so that it's possible to update each component of a nested object.

Parameters ---------- **params : dict Estimator parameters.

Returns ------- self : object Estimator instance.

val support_ : t -> [> `ArrayLike ] Np.Obj.t

Attribute support_: get value or raise Not_found if None.

val support_opt : t -> [> `ArrayLike ] Np.Obj.t option

Attribute support_: get value as an option.

val support_vectors_ : t -> [> `ArrayLike ] Np.Obj.t

Attribute support_vectors_: get value or raise Not_found if None.

val support_vectors_opt : t -> [> `ArrayLike ] Np.Obj.t option

Attribute support_vectors_: get value as an option.

val dual_coef_ : t -> [> `ArrayLike ] Np.Obj.t

Attribute dual_coef_: get value or raise Not_found if None.

val dual_coef_opt : t -> [> `ArrayLike ] Np.Obj.t option

Attribute dual_coef_: get value as an option.

val coef_ : t -> [> `ArrayLike ] Np.Obj.t

Attribute coef_: get value or raise Not_found if None.

val coef_opt : t -> [> `ArrayLike ] Np.Obj.t option

Attribute coef_: get value as an option.

val intercept_ : t -> [> `ArrayLike ] Np.Obj.t

Attribute intercept_: get value or raise Not_found if None.

val intercept_opt : t -> [> `ArrayLike ] Np.Obj.t option

Attribute intercept_: get value as an option.

val to_string : t -> string

Print the object to a human-readable representation.

val show : t -> string

Print the object to a human-readable representation.

val pp : Stdlib.Format.formatter -> t -> unit

Pretty-print the object to a formatter.