Home | Trees | Indices | Help |
|
---|
|
Classifier using Ridge regression. This node has been automatically generated by wrapping the ``sklearn.linear_model.ridge.RidgeClassifier`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Read more in the :ref:`User Guide <ridge_regression>`. **Parameters** alpha : float Small positive values of alpha improve the conditioning of the problem and reduce the variance of the estimates. Alpha corresponds to ``C^-1`` in other linear models such as LogisticRegression or LinearSVC. class_weight : dict or 'balanced', optional Weights associated with classes in the form ``{class_label: weight}``. If not given, all classes are supposed to have weight one. The "balanced" mode uses the values of y to automatically adjust weights inversely proportional to class frequencies in the input data as ``n_samples / (n_classes * np.bincount(y))`` copy_X : boolean, optional, default True If True, X will be copied; else, it may be overwritten. fit_intercept : boolean Whether to calculate the intercept for this model. If set to false, no intercept will be used in calculations (e.g. data is expected to be already centered). max_iter : int, optional Maximum number of iterations for conjugate gradient solver. The default value is determined by scipy.sparse.linalg. normalize : boolean, optional, default False If True, the regressors X will be normalized before regression. solver : {'auto', 'svd', 'cholesky', 'lsqr', 'sparse_cg', 'sag'} Solver to use in the computational routines: - 'auto' chooses the solver automatically based on the type of data. - 'svd' uses a Singular Value Decomposition of X to compute the Ridge coefficients. More stable for singular matrices than 'cholesky'. - 'cholesky' uses the standard scipy.linalg.solve function to obtain a closed-form solution. - 'sparse_cg' uses the conjugate gradient solver as found in scipy.sparse.linalg.cg. As an iterative algorithm, this solver is more appropriate than 'cholesky' for large-scale data (possibility to set `tol` and `max_iter`). - 'lsqr' uses the dedicated regularized least-squares routine scipy.sparse.linalg.lsqr. It is the fatest but may not be available in old scipy versions. It also uses an iterative procedure. - 'sag' uses a Stochastic Average Gradient descent. It also uses an iterative procedure, and is faster than other solvers when both n_samples and n_features are large. .. versionadded:: 0.17 Stochastic Average Gradient descent solver. tol : float Precision of the solution. random_state : int seed, RandomState instance, or None (default) The seed of the pseudo random number generator to use when shuffling the data. Used in 'sag' solver. **Attributes** ``coef_`` : array, shape (n_features,) or (n_classes, n_features) Weight vector(s). ``intercept_`` : float | array, shape = (n_targets,) Independent term in decision function. Set to 0.0 if ``fit_intercept = False``. ``n_iter_`` : array or None, shape (n_targets,) Actual number of iterations for each target. Available only for sag and lsqr solvers. Other solvers will return None. See also Ridge, RidgeClassifierCV **Notes** For multi-class classification, n_class classifiers are trained in a one-versus-all approach. Concretely, this is implemented by taking advantage of the multi-variate response support in Ridge.
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
Inherited from Inherited from Inherited from |
|||
Inherited from ClassifierCumulator | |||
---|---|---|---|
|
|||
|
|||
|
|||
Inherited from ClassifierNode | |||
|
|||
|
|||
|
|||
|
|||
|
|||
Inherited from Node | |||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|
|||
|
|||
|
|
|||
Inherited from |
|||
Inherited from Node | |||
---|---|---|---|
_train_seq List of tuples: |
|||
dtype dtype |
|||
input_dim Input dimensions |
|||
output_dim Output dimensions |
|||
supported_dtypes Supported dtypes |
|
Classifier using Ridge regression. This node has been automatically generated by wrapping the ``sklearn.linear_model.ridge.RidgeClassifier`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Read more in the :ref:`User Guide <ridge_regression>`. **Parameters** alpha : float Small positive values of alpha improve the conditioning of the problem and reduce the variance of the estimates. Alpha corresponds to ``C^-1`` in other linear models such as LogisticRegression or LinearSVC. class_weight : dict or 'balanced', optional Weights associated with classes in the form ``{class_label: weight}``. If not given, all classes are supposed to have weight one. The "balanced" mode uses the values of y to automatically adjust weights inversely proportional to class frequencies in the input data as ``n_samples / (n_classes * np.bincount(y))`` copy_X : boolean, optional, default True If True, X will be copied; else, it may be overwritten. fit_intercept : boolean Whether to calculate the intercept for this model. If set to false, no intercept will be used in calculations (e.g. data is expected to be already centered). max_iter : int, optional Maximum number of iterations for conjugate gradient solver. The default value is determined by scipy.sparse.linalg. normalize : boolean, optional, default False If True, the regressors X will be normalized before regression. solver : {'auto', 'svd', 'cholesky', 'lsqr', 'sparse_cg', 'sag'} Solver to use in the computational routines: - 'auto' chooses the solver automatically based on the type of data. - 'svd' uses a Singular Value Decomposition of X to compute the Ridge coefficients. More stable for singular matrices than 'cholesky'. - 'cholesky' uses the standard scipy.linalg.solve function to obtain a closed-form solution. - 'sparse_cg' uses the conjugate gradient solver as found in scipy.sparse.linalg.cg. As an iterative algorithm, this solver is more appropriate than 'cholesky' for large-scale data (possibility to set `tol` and `max_iter`). - 'lsqr' uses the dedicated regularized least-squares routine scipy.sparse.linalg.lsqr. It is the fatest but may not be available in old scipy versions. It also uses an iterative procedure. - 'sag' uses a Stochastic Average Gradient descent. It also uses an iterative procedure, and is faster than other solvers when both n_samples and n_features are large. .. versionadded:: 0.17 Stochastic Average Gradient descent solver. tol : float Precision of the solution. random_state : int seed, RandomState instance, or None (default) The seed of the pseudo random number generator to use when shuffling the data. Used in 'sag' solver. **Attributes** ``coef_`` : array, shape (n_features,) or (n_classes, n_features) Weight vector(s). ``intercept_`` : float | array, shape = (n_targets,) Independent term in decision function. Set to 0.0 if ``fit_intercept = False``. ``n_iter_`` : array or None, shape (n_targets,) Actual number of iterations for each target. Available only for sag and lsqr solvers. Other solvers will return None. See also Ridge, RidgeClassifierCV **Notes** For multi-class classification, n_class classifiers are trained in a one-versus-all approach. Concretely, this is implemented by taking advantage of the multi-variate response support in Ridge.
|
|
|
Transform the data and labels lists to array objects and reshape them.
|
|
|
Predict class labels for samples in X. This node has been automatically generated by wrapping the sklearn.linear_model.ridge.RidgeClassifier class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters
Returns
|
Fit Ridge regression model. This node has been automatically generated by wrapping the ``sklearn.linear_model.ridge.RidgeClassifier`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. **Parameters** X : {array-like, sparse matrix}, shape = [n_samples,n_features] Training data y : array-like, shape = [n_samples] Target values sample_weight : float or numpy array of shape (n_samples,) Sample weight. .. versionadded:: 0.17 *sample_weight* support to Classifier. Returns self : returns an instance of self.
|
Home | Trees | Indices | Help |
|
---|
Generated by Epydoc 3.0.1 on Tue Mar 8 12:39:48 2016 | http://epydoc.sourceforge.net |