Home | Trees | Indices | Help |
|
---|
|
An extremely randomized tree regressor. This node has been automatically generated by wrapping the ``sklearn.tree.tree.ExtraTreeRegressor`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Extra-trees differ from classic decision trees in the way they are built. When looking for the best split to separate the samples of a node into two groups, random splits are drawn for each of the `max_features` randomly selected features and the best split among those is chosen. When `max_features` is set 1, this amounts to building a totally random decision tree. Warning: Extra-trees should only be used within ensemble methods. Read more in the :ref:`User Guide <tree>`. See also ExtraTreeClassifier, ExtraTreesClassifier, ExtraTreesRegressor **References** .. [1] P. Geurts, D. Ernst., and L. Wehenkel, "Extremely randomized trees", Machine Learning, 63(1), 3-42, 2006.
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
Inherited from Inherited from |
|||
Inherited from Cumulator | |||
---|---|---|---|
|
|||
|
|||
Inherited from Node | |||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|
|||
|
|||
|
|
|||
Inherited from |
|||
Inherited from Node | |||
---|---|---|---|
_train_seq List of tuples: |
|||
dtype dtype |
|||
input_dim Input dimensions |
|||
output_dim Output dimensions |
|||
supported_dtypes Supported dtypes |
|
An extremely randomized tree regressor. This node has been automatically generated by wrapping the ``sklearn.tree.tree.ExtraTreeRegressor`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Extra-trees differ from classic decision trees in the way they are built. When looking for the best split to separate the samples of a node into two groups, random splits are drawn for each of the `max_features` randomly selected features and the best split among those is chosen. When `max_features` is set 1, this amounts to building a totally random decision tree. Warning: Extra-trees should only be used within ensemble methods. Read more in the :ref:`User Guide <tree>`. See also ExtraTreeClassifier, ExtraTreesClassifier, ExtraTreesRegressor **References** .. [1] P. Geurts, D. Ernst., and L. Wehenkel, "Extremely randomized trees", Machine Learning, 63(1), 3-42, 2006.
|
|
|
|
DEPRECATED: Support to use estimators as feature selectors will be removed in version 0.19. Use SelectFromModel instead. This node has been automatically generated by wrapping the ``sklearn.tree.tree.ExtraTreeRegressor`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Reduce X to its most important features. Uses ``coef_`` or ``feature_importances_`` to determine the most important features. For models with a ``coef_`` for each class, the absolute sum over the classes is used. Parameters ---------- X : array or scipy sparse matrix of shape [n_samples, n_features] The input samples. threshold : string, float or None, optional (default=None) The threshold value to use for feature selection. Features whose importance is greater or equal are kept while the others are discarded. If "median" (resp. "mean"), then the threshold value is the median (resp. the mean) of the feature importances. A scaling factor (e.g., "1.25*mean") may also be used. If None and if available, the object attribute ``threshold`` is used. Otherwise, "mean" is used by default. Returns ------- X_r : array of shape [n_samples, n_selected_features] The input samples with only the selected features.
|
|
|
Build a decision tree from the training set (X, y). This node has been automatically generated by wrapping the sklearn.tree.tree.ExtraTreeRegressor class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters
Returns
|
Home | Trees | Indices | Help |
|
---|
Generated by Epydoc 3.0.1 on Tue Mar 8 12:39:48 2016 | http://epydoc.sourceforge.net |