Home | Trees | Indices | Help |
|
---|
|
Sparse coding This node has been automatically generated by wrapping the ``sklearn.decomposition.dict_learning.SparseCoder`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Finds a sparse representation of data against a fixed, precomputed dictionary. Each row of the result is the solution to a sparse coding problem. The goal is to find a sparse array `code` such that:: X ~= code * dictionary Read more in the :ref:`User Guide <SparseCoder>`. **Parameters** dictionary : array, [n_components, n_features] The dictionary atoms used for sparse coding. Lines are assumed to be normalized to unit norm. transform_algorithm : {'lasso_lars', 'lasso_cd', 'lars', 'omp', 'threshold'} Algorithm used to transform the data: - lars: uses the least angle regression method (linear_model.lars_path) - lasso_lars: uses Lars to compute the Lasso solution - lasso_cd: uses the coordinate descent method to compute the - Lasso solution (linear_model.Lasso). lasso_lars will be faster if - the estimated components are sparse. - omp: uses orthogonal matching pursuit to estimate the sparse solution - threshold: squashes to zero all coefficients less than alpha from - the projection ``dictionary * X'`` transform_n_nonzero_coefs : int, ``0.1 * n_features`` by default Number of nonzero coefficients to target in each column of the solution. This is only used by `algorithm='lars'` and `algorithm='omp'` and is overridden by `alpha` in the `omp` case. transform_alpha : float, 1. by default If `algorithm='lasso_lars'` or `algorithm='lasso_cd'`, `alpha` is the penalty applied to the L1 norm. If `algorithm='threshold'`, `alpha` is the absolute value of the threshold below which coefficients will be squashed to zero. If `algorithm='omp'`, `alpha` is the tolerance parameter: the value of the reconstruction error targeted. In this case, it overrides `n_nonzero_coefs`. split_sign : bool, False by default Whether to split the sparse feature vector into the concatenation of its negative part and its positive part. This can improve the performance of downstream classifiers. n_jobs : int, number of parallel jobs to run **Attributes** ``components_`` : array, [n_components, n_features] The unchanged dictionary atoms See also DictionaryLearning MiniBatchDictionaryLearning SparsePCA MiniBatchSparsePCA sparse_encode
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
Inherited from Inherited from |
|||
Inherited from Cumulator | |||
---|---|---|---|
|
|||
|
|||
Inherited from Node | |||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|
|||
|
|||
|
|
|||
Inherited from |
|||
Inherited from Node | |||
---|---|---|---|
_train_seq List of tuples: |
|||
dtype dtype |
|||
input_dim Input dimensions |
|||
output_dim Output dimensions |
|||
supported_dtypes Supported dtypes |
|
Sparse coding This node has been automatically generated by wrapping the ``sklearn.decomposition.dict_learning.SparseCoder`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. Finds a sparse representation of data against a fixed, precomputed dictionary. Each row of the result is the solution to a sparse coding problem. The goal is to find a sparse array `code` such that:: X ~= code * dictionary Read more in the :ref:`User Guide <SparseCoder>`. **Parameters** dictionary : array, [n_components, n_features] The dictionary atoms used for sparse coding. Lines are assumed to be normalized to unit norm. transform_algorithm : {'lasso_lars', 'lasso_cd', 'lars', 'omp', 'threshold'} Algorithm used to transform the data: - lars: uses the least angle regression method (linear_model.lars_path) - lasso_lars: uses Lars to compute the Lasso solution - lasso_cd: uses the coordinate descent method to compute the - Lasso solution (linear_model.Lasso). lasso_lars will be faster if - the estimated components are sparse. - omp: uses orthogonal matching pursuit to estimate the sparse solution - threshold: squashes to zero all coefficients less than alpha from - the projection ``dictionary * X'`` transform_n_nonzero_coefs : int, ``0.1 * n_features`` by default Number of nonzero coefficients to target in each column of the solution. This is only used by `algorithm='lars'` and `algorithm='omp'` and is overridden by `alpha` in the `omp` case. transform_alpha : float, 1. by default If `algorithm='lasso_lars'` or `algorithm='lasso_cd'`, `alpha` is the penalty applied to the L1 norm. If `algorithm='threshold'`, `alpha` is the absolute value of the threshold below which coefficients will be squashed to zero. If `algorithm='omp'`, `alpha` is the tolerance parameter: the value of the reconstruction error targeted. In this case, it overrides `n_nonzero_coefs`. split_sign : bool, False by default Whether to split the sparse feature vector into the concatenation of its negative part and its positive part. This can improve the performance of downstream classifiers. n_jobs : int, number of parallel jobs to run **Attributes** ``components_`` : array, [n_components, n_features] The unchanged dictionary atoms See also DictionaryLearning MiniBatchDictionaryLearning SparsePCA MiniBatchSparsePCA sparse_encode
|
|
|
|
Encode the data as a sparse combination of the dictionary atoms. This node has been automatically generated by wrapping the sklearn.decomposition.dict_learning.SparseCoder class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Coding method is determined by the object parameter
Parameters
Returns
|
|
|
Do nothing and return the estimator unchanged This node has been automatically generated by wrapping the sklearn.decomposition.dict_learning.SparseCoder class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. This method is just there to implement the usual API and hence work in pipelines.
|
Home | Trees | Indices | Help |
|
---|
Generated by Epydoc 3.0.1 on Tue Mar 8 12:39:48 2016 | http://epydoc.sourceforge.net |