Home | Trees | Indices | Help |
|
---|
|
|
|||
ARDRegressionScikitsLearnNode Bayesian ARD regression. |
|||
AdaBoostClassifierScikitsLearnNode An AdaBoost classifier. |
|||
AdaBoostRegressorScikitsLearnNode An AdaBoost regressor. |
|||
AdaptiveCutoffNode Node which uses the data history during training to learn cutoff values. |
|||
AdditiveChi2SamplerScikitsLearnNode Approximate feature map for additive chi2 kernel. |
|||
AffinityPropagationScikitsLearnNode Perform Affinity Propagation Clustering of data. |
|||
BaggingClassifierScikitsLearnNode A Bagging classifier. |
|||
BaggingRegressorScikitsLearnNode A Bagging regressor. |
|||
BayesianRidgeScikitsLearnNode Bayesian ridge regression |
|||
BernoulliNBScikitsLearnNode Naive Bayes classifier for multivariate Bernoulli models. |
|||
BernoulliRBMScikitsLearnNode Bernoulli Restricted Boltzmann Machine (RBM). |
|||
BinarizerScikitsLearnNode Binarize data (set feature values to 0 or 1) according to a threshold |
|||
BirchScikitsLearnNode Implements the Birch clustering algorithm. |
|||
CCAScikitsLearnNode CCA Canonical Correlation Analysis. |
|||
CalibratedClassifierCVScikitsLearnNode Probability calibration with isotonic regression or sigmoid. |
|||
Convolution2DNode Convolve input data with filter banks. |
|||
CountVectorizerScikitsLearnNode Convert a collection of text documents to a matrix of token counts |
|||
CuBICANode Perform Independent Component Analysis using the CuBICA algorithm. Note that CuBICA is a batch-algorithm, which means that it needs all input data before it can start and compute the ICs. The algorithm is here given as a Node for convenience, but it actually accumulates all inputs it receives. Remember that to avoid running out of memory when you have many components and many time samples. |
|||
CutoffNode Node to cut off values at specified bounds. |
|||
DPGMMScikitsLearnNode Variational Inference for the Infinite Gaussian Mixture Model. |
|||
DecisionTreeClassifierScikitsLearnNode A decision tree classifier. |
|||
DecisionTreeRegressorScikitsLearnNode A decision tree regressor. |
|||
DictVectorizerScikitsLearnNode Transforms lists of feature-value mappings to vectors. |
|||
DictionaryLearningScikitsLearnNode Dictionary learning |
|||
DiscreteHopfieldClassifier Node for simulating a simple discrete Hopfield model |
|||
ElasticNetCVScikitsLearnNode Elastic Net model with iterative fitting along a regularization path |
|||
ElasticNetScikitsLearnNode Linear regression with combined L1 and L2 priors as regularizer. |
|||
EtaComputerNode Compute the eta values of the normalized training data. |
|||
ExtraTreeClassifierScikitsLearnNode An extremely randomized tree classifier. |
|||
ExtraTreeRegressorScikitsLearnNode An extremely randomized tree regressor. |
|||
ExtraTreesClassifierScikitsLearnNode An extra-trees classifier. |
|||
ExtraTreesRegressorScikitsLearnNode An extra-trees regressor. |
|||
FANode Perform Factor Analysis. |
|||
FDANode Perform a (generalized) Fisher Discriminant Analysis of its input. It is a supervised node that implements FDA using a generalized eigenvalue approach. |
|||
FactorAnalysisScikitsLearnNode Factor Analysis (FA) |
|||
FastICANode Perform Independent Component Analysis using the FastICA algorithm. Note that FastICA is a batch-algorithm. This means that it needs all input data before it can start and compute the ICs. The algorithm is here given as a Node for convenience, but it actually accumulates all inputs it receives. Remember that to avoid running out of memory when you have many components and many time samples. |
|||
FastICAScikitsLearnNode FastICA: a fast algorithm for Independent Component Analysis. |
|||
FeatureAgglomerationScikitsLearnNode Agglomerate features. |
|||
FeatureHasherScikitsLearnNode Implements feature hashing, aka the hashing trick. |
|||
ForestRegressorScikitsLearnNode Base class for forest of trees-based regressors. |
|||
FunctionTransformerScikitsLearnNode Constructs a transformer from an arbitrary callable. |
|||
GMMScikitsLearnNode Gaussian Mixture Model |
|||
GaussianClassifier Perform a supervised Gaussian classification. |
|||
GaussianNBScikitsLearnNode Gaussian Naive Bayes (GaussianNB) |
|||
GaussianProcessScikitsLearnNode The Gaussian Process model class. |
|||
GaussianRandomProjectionHashScikitsLearnNode This node has been automatically generated by wrapping the sklearn.neighbors.approximate.GaussianRandomProjectionHash class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. |
|||
GaussianRandomProjectionScikitsLearnNode Reduce dimensionality through Gaussian random projection |
|||
GeneralExpansionNode Expands the input samples by applying to them one or more functions provided. |
|||
GenericUnivariateSelectScikitsLearnNode Univariate feature selector with configurable strategy. |
|||
GradientBoostingClassifierScikitsLearnNode Gradient Boosting for classification. |
|||
GradientBoostingRegressorScikitsLearnNode Gradient Boosting for regression. |
|||
GridSearchCVScikitsLearnNode Exhaustive search over specified parameter values for an estimator. |
|||
GrowingNeuralGasExpansionNode Perform a trainable radial basis expansion, where the centers and sizes of the basis functions are learned through a growing neural gas. |
|||
GrowingNeuralGasNode Learn the topological structure of the input data by building a corresponding graph approximation. |
|||
HLLENode Perform a Hessian Locally Linear Embedding analysis on the data. |
|||
HashingVectorizerScikitsLearnNode Convert a collection of text documents to a matrix of token occurrences |
|||
HistogramNode Node which stores a history of the data during its training phase. |
|||
HitParadeNode Collect the first n local maxima and minima of the training signal which are separated by a minimum gap d. |
|||
ICANode ICANode is a general class to handle different batch-mode algorithm for Independent Component Analysis. More information about ICA can be found among others in Hyvarinen A., Karhunen J., Oja E. (2001). Independent Component Analysis, Wiley. |
|||
ISFANode Perform Independent Slow Feature Analysis on the input data. |
|||
IdentityNode Execute returns the input data and the node is not trainable. |
|||
ImputerScikitsLearnNode Imputation transformer for completing missing values. |
|||
IncrementalPCAScikitsLearnNode Incremental principal components analysis (IPCA). |
|||
IsomapScikitsLearnNode Isomap Embedding |
|||
IsotonicRegressionScikitsLearnNode Isotonic regression model. |
|||
JADENode Perform Independent Component Analysis using the JADE algorithm. Note that JADE is a batch-algorithm. This means that it needs all input data before it can start and compute the ICs. The algorithm is here given as a Node for convenience, but it actually accumulates all inputs it receives. Remember that to avoid running out of memory when you have many components and many time samples. |
|||
KMeansClassifier Employs K-Means Clustering for a given number of centroids. |
|||
KMeansScikitsLearnNode K-Means clustering |
|||
KNNClassifier K-Nearest-Neighbour Classifier. |
|||
KNeighborsClassifierScikitsLearnNode Classifier implementing the k-nearest neighbors vote. |
|||
KNeighborsRegressorScikitsLearnNode Regression based on k-nearest neighbors. |
|||
KernelCentererScikitsLearnNode Center a kernel matrix |
|||
KernelPCAScikitsLearnNode Kernel Principal component analysis (KPCA) |
|||
KernelRidgeScikitsLearnNode Kernel ridge regression. |
|||
LLENode Perform a Locally Linear Embedding analysis on the data. |
|||
LabelBinarizerScikitsLearnNode Binarize labels in a one-vs-all fashion |
|||
LabelEncoderScikitsLearnNode Encode labels with value between 0 and n_classes-1. |
|||
LabelPropagationScikitsLearnNode Label Propagation classifier |
|||
LabelSpreadingScikitsLearnNode LabelSpreading model for semi-supervised learning |
|||
LarsCVScikitsLearnNode Cross-validated Least Angle Regression model |
|||
LarsScikitsLearnNode Least Angle Regression model a.k.a. |
|||
LassoCVScikitsLearnNode Lasso linear model with iterative fitting along a regularization path |
|||
LassoLarsCVScikitsLearnNode Cross-validated Lasso, using the LARS algorithm |
|||
LassoLarsICScikitsLearnNode Lasso model fit with Lars using BIC or AIC for model selection |
|||
LassoLarsScikitsLearnNode Lasso model fit with Least Angle Regression a.k.a. |
|||
LassoScikitsLearnNode Linear Model trained with L1 prior as regularizer (aka the Lasso) |
|||
LatentDirichletAllocationScikitsLearnNode Latent Dirichlet Allocation with online variational Bayes algorithm |
|||
LibSVMClassifier The LibSVMClassifier class acts as a wrapper around the LibSVM library for support vector machines. |
|||
LinearDiscriminantAnalysisScikitsLearnNode Linear Discriminant Analysis |
|||
LinearModelCVScikitsLearnNode This node has been automatically generated by wrapping the sklearn.linear_model.coordinate_descent.LinearModelCV class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. |
|||
LinearRegressionNode Compute least-square, multivariate linear regression on the input data, i.e., learn coefficients b_j so that: |
|||
LinearRegressionScikitsLearnNode Ordinary least squares Linear Regression. |
|||
LinearSVCScikitsLearnNode Linear Support Vector Classification. |
|||
LinearSVRScikitsLearnNode Linear Support Vector Regression. |
|||
LocallyLinearEmbeddingScikitsLearnNode Locally Linear Embedding |
|||
LogOddsEstimatorScikitsLearnNode This node has been automatically generated by wrapping the sklearn.ensemble.gradient_boosting.LogOddsEstimator class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. |
|||
LogisticRegressionCVScikitsLearnNode Logistic Regression CV (aka logit, MaxEnt) classifier. |
|||
LogisticRegressionScikitsLearnNode Logistic Regression (aka logit, MaxEnt) classifier. |
|||
MaxAbsScalerScikitsLearnNode Scale each feature by its maximum absolute value. |
|||
MeanEstimatorScikitsLearnNode This node has been automatically generated by wrapping the sklearn.ensemble.gradient_boosting.MeanEstimator class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. |
|||
MeanShiftScikitsLearnNode Mean shift clustering using a flat kernel. |
|||
MinMaxScalerScikitsLearnNode Transforms features by scaling each feature to a given range. |
|||
MiniBatchDictionaryLearningScikitsLearnNode Mini-batch dictionary learning |
|||
MiniBatchKMeansScikitsLearnNode Mini-Batch K-Means clustering |
|||
MiniBatchSparsePCAScikitsLearnNode Mini-batch Sparse Principal Components Analysis |
|||
MultiLabelBinarizerScikitsLearnNode Transform between iterable of iterables and a multilabel format |
|||
MultiTaskElasticNetCVScikitsLearnNode Multi-task L1/L2 ElasticNet with built-in cross-validation. |
|||
MultiTaskElasticNetScikitsLearnNode Multi-task ElasticNet model trained with L1/L2 mixed-norm as regularizer |
|||
MultiTaskLassoCVScikitsLearnNode Multi-task L1/L2 Lasso with built-in cross-validation. |
|||
MultiTaskLassoScikitsLearnNode Multi-task Lasso model trained with L1/L2 mixed-norm as regularizer |
|||
MultinomialNBScikitsLearnNode Naive Bayes classifier for multinomial models |
|||
NIPALSNode Perform Principal Component Analysis using the NIPALS algorithm. This algorithm is particularyl useful if you have more variable than observations, or in general when the number of variables is huge and calculating a full covariance matrix may be unfeasable. It's also more efficient of the standard PCANode if you expect the number of significant principal components to be a small. In this case setting output_dim to be a certain fraction of the total variance, say 90%, may be of some help. |
|||
NMFScikitsLearnNode Non-Negative Matrix Factorization (NMF) |
|||
NearestCentroidScikitsLearnNode Nearest centroid classifier. |
|||
NearestMeanClassifier Nearest-Mean classifier. |
|||
NeuralGasNode Learn the topological structure of the input data by building a corresponding graph approximation (original Neural Gas algorithm). |
|||
NoiseNode Inject multiplicative or additive noise into the input data. |
|||
NormalNoiseNode Special version of NoiseNode for Gaussian additive noise. |
|||
NormalizeNode Make input signal meanfree and unit variance |
|||
NormalizerScikitsLearnNode Normalize samples individually to unit norm. |
|||
NuSVCScikitsLearnNode Nu-Support Vector Classification. |
|||
NuSVRScikitsLearnNode Nu Support Vector Regression. |
|||
NystroemScikitsLearnNode Approximate a kernel map using a subset of the training data. |
|||
OneClassSVMScikitsLearnNode Unsupervised Outlier Detection. |
|||
OneHotEncoderScikitsLearnNode Encode categorical integer features using a one-hot aka one-of-K scheme. |
|||
OneVsOneClassifierScikitsLearnNode One-vs-one multiclass strategy |
|||
OneVsRestClassifierScikitsLearnNode One-vs-the-rest (OvR) multiclass/multilabel strategy |
|||
OrthogonalMatchingPursuitCVScikitsLearnNode Cross-validated Orthogonal Matching Pursuit model (OMP) |
|||
OrthogonalMatchingPursuitScikitsLearnNode Orthogonal Matching Pursuit model (OMP) |
|||
OutputCodeClassifierScikitsLearnNode (Error-Correcting) Output-Code multiclass strategy |
|||
PCANode Filter the input data through the most significatives of its principal components. |
|||
PCAScikitsLearnNode Principal component analysis (PCA) |
|||
PLSCanonicalScikitsLearnNode PLSCanonical implements the 2 blocks canonical PLS of the original Wold algorithm [Tenenhaus 1998] p.204, referred as PLS-C2A in [Wegelin 2000]. |
|||
PLSRegressionScikitsLearnNode PLS regression |
|||
PLSSVDScikitsLearnNode Partial Least Square SVD |
|||
PassiveAggressiveClassifierScikitsLearnNode Passive Aggressive Classifier |
|||
PassiveAggressiveRegressorScikitsLearnNode Passive Aggressive Regressor |
|||
PatchExtractorScikitsLearnNode Extracts patches from a collection of images |
|||
PerceptronClassifier A simple perceptron with input_dim input nodes. |
|||
PerceptronScikitsLearnNode Perceptron |
|||
PolynomialExpansionNode Perform expansion in a polynomial space. |
|||
PolynomialFeaturesScikitsLearnNode Generate polynomial and interaction features. |
|||
PriorProbabilityEstimatorScikitsLearnNode An estimator predicting the probability of each |
|||
ProjectedGradientNMFScikitsLearnNode Non-Negative Matrix Factorization (NMF) |
|||
QuadraticDiscriminantAnalysisScikitsLearnNode Quadratic Discriminant Analysis |
|||
QuadraticExpansionNode Perform expansion in the space formed by all linear and quadratic monomials. QuadraticExpansionNode() is equivalent to a PolynomialExpansionNode(2) |
|||
QuantileEstimatorScikitsLearnNode This node has been automatically generated by wrapping the sklearn.ensemble.gradient_boosting.QuantileEstimator class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. |
|||
RANSACRegressorScikitsLearnNode RANSAC (RANdom SAmple Consensus) algorithm. |
|||
RBFExpansionNode Expand input space with Gaussian Radial Basis Functions (RBFs). |
|||
RBFSamplerScikitsLearnNode Approximates feature map of an RBF kernel by Monte Carlo approximation of its Fourier transform. |
|||
RBMNode Restricted Boltzmann Machine node. An RBM is an undirected probabilistic network with binary variables. The graph is bipartite into observed (visible) and hidden (latent) variables. |
|||
RBMWithLabelsNode Restricted Boltzmann Machine with softmax labels. An RBM is an undirected probabilistic network with binary variables. In this case, the node is partitioned into a set of observed (visible) variables, a set of hidden (latent) variables, and a set of label variables (also observed), only one of which is active at any time. The node is able to learn associations between the visible variables and the labels. |
|||
RFECVScikitsLearnNode Feature ranking with recursive feature elimination and cross-validated selection of the best number of features. |
|||
RFEScikitsLearnNode Feature ranking with recursive feature elimination. |
|||
RadiusNeighborsClassifierScikitsLearnNode Classifier implementing a vote among neighbors within a given radius |
|||
RadiusNeighborsRegressorScikitsLearnNode Regression based on neighbors within a fixed radius. |
|||
RandomForestClassifierScikitsLearnNode A random forest classifier. |
|||
RandomForestRegressorScikitsLearnNode A random forest regressor. |
|||
RandomTreesEmbeddingScikitsLearnNode An ensemble of totally random trees. |
|||
RandomizedLassoScikitsLearnNode Randomized Lasso. |
|||
RandomizedLogisticRegressionScikitsLearnNode Randomized Logistic Regression |
|||
RandomizedPCAScikitsLearnNode Principal component analysis (PCA) using randomized SVD |
|||
RandomizedSearchCVScikitsLearnNode Randomized search on hyper parameters. |
|||
RidgeCVScikitsLearnNode Ridge regression with built-in cross-validation. |
|||
RidgeClassifierCVScikitsLearnNode Ridge classifier with built-in cross-validation. |
|||
RidgeClassifierScikitsLearnNode Classifier using Ridge regression. |
|||
RidgeScikitsLearnNode Linear least squares with l2 regularization. |
|||
RobustScalerScikitsLearnNode Scale features using statistics that are robust to outliers. |
|||
SFA2Node Get an input signal, expand it in the space of inhomogeneous polynomials of degree 2 and extract its slowly varying components. The get_quadratic_form method returns the input-output function of one of the learned unit as a QuadraticForm object. See the documentation of mdp.utils.QuadraticForm for additional information. |
|||
SFANode Extract the slowly varying components from the input data. More information about Slow Feature Analysis can be found in Wiskott, L. and Sejnowski, T.J., Slow Feature Analysis: Unsupervised Learning of Invariances, Neural Computation, 14(4):715-770 (2002). |
|||
SGDClassifierScikitsLearnNode Linear classifiers (SVM, logistic regression, a.o.) with SGD training. |
|||
SGDRegressorScikitsLearnNode Linear model fitted by minimizing a regularized empirical loss with SGD |
|||
SVCScikitsLearnNode C-Support Vector Classification. |
|||
SVRScikitsLearnNode Epsilon-Support Vector Regression. |
|||
ScaledLogOddsEstimatorScikitsLearnNode This node has been automatically generated by wrapping the sklearn.ensemble.gradient_boosting.ScaledLogOddsEstimator class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. |
|||
SelectFdrScikitsLearnNode Filter: Select the p-values for an estimated false discovery rate |
|||
SelectFprScikitsLearnNode Filter: Select the pvalues below alpha based on a FPR test. |
|||
SelectFromModelScikitsLearnNode Meta-transformer for selecting features based on importance weights. |
|||
SelectFweScikitsLearnNode Filter: Select the p-values corresponding to Family-wise error rate |
|||
SelectKBestScikitsLearnNode Select features according to the k highest scores. |
|||
SelectPercentileScikitsLearnNode Select features according to a percentile of the highest scores. |
|||
SignumClassifier This classifier node classifies as 1 if the sum of the data points is positive and as -1 if the data point is negative |
|||
SimpleMarkovClassifier A simple version of a Markov classifier. It can be trained on a vector of tuples the label being the next element in the testing data. |
|||
SkewedChi2SamplerScikitsLearnNode Approximates feature map of the "skewed chi-squared" kernel by Monte Carlo approximation of its Fourier transform. |
|||
SparseCoderScikitsLearnNode Sparse coding |
|||
SparsePCAScikitsLearnNode Sparse Principal Components Analysis (SparsePCA) |
|||
SparseRandomProjectionScikitsLearnNode Reduce dimensionality through sparse random projection |
|||
StandardScalerScikitsLearnNode Standardize features by removing the mean and scaling to unit variance |
|||
TDSEPNode Perform Independent Component Analysis using the TDSEP algorithm. Note that TDSEP, as implemented in this Node, is an online algorithm, i.e. it is suited to be trained on huge data sets, provided that the training is done sending small chunks of data for each time. |
|||
TfidfTransformerScikitsLearnNode Transform a count matrix to a normalized tf or tf-idf representation |
|||
TfidfVectorizerScikitsLearnNode Convert a collection of raw documents to a matrix of TF-IDF features. |
|||
TheilSenRegressorScikitsLearnNode Theil-Sen Estimator: robust multivariate regression model. |
|||
TimeDelayNode Copy delayed version of the input signal on the space dimensions. |
|||
TimeDelaySlidingWindowNode TimeDelaySlidingWindowNode is an alternative to TimeDelayNode which should be used for online learning/execution. Whereas the TimeDelayNode works in a batch manner, for online application a sliding window is necessary which yields only one row per call. |
|||
TimeFramesNode Copy delayed version of the input signal on the space dimensions. |
|||
TruncatedSVDScikitsLearnNode Dimensionality reduction using truncated SVD (aka LSA). |
|||
VBGMMScikitsLearnNode Variational Inference for the Gaussian Mixture Model |
|||
VarianceThresholdScikitsLearnNode Feature selector that removes all low-variance features. |
|||
VotingClassifierScikitsLearnNode Soft Voting/Majority Rule classifier for unfitted estimators. |
|||
WhiteningNode Whiten the input data by filtering it through the most significatives of its principal components. All output signals have zero mean, unit variance and are decorrelated. |
|||
XSFANode Perform Non-linear Blind Source Separation using Slow Feature Analysis. |
|||
ZeroEstimatorScikitsLearnNode This node has been automatically generated by wrapping the sklearn.ensemble.gradient_boosting.ZeroEstimator class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. |
|||
_OneDimensionalHitParade Class to produce hit-parades (i.e., a list of the largest and smallest values) out of a one-dimensional time-series. |
|
|||
|
|
|||
__package__ =
|
|
|
|
__package__
|
Home | Trees | Indices | Help |
|
---|
Generated by Epydoc 3.0.1 on Tue Mar 8 12:39:48 2016 | http://epydoc.sourceforge.net |