Home | Trees | Indices | Help |
|
---|
|
Bernoulli Restricted Boltzmann Machine (RBM). This node has been automatically generated by wrapping the ``sklearn.neural_network.rbm.BernoulliRBM`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. A Restricted Boltzmann Machine with binary visible units and binary hiddens. Parameters are estimated using Stochastic Maximum Likelihood (SML), also known as Persistent Contrastive Divergence (PCD) [2]. The time complexity of this implementation is ``O(d ** 2)`` assuming d ~ n_features ~ n_components. Read more in the :ref:`User Guide <rbm>`. **Parameters** n_components : int, optional Number of binary hidden units. learning_rate : float, optional The learning rate for weight updates. It is *highly* recommended to tune this hyper-parameter. Reasonable values are in the 10**[0., -3.] range. batch_size : int, optional Number of examples per minibatch. n_iter : int, optional Number of iterations/sweeps over the training dataset to perform during training. verbose : int, optional The verbosity level. The default, zero, means silent mode. random_state : integer or numpy.RandomState, optional A random number generator instance to define the state of the random permutations generator. If an integer is given, it fixes the seed. Defaults to the global numpy random number generator. **Attributes** ``intercept_hidden_`` : array-like, shape (n_components,) Biases of the hidden units. ``intercept_visible_`` : array-like, shape (n_features,) Biases of the visible units. ``components_`` : array-like, shape (n_components, n_features) Weight matrix, where n_features in the number of visible units and n_components is the number of hidden units. **Examples** >>> import numpy as np >>> from sklearn.neural_network import BernoulliRBM >>> X = np.array([[0, 0, 0], [0, 1, 1], [1, 0, 1], [1, 1, 1]]) >>> model = BernoulliRBM(n_components=2) >>> model.fit(X) BernoulliRBM(batch_size=10, learning_rate=0.1, n_components=2, n_iter=10, random_state=None, verbose=0) **References** [1] Hinton, G. E., Osindero, S. and Teh, Y. A fast learning algorithm for deep belief nets. Neural Computation 18, pp 1527-1554. http://www.cs.toronto.edu/~hinton/absps/fastnc.pdf [2] Tieleman, T. Training Restricted Boltzmann Machines using Approximations to the Likelihood Gradient. International Conference on Machine Learning (ICML) 2008
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
Inherited from Inherited from |
|||
Inherited from Cumulator | |||
---|---|---|---|
|
|||
|
|||
Inherited from Node | |||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|||
|
|
|||
|
|||
|
|
|||
Inherited from |
|||
Inherited from Node | |||
---|---|---|---|
_train_seq List of tuples: |
|||
dtype dtype |
|||
input_dim Input dimensions |
|||
output_dim Output dimensions |
|||
supported_dtypes Supported dtypes |
|
Bernoulli Restricted Boltzmann Machine (RBM). This node has been automatically generated by wrapping the ``sklearn.neural_network.rbm.BernoulliRBM`` class from the ``sklearn`` library. The wrapped instance can be accessed through the ``scikits_alg`` attribute. A Restricted Boltzmann Machine with binary visible units and binary hiddens. Parameters are estimated using Stochastic Maximum Likelihood (SML), also known as Persistent Contrastive Divergence (PCD) [2]. The time complexity of this implementation is ``O(d ** 2)`` assuming d ~ n_features ~ n_components. Read more in the :ref:`User Guide <rbm>`. **Parameters** n_components : int, optional Number of binary hidden units. learning_rate : float, optional The learning rate for weight updates. It is *highly* recommended to tune this hyper-parameter. Reasonable values are in the 10**[0., -3.] range. batch_size : int, optional Number of examples per minibatch. n_iter : int, optional Number of iterations/sweeps over the training dataset to perform during training. verbose : int, optional The verbosity level. The default, zero, means silent mode. random_state : integer or numpy.RandomState, optional A random number generator instance to define the state of the random permutations generator. If an integer is given, it fixes the seed. Defaults to the global numpy random number generator. **Attributes** ``intercept_hidden_`` : array-like, shape (n_components,) Biases of the hidden units. ``intercept_visible_`` : array-like, shape (n_features,) Biases of the visible units. ``components_`` : array-like, shape (n_components, n_features) Weight matrix, where n_features in the number of visible units and n_components is the number of hidden units. **Examples** >>> import numpy as np >>> from sklearn.neural_network import BernoulliRBM >>> X = np.array([[0, 0, 0], [0, 1, 1], [1, 0, 1], [1, 1, 1]]) >>> model = BernoulliRBM(n_components=2) >>> model.fit(X) BernoulliRBM(batch_size=10, learning_rate=0.1, n_components=2, n_iter=10, random_state=None, verbose=0) **References** [1] Hinton, G. E., Osindero, S. and Teh, Y. A fast learning algorithm for deep belief nets. Neural Computation 18, pp 1527-1554. http://www.cs.toronto.edu/~hinton/absps/fastnc.pdf [2] Tieleman, T. Training Restricted Boltzmann Machines using Approximations to the Likelihood Gradient. International Conference on Machine Learning (ICML) 2008
|
|
|
|
Compute the hidden layer activation probabilities, P(h=1|v=X). This node has been automatically generated by wrapping the sklearn.neural_network.rbm.BernoulliRBM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters
Returns
|
|
|
Fit the model to the data X. This node has been automatically generated by wrapping the sklearn.neural_network.rbm.BernoulliRBM class from the sklearn library. The wrapped instance can be accessed through the scikits_alg attribute. Parameters
Returns
|
Home | Trees | Indices | Help |
|
---|
Generated by Epydoc 3.0.1 on Tue Mar 8 12:39:48 2016 | http://epydoc.sourceforge.net |