<P> RBMs were initially invented under the name Harmonium by Paul Smolensky in 1986, and rose to prominence after Geoffrey Hinton and collaborators invented fast learning algorithms for them in the mid-2000 . RBMs have found applications in dimensionality reduction, classification, collaborative filtering, feature learning and topic modelling . They can be trained in either supervised or unsupervised ways, depending on the task . </P> <P> As their name implies, RBMs are a variant of Boltzmann machines, with the restriction that their neurons must form a bipartite graph: a pair of nodes from each of the two groups of units (commonly referred to as the "visible" and "hidden" units respectively) may have a symmetric connection between them; and there are no connections between nodes within a group . By contrast, "unrestricted" Boltzmann machines may have connections between hidden units . This restriction allows for more efficient training algorithms than are available for the general class of Boltzmann machines, in particular the gradient - based contrastive divergence algorithm . </P> <P> Restricted Boltzmann machines can also be used in deep learning networks . In particular, deep belief networks can be formed by "stacking" RBMs and optionally fine - tuning the resulting deep network with gradient descent and backpropagation . </P> <P> The standard type of RBM has binary - valued (Boolean / Bernoulli) hidden and visible units, and consists of a matrix of weights W = (w i, j) (\ displaystyle W = (w_ (i, j))) (size m × n) associated with the connection between hidden unit h j (\ displaystyle h_ (j)) and visible unit v i (\ displaystyle v_ (i)), as well as bias weights (offsets) a i (\ displaystyle a_ (i)) for the visible units and b j (\ displaystyle b_ (j)) for the hidden units . Given these, the energy of a configuration (pair of boolean vectors) (v, h) is defined as </P>

All the visible layers in a restricted boltzmann machine are connected to each other