Restricted Boltzmann machine

From Eyewire
Jump to: navigation, search
This page contains changes which are not marked for translation.

A restricted Boltzmann machine, commonly abbreviated as RBM, is a neural network where neurons beyond the visible have probabilitistic outputs. The machine is restricted because connections are restricted to be from one layer to the next, that is, having no intra-layer connections.

As with contrastive Hebbian learning, there are two phases to the model, a positive phase, or wake phase, and a negative phase, or sleep phase.

Model

File:ArtificialNeuronModel english.png
Model of a neuron. j is the index of the neuron when there is more than one neuron. For the RBM, the activation function is logistic, and the activation is actually the probability that the neuron will fire.

We use a set of binary-valued neurons. Given a set of k-dimensional inputs represented as a column vector Hebb1.png, and a set of m neurons with (initially random, between -0.01 and 0.01) synaptic weights from the inputs, represented as a matrix formed by m weight column vectors (i.e. a k row x m column matrix):

Sanger1.png

where Sanger2.png is the weight between input i and neuron j.

During the positive phase, the output of the set of neurons is defined as follows:

RBM1.png

where RBM2.png is a column vector of probabilities, where element i indicates the probability that neuron i will output a 1. RBM3.png is the logistic sigmoidal function:

RBM4.png

During the negative phase, from this output, a binary-valued reconstruction of the input RBM5.png is formed as follows. First, choose the binary outputs of the output neurons RBM6.png based on the probabilities RBM2.png. Then:

RBM7.png

Then the reconstructed binary inputs RBM5.png based on the probabilities RBM8.png. Next, the binary outputs RBM9.png are computed again based on the probabilities RBM10.png, but this time from the reconstructed input:

RBM11.png

This completes one wake-sleep cycle.

To update the weights, a wake-sleep cycle is completed, and weights updated as follows:

RBM12.png

where η is some learning rate. In practice, several wake-sleep cycles can be run before doing the weight update. This is known as Gibbs sampling.

A batch update can also be used, where some number of patterns less than the full input set (a mini-batch) are uniformly randomly presented, the wake and sleep results recorded, and then the updates done as follows:

RBM13.png

where RBM14.png is an average over the input presentations. This method is called contrastive divergence.


References