Neural Networks (Geoffrey Hinton Course)

From TedYunWiki
Jump to navigation Jump to search

Some Simple Models or Neurons

$y$ output, $x_i$ input.

Linear Neurons

$y = b + \sum_{i} x_i w_i$

$w_i$ weights, $b$ bias

Binary Threshold Neurons

$z = \sum_{i} x_i w_i$

$y = 1$ if $z \geq \theta$, $0$ otherwise.

Or, equivalently,

$z = b + \sum_{i} x_i w_i$

$y = 1$ if $z \geq 0$, $0$ otherwise.

Rectified Linear Neurons

$z = b + \sum_{i} x_i w_i$

$y = z$ if $z > 0$, $0$ otherwise. (linear above zero, decision at zero.)

Sigmoid Neurons

Give a real-valued output that is a smooth and bounded function of their total input.

$z = b + \sum_{i} x_i w_i$

$y = \frac{1}{1 + e^{-z}}$

Stochastic Binary Neurons

Same equations as logistic units, but outputs $1$ (=spike) or $0$ randomly based on the probability. They treat the output of the logistic as the probability of producing a spike in a short time window.

$z = b + \sum_{i} x_i w_i$

$P(s = 1) = \frac{1}{1 + e^{-z}}$

We can do a similar trick for rectified linear units - the output is treated as the Poisson rate for spikes.