Difference between revisions of "Neural Networks (Geoffrey Hinton Course)"

From TedYunWiki
Jump to navigation Jump to search
Line 26: Line 26:
  
 
$y = z$ if $z > 0$, $0$ otherwise. (linear above zero, decision at zero.)
 
$y = z$ if $z > 0$, $0$ otherwise. (linear above zero, decision at zero.)
 +
 +
=== Sigmoid Neurons ===
 +
 +
Give a real-valued output that is a smooth and bounded function of their total input.
 +
 +
$z = b + \sum_{i} x_i w_i$
 +
 +
$y = \frac{1}{1 + e^{-z}}$

Revision as of 17:30, 30 October 2016

Some Simple Models or Neurons

$y$ output, $x_i$ input.

Linear Neurons

$y = b + \sum_{i} x_i w_i$

$w_i$ weights, $b$ bias

Binary Threshold Neurons

$z = \sum_{i} x_i w_i$

$y = 1$ if $z \geq \theta$, $0$ otherwise.

Or, equivalently,

$z = b + \sum_{i} x_i w_i$

$y = 1$ if $z \geq 0$, $0$ otherwise.

Rectified Linear Neurons

$z = b + \sum_{i} x_i w_i$

$y = z$ if $z > 0$, $0$ otherwise. (linear above zero, decision at zero.)

Sigmoid Neurons

Give a real-valued output that is a smooth and bounded function of their total input.

$z = b + \sum_{i} x_i w_i$

$y = \frac{1}{1 + e^{-z}}$