4
$\begingroup$

SO I'm looking at these two neural networks and walking through how the $ijk$ values of $\Theta$ correspond to the layer, the node number.

Either there are redundant values or I'm missing how the subscripts actually map from node to node.

$\Theta^i_{jk}$ ... where this is read as " Theta superscript i subscript jk "

As shown here: example2

It looks like the $\Theta$ value corresponding to the node circled in teal would be $\Theta^2_{12}$ ... where:

  • superscript $i=2$ ( layer 2 )
  • $j=1$ ( node number within the subsequent layer ? )
  • $k=2$ ( node number within the current layer ? )

If I'm matching the pattern correctly I think the $j$ value is the node to the right of the red circled node ... and the $k$ value is the teal node...

Am I getting this right?

Because between the above image and this one:

example1

That seems to be the case ... can I get a confirmation on this?

1 Answers 1

0

Yes, $\Theta^i_{jk}$ is the weight that the activation of node $j$ has in the previous input layer $j - 1$ in computing the activation of node $k$ in layer $i$.

  • 0
    "Previous input layer j-1" I think this should be i-1?2017-07-04