next up previous contents
Next: The Neurone Up: Neuronal Networks Previous: Temporal Behaviour   Contents

Subsections

Common Network Types

Feed Forward Networks

A network with the topology $G=({\bf N},{\bf T})$, the input nodes ${\bf I}$ and the output nodes ${\bf O}$ is called a feed forward network if it satisfies the following three conditions:


\begin{displaymath}\neg \, \mbox{cyclic}\,({\bf G}), \qquad
\{A \, \vert \, (A,...
...
\{B \, \vert \, (A,B)\in {\bf T} \} \cap {\bf O} = \emptyset \end{displaymath}

Since the network is supposed to be acyclic, the output state is a direct function of the input state. This network function $f$ defines the functionality of the network.


\begin{displaymath}O = f(I) \quad \mbox{with} \quad
I \in {\bf S}^p, \quad O \in {\bf S}^q, \quad
f: {\bf S}^p \rightarrow {\bf S}^q \end{displaymath}

n-Layer Networks

A feed forward network with the topology $G=({\bf N},{\bf T})$, the input nodes ${\bf I}$ and the output nodes ${\bf O}$ is called a n-layer network if it satisfies the following conditions:


\begin{displaymath}{\bf N} = \bigcap_{k=0}^{n} {\bf N}_k, \qquad
{\bf N}_i \cap {\bf N}_j = \emptyset \, \Longleftrightarrow \, i \neq j \end{displaymath}


\begin{displaymath}{\bf T} = \bigcap_{k=1}^{n} {\bf T}_k, \qquad
{\bf T}_i \cap {\bf T}_j = \emptyset \, \Longleftrightarrow \, i \neq j \end{displaymath}


\begin{displaymath}{\bf I} = {\bf N}_0, \qquad {\bf O} = {\bf N}_n, \qquad
{\bf T}_k \subseteq {\bf N}_{k-1} \times {\bf N}_k \end{displaymath}

Note that the input layer ${\bf N}_0$ is not counted as a real layer, since their propagation functions are merely constants set to the components of the input vector.

The partitions ${\bf T}_k$ can also be expressed my an adjacence matrix $M_k$ which is defined follows:


\begin{displaymath}{\bf N}_j = \{N_{j,i} \, \vert \, 1 \leq i \leq \vert{\bf N}_...
...qquad
s=\vert{\bf N}_{k-1}\vert, \qquad t=\vert{\bf N}_k\vert \end{displaymath}


\begin{displaymath}M_k=\left( \begin{array}{ccc}
m_{11}^{(k)} & \cdots & m_{1t}...
...\\
1, & (N_{k,i},N_{k,j}) \in {\bf T}_k
\end{array} \right. \end{displaymath}

If $(\forall i,j)\;m_{ij}^{(k)}=1$, the layers ${\bf N}_{k-1}$ and ${\bf N}_k$ are fully connected.


next up previous contents
Next: The Neurone Up: Neuronal Networks Previous: Temporal Behaviour   Contents

(c) Bernhard Ömer - oemer@tph.tuwien.ac.at - http://tph.tuwien.ac.at/~oemer/