Forward propagationLooking at inference part of a feed forward neural network, we have forward propagation.Finding the asymptotic complexity of the forward propagation procedure can be done much like we how we found the run-time complexity of matrix multiplication.Before beginning, you should be familiar with the forward propagation procedure.We assume the input-vector can be described as:$$x \in \mathbb{R}^{n}$$Where the first element is the bias unit: $x_0 = 1$.When implementing neural networks, it's often the case that all the samples are collected into a matrix with the dimensions $x \in \mathbb{R}^{\eta \times n}$ where $\eta$ is the total number of samples in the trainingset. This is done to save time on allocating memory, and is primarily a practical problem which is why we won't consider it further in the theory.The input is treated in the same as any other activation matrix, and has the index: $x=a^{(0)}$. The zeroth element, $a_0^{(0)}$ is as usual the bias unit with a value of 1.About forward propagation, we can write:$$z^{(k)} = \theta^{(k)} a^{(k-1)}$$$$z^{(k)} \in \mathbb{R}^{1 \times (m | \theta^{(k)}\in \mathbb{R}^{m \times n})}$$$$a^{(k)} = g\big( z^{(k)} \big)$$Where $g(x)$ is the activation function which is evaluated elementwise. We therefore know that $a^{(k)}$ has the same dimensions as $z^{(k)}$.We see that for each layer a matrix multiplication, and an activation function is computed. We know from the the previous essay that naive matrix multiplication has a asymptotic run-time of $O(n^3)$, and since $g(x)$ is an elementwise function, we know that it has a run-time of $O(n)$.By analysing the dimensions of a feed forward neural network, we find:$$\theta^{(0)} \in \mathbb{R}^{n^{(0)} \times 1}$$$$\theta^{(1)} \in \mathbb{R}^{n^{(1)} \times n^{(0)}} $$$$\theta^{(2)} \in \mathbb{R}^{n^{(2)} \times n^{(1)}}$$More generally, we can write:$$\theta^{(k)}= \begin{cases} \mathbb{R}^{n^{(k)}\times 1} \quad &\text{if} \space k=0\\ \mathbb{R}^{n^{(k)}\tim...
First seen: 2025-07-21 01:34
Last seen: 2025-07-21 03:34