
Output Values
output[0] = weight[0] * input[0] + weight[1] * input[1] + weight[2] * input[2]
output[1] = weight[0] * input[1] + weight[1] * input[2] + weight[2] * input[3]
output[2] = weight[0] * input[2] + weight[1] * input[3] + weight[2] * input[4]
class Conv:
@staticmethod
def backward(ctx, d):
...
grad_input[2] = weight[0] * d[2] + weight[1] * d[1] + weight[2] * d[0]
...


Quiz





Step is derivative of ReLU
$$ \begin{eqnarray*} \text{ReLU}'(x) &=& \begin{cases} 0 & \text{if } x \leq 0 \\ 1 & \text{ow} \end{cases} \\ \text{step}(x) &=& \text{ReLU}'(x) \end{eqnarray*} $$
Mathematically,
$$\text{step}'(x) = \begin{cases} 0 & \text{if } x \leq 0 \\ 0 & \text{ow} \end{cases}$$
Not a useful function to differentiate

Would be nice to have a version that with a useful derivative
$$\text{sigmoid}(x) = \text{softmax} \{0, x\}$$
Useful soft version of argmax.
Max is a binary associative operator
$\max(a, b)$ returns max value
Generalized $\text{ReLU}(a) = \max(a, 0)$
argmaxargmax gradinputReLU(0).$$\text{softmax}(\textbf{x}) = \frac{\exp \textbf{x}}{\sum_i \exp x_i}$$
$$\text{softmax}([0, x])[1] = \frac{\exp x}{\exp x + \exp 0} = \sigma(x)$$
