# Q7 – Jacobian of Softmax and Sigmoid

Contributed by Louis-Guillaume Gagnon

1.  Compute the jacobian matrix of the softmax function, $S(x_i) = \frac{e^{x_i}}{\sum_k e^{x_k}}$. Express it as a matrix equation.
2. Compute the jacobian matrix of the sigmoid function, $\sigma(x) = 1/(1 + e^{-x})$
3.  Let $y$ and $x$ be vectors related by $y = f(x)$. Let $L$ be an unspecified loss function. Let $g_x$ and $g_y$ be the gradient of $L$ with respect to $x$ and $y$.  Let $J_y(x)$ be the jacobian of y with respect to $x$. Eq. 6.46 (of the Deep Learning book) tells us that: $g_x = J_y(x) \cdot g_y$. Show that if $f(x) \equiv \sigma(x)$, the above can be rewritten $g_x = g_y \odot \sigma'(x)$. If $f(x) \equiv S(x)$, can $g_x$ be defined the same way?