Contributed by Louis-Guillaume Gagnon
- Compute the jacobian matrix of the softmax function, . Express it as a matrix equation.
- Compute the jacobian matrix of the sigmoid function,
- Let and be vectors related by . Let be an unspecified loss function. Let and be the gradient of with respect to and . Let be the jacobian of y with respect to . Eq. 6.46 (of the Deep Learning book) tells us that: . Show that if , the above can be rewritten . If , can be defined the same way?