# Q15 – Softmax and Cross Entropy

The softmax function for $m$ classes is given by

$p_i = \frac{e^{x_i}}{\sum_{j=1}^m e^{x_j}} \text{ for } i = 1\ldots m$.

It transforms a vector $(x_i)$ of real values into a probability mass vector for a categorical distribution.  It is often used in conjunction with the cross-entropy loss
$L(x, y) = - \sum_{i=1}^m y_i \log p_i$

1. Find a simplified expression for $p_i$ when $k = 2$.
2. Differentiate $p_i$ with respect to $x_k$.
3. Differentiate $L$ with respect to $x_k$.
2. $p_i(1-p_i) if i=k p_ip_k if i !=k$
3.$\sum_{i=1,i!=k}^{m}y_ip_k -y_k(1-p_k)$