# Q13 – Activation Functions II

Contributed by Pulkit Khandelwal.

Consider a neural network as shown in the Figure below. The network has linear activation functions. Let the various weights be defined as shown in the figure and also the output of each unit is multiplied by some constant k.

4. Let the hidden units use sigmoid activation functions and let the output unit use a threshold activation function. Find weights which cause this network to compute the XOR of $X_{1}$ and $X_{2}$ for binary-valued $X_{1}$ and $X_{2}$. Assume that there are no bias terms.