Q19 – Linear Regression

Consider a linear regression problem with input data \boldmath{X} \in \mathbb{R}^{n\times d}. weights \boldmath{w} \in \mathbb{R}^{d \times 1} and and targets \boldmath{y} \in \mathbb{R}^{n \times 1}. Now, suppose that dropout is being applied to the input units with probability p.

1) Rewrite the input data matrix taking into account the probability of each unit to be dropped out (Hint: the probability of each unit to be dropped out is a Bernoulli random variable with probability p).

2)What is the cost function of the linear regression with dropout?

3)Show that applying dropout to the linear regression problem aforementioned can be seen as using L2 regularization in the loss function.

Advertisements

Q17 – ConvNet Invariances

Question 1: A convolutional neural network (CNN) has the ability to be “insensitive” to some slight spatial variations in the input data, such as translation. In comparison with the regular feed-forward neural networks, the CNN architecture has two components responsible for providing this kind of insensitivity. Explain which are those components and how a CNN can ignore small translations in the input data.

09 – Demonstration of Implementing Convnets

In this lecture I’ll walk us through training a convnet to do MNIST classification.  If time permits I’ll take requests on demonstrating other methods for trying to improve results.

Code will be posted here beforehand but I’ll try to implement it in class without using any notes.

https://github.com/alexmlamb/convnet_demo_ift6266

I’ll also explain the class project!

https://ift6266h17.wordpress.com/project-description/