07 – Theano Tutorial

Your TA, Alex Lamb, will present a THEANO tutorial and review the first assignment.

Intro to Theano (slides from Ian Goodfellow)

Advanced Theano:

https://drive.google.com/file/d/0B64011x02sIkOVpPY1B2WmVYa3c/edit

Theano Internals:

https://drive.google.com/file/d/0B64011x02sIkWW9LLVV6QWtzRTg/edit

Extra fun theano resources:

http://deeplearning.net/tutorial/logreg.html#logreg

http://deeplearning.net/tutorial/mlp.html#mlp

https://arxiv.org/abs/1605.02688

 

 

 

 

 

Advertisements

2 thoughts on “07 – Theano Tutorial

  1. Thursday’s tutorial was very insightful, thank you for organizing it. However, when we discussed ways to build convnets using Theano, you mentioned (I think) that one inputs would be a 4d tensor shared variable. Can you please elaborate more on that? Which input would that be and why 4d?

    Thank you,
    Mahmoud Nassif

    Like

    • In a convolutional layer you have a unit for each “feature” in each position. This makes the value a 3D tensor for a single example. If you use a minibatch, it becomes a 4D tensor.

      The canonical way to access this tensor is:

      (minibatch example, filter index, x position, y position).

      For example, if you have a batch of 64 images of size 256,256 with three colour channels, then the initial image will be:

      (64, 3, 256, 256)

      The next layer could typically have a size like:

      (64, 100, 128, 128)

      If you downsample 2x on the first convolutional layer (either by striding or pooling) and have 100 feature maps for that layer.

      Liked by 2 people

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s