diff --git a/notebooks/08_Introduction_to_TensorFlow.ipynb b/notebooks/08_Introduction_to_TensorFlow.ipynb index efd311f..72eb7e7 100644 --- a/notebooks/08_Introduction_to_TensorFlow.ipynb +++ b/notebooks/08_Introduction_to_TensorFlow.ipynb @@ -526,7 +526,7 @@ }, "outputs": [], "source": [ - "show_graph(tf.get_default_graph())" + "Computing a log-likelihood:show_graph(tf.get_default_graph())" ] }, { @@ -622,8 +622,30 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "That completes our basic introduction to TensorFlow. If you want more to explore more of TensorFlow before beginning your project for this semester, you may wish to go through some of the [official tutorials](https://www.tensorflow.org/tutorials/) or some of the many sites with unofficial tutorials e.g. the series of notebooks [here](https://github.com/aymericdamien/TensorFlow-Examples)." + "That completes our basic introduction to TensorFlow. If you want more to explore more of TensorFlow before beginning your project for this semester, you may wish to go through some of the [official tutorials](https://www.tensorflow.org/tutorials/) or some of the many sites with unofficial tutorials e.g. the series of notebooks [here](https://github.com/aymericdamien/TensorFlow-Examples). If you have time you may also wish to have a go at the optional exercise below." ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Optional exercise: multiple layer MNIST classifier using `contrib` modules\n", + "\n", + "As well as the core officially supported codebase, TensorFlow is distributed with a series of contributed modules under [`tensorflow.contrib`](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib). These tend to provide higher level interfaces for constructing and running common forms of computational graphs which can allow models to be constructed with much more concise code. The interfaces of the `contrib` modules tend to be (even) less stable than the core TensorFlow Python interface and they are also more restricted in the sorts of models that can be created. Therefore it is worthwhile to also be familiar with constructing models with the operations available in the core TensorFlow codebase; you can also often mix and match use of 'native' TensorFlow and functions from `contrib` modules.\n", + "\n", + "As an optional extension exercise, construct a deep MNIST classifier model, either using TensorFlow operations directly as above or using one (or more) of the higher level interfaces defined in `contrib` modules such as [`tensorflow.contrib.learn`](https://www.tensorflow.org/tutorials/tflearn/), [`tensorflow.contrib.layers`](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/layers) or [`tensorflow.contrib.slim`](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/slim). You should choose an appropriate model architecture (number and width of layers) and choice of activation function based on your experience fitting models from last semester.\n", + "\n", + "As well as exploring the use of the interfaces in `contrib` modules you may wish to explore the more advanced optimizers available in [`tensorflow.train`](https://www.tensorflow.org/versions/r0.11/api_docs/python/train) such as [`tensorflow.train.AdamOptimizer`](https://www.tensorflow.org/versions/r0.11/api_docs/python/train/optimizers#AdamOptimizer) and [`tensorflow.train.AdagradOptimizer`](https://www.tensorflow.org/versions/r0.11/api_docs/python/train/optimizers#AdagradOptimizer) corresponding to the adaptive learning rules implemented in the first coursework last semester." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "collapsed": true + }, + "outputs": [], + "source": [] } ], "metadata": {