Minor textual modifications

This commit is contained in:
Steve Renals 2015-10-12 09:32:54 +01:00
parent a618c4c9dc
commit 576f9d335b

View File

@ -97,7 +97,7 @@
"source": [
"## Task 1 - Sigmoid Layer (15%)\n",
"\n",
"In this task you need to create a class `Sigmoid` which encapsulates a layer of `sigmoid` units. You should do this by extending the `mlp.layers.Linear` class (in file `mlp/layers.py`), which implements a a layer of linear units (i.e. weighted sum plus bias). The `Sigmoid` class extends this by applying the `sigmoid` transfer function to the weighted sum in the forward propagation, and applying the derivative of the `sigmoid` in the gradient descent back propagation and computing the gradients with respect to layer's parameters. Do **not** copy the implementation provided in `Linear` class but rather, **reuse** it through inheritance.\n",
"In this task you need to create a class `Sigmoid` which encapsulates a layer of sigmoid units. You should do this by extending the `mlp.layers.Linear` class (in file `mlp/layers.py`), which implements a a layer of linear units (i.e. weighted sum plus bias). The `Sigmoid` class extends this by applying the sigmoid transfer function to the weighted sum in the forward propagation, and applying the derivative of the sigmoid in the gradient descent back propagation and computing the gradients with respect to layer's parameters. Do **not** copy the implementation provided in `Linear` class but rather, **reuse** it through inheritance.\n",
"\n",
"When you have implemented `Sigmoid` (in the `mlp.layers` module), then please test it by running the below code cell.\n"
]
@ -123,7 +123,11 @@
"\n",
"print fp.sum()\n",
"print deltas.sum()\n",
"print ograds.sum()"
"print ograds.sum()\n",
"%precision 3\n",
"print fp\n",
"print deltas\n",
"print ograds\n"
]
},
{
@ -151,7 +155,7 @@
"source": [
"## Task 2 - Softmax (15%)\n",
"\n",
"In this task you need to create a class `Softmax` which encapsulates a layer of `softmax` units. As in the previous task, you should do this by extending the `mlp.layers.Linear` class (in file `mlp/layers.py`).\n",
"In this task you need to create a class `Softmax` which encapsulates a layer of softmax units. As in the previous task, you should do this by extending the `mlp.layers.Linear` class (in file `mlp/layers.py`).\n",
"\n",
"When you have implemented `Softmax` (in the `mlp.layers` module), then please test it by running the below code cell.\n"
]
@ -177,7 +181,11 @@
"\n",
"print fp.sum()\n",
"print deltas.sum()\n",
"print ograds.sum()"
"print ograds.sum()",
"%precision 3\n",
"print fp\n",
"print deltas\n",
"print ograds\n"
]
},
{
@ -206,10 +214,10 @@
"## Task 3 - Multi-layer network for MNIST classification (40%)\n",
"\n",
"**(a)** (20%) Building on the single layer linear network for MNIST classification used in lab [02_MNIST_SLN.ipynb](02_MNIST_SLN.ipynb), and using the `Sigmoid` and `Softmax` classes that you implemented in tasks 1 and 2, construct and learn a model that classifies MNIST images and:\n",
" * Has one hidden layer with a `sigmoid` transfer function and 100 units\n",
" * Uses a `softmax` output layer to discriminate between the 10 digit classes (use the `mlp.costs.CECost()` cost)\n",
" * Has one hidden layer with a sigmoid transfer function and 100 units\n",
" * Uses a softmax output layer to discriminate between the 10 digit classes (use the `mlp.costs.CECost()` cost)\n",
"\n",
"Your code should print the final values of the error function and the classification accuracy for train, validation, and test sets (please keep also the log information printed by default by the optimiser). Limit the number of training epochs to 30. You can, of course, split the solution at as many cells as you think is necessary."
"Your code should print the final values of the error function and the classification accuracy for train, validation, and test sets (please keep also the log information printed by default by the optimiser). Limit the number of training epochs to 30. You can, of course, split your code across as many cells as you think is necessary."
]
},
{