From cafed216d76b1fcfa8076e238f3e846535b31552 Mon Sep 17 00:00:00 2001 From: Matt Graham Date: Tue, 11 Oct 2016 13:40:29 +0100 Subject: [PATCH] Fixing typo in description of back-propagation. --- notebooks/03_Multiple_layer_models.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/notebooks/03_Multiple_layer_models.ipynb b/notebooks/03_Multiple_layer_models.ipynb index cbb1e60..26efdae 100644 --- a/notebooks/03_Multiple_layer_models.ipynb +++ b/notebooks/03_Multiple_layer_models.ipynb @@ -103,7 +103,7 @@ " \\pd{\\bar{E}}{x^{(b)}_d} = \\sum_{k=1}^K \\lpa \\pd{\\bar{E}}{y^{(b)}_k} \\pd{y^{(b)}_k}{x^{(b)}_d} \\rpa.\n", "\\end{equation}\n", "\n", - "Mathematically therefore the `bprop` method takes an array of gradients with respect to the outputs $\\pd{y^{(b)}_k}{x^{(b)}_d}$ and applies a sum-product operation with the partial derivatives of each output with respect to each input $\\pd{y^{(b)}_k}{x^{(b)}_d}$ to produce gradients with respect to the inputs of the layer $\\pd{\\bar{E}}{x^{(b)}_d}$.\n", + "Mathematically therefore the `bprop` method takes an array of gradients with respect to the outputs $\\pd{\\bar{E}}{y^{(b)}_k}$ and applies a sum-product operation with the partial derivatives of each output with respect to each input $\\pd{y^{(b)}_k}{x^{(b)}_d}$ to produce gradients with respect to the inputs of the layer $\\pd{\\bar{E}}{x^{(b)}_d}$.\n", "\n", "For the affine transformation used in the `AffineLayer` implemented last week, i.e a forwards propagation corresponding to \n", "\n",