From 174445af8fb141909bff323f1c27bcbc8bc92430 Mon Sep 17 00:00:00 2001
From: pswietojanski
Date: Thu, 1 Oct 2015 13:26:08 +0100
Subject: [PATCH 1/2] changes to dataset
---
00_Introduction.ipynb | 389 ------------------------------------------
mlp/dataset.py | 105 +++++++++++-
2 files changed, 97 insertions(+), 397 deletions(-)
delete mode 100644 00_Introduction.ipynb
diff --git a/00_Introduction.ipynb b/00_Introduction.ipynb
deleted file mode 100644
index 989191f..0000000
--- a/00_Introduction.ipynb
+++ /dev/null
@@ -1,389 +0,0 @@
-{
- "cells": [
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "# Introduction\n",
- "\n",
- "This notebook shows how to set-up a working python envirnoment for the Machine Learning Practical course.\n",
- "\n",
- "\n",
- "# Setting up the software\n",
- "\n",
- "Within this course we are going to work with python (using some auxiliary libraries like numpy and scipy). Depending on the infrastracture and working environment (e.g. DICE), root permission may not be not available so the packages cannot be installed in default locations. A convenient python configuration, which allows us to install and update third party libraries easily using package manager, are so called virtual environments. They can be also used to work (and test) the code with different versions of software.\n",
- "\n",
- "## Instructions for Windows\n",
- "\n",
- "The fastest way to get working setup on Windows is to install Anaconda (http://www.continuum.io) package. It's a python environment with precompiled versions of the most popular scientific python libraries. It also works on MacOS, but numpy is not linked without a fee to a numerical library, hence for MacOS we recommend the following procedure.\n",
- "\n",
- "## Instructions for MacOS\n",
- "\n",
- " * Install macports following instructions at https://www.macports.org/install.php\n",
- " * Install the relevant python packages in macports\n",
- "\n",
- " ```\n",
- " sudo port install py27-scipy +openblas\n",
- " sudo port install py27-ipython +notebook\n",
- " sudo port install py27-notebook\n",
- " sudo port install py27-matplotlib\n",
- " sudo port select --set python python27\n",
- " sudo port select --set ipython2 py27-ipython\n",
- " sudo port select --set ipython py27-ipython\n",
- " ```\n",
- "\n",
- "Make sure that your `$PATH` has `/opt/local/bin` before `/usr/bin` so you pick up the version of python you just installed.\n",
- "\n",
- "## Instructions for DICE:\n",
- "\n",
- "### Directory structure and getting things organised\n",
- "\n",
- "To get things somehow standarized between people, and make life of everyone easier, we propse to organise your DICE setup in the following directory structure:\n",
- "\n",
- " * `~/mlpractical/` -- for a general course repository\n",
- " * `~/mlpractical/repos-3rd` -- for stuff you download, build and install (numpy, OpenBlas, virtualenv)\n",
- " * `~/mlpractical/repo-mlp` -- this is the actual course repository you clone from our website (do not create a dir for it yet!)\n",
- " * `~/mlpractical/venv` -- this is where virutal envirnoment will make its dir (do not create a dir for it yet!)\n",
- "\n",
- "Create now repos-3rd directory (option -p in the below command will automatically create (non-existing) **p**arent directories (mlpractical):\n",
- "\n",
- " * `mkdir -p ~/mlpractical/repos-3rd`\n",
- "\n",
- "And now, let us set an MLP_WDIR environmental variable (MLP Working DIRectory) that will keep an absolute path of working dir pointing to `~/mlpractial`, **add the below line** to your `~/.bashrc` file (if it does not exists, create one!):\n",
- "\n",
- "```\n",
- "export MLP_WDIR=~/mlpractical\n",
- "```\n",
- "\n",
- "Now re-source `~/.bashrc` by typing (so the env variables get updated!): `source ~/.bashrc`\n",
- "\n",
- "Enter the `repos-3rd` directory by typing: `cd ~/mlpractical/repos-3rd` (or ```cd $MLP_WDIR/repos-3rd``` if you want)\n",
- "\n",
- "### Configuring virtual environment\n",
- "\n",
- "Make sure you are in `repos-3rd` directory and that MLP_WDIR variable has been exported (you may type export in the terminal and examine the list of availabe variables in the current session), then type:\n",
- "\n",
- " * `git clone https://github.com/pypa/virtualenv`\n",
- " * Enter the cloned repository and type ```./virtualenv.py --python /usr/bin/python2.7 --no-site-packages $MLP_WDIR/venv```\n",
- " * Activate the environment by typing `source ~/mlpractical/venv/bin/activate` (to leave the virtual environment one may type `decativate`)\n",
- " * Environments need to be activated every time ones start the new session so we will now create a handy alias to it in `~/.bashrc` script, by typing the below command (note, MLP_WDIR export needs to preceed this command):\n",
- " \n",
- " ```alias activate_mlp=\"source $MLP_WDIR/venv/bin/activate\"```\n",
- " \n",
- "Then every time you open new session and want to activate the right virtual environment, simply type `activate_mlp` instead `source ~/mlpractical/venv/bin/activate`. Note, you need to re-soure the .bashrc in order alias to be visible in the current session.\n",
- "\n",
- "### Installing remaining packages\n",
- "\n",
- "Then, before you follow next, install/upgrade the following packages:\n",
- "\n",
- "```\n",
- "pip install --upgrade pip\n",
- "pip install setuptools\n",
- "pip install setuptools --upgrade\n",
- "pip install ipython\n",
- "pip install notebook\n",
- "```\n",
- "\n",
- "### Installing numpy\n",
- "\n",
- "Note, having virtual environment properly installed one may then run `pip install numpy` to use pip to install numpy, though this will most likely lead to the suboptimal configuration where numpy is linked to ATLAS numerical library, which on DICE is compiled in multi-threaded mode. This means whenever numpy use BLAS accelerated computations (using ATLAS), it will use **all** the available cores at the given machine. This happens because ATLAS can be compiled to either run computations in single *or* multi threaded modes. However, contrary to some other backends, the latter does not allow to use an arbitrary number of threads (specified by the user prior to computation). This is highly suboptimal, as the potential speed-up resulting from paralleism depends on many factors like the communication overhead between threads, the size of the problem, etc. Using all cores for our exercises is not-necessary.\n",
- "\n",
- "For which reason, we are going to compile our own version of BLAS package, called *OpenBlas*. It allows to specify the number of threads manually by setting an environmental variable OMP_NUM_THREADS=N, where N is a desired number of parallel threads (please use 1 by default). You can set an environment variable in the current shell by running\n",
- "\n",
- "```\n",
- "export OMP_NUM_THREADS=1\n",
- "```\n",
- "\n",
- "(note the lack of spaces around the equals sign and use of `export` to define an environment variable which will be available in sub-shells rather than just a variable local to the current shell).\n",
- "\n",
- "#### OpenBlas\n",
- "\n",
- "Enter again repos-3rd directory (`cd ~/mlpractical/repos-3rd) and copy into terminal the following commands (one at the time):\n",
- "\n",
- "```\n",
- "OBDir=$MLP_WDIR/repos-3rd/OpenBLAS\n",
- "git clone git://github.com/xianyi/OpenBLAS\n",
- "cd OpenBLAS\n",
- "make\n",
- "make PREFIX=$OBDir/lib install\n",
- "```\n",
- "\n",
- "Once OpenBLAS is finished compiling we need to ensure the compiled shared library files in the `lib` subdirectory are available to the shared library loader. This can be done by appending the absolute path to the `lib` subdirectory to the `LD_LIBRARY_PATH` environment variable. To ensure this changes persist we will change the bash start up file `~/.bashrc` by opening it in a text editor (e.g. by running `gedit ~/.bashrc`) and adding the following line\n",
- "\n",
- "```\n",
- "export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$MLP_WDIR/repos-3rd/OpenBLAS/lib\n",
- "```\n",
- "\n",
- "Note, we again are using MLP_WDIR here, so the above line needs to be placed after you set MLP_WDIR.\n",
- "\n",
- "After you have edited `.bashrc` run\n",
- "\n",
- "```\n",
- "source ~/.bashrc\n",
- "source ~/mlpractical/venv/bin/activate\n",
- "```\n",
- "\n",
- "to rerun the bash start up script make sure the new environment variable is available in the current shell and then reactivate the virtual environment.\n",
- "\n",
- "#### Numpy\n",
- "\n",
- "To install `numpy` linked against the OpenBLAS libraries we just compiled, first run the following commands (one at a time)\n",
- "\n",
- "```\n",
- "cd ~/mlpractical/repos-3rd/\n",
- "wget http://downloads.sourceforge.net/project/numpy/NumPy/1.9.2/numpy-1.9.2.zip\n",
- "unzip numpy-1.9.2.zip\n",
- "cd numpy-1.9.2\n",
- "echo \"[openblas]\" >> site.cfg\n",
- "echo \"library_dirs = $OBDir/lib\" >> site.cfg\n",
- "echo \"include_dirs = $OBDir/include\" >> site.cfg\n",
- "python setup.py build --fcompiler=gnu95\n",
- "```\n",
- "\n",
- "Assuming the virtual environment is activated, the below command will install numpy in a desired space (`~/mlpractical/venv/...`):\n",
- "\n",
- "```\n",
- "python setup.py install\n",
- "```\n",
- "\n",
- "Now use pip to install remaining packages: `scipy`, `matplotlib`, `argparse`, `nose`,\n",
- "\n",
- "### Getting the mlpractical repository\n",
- "\n",
- "Clone the course repository from the github, by navigating to `~/mlpractical` directory and typing:\n",
- "\n",
- "```\n",
- "git clone https://github.com/CSTR-Edinburgh/mlpractical.git repo-mlp\n",
- "```\n",
- "\n",
- "When download is ready, enter the repo-mlp directory (`cd repo-mlp`) and start the actual interactive notebook session by typing:\n",
- "\n",
- "```\n",
- "ipython notebook\n",
- "```\n",
- "\n",
- "This should start a ipython server which opens a new browser window listing files in `repo-mlp` directory, including `00_Introduction.ipynb.`. Open it and run (from the browser interface) the following examples and exercies."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": false,
- "scrolled": true
- },
- "outputs": [],
- "source": [
- "%clear\n",
- "import numpy\n",
- "# show_config() prints the configuration of numpy numerical backend \n",
- "# you should be able to see linkage to OpenBlas or some other library\n",
- "# in case those are empty, it means something went wrong and \n",
- "# numpy will use a default (slow) pythonic implementation for algebra\n",
- "numpy.show_config()\n",
- "#numpy.test()"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "Also, below we check whether and how much speedup one may expect by using different number of cores:"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": false
- },
- "outputs": [],
- "source": [
- "import os\n",
- "import multiprocessing\n",
- "import timeit\n",
- "\n",
- "num_cores = multiprocessing.cpu_count()\n",
- "N = 1000\n",
- "x = numpy.random.random((N,N))\n",
- "\n",
- "for i in xrange(0, num_cores):\n",
- " # first, set the number of threads OpenBLAS\n",
- " # should use, the below line is equivalent\n",
- " # to typing export OMP_NUM_THREADS=i+1 in bash shell\n",
- " print 'Running matrix-matrix product on %i core(s)' % i\n",
- " os.environ['OMP_NUM_THREADS'] = str(i+1)\n",
- " %%timeit numpy.dot(x,x.T)"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "Test whether you can plot and display the figures using pyplot"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": false
- },
- "outputs": [],
- "source": [
- "import matplotlib.pyplot as plt\n",
- "\n",
- "x = numpy.linspace(0.0, 2*numpy.pi, 100)\n",
- "y1 = numpy.sin(x)\n",
- "y2 = numpy.cos(x)\n",
- "\n",
- "plt.plot(x, y1, lw=2, label=r'$\\sin(x)$')\n",
- "plt.plot(x, y2, lw=2, label=r'$\\cos(x)$')\n",
- "plt.xlabel('x')\n",
- "plt.ylabel('y')\n",
- "plt.legend()\n",
- "plt.xlim(0.0, 2*numpy.pi)\n",
- "plt.grid()\n",
- "plt.show()"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "# Exercises\n",
- "\n",
- "Today exercises are meant to get you familiar with ipython notebooks (if you haven't used them so far), data organisation and how to access it. Next week onwars, we will follow with the material covered in lectures.\n",
- "\n",
- "## Data providers\n",
- "\n",
- "Open (in the browser) `mlp.dataset` module (go to `Home` tab and navigate to mlp package, then click on the link `dataset.py`). Have a look thourgh the code and comments, then follow to exercises.\n",
- "\n",
- "General note: you can load the mlp code into your favourite python IDE but it is totally OK if you work (modify & save) the code directly in the browser by opening/modyfing the necessary modules in the tabs.\n",
- "\n",
- "### Exercise 1 \n",
- "\n",
- "Using MNISTDataProvider, write a code that iterates over the first 5 minibatches of size 100 data-points. Print MNIST digits in 10x10 images grid plot. Images are returned from the provider as tuples of numpy arrays `(features, targets)`. The `features` matrix has shape BxD while the `targets` vector is of size B, where B is the size of a mini-batch and D is dimensionality of the features. By deafult, each data-point (image) is stored in a 784 dimensional vector of pixel intensities normalised to [0,1] range from an inital integer values [0-255]. However, the original spatial domain is two dimensional, so before plotting you need to convert it into 2D matrix (MNIST images have the same number of pixels for height and width).\n",
- "\n",
- "Tip: Useful functions for this exercise are: imshow, subplot, gridspec"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": false
- },
- "outputs": [],
- "source": [
- "import matplotlib.pyplot as plt\n",
- "import matplotlib.gridspec as gridspec\n",
- "import matplotlib.cm as cm\n",
- "from mlp.dataset import MNISTDataProvider\n",
- "\n",
- "def show_mnist_image(img):\n",
- " fig = plt.figure()\n",
- " gs = gridspec.GridSpec(1, 1)\n",
- " ax1 = fig.add_subplot(gs[0,0])\n",
- " ax1.imshow(img, cmap=cm.Greys_r)\n",
- " plt.show()\n",
- "\n",
- "def show_mnist_images(batch):\n",
- " raise NotImplementedError('Write me!')\n"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": false
- },
- "outputs": [],
- "source": [
- "# An example for a single MNIST image\n",
- "mnist_dp = MNISTDataProvider(dset='valid', batch_size=1, max_num_examples=2, randomize=False)\n",
- "\n",
- "for batch in mnist_dp:\n",
- " features, targets = batch\n",
- " show_mnist_image(features.reshape(28, 28))"
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": false
- },
- "outputs": [],
- "source": [
- "#implement here Exercise 1"
- ]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": [
- "### Exercise 2\n",
- "\n",
- "`MNISTDataProvider` as `targets` currently returns a vector of integers, each element in this vector represents an id of the category `features` data-point represent. Later in the course we are going to need 1-of-K representation of targets, for instance, given the minibatch of size 3 and the corresponding targets vector $[2, 2, 0]$ (and assuming there are only 3 different classes to discriminate between), one needs to convert it into matrix $\\left[ \\begin{array}{ccc}\n",
- "0 & 0 & 1 \\\\\n",
- "0 & 0 & 1 \\\\\n",
- "1 & 0 & 0 \\end{array} \\right]$. \n",
- "\n",
- "Implement `__to_one_of_k` method of `MNISTDataProvider` class. Then modify (uncomment) an appropriate line in its `next` method, so the raw targets get converted to `1 of K` coding. Test the code in the cell below."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": true
- },
- "outputs": [],
- "source": []
- },
- {
- "cell_type": "markdown",
- "metadata": {
- "collapsed": true
- },
- "source": [
- "### Exercise 3\n",
- "\n",
- "Write your own data provider `MetOfficeDataProvider` that wraps the weather data for south Scotland (could be obtained from: http://www.metoffice.gov.uk/hadobs/hadukp/data/daily/HadSSP_daily_qc.txt). The file was also downloaded and stored in `data` directory for your convenience. The provider should return a tuple `(x,t)` of the estimates over an arbitrary time windows (i.e. last N-1 days) for `x` and the N-th day as the one which model should be able to predict, `t`. For now, skip missing data-points (denoted by -99.9) and simply use the next correct value. Make sure the provider works for arbitrary `batch_size` settings, including the case where single mini-batch is equal to all datapoints in the dataset. Test the dataset in the cell below.\n",
- "\n",
- "Tip: To follow with this exercise, copy MNISTDataProvider in dataset.py, rename it to `MetOfficeDataProvider` and reimplement necesarry parts (including the arguments you pass to the constructor)."
- ]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": true
- },
- "outputs": [],
- "source": []
- }
- ],
- "metadata": {
- "kernelspec": {
- "display_name": "Python 2",
- "language": "python",
- "name": "python2"
- },
- "language_info": {
- "codemirror_mode": {
- "name": "ipython",
- "version": 2
- },
- "file_extension": ".py",
- "mimetype": "text/x-python",
- "name": "python",
- "nbconvert_exporter": "python",
- "pygments_lexer": "ipython2",
- "version": "2.7.9"
- }
- },
- "nbformat": 4,
- "nbformat_minor": 0
-}
diff --git a/mlp/dataset.py b/mlp/dataset.py
index 2c7ff12..01156f5 100644
--- a/mlp/dataset.py
+++ b/mlp/dataset.py
@@ -42,11 +42,16 @@ class DataProvider(object):
raise NotImplementedError()
def __iter__(self):
+ """
+ This method says an object is iterable.
+ """
return self
def next(self):
"""
- Data-specific iteration mechanism.
+ Data-specific iteration mechanism. Called each step
+ (i.e. each iteration in a loop)
+ unitl StopIteration() exception is raised.
:return:
"""
raise NotImplementedError()
@@ -94,7 +99,7 @@ class MNISTDataProvider(DataProvider):
def __randomize(self):
assert isinstance(self.x, numpy.ndarray)
- return numpy.random.permute(numpy.arange(0, self.x.shape[0]))
+ return numpy.random.permutation(numpy.arange(0, self.x.shape[0]))
def next(self):
@@ -117,18 +122,100 @@ class MNISTDataProvider(DataProvider):
self._curr_idx += self.batch_size
- #return rval_x, self.__to_one_of_k(rval_y)
+ return rval_x, self.__to_one_of_k(rval_y)
return rval_x, rval_t
def __to_one_of_k(self, y):
- raise NotImplementedError('Write me!')
+ rval = numpy.zeros((y.shape[0], self.num_classes), dtype=numpy.float32)
+ for i in xrange(y.shape[0]):
+ rval[i, y[i]] = 1
+ return rval
+class MetOfficeDataProvider_(DataProvider):
+ """
+ The class iterates over South Scotland Weather, in possibly
+ random order.
+ """
+ def __init__(self, window_size,
+ batch_size=10,
+ max_num_batches=-1,
+ randomize=True):
+
+ super(MetOfficeDataProvider_, self).\
+ __init__(batch_size, randomize)
+
+ dset_path = './data/HadSSP_daily_qc.txt'
+ assert os.path.isfile(dset_path), (
+ "File %s was expected to exist!." % dset_path
+ )
+
+ raw = numpy.loadtxt(dset_path, skiprows=3, usecols=range(2, 32))
+
+ self.window_size = windows_size
+ #filter out all missing datapoints and
+ #flatten a matrix to a vector, so we will get
+ #a time preserving representation of measurments
+ #with self.x[0] being the first day and self.x[-1] the last
+ self.x = raw[raw < 0].flatten()
+ self._max_num_examples = max_num_examples
+
+ self._rand_idx = None
+ if self.randomize:
+ self._rand_idx = self.__randomize()
+
+ def reset(self):
+ super(MetOfficeDataProvider_, self).reset()
+ if self.randomize:
+ self._rand_idx = self.__randomize()
+
+ def __randomize(self):
+ assert isinstance(self.x, numpy.ndarray)
+ # we generate random indexes starting from window_size, i.e. 10th absolute element
+ # in the self.x vector, as we later during minibatch preparation slice
+ # the self.x container backwards, i.e. given we want to get a training
+ # data-point for 11th day, we look at 10 preeceding days.
+ # Note, we cannot do this, for example, for the 5th day as
+ # we do not have enough observations to make an input (10 days) to the model
+ return numpy.random.permutation(numpy.arange(self.window_size, self.x.shape[0]))
+
+ def next(self):
+
+ has_enough = (self._curr_idx + self.batch_size) <= self.x.shape[0]
+ presented_max = (self._max_num_examples > 0 and
+ self._curr_idx + self.batch_size > self._max_num_examples)
+
+ if not has_enough or presented_max:
+ raise StopIteration()
+
+ if self._rand_idx is not None:
+ range_idx = \
+ self._rand_idx[self._curr_idx:self._curr_idx + self.batch_size]
+ else:
+ range_idx = \
+ numpy.arange(self._curr_idx, self._curr_idx + self.batch_size)
+
+ #build slicing matrix of size minibatch, which will contain batch_size
+ #rows, each keeping indexes that selects windows_size+1 [for (x,t)] elements
+ #from data vector (self.x) that itself stays always sorted w.r.t time
+ range_slices = numpy.zeros((self.batch_size, self.window_size + 1))
+ for i in xrange(0, self.batch_size):
+ range_slices[i,:] = \
+ numpy.arange(range_idx[i], range_idx[i] - self.window_size - 1, -1)[::-1]
+
+ #here we use advanced indexing to select slices from observation vector
+ #last column of rval_x makes our targets t
+ rval_x = self.x[range_slices]
+
+ self._curr_idx += self.batch_size
+
+ return rval_x[:,:-1], rval[:,-1]
+
+
class FuncDataProvider(DataProvider):
"""
- Function gets as an argument a list of functions random samples
- drawn from normal distribution which means are defined by those
- functions.
+ Function gets as an argument a list of functions defining the means
+ of a normal distribution to sample from.
"""
def __init__(self,
fn_list=[lambda x: x ** 2, lambda x: numpy.sin(x)],
@@ -138,6 +225,8 @@ class FuncDataProvider(DataProvider):
points_per_fn=200,
batch_size=10,
randomize=True):
+ """
+ """
super(FuncDataProvider, self).__init__(batch_size, randomize)
@@ -164,7 +253,7 @@ class FuncDataProvider(DataProvider):
def __randomize(self):
assert isinstance(self.x, numpy.ndarray)
- return numpy.random.permute(numpy.arange(0, self.x.shape[0]))
+ return numpy.random.permutation(numpy.arange(0, self.x.shape[0]))
def __iter__(self):
return self
From e5ffdfeb60dcc16f4c8b99f1d079e9333bd84aa1 Mon Sep 17 00:00:00 2001
From: pswietojanski
Date: Mon, 5 Oct 2015 09:07:04 +0100
Subject: [PATCH 2/2] 2nd lab
---
00_Introduction.ipynb | 111 +++++-
01_Linear_Models.ipynb | 651 +++++++++++++++++++++++++-------
mlp/dataset.py | 59 +--
res/singleLayerNetBP-1.png | Bin 0 -> 70872 bytes
res/singleLayerNetPredict.png | Bin 0 -> 63208 bytes
res/singleLayerNetWts-1.png | Bin 0 -> 62483 bytes
res/singleLayerNetWtsBP.pdf | Bin 0 -> 216889 bytes
res/singleLayerNetWtsEqns-1.png | Bin 0 -> 75201 bytes
res/singleLayerNetWtsEqns.pdf | Bin 0 -> 258668 bytes
9 files changed, 652 insertions(+), 169 deletions(-)
create mode 100644 res/singleLayerNetBP-1.png
create mode 100644 res/singleLayerNetPredict.png
create mode 100644 res/singleLayerNetWts-1.png
create mode 100644 res/singleLayerNetWtsBP.pdf
create mode 100644 res/singleLayerNetWtsEqns-1.png
create mode 100644 res/singleLayerNetWtsEqns.pdf
diff --git a/00_Introduction.ipynb b/00_Introduction.ipynb
index 7f7f0ee..896e532 100644
--- a/00_Introduction.ipynb
+++ b/00_Introduction.ipynb
@@ -175,12 +175,61 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 2,
"metadata": {
"collapsed": false,
"scrolled": true
},
- "outputs": [],
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "\u001b[H\u001b[2Jatlas_3_10_blas_info:\n",
+ " NOT AVAILABLE\n",
+ "atlas_3_10_blas_threads_info:\n",
+ " NOT AVAILABLE\n",
+ "atlas_threads_info:\n",
+ " libraries = ['lapack', 'ptf77blas', 'ptcblas', 'atlas']\n",
+ " library_dirs = ['/usr/lib64/atlas']\n",
+ " define_macros = [('ATLAS_INFO', '\"\\\\\"3.8.4\\\\\"\"')]\n",
+ " language = f77\n",
+ " include_dirs = ['/usr/include']\n",
+ "blas_opt_info:\n",
+ " libraries = ['ptf77blas', 'ptcblas', 'atlas']\n",
+ " library_dirs = ['/usr/lib64/atlas']\n",
+ " define_macros = [('ATLAS_INFO', '\"\\\\\"3.8.4\\\\\"\"')]\n",
+ " language = c\n",
+ " include_dirs = ['/usr/include']\n",
+ "openblas_info:\n",
+ " NOT AVAILABLE\n",
+ "atlas_blas_threads_info:\n",
+ " libraries = ['ptf77blas', 'ptcblas', 'atlas']\n",
+ " library_dirs = ['/usr/lib64/atlas']\n",
+ " define_macros = [('ATLAS_INFO', '\"\\\\\"3.8.4\\\\\"\"')]\n",
+ " language = c\n",
+ " include_dirs = ['/usr/include']\n",
+ "lapack_opt_info:\n",
+ " libraries = ['lapack', 'ptf77blas', 'ptcblas', 'atlas']\n",
+ " library_dirs = ['/usr/lib64/atlas']\n",
+ " define_macros = [('ATLAS_INFO', '\"\\\\\"3.8.4\\\\\"\"')]\n",
+ " language = f77\n",
+ " include_dirs = ['/usr/include']\n",
+ "openblas_lapack_info:\n",
+ " NOT AVAILABLE\n",
+ "lapack_mkl_info:\n",
+ " NOT AVAILABLE\n",
+ "atlas_3_10_threads_info:\n",
+ " NOT AVAILABLE\n",
+ "atlas_3_10_info:\n",
+ " NOT AVAILABLE\n",
+ "blas_mkl_info:\n",
+ " NOT AVAILABLE\n",
+ "mkl_info:\n",
+ " NOT AVAILABLE\n"
+ ]
+ }
+ ],
"source": [
"%clear\n",
"import numpy\n",
@@ -201,11 +250,26 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 4,
"metadata": {
"collapsed": false
},
- "outputs": [],
+ "outputs": [
+ {
+ "name": "stdout",
+ "output_type": "stream",
+ "text": [
+ "Running matrix-matrix product on 0 core(s)\n",
+ "1 loops, best of 3: 145 ms per loop\n",
+ "Running matrix-matrix product on 1 core(s)\n",
+ "10 loops, best of 3: 140 ms per loop\n",
+ "Running matrix-matrix product on 2 core(s)\n",
+ "10 loops, best of 3: 141 ms per loop\n",
+ "Running matrix-matrix product on 3 core(s)\n",
+ "10 loops, best of 3: 141 ms per loop\n"
+ ]
+ }
+ ],
"source": [
"import os\n",
"import multiprocessing\n",
@@ -233,11 +297,22 @@
},
{
"cell_type": "code",
- "execution_count": null,
+ "execution_count": 3,
"metadata": {
"collapsed": false
},
- "outputs": [],
+ "outputs": [
+ {
+ "data": {
+ "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYYAAAEPCAYAAABGP2P1AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzsnXd4FUUXh99NpQakV+kdNDQFBCEggoBIkyZIRxQEBD6Q\nHiCAdAgggkhTlF4UaQqJdGmGFkB6J5RQEtKT+f4YkgCm37J3b+Z9nvtkN3d39neyN/fszJk5RxNC\noFAoFApFLA56C1AoFAqFbaEcg0KhUCheQjkGhUKhULyEcgwKhUKheAnlGBQKhULxEsoxKBQKheIl\ndHUMmqYt0TQtQNO0U0kc461p2hlN045rmlbZmvoUCoUiPaJ3j2Ep0DixNzVNaw28LoSoAPR4frxC\noVAoLIiujkEIsRd4lMQhTYAfnx/7D+CkaVoha2hTKBSK9IrePYbkKATceGH/5vPfKRQKhcJC2Lpj\nANBe2Vc5PBQKhcKCOOktIBluAoWBv5/vF3r+u5fQNE05C4VCoUgDQohXH75tvsewFfgEQNO0KkC0\nEOJWQgcKIRBCEBMTw9VHV1nvv54B2wZQaGYh8CTu9d6K99h5cScxMTFx59jia+zYsSa3sWePoHVr\ngYODQHa0BLlyCT79VLBqleDiRUFUVNJtREUJTp4UfPutoGNHgZtbfFtOToIePQSXLlnPJlt82aNd\nyiZjvGJtOh1wmi4bu+A03inuu85tshudN3Rmud9yTgecJio66j/nJ4qeRgG/ALeBCGQsoTvwGfDZ\nC8fMA84Ax4EqibQjEiM6JlocvHFQ9N/aX2SZlEXgicATUWVhFeF7xTfR8/SmS5cuaT731CkhGjUS\nAuTLyUmIDh2E+OsvIaKiTNMVEiLE6tVCNGsmhIODbN/RUYguXYS4dCnpc02xyZaxR7uUTcagZfuW\nosWqFnHfaw7jHMTHaz4WW85vEWGRYcme//y787/fqQn90mivpBzDiwSGBIpJeyaJvNPyxv0h269r\nL248uZGi861JWj7EAQFC9OoV/4Xt5ibE6NFC3Lplfn1CCPHvv0J07SodAwiRIYMQXl5ChIcnfLw9\n/mMKYZ92KZtsm+DwYDHizxFCc9cEnoiMXhlF39/7ikuByTydvYJyDC8QEhEixvuOFxm8Mgg8EZkm\nZhJzDs0RMTExqWrHkvj4+KTq+A0bhMidO/4Jvm9fIe7ds4y2V7l4UYiOHeN7KOXKyd7Jq6TWJqNg\nj3Ypm2yXHRd3iEIzC8mH2y6ILhu7iDtBd9LUVmKOQZPvGRtN00Ra7Lj2+BqDdw5m/dn1AHxQ8gOW\nfrSUvFnymluixXj8GAYMgBUr5L6HB3z7LZQta30tu3bB55/DhQugaTBiBHh6gpOtT3FQKAxAeFQ4\nw3cNZ9ahWQBUzV+VuR/MpWbhmmluU9M0RALBZ92f9s3xIpU9hlfZ4L9B5JiSQ+CJyDMtj9j671aT\n2jMHKXm6OX5ciCJF5FN6xoxCeHsLER1tcWlJEhoqxKhR8cNZdeoIcfOmfM9enthexR7tsmWbiJ39\noF6peiX2txQJfKfa+qwkq9CyXEtO9jlJ/WL1uffsHk1/bsrU/VNjnY5NsmoVvPMOXLsG1auDnx98\n+SU46HxHM2SACRNk7yF/fti7F9zdYfdufXUp7IuEvszUK8mH51SRroeSXiVGxDBp7yRG+4wGoKt7\nV75r+h2uTq4mt20uoqNh5EiYMkXu9+gB8+eDq+1IjOPePejcGXbuBGdnWLwYPv1Ub1UKo/N8+ENv\nGYYisb9ZYkNJyjEkwHr/9XTe2JnQqFDqvF6Hje02kjNTTrO1n1YiIqBTJ1i7FhwdYfZs6NtXjufb\nKjExMHQozJgh9z09YcwY29assG2UY0g9qXUMaigpAVqXb83ebnspkLUAe6/vxWO5B/ee3bOqBl9f\n35f2nz2D5s2lU3Bzgz/+gH79bP8L1sEBpk+HefNA03zx9JS9nOhovZWZj1fvlT1gjzYpUo5yDIlQ\ntUBV/u75N2VyluHUvVO8u/Rdbj1NcNG1xXn0CBo2hB07IHdu8PWVs4+MRN++4OUFmTLB0qXQpQtE\nRemtSqFQJIQaSkqGgOAAGv7YkFP3TlH8teLs+nQXRbMXtci1EuLxY6hfH/75BwoXhj//hNKlrXZ5\ns7NvH3zwAQQHQ/v28OOPajqrInWooaTUo4aSzEzeLHnx7epLtQLVuPzoMh7LPazWcwgOhiZNpFMo\nWRL27ze2UwCoXVv2fLJmlTOrPvlE9RwUile5dOlSssfcuXOHkJAQi1xfOYYUkCNjDnZ9uou3Cr7F\n1cdXef+n93kY8tCi19yxw5fmzeHgQXj9dTn9s3Bhi17S4sSOW9eqJWcqubnBmjXQu7dcM21U7HE8\n3h5tMgrnz5/n2LFjyR6XN29eZs+ebRENyjGkEDdXN7Z23Er53OXxv+9Pk5+bEBwRbJFrRUbC2LHg\n4wP58kmn8PrrFrmUbtSoAdu3Q8aMMuYwcqTeihQKyxAZGUmhQoW4cOFCio5ftGgRbdu2TfY4BwcH\nmjRpworYtAdmRDmGVJAzU052dtpJ0exFOXzrMC1WtSAiOsKs1xBCppX4++965MwpYwolS5r1ErpR\nr169l/Zr1oR16+TU28mTwdtbH12m8qpd9oA92qQXzs7OTJ06lddT8HR37NgxSqbiH97d3d0ivTvl\nGFJJQbeC/NH5D/JmzsuuK7v4fMvnZg2EffMN/PCDfJLeuhUqVDBb0zZJkyawZIncHjhQDi0pFPZG\nx44dcU3BKtRNmzbx3nvvpartfPnycfHixbRKSxDlGNJAyRwl2dJxCxmdMrLEbwnTD0w3S7urVsnE\nc5oGw4f78tZbZmnWZkjsyebTT2HqVNlb6tIFjhyxri5TscfxeCPbpGnmeaWVY8eOsXDhQjZu3Ejz\n5s3ZvXs3b7zxBvv37wfgxx9/JG/evGzfvp1169bRrVs3rl27BsCJEycoVapUqq5Xvnz5FMUkUoNy\nDGmkWoFqrGgpx/aG/TmMzec2m9TewYPQtavcnj4d6tQxUaDBGDJEBqHDwqBFC7h9W29FCkXaWLJk\nCU2bNqVly5Y0a9aM+vXrU7lyZaKeT7/r3Lkz5cqVIyoqijZt2lCjRg3WrVsHkOAso40bN7J161a+\n+uorli5dSvv27Tl16lTc+3ny5OHmzf9UPDYJ5RhMoE35Nnh5eCEQdNzQEb+7fmlq584daNUKwsNl\nfOGrr+xzjDcpmzQN5s6Fd9+VTqFlS+kkjEB6u1e2jhDmeaWVZs2aUb16dVq3bh339O/wSnZLTdOo\n8HycOEOGDAQHy4ksMTExLx138eJF3N3dadKkCXv27KF169a0bduWokWLxh3j7OxMmJn/WZRjMJER\ndUbQ6Y1OhESG0HpNax6FPkrV+RER8PHHcPcu1K0rA7C2nubCUri4yGB0kSJw+DD06mXsaayK9Emp\nUqU4ceIErVu3pm/fvly/fh2QzuBFHB0d47Zj33NxcXnpmJIlS1KsWDECAgLImjUrbm5utGrViqxZ\ns8YdExwcjJubm1ltUI7BRDRN4/sPv6dK/ipcfnSZLpu6ECNikj/xOYMHy4VrBQvC6tXxq4CNPMab\nGCmxKXdu+PVXyJwZfvoJvvvO8rpMJb3eK0XCLFy4kGzZstGxY0cGDBhAQEAAwH8mqcTuv5gaO1eu\nXISGhsYdc/LkSY4fP8727dvjenFbtmx5qZ2bN2+mOi6RHMoxmIEMThlY9/E6smfIzm///saUfVNS\ndN6PP8rkci4usH495DVO4TiL8sYbcmYWyJlKZo6rKRQWxcXFhQULFrB8+XICAwN59OgRBw4cYNGi\nRYSEhLBmzRrOnj3LlClT8PPzY9WqVWzbto2jR49Sp06dlwLJO3fuZPPmzYSGhhIcHMzatWvJnTv3\nS9c7efIktWrVMq8ReheQMFMRCmELbDm/ReCJcBjnIHZd3pXksf7+suoaCLFwoZUEGowvvpB/n2LF\nhHj0SG81ClvBVv7fLcGDBw/EqFGjUnx8ZGSk6Nu3b7LHJfY3Q1VwszxNSzdlVJ1RxIgYOqzvQEBw\nQILHhYXJBHKhobKQTa9eVhZqEGbOhKpV4coV6NZNxRsU9k/OnDnJnj07gYGBKTr+559/5vPPPze7\nDuUYzIxnPU88isr6Dd1/7Z7g4rchQ+DkSbmief78hIPN9jjGm1qbXF3lgrds2WDTJjlryRZR90ph\nTgYMGMDq1auTPe7mzZu4urrGzW4yJ8oxmBlHB0dWtFzBaxleY+uFrcw7PO+l9zdvls7A2VkuaHth\ncoEiAYoXj18ZPXQonDmjrx6FwtI4OTmlqBdQqFAh2rVrZxENqh6DhdhwdgOt17TG1dGVI72OUClv\nJW7dkoHVwEA5TPLVV3qrNA49e8qA9Jtvwt9/22aNa4V1UPUYUo+qx2AjtCrXil5VehEeHU6H9R0I\niQilRw/pFD74QM62UaSc2bOhRAk4cQJGj9ZbjUJh3yjHYEFmNZpFmZxlOHP/DB/NGc2OHZAjhxwa\nSW4Rmz2O8ZpiU5Yscl2Do6NMGeLjYz5dpqLulcLeUI7BgmR2ycyPLX/EUXPkz2czofABFiyQNRYU\nqadGDRg1Ss5O6toVnj7VW5FCYZ+oGIOFiY6GYj1HcKPoZLKElyJgnB+ZnDPpLcuwREXJOg5Hj0Kf\nPrBggd6KFNZGxRhSj4ox2Bje3nDjx7E4BVYg2PUCo3aP0luSoXFykhXfnJ1luozdu/VWpFDYH8ox\nWJDLl5+XrIx2ZVqt5Thqjsw+NJt91/cle649jvGay6aKFeMD0D17QrBlKqymGHWvFPaGcgwWQgj4\n7DO5urljRxjYripf1/4agaDnrz0JizJITmkb5euvwd1drooeMUJvNQqFfaFiDBZi2TKZxiFnTjh7\nVmYNDY8Kp/LCypx9cJYx745hnMc4vWUaGj8/qF5dxh327YN33tFbkcIaqBhD6lExBhsgIAAGDZLb\ns2ZJpwDg6uTKog8XATB532T87/vrpNA+cHeHYcPkdp8+EBmprx6FwtxcunQpyffv3LmTYNU3U1GO\nwQIMGACPHsH770OnTi+/V/v12vSu0pvImEh6/9Y70doN9jjGawmbRo6UC99On4YZM8zefIpQ90ph\nCc6fP59sLee8efMye/Zss19bOQYzs327LLiTKZOcNZPQQrYpDaeQL0s+9t/Yz/fHvre+SDsiY8b4\nKavjx8uAv0JhDyxatIi2bdsmeYyDgwNNmjRhxYoVZr22cgxmJCwM+vWT256eUKxYwsdlz5Ad78be\nAAz9cyh3g+/+5xgj19xNDEvZ1LAhfPKJDPR/8YX103Ore6UwN8eOHaNkyZIpOtbd3d3sPTzlGMzI\nlClw6RKUL598LqQ25dvQpFQTnoY/Zdifw6wj0I6ZMQOyZ4cdO2SqboXCyGzatIn33nsvxcfny5eP\nixcvmu36alaSmbh0CSpUgPBw8PWFunVTcE7gJSp8W4Hw6HD2dttL7ddrx73n6+trd09tlrZp0SI5\nRbhAATh/XuZXsgbqXlmX5GYlaeOSSUSWQsTYtH+nLFiwgGfPnuHq6hqXRnvp0qVER0ejaRqaptG9\ne3eOHTvG0aNHyZMnD0uXLuXXX38FoHnz5nHbKeGnn37C2dk50TTcalaSDggB/ftLp9CpU8qcAkCJ\nHCUY9o7sLfTd2peomCgLqrR/evaU01dv3wYvL73VKNIre/fu5c8//2TIkCG89dZb7N+/nwMHDnD4\n8GF69uxJjx49OHfuHHv27GHJkiU0bdqUli1b0rRp07g2Xp1ptHHjRrZu3cpXX33F0qVLad++PadO\nnYp7P0+ePNy8edN8RiRU79NoL3SuAbtxo6xNnC2bEHfupO7ckIgQUXR2UYEnYs6hOZYRmI44fFgI\nTRPC2VmI8+f1VqOwBHr/vyfHwIEDxbRp0/7zO29v77j9xYsXiz59+oitW7eKfPnyiVatWondu3fH\nve/h4RG3feHCBXH58mUhhBBVqlQRT548EevXrxdPnz6NO2b37t3Cy8srUU2J/c1QNZ8tQ1hY/JqF\nCRNSnzk1o3NG5jSeA8Bon9GJ1olWpIzq1aF7d7mmoX9/VSdaYX1E/APrS7+LfGGhTVhYGJGRkZQq\nVYoTJ07QunVr+vbty40bNwBwcXGJO7ZkyZIUK1aMgIAAsmbNipubG61atSLrC+Ufg4ODcXNzM5sN\nyjGYyMyZMi1DxYqQ1prcH5b+kKalmvI0/Ckjdsn8DvY4j9xaNk2eHB+ITsUwbZpR90rxIh999NFL\nf7/ffvuN1q1bc/LkybjfnTx5ktatW7Nw4UKyZctGx44dGTBgAAEB8sEwV65chIaGxh17/Phxtm/f\nHhf32bJly0vXvHnzJqVKlTKbDcoxmMCtWzBpktyePVtm/kwLmqYxq9EsnB2cWeq3lON3jptPZDok\nd27ZewM5OyxMpaVSWBEPDw8aNWrE2LFjWb58OXny5KFOnTrUrVuXefPmsXDhQqpUqcIHH3yAi4sL\nCxYsYPny5QQGBlKtWjUA6tSpE7e4befOnWzevJnQ0FCCg4NZu3YtuWPTKTzn5MmT1KpVy3xGJDS+\nZLQXOo05duokYwstW5qnvcE7Bgs8EbWX1BYxMTHmaTSdEhkpRKVK8v5Mnqy3GoU50ev/3Zo8ePBA\njBo1KkXHRkZGir59+yZ5TGJ/M1SMwbwcPChLTbq6ylKT5mD0u6PJnSk3+67vY63/WvM0mk5xcpLD\nfCB7dQEqdKMwEDlz5iR79uwEBgYme+zPP//M52kdx04E5RjSQEyMzIcEMHgwFC9unnazZciGV305\nz/LLBV8SGhlqnoZtBGuPW7/3Hnz4IQQFxddvsAT2OB5vjzYZjQEDBrB69eokj7l58yaurq5UqFDB\nrNfW1TFomtZY07RTmqb5a5r2n+W/mqZ11TTtvqZp/zx/dddD56usXg1HjsgZSMOHm7ftHpV78Ebe\nN7gXfI8ZB3XKCmdHTJ8uew+LF8OJE3qrUShSTuzCuKQoVKhQoovaTEG3lc+aprkC54DaQABwEOgt\nhPjnhWO6AFWFEP2TaUtYy46wMChbFq5dk182PXqY/xo+V3yov6I+mZ0zc7H/RfJlSeUcWMVLDBwI\nc+ZA/frw558JJzZUGAdVjyH1GGnl89vAGSHELSFEFLAaaPrKMdrzl83g7S2dQqVK0LWrZa7hUcyD\n5mWa8yzyGZ6+npa5SDpizBh47TVZH/q33/RWo1DYPno6hkLAjRf2bz7/3YsIoJWmaWc0TftV07Qi\nVlOXAPfvw8SJcnv6dHB0tNy1WmdsjaPmyOLjizl7/6zlLmRF9Bq3zpEDxo6V219/LSu+mRN7HI+3\nR5sUKSeNM+/NQkr6gr8CK4UQUZqm9QBWIoee/kPXrl0pWrQoANmzZ8fd3T1uMUjsh9zU/fXr6/H0\nKVSv7otcmGje9l/cD7wSSK8qvfju2Hf08O7BpAaTzG6Ptfdj0eP65ctD8eL1OHsWvv7al2bNzNe+\nn5+f1e2x9L6fn59N6XlxX5F2fH19WbZsGUDc92VC6BljqAMME0I0e77/P8BFCDExiXOChBBZE/i9\nxWMM//4rs6fGxMggZsWKFr0cAAHBAZScW5LgiGB8uvhQr2g9y1/UjlmzBtq1k5MGLl6EzJn1VqRI\nCyrGkHqMFGM4AlTUNK2gpmnOQFtg24sHaJqW+4XtD4EL1pUYz8iRcgiiWzfrOAWAvFnyMrTWUACG\n7BySaBlQRcr4+GOZS+nu3fg1DgpjEpu6Wr1S9kotujkGIUQY8DmwAzgBbBBCHNc0bdxzJwAwWNO0\nk5qmnQGGAZ310Hr4MKxbBxkywLhx1rlmbPd5UM1BFMhagGN3jrHmjLEr0Og9bq1pMG2a3J461XyL\n3vS2yxLYsk0JrdRNycvHxyfZYy4+vIjTeCc0T42Td0/qntXBHDbFvlKDrusYhBDbhBAVhRDlhRCT\nn/9urBDit+fbXwsh3hBCVBBC1BZCnLG+Rhj2fIXFgAFQsKB1r5/ZJTOedT0BmX01Mjoy6RMUSVK3\nrlz0Fhwsa0QrFC8y2mc0UTFRdHHvQqW8lfSWoxuqglsybN8OH3wgpzteuiR/WpuomCgqfFuBfx/+\ny4KmC+hTrY/1RdgR/v5yurGDA5w7ByVK6K1IYQv43fWj8sLKuDi6cOHLC7ye7XW9JVkcW4wx2Dwx\nMXJ6I8gVzno4BQAnBycm1pcx+XF/jeNZxDN9hNgJ5cvDp5/KmFHsNFaFYvgumcagb/W+6cIpJIVy\nDEnwyy9yBlKhQtCvn3Wv/eoYb+tyralWoBp3g+/i/be3dcWYCVsat/b0BBcX+PlneCFNfpqwJbvM\nRXqzyfeqL9svbierS1ZG1BlhPVEmYqn7pBxDIkRGyhWzIAPOGTPqq0fTNL5p8A0AU/ZPITA0+ayL\nisQpUkQWVhJCzjhTpF+EEHG9hf/V+h+5MuXSWZH+qBhDIixcCH36QJkycPp02ovwmJuGPzbkz8t/\nMrTWUKY0nKK3HENz757MjPvsGezbB++8o7cihR5sPreZFqtbkCdzHi71v0QWlyx6S7IaKsaQCkJD\n42esjB9vO04BYHKDyQDMPTyXO0F3dFZjbPLkia/X/fXXqj50eiRGxDDaR+ZkH1VnVLpyCkmhHEMC\nLFgAt2+Duzu0aaOPhsTGDqsVqEbLsi0JjQpl8r7J1hVlIrY4bj14MOTMKXsMO3akrQ1btMtU0otN\na86s4dS9UxR2K0zvqr2tL8pEVIzBSgQFyWLyAF5eckqjrTGu3jg0NBYeW8j1J9f1lmNosmWLX6cy\nerTqNaQnomKiGOsrp6WNqTsGVydXnRXZDirG8AoTJsigc61a8inSVnP3d1zfkV9O/0LPyj35vvn3\nessxNCEhMtYQEACbNsFHH+mtSGENlv6zlO6/dqfEayU42/cszo7OekuyOirGkAIePYIZz4umTZpk\nu04BwLOeJ46aI0v9lnLhoW4ppOyCTJlgxPMZimPGyPUrCvsmPCqccX/J/Dbj6o1Ll04hKZRjeIGZ\nM+HJE2jQQKZO0JPkxg5L5yxNlze7EC2i8fzL0yqaTMWWx61795brVU6elHmxUoMt25VW7N2mH/75\ngWtPrlE+d3naV2yvnygTUTEGC/PwIcyeLbeNkkNnTN0xODs488upX/C/76+3HEOTIYOMMYBcDR0d\nra8eheUIjQxl4l6ZSWB8vfE4Oliw4pZBUTGG5wwfDt98A40bw7ZtyR9vK3zx+xcsOLqAthXasrrN\nar3lGJrISFnP+/JlWL5cps1Q2B9zDs1h4I6BuOdz53jv42lKS20vJBZjUI6Blxc6/f03vPWWGcVZ\nmJtPb1LCuwSR0ZGc/PwkFfNYqViEnbJ8uazlXbIknD1rW2tYFKYTGhlKce/i3A2+y+b2m2leprne\nknRFBZ+TYOpU6RSaNbMdp5DSscNCboXoXaU3AhEXTLNVjDBu/cknUKqUrPC2cmXKzjGCXanFXm36\n7uh33A2+S9X8Vfmw9IfJn2TjqBiDhbh7F779Vm5bqwiPuRleZziujq6s81/Hibsn9JZjaJyc4mMN\nEybIDKwK+yA0MpRv9st8Y571PNP1EFJypHvHMHWqTIHx0UdQpYreauJJTeHzAlkLxNVosOVeg1GK\nuXfoIHsNly7Bjz8mf7xR7EoN9mjTmcxnuPfsHtULVKdpqaZ6yzELlrpP6TrGcPcuFCsGYWFw/DhU\nrmwBcVbiTtAdinsXJywqjOO9j1M5v4GNsQF++gk6d5afj/PnwVlNczc0zyKeUXROUR6EPGBrx618\nUOoDvSXZBCrGkADTpkmn8NFHtucUUjt2mD9rfj6v9jkA4/fY5nxbI41bd+ggM+teuZJ8r8FIdqUU\ne7Pp2yPf8uDMA94u+DaNSzbWW47ZUDEGMxMQIJPlgf1U8Rr6zlAyOGVg07lNKtZgIo6O8fU4vLzk\nVFaFMXkW8YxpB6YBMLbuWBVbSAHp1jFMmyZjC82b215vAdI2dpgvSz4+q/oZYJu9BqONW7drF99r\n+OmnxI8zml0pwZ5sWnhsIfdD7lP9nep21VsAFWNIktTGGO7dg6JFpWM4dsy2gs6mcjvoNsXnFCc8\nOpwTfU7wRt439JZkaGJjDSVKwLlzal2D0QiJDKH4nOIEPAtgS4ctNC1tH0Fnc6FiDC8wfbp0Ch9+\naLtOIa1jhwWyFojLKz9hzwQzKjIdI45bt28vF7tduiRrgCeEEe1KDnuxadGxRQQ8C6BagWpkupVJ\nbzlmR8UYzMT9+zB/vtyOHUO2N4a9MwwXRxfW+a/j9L3TessxNE5OMGqU3PbyUjmUjERoZChT9svy\nt2PeHaNiC6kg3Q0ljRghC/E0aQK//25hYTrSb2s/5h+ZT7sK7VjVZpXecgzNizmUVq6Ejh31VqRI\nCXP/nkv/7f2pnK8yx3ofU44hAVSuJCAwUMYWgoLg4EGoUcPy2vTixpMblPAuQVRMFP59/Smbq6ze\nkgzNDz9Az55QrhycOiVnLSlsl/CocEp4l+BW0C02tttIi7It9JZkk6gYAzBnjnQKDRvavlMwdeyw\ncLbCdHPvhkAwae8k84gyESOPW3fuDEWKyMR669e//J6R7UoMo9u0zG8Zt4JuUSlPpbhEeUa3KSFU\njMFEnjyRjgHic+HYO1/X/hpHzZGfT/3MpcBLessxNC4uMjU7yFiDqvJmu0RGRzJ5nyzcPurdUTho\n6eZrzmykm6EkLy/pEOrVAx8f6+iyBbpt7sYyv2X0qNyDxc0X6y3H0ISHy2mrt26p2tC2zJJ/ltDj\n1x6UzVWW05+fVoV4kiBdDyUFBcGsWXI7vfQWYhlRewQOmgPLTyzn2uNressxNK6uMHSo3PbyAjt4\nprI7omKi4oZOR9YZqZxCGkkXjuG772TguVYt8PDQW03KMNfYYamcpWhfsT1RMVFxU/f0wh7GeHv2\nhDx54OhR2LFD/s4e7HoVo9q06vQqLj26RInXSvynlrNRbUoKFWNII6GhckEbyPno6XHG2ojaIwBZ\nAP120G2d1RibTJlgyBC5PWGC6jXYEjEiJq63MLz2cJwc1DL1tGL3MYa5c6F/f7nC+ejR9OkYANqs\nacP6s+sZVm5NAAAgAElEQVT5qsZXzGw0U285hiYoSE57DgyE3buN0wu1d9b5r+PjtR/zerbXufDl\nBVwcXfSWZPOkyxhDRIQsxAPpt7cQy8g6I4HnCcWe3ddZjbHJmhW++kpue3npq0UhEUIwce9EIH7l\nvyLt2LVjWLECbt6EChWMN4PE3GOHlfNXpkmpJoREhjD70Gyztp1S7GmMt18/cHOTPYb58331lmN2\njHavtl7Yit9dP/JlyUf3yt0TPMZoNqUEFWNIJVFRMvUFwMiR4GC3lqacUXVk0p95R+bxOOyxzmqM\nTfbs0Lev3E4qJbfC8rzYWxhcczAZnDLorMj42G2MITZdcqlScrWqSmEgabCiAbuv7MbLw4uR747U\nW46huX9froYODYV//gF3d70VpU98rvhQf0V9cmTMwbWB18jikkVvSYYhXcUYYmLiewtff62cwovE\nxhpmHZpFcESwzmqMTe7c8Jmsi8Qk28g6ki7x2isDPQPfHqicgpmwS8ewaRP4+0PhwtCpk95q0oal\nxg49inpQs1BNHoY+ZNGxRRa5RmLY4xjvkCHg5OTLunWykI+9YJR7dejmIXZf2Y2bqxtfvv1lksca\nxabUoGIMKUSI+Ke3oUNljhtFPJqmxfUaph+YTlhUmM6KjE3BgtC4sfzcxfZSFdYjNrbQt3pfsmfI\nrrMa+8HuYgw7dsh/1Dx54OpVyJhRX222iBCCygsrcyLgBAuaLqBPtT56SzI0ly9D6dJy++JFucZB\nYXlO3D2B+0J3Mjpl5OrAq+TJnEdvSYYj3cQYJsoHCAYNUk4hMTRNY0QduRp6yv4pRMVE6azI2BQv\nLov3REfHr5tRWJ7YDKq9qvRSTsHM2JVj2LtXvrJnh88/11uNaVh6PLR1udaUzlmaq4+v8supRIoZ\nmxl7HOMFadfXX8vtJUvgzh199ZgDW79X/z78lzVn1uDs4Mz/3vlfis6xdZvSgooxpIDY2EL//nLx\nkSJxHB0c+fod+W02ed9kYoQqMGAK5ctDq1YyNfdMlXHE4nyz7xsEgi5vdqGQWyG95dgddhNjOHpU\nUK0aZM4M165Bzpx6q7J9IqMjKTm3JNefXGd92/W0KtdKb0mG5tgx1GfQClx/cp0S3iWIETGc73ee\nkjlK6i3JsNh9jCF2RkifPuofMqU4OzoztJYsMDBx70Ts4SFBT6pWhUaN4NkzmbxRYRmm7Z9GVEwU\n7Sq0U07BQtiNY9iwQU5NHTRIbyXmwVrjod0rdydv5rwcv3OcnZd2WvRa9jjGCy/bNULG9PH2lllY\njYqt3quA4AAW/yMrEQ6vPTxV59qqTaZglzEGTdMaa5p2StM0f03ThiXwvqumaaufH7Nf07QiibUl\nBHTvDgUKWFazvZHROSODakpvOmmfWr5rKu++C7Vrw6NHsGCB3mrsj9mHZhMWFcZHZT6iUt5Kesux\nW3SLMWia5gqcA2oDAcBBoLcQ4p8XjhkMFBZCDNQ0rQXQTQjxnzypmqYJR0fBv//KqYOK1PE0/ClF\nZhfhcdhj9nbbS+3Xa+stydBs2wZNmkDevHItTQaV080sPAp9RJHZRQiKCOJQj0O8XehtvSUZnjTH\nGDRN669p2msW0PQ2cEYIcUsIEQWsBpq+ckwT4Mfn278CtTQt4aoKHToop5BW3Fzd+PItmU4gtgKW\nIu00bgyVK0NAACxdqrca+2H+kfkERQTRoFgD5RRMRAhZryYxUjKUlBc4omnamudDP+Yqd1MIuPHC\n/s3nv0vwGCFEDPAQSHAlS+w8cnvB2uOh/d/uTybnTGy7uI1/7vyT/AlpwB7HeOG/dmlafKxh6lSI\njLS+JlOxtXv1LOJZXB2R2MWZqcXWbDIHabVp9+6kH6STLYoqhBipadpo4H2gKzBP07Q1wBIhxMU0\nqXretAnn/odp07pS9HkuguzZs+Pu7k69evWA+D+ekfb9/Pysfv0+Vfsw89BMBi4cyLh648zefiy2\n8Pc1576fn99/3n/tNShTph7nz8OYMb40amQ7em3185fU/toza3kY+pAahWqgXdXwvearPn9p2Pf1\n9WXZsmXs2AF37xYlMVIcY9A0zR3oBjQGdiOHgnyFEGmaB6RpWh1gmBCi2fP9/wEuQoiJLxyz6/kx\nRzVNc0DGIvI+7z282FaiNZ8VKefW01sU9y5OZHQk/n39KZurrN6SDM3y5dC1K5QtC2fOqGJRaSU8\nKpzi3sW5HXSbX9v/yodlPtRbkqE5dAhq1pSLgJ8+TXuMYYCmaceAqcB+oKIQ4nOgKv+NCaSGI0BF\nTdMKaprmDLQFtr1yzFYgNnH2R8DBV52CwnwUdCtI1ze7IhBM2T9FbzmGp2NHeP11mY570ya91RiX\nFSdWcDvoNpXyVKJpaVO+chQQnyEitgJhQqTkGSYH0EoI8b4QYo0QIhLg+SN6y7SKE0KEAZ8DO4AT\nwAYhxHFN08Zpmhb7SDAPKKBp2ingf0D/tF7PaLza/bUWw2oPw0Fz4KeTP3Ht8TWztq2XTZYmMbuc\nnWXqd5D/jEbq1NrKvYqKiYp7SBlRZwQOWtq7XbZikzlJrU0nT8Jvv8kEowMHJn5csn9lIcRYIUSC\n3xBCCP9Uqfrv+duEEBWFEOWFEJNfuN5vz7fDhRBthRCVhBC1hBBXTbmeInmKv1acDhU7EBUTxbQD\n0/SWY3i6d5fTVo8dg52WXT9ol6w5s4ZLjy5RMkdJPi7/sd5yDE9shohevWRpgsSwm1xJ9mCHrXD6\n3mkqLaiEq6MrVwdeJV+WfHpLMjRTp8KwYXLx219/6a3GOMSIGN787k1O3zvN9x9+T88qPfWWZGgu\nXJDxLgcHWUOkcOF0kCtJYT4q5qlIi7ItCI8Oj5siqEg7ffrIVPB79sC+fXqrMQ5b/t3C6XunKeRW\niE/f/FRvOYZn6lSIiYFPP5VOISmUY7BR9B4PHVFbzhX/9si3PAp9ZJY29bbJUiRnl5ubTAUP8YE/\nW0fveyWEiCvbOaTmEFwcTa/Rq7dNliClNt24IWfJaZrsvSaHcgyKBKlesDoNizckKCKIuYdVqlBT\n6d9fpuPetg2OH9dbje2z+8puDt86TK5MuehVtZfecgzP9OlyoWXbtvFlaJNCxRgUieJ71ReP5R7k\nyJiDawOvkcUli96SDM3gwbKIT+vWsG6d3mpsm/rL6+Nz1YeJ9SemeaWzQnLvnqxDHhoKJ07AG2/E\nv6diDIpUU7dIXWoVrkVgaCALjy7UW47hGTxYpobfsAHOntVbje1y8MZBfK76kM01G32rJzHZXpEi\nZs+WTuHDD192CkmhHIONYgvjoZqmMbLOSACmH5xOWFSYSe3Zgk2WIKV2FSgA3brJ9QzffGNZTaai\n572KjS30e6sf2TJkM1u79vj5S86mx49h/ny5PSIVHS/lGBRJ8kHJD6icrzJ3g++y9B+VKtRUhg0D\nR0dYuVJOGVS8jN9dP36/8DuZnDMxsEYSK7AUKWLePHj6FOrXhxo1Un6eijEokmWd/zo+XvsxRbIV\n4cKXF3B2dNZbkqHp0gVWrIDPPoPvvtNbjW3Rdm1b1vqv5asaXzGz0Uy95RiaZ8+gSBF4+BB27ZLO\n4VVUjEGRZlqVa0XZXGW59uQaK0+t1FuO4Rk+XE4bXLoUbt3SW43tcPb+Wdb5r8PF0YUhtYboLcfw\nLFwonUKNGuDhkbpzlWOwUWxpPNRBc4irrzt532SiY6LT1I4t2WROUmtX2bJyZlJEBMyYYRlNpqLH\nvfpm/zcIBN3cu1Egq/lr9Nrj5y8xm8LC5BRVgFGj5INIalCOQZEiOlTsQLHsxfj34b+sP7tebzmG\nZ6SM6bNwITx4oK8WW+DKoyusPLkSR82Roe8M1VuO4Vm6FO7cAXd3WWY2tagYgyLFLDq2iM+2fEal\nPJXw6+NnUqZLBTRrBr//LmeLTJyY/PH2TJ8tfVh4bCFd3uzCshbL9JZjaCIjoWRJuH4d1q6FNm0S\nPzaxGINyDIoUEx4VTgnvEtwKusWmdpv4qOxHeksyNC8WTLl2TeZTSo/cfHqTEt4liIyO5Gzfs5TJ\nVUZvSYZm6VKZ1bdcOTh9OukCUSr4bDBscTzU1ck1rpvvtdeL1DpjW7TJHKTVrho1oEEDOZ3Q29u8\nmkzFmvdq2v5pRERH0LZCW4s6BXv8/L1qU3R0fGrtESPSXjVQOQZFquhZpSd5Mufh6O2j7LykCgyY\nyujR8ufs2RAUpK8WPQgIDmDR8UUAKvWFGVi7VqbXLl4c2rdPeztqKEmRaqbun8qwP4fxTuF32Ntt\nL1pqpzwo4hBC1mnYt0+uhk5J5kt7Ytgfw5h6YCoflfmITe1V/VNTiImRKS/OnIFFi2QxnuRQMQaF\n2QgKD6LI7CI8CnuETxcf6hWtp7ckQ7NzJzRqBLlzw5UrMgtreuBhyEOKzilKcEQwh3sepnrB6npL\nMjTr18tAc+HCcPGizMuVHCrGYDBseTw0q2vWuHQFE/ZMSPF5tmyTKZhqV8OGUL063L8P339vHk2m\nYo17NfvQbIIjgnm/xPtWcQr2+PmLtUkImPD8X3HYsJQ5haRQjkGRJvq/3R83Vzd2X9nN/uv79ZZj\naDQtPtYwdapcnGTvPA57jPdhGXEfW3eszmqMz2+/yZTa+fNDjx6mt6eGkhRpZvTu0Xjt9aJRiUZs\n77RdbzmGRgioUgX8/GTis752nm16/F/jGes7lvrF6rPr0116yzE0QsBbb8HRozBrFgxMRe5BFWNQ\nmJ0Xx4gP9TjE24Xe1luSodmwQabKKFRIjhG7uuqtyDI8DX9KkdlFeBz2GN8uvtQtWldvSYZm2za5\nujlPHhmjypQp5eeqGIPBMMJ4aM5MOelXvR+QsliDEWxKC+ayq0ULqFgRbt6Ui5T0xJL3at7heTwO\ne8y7Rd61qlOwx8+fj49vXGxh8ODUOYWkUI5BYRKDag4ik3Mmfr/wO8fvqGLGpuDgAGPGyO3Jk2WS\nPXsjOCKYmQdlOu0x747RWY3xOXYMDh6EHDng88/N165yDDZKvXr19JaQInJnzs3n1eQncvxf45M8\n1ig2pRZz2tW6NZQvL/PcLF9utmZTjaXu1YIjC3gY+pCahWpSv1gCBQIsiL19/oSATZvqATBkCGTN\nar62VYxBYTJ3g+9SfE5xQqNCOd77OJXzV9ZbkqH55Rfo2FEWcP/3X3C2k7pIzyKeUXROUR6EPGDb\nJ9toXLKx3pIMza5d8N57srdw9WraHIOKMRgMI42H5suSL77XsCfxXoORbEoN5rarbVsoU0b+s//4\no1mbTjGWuFffHvmWByEPqFGoBo1KNDJ7+8lhT58/IcDTE8CXwYPN21sA5RgUZuJ/7/yPjE4Z2XRu\nE//c+UdvOYbG0TF+XYOXl0yjbHSeRTxj6oGpAHjW9VRpVEzEx0emUcmSBfr1M3/7aihJYTYG7xjM\nzEMzaVG2BRvbbdRbjqGJjoYKFeD8eVi82DyLlvRk2v5pDP1zKDUK1eBA9wPKMZhI3bqwZ498cIgt\n+pQW1DoGhcVRsQbz8vPP8Mknxo81vBhb2P7JdhqVtP4wkj3h6ytrOL/2mhxudHNLe1sqxmAwjDge\nmlyswYg2pQRL2dWunawPffWq9WcomdOmF2ML75d432ztphZ7+PwJET+ledAgOH7c1yLXUY5BYVZe\njDWodQ2m4egY/yXg5WXMdQ3BEcEqtmBGdu2CvXvlTKT+/S13HTWUpDA7Q3YOYcbBGTQt1ZQtHbfo\nLcfQREdDpUpw9iwsXAi9e+utKHVM3juZEbtHqNiCGRAC3nlHLmibPBm+/tr0NlWMQWE17j27R/E5\nxXkW+YyDPQ5So1ANvSUZmlWroEMHeP11GWswSg6lJ2FPKDanGI/CHvFH5z94r/h7eksyNLE5kXLl\nkjmRsmQxvU0VYzAYRh4PzZM5D1++9SUAY3zi0x4Y2aaksLRdbdvKHErXr8sZStbAHDbNPjSbR2GP\neLfIuzQo1sB0USZi5M/fi7GFYcPinYKlbFKOQWERhtQaQlaXrPxx+Q/2XturtxxD4+AA45/H8r28\nICREXz0pITA0kJmHZE6kCR4T1BCSiWzZItNq580LX3xh+eupoSSFxRjrM5bxe8ZTr2g9fLr46C3H\n0AgB1arB8eMwfbrMpGnLjNw1kkn7JvFe8ff4o/MfessxNDExULWqrNUxezYMGGC+tlWMQWF1Hoc9\npticYjwOe8yfnf+kQXH9hxOMzItjzJcvmz8Ngrl4EPKAorOLqhiTmVizRk5dLlgQLlyAjBnN17aK\nMRgMI4+HxpI9Q3aG1BwCwCifUfj42GevwVr3qnFjqFkTHjyAuXMtey1TbPpm3zc8i3xGk1JNbMop\nGPF/KioqPrYwevR/nYKKMSgMSf+3+5M7U24O3TzEgRsH9JZjaDRNxhgApk2DR4/01ZMQN5/eZN7h\neQB4eXjprMb4/PSTTItSvDh0726966qhJIXFmXNoDgN3DKRinoqc6HMCB009j5hC/foyidrw4TBp\nkt5qXuaz3z5j0fFFtK3QltVtVustx9CEh8ssu9euySy7nTqZ/xoqxqDQjbCoMErPLc2NpzdY2Wol\nHSt11FuSoTl0SA4pZcoka0Pnz6+3IsmFhxcoN78cAGe+OEOZXGV0VmRs5s+XmVMrVIATJ+RKeHOj\nYgwGw4jjoYmRwSkDnvU84Ypc1xAZbQd5pF/A2veqRg1ZHzokJH5oydykxaaxvmOJFtF0de9qk07B\nSP9TL97bCRMSdwoqxqAwNJ+++SmFsxXm0qNLLPlnid5yDI+Xl4w5LFokZyjpzYm7J/jl9C+4OLow\npq6q5Wwqc+bA3btyinKLFta/vhpKUliNtWfW0nZdWwpkLcCFLy+QyTmT3pIMTZcusGKFHHvWq9Jb\nLM1+bsbvF35n4NsDmdV4lr5iDM7DhzLY/PSpTJpX34KlsdVQkkJ3WpdvTdX8VbkddJs5h+boLcfw\njBsnazSsXAknT+qn46+rf/H7hd/J4pKF4XWG6yfETpg8WTqF99+3rFNICuUYbBQjjYemlD1/7WHK\ne1MA+Gb/NzwIeaCzIvOg170qWhT69JGrooeb+fs4pTYJIRj651AAhtYaSp7MecwrxIwY4X/q2rX4\nNSrffJP88XYVY9A0LYemaX9omnZS07QdmqZlT+S4aE3T/nn+2mRtnQrz06B4A94v8T5Pw58yaa+N\nzbU0IKNGyRXQW7fKKazWZv3Z9Ry+dZi8mfPyVc2vrC/Azhg7Vtbd6NABKutYAFGXGIOmaXOBS0KI\n2ZqmDQSKCSH+kwFE07QgIUSyC/9VjMFY+N31o8rCKjg7OnO+33mKZi+qtyRDM3GidBDVqsHff8uk\ne9YgMjqSCt9W4ELgBRY0XUCfan2sc2E75fRpeOMNcHKCc+dknMHS2FqMoQkQGy77CWiqkw6FDrjn\nc+eTNz4hIjqCUbtH6S3H8Hz1FRQoILNvrlljvet+f/x7LgReoHTO0vSo3MN6F7ZThg6Vw4KffWYd\np5AUejmG3EKIhwBCiAdAYgOTGTRNO6pp2nFN09paT57+GGE8NLW8aNMEjwm4OLqw8tRKw5cA1fte\nZcoUn5Z7xAi5YtZUkrMpKDyIcX+NA2BS/Uk4OzqbflELo/d9Soo//pBJErNmlTmRUoqlbHKySKuA\npml/APkSeGtkKpopKIS4p2laMWC3pmknhBDnEzqwa9euFC1aFIDs2bPj7u5OvXr1gPg/npH2/fz8\nbEqPOfZjid3vV70fMw/NpMecHsxsNBMPDw+b0pvSfT8/P931FC0KFSrU48wZGDzYlzZtLPv5++H4\nD9x7do+3C75NjoAc+N7ztZn7kdLPn956Yvd37fKlTx+AeowYAf7+vvj7W+Z6vr6+LFu2DCDu+zIh\n9IoxXALeFkI80DQtN3BQCFEymXMWAr5CiF8SeE/FGAzIo9BHlJxbksDQQDa330zzMs31lmRofv8d\nmjWD116TqTJy5LDMda4/uU6ZeWUIiwrjQPcD1Cxc0zIXSicsXSoT5L3+uowtmDOtdnLYWoxhKxCb\nEqrT8/2X0DQtm6Zpzs+3cwJ1gTNWU6iwOK9lfA3Pup4ADNk5hIjoCH0FGZwmTeS890eP4oeWLMGI\nXSMIiwqjXYV2yimYyLNncuIAyISI1nQKSaGXYxgLNNU07STwATAGQNO0qpqmff/8mArAcU3TTgD7\nAW8hhI7LeKzLq91feyAhm/pU60PpnKW5EHiB745+Z31RZsBW7pWmwcyZ8uf8+TJdc1pJzKYjt46w\n8tRKXBxdmNxgctovoAO2cp9eZOZMuH1bzijr0CH151vKJl0cgxAiUAjRUAjxhhDifSHE4+e/PyaE\n6PV8+4AQopIQ4k0hRFkhxLd6aFVYFmdHZ6Y1nAaAp68ngaGBOisyNm++CT17ygIvQ4aYt20hBIN2\nDgJg4NsDKfZaMfNeIJ1x6xZMkes9mT7detOMU4LKlaTQHSEEDVY0wOeqj8q1YwYCAqBUKQgKgp07\noWFD87S7zn8dH6/9mFyZcnHxy4tky5DNPA2nUzp3loV4WraEDRv00WBrMQaFIg5N05jZaCYaGvOO\nzMP/vr/ekgxN3rxy2irAoEGy92AqIZEhDN45GIDx9cYrp2AiBw9Kp+DqCjNm6K3mvyjHYKPY4nio\nqSRlk3s+d3pX7U1UTBQDtg/ASD1AW7xXAwfKXEqnT8vU3KnlVZum7Z/G9SfXeTPvm/Su2tssGq2N\nrdynmBjo319uDxkCxUwYkbOrGINCkRBe9b3IniE7f17+k83nN+stx9BkyCADmwAjR8L9+2lv69rj\na3yzX2Z08/7AG0cHC5QSS0csXy5XqRcoAF9/rbeahFExBoVNMe/wPL7c9iVFsxfF/wt/MjrbyPw9\nAyIENGokV9X26pW2ngNAmzVtWH92PR0qduDn1j+bV2Q64+lTKF1axoF++gk++URfPSrGoDAEfar1\noWKeilx9fJUZB21w8NVAaBp4e8ukbIsXy6fU1LLr8i7Wn11PJudMTG041fwi0xmentIp1KwJHW24\n9LlyDDaKrYyHmpOU2OTk4IR3Y28AJu2dxLXH1yysynRs+V6VLSuT7AkhC8vHxKTsPF9fXyKiI+i/\nXQ6Gj6wzkkJuhSyo1PLofZ9OnpSO2sEB5s2TjttUVIxBkW7wKOZBuwrtCI0KjftiUqSd0aMhf36Z\nkvt5mpwUMevgLPzv+1MyR0kG1RxkMX3pgZgY+PxziI6GL76AKlX0VpQ0KsagsEluB92m7LyyBEUE\nsandJj4q+5HekgzNypWyNnTOnDIfT65cSR9/9fFVys8vT2hUKDs77aRhCTMthkinxOZDyptX/v2z\nJ1iazPqoGIPCUBTIWoCJ9ScC8OW2LwmOCNZZkbHp2FHmUXr4EP73v6SPFULw5bYvCY0KpV2Fdsop\nmEhgoKy1AHLNgq04haRQjsFG0Xs81BKk1qYvqn9BlfxVuPH0BuP/smBWOBMxwr3SNFiwAFxc5HBS\nUpI3n9/Mlh1bcHN1Y2ajmdaSaHH0uk/Dh8ODB1CvnvkDzirGoEh3ODo48l3T79DQmHVoFqcCTukt\nydCULi3XNAD06ZNwQZ/giGD6b5NxHS8PLwpkLWBFhfbHnj1ymrCzs0xsaI6AszVQMQaFzdNvaz/m\nH5nPWwXf4kD3A2qBlQmEh8tEe+fPw7hxMGbMy+8P2DYA78PeVMlfhcM9D6u/tQmEhcm/9b//wtix\ncqqqraFiDArDMqnBJAq5FeLwrcPM+XuO3nIMjasrfPc8u/nEieD/Qlqq/df3M/fwXBw1RxZ/uFg5\nBROZMEE6hXLl5HCSkVCOwUYxwrh1akmrTW6ubnzXVH6bjdo9ikuBl8yoynSMdq/q1YMePSAiQs6U\niY6GsKgwev7WE4Fg2DvDeHL+id4yzY4179OJEzB1qhw6WrxYOmRLoGIMinRN09JN+aTSJ4RGhdLr\nt16GSrJni8yYAQULyrUNs2aB1x4vzj04R5mcZRhdNxXV6BX/ISoqvibGF19ArVp6K0o9KsagMAwP\nQh5Qbn45HoQ8YFGzRfSq2ktvSYZm61Zo2hRcXvcjpkd1okU0e7vt5Z3X39FbmqGZNEkG+QsVgjNn\nwM1Nb0WJo2IMCsOTK1Mu5n4wF4DBOwdz9fFVfQUZnCZN4JNPw4lo+ilRIoq+1fspp2AiJ07EB5l/\n+MG2nUJSKMdgoxht3DolmMOmdhXa0apcK4IiguiyqQvRMdGmCzMRI9+rHG3GQN5TEFiC/P6T4n5v\nZJsSw9I2RUTAp59CZKRMf/H++xa9HKBiDAoFILu+C5stJG/mvOy5todZh1QZ0LSy99pe5h2fhgMO\nsOFHxo3MwsmTeqsyLuPHy0R5xYvLwLORUTEGhSH5/d/fafZLM1wcXTja6yiV8lbSW5KhCAoP4s3v\n3uTK4yuMqD2CB2smsmgRVKwIR47IQj+KlPP33zLILAT89RfUqaO3opShYgwKu6Jp6ab0rtKbiOgI\nOm3sRHhUAst4FYny1Y6vuPL4CpXzVWZsvbHMnAmlSslSoLH1ohUp48kT6NBBZlAdNMg4TiEplGOw\nUdQYb/LMaDSDEq+V4GTASYb+MdSsbacGo92r1adX88M/P+Dq6MqPLX/ExdGFzJllRTFHRzl9dcYM\nX71lmh1L3CchZHqRK1egcmW5aNCaqBiDQvEKWVyy8HPrn3F2cMb7sDebzm3SW5LNcynwEr1+k9N8\nZzaaSYU8FeLee+ut+Bk1EyfC3bs6CDQYy5bBqlWQObP8aamFbNZGxRgUhmfWwVkM2jmI7Bmy4/eZ\nH0WyF9Fbkk0SHhVOrSW1OH7nOG3Kt2FNmzVor2R1i4qC996T4+QeHrJetKPKjJEg58/LgjshIdJB\ndOmit6LUo2IMCrtlYI2BNCvdjMdhj2m/vj2R0ZF6S7JJhv4xlON3jlMsezG+//D7/zgFkPWhf/kF\n8uQBHx+ZaE/xX0JCoG1b+fOTT+Q0VXtCOQYbxWjj1inBUjZpmsayj5ZR2K0wh24eYtifwyxyncQw\nwnk28E8AAA5GSURBVL1a578O78PeODs4s7rNarJnSLxaTP78MGyYL5oGXl6wc6cVhVoQc90nIaBX\nLzk1tVQp+PZb/dJpqxiDQpEEOTPlZFWbVTg5ODHr0CxWnlyptySb4VTAKbpu6grAtIbTqF6werLn\nVKkiewtCyCfia9csLNJAeHvDzz/LuMLGjcZd3ZwUKsagsCsWHFnAF1u/IINTBvZ120fVAlX1lqQr\ngaGBVFtUjSuPr9DpjU6saLEiwSGkhIiJkWkzduyQdQX275dfhumZv/6CBg1kRto1a+Djj/VWZBoq\nxqBIF/Sp1oeelXsSFhVGy9Utuffsnt6SdCMqJor269pz5fEVquavyqJmi1LsFAAcHGS8oVQpmQOo\nSxfpLNIr16/LuEJ0tKybbXSnkBTKMdgoRhi3Ti3WsEnTNOY1mUfNQjW58fQGbda0sfjiN1u9V0P/\nGMofl/8gd6bcbGy3kYzOGVN8bqxNr70Gv/4K2bLB+vWy+IxRMeU+PXkiM9Heuyd7DJMmJX+ONVAx\nBoUihbg6ubK+7XoKZC3A3ut76bq5KzEifT3qzjk0h1mHZuHk4MS6tusonK1wmtsqW1b2HBwc5DqH\n1avNp9MIREbK3sHp0/JvsXatnL1lz6gYg8Ju8bvrR52ldQiOCOZ/tf7H1IYGz2yWQtb7r+fjtR8j\nEKxosYLOb3Y2S7szZ8LgweDiAtu2Qf36ZmnWphECPvsMvv8ecueGQ4dkkjx7QcUYFOkO93zurG+7\nHicHJ6YdmMbcv+fqLcni7L++n082fIJAMLH+RLM5BYCvvoIBA2R66RYt4J9/zNa0zTJxonQKGTLI\nITV7cgpJoRyDjWKr49amoIdN75d4n8UfLgZgwPYBrDq9yuzXsJV7dTLgJM1XNSc8OpzPqn7G8Npp\nr0CfkE2aJnsNHTpAUBB88AFcsq3y20mS2vs0Zw6MHi3t/vFHqFHDMrpMQcUYFIo00sW9CxPrT0Qg\n6LShE+v81+ktyeycuXeGBisaEBgaSPMyzZnXZF6qZiClFAcHmf7hvfcgIAAaNrTPNQ6LF8PAgXJ7\n0SJo00ZfPdZGxRgU6QIhBKN9RjNx70ScHJxY+/FaWpRtobcss3D2/lnqLa/HvWf3aFyyMZvabcLV\nybLZ3IKCpHM4fBiKFpXpM4oWteglrcbPP0OnTjK+MGcO9O+vtyLLoWIMinSNpmlM8JjAsHeGERUT\nRdu1bdl8brPeskzm3INz1F9Rn3vP7tGweEM2tN1gcacAkDWrTJXx9ttw9SrUrStTTxudJUugc2fp\nFCZNsm+nkBTKMdgotjJubU70tknTNCY3mMygGoOIjImk9ZrWLPlnicnt6mXX3zf/pvaS2twNvotH\nUQ82td+UqrUKSZESm7Jlk86hZk25+KtuXTh3ziyXtwjJ2TRjBvToIRfxjR8Pw9MeorEaKsagUJgB\nTdOY/v50RtYZSbSIpsevPZi4ZyJGG4r8/d/f8VjuwcPQhzQp1YTfOvxGJudMVtfh5iZTZrzzDty4\nIctb7tljdRkmIQSMHAlDhsj9uXNl0Dk9o2IMinTLt0e+pd/WfggEX1T7gjkfzMHJwfZXLv1w/Ac+\n2/IZ0SKabu7dWNhsIc6OzrpqevZMJtvbvFmuc1i6FDp21FVSiggJgZ495QI+R0dYvlzakV5ILMag\nHIMiXbPefz2fbPiE8Ohw3i3yLqvbrCZflnx6y0qQ8KhwBmwfwMJjCwEYWWckEzwmWGT2UVqIjpY1\nj7295f6oUXKltK0W+rl+PX49RpYssgJb06Z6q7IuKvhsMPQej7cEtmhT6/Kt2d1lN/mz5GfPtT1U\nWViFfdf3paoNa9h1/cl16iytw8JjC3F1dGXxh4vxqu9lMaeQFpscHeUsntmziavl0LAh3Lljfn1p\n4UWbdu+GatWkUyhRQq5oNqJTUDEGhcJC1Cpci+OfHefdIu9yJ/gOHss98NrjZROV4IQQrDmzhioL\nq3Dk9hGKZCvC/u776VGlh97SEmXAAFkSNG9eOY3V3d12iv2EhsoV3A0awP378P77cspthQrJn5ue\nUENJCsVzIqMjGb5rODMOzgDgzbxvsuSjJVTJX0UXPXeC7tB3a182ntsIQOOSjfmp5U/kzJRTFz2p\n5e5dOV6/e7fc79oVpk2DXLn00XPkiCzBee6c7N2MGiVf9p4QLylUjEGhSCG7r+ym5689ufL4Co6a\nIwNrDGREnRHkyJjDKtePiI5g8fHFjNo9ikdhj8jqkpVpDafRq2ovHDRjdfKjo2HKFDn9MzwccuaU\nzqFLF7mK2hrcvi1nGS1dKmcglSsHK1bIoaT0TmKOASGE1V/Ax8AZIBqoksRxjYFTgD8wLInjhL3h\n4+OjtwSzYySbgsODxYBtA4TmqQk8EW6T3cSEvyaIoPCg/xxrLruioqPECr8VotjsYgJPBJ6IJiub\niOuPr5ul/dRg7nv1779CNGgghPxqFqJiRSFWrxYiKsqsl3mJx4+F8PQUIlMmeU0HBx8xeLAQISGW\nu6a1MfU+Pf/u/M93ql6PH6eAlkCiM541TXMFFiCdwxtAG03TKltHnv74+fnpLcHsGMmmzC6Zmd14\nNkd6HaFh8YY8DX/KaJ/RFJtTjKF/DOXCwwtxx5pq14OQB8w8OJMK31bg002fcuXxFcrlKsf6tuvZ\n0mGLSbUU0oq571WpUjLu8NNPUKiQrG3Qrh1UqiTzEj15Yr5rnTsHfftCwYJyVlRICLRsCcOG+TF9\nOmQ0zxpAm8BS/1O6OAYhxDkhxL/JHPY2cEYIcUsIEQWsBgw4byBtPH78WG8JZseINlUtUJWdnXey\n+9Pd1ChUgwchD5h2YBql55XGY7kH8w/P5+Lti6leIHc3+C4rT66k3bp2FJxZkME7B3P+4XmKZCvC\nso+WcerzU7Qq10q3qaiWuFeaJmMOFy/Cd9/B66/D2bPQqxfkyycdxaZNEBiYunaFkO1MnQp16sih\nom+/lWsrPDxkneYNG8DFxXifv+Sw1P+ULYddCgE3Xti/CdTTR4oiveNRzIMD3Q9w6OYhvj/+PatO\nr8L3qi++V33hMGyevZlahWtR4rUSFH+tOIXdCsctOosRMdwNvsvlR5e5/Ogyx+4c4/S903Fta2g0\nKdWEXlV60bRUU90Xq1kaV1dZ/KZbN7mwbNky8PWFNWvkC6B8ebmKukQJKFAA8ueXT/qhoRAWJp2H\nv798nTwp1yTEkjGjzHf05ZdQsaIeFhofizkGTdP+ABJaKTRCCPFbCppI19Hkq1ev6i3B7BjdJk3T\nqFm4JjUL12RWo1msP7uePy7/wcbfNnLz6U3WnFmT4rYyOWfi3SLv8l6x9/i4wv/bu7tXK6o4jOPf\nR+vkKwUakqlUUBdhIImgvUoZhojQhRIUSgRFUASFhV10E11EF/4DvcAJX0gLEcQiyKgo0NB8Kaub\nilMXahSVoN34dDFjNbKFo3t268zx+dyc2fschmfY+8xv1pq11qxm3pXzBpj8wv0fn9XQUHUTet26\n6sS+aVP1ZLi9e/896Y/WzJmwYgWsXAnLl1dLdZyr69+/XgZ1TEVHJUnaAzxre3+P391JdcN5Zf16\nPTBk++Uef3tJF5GIiIvlHqOSxkJX0vk6UfcB8yVdCxwH1gCP9/rDXgcWEREXp8jNZ0kPSBoBFgO7\nJO2u358taReA7dPAE8D7wEHg3V4ti4iIaNe4mOAWERHt6dY0ynNIul/SYUlfS3q+dJ42SHpD0jFJ\nh0tnaYukuZI+rj+rbyU9VzpTvyRNkrRP0gFJ30naWDpTWyRNrI9rNINEOkHSD5IO1ce1t3SeNki6\nStI2SQclHZW0pLV9d7XFUE+A+wa4AzgGfA48ZvtA0WB9qm+6nwSGbd9SOk8bJM0CrrZ9RNI0YD+w\n2vbBwtH6Immy7VOSLgM+BTbY3lM6V78kPQMsBKbbXlU6TxskfQ8stH2BsyTGLknbqLrYt0iaAEyz\n/Ucb++5yi2FcToCz/QnwW+kcbbJ9zPaRevskcAiYXTZV/2yfqjeHgIlUFyidJmkOsAJ4jfMPDOmq\ncXM8kmYAC2xvAbB9pq2iAN0uDL0mwM0plCVGSdJ1wCKqK+xOkzRB0pdUBWGP7QsYeT9mbQTWA2dK\nB2mZgQ/q7qQnS4dpwY3ACUlvSzoiabhujbeiy4Whm31gl7D6i7sNeNr2n6Xz9Ku+SltAdUFyl6Sl\nhSP1RdJK4HjdHTturq5ri23fCtwLPCJpWelAfZpAdYH1qu35wK9Aa0+q7nJh+An47+pic2m2IGIM\nkXQ58A6w2faO0nnaZPt3YBfV8Osuuw1YVffHbwHukTRcOFMrbB+vf54AtlOdVLtsBPjZ9r769XZg\nQVs773Jh+GcCXH3SWQPsLpwpelC1EtzrwNe2x8XoHUkzJE2vtycD91GtGtxZtl+wPdf29cCDwIe2\n15bO1S9JUyRNqbenUq3Y/FXZVP2xPQL8Iumm+q1lwNG29j8WZj5fFNunJZ2dADcBeGs8TICTtAW4\nG5hRTwJ80fabhWP163bgYeCQpLOjxjbYfq9gpn7NBobrojeJqiW0q3Cmto2X7tpZwI566ZwpwFbb\nOwtnasOjwKa66P0IPNTWjjs7XDUiIgajy11JERExACkMERHRkMIQERENKQwREdGQwhAREQ0pDBER\n0ZDCEBERDSkMERHRkMIQ0TJJi+qHp1whaWq9+uXNpXNFjFZmPkcMgKSXqJbKmAyM2H6lcKSIUUth\niBiAemHHL4BTwBLnHy06JF1JEYMxE5gKTKNqNUR0RloMEQMgaSewGbgBuMb2U4UjRYxaZ5fdjhir\nJK0F/rK9tX5I+2eSltr+qHC0iFFJiyEiIhpyjyEiIhpSGCIioiGFISIiGlIYIiKiIYUhIiIaUhgi\nIqIhhSEiIhpSGCIiouFvo0yDC4nhUVMAAAAASUVORK5CYII=\n",
+ "text/plain": [
+ ""
+ ]
+ },
+ "metadata": {},
+ "output_type": "display_data"
+ }
+ ],
"source": [
"# Remove the below line if not running this code in an ipython notebook\n",
"# It's a special command allowing the notebook to display plots inline\n",
@@ -319,17 +394,6 @@
" show_mnist_image(features.reshape(28, 28))"
]
},
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {
- "collapsed": false
- },
- "outputs": [],
- "source": [
- "#implement here Exercise 1"
- ]
- },
{
"cell_type": "markdown",
"metadata": {},
@@ -344,6 +408,17 @@
"Implement `__to_one_of_k` method of `MNISTDataProvider` class. Then modify (uncomment) an appropriate line in its `next` method, so the raw targets get converted to `1 of K` coding. Test the code in the cell below."
]
},
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": false
+ },
+ "outputs": [],
+ "source": [
+ "#implement here Exercise 1"
+ ]
+ },
{
"cell_type": "code",
"execution_count": null,
@@ -392,7 +467,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython2",
- "version": "2.7.5"
+ "version": "2.7.9"
}
},
"nbformat": 4,
diff --git a/01_Linear_Models.ipynb b/01_Linear_Models.ipynb
index b34eeb9..7fc5611 100644
--- a/01_Linear_Models.ipynb
+++ b/01_Linear_Models.ipynb
@@ -8,7 +8,7 @@
"\n",
"This tutorial is about linear transforms - a basic building block of many, including deep learning, models.\n",
"\n",
- "# Short recap and syncing repositories\n",
+ "# Virtual environments and syncing repositories\n",
"\n",
"Before you proceed onwards, remember to activate you virtual environments so you can use the software you installed last week as well as run the notebooks in interactive mode, no through github.com website.\n",
"\n",
@@ -22,134 +22,408 @@
"\n",
"## On Synchronising repositories\n",
"\n",
- "I started writing this, but do not think giving students a choice is a good way to progess, the most painless way to follow would be to ask them to stash their changes (with some meaningful message) and work on the clean updated repository. This way one can always (temporarily) recover the work once needed but everyone starts smoothly the next lab. We do not want to help anyone how to resolve the conflicts...\n",
+ "Enter your git mlp repository you set up last week (i.e. ~/mlpractical/repo-mlp) and once you synced the repository (in one of the two below ways), start the notebook session by typing:\n",
"\n",
- "Enter your git mlp repository you set up last week (i.e. ~/mlpractical/repo-mlp) and depending on how you want to proceed you either can:\n",
- " 1. Overridde some changes you have made (both in the notebooks and/or in the code if you happen to modify parts that were updated by us) with the code we have provided for this lab\n",
- " 2. Try to merge your code with ours (for example, if you want to use `MetOfficeDataProvider` you have written)\n",
- " \n",
- "Our recommendation is, you should at least keep the progress in the notebooks (so you can peek some details when needed)\n",
- " \n",
"```\n",
- "git pull\n",
+ "ipython notebook\n",
"```\n",
"\n",
- "## Default Synchronising Strategy\n",
+ "### Default way\n",
"\n",
- "Need to think/discuss this."
+ "To avoid potential conflicts between the changes you have made since last week and our additions, we recommend `stash` your changes and `pull` the new code from the mlpractical repository by typing:\n",
+ "\n",
+ "1. `git stash save \"my 1st lab work\"`\n",
+ "2. `git pull`\n",
+ "\n",
+ "Then, once you need you can always (temporaily) restore a desired state of the repository.\n",
+ "\n",
+ "### For advanced github users\n",
+ "\n",
+ "It is OK if you want to keep your changes and merge the new code with whatever you already have, but you need to know what you are doing and how to resolve conflicts.\n",
+ " "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
- "# Linear and Affine Transforms\n",
+ "# Single Layer Models\n",
"\n",
- "Depending on the required level of details, one may need to. The basis of all linear models is so called affine transform, that is the transform that implements some (linear) rotation of some input points and shift (translation) them. Denote by $\\vec x$ some input vector, then the affine transform is defined as follows:\n",
+ "***\n",
+ "### Note on storing matrices in computer memory\n",
+ "\n",
+ "Consider you want to store the following array in memory: $\\left[ \\begin{array}{ccc}\n",
+ "1 & 2 & 3 \\\\\n",
+ "4 & 5 & 6 \\\\\n",
+ "7 & 8 & 9 \\end{array} \\right]$ \n",
+ "\n",
+ "In computer memory the above matrix would be organised as a vector in either (assume you allocate the memory at once for the whole matrix):\n",
+ "\n",
+ "* Row-wise layout where the order would look like: $\\left [ \\begin{array}{ccccccccc}\n",
+ "1 & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 \\end{array} \\right ]$\n",
+ "* Column-wise layout where the order would look like: $\\left [ \\begin{array}{ccccccccc}\n",
+ "1 & 4 & 7 & 2 & 5 & 8 & 3 & 6 & 9 \\end{array} \\right ]$\n",
+ "\n",
+ "Although `numpy` can easily handle both formats (possibly with some computational overhead), in our code we will stick with modern (and default) `c`-like approach and use row-wise format (contrary to Fortran that used column-wise approach). \n",
+ "\n",
+ "This means, that in this tutorial:\n",
+ "* vectors are kept row-wise $\\mathbf{x} = (x_1, x_1, \\ldots, x_D) $ (rather than $\\mathbf{x} = (x_1, x_1, \\ldots, x_D)^T$)\n",
+ "* similarly, in case of matrices we will stick to: $\\left[ \\begin{array}{cccc}\n",
+ "x_{11} & x_{12} & \\ldots & x_{1D} \\\\\n",
+ "x_{21} & x_{22} & \\ldots & x_{2D} \\\\\n",
+ "x_{31} & x_{32} & \\ldots & x_{3D} \\\\ \\end{array} \\right]$ and each row (i.e. $\\left[ \\begin{array}{cccc} x_{11} & x_{12} & \\ldots & x_{1D} \\end{array} \\right]$) represents a single data-point (like one MNIST image or one window of observations)\n",
+ "\n",
+ "In lecture slides you will find the equations following the conventional mathematical column-wise approach, but you can easily map them one way or the other using using matrix transpose.\n",
+ "\n",
+ "***\n",
+ "\n",
+ "## Linear and Affine Transforms\n",
+ "\n",
+ "The basis of all linear models is so called affine transform, that is a transform that implements some linear transformation and translation of input features. The transforms we are going to use are parameterised by:\n",
+ "\n",
+ " * Weight matrix $\\mathbf{W} \\in \\mathbb{R}^{D\\times K}$: where element $w_{ik}$ is the weight from input $x_i$ to output $y_k$\n",
+ " * Bias vector $\\mathbf{b}\\in R^{K}$ : where element $b_{k}$ is the bias for output $k$\n",
+ "\n",
+ "Note, the bias is simply some additve term, and can be easily incorporated into an additional row in weight matrix and an additinal input in the inputs which is set to $1.0$ (as in the below picture taken from the lecture slides). However, here (and in the code) we will keep them separate.\n",
"\n",
"![Making Predictions](res/singleLayerNetWts-1.png)\n",
"\n",
- "$\n",
+ "For instance, for the above example of 5-dimensional input vector by $\\mathbf{x} = (x_1, x_2, x_3, x_4, x_5)$, weight matrix $\\mathbf{W}=\\left[ \\begin{array}{ccc}\n",
+ "w_{11} & w_{12} & w_{13} \\\\\n",
+ "w_{21} & w_{22} & w_{23} \\\\\n",
+ "w_{31} & w_{32} & w_{33} \\\\\n",
+ "w_{41} & w_{42} & w_{43} \\\\\n",
+ "w_{51} & x_{52} & 2_{53} \\\\ \\end{array} \\right]$, bias vector $\\mathbf{b} = (b_1, b_2, b_3)$ and outputs $\\mathbf{y} = (y_1, y_2, y_3)$, one can write the transformation as follows:\n",
+ "\n",
+ "(for the $i$-th output)\n",
+ "\n",
+ "(1) $\n",
"\\begin{equation}\n",
- " \\mathbf y=\\mathbf W \\mathbf x + \\mathbf b\n",
+ " y_i = b_i + \\sum_j x_jw_{ji}\n",
"\\end{equation}\n",
"$\n",
"\n",
- "Note: the bias term can be incorporated as an additional column in the weight matrix, though in this tutorials we will use a separate variable to for this purpose.\n",
+ "or the equivalent vector form (where $\\mathbf w_i$ is the $i$-th column of $\\mathbf W$):\n",
"\n",
- "An $i$th element of vecotr $\\mathbf y$ is hence computed as:\n",
- "\n",
- "$\n",
+ "(2) $\n",
"\\begin{equation}\n",
- " y_i=\\mathbf w_i \\mathbf x + b_i\n",
+ " y_i = b_i + \\mathbf x \\mathbf w_i^T\n",
"\\end{equation}\n",
"$\n",
"\n",
- "where $\\mathbf w_i$ is the $i$th row of $\\mathbf W$\n",
+ "The same operation can be also written in matrix form, to compute all the outputs $\\mathbf{y}$ at the same time:\n",
"\n",
- "$\n",
+ "(3) $\n",
"\\begin{equation}\n",
- " y_i=\\sum_j w_{ji}x_j + b_i\n",
+ " \\mathbf y=\\mathbf x\\mathbf W + \\mathbf b\n",
"\\end{equation}\n",
"$\n",
"\n",
- "???\n",
- "\n"
+ "When $\\mathbf{x}$ is a mini-batch (contains $B$ data-points of dimension $D$ each), i.e. $\\left[ \\begin{array}{cccc}\n",
+ "x_{11} & x_{12} & \\ldots & x_{1D} \\\\\n",
+ "x_{21} & x_{22} & \\ldots & x_{2D} \\\\\n",
+ "\\cdots \\\\\n",
+ "x_{B1} & x_{B2} & \\ldots & x_{BD} \\\\ \\end{array} \\right]$ equation (3) effectively becomes to be\n",
+ "\n",
+ "(4) $\n",
+ "\\begin{equation}\n",
+ " \\mathbf Y=\\mathbf X\\mathbf W + \\mathbf b\n",
+ "\\end{equation}\n",
+ "$\n",
+ "\n",
+ "where both $\\mathbf{X}\\in\\mathbb{R}^{B\\times D}$ and $\\mathbf{Y}\\in\\mathbb{R}^{B\\times K}$ are matrices, and $\\mathbf{b}$ needs to be broadcased $B$ times (numpy will do this by default). However, we will not make an explicit distinction between a special case for $B=1$ and $B>1$ and simply use equation (3) instead, although $\\mathbf{x}$ and hence $\\mathbf{y}$ could be matrices. From implementation point of view, it does not matter.\n",
+ "\n",
+ "The desired functionality for matrix multiplication in numpy is provided by numpy.dot function. If you haven't use it so far, get familiar with it as we will use it extensively."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Note on random number generators (could move it somewhere else)\n",
+ "\n",
+ "It is generally a good practice (for machine learning applications **not** cryptography!) to seed a pseudo-random number generator once at the beginning of the experiment, and use it later through the code where necesarry. This allows to avoid hard to reproduce scenariors where a particular action happens only for a particular sequence of numbers (which you cannot reproduce easily due to unknown random seeds sequence on the way!). As such, within this course we are going use a single random generator object. For instance, the one similar to the below:"
]
},
{
"cell_type": "code",
- "execution_count": 1,
+ "execution_count": null,
"metadata": {
"collapsed": false
},
- "outputs": [
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "[ 0.06875593 -0.69616488 0.08823301 0.34533413 -0.22129962]\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
"import numpy\n",
- "x=numpy.random.uniform(-1,1,(4,)); \n",
- "W=numpy.random.uniform(-1,1,(5,4)); \n",
- "y=numpy.dot(W,x);\n",
- "print y"
+ "\n",
+ "#initialise the random generator to be used later\n",
+ "seed=[2015, 10, 1]\n",
+ "random_generator = numpy.random.RandomState(seed)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Exercise 1 \n",
+ "\n",
+ "Using numpy.dot, implement **forward** propagation through the linear transform defined by equations (3) and (4) for $B=1$ and $B>1$. As data ($\\mathbf{x}$) use `MNISTDataProvider` from previous laboratories. For case when $B=1$ write a function to compute the 1st output ($y_1$) using equations (1) and (2). Check if the output is the same as the corresponding one obtained with numpy. \n",
+ "\n",
+ "Tip: To generate random data you can use `random_generator.uniform(-0.1, 0.1, (D, 10))` from the preceeding cell."
]
},
{
"cell_type": "code",
- "execution_count": 4,
+ "execution_count": null,
"metadata": {
"collapsed": false
},
- "outputs": [
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "[[ 0.63711 0.11566944 0.74416104]\n",
- " [-0.01335825 0.46206922 -0.1109265 ]\n",
- " [-0.37523063 -0.06755371 0.04352121]\n",
- " [ 0.25885831 -0.53660826 -0.40905639]]\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
- "def my_dot(x, W, b):\n",
- " y = numpy.zeros_like((x.shape[0], W.shape[1]))\n",
+ "from mlp.dataset import MNISTDataProvider\n",
+ "\n",
+ "mnist_dp = MNISTDataProvider(dset='valid', batch_size=3, max_num_batches=1, randomize=False)\n",
+ "\n",
+ "irange = 0.1\n",
+ "W = random_generator.uniform(-irange, irange, (784,10)) \n",
+ "b = numpy.zeros((10,))\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": false
+ },
+ "outputs": [],
+ "source": [
+ "\n",
+ "mnist_dp.reset()\n",
+ "\n",
+ "#implement following functions, then run the cell\n",
+ "def y1_equation_1(x, W, b):\n",
+ " raise NotImplementedError()\n",
+ " \n",
+ "def y1_equation_2(x, W, b):\n",
+ " raise NotImplementedError()\n",
+ "\n",
+ "def y_equation_3(x, W, b):\n",
+ " #use numpy.dot\n",
+ " raise NotImplementedError()\n",
+ "\n",
+ "def y_equation_4(x, W, b):\n",
+ " #use numpy.dot\n",
+ " raise NotImplementedError()\n",
+ "\n",
+ "for x, t in mnist_dp:\n",
+ " y1e1 = y1_equation_1(x[0], W, b)\n",
+ " y1e2 = y1_equation_2(x[0], W, b)\n",
+ " ye3 = y_equation_3(x, W, b)\n",
+ " ye4 = y_equation_4(x, W, b)\n",
+ "\n",
+ "print 'y1e1', y1e1\n",
+ "print 'y1e1', y1e1\n",
+ "print 'ye3', ye3\n",
+ "print 'ye4', ye4\n",
+ " "
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {
+ "collapsed": true
+ },
+ "source": [
+ "## Exercise 2\n",
+ "\n",
+ "Modify (if necessary) examples from Exercise 1 to perform **backward** propagation, that is, given $\\mathbf{y}$ (obtained in previous step) and weight matrix $\\mathbf{W}$, project $\\mathbf{y}$ onto the input space $\\mathbf{x}$ (ignore or set to zero the biases towards $\\mathbf{x}$ in backward pass). Mathematically, we are interested in the following transformation: $\\mathbf{z}=\\mathbf{y}\\mathbf{W}^T$"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": true
+ },
+ "outputs": [],
+ "source": []
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "***\n",
+ "## Exercise 3 (optional)\n",
+ "\n",
+ "In case you do not fully understand how matrix-vector and/or matrix-matrix products work, consider implementing `my_dot_mat_mat` function (you have been given `my_dot_vec_mat` code to look at as an example) which takes as the input the following arguments:\n",
+ "\n",
+ "* D-dimensional input vector $\\mathbf{x} = (x_1, x_2, \\ldots, x_D) $.\n",
+ "* Weight matrix $\\mathbf{W}\\in\\mathbb{R}^{D\\times K}$:\n",
+ "\n",
+ "and returns:\n",
+ "\n",
+ "* K-dimensional output vector $\\mathbf{y} = (y_1, \\ldots, y_K) $\n",
+ "\n",
+ "Your job is to write a variant that works in a mini-batch mode where both $\\mathbf{x}\\in\\mathbb{R}^{B\\times D}$ and $\\mathbf{y}\\in\\mathbb{R}^{B\\times K}$ are matrices in which each rows contain one of $B$ data-points from mini-batch (rather than $\\mathbf{x}\\in\\mathbb{R}^{D}$ and $\\mathbf{y}\\in\\mathbb{R}^{K}$)."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": false
+ },
+ "outputs": [],
+ "source": [
+ "def my_dot_vec_mat(x, W):\n",
+ " J = x.shape[0]\n",
+ " K = W.shape[1]\n",
+ " assert (J == W.shape[0]), (\n",
+ " \"Number of columns of x expected to \"\n",
+ " \" to be equal to the number of rows in \"\n",
+ " \"W, bot got shapes %s, %s\" % (x.shape, W.shape)\n",
+ " )\n",
+ " y = numpy.zeros((K,))\n",
+ " for k in xrange(0, K):\n",
+ " for j in xrange(0, J):\n",
+ " y[k] += x[j] * W[j,k]\n",
+ " \n",
+ " return y"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": false
+ },
+ "outputs": [],
+ "source": [
+ "irange = 0.1 #+-range from which we draw the random numbers\n",
+ "\n",
+ "x = random_generator.uniform(-irange, irange, (5,)) \n",
+ "W = random_generator.uniform(-irange, irange, (5,3)) \n",
+ "\n",
+ "y_my = my_dot_vec_mat(x, W)\n",
+ "y_np = numpy.dot(x, W)\n",
+ "\n",
+ "same = numpy.allclose(y_my, y_np)\n",
+ "\n",
+ "if same:\n",
+ " print 'Well done!'\n",
+ "else:\n",
+ " print 'Matrices are different:'\n",
+ " print 'y_my is: ', y_my\n",
+ " print 'y_np is: ', y_np"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": true
+ },
+ "outputs": [],
+ "source": [
+ "def my_dot_mat_mat(x, W):\n",
+ " I = x.shape[0]\n",
+ " J = x.shape[1]\n",
+ " K = W.shape[1]\n",
+ " assert (J == W.shape[0]), (\n",
+ " \"Number of columns in of x expected to \"\n",
+ " \" to be the same as rows in W, got\"\n",
+ " )\n",
+ " #allocate the output container\n",
+ " y = numpy.zeros((I, K))\n",
+ " \n",
+ " #implement here matrix-matrix inner product here\n",
" raise NotImplementedError('Write me!')\n",
+ " \n",
" return y"
]
},
{
"cell_type": "markdown",
"metadata": {},
- "source": []
+ "source": [
+ "Test whether you get comparable numbers to what numpy is producing:"
+ ]
},
{
"cell_type": "code",
- "execution_count": 22,
+ "execution_count": null,
"metadata": {
"collapsed": false
},
- "outputs": [
- {
- "name": "stdout",
- "output_type": "stream",
- "text": [
- "[ 0 1 2 3 4 5 6 7 8 9 10]\n"
- ]
- }
- ],
+ "outputs": [],
"source": [
+ "irange = 0.1 #+-range from which we draw the random numbers\n",
"\n",
- "for itr in xrange(0,100):\n",
- " my_dot(W,x)\n",
- " \n"
+ "x = random_generator.uniform(-irange, irange, (2,5)) \n",
+ "W = random_generator.uniform(-irange, irange, (5,3)) \n",
+ "\n",
+ "y_my = my_dot_mat_mat(x, W)\n",
+ "y_np = numpy.dot(x, W)\n",
+ "\n",
+ "same = numpy.allclose(y_my, y_np)\n",
+ "\n",
+ "if same:\n",
+ " print 'Well done!'\n",
+ "else:\n",
+ " print 'Matrices are different:'\n",
+ " print 'y_my is: ', y_my\n",
+ " print 'y_np is: ', y_np"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now we benchmark each approach (we do it in separate cells, as timeit currently can measure whole cell execuiton only)."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": true
+ },
+ "outputs": [],
+ "source": [
+ "#generate bit bigger matrices, to better evaluate timings\n",
+ "x = random_generator.uniform(-irange, irange, (10, 1000))\n",
+ "W = random_generator.uniform(-irange, irange, (1000, 100))"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": false
+ },
+ "outputs": [],
+ "source": [
+ "print 'my_dot timings:'\n",
+ "%timeit -n10 my_dot_mat_mat(x, W)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": false
+ },
+ "outputs": [],
+ "source": [
+ "print 'numpy.dot timings:'\n",
+ "%timeit -n10 numpy.dot(x, W)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "**Optional section ends here**\n",
+ "***"
]
},
{
@@ -158,93 +432,212 @@
"source": [
"# Iterative learning of linear models\n",
"\n",
- "We will learn the model with (batch for now) gradient descent.\n",
+ "We will learn the model with stochastic gradient descent using mean square error (MSE) loss function, which is defined as follows:\n",
"\n",
- "\n",
- "## Running example\n",
- "\n",
- "![Making Predictions](res/singleLayerNetPredict.png)\n",
- " \n",
- "\n",
- " * Input vector $\\mathbf{x} = (x_1, x_1, \\ldots, x_d)^T $\n",
- " * Output vector $\\mathbf{y} = (y_1, \\ldots, y_K)^T $\n",
- " * Weight matrix $\\mathbf{W}$: $w_{ki}$ is the weight from input $x_i$ to output $y_k$\n",
- " * Bias $w_{k0}$ is the bias for output $k$\n",
- " * Targets vector $\\mathbf{t} = (t_1, \\ldots, t_K)^T $\n",
- "\n",
- "\n",
- "$\n",
- " y_k = \\sum_{i=1}^d w_{ki} x_i + w_{k0}\n",
- "$\n",
- "\n",
- "If we define $x_0=1$ we can simplify the above to\n",
- "\n",
- "$\n",
- " y_k = \\sum_{i=0}^d w_{ki} x_i \\quad ; \\quad \\mathbf{y} = \\mathbf{Wx}\n",
- "$\n",
- "\n",
- "$\n",
+ "(5) $\n",
"E = \\frac{1}{2} \\sum_{n=1}^N ||\\mathbf{y}^n - \\mathbf{t}^n||^2 = \\sum_{n=1}^N E^n \\\\\n",
" E^n = \\frac{1}{2} ||\\mathbf{y}^n - \\mathbf{t}^n||^2\n",
"$\n",
"\n",
- " $ E^n = \\frac{1}{2} \\sum_{k=1}^K (y_k^n - t_k^n)^2 $\n",
- " set $\\mathbf{W}$ to minimise $E$ given the training set\n",
+ "(6) $ E^n = \\frac{1}{2} \\sum_{k=1}^K (y_k^n - t_k^n)^2 $\n",
" \n",
- "$\n",
- " E^n = \\frac{1}{2} \\sum_{k=1}^K (y^n_k - t^n_k)^2 \n",
- " = \\frac{1}{2} \\sum_{k=1}^K \\left( \\sum_{i=0}^d w_{ki} x^n_i - t^n_k \\right)^2 \\\\\n",
- " \\pderiv{E^n}{w_{rs}} = (y^n_r - t^n_r)x_s^n = \\delta^n_r x_s^n \\quad ; \\quad\n",
- " \\delta^n_r = y^n_r - t^n_r \\\\\n",
- " \\pderiv{E}{w_{rs}} = \\sum_{n=1}^N \\pderiv{E^n}{w_{rs}} = \\sum_{n=1}^N \\delta^n_r x_s^n\n",
+ "Hence, the gradient w.r.t (with respect to) the $r$ output y of the model is defined as, so called delta function, $\\delta_r$: \n",
+ "\n",
+ "(8) $\\frac{\\partial{E^n}}{\\partial{y_{r}}} = (y^n_r - t^n_r) = \\delta^n_r \\quad ; \\quad\n",
+ " \\delta^n_r = y^n_r - t^n_r \n",
"$\n",
"\n",
+ "Similarly, using the above $\\delta^n_r$ one can express the gradient of the weight $w_{sr}$ (from the s-th input to the r-th output) for linear model and MSE cost as follows:\n",
"\n",
- "\\begin{algorithmic}[1]\n",
- " \\Procedure{gradientDescentTraining}{$\\mvec{X}, \\mvec{T},\n",
- " \\mvec{W}$}\n",
- " \\State initialize $\\mvec{W}$ to small random numbers\n",
- "% \\State randomize order of training examples in $\\mvec{X}\n",
- " \\While{not converged}\n",
- " \\State for all $k,i$: $\\Delta w_{ki} \\gets 0$\n",
- " \\For{$n \\gets 1,N$}\n",
- " \\For{$k \\gets 1,K$}\n",
- " \\State $y_k^n \\gets \\sum_{i=0}^d w_{ki} x_{ki}^n$\n",
- " \\State $\\delta_k^n \\gets y_k^n - t_k^n$\n",
- " \\For{$i \\gets 1,d$}\n",
- " \\State $\\Delta w_{ki} \\gets \\Delta w_{ki} + \\delta_k^n \\cdot x_i^n$\n",
- " \\EndFor\n",
- " \\EndFor\n",
- " \\EndFor\n",
- " \\State for all $k,i$: $w_{ki} \\gets w_{ki} - \\eta \\cdot \\Delta w_{ki}$\n",
- " \\EndWhile\n",
- " \\EndProcedure\n",
- "\\end{algorithmic}"
+ "(9) $\n",
+ " \\frac{\\partial{E^n}}{\\partial{w_{sr}}} = (y^n_r - t^n_r)x_s^n = \\delta^n_r x_s^n \\quad\\\\\n",
+ " \\frac{\\partial{E}}{\\partial{w_{sr}}} = \\sum_{n=1}^N \\frac{\\partial{E^n}}{\\partial{w_{rs}}} = \\sum_{n=1}^N \\delta^n_r x_s^n\n",
+ "$\n",
+ "\n",
+ "and the gradient for bias parameter at the $r$-th output is:\n",
+ "\n",
+ "(10) $\n",
+ " \\frac{\\partial{E}}{\\partial{b_{r}}} = \\sum_{n=1}^N \\frac{\\partial{E^n}}{\\partial{b_{r}}} = \\sum_{n=1}^N \\delta^n_r\n",
+ "$"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
- "# Excercises"
+ "\n",
+ "![Making Predictions](res/singleLayerNetPredict.png)\n",
+ " \n",
+ " * Input vector $\\mathbf{x} = (x_1, x_2, \\ldots, x_D) $\n",
+ " * Output scalar $y_1$\n",
+ " * Weight matrix $\\mathbf{W}$: $w_{ik}$ is the weight from input $x_i$ to output $y_k$. Note, here this is really a vector since a single scalar output, y_1.\n",
+ " * Scalar bias $b$ for the only output in our model \n",
+ " * Scalar target $t$ for the only output in out model\n",
+ " \n",
+ "First, ensure you can make use of data provider (note, for training data has been normalised to zero mean and unit variance, hence different effective range than one can find in file):"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": false
+ },
+ "outputs": [],
+ "source": [
+ "from mlp.dataset import MetOfficeDataProvider\n",
+ "\n",
+ "modp = MetOfficeDataProvider(10, batch_size=10, max_num_batches=2, randomize=False)\n",
+ "\n",
+ "%precision 2\n",
+ "for x, t in modp:\n",
+ " print 'Observations: ', x\n",
+ " print 'To predict: ', t"
]
},
{
"cell_type": "markdown",
"metadata": {},
- "source": []
+ "source": [
+ "## Exercise 4\n",
+ "\n",
+ "The below code implements a very simple variant of stochastic gradient descent for the weather regression example. Your task is to implement 5 functions in the next cell and then run two next cells that 1) build sgd functions and 2) run the actual training."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": true
+ },
+ "outputs": [],
+ "source": [
+ "\n",
+ "#When implementing those, take into account the mini-batch case, for which one is\n",
+ "#expected to sum the errors for each example\n",
+ "\n",
+ "def fprop(x, W, b):\n",
+ " #code implementing eq. (3)\n",
+ " #return: y\n",
+ " raise NotImplementedError('Write me!')\n",
+ "\n",
+ "def cost(y, t):\n",
+ " #Mean Square Error cost, equation (5)\n",
+ " raise NotImplementedError('Write me!')\n",
+ "\n",
+ "def cost_grad(y, t):\n",
+ " #Gradient of the cost w.r.t y equation (8)\n",
+ " raise NotImplementedError('Write me!')\n",
+ "\n",
+ "def cost_wrt_W(cost_grad, x):\n",
+ " #Gradient of the cost w.r.t W, equation (9)\n",
+ " raise NotImplementedError('Write me!')\n",
+ " \n",
+ "def cost_wrt_b(cost_grad):\n",
+ " #Gradient of the cost w.r.t to b, equation (10)\n",
+ " raise NotImplementedError('Write me!')\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": true
+ },
+ "outputs": [],
+ "source": [
+ "\n",
+ "def sgd_epoch(data_provider, W, b, learning_rate):\n",
+ " mse_stats = []\n",
+ " \n",
+ " #get the minibatch of data\n",
+ " for x, t in data_provider:\n",
+ " \n",
+ " #1. get the estimate of y\n",
+ " y = fprop(x, W, b)\n",
+ "\n",
+ " #2. compute the loss function\n",
+ " tmp = cost(y, t)\n",
+ " mse_stats.append(tmp)\n",
+ " \n",
+ " #3. compute the grad of the cost w.r.t the output layer activation y\n",
+ " #i.e. how the cost changes when output y changes\n",
+ " cost_grad_deltas = cost_grad(y, t)\n",
+ "\n",
+ " #4. compute the gradients w.r.t model's parameters\n",
+ " grad_W = cost_wrt_W(cost_grad_deltas, x)\n",
+ " grad_b = cost_wrt_b(cost_grad_deltas)\n",
+ "\n",
+ " #4. Update the model, we update with the mean gradient\n",
+ " # over the minibatch, rather than sum of particular gradients\n",
+ " # in a minibatch, to do so we scale the learning rate by batch_size\n",
+ " mb_size = x.shape[0]\n",
+ " effect_learn_rate = learning_rate / mb_size\n",
+ "\n",
+ " W = W - effect_learn_rate * grad_W\n",
+ " b = b - effect_learn_rate * grad_b\n",
+ " \n",
+ " return W, b, numpy.mean(mse_stats)\n",
+ "\n",
+ "def sgd(data_provider, W, b, learning_rate=0.1, max_epochs=10):\n",
+ " \n",
+ " for epoch in xrange(0, max_epochs):\n",
+ " #reset the data provider\n",
+ " data_provider.reset()\n",
+ " \n",
+ " #train for one epoch\n",
+ " W, b, mean_cost = \\\n",
+ " sgd_epoch(data_provider, W, b, learning_rate)\n",
+ " \n",
+ " print \"MSE training cost after %d-th epoch is %f\" % (epoch + 1, mean_cost)\n",
+ " \n",
+ " return W, b\n",
+ " \n",
+ " "
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "collapsed": false
+ },
+ "outputs": [],
+ "source": [
+ "\n",
+ "#some hyper-parameters\n",
+ "window_size = 12\n",
+ "irange = 0.1\n",
+ "learning_rate = 0.01\n",
+ "max_epochs=40\n",
+ "\n",
+ "# note, while developing you can set max_num_batches to some positive number to limit\n",
+ "# the number of training data-points (you will get feedback faster)\n",
+ "mdp = MetOfficeDataProvider(window_size, batch_size=10, max_num_batches=-100, randomize=False)\n",
+ "\n",
+ "#initialise the parameters\n",
+ "W = random_generator.uniform(-irange, irange, (window_size, 1))\n",
+ "b = random_generator.uniform(-irange, irange, (1, ))\n",
+ "\n",
+ "#train the model\n",
+ "sgd(mdp, W, b, learning_rate=learning_rate, max_epochs=max_epochs)\n"
+ ]
},
{
"cell_type": "markdown",
- "metadata": {},
+ "metadata": {
+ "collapsed": true
+ },
"source": [
- "# Fun Stuff\n",
+ "## Exercise 5\n",
"\n",
- "So what on can do with linear transform, and what are the properties of those?\n",
+ "Modify the above regression problem so the model makes binary classification whether the the weather is going to be one of those \\{rainy, sunny} (look at slide 12 of the 2nd lecture)\n",
"\n",
- "Exercise, show, the LT is invertible, basically, solve the equation:\n",
- "\n",
- "y=Wx+b, given y (transformed image), find such x that is the same as the original one."
+ "Tip: You need to introduce the following changes:\n",
+ "1. Modify `MetOfficeDataProvider` (for example, inherit from MetOfficeDataProvider to create a new class MetOfficeDataProviderBin) and modify `next()` function so it returns as `targets` either 0 (sunny - if the the amount of rain [before mean/variance normalisation] is equal to 0 or 1 (rainy -- otherwise).\n",
+ "2. Modify the functions from previous exercise so the fprop implements `sigmoid` on top of affine transform.\n",
+ "3. Modify cost function to binary cross-entropy\n",
+ "4. Make sure you compute the gradients correctly (as you have changed both the output and the cost)\n"
]
},
{
diff --git a/mlp/dataset.py b/mlp/dataset.py
index 01156f5..5ca346a 100644
--- a/mlp/dataset.py
+++ b/mlp/dataset.py
@@ -64,7 +64,7 @@ class MNISTDataProvider(DataProvider):
"""
def __init__(self, dset,
batch_size=10,
- max_num_examples=-1,
+ max_num_batches=-1,
randomize=True):
super(MNISTDataProvider, self).\
@@ -74,6 +74,10 @@ class MNISTDataProvider(DataProvider):
"Expected dset to be either 'train', "
"'valid' or 'eval' got %s" % dset
)
+
+ assert max_num_batches != 0, (
+ "max_num_batches should be != 0"
+ )
dset_path = './data/mnist_%s.pkl.gz' % dset
assert os.path.isfile(dset_path), (
@@ -83,7 +87,7 @@ class MNISTDataProvider(DataProvider):
with gzip.open(dset_path) as f:
x, t = cPickle.load(f)
- self._max_num_examples = max_num_examples
+ self._max_num_batches = max_num_batches
self.x = x
self.t = t
self.num_classes = 10
@@ -104,8 +108,7 @@ class MNISTDataProvider(DataProvider):
def next(self):
has_enough = (self._curr_idx + self.batch_size) <= self.x.shape[0]
- presented_max = (self._max_num_examples > 0 and
- self._curr_idx + self.batch_size > self._max_num_examples)
+ presented_max = (0 < self._max_num_batches < (self._curr_idx / self.batch_size))
if not has_enough or presented_max:
raise StopIteration()
@@ -122,8 +125,7 @@ class MNISTDataProvider(DataProvider):
self._curr_idx += self.batch_size
- return rval_x, self.__to_one_of_k(rval_y)
- return rval_x, rval_t
+ return rval_x, self.__to_one_of_k(rval_t)
def __to_one_of_k(self, y):
rval = numpy.zeros((y.shape[0], self.num_classes), dtype=numpy.float32)
@@ -132,7 +134,7 @@ class MNISTDataProvider(DataProvider):
return rval
-class MetOfficeDataProvider_(DataProvider):
+class MetOfficeDataProvider(DataProvider):
"""
The class iterates over South Scotland Weather, in possibly
random order.
@@ -142,7 +144,7 @@ class MetOfficeDataProvider_(DataProvider):
max_num_batches=-1,
randomize=True):
- super(MetOfficeDataProvider_, self).\
+ super(MetOfficeDataProvider, self).\
__init__(batch_size, randomize)
dset_path = './data/HadSSP_daily_qc.txt'
@@ -152,27 +154,35 @@ class MetOfficeDataProvider_(DataProvider):
raw = numpy.loadtxt(dset_path, skiprows=3, usecols=range(2, 32))
- self.window_size = windows_size
+ self.window_size = window_size
+ self._max_num_batches = max_num_batches
#filter out all missing datapoints and
#flatten a matrix to a vector, so we will get
#a time preserving representation of measurments
#with self.x[0] being the first day and self.x[-1] the last
- self.x = raw[raw < 0].flatten()
- self._max_num_examples = max_num_examples
+ self.x = raw[raw >= 0].flatten()
+
+ #normalise data to zero mean, unit variance
+ mean = numpy.mean(self.x)
+ var = numpy.var(self.x)
+ assert var >= 0.01, (
+ "Variance too small %f " % var
+ )
+ self.x = (self.x-mean)/var
self._rand_idx = None
if self.randomize:
self._rand_idx = self.__randomize()
def reset(self):
- super(MetOfficeDataProvider_, self).reset()
+ super(MetOfficeDataProvider, self).reset()
if self.randomize:
self._rand_idx = self.__randomize()
def __randomize(self):
assert isinstance(self.x, numpy.ndarray)
# we generate random indexes starting from window_size, i.e. 10th absolute element
- # in the self.x vector, as we later during minibatch preparation slice
+ # in the self.x vector, as we later during mini-batch preparation slice
# the self.x container backwards, i.e. given we want to get a training
# data-point for 11th day, we look at 10 preeceding days.
# Note, we cannot do this, for example, for the 5th day as
@@ -182,8 +192,7 @@ class MetOfficeDataProvider_(DataProvider):
def next(self):
has_enough = (self._curr_idx + self.batch_size) <= self.x.shape[0]
- presented_max = (self._max_num_examples > 0 and
- self._curr_idx + self.batch_size > self._max_num_examples)
+ presented_max = (0 < self._max_num_batches < (self._curr_idx / self.batch_size))
if not has_enough or presented_max:
raise StopIteration()
@@ -198,18 +207,24 @@ class MetOfficeDataProvider_(DataProvider):
#build slicing matrix of size minibatch, which will contain batch_size
#rows, each keeping indexes that selects windows_size+1 [for (x,t)] elements
#from data vector (self.x) that itself stays always sorted w.r.t time
- range_slices = numpy.zeros((self.batch_size, self.window_size + 1))
+ range_slices = numpy.zeros((self.batch_size, self.window_size + 1), dtype=numpy.int32)
+
for i in xrange(0, self.batch_size):
- range_slices[i,:] = \
- numpy.arange(range_idx[i], range_idx[i] - self.window_size - 1, -1)[::-1]
+ range_slices[i, :] = \
+ numpy.arange(range_idx[i],
+ range_idx[i] - self.window_size - 1,
+ -1,
+ dtype=numpy.int32)[::-1]
#here we use advanced indexing to select slices from observation vector
- #last column of rval_x makes our targets t
- rval_x = self.x[range_slices]
-
+ #last column of rval_x makes our targets t (as we splice window_size + 1
+ tmp_x = self.x[range_slices]
+ rval_x = tmp_x[:,:-1]
+ rval_t = tmp_x[:,-1].reshape(self.batch_size, -1)
+
self._curr_idx += self.batch_size
- return rval_x[:,:-1], rval[:,-1]
+ return rval_x, rval_t
class FuncDataProvider(DataProvider):
diff --git a/res/singleLayerNetBP-1.png b/res/singleLayerNetBP-1.png
new file mode 100644
index 0000000000000000000000000000000000000000..122ee36763fa6b86e50623458552aee6c7c7d949
GIT binary patch
literal 70872
zcmce-Wl&sQv<28`aCdii2@XMmhG2o9jRp-CtZ@PacL?qp+$Fesa1ZY8u5-!vUQJEa
z%>0?E`H^(eb>*BbYp=cb>0srLGN?#INFWdhRZdn)6$FB@0shbx|Q_~PMwr+{7$d55&_yT}X;Csi)CD$mLU
z#>Zg*NrqhP<(=1gAI|_y
zEcfJvsKc_uh>j!%EQSOEr9TE1Kd%M!zkf+E%DsLe!9a`n`hk!~^Y0@oj^_1$l>VwQ
z|2~xEvHo2yFZX|Y*Z*D~7xVvm*Z;#y{@rC1xY2iZ;+<2Wd^g?p+=;Jm&;`E`jIo|Z
z)0Rduq6=evsI`+B>9pGjg^sxizA#N<^YRU4WIvw
zeRs$2L4ePm?d7?9wL1o7Z_k7Bp8$__{hwSQ62&
zY${z>TAOCfTB}LGe*cWs%I^Sw(iS-?DtWYEzDUs1!@%&a%}Z8ECgm*;I%PL~>MEUo
zl;wv&vj+b-Xg?-2M9a=iB3^NO)?0$oD@!2$Z?^h8&q%+#ctM>py3Nv0-a$
zBd!11_2AAm;VUMO8Tex;aWDa~IvqI>8y)Ls{WnfuuSV9*Vu5F0_J6i8xafe+#bvkR
zd3$fwFs$hL(o*yMCyo(vERY<6u}=DL|4y{@Z;bo04J9v+!}AsG?|Hm#p$=xUoUEE(
zM5}2K|66IZsz$&`YgeNa15@W#!bUIgcr|^aqqyRT7UusZN>Kc1YwZ6|snOY0XZ=eZ
znb%Tk()dU_o$tRjNP%c+EB@vjCYp_Gx>Dx~Dg)WA{T)MsD!Gcw@i2%~2nW1Bg_t8<
zc&F#<6N+zGzwzjrDiS(bprY!FO%FJKSA|44XhOA3w7^vH@$5}9-_AVszkHWfueQRs
zX(Rpe#Rl9AuTkRDfsx;Z0sDAI?|kIj@pFTs%)fJaSHJr$Ni71xNaOY!*UTCW)bBngB@7;y#GD5I5!A~0hsNJkw*!raa
zm^V0Jw;R1Ma{Vir1}_om$liF?iHuSm<36;hUma>P|9~tC0RWLjPd+xVqo=4S*}?2a
zgiz5l0x4Sa9Ua(H9VxC%m$3EIM+w=ve_1Vs>$3D?7OaM6Cr$k^qwJnKEV7Te++dz-
zsua0`8H17&?G-Aptm6_kS|2Jbe|9Pq7AA^`snW&qSxZEcRm@k|lG`*#?}SGs`l|py
z87w%ehkdb4$*HoIclg!g*)^ABhty>?#a1~7-t^D995s!G;ip$K#C{nXD~Rm8_zVta
z)v+65jsBKe-t3#iZGT(Av!aL)T;qgEv)ppliHtOwArkr>#{O6rcwVlY`4~vw
zqk~&o_Cx0{0KBDG@!2S7;ZpdRn|a(EVW*o|G=0du+EWFf4vnhUVwIG^5zJ-pB%sTh
zu@dy)#Q_ZEcR~B|1^CNZ=>DcW?gt|R;Ku@`9A|4?d$`xD^dpFP?be`2(z@oWfy?mO
zG8OP0k4sMTF1E9NA=FxJ3^GSY`3$kkxC+QC02wtn$YFx;e9Z@T{0A4ZD3!M43-y72
z@=qt~SJtxvv#(s9ZmAnxZWMCS2$auuDvcYC1y`09`og>1PkxK;*=2qtO%*$_FJIKX
z-0fhFz`*D4P(+{{A6DdB>q7sj@3u`yB1%%)R;qG*Xz-KqaKPcRiOIPAfir6uB{&X6o*fxhYTK$Fyx_w01cuTZ9`(OC`z8
zm;dW@u-=WR>zvfKa$D}b7S5MBYdp`T>q1)2F45fn&l_UZrfc~rIO#8Y^v%z1Q!j!&
zbuvPIp^SyRx9It%hrD7Cm`VxqspM31)w$CqbX7IByo*r6l(Lrrc^Q)99fa(fg+H
zdLiT}mfn+ISE6tXX5-mG^4jU47atUavbUF%u^Pb-hx91997hs!6
zMcgvIy}Nr{-PHSp^n$~Q8Ob0&=P|+n;`l^BH}Ks@A}OTp5JwXU8TX5
zyY&(4?rs9?B`;TutCnvtLNZlGYTxF00%ie+iMUfB0E__!{?k1Gwp&AnTmVWbCTGf^
z0}=c}f%Oqmiz!iK`1kFi8}s(4j4DC~zsp>@N
zN9_iThimoRko(^WJ-z}C9G-3_ydXgj0++?_7dvNt3jEd2ajJ$rf!6$OBrerKHn(;NM6>+1Y%@-3j%Oq_X_jU
zeKfV>=_beBg+*7>0^q#c1<2B0za*
zEC)t<$-NB*<3h}w52N0Ogo+&3I+xarm+DNDTzB}HPr#m?X^Rl?7F9^jHLN4CXlACq
z@L3bZvqvU}4w9?f9ybATAFBjkH|w%ucwyRnt!?tOe6|5Fp8nxa4W*51@SwRi^2vgY
zqk5I%_{+yx{m;!iy~KNguZd41hzZ+#g_rJ4=5(MTY{4Z*JgZ;cJR6LS%Zgsi^jQ_7
zMh{*3VDv^RciV)3vR
z`19ueOh(RB&zVqFwP?FPg|9D~y3$oO0KISU=g({nDh=jz2oN4p;?m!h7zGTW@2?~i
z*uEd}8pE$|aDFo}d7ATnA@;n_)DznJd~;ZMYy0ivvBw^5jpYw)`qOrp=)AC3Yx~2t
zPvq^6WvN~Q!D7FQ0qX8C?8n$@nJI~WE9lhu^D|T1PCHbI=6BBryJuk{p7zrEY@K@f
zBluu?%c=Im`MDazH)sv5HI}#lf@9N3)yF+<;d@-5I-gwEebyoz90XV+?kI^+z^r~>
zg1*Ku=7Z$!?Ar4az??64vq*gE{{p)vs}51h=v{7^s8e8){gVti6`w})!+60&pp<%l
ze(|GJhgpQ%BnX7ybGqtBc4PTh!)>(fIq!78!>=ov)nqsPm
zR`{vLZWE#O7~j=ZLO0dKY)3{nwZh&wm)yg~rUy#j(}?
zehwb~0J=TorlB`}awA&iV`ThknPtKT?ymY0?R7e$73uv17nrj?no0t+F$rMRxr6;;
zk+OycC+6)xKv}+X2U+R`w!N^CN*8Mkwgg(vbOXH2Ob+^XkKQ83^C3jyisICj!2gw^
z2P1s@mP;|H{u%Mhm%)>}W1&5fT1&*dyq{n}2Wj(VT>=4B0lO;m=o;e|x6bb{Jax7t
zXj$$qv%gH_L;BRP1xWan_YxyXW7FALag%v4zkL%vW`IZQ=_^vl{XXn|jM*D@cJo)g
z8Et6jNBuGu7$SaqF=}u)R$F|jx6v19GKzujhw)m(4JFvF0JlaeOt{L$J?>Jc3iu&n
z$HCB}i-mbTfiiPpW9cX7T7yHonq3=!@@i(&0JY*p99#_7xSD#kOQP@bV9jG2@g3%A
zL1p@PhioUwyZw^aY|)m-0H`kCN~U`8ZbQ7~)H|T+xz)tPdE9+LAqo0xdsCO=
zZZ&YTESV-qAS&!*i@xKu`?MHbV?DGC_u4V2qRv%h8ErP*TW;Ny=+HaGm8c`TEbra#
zFS#h_T?)ErEHC}0GzwyNv0aHU%b1>__>-JvyHX@f-Q=<{E|Zu16J1Oyl5E^jKtN$&
zqp#8sCZ45mH*DhzyBXu%PDY|2?uuvN?Yn?BJ8&RavvzPpF=04Wpy9;}U|LJ)uZ^#i
zSlbhwb&c)GnaKI!b()LD{vG{!#~Yyj0UY-JQe|1sW>&Z9Z89(Np1ExkYKE6KrN*b7
zWUC_0My=A_zxY7#gI*pBwi~safZEa*X?&>pc%a{^_N1G;*&Vz)&%KI`dGD0?^9SF`
zYeOxE-Te5Cqxa>2{p$jawfjD-)+hh9uCpO_vOJUBK=FT=?vW};YaLV!@1777@;zR=
z_VjjRxm5zJQJ~+Zyc4mp&J=@S90@r0w2OfOD3OmHNkjgU4e8|;X2t7Z_K)8M_{R^(-IptD!WR8NW(^8%?7#In_QoKf+#N3^
z3Vqh<`Wd$xAynZl21@*(SzorE_wuB)U_Mu&irPAy#0`1ze0*3Ld@l<{3QzgH9mz%_
z$h4o9l$Lhz-tP?Oan5|hrx>bG3Oh{rB1A@S)M
z$xLM~C$KFPR!}LF&(c2_dpcncTqY^retRaXlmM%%MfleC+S!5EEFza0%Hp?*R;kw&
z?)o|ia=3Wu{St`Bdb$}q_=a%I5aYE|4Mw=XNq`sISq%}c`(gsLW4U4{jm)bv0p#9c
z<*hM}6Teh=>SVS@daTOj8Vv`q9*=Ct?
zz%-Vz(R(OWU}BodYI7<-zBeMP4w4Ut^Vgn6AzIqbUn&wt*%hwUeH$pR
zON0eDVo-cs0`g=&qSHPCuNBhMJ)&_h0=gIr4Gn^HPE>||W%g1sFO#^dFjMFu{2d(F0
zakS8x=7MBl(HVr7nV~c7=YQ0VZsvK;H4(}YnEhQ$#Sr)e9`W8YLpK){bOm{^Rv6~H9<^lrMb$l
zC2yk?Y>ef<7tVRs7gX8AzSy2Xikrlcbn+k6TEXeXQwKoN^C0RD@;
zYMkobIgvnB25}HYryb3*t$2k(!JO{qBytx_ST6bc9Q_@y`hrtJOG-A9(dT)yR*(V8
zbg^XtlW58u2lbdD=c{&`N=o>){r*X$7q~MDzzvC5g!56w7X@`NL9`tKO`X9!Dy0+*
z2^x!D$lXO=z}&HBDMwPSwOFX>^Nkn^fe#{fRIHlCBfQ-u!$NIrk$Eu1>WO!PV=g!n
z4%tXbhH5TkL(pkzD!M~2Fce>b*(Z`fC2Qa7Szr-9EUJs567B(D7*oAAGUedgGa1nC
z940XzQ!_m`Ha;8ZU4|&x1FO)*^k)drBBz%w9++I10zkY!opS`W8qUYz!Uo!P)U1Ma
zI+&m-!n|~1_*gz$2$Q4YX4(PW{J_Q%UqR;WPXhl>ue5zU@s5drWfF`Ma@a&zc2^>r
z5WQ``dBdjZvy;)c-ortF17>CK^M(7K7uv(2?-l$R4CTXI)_${+W1*HT#p~OO7D*0p
zQxJ#2yY~iH7o(i!%9@d45@>#}LS+zooo)Ht@$Bf8y8XTSSF4*Id|CPPGmH6o$U5Rw
zS$ILZ>McVPX@L@;jOFHypK7*&s{s8eQH;EPg^cly^BVNW3=MD&tv14zAPY;J_Swxb
zt4M4La%i3rRLUs^WuW-0v%BBS>cW7+i8s>SaeO1nTSM{`pjO3RI&vcKdA$-6HV(B?
zMGlVA!sMUN-jI0_?2N*+>)77aS@S1x!I?VrxE-hzQTw%`>C|z87vTpJ&dx$~ik(6y6qHa7+Xn5R6>_H`*14wHa`vY^
z`bAHdtJ=FlHCee1+cv273d2MZoj($>7y_r{Vtym|C;oYycD#X{y$5=UFlipEfLQuUOWb_Lu939e$v50)>%GJlG}9G-
z?O&cWTZll7E`8@)4Tn_-g3V^>9iR2HelcjzK5K-=U?u+fr2^x!Qu=%!uHy1PPd0eF
zvE*{L5edjJyjJV4P08wB!FU#obq<}c4^Th%3P593v@_)iSSVDmm6%W5qp~ux|Ag(R
zc0hPa&x_RRrI`%hb2xBJY^msn_|jXeER
z?XQiZzgbw6<`2(T5orO~D7rAkUVn#Q-j;Pye6?An9VKAP2-2VFZ6~6@gZ~Xgq)^b_
zz@tRr=7@X2BtgJGIg%`2DZ!D(
z$Ev*Nmx?yxOentn8uZ{GKpy}Rp-`Bpa
zok#T-&HpLAxkd9;a4sw1)n+-{KDI5mc*uW2m{=DT<*)Liu-V8epz+Dc_#V+97P9@*
zd8x>nNFzlT>!=2f`0ZeS$ ALoe@cFD>a#~*-FBy4d_U6T38ZBME?S0@b1pcB1gU0&iktB
z3*^?2$2x>m$ZjKiBG$MUh|SA`Ks3?{Hz!;YTuPd-DGb=}8WjFf40K}T)uuoLghE0v
zb0Xr1P``|nf~r$5QSRb-bx?kGyV0c4rM{d$tw>AczQH1ZJ8Co&<(
z$^#`mQIyoUcTC&An`s!($$=4H!ADAFWKg4%nwMla(e4Bs$aENFjQ5|eM~R#Gp}`L7
zG&$4t;txn!*~j{?ImdtD7^RT-XQFrIdHYO6Fzq1ae?}rj>0&S4*h-n=FjJ+WePmr(
zl53(qq(Zsj#bxKWWOsQ>O|HEN!Rm7;L4Zyj?)S$2W(l&-NW$*}%u%8!=URyIkk^-V
z?*RqbQ~QVQ*D+9n-nJVG6}YH+D*=ssTkd}*53qnd-QgFV91@1II`MdZy!1Nsb2ZCz
zNf!>LQ&iT-?@Xw(K1e=+oafeH;_ukA-{r+%{ht$&P`tHBv5IebmuBGf8kK^G1EKO(
zB5XQ9i_H;g?F+2AJ2Xu{G(xFv(@Q+ogVdI{BqX3>q9d>%j*BO4~w>
zy{DNj3D`A{30?gd>Jxb6LRz+@vx*o|70UYC=FEW4d%<%rEj-8>>wspfMLmGJ*d52|3-i`*;Ibk91Z)|M{{7F`NX
z*M({m0SUPF2$U*l^bG*&AYsB{P0;Z6
zRDLu!mxBHc==pJS1AP8@#R-3u;D25bkm221rU&BqnmE;}f&r*DiEmLOMVVR(YH*U0
z_G=hc>i($9UmpaH=RlP+&mS%!gHVarBZMHm|BWEAID;_7K}sUD*1-HG2J3QP{Vct>
zy!UK_fmMC&V1LHRGvC!=(OWE@UB;&D+fY_0ex(g5Ncf?olS|k?N;4~9H7^o|(m$$y
z^ZOeEt}I1j3)aK)Z;nSH4<~)Oa%(~Xr8@R~qzw?v6uul!_u0?Y)tnFxqPN|)ZEML^
zz#@XxHD8CNDzXJ1+w*X30EGjYeYF#=`PN7?GD}a$#1y(cQ=H%N*Mvd}pYap}&>!Nl
zJF|R=2?ilR7E}x`~uo0}y~~
z%)gI{$@E1CfBrA2f`wtBbn3G+L9E*E%F%Bu`9|V?8lD-I=tpK#b@yaxaXTIaq>C`m
z0D$+u^Ynsn#g+d(3$PAtOZaaHFB?@M4s#aJwZ`vpdm2qGgGRz%VZR|=JdL#4NAAs{
zYU%csg8&N~8!9(9cdA@pjFy&`l2Y6wAc`rsqT&?jy6*1o9&S!vj_OxHhl{@_jVc@3
z)XHb;EhgdL4aKwS)R;%7rj|_ZK~S;XARy@+l|@-Ox!-F6#M<>XX^Dx6QBm)>v4Cmm
z=Ms&Be8m*6+qJ+S$oP1Ll^+8-3?RE%fsM{b3q|V1kh%ow1vPE$M(e+I>ctwK7k~Id
zVmO1W05$aFcs!@ix_+s*GIp%1Np?6c4#y25FW2d=))#`y+{};f$+b`;J*f9fU*C=H
zZYCtZ=Ya21v%>~^g?4mxdrAojza0g&h}4x8WTQ768K3Ra5%qv5>u-y>E3$S?RJz47
zN;t8?-zMkJEVM1&VqSm!$Y(Qm6Fc$uuI~HD+z+E=$mBkL%%7L5S=24nYjS>myc76l
z=I#ao>cSMMr`bsAo?DIp2KwduK|NX7cXpy3Ye1_uS4%8b(%xCFEAyoVEC$%VAj2z}
zlMIUruxBV9>-}jLx*kq$mJpd6hTe(zQlqRq(s>s$vi&B4Mrki!sKG#aiK0^@Cbe`J
z0jF;cT~?sW%_)1^krNza+r;4yX;7=OO$y3aA
zU2>F*FEGsN%$1a|G%bx_OL?g3>ZV3VtL)v&NpCyikfW~fex3IX8yMp)!pR-(@2{Bu
zDlDFhRH@0Lez11*Wsq#ygloL3k0~x@8lif=WcGkrx>b6O%oA56BFjCmBhfbo){#1n
zlMwEq=#U7HH_zT6PbJnjMk*o@-7M_gc-KMf-WnsVTnS1$ykCcazp*xoyOugy`$(j(
z=TCU!@ukxeOtY_Vb+om$H&J5Q&~*^G=ssh`B_$=1aLK8=LN1bnu21P=gU!NKW28`>
zo0eRelpQ%S*Y$PDdSsYUd)-=EGc$e6qUMgSfgPrOqz~;c`Hji*p%Gzlcn`OFSukzn
z>)Q*In2IH%_f>)HbDsygh*98QqZ13R0i`{^2Xo&@7dG$W$x2(BXa}j78!xoSb%f36
zeQp8*ZQutiGIa7vstE=E7&6hf25>IJc=e)>r(M5)|DMqO>nMDKE~VmR0Vcagj~fIz
z2`f_gxWpw_?{@~#5EW`nJw16rm)zjEKzMJ{FZcDvyc~O!S==Q%
zUvH~#FSOsudK?!IEI)TzoWO_D(iNgd?I9?%FXCZHN#^$H{1o1gQb;I;Oq9s*6PGgs
zE9=oxQ~mB2lqD4pp8XaSpKhtI&+i9?u_w3B&(9%c#*kbRZqu+|)K1|B^H`p1crI92
zz}oYB+VeoeC0f;xGP%{8X$>=U+j$~nGEv6D&aR#9?TIcKMqt4a3F8M>>s90yIUGl%
z-E)n^Bfoh+2!*PX&NsQ(+AoZZk*++WbJndVPa}B>?GcUi_a_2F!q0t_;*i_0fkb{v
zcoB~T_dM&^fvEx|@^nWa3bwsFo1dQ^Lcw(KbhS?;!SqV$!)l|^Tnn3AZZ|(I?ST}n
z`ZDy3E?A7L6)mtHsXqjYErnXxk{R#*5NTH9c|V@@1)vhEs;ODcm}Fy~l>#QI
z@FH#OwqZtgyWRbvGB9_>6<_)b?QU?yo-}8i-{n}h*U3m)CV4h9na4BvoM@r|6sCyf
z1{ozwuN*M{t|I4_#Ly*BR&CrDaeE6{{;et_O{&+lzMXQ~_Ws|6pV2ccyY;;dH`v1z
zRpqhV2*YKJR1_y>m1U$E2L^WMVH`vYl4&ka%PgyIo-R-0Vx%588+7VzjJ!;Er=qKp
zg`GlN%Em*w)^;cIhECdEz#>k$vel+P&*=cJJBaKTMAizFhUvGe6cfK4-Y;*~&Vv6|
z8q0tQFp(W-b0Q}uCh%Cbad2_f&Db5}B0iZ+C;l;IDsCxXMX6=+4mdZCU%mH0s
zE+Qf<;!)Fn7q?67NA7iNX#er_0PffJ0d#ktaS~#{F=#ALA)&KV0!*pWF;%Gco-|2-
zl!TP7&9_Dc6qTOd>bR#0CYxUDD#*_dA@l0ta)YpNi)XO7Ldt}QDmj=4zkEpIdarv#
z5zrJ0=Y`GgL_N9Z2yDTK^Q%8K{r9}^&afK`bd`6Xr1X)HkYZwD%wsK&*9c1s0t@13
zNc_6wy+kjMpxO;MN_2n6zz46YbSzXDN`eB8HBF)4z$Hzn)-?Rs_);qk_7pYdqHB|I
zXMjL*#uL-h6#b-#Qdc9gqw{OLU!K`<
z{d&*>cLzDcC?HP|OZ}Sk*+pmecYxEy8MRY|xwjDwJ0|SOwvap@MOs2-F&6GG>
zN^Z@alrqnx_AxeDKtAj1>rx90=ZU;lGrQwC9Rot_boZc~K4{&5hDJ9?`~rkqg`E-6
zpGA2-G>G%^Jyi%`W)+$I@a%}oeutN+!G0iSx9#}N
zqbwzyo$Gej#L^VePv+$>Xwk(?hY7T)vw=q;+3}Moro3fkV&X!Ry7pM;D&7VMw4y_!
z>`NI)t*MS}DN`2IwHVfBuQ`F-VbqB?pKUHrpWPwX@F5HYKf|E&Kc+EV2N)EFE8Sn3
zD$`TF@li0mQZiGMxu@V%kw;cgQZn-{-f!;?#@<_lxjKD%xcTRqXH`v2hUcz0e>n9$I=MIyMB@Nj+iPEUFw6V?crc8WnpHF2!`;+}Y7#
z!E?LC5P7Vzm^4R~ms^hx#-@iJqpwEP=jiTJu!7DjC`gEppP*3h?EHb;6Mn`%M$p0W
z4Hl%y65re`tWDhu0eIftn&jCj^|SaZt!Xx@)pq#y4b=RI6^==RF2Z1aT30I%_j9<92ue*E~69cU)YWxh!~H7fcLHj^1ggA_Db)hJDH$`9O*h}fUV
zvQ$?EqjjrQY4l@`)pi$~NBT5Km70$)L4?jY_-4gh|ME0VaGW5^?UrvD0=PgmreWQV
zUy4z{0($6v>bLb$gFSVr3awvWmJo8iWVVADBmLLo#osm3!w^Vi!e))j?wX`gw*yG|
z)2DyDOn1J)K8Ik0kl&{l;i^vk@D~;RQS|An_wd4727kUtkh8x)Rrs*f2{$)T(yv9W
z%6^FBdbbOZm|*60?;oN`q2#~X8!AB@4%;8{O#@?Vt!6`rQ`^+U<-tfTfW?8h+E_zf
zN|G#rwr!)TwHsN23hh#(u4bqw3lt{;Q4|9xtq&9{
zlM`LVTWdWmbj|{Rb#d*EC7rsrJpeEv3FrdOWH9%jj@T6cNPXMleq*CeU6dr|Eh^I7
z7<8uhN`lszXfwP*;+4by+H7!;XvDKpKgbfD-Da%oLON76
z#=HmUD!NhY*3}aF$ksT9|7Pvx&w7F=>NKe?PrIL_t)A|$c;NDAI?%-)L9%k)Qt#ay
z69NqBChaQ1R5!JR+1#;@$iOVHG({`s!2>YJOzYL;zQO_czpE>
z&;?K9-#HiA=n+YJcJt2<#L`|h^woOLNI;aU~PR^3tT>t#c
zxRRMv0SDgI33tuwfBn`$Xa_6
zD|xJ$5EO)bj~O0?{geMk0bG^3mIz*WkX+g2DG?>zbcGIHo{7^={fZY1cNH%%sk88_
zFI}LdW2#K~DcxTQb}fpN!pz<=e4i}cFUK1;4|tG~G{ihGbyTKCLs*KsyHHgkNa*$Z
z^YowQaG^<^pob~*@Hy=mDe97h2^_j
zVBiq*Lv{)f_G1YC3=^6*kJ1dZ#jJ18Cg2xT6SzMRRiEK6_+Wf-KFa_1Xp|PP5_J@T
z&zHp-)-%`(D<0qQ0Y+Lx5P)d`(yIg{(?DlDhL9Z7l>9bG0|g!?Jd^b6hmVYMLECoW
zDICffJ%0_e$lL?oQwAS;5a40Wm87c1uyBsUFHGdhVH1g~dFe_LoBIm?m~C{j(x#R-
zq^p!p6_6YNp5XI_3<63GoDRDp&J71Bm9`{x_x^3Rag9G+zZxX0Ns8L&{*6ep^yUc&
zl=e|ZO<7*Us#}hETf)MBE_l?OX98z|@^c?P-UPuuwF{E*A}tZ-q}VsSF|6d*5vt?C
zO!&WO>W|eJ-k*f#WSXz~qL3^h1Q3pHRM)EDUqN$>VWni*IKLUfYkRd%|C0U7SsaU1
zlpwsrMYxLk>gyov59=n_D)^j#B3`c)VEqnUNs({#
zJ!tIUip6u9=yy*#LBqu+&9*eW5IQoqT}n3yt7dKT
zE41rMpo};WXr!Is!L%Dkib$)OD;8^%c84prO%P@f%sxW0=x_%8E#ll#SCx_*3v2CM9d;_Av1fW%LdC=mL(-A{C!oGPhMTDQF6
zSpjW(1AN9z^cwrbVxZ$j6)r7Ko#(Q>&NN
zSAlw%|G8j{U>|_ssmM>b0Ci$;b#>Kibc8aPhb|)`<+1+T6@&gCC%5t^6u6Ax#V?dz
zpq5n)q%ebW6kN)Z3|!yXK#nwOMI`qQ&&Ia&Qn(l6utC5o%WUa7T0_2GQ{^+nB0zhS
zOqaDwAR?Scb4KO2@VeXH@BQh36?aO;x4<9<1_s&e`ZA>a4@oiU>19^_l0SkX3q*XQ
zHh!t0+YOD`>2|XVacmiv+`s=*PCSKN9xgTE#3Kx}Kf;J0fQquc+>LsJHd)>oQifxq
zj7GwP0Q0k}srUEr={oZB=eN8838vWqn$XB-Ums;2Yy8U_+5?V_j4@v2orBg#@uI8k
ziDkogEUj6A=;K|);nM#QB>H2TVDO(iy5~@UM?QLJoFte~$x7Tb&-lqO9PY$1C9(jJGX0zW6K(wr3DkNv-DadMi!Rl=lg>W
zXV~xDTfB1e3t!)gC7dy^cCFV!c91tu)OH9QQ8^3=(hvK11}TnVVdmPr
zUu@f+OaUJ1;LyWIfD!=sDkS=0J;#7K#RubJ^~&w(`UoWpD0b!iBoP4nz{ABQY&70N
zuL*2p(R=Mf+GA)H(mk)|vw@%RXfNvOCU6M|{I*$YR7~axHiOT&&dtx?-rVFc=6Z)1
zabOGW6BZT*FoQRZ4-rqdmpa4usU8Avu(B-X$I*O8V-j)S-rnZ=wNdWf#>2{xa5dn`2QBeW5H{WHk8VCt
zw;%gb6d=~0S7^x#2Dj<``DTmfra!@>+EkqBlpCZusEAJfE0CRyZ5xR%E~Z<5vq5{?
zF^ovg9a8`EyvD%J66>n1R1=Xx!Z_J-;yqnR1Qc)-w0in!v)k&le2qaX@OOFfn3(J#$z?YU
z4q0g{y%o?RL$!E!1)v@aegojfEXipA;GZ1Ve36I8)>F
z#ux0VuqviAZwN&2KzQun#d7^N3}BX!Ux?|_{Ry=KJPj!8WAzC(#x-b-zxWUI(x~H}
z-Ek|W0WTddHMtzjl&*N*q}`rmPVRX<9MuC-`$X5n-Wn^O0@vs7IdL@N;hcAFEo3En
zr#`}GoaSShfEok^Ijrn+yW9m1bJh#U;PqGi*-JBW(D4OCH#z!e{0-}LIkiZ
zf47GrOB*HQ3IcgykJ}1|={j(E!k4YB1pQ?|Ap2dLLQfDgh435vb0bg4X!>oD5oN;!
zSF@OY8qc65Edw$#GX2NY-JPYvQJoo1<6hv{9d^m~_14;65}@Fo0zIY{ynV>n0|ASR
zi;H;q>hR`!uS0KYIxWGI9qVgG51!#{)!s7OE`YOQBJ$UOtPDi@Qf;_(G<<4tx_IIvtxl>WpA|fpSQmW;Dupb-S$0R4uNK0c)Pi|_gFu}dMi;3VN
zKmi2%;(KhIz__}(7{A&Ks=NFsQOlWxUoL#$jPQvn0?g>b!a_h6ZUF>OK|KwP833!&
zA?v|_Oa^yvLHzcsUB)k`>(iEpi7w-rH!*>u{r#B{xp6c_z!3(s*a27y7bmA1URT$N
zbFC0#I|&KjIUhRZ-f&{G>iNxW@x8r0KxX+fGLoP9DKK`Xn;h@Zb=}Lpd8)p6#&W{4
z$FqfUA4_fh$L$wn6%8WFK02g90Kb*XqeSU9jHjB21%}od2&|~MczM<1{78kIQGVs~
zGcylON?GrWq*Yf{?XJB?XOI61DCP;zHZ)ZjvW`^9q>?TO*Za2|<LvAEDT#`T|+LIwyCv3i~13vhvcIBD*sZf#_x?K?V#y3SN-L>NM>fqquq;tl}>Fo~ZIK|S!#?3ER
zy39V59g{5^aEu*D^T)+hX3~Yj6feoLxD#5ST!$^$+;}H90nOgAXs9n24T-a1wHBm<
zA(-au%q{uW$Vnr%DgstnS(ze9A%3AM>urkh2jx0qnoa9aQOv^G14X*n8}CO-$$(o&
zE?{o(5IA?_96OLZdqA(4{Kq$(Ixmi9EmTn>BYMu8lo1`#OMald^?q
z36_4?l)L<$?O`QCH5U^i^_~3;U7LbNY@;-Ts5YsSNgnc^-FM&Ls*ar{C~J1&EQo`Y
z*gt%!Rj8?+9?gJobvl^2IyPalA&zUUry@v-7DqoR2&4y2N?3AL$jmGvv2z1cs<1GW
zlt0^(^1y^d)X4;xm(>y*7h%$e{Lmtl2>GZbwTxm2&L1M1R#Z%$g^Wd0v0>`&$cVLY
zuup0Q$uwnp9AS4jzr~Zvh16Sc5P0OvG6hinF|8hItAxs!bXqREmpf^gOB`0%BJN?LrjgP
zXOr*-g@lu4?|AKMpz;tHJK>V0-$erw2vQWOR!Ky69Tq_&GlbCWyEzY03&P;~y3q*~
zObhr#FR~F?KB1C6bIBN%6Zf3a&-RL16wl0+aA=E@*UBa=GES6=EgcJHkvuy317Tz4
zS3cSP6?{*si5~P_M?ubvi8~lvhv#9_>Wiaa$j98Q=glEym@GO)(XVEwSwvI0mP(p5
z9vs3sEKbA#hGTYT{cPYl0}cogVmF=q@#X&X&;v@`%SM!o0I?e2_VR+eV#wy5Ryhz3
zLzY~PZEya68ceS7y4wfG)4?!-`6Ro9*#S?
zV#N!^Z!zael0Z}nLA)L5+b|1vmCBjRq8kf*nJh*(yO)|piu&nE%kRpBwgJM^=sz)|32!kxS&OkXrcT;8^M5-L6)xi*odT|R<`
zn;+H1#oQx;L$Rys93|F!k&GE>J5F<14h93jMFM0{?!G-E_Y`kPcgI{$4?%eEJF|Eg
zJ*&SN%_J#*G*qD1^*ug~KeHLj)s!{uR=gRz?7QO}iqsrh^OeWsuh&cc*+TFovIZgV
zivsN{JmYqZ>R35l~h
z$_1mAg3K-E^Zl@E((~|eW9YYy_MqL$9j*^hzR350PNV>i
zBxOg;U{qihqmrhJ9mwko`Nde9e#=Za82g=~k~;Pvk$~bcc6>7LKcu@`QTw@WwBr?JiE9(uw>{8hVrtsx
zVLT_?9O_#FM&<*6Zr^VX&h6#-$+UW2q(^xc!6N6AgJ2z&W*vot+xr)eo$*-{uHO%1
zA@3<#*x3sFZ=S8Q0`YV^W3v`bWDX>j=|f-`n6E#HA2JdULRx})j8t@c=J+wR9Hygx
zu7i?z>s6U@J!XI=1Exv7WmD7*QR#lAQn<>jerez@-lK=hzW=#E2D@@)lGVEwugA8Y
zpJEP23$@3w^w0h@X@BMJ=XE=6>I3__Qf3+40T~47F92K$#w6X-b)j-2Q9|w{YE;On
zvfxCz&N9e2*9~s@;HFOE$W7P9KgRoQV`uk(L>J{#nXAQTaGRH>I#!Dw*DnbaZK@Md~RMz6+^biP6M{S<3|FJ`4XP49@gHv#>jblqq^Wq`F|>0$LAM>hE;4xf0{598{0F^k_-Jydsjsbt0P
zhMB#8RLx{7l#|z*8mBn{Tiswm^$eg%n@zE!RIX%py9IDhY;w@=`V`=Wlbb@pa+V0b
zAXRJNN6f>(L8vXAHGh9tc3UZ=*CyT>Y+sDUelH^Bv`ortZ#Ax&tVAsl3N~tI*#rum{{c-gP&+d1%Uk});
zG}>5*S&X&a26FMf3_VG0Vfmcnk3crtncI5Ztao~&{ePZUMso7b2R!hY{{DU-+F-X|
zf8Ectmw)x7tf>gE@m1w&8u#zJBHyt0=0tv@Gq^!(8X%7D%*N6Tm>+UNs7;~Rp
zI4a3F+P6`fcPQ4)drV-s1mE&o+|XOAdVY3Rb*c5~!rUk6|DYHJfkCU^$9K;F_QCA8
zxaAN-BB+1Acz$gAa(|Xjl;0382;epP`Z?Of0_^PU)YOBf-|miP;EQ;K*!6rJ%1@E)
zP|LazqXRGLeyxfKY^5piuoEa|*fSf!p4A4}<~+6ks|kKpNt!BI~H_BCw88@)=DgMbg=D3ohVM9pIL*}ZA}heM@(*x
zN{)%qZxrelFvE!y&IP&HQAy3{<*sO~M4_NFPXUEt&wRva>CQRiSMhxk+oF(=9J
zvLAhuF2~HNst@hJiT-V%a8o8V|AS7%LKuc(ZvyS@SGJ%cBas_yH4#!^l+Bg5I1c?p
zyf>YzMhxY{jHHZK^9j2FeR+>wsBqaZ85Zn{R{YeT$;!P|G#$4vzwz7_Z^jQ~W7VtX
zan~q1?LNw)3cI|(|6JRhV(VtQ+>5LqW!oXq={W)wWkP4OMT*>fpT!FABG6N!Sbqtm
zyXK^l6&;2^zXVT#D_~xC%Wdb1i=stP_}4#ALUAw|_`d#V&dIK)U1j}7@uzC#jR#MK
zH|QL9c2~eMUp&_9jplf0GbP#h0g<_Rrovkr=S#~AWM!s^5a7Ux9ReyjMb1Mij2BcD
zU;Qf(M2~(HmS?yiDpc$h4g}8cxf#3-Q;4TFED5T-bfL#>Q}kQpdef*f?(No8i>h`;
z(bGJW$a173Sx%jmgIS
zzx?r1DzxP@M?cuWwBUkBAaKg%zfqlH)Ap(2YJ1$69FY-?rPD}enz5pbGd)I@#=$z3
z$ENp5nHa2!NkG+iRQIiuhSRXwz%n*7N2l`Y!H@A%V>r1_y4NlypRQH=6ojPkKDLQY
z-OfIpNXJCK`K>x%d@Q#8LA@?zP-Y4wdGMo%ir4xtvGL6ci_g^ZZ)-hX@`K%!w6w7I
z;djnw4uV)TzDuFY&H`R{+&NPvr9}POX6@?Z3>q>rfK;Y~%KC|zuf=>Kr{V|j7o+Rl
z;rcuVIywSA4+j8vX=rHpf-2cf;yxYaozccYsPNTr%4`;`Bis8;?A86V{_rQ8+MIY8
z`?zA{RBv~Z)!Ok$zH%3%f7y|pPd)zog$nTZ@9qsOjD#8>m{)O5>K$)nyQy~K*M?5V
zDJ4pQipEBl`6?{{VEF=!2h5Fu93Y3CPB_7V7q-%$q&_4A;;vM6+l7HNR4RY9$E#Tb
zfZ5Eo^$ZRUPW<@VHww>TISuNs{lQ_I@sbR{63TnKUS71EobDqjv_yqV1zBRkeABtB
zwraiMQI`BxI44j4K2M+AS&fyB)`8ES)w#l{)hF>L>bMZUdmOGN|c}}{9Sgj)>Jbi@BcU`sYVV(wHk)Am=;jk
z{(E!81GaYbZ)h8i%){Y<&qnFq1Cy}1)E5=fsGWj8&l!4v#BgHt$IWuS3|QfeZm#X<
zb{b8p(AA)~kSaTxO$|)}j)A66aqC3?SHUBKcE|JU>*UpEflt~|Ppv@(ZORIDd!=dL
z*9an|%-_>>aK#wq*P9ewizJey!)(a!8ne8MwA?Vg_s-?K{$2j$Vo3Prx2pThVaLML-30}v&srx
zP5)toqr&9Os-Jp$k-|7WI{2%?93<58X4DB8GugWTXM7EcA=w
zPT9vYOoYpZk1#j8EFgEM2n2fex=1bgGd{duwq+Rvw*FubwNC@lak1
z<=h<=@+7;stP0!KSL0`0GqFD3yeviFadA9VFl2ABv{O*_a7;T<@D@IE8a;W#-)}j9xC5=V6X2ON?t=IotlIl5phBS8MLqQjr_J16&NXewZ{%NKNYmzj6L45GlmY)r|
z=8A4i8zr{Du)!&1VU12Pf+{Okv@8drYQ7mO+_ERi>=$cc
z6?s-#?u%kYqg^rHy^dGl{Dlb8bWPP5?65(yp|=3F@}m1~K6bse6%yK?X=!F0nLnBS
zvxg8rrw$XEaN!6f(3R?W36^paWK=kWsyC^
ziiY#-2fv@h?CvjP^xlDw_w{@Ql$Wj|5A*`v*=Smxy
zS&AZF0_M$gdYMzBoxkVbhI`AY)=U`b4ubWQknb}$>1fnQ?%pPDuoeaou?0(;MD1BY
zn3F+lfhA^9j=~CR?B`D<+93)2KWF1D9D$`FMF8?eHMF;^eqFN+A<7Yj^V`diPik}R
zJTSY>qVZ5I8nSVOoboeT-axF}DDp|jD*xd1=i0pXapWL+SRAI2dNPx6I^clg{Fya!
z&n@UoC>VWy>?578Le@+PneNxY5F}Kqm_r--_~{Z06@Ct%r$jgeG1|74e*W*@zga0U
zAisAI3ImO_f|vV=%M}P$gVr$AzgRY}WD|m|9-fTKs&V;$0ASA0P{QxR$20BBMBvn<
z^4Cf<{PP>PvtBzFj%&5zjMZ1Onx=d%m>i^5+}absdrc9}QgM@qNTeE<<#a{GLK-TTwQOZG3?Lxpeb>p%C7hJS_kcj4^%hF?(*aZRNmSd
zcEtEN>?{JX)odmZ;$_N~R8>_qHnLre@rxM1P1ta>r#DCA#ki@c0S;7IJl{Z`I5Ou&
zta)teRE?0rI6YEu2Xa>iHsMexL5`{_oE>Fb1ugW$lIEOvYHlpP+9^EtyZx=CP*CPe%DTtE(sG$17<
z1z7dMsYAZCqPBumUKq!0LV&l-9>IB~odUUR(-b5#q!9Jl&jQ6WnPEuSxIZ(B;n9gTDP3sGB4%(+m&P=6*=rg
zr=JL!y^f*Px2tvM{Hof+1G&OO!9Hh=Q$No0Z9#Fg+JVG~nwU-yuuok`HHs}323JVZ
zv70VXGLdUA*^e-DZWo54>evMP!)25+%J5&+YP8*J5RI*PUFL9Z(<_J*T@73gLee0K
zo9oX7?S&4Ox6l)T(>K2MGUv(wD+wi&ZR3=y=pZJ_YP}{ky>dzNSI2S5-)hY>SQU|7
z*fMx;=?Y9_n^{g?9<}twg{z338YuSR+jtzJh>wU`3V$J2LMS$^
zQqtB2LPqG#G0u&cCz4En9ca})RVWG2IK#FGzq9|$$r1QIlJe_qdUtmhV8%vYGjVb@
z8})@PEiEBkv>$dqs{WI$z1{uIV?!hHG7kBv84L#gRiD?W$pG*puBd2o{qYB*>E3<1
zBY^0WLwE;tit(!r;=;_xf}my&_SeNQdRWa}D`dwLE`M0b78Vu
z;jgsKu9*}m+zGWsbF0t;&JI+G+9a`I6ZT!J9!_O|5-%ozS8ogRxmxzFEHXX%p
zJe}H)=Y_wNDMq!^ZP5~iDUpO>=Ys2ww9#O1T{+tFMMN5L>emewjN4?jW0YMca(XB7
z->ou^UA%z*XC(jmQ)Bp(a;i)Fd@RV9eBUChl{daBr}dTtK$~=T{byM?b<4DAX?efL
z;?~^xh3lzpX&yi>E{+;n=l>=53q%?b4n|t0w$icQ#yY6K@=GuH0_2C-sf^Sm#GMN^
z+ir$_iz$2#(fZtL)&R=UEl7>-BUW>wg+sCxC$T42mMEX{ovd`FU&k(=Nf3xzIy#F{`)F>%oOX-Fjp7$J>!t$kn@LW9K
znn-p5wrMxpDyW_bsb-<%C0>wg&=_a#MoZ_iVImwkqz`sPaD-d>L1eMkC)Er3iIP0q
z(V&`l@$-MBJO`?=bV{oAIqjt{g^;PP;bl~n6^TVJZ~l*87Csst?yi(@4&Esu%fuLB
zVKm4x6k)_09@Wbc8I|!`g0Kx*PkY6yz5IjP%5InGJ1J|lwQJMWwa{xVET`5~XP+f{
z6(ZD@Q99>I)HyVCQreV+nhEo@IqTYx4;}_>s2ZHda~8PG&BnPm;&bEClg7Bs_y74E
zc;;L@1X4Sw*J+Xnq~7Mcd)BwJvWKf;!ct`mnl^k-p3cyOn#m0b51#nr2(`A|DsAy`
zrRPC$ap{n-v9fb}h7}rI+r;~{TYhK82H(n=kT_Ingc)f#Z!Ud>&|jg9?(dx6oIdp)
zrM?KFrvN{(bA4RC5j87N`Z}P5Wtaa@0Y@qsCQ)Yg`xH5w2wN;dP~)?YL^kvXX3Y}O
zWFcWf24G2B#df#`u{7`{-1OJhh?(C#6^y#7hWV1Cv~BD#Bd;5;&)aEItuOg>)xAgU
zFM_novjlI_&)#BcdL|rSRHxy)e$^f|(WoY=8784*swN48{&4Zn&YCY@yMsVw{u!Wh-H{#K6^P~c72%>7pf&uM-ClXv1-(5R^o`nwQ8#T
z&XjJVS&m4ZpSyEkUW9gTXfR_d+Le7KDpZbj@FKxALPa0_+qsqKkIu>+QOc@6QYh5=
z5vl(de~gyl4D5^t)95oiz)2S?dl->_!Hh%;OsnEemCld&9Xm+q%ZI5K=_NZ$lR10D
zIXo|=ocQnS(-{rhD96A>Uulv+iUy7PAIw9jM^9{7T-uc-kP(e0_it$sR{7b+0Muu6
zHKz~92t~}cWZyUiKVz=vrBKb3jzENb{_mKPg+W!B@UAvJDi_1ia;2XYA#`g$*C^8H_4dse;{Rw&l*2_vB6M>W8X(FjF04^GOen>Ek#ig`(q@09_`@
zALELJ>>f1d1QJ&`P4}lcsZQ?RP7scnkkE#VQP6v
zpY$02Z#Q4UQpq)YmLyr^n?A}qa9Fxl^Cb}4EEOow(RbnN-Wqta;Et2N@FoDC(yA-M>s#6ON#e0g@eqq<5s^JFpWtlQ&(_0_*>f
z_TTNzSZK1DFB~Zln7Arf481wq%2Dg^`MwHc4f!+_&v-H{TDOzdeoanB*H=cY$)rJ^(`0C9DS*BR=DW9I@*1bMRElq0pdm&jzWwlQ+18}kU
zbR&UX?S=O#(+*jfGEpkwseB=zG#j?ANvW~9**cv13mIh?B4wdw-ZzJJdvl-J0EMD4
zcnIU`vQ(N=d3yJgCdr6kYRIx1DJ+}o7L|`=EDA7p;11bVoYU&3*c0DJ)#(`_zR3K1
z=mS;dZ7P=XbUAHY>&zeGR$E|rmEvC)1?v*)|Hg*Za*|Nuhh$IZbLwHh`{zYxI944x
z&920jCr4B0nK0~(9o10|k6elJFWh#(DKtcEl4};vPkM!+`n?3=uWL$0P@%!58>8`b
zHkX7UYD1ueZ#(){VXDKOR~)5WVN7ehG^JyKoR4o^N_XNGBT3G|Z5F1_@pEZnVI9&O
zo0{AG|7_K1I<%i`D;EPIIfq~qxjPESa9PEtgH@VYYy_BP1CPL{mw3I9Btp(XObcFJ
zT%=II$jQ`$A~QmsqM6Gh
zmW`i*8){6GSPyeOkd+8Eb0TDsW$p@ZWD8E?c?pE+!U``vdzmb}-9xYhSdUJGeYxhn
zO*%{ENHWKIue7>h!W!fAZb@C8vM0jgO%)sq!AQri1tJs|kI%4|I@2$Z5miUC0>vNX{X*TSqK5%Z~Dr(FsJm8dn3$4U9=
ztZ<6vx1~@y1Ab}ir9|6u?cE^{Q2j4Ch1JW2hJdg@4K#NTTINa9a2q7Q
zbmVX%8j6L@H-dneykGfpcXzk8w)Tm+qNK!ZAPQ$_h{YR$VJPVNL8yK|kKY~P|8oIM
z9b{2nwm!wszMR-~PE=?z0EyP;=Vw@CT+;?EG4kYLQ&Caq-=i-NYK9;;uEiUa
ztS?{w8=z97RzhD8h>{FlUD>lwKCeOO%LXI%)HB=ZqSSQZ6VV;N#?;i*7_tCmm-)Lra&nhafR>uVb9-vuWqFYy4Q{mH
z^HjXycf{YRYtTSZA+BuCL^f~w+g!&C%0A4Az*w;YWQ1HvttJ%l8dSxPxTAa2U#-<*
zZk0Vv4NzCPHBbzZ%20~#TmUGCe$ef-#Q_YL2G0`D*3;N6xA7%!-@WQANAoE8XHGvF`kHTp}17+1Ob9vn_R=e5V}nGM}%
zH5W9+6QLism&1Ds2895IT&|WxTdX5UwA6R(x8;{>$+%Xgy|kFvW_`>gP|r6D
zga3J1O~Js^VGONeoatzd1f=8a#Ch6$f0-t?Wu9W_$6tA@9)0=ZAJZHj=?rBe64}rg
z_=AZVgc*1VM}StsAt&}^oK}LaU2m7;2^wi_Zf>S-KD=XeU$?}BrQj+`I?S4gmq`RJ
ziZ=96k-!7&02)vK#2z^
zarDW&fN$?Y=waQ)NRyfZ2bCNs)E&uhX)YZoseWD^9=AVU2kuab)|wFx?@ewyBfP5F
z9;dV&jU5?;lkWzDBx`N!F5_>ku4LByH?m)KGx_{Jt}I1QDj=FG1a8Um@R}4;#&c?;
z<=nV*ETAhL
zQu|W1U2U;wINP6(8HM5r0hRaCjaP}bd+npG@Z!F3z?~zN4nkNrHX{Ec>KA}*hN|VJI7n0=vArF)e!F-sB9WZW`$M8@Y
zG=)%AhOHB4N>{mjE9ZypY9C{V*2(7a51-FF1DVk{u)_p
zxaOVZIvth?){@Tup)xIKXzH|44u6TC(u&gIo8G92z^}N1pj#T^MVA6$sv}Td1wsVG
z4KZX(?8!@bA=Kvfw1PBruQ2XK?ciM10%%Scl=M-d%35_%m#Wl#9^0a+vcb;Cjqwf{
zJ50F6!^y=u_5<_9(3IpJ>z*Tm*WrSnk_O&hhW+(goh}WiWdVbYMm_$g1B~aG2PL+J
zz%$leWLL*w!a?i)!7X&q=h%t7cL`GRu(}T_N0;V~WY`;vgxBd&86IxNyihIbCff~*
zgU5gI+ktHMWJ_a>143N!XX5Ck~evx{ltR6-z%&E3Tn?Gy|G%#l)sOB0*Jn%uL
zCOPRge5|+YEPwR;|#ui?p>9yD7pSj**^6!0c_@RK;*Odr^#tOOI3_6(t-yTTu58TLC67~
zbzD{<3%}R4(&zK|e$eUC^i!89oO*BcFj@)I^1zhgG+JPk%i9d!vg(dZ_oCDsCAjwM;MPd2#*{JEaGp@q}N
zd`P9biiL}0mQ&L$l%kW4{_rXtJ5V`4)7$iq&oOeNTiS#E<|2O1t93tI&Rh4eu0t{9
zsw$EMo~5=V{XSlLl!wp+$~qXjNFhmo|E7JNfhUmy`YNcYfe&(-$1A;SBqQ~Qze0ccP=Xgq6)IhRhRbZ;%nWALV&sY`G*V3UD&<$4T|9Ix!^r3oGy>8
z0H38lN&$I3n*=#o#A9PHrLRYR#TJtoYR=b``@vq3>z%!wHGB)6a$i44;v(}
z+js_2S1oKVWiSE2bX{FPw}d{dC&%%QmnP*vTeXe(mnkUH(P|;e@-QL%>ZjWClr7~j
z1nl3w?ybk_=kIJaDr9*J3Dfn{tJ_!{%(>>fD5fo~O$9Au7+TG3iA7YnOy*0OggGgX
zr|mkHa{FF8FSjbdB1p?R&QR@L{QO#Yi?X0sdaq=?AZ_wD0e3Igmnz#S%4J4wWPWc#
zK<9ZEgJLA2uz`o-~RDsf@X2j*8FAd(YnP3
z&3E>9OA;pzYTM6wvcx#p**ov%b>IYgh5yKFnrSPKtyB#b^ij+yIC3C6P8M=3(21C5
zc%LOV>GH*B-q_GZlElJD4ce}1k~*$9K=DB%kpb@}p
zIk&SkiQnB^8`ERBU`mNBt!D**9Pw^;95wZtVkS#(zsg>W46u!wg~?rSJtd9qIoK0G
zztm9_pp$3o>F9W04^!_fd3HLl&~8_r-I-vcjwg%c2hbtON-QyQLa_|mSM5f)mb9Mz
z`nlU-Kmf^A*Lu(Rlef8lmmFiyeo~9Z^YNFe0W<#<%+sB{uMCtB!8ftx7NsZnT%Y>e
zEp~RRPB&7&7oDTWonm0ldQCu30?nHLEmvCGd%4YOn!D^xa`2O$H%qidGE!jomj>5R
zYDA?7l)G%Ybs-?Jt>Ae=o+9BDs^S^r!gh@wCsvCb3N
zk7jN`MuDGwC;&LU7?&F^E8kU>?G2L%B{89Y3co+^U~I9)g5_CQD~ij0%EDcc)mX4t
zwj?%>k&~I|3rZOI%Q5y)RQqLR3+N&Y<^S~x(k!*))_aR}FteqB`*(!;n7NU{wVFVN
z6cvsx`NDa9>wmv3jPE|Nf#X|S2t)RX*F%qZST+12EbfsyU!SD
zQr`GK^jc>jo13F1vIKKY^=y*LziDJEY&x>DuMqIbbA&Gw3QIMy>y4nGj~Mq*X(vw{
zIBv_-C3}26y@l(=5t*G%zjuM}4qq=zEaI6LD3_z0$Wc)755C5SVjR*vXi@5SoGyym
zV0AJ*^7gwzX3ybt+%}Z72c*}@Y?ncuj|DD=3D*+>54uM;@dDb+ERI&+^
z(t9f43y9Cwh)U#gir~5$t*gx%2jmVn`x(PzqDNRO0cYC}f-K5OmNyP*@-cL2pC&~l
zXr1fluSnkVq-CufZ=peM6Ut9LImRQ!inGOC`HLTL#49j#=}BA@jt1^}k$)L=%U)6Z
zKVA%n5`LlgUI~6sd~r>2*$gPQZaJ=q+Gy}@zMZR;w(t1*hy%bD@FHw$X`-Z|hLm|u
zW%bk(<39bEnWhKtCTzRGlFLJvAR~FZYN|P;NKnns%S@19I8Df}f8GD^<8@%pYNWX0
z8^wRF*m0r)VjUX!X_f$8wcAPj3^>3nbB1P4#YBI!w(r8@6F-@8TG
z^&5oj4cKU;5S^^gcA-CIfdI5<=17b@Q3|wu<)L-Gb-yCRc6Gm!>QB7T?{gh_{m}7J
z!{%ZkPF@D3`Hmo{yBBiPc>(^oRWS8iElnbZoc2d3HE@JXO^N6RZN;4B9W|4;?f4y_Qsp;Lz5OmdzrY$=6#+bLj`cofAnPbjQXzCS9Jkv*T5N%bEr5C>LnD)s)-$P(zS
z#vVgf1$cA2^Yg%`bggvhMO5aH@mm8X7f2)5ye3VV=)~f9hl8{gyY`B748OG7Xla0X
zRiqKRByxFaN`pP}cGE0-pm|1>2$K&=*;I%jG2)4_EnJ8f*EU-vl+tjvJ@_`d
z}TA1-c_m#v8I|M$#&`%)7LS;Y!9N^co$_>lFWFEzTe~>iTHEMR|Lh
zo-UpS8knf1-ezA&ZM@{%u3bn|XM=!1#&jDBXi33yEp?RT%$8wSt3{FRsxIzL>Rq;r
zF)f4>inb4cCX&RSAilsgpPR8>J%13UrL{G3c;B$8^6b~e4d{J62!Zi+Ux-Fx8dxgp
zn#dr31K5g+ftGiB&VuOpFKFtgQo&bpL2(irGlX`J=k6;4ga`_DtN2gp{^hr0{QUf*
z>CE=l)}puH=PJ~IYU0WoGgb%wk7R9uoy^1SV64A_w~G^iydIsezog`#>w;eTnAz6*
zf81qGHnz)en9Ni?54fM+T}>fe&LIb9%U=7}+=2F7%Se^jO*TutR@mM7v
zoiG26eo{R?I)a4oxnD8|ATWpke((ENpsF1Xq66?yXtHf1XmcF@eufAD%9fXxwIAUj
zgaGJCPcK`v`8K^PpoR(PR0MSL=^3UD9%X{2&QiQbfkz8UH-Kw*5B?*-;sgRDOS61A
zxV8~Yy7pQRx;s2KuSa_({80g#`fD2~P_Mn)vNW(Tp2>GF1gje`snX3|=4g+%Di9D{
z53AD}O!UxHYQyo+fpicME!OjFMm1fULSKI@UIo-6
zvtg|gr~M7SPnI@ODz{3Afr1O}m<^ACFZV(xLYjSH!bJ29@9>k24IN+KhSUDaeWQt
zK?9-_YyW?$*u=Eenq*%DWZ!50_+mnQXHP#_kh$_0_%Q4mtf=Uz1=LjUkZ;RDlIS%@
z9FS#h*muw0jnDeKr>SMDy`KC$7qm=5&6A=3aS5V^{EZjik^qVArHhGtS|lnt76zKw
zh>L(d-Y$8UZatxKkYL%!buWi{s&e(UI4c-jt_Qfi|HR@XJBU?n#Bpymlw?s7q4p_=
z$|iC#MJ>L>QgPTO#c`Jyk7ga@rpxmTcb9(g?=YZ0GK*MrS)o*|*Hk>M((^cpzXUP<
z{W45h3GYPHLJrCVN#DAaAk4eYscBrvkf)AQ1D;pyLmQwlaplttF%R!#pLg9&Jf)l2
zzF1KpdAj2`K+)L;d*TI^TvEKslPNhAIuUWKWfV1Kf$gVFmkNk)Btp)4A!XDygc!U@
zS5&7q*{ZxRh=vzQiGACB*j53%_rbz5la&`)ncMd3d7#OT?7>2x3tAAO$ID^vJ_^Yv
z73ng$uX`R~3@E=Ye?echHjtpZ@JDb2CWzP|seTVf?g8qeD4;u$sO`8(9{JM^>L@$|
zx^Ow+zJyH7=E~+PqR-;ZJm=5>{vQe9vSqzkhL)T}#Fp0*-o@PF~-1Jc_hU
z;7>vVh&FkcjXhC>W=~?>hqyCBV}G5d=&@I}VWi#;J_e*__>~2Wkh^omRpf<5E#FA)
z6F0cy#wz}nkBf1ZR(&nC>s)ia(T8U3Gi&fpezr)~`C5-1jq*N==11!4Ce7<6b@9jg
z&Da-~6Hl|jY>csa6)Mw?HJL{nZWd95F&!fPMV_<84epUkM_$%=^Y31ZW_w|w$^Gfl
zuS=nEh?Z_Ffc4RwQf}?vj@K1xT;~A@E4#b7_1gLYShw-N(hFIrb{RF2&Vz7~BO*E|
zM7UsjjT)cLK=@5LSg62ion|KMdPJ;J4paw$V+`&ycyr5c@q8@CyWdd+sioFeAl4m|MP9Y6-3%xrKxL*R!=o&@Njb}41<92t
zm#3P?P&G~*KMdGdq8=Ukeb
z`)xX$1Qfe|3b=HHHn2qvL|M6&?E>_9d%dKI+Xim$??B>trvvM}&=q`RrIlj#e({78tiL4fbAGwZBr}v#{6>&bSLVRoD#l#w&ot?hE
zz7T*?x*G%pDls3z$K=Wi@&<({0t^(uSBHkka$B|pO17aP*x1=oNCb!=0_?PkjWG+3
zR4*km*EhI)pKpOFe6`a2y*~^HDxAc1PKRDDvFRYJxwI>*CJlPmO~&F0uHJl8PF?fA
z4e{p!R6UPvvbDUQs<)K*)oo28t_Np1<)=uvC5HB##8@5uj_DPL2d|UkD|b>=G3NeoA#OW7F?|3x3Vx5?irYq$FE7HVYvS=hl*QBz+z%jgBR5aJ
z^2i$YMF)K&LQVI;G?HJ><#cqPj2Iuj1YuwSH9y4<+QVh5hnWM2`KlH*a%_~-1@Ay;
zO+p}k$X8uHq$T@pAesp&cLkdm6%#T6A>ck)s071wynOWIiak8{hChC?j$YEbo;O3C
zVn-pd0}7Z-pIyYdi%tO6FX6hTao3;U4VBN|8<_`d$R$-bar
zVU#poBgiT@$dsmPKJ=Q>RUh8}AIm`WXL`#~zyL-n8~yxzLna&)BuAhr#kn9+{jf-W
zo-jGtrFRFX)fMOVR~FuJh$kOnYR-|G1P|OtHyKGuJ+#TNK+y-OYpDn)d(r}ss$e7=
ziYJNSSlqW}2(cVDBD*>h$ZF#Mvc~b4x|1%M26Rig8i!TB
za#CM4jZ-*(F?>~B`{ikt+%gvp2Tz`X@<&>jDWJKtgfORHP>WGkN^Fp^4=e&!ou{+k
z!a?OOR~^Zsj-yg(TpUEZ6GMRu9&vAEE>tx1wVxIH8p-X_B_r$u9sj2W>;q0p8QpvR
zAMYt4M(nU;B}*ARYHZ72$r_BL;cF?jzE}i_S&omsoSuL8cPsGg$;Fp-R|3C0?H?g6
zw9?QA5czC*lXT38a&y5z!;zm9absg+ZbMpt!npEpYBy3z4x-YwBmg3OQXk-S7;hJy
zkAyhr7_}P#(hUC~#E=`6?zi*>+j_J5FQ*83pG4z;Y=b&Z7Sl<_8MKK87f#W!FdNIitk;2aT@*T|X{J_s
zTP_cO5<(O71$hE<9V|4~mV{}pp7JzyYT1;W
zTXo43Su*yU?tFvrOuD0lYu2q%xk&q`DE=s2_=v9QM#AS2HIJT;0 |