Jekyll2021-06-13T17:14:30-05:00https://jrzaurin.github.io/infinitoml/feed.xmlinfinitomllimitless or endless in space, extent, or size; impossible to measure or calculate.pytorch-widedeep, deep learning for tabular data IV: Deep Learning vs LightGBM2021-05-28T00:00:00-05:002021-05-28T00:00:00-05:00https://jrzaurin.github.io/infinitoml/2021/05/28/pytorch-widedeep_iv<!--
#################################################
### THIS FILE WAS AUTOGENERATED! DO NOT EDIT! ###
#################################################
# file to edit: _notebooks/2021-05-28-pytorch-widedeep_iv.ipynb
-->
<div class="container" id="notebook-container">
<div class="cell border-box-sizing code_cell rendered">
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Here we go with yet another post in the series. I started planning this posts a few months ago, as soon as I released what it was the last beta version (<code>0.4.8</code>) of the library <a href="https://github.com/jrzaurin/pytorch-widedeep">pytorch-widedeep</a>. However, since then, a few things took priority, which meant that to run the hundreds of experiments that I run (probably over 1500), took me considerably more time than I expected. Nevertheless, here we are.</p>
<p>Let me start by saying thanks to the guys at the <a href="https://aws.amazon.com/developer/community/community-builders/">AWS community builders</a> and specially to <a href="https://www.linkedin.com/in/cameronperon/">Cameron</a>, for making my life a lot easier around AWS.</p>
<p>All the Deep Learning models for this project were run on a <code>p2.xlarge</code> instance and all the <code>LightGBM</code> experiments were run on my Mac <code>Mid 2015</code>.</p>
<p>Once the proper acknowledgments have been made, let me tell you a bit about the context of all those experiments and eventually this post.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="1.-Introduction:-why-all-this?">1. Introduction: why all this?<a class="anchor-link" href="#1.-Introduction:-why-all-this?"> </a></h2><p>Through the last couple of years, and in particular during the last year, I have been putting a lot of effort in improving <a href="https://github.com/jrzaurin/pytorch-widedeep">pytorch-widedeep</a>. This has been <strong>really</strong> entertaining, and I have learned a lot. However, as I was adding models to the library, especially for the tabular component (see <a href="https://pytorch-widedeep.readthedocs.io/en/latest/model_components.html">here</a>), I wondered if there was a purpose to it, other than learning those models themselves. You see, I am a scientist in education and I spent over a decade in academia. There we used to do <em>a lot</em> of not-very-useful things, cool (sometimes), but not very useful. One of the aspects that drove me to the private sector, a few years back now, was the search for a sense of "usefulness", where I could build things that have a scientific aspect and at the same time are useful. With that in mind, I wanted the library to be, forgive the redundancy, useful. Here the adjective "useful" can mean a number of things. It could mean directly using the library, or fork the repo and use the code, or just copy and paste some portion of the code for a given project. However eventually, a question that I wanted to answer was: <em>do these models compare well or even improve the performance of other more "standard" models like GBMs?</em>. Note that I write "<em>a question</em>" and not "<em>the question</em>". More on this later in the post.</p>
<p>Of course, I am not the first to compare Deep Learning (hereafter DL) approaches with GBMs for tabular data, and I won't be the last. In fact, by the time I am writing these lines, a new paper: <a href="https://arxiv.org/pdf/2106.03253.pdf">Tabular Data: Deep Learning is Not All You Need</a> [1] was published. This post and that paper are certainly very similar, and the conclusion entirely consistent. However, there are some differences. The compare DL algorithms against <code>XGBoost</code> [2] and <code>CatBoost</code> [3], while I use <code>LightGBM</code> [4] (see Section 2.3 for an explanation on the use of this algorithm). Also, I would say that three of the four datasets that I use here are a bit more challenging that the datasets in their paper, but that might be just my perception. Finally, with the exception of <code>TabNet</code>, the DL models I use here and those in that paper are different. Nonetheless, in the Conclusion section I will write some thoughts on ways to tackle this benchmark/testing exercises.</p>
<p>Aside from that paper, in <em>all</em> papers where they release new models there are often comprehensive comparisons between DL architectures and GBMs. My main caveats with some of these publications are the following: I often do not manage to reproduce the results in the paper (which of course might be my fault) and I sometimes find that the effort placed in optimizing the DL models is a bit more "<em>intense</em>" than that for the GBMs. Last but not least, the lack of consistency in the results tables in some papers is, sometimes, confusing. For example, Paper A will use DL Model A to find that performs better than all GBMs, normally <code>XGBoost</code>, <code>Catboost</code> and <code>LightGBM</code>. Then Paper B will come with a new DL Model B that will also perform better than all GBMs, but in their paper it turns out that Model A does not beat GBMs anymore.</p>
<p>Considering all that, I decided to use <a href="https://github.com/jrzaurin/pytorch-widedeep">pytorch-widedeep</a> and run a sizable set of experiments comprising different DL models for tabular data and <code>LightGBM</code>.</p>
<p>Before I move on let me comment on the code "quality" in that repo. One has to bear in mind that the goal here is to test algorithms in a rigorous manner, and not to write production code. If you wanted to see better code you can go to the <a href="https://github.com/jrzaurin/pytorch-widedeep">pytorch-widedeep</a> itself or maybe some other of my repos. Just saying in case some "purist" is tempted to waste universe's time.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="2.-Datasets-and-Models">2. Datasets and Models<a class="anchor-link" href="#2.-Datasets-and-Models"> </a></h2><p>For the experiments here I have used four datasets and four DL models.</p>
<h3 id="2.1-Datasets">2.1 Datasets<a class="anchor-link" href="#2.1-Datasets"> </a></h3><ol>
<li><a href="https://archive.ics.uci.edu/ml/datasets/adult">Adult Census</a> (binary classification) </li>
<li><a href="https://archive.ics.uci.edu/ml/datasets/Bank+Marketing">Bank Marketing</a> (binary classification)</li>
<li><a href="https://www.kaggle.com/neomatrix369/nyc-taxi-trip-duration-extended">NYC taxi ride duration</a> (regression)</li>
<li><a href="https://archive.ics.uci.edu/ml/datasets/Facebook+Comment+Volume+Dataset">Facebook Comment Volume</a> (regression)</li>
</ol>
<p>The bash script <code>get_data.sh</code> in the <a href="https://github.com/jrzaurin/tabulardl-benchmark">repo</a> has all the info you need to get those datasets in case you wanted to explore them yourself. Of course, all the code used to run the experiments and reproduce the results is also available in that repo.</p>
<p>Here are some basic information about the datasets:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">import</span> <span class="nn">pandas</span> <span class="k">as</span> <span class="nn">pd</span>
<span class="n">basic_info</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="s2">"../../tabulardl-benchmark/raw_data/basic_stats_df.csv"</span><span class="p">)</span>
<span class="n">basic_info</span><span class="p">[</span><span class="n">basic_info</span><span class="o">.</span><span class="n">Dataset</span> <span class="o">!=</span> <span class="s2">"airbnb"</span><span class="p">]</span><span class="o">.</span><span class="n">reset_index</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>Dataset</th>
<th>n_rows</th>
<th>n_cols</th>
<th>objective</th>
<th>neg_pos_ratio</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>adult</td>
<td>45222</td>
<td>15</td>
<td>binary_classification</td>
<td>0.3295</td>
</tr>
<tr>
<th>1</th>
<td>bank_marketing</td>
<td>41188</td>
<td>20</td>
<td>binary_classification</td>
<td>0.1270</td>
</tr>
<tr>
<th>2</th>
<td>nyc_taxi</td>
<td>1458644</td>
<td>26</td>
<td>regression</td>
<td>NaN</td>
</tr>
<tr>
<th>3</th>
<td>facebook_comments_vol</td>
<td>199029</td>
<td>54</td>
<td>regression</td>
<td>NaN</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 1</strong>. Basic information for the datasets used in this post</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>There are reasons why I choose those datasets.</p>
<p>In general, I looked for a binary, multi-class and regression datasets that had a good number of, if not dominated by categorical features. This is because in my experience, DL models for tabular data become more useful and competitive in sizeable datasets where categorical features are present (although [5] suggest that better results are obtained encoding numerical features as well) and moreover if these categorical features have a lot of categories. This is because the embeddings acquire a more significant value, i.e. we learn representations of those categorical features that encode relationships with all other features and also the target for a specific dataset. Note that this does not happen when using GBMs. Even if one used <a href="https://maxhalford.github.io/blog/target-encoding/">target encoding</a>, in reality there is not much of a learning element there (still useful of course).</p>
<p>Of course, one could take datasets that are dominated by numerical features and bin them somehow to turn them into categorical. However, this seemed a bit too "forced" for me. With the idea of keeping the content of this post as close as possible to real use cases, it is hard for me to think of many "real world" scenarios where we are provided with datasets dominated by numerical features that are then turned/binned into categorical before being fed to an algorithm. In other words, I did not want to consider datasets where I had to bin the numerical features into categorical just to compare GBMs and DL models.</p>
<p>On the other hand, I also looked for datasets that were already familiar to me or did not required too much feature engineering to get to a stage where the data could be passed to a model. This way I could perhaps save some time on that aspect and focus a bit more on the experimentation, since I intended to run a large number of experiments. Finally I looked for datasets that, to some extent, resemble as much as possible to datasets that one would find in the "real world", but had a tractable size so I could experiment within a reasonable time frame.</p>
<p>While I did manage to find suitable datasets for binary classification and regression, and I did not find datasets that I particularly liked in the case of multi-class classification (if anyone has any suggestion, please comment below and I am happy to give it a go). Perhaps I will include the <a href="https://archive.ics.uci.edu/ml/datasets/covertype">CoverType</a> dataset in the future, but the one at the UCI ML repository, not the Kaggle's balanced version. For now, I will move on with those four enumerated above. Let me briefly comment on each dataset.</p>
<p>I would refer to the <em>Adult Census dataset</em> as the "<em>easiest dataset</em>", in the sense that simple models (i.e. a Naive Bayes classifier) will already lead to accuracies of $\sim$ 84$\%$ without any feature engineering. Personally, I normally don't find these nice datasets in the real-world. However, it is one of the most popular and well known datasets for ML tutorials, posts etc, and I eventually decided to include it.</p>
<p>The <em>Bank Marketing</em> dataset is also well known. This data is related with direct marketing campaigns based on phone calls, trying to predict whether or not a client will subscribe to a product. In this case it is important to mention a couple of relevant aspects. In the first place I used the <a href="https://archive.ics.uci.edu/ml/datasets/Bank+Marketing">original dataset</a>, which is a bit imbalanced (positive to negative class ratio is 0.127). Secondly, you might look around and find that some people obtained better results that those I will show later in the post. All such cases that I found use either a balanced dataset from Kaggle, a feature called <code>duration</code>, or both. The <code>duration</code> feature, which refers to the duration of the call, is something you know <strong>after</strong> the call and highly affects the target. Therefore, I have not used it in my experiments. This dataset resembles more a real use case than the adult dataset in the sense that the data is imbalanced and the prediction is not an easy task at all. Still, the data size is small and is not that imbalanced.</p>
<p>The <em>NYC taxi ride duration</em> dataset is also well known and is the largest of all datasets I used. Here our goal is to predict the total ride duration of taxi trips in New York City. Instead of getting the dataset from the <a href="https://www.kaggle.com/c/nyc-taxi-trip-duration">Kaggle site</a> I manually downloaded an extended version from <a href="https://www.kaggle.com/neomatrix369/nyc-taxi-trip-duration-extended">here</a>, where all the feature engineering had already been done.</p>
<p>Finally the <em>Facebook Comment Volume</em> dataset was another ideal candidate, since it has a good size and all the feature engineering was done for me. Our goal here is to predict the comment volume that posts will receive. In fact this dataset was originally used to compare decision trees versus neural networks. A very detailed description of the dataset and the pre-processing can be found in the <a href="https://uksim.info/uksim2015/data/8713a015.pdf">original publication</a> [6]. In particular, I used their training Variant - 5 dataset for the experiments in this post, which has 199029 rows and 54 columns.</p>
<p>All the code for the data preparation steps, before the data is fed to the algorithms can be found <a href="https://github.com/jrzaurin/tabulardl-benchmark/tree/master/prepare_datasets">here</a></p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="2.2.-The-DL-Models">2.2. The DL Models<a class="anchor-link" href="#2.2.-The-DL-Models"> </a></h3><p>As I mentioned earlier in the post, all DL models were run via <a href="https://github.com/jrzaurin/pytorch-widedeep">pytorch-widedeep</a>. This library offers four wide and deep model <a href="https://pytorch-widedeep.readthedocs.io/en/latest/model_components.html">components</a>: <code>wide</code>, <code>deeptabular</code>, <code>deeptext</code>, <code>deepimage</code>. Let me briefly comment on each one of them. For more details, please see the <a href="https://jrzaurin.github.io/infinitoml/">companion posts</a>, the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/model_components.html">documentation</a> or the <a href="https://github.com/jrzaurin/pytorch-widedeep/tree/tabnet/pytorch_widedeep/models">source code</a> itself.</p>
<ol>
<li><code>wide</code>: this is just a linear model implemented via an <code>Embedding</code> layer</li>
</ol>
<ol>
<li><p><code>deeptabular</code>: this component will take of the "standard" tabular data and has 4 alternatives</p>
<p>2.1 <code>TabMlp</code>: a simple standard MLP. Very similar to, for example, the <a href="https://docs.fast.ai/tabular.learner.html">tabular api</a> implementation in the fastai library.</p>
<p>2.2 <code>TabResnet</code>: similar to the MLP but instead of dense layers I use Resnet blocks.</p>
<p>2.3 <code>Tabnet</code>[7]: this is a very interesting implementation. It is hard to explain it in a few sentences, therefore I strongly suggest reading the <a href="https://arxiv.org/abs/1908.07442">paper</a>. <code>Tabnet</code> is meant to be competitive with GBMs and offers model interpretability via feature importance. <code>pytorch-widedeep</code>'s implementation of <code>Tabnet</code> is fully based on the fantastic <a href="https://github.com/dreamquark-ai/Tabnet">implementation</a> by the guys at dreamquark-ai, therefore, <strong>ALL</strong> credit to them. Simply, I have adapted it to work within a Wide and Deep frame and added a couple of extra features, such as internal dropout in the GLU blocks and the possibility of not using ghost batch normalization [8].</p>
<p>Note that the original implementation allows training in two stages. First self-supervised training via a standard encoder-decoder approach and then supervised training or fine-tuning using only the encoder. Only the supervised training (i.e. the encoder) is implemented in <code>pytorch-widedeep</code>. The authors showed that unsupervised pre-training improves the performance mostly in low data sizes regime or when the unlabeled dataset is much larger than the labeled dataset. Therefore, if you are in one of those scenarios (or simply as a general statement), you better use dreamquark-ai's implementation.</p>
<p>2.4.<code>TabTransformer</code>[9]: this is similar to <code>TabResnet</code>, but instead of Resnet blocks the authors used Transformer [10] blocks. Similar to the case of <code>Tabnet</code>, the <code>TabTransformer</code> allows for a two stages training process, unsupervised pre-training followed by supervised training or fine-tunning. <code>pytorch-widedeep</code>'s implementation of the <code>TabTransformer</code> is designed to be used in a "standard" way, i.e. supervised training. Note that consistent with the results of Sercan Ö. Arık, Tomas Pfister for <code>Tabnet</code>, the authors found that unsupervised pre-training improves the performance mostly in low data volume regime or when the unlabeled dataset is much larger than the labeled dataset. The <code>TabTransformer</code> implementation available in <code>pytorch-widedeep</code> is partially based on that at the <a href="https://github.com/awslabs/autogluon/tree/058398b61d1b2011f56a9dce149b0989adbbb04a/tabular/src/autogluon/tabular/models/tab_transformer">autogluon</a> library and that from Phil Wang <a href="https://github.com/lucidrains/tab-transformer-pytorch">here</a>.</p>
</li>
</ol>
<ol>
<li><code>deeptext</code>: standard text classifier/regressor comprised by a stack of RNNs (LSTMs or GRUs). In addition, there is the option to add a set of dense layers on top of the stack of RNNs and some other extra features. </li>
</ol>
<ol>
<li><code>deepimage</code>: standard image classifier/regressor using a pretrained network (in particular ResNets) or a sequence of 4 convolution layers. In addition, there is the option to add a set of dense layers on top of the stack of CNNs and some other extra features. </li>
</ol>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="2.3.-Why-LightGBM?">2.3. Why <code>LightGBM</code>?<a class="anchor-link" href="#2.3.-Why-LightGBM?"> </a></h3><p>If you have worked with me, or even have a chat with me about some ML project, you will know that one of my favorite algorithms is <code>LightGBM</code>. I have used is extensively. In fact, the last 3 ML systems that I have productionised all relied on <code>LightGBM</code>. It performs similarly, when not better, than its brothers and sisters (e.g. <code>XGBoost</code> or <code>CatBoost</code>), is significantly faster and offers support for categorical features (see <a href="https://www.tandfonline.com/doi/abs/10.1080/01621459.1958.10501479">here</a>. Although when it comes to support for categorical features <code>CatBoost</code> is probably the superior solution). In additions, offers the usual flexibility and performance of GBMs.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="2.4.-Experiments-setup-and-other-considerations">2.4. Experiments setup and other considerations<a class="anchor-link" href="#2.4.-Experiments-setup-and-other-considerations"> </a></h3><p>As I mentioned earlier in the post, I run many experiments (not all were recorded and/or made it to the post) for the four datasets focusing on the different models available for the <code>deeptabular</code> component. All the experiments run can be found <a href="https://github.com/jrzaurin/tabulardl-benchmark/tree/master/run_experiments">here</a> in the repo.</p>
<p>The experiments not only considered different parameters for the models (i.e. number of units, layers, etc..) but also different optimizers, learning rate schedulers, and training processes. For example, all experiments where run with early stopping, with "<code>patience</code> of 30 epochs in most cases. I used three different optimizers (<code>Adam</code>[11], <code>AdamW</code>[12] and <code>RAdam</code>[13]) and three different learning rate schedulers (<code>ReduceLROnPlateau</code>, <code>OneCycleLR</code>[14], <code>CyclicLR</code>[15]). The following command corresponds to one of the experiments run:</p>
<div class="highlight"><pre><span></span>python adult/adult_tabmlp.py --mlp_hidden_dims <span class="o">[</span><span class="m">100</span>,50<span class="o">]</span> --mlp_dropout <span class="m">0</span>.2 --optimizer Adam --early_stop_patience <span class="m">30</span> --lr_scheduler CyclicLR --base_lr 5e-4 --max_lr <span class="m">0</span>.01 --n_cycles <span class="m">10</span> --n_epochs <span class="m">100</span> --save_results
</pre></div>
<p>That command above will run a <code>TabMlp</code> model for the adult dataset. Most <code>args</code> are straightforward to understand. Perhaps the only interesting aspect to comment is that this particular experiment was run with a <code>CyclicLR</code> scheduler, where the learning rate oscillates between 0.0005 to 0.01, 10 times over 100 epochs (i.e. a cycle every 10 epochs).</p>
<p>It is worth mentioning that when running the experiments, I assumed that there is an inherent hierarchy in the DL model parameters and training set ups. Therefore, rather than optimizing all parameters at once, I chose those that I considered more relevant and run experiments that reproduced that hierarchy. For example, when running a simple <code>MLP</code>, I assume that the number of neurons in the layers is a more important parameter than whether or not I use <code>BatchNorm</code> in the last layer. It might be, or surely it is, that the best thing to do is to optimize all parameters at once, but following this "hierarchical" approach also gave me a sense of how changing some individual parameters affected the performance of the model. Nonetheless, around 100 experiments were run per model and per dataset on average, so the exploration was relatively exhaustive (just relatively).</p>
<p>On the other hand <code>LightGBM</code> was optimized using <code>Optuna</code>[16], <code>Hyperopt</code>[17], or both and choosing the parameters that lead to the best metrics. All the code can be found <a href="https://github.com/jrzaurin/tabulardl-benchmark">here</a>. Note that the experiments, and the code in the repo, represent a very detailed and thorough tutorial on how to use <code>pytorch-widedeep</code> (if you wanted to use the library).</p>
<p>It is also worth mentioning that when running the experiment, the early stop criterion for both the DL models and <code>LightGBM</code> was based on the validation loss. Alternatively, one can monitor a metric, such as accuracy of the f1 score. Note that accuracy (or f1) and loss are not necessarily exactly inversely correlated. There might be edge cases where the algorithm is really unsure about some predictions (i.e. predictions are close to the metric threshold leading to high loss values) yet ends up making the right prediction (higher accuracy). Of course, ideally we want the algorithm to be sure and make the right predictions, but you know, the real world is messy and noisy. Nonetheless, out of curiosity, I tried to monitor metrics in some experiments. Overall, I did find that the results where consistent with those monitoring loss values, although slightly better metrics could be achieved in some cases.</p>
<p>Another relevant piece of information is related to the number of embeddings used to represent the categorical features. As one can imagine the amount of possibilities here is endless, and I had to find a way to consistently automate the process across all experiment. To that end I decided to use fastai's <a href="https://github.com/fastai/fastai/blob/90e009b90b9843dde8c02b0268ab9021ebef342f/fastai/tabular/model.py#L10">empirical rule of thumb</a>. For a given categorical feature, the number of embeddings will be:</p>
$$
n_{embed} = min\big(600, int(1.6 \times n_{cat}^{0.56})\big)
$$<p>The exception is the <code>TabTransformer</code>. The <code>TabTransformer</code> treats the categorical features as if they were part of a sequence (i.e. contextual) where the sequence order is irrelevant, i.e. no positional encoding needed. Therefore, rather than stack them "one besides another", they are stacked "one on top of each other". This means that all categorical features must have the same dimensions. Note that this is bit of an inconvenient when we have a wide range of categories for the categorical features in the dataset.</p>
<p>For example, let's say we have a dataset with just 2 categorical features having 50 and 3 different categories respectively. While using embeddings of 16 dimensions, for example, seems appropriate for the former, it certainly seems like an "over-representation" in the latter case. One could still use fastai's rule of thumb and pad the embeddings with lower dimension, but that would imply that some of the attention heads will be attending to zeros/nothing throughout the entire training process, which seems like a waste to me. Despite of this potential "waste", I am considering bringing this as an option for <code>pytorch-widedeep</code>'s <code>TabTransformer</code> implementation. In the meantime, "<em>all</em>" <code>TabTransformer</code> experiments were run with an additional set up where categorical features with a small number of categories were passed through the <code>wide</code> component.</p>
<p>Finally, for all experiments I used 80% of the data for training and 10% for validation/parameter tunning. Then these 2 datasets were combined in one last training run and the algorithm was tested on the remaining 10% of the data. The datasets were split at random unless there is a temporal component. In those cases I used chronological train/test split (note that in the case of the <em>Facebook Comment Volume</em> dataset I did not use the test set used in the paper. All train, validation and test datasets are splits of the Variant - 5 dataset described in the paper).</p>
<p>And that's all, without further ado, let's move to the results.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="3.-Results">3. Results<a class="anchor-link" href="#3.-Results"> </a></h2><p>The previous sections provide context to this "project" and details on the experiments that I did run. In this section I will simply show the top 5 results for all data and model combinations along with some comments when I consider necessary. The complete tables with the results for <em>all</em> experiments can be found <a href="https://github.com/jrzaurin/tabulardl-benchmark/tree/master/analyze_experiments/leaderboards">here</a>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="3.1-Adult-Census-Dataset">3.1 Adult Census Dataset<a class="anchor-link" href="#3.1-Adult-Census-Dataset"> </a></h3><h4 id="3.1.1-TabMlp">3.1.1 <code>TabMlp</code><a class="anchor-link" href="#3.1.1-TabMlp"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">adult_tabmlp</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"adult_tabmlp.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">adult_tabmlp</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_dropout</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>[400,200]</td>
<td>relu</td>
<td>0.5</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.0010</td>
<td>128</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.0010</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2857</td>
</tr>
<tr>
<th>1</th>
<td>[400,200]</td>
<td>relu</td>
<td>0.5</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0005</td>
<td>128</td>
<td>0.0</td>
<td>Adam</td>
<td>CyclicLR</td>
<td>0.0005</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>10.0</td>
<td>0.2860</td>
</tr>
<tr>
<th>2</th>
<td>[100,50]</td>
<td>relu</td>
<td>0.2</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0004</td>
<td>128</td>
<td>0.0</td>
<td>Adam</td>
<td>OneCycleLR</td>
<td>0.0010</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2860</td>
</tr>
<tr>
<th>3</th>
<td>[400,200]</td>
<td>relu</td>
<td>0.5</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.0010</td>
<td>128</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.0010</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2861</td>
</tr>
<tr>
<th>4</th>
<td>[400,200]</td>
<td>relu</td>
<td>0.5</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0005</td>
<td>128</td>
<td>0.0</td>
<td>RAdam</td>
<td>CyclicLR</td>
<td>0.0005</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>10.0</td>
<td>0.2862</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 2</strong>. Results obtained for the Adult Census dataset using <code>TabMlp</code>.</p>
<p>Perhaps the first comment to make relates to the columns/parameters. It is straightforward to understand that not all parameters/columns apply to each experiment/row. For example, parameters/columns like <code>base_lr</code>, <code>max_lr</code>, <code>div_factor</code> or <code>final_div_factor</code> apply only when the learning rate scheduler is either <code>CyclicLR</code> or <code>OneCycleLR</code>.</p>
<p>On the other hand, the dense layers of the MLP are built using a very similar approach to that in the <code>fastai</code> library. This approach offers flexibility in terms of the operations that occur within each dense layer in the MLP (see <a href="https://pytorch-widedeep.readthedocs.io/en/latest/model_components.html#pytorch_widedeep.models.tab_mlp.TabMlp">here</a> for details). in that context thee columns <code>mlp_batchnorm_last</code> and <code>mlp_linear_first</code> set the order in which these operations occur. For example, if for a given dense layer we set <code>mlp_linear_first = True</code>, the implemented dense layer will look like this: <code>[LIN -> ACT -> DP]</code>. On the other hand, If <code>mlp_linear_first = False</code> then the dense layer will perform the operations in the following order: <code>[DP -> LIN -> ACT]</code>.</p>
<p>In the case of the Adult census dataset cyclic learning rates schedulers produce very good results. In fact, a one cycle learning rate with the adequate parameters would already lead to an acceptable validation loss in just one epoch (provided that the batch size is small enough), which perhaps illustrates that this dataset is not particularly difficult. Nonetheless the best result (by a negligible amount) was obtained with a <code>ReduceLROnPlateau</code> learning rate scheduler. This is actually common across all experiments for the different dataset and is also consistent with my experience running DL models in many different scenarios, for tabular data or text. The <code>ReduceLROnPlateau</code> learning rate scheduler was run with "<em>patience</em>" of 10 epochs. This along with the <code>EarlyStopping</code> patience of 30 epochs means that, when <code>ReduceLROnPlateau</code> is used, the learning rate will be reduced 3 times before the experiment is forced to stop.</p>
<p>For full details on the experiments setup, the model implementation and the meaning behind each parameter/column please have a look to the two <code>pytorch-widedeep</code>'s <a href="https://pytorch-widedeep.readthedocs.io/en/latest/index.html">documentation</a> and the experiments <a href="https://github.com/jrzaurin/tabulardl-benchmark">repo</a>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.1.2-TabResnet">3.1.2 <code>TabResnet</code><a class="anchor-link" href="#3.1.2-TabResnet"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">adult_tabresnet</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"adult_tabresnet.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">adult_tabresnet</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>blocks_dims</th>
<th>blocks_dropout</th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_dropout</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.0004</td>
<td>32</td>
<td>0.0</td>
<td>Adam</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2850</td>
</tr>
<tr>
<th>1</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0004</td>
<td>32</td>
<td>0.0</td>
<td>Adam</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2853</td>
</tr>
<tr>
<th>2</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.0004</td>
<td>128</td>
<td>0.0</td>
<td>AdamW</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2854</td>
</tr>
<tr>
<th>3</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.0004</td>
<td>64</td>
<td>0.0</td>
<td>AdamW</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2855</td>
</tr>
<tr>
<th>4</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.0004</td>
<td>32</td>
<td>0.0</td>
<td>AdamW</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2856</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 3</strong>. Results obtained for the Adult dataset using <code>TabResnet</code>.</p>
<p><code>block_dim = same</code> in Table 3 indicate that the Resnet blocks, which are comprised by dense layers, have the same dimensions than the incoming embeddings (see <a href="https://github.com/jrzaurin/pytorch-widedeep/blob/tabnet/pytorch_widedeep/models/tab_resnet.py">here</a> for details on the implementation).</p>
<p>On the other hand, the <code>TabResnet</code> model offers the possibility of using an MLP "on top" of the Resnet blocks. When <code>mlp_hidden_dims = None</code> indicates that no MLP was used and the output of the last Resnet block was "plugged" directly into the output neuron. Therefore, as shown in Table 3, the top 5 results obtained using <code>TabResnet</code> correspond to architectures that have no MLP. In consequence, all MLP related parameters/columns are redundant for those experiments.</p>
<p>I find interesting that whether <code>Adam</code> or <code>AdamW</code>, the best results are obtained using <code>OneCycleLR</code>. When using this scheduler, I normally set the number of epochs to be in between 1 and 10. Normally I obtain the best results for a small number of epochs ($\leq 5$) and a small batch size, which implies that the increase/decrease of the learning rate will be more gradual (i.e. spread over a higher number of steps) as opposed as using large batch sizes. Finally note that the parameter/column <code>n_cycles</code> only apply to the <code>CyclicLR</code> scheduler. Since it is not used in any of the top 5 experiments it can be ignored in Table 3.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.1.3-Tabnet">3.1.3 <code>Tabnet</code><a class="anchor-link" href="#3.1.3-Tabnet"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">adult_tabnet</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"adult_tabnet.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">adult_tabnet</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>n_steps</th>
<th>step_dim</th>
<th>attn_dim</th>
<th>ghost_bn</th>
<th>virtual_batch_size</th>
<th>momentum</th>
<th>gamma</th>
<th>dropout</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>lambda_sparse</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>5</td>
<td>32</td>
<td>32</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.1</td>
<td>0.0</td>
<td>0.03</td>
<td>128</td>
<td>0.0</td>
<td>0.0001</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2916</td>
</tr>
<tr>
<th>1</th>
<td>5</td>
<td>64</td>
<td>64</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.2</td>
<td>0.0</td>
<td>0.03</td>
<td>128</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2938</td>
</tr>
<tr>
<th>2</th>
<td>5</td>
<td>32</td>
<td>32</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.1</td>
<td>0.0</td>
<td>0.03</td>
<td>128</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2939</td>
</tr>
<tr>
<th>3</th>
<td>5</td>
<td>64</td>
<td>64</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.2</td>
<td>0.0</td>
<td>0.03</td>
<td>128</td>
<td>0.0</td>
<td>0.0001</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2945</td>
</tr>
<tr>
<th>4</th>
<td>5</td>
<td>64</td>
<td>64</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.2</td>
<td>0.0</td>
<td>0.05</td>
<td>128</td>
<td>0.0</td>
<td>0.0001</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2962</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 4</strong>. Results obtained for the Adult dataset using <code>Tabnet</code>.</p>
<p><code>Tabnet</code> has received some attention lately for being competitive with GBMs, and even over-performing them. In addition, it is a very elegant implementation that offers model interpretability via feature importance obtained using attention mechanisms.</p>
<p>The reality is that for the Adult Census dataset I obtain the worst loss values on the validation set (but as we will see later, not the worst metric). Maybe I simply missed "that precise" set of parameters that lead to better results. However, it is worth emphasizing that I have explored <code>Tabnet</code> with the same level of detail that any of the other 3 model alternatives.</p>
<p>On the other hand, it is interesting that, within all the experiments run, the best results are consistently obtained without Ghost batch normalization. Therefore, the parameter/column <code>virtual_batch_size</code> can be ignored in Table 4. Similarly, since the best results are all obtained using <code>ReduceLROnPlateau</code>, all the parameters related to cyclic learning rate schedulers can be ignored in Table 4.</p>
<p>Finally, consistent with some experiments I run in the past, the best results obtained using <code>RAdam</code> normally involve relatively high learning rates.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.1.4-TabTransformer">3.1.4 <code>TabTransformer</code><a class="anchor-link" href="#3.1.4-TabTransformer"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">adult_tabtransformer</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"adult_tabtransformer.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">adult_tabtransformer</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>embed_dropout</th>
<th>full_embed_dropout</th>
<th>shared_embed</th>
<th>add_shared_embed</th>
<th>frac_shared_embed</th>
<th>input_dim</th>
<th>n_heads</th>
<th>n_blocks</th>
<th>dropout</th>
<th>ff_hidden_dim</th>
<th>transformer_activation</th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>with_wide</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>4</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.010</td>
<td>128</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2879</td>
</tr>
<tr>
<th>1</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>4</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>same</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.010</td>
<td>128</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2885</td>
</tr>
<tr>
<th>2</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>4</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>True</td>
<td>0.010</td>
<td>128</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2888</td>
</tr>
<tr>
<th>3</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>4</td>
<td>8</td>
<td>0.2</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>True</td>
<td>0.001</td>
<td>128</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2892</td>
</tr>
<tr>
<th>4</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>2</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.010</td>
<td>128</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2894</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 5</strong>. Results obtained for the Adult Census dataset using the <code>TabTransformer</code>.</p>
<p>As with all the previous models, if you wanted details on the meaning of each parameter/column, please have a look to the [documentation] of the [source code] itself.</p>
<p>It is perhaps worth mentioning that when feed forward hidden dim (<code>ff_hidden_dim</code>) is set to <code>NaN</code> the model will default to a <code>ff_hidden_dim</code> value equal to 4 times the input embedding dimensions (16 in all the experiments/rows shown in the Table). This will result in a feed forward layer with dimensions <code>[ff_input_dim -> 4 * ff_input_dim -> ff_input_dim]</code>. Similarly, when <code>mlp_hidden_dims = None</code> the model will default to 4 times the input dimensions, resulting in an MLP of dimensions <code>[mlp_input_dim -> 4 * mlp_input_dim -> 2* mlp_input_dim -> output_dim]</code>.</p>
<p>On In addition, and as mentioned before, the <code>TabTransformer</code> was also run with a set up that includes a <code>wide</code> component. This is specified by the <code>with_wide</code> parameter.</p>
<p>Is is worth noticing that the best loss values, which are similar to those of the rest of the DL models, are normally obtained using a <code>RAdam</code> optimizer.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.1.5-DL-vs-LightGBM">3.1.5 DL vs <code>LightGBM</code><a class="anchor-link" href="#3.1.5-DL-vs-LightGBM"> </a></h4><p>After having gone through the results obtained for each of the DL models, this is the moment of truth, let's see how the DL results compare with those obtained with <code>LightGBM</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">lightgbm_vs_dl_adult</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"lightgbm_vs_dl_adult.csv"</span><span class="p">)</span>
<span class="n">lightgbm_vs_dl_adult</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>model</th>
<th>acc</th>
<th>runtime</th>
<th>best_epoch_or_ntrees</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>lightgbm</td>
<td>0.8782</td>
<td>0.9086</td>
<td>408.0</td>
</tr>
<tr>
<th>1</th>
<td>tabmlp</td>
<td>0.8722</td>
<td>205.3576</td>
<td>62.0</td>
</tr>
<tr>
<th>2</th>
<td>tabtransformer</td>
<td>0.8718</td>
<td>288.6406</td>
<td>32.0</td>
</tr>
<tr>
<th>3</th>
<td>tabnet</td>
<td>0.8704</td>
<td>422.2967</td>
<td>26.0</td>
</tr>
<tr>
<th>4</th>
<td>tabresnet</td>
<td>0.8698</td>
<td>388.9325</td>
<td>25.0</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 6</strong>. Results obtained for the Adult Census dataset using four DL models and <code>LightGBM</code>. <code>runtime</code> units are seconds</p>
<p>Let me emhpasise again that the metrics shown in Table 6 are <em>all</em> obtained, of course, for the test dataset. The <code>runtime</code> column shows the training time, in seconds, for the final train dataset (i.e. a dataset comprising 90% of the data) using the best parameters obtained during validation. The DL models where run on a <code>p2.xlarge</code> instance on AWS and all the <code>LightGBM</code> experiments were run on my Mac Mid 2015.</p>
<p>They are a few aspects worth commenting. In the first place, all DL models obtain results that are competitive with, but not better than, those of <code>LightGBM</code>. Secondly, the best performing DL model (by a rather marginal amount) is the simplest model, the <code>TabMlp</code>. And finally, the training time when using <code>LightGBM</code> is simply "<em>gigantically</em>" better than with any of the DL models.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="3.2-Bank-Marketing-Dataset">3.2 Bank Marketing Dataset<a class="anchor-link" href="#3.2-Bank-Marketing-Dataset"> </a></h3><p>Most of the comments in the previous section apply to the tables shown in this section.</p>
<p>Note that as I mentioned earlier in the post, the Bank Marketing dataset is slightly imbalanced. Therefore I also run some experiments using the <a href="https://arxiv.org/abs/1708.02002?source=post_page---------------------------">focal loss</a> [18] (which is accessible in <code>pytorch_widedeep</code> via a parameter or as a loss function input. See <a href="https://pytorch-widedeep.readthedocs.io/en/latest/trainer.html">here</a>). Overall, the results obtained where similar to, but not better than those without the focal loss. This is consistent with my experience with other datasets where I find that the focal loss leads to notably better results when the dataset is highly imbalanced (for example, around 2% positive to negative class ratio).</p>
<h4 id="3.2.1-TabMlp">3.2.1 <code>TabMlp</code><a class="anchor-link" href="#3.2.1-TabMlp"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">bank_marketing_tabmlp</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"bank_marketing_tabmlp.csv"</span><span class="p">)</span>
<span class="c1"># focal loss values are on a different scale</span>
<span class="n">bank_marketing_tabmlp</span> <span class="o">=</span> <span class="n">bank_marketing_tabmlp</span><span class="p">[</span><span class="n">bank_marketing_tabmlp</span><span class="o">.</span><span class="n">val_loss_or_metric</span> <span class="o">></span> <span class="mf">0.2</span><span class="p">]</span>
<span class="p">(</span><span class="n">bank_marketing_tabmlp</span>
<span class="o">.</span><span class="n">sort_values</span><span class="p">(</span><span class="s2">"val_loss_or_metric"</span><span class="p">,</span> <span class="n">ascending</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
<span class="o">.</span><span class="n">reset_index</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
<span class="o">.</span><span class="n">head</span><span class="p">(</span><span class="mi">5</span><span class="p">))</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_dropout</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>[100,50]</td>
<td>relu</td>
<td>0.1</td>
<td>True</td>
<td>True</td>
<td>False</td>
<td>0.1</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2638</td>
</tr>
<tr>
<th>1</th>
<td>[100,50]</td>
<td>relu</td>
<td>0.1</td>
<td>True</td>
<td>False</td>
<td>True</td>
<td>0.1</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2639</td>
</tr>
<tr>
<th>2</th>
<td>[100,50]</td>
<td>relu</td>
<td>0.1</td>
<td>True</td>
<td>True</td>
<td>False</td>
<td>0.1</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2643</td>
</tr>
<tr>
<th>3</th>
<td>[100,50]</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2643</td>
</tr>
<tr>
<th>4</th>
<td>[100,50]</td>
<td>relu</td>
<td>0.1</td>
<td>True</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2646</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 7</strong>. Results obtained for the Bank Marketing dataset using <code>TabMlp</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.2.2-TabResnet">3.2.2 <code>TabResnet</code><a class="anchor-link" href="#3.2.2-TabResnet"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">bank_marketing_tabresnet</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"bank_marketing_tabresnet.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">head</span><span class="p">(</span><span class="mi">5</span><span class="p">)</span>
<span class="n">bank_marketing_tabresnet</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>blocks_dims</th>
<th>blocks_dropout</th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_dropout</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0004</td>
<td>64</td>
<td>0.0</td>
<td>Adam</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2660</td>
</tr>
<tr>
<th>1</th>
<td>[50,50,50,50]</td>
<td>0.2</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0010</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2661</td>
</tr>
<tr>
<th>2</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0004</td>
<td>64</td>
<td>0.0</td>
<td>RAdam</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2663</td>
</tr>
<tr>
<th>3</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0004</td>
<td>128</td>
<td>0.0</td>
<td>RAdam</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2664</td>
</tr>
<tr>
<th>4</th>
<td>same</td>
<td>0.5</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0004</td>
<td>128</td>
<td>0.0</td>
<td>Adam</td>
<td>OneCycleLR</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>0.2667</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 8</strong>. Results obtained for the Bank Marketing dataset using <code>TabResnet</code>.</p>
<p>Again, and very interestingly, <code>RAdam</code> optimizer and <code>OneCycleLR</code> leading to some of the best results for this DL model.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.2.3-Tabnet">3.2.3 <code>Tabnet</code><a class="anchor-link" href="#3.2.3-Tabnet"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">bank_marketing_tabnet</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"bank_marketing_tabnet.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">head</span><span class="p">(</span><span class="mi">5</span><span class="p">)</span>
<span class="n">bank_marketing_tabnet</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>n_steps</th>
<th>step_dim</th>
<th>attn_dim</th>
<th>ghost_bn</th>
<th>virtual_batch_size</th>
<th>momentum</th>
<th>gamma</th>
<th>dropout</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>lambda_sparse</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>5</td>
<td>16</td>
<td>16</td>
<td>True</td>
<td>128</td>
<td>0.75</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.03</td>
<td>512</td>
<td>0.0</td>
<td>0.0001</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2714</td>
</tr>
<tr>
<th>1</th>
<td>5</td>
<td>16</td>
<td>16</td>
<td>True</td>
<td>64</td>
<td>0.25</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.03</td>
<td>512</td>
<td>0.0</td>
<td>0.0001</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2722</td>
</tr>
<tr>
<th>2</th>
<td>5</td>
<td>64</td>
<td>64</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.2</td>
<td>0.0</td>
<td>0.03</td>
<td>128</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2726</td>
</tr>
<tr>
<th>3</th>
<td>5</td>
<td>64</td>
<td>64</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.2</td>
<td>0.0</td>
<td>0.03</td>
<td>128</td>
<td>0.0</td>
<td>0.0001</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2738</td>
</tr>
<tr>
<th>4</th>
<td>5</td>
<td>16</td>
<td>16</td>
<td>True</td>
<td>128</td>
<td>0.98</td>
<td>2.0</td>
<td>0.0</td>
<td>0.0</td>
<td>0.03</td>
<td>512</td>
<td>0.0</td>
<td>0.0001</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2739</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 9</strong>. Results obtained for the Bank Marketing dataset using <code>Tabnet</code>.</p>
<p>Note the top 5 results obtained with <code>Tabnet</code> in this case all have relatively high learning rate values (<code>lr = 0.03</code>). Also, and similar to the case of the Adult Census dataset, <code>Tabnet</code> produces the worst validation loss values.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.2.4-TabTransformer">3.2.4 <code>TabTransformer</code><a class="anchor-link" href="#3.2.4-TabTransformer"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">bank_marketing_tabtransformer</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"bank_marketing_tabtransformer.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">head</span><span class="p">(</span><span class="mi">5</span><span class="p">)</span>
<span class="n">bank_marketing_tabtransformer</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>embed_dropout</th>
<th>full_embed_dropout</th>
<th>shared_embed</th>
<th>add_shared_embed</th>
<th>frac_shared_embed</th>
<th>input_dim</th>
<th>n_heads</th>
<th>n_blocks</th>
<th>dropout</th>
<th>ff_hidden_dim</th>
<th>transformer_activation</th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>with_wide</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>32</td>
<td>8</td>
<td>6</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2646</td>
</tr>
<tr>
<th>1</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>32</td>
<td>8</td>
<td>6</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2647</td>
</tr>
<tr>
<th>2</th>
<td>0.0</td>
<td>False</td>
<td>True</td>
<td>False</td>
<td>4</td>
<td>16</td>
<td>4</td>
<td>6</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.010</td>
<td>128</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2668</td>
</tr>
<tr>
<th>3</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>32</td>
<td>8</td>
<td>6</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.010</td>
<td>1024</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2672</td>
</tr>
<tr>
<th>4</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>32</td>
<td>8</td>
<td>6</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.001</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>0.2672</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 10</strong>. Results obtained for the Bank Marketing dataset using the <code>TabTransformer</code>.</p>
<p>It is perhaps worth noticing that consistent with some of the previous results, the best results obtained here using <code>RAdam</code> involve relatively high learning rates (a factor of 10 compared to those obtained using <code>Adam</code> or <code>AdamW</code>.)</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.2.5-DL-vs-LightGBM">3.2.5 DL vs <code>LightGBM</code><a class="anchor-link" href="#3.2.5-DL-vs-LightGBM"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">lightgbm_vs_dl_bank_marketing</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"lightgbm_vs_dl_bank_marketing.csv"</span><span class="p">)</span>
<span class="n">lightgbm_vs_dl_bank_marketing</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>model</th>
<th>f1</th>
<th>auc</th>
<th>runtime</th>
<th>best_epoch_or_ntrees</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>tabresnet</td>
<td>0.4298</td>
<td>0.6501</td>
<td>92.5175</td>
<td>11.0</td>
</tr>
<tr>
<th>1</th>
<td>tabtransformer</td>
<td>0.4200</td>
<td>0.6440</td>
<td>31.6938</td>
<td>4.0</td>
</tr>
<tr>
<th>2</th>
<td>tabmlp</td>
<td>0.3855</td>
<td>0.6281</td>
<td>9.5721</td>
<td>7.0</td>
</tr>
<tr>
<th>3</th>
<td>lightgbm</td>
<td>0.3852</td>
<td>0.6265</td>
<td>0.4614</td>
<td>57.0</td>
</tr>
<tr>
<th>4</th>
<td>tabnet</td>
<td>0.3087</td>
<td>0.5943</td>
<td>77.8781</td>
<td>13.0</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 11</strong>. Results obtained for the Bank Marketing dataset using four DL models and LightGBM.</p>
<p>I must admit that the results shown in Table 11 were surprising to me at first (to say the least). I went and run a few DL models again and <code>LightGBM</code> multiple times to double check, and finally concluded (spoiler alert) that this is going to be the only case among all experiments I run in this post where DL models perform better than <code>LightGBM</code>. In fact, if we joined the experiments here with my experience at work, this is the second time ever that I find that DL models perform better than <code>LightGBM</code> (more on this later). Furthermore, the improvement obtained using <code>TabResnet</code> or the <code>TabTransformer</code> is quite significant to the point that if this was a "real world" example, one might consider using a DL model and accept the trade between running time and success metric.</p>
<p>Of course one could go and dive a bit deeper into <code>LightGBM</code>, setting sample weights, or even using a custom loss, but the same can be said about the DL models. Therefore, and overall, I consider the comparison fair. However, I am so surprised that I consider the possibility that I might have a bug in the code that I have not been able to find. Therefore, if anyone goes through the code at some point and finds indeed a bug please let me know 🙂.</p>
<p>Finally, someone might feel disappointed by <code>Tabnet</code>'s performance, as I was. There is a possibility that I have not implemented it correctly, although the code is fully based on that from dreamquark-ai's implementation (<strong>ALL</strong> credit to them) and when tested with easier datasets, I obtain similar results to those with GBMs. I find <code>Tabnet</code> to be a very elegant implementation and somehow I believe it should perform better. I will come back to this point in the Conclusions section.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="3.3-NYC-Taxi-trip-duration">3.3 NYC Taxi trip duration<a class="anchor-link" href="#3.3-NYC-Taxi-trip-duration"> </a></h3><p>As I mentioned earlier this is the largest dataset, and in consequence, I experimented with larger batch sizes. While this might slightly change some of the individual results, I believe it will not change the overall conclusion in this section.</p>
<h4 id="3.3.1-TabMlp">3.3.1 <code>TabMlp</code><a class="anchor-link" href="#3.3.1-TabMlp"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">nyc_taxi_tabmlp</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"nyc_taxi_tabmlp.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">nyc_taxi_tabmlp</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_dropout</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>auto</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>True</td>
<td>0.0</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>79252.7786</td>
</tr>
<tr>
<th>1</th>
<td>auto</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>True</td>
<td>0.0</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>79440.6025</td>
</tr>
<tr>
<th>2</th>
<td>auto</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>79477.5653</td>
</tr>
<tr>
<th>3</th>
<td>auto</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>79710.8551</td>
</tr>
<tr>
<th>4</th>
<td>auto</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>80214.7197</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 12</strong>. Results obtained for the NYC Taxi trip duration dataset using the <code>TabMlp</code>.</p>
<p>The validation loss in this case is the <code>MSE</code>. The standard deviation (std hereafter) of the target variable in the validation set is $\sim$599. Given that the std is the <code>RMSE</code> we would obtain if we always predicted the expected value, we can see that this is not a very powerful model, i.e. the task of predicting taxi trip duration is, indeed, relatively challenging.</p>
<p>Let's see how the other DL models perform.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.3.2-TabResnet">3.3.2 <code>TabResnet</code><a class="anchor-link" href="#3.3.2-TabResnet"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 13</strong>. Results obtained for the NYC Taxi trip duration dataset using the <code>TabResnet</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.3.3--Tabnet">3.3.3 <code>Tabnet</code><a class="anchor-link" href="#3.3.3--Tabnet"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">nyc_taxi_tabnet</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"nyc_taxi_tabnet.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">nyc_taxi_tabnet</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>n_steps</th>
<th>step_dim</th>
<th>attn_dim</th>
<th>ghost_bn</th>
<th>virtual_batch_size</th>
<th>momentum</th>
<th>gamma</th>
<th>dropout</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>lambda_sparse</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>5</td>
<td>8</td>
<td>8</td>
<td>False</td>
<td>128</td>
<td>0.75</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>144819.1190</td>
</tr>
<tr>
<th>1</th>
<td>5</td>
<td>8</td>
<td>8</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>146057.8078</td>
</tr>
<tr>
<th>2</th>
<td>5</td>
<td>8</td>
<td>8</td>
<td>False</td>
<td>128</td>
<td>0.50</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>146201.3771</td>
</tr>
<tr>
<th>3</th>
<td>5</td>
<td>16</td>
<td>16</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>146461.7343</td>
</tr>
<tr>
<th>4</th>
<td>5</td>
<td>8</td>
<td>8</td>
<td>False</td>
<td>128</td>
<td>0.25</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>148636.8888</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 14</strong>. Results obtained for the NYC Taxi trip duration dataset using the <code>Tabnet</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.3.4--TabTransformer">3.3.4 <code>TabTransformer</code><a class="anchor-link" href="#3.3.4--TabTransformer"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">nyc_taxi_tabtransformer</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"nyc_taxi_tabtransformer.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">nyc_taxi_tabtransformer</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>embed_dropout</th>
<th>full_embed_dropout</th>
<th>shared_embed</th>
<th>add_shared_embed</th>
<th>frac_shared_embed</th>
<th>input_dim</th>
<th>n_heads</th>
<th>n_blocks</th>
<th>dropout</th>
<th>ff_hidden_dim</th>
<th>transformer_activation</th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>with_wide</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>4</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>180162.4087</td>
</tr>
<tr>
<th>1</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>4</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.01</td>
<td>256</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>186017.1888</td>
</tr>
<tr>
<th>2</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>4</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.01</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>196144.0674</td>
</tr>
<tr>
<th>3</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>32</td>
<td>8</td>
<td>4</td>
<td>0.4</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.01</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>357869.3703</td>
</tr>
<tr>
<th>4</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>64</td>
<td>16</td>
<td>4</td>
<td>0.4</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.01</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>357884.9043</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 15</strong>. Results obtained for the NYC Taxi trip duration dataset using the <code>TabTransformer</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.3.5-DL-vs-LightGBM">3.3.5 DL vs <code>LightGBM</code><a class="anchor-link" href="#3.3.5-DL-vs-LightGBM"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">lightgbm_vs_dl_nyc_taxy</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"lightgbm_vs_dl_nyc_taxi.csv"</span><span class="p">)</span>
<span class="n">lightgbm_vs_dl_nyc_taxy</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>model</th>
<th>rmse</th>
<th>r2</th>
<th>runtime</th>
<th>best_epoch_or_ntrees</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>lightgbm</td>
<td>262.7099</td>
<td>0.8044</td>
<td>42.7211</td>
<td>504.0</td>
</tr>
<tr>
<th>1</th>
<td>tabmlp</td>
<td>271.3422</td>
<td>0.7913</td>
<td>568.4309</td>
<td>24.0</td>
</tr>
<tr>
<th>2</th>
<td>tabresnet</td>
<td>292.8908</td>
<td>0.7569</td>
<td>471.2650</td>
<td>24.0</td>
</tr>
<tr>
<th>3</th>
<td>tabtransformer</td>
<td>336.5826</td>
<td>0.6789</td>
<td>5779.0314</td>
<td>54.0</td>
</tr>
<tr>
<th>4</th>
<td>tabnet</td>
<td>376.0530</td>
<td>0.5992</td>
<td>1844.4723</td>
<td>15.0</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 16</strong>. Results obtained for the NYC Taxi trip duration dataset using four DL models and LightGBM.</p>
<p>The <code>TabTransformer</code> and <code>Tabnet</code> are, in this case, the models that have the worst performance. As I mentioned earlier I will reflect on potential reasons later in the Conclusion section.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="3.4-Facebook-comments-volume">3.4 Facebook comments volume<a class="anchor-link" href="#3.4-Facebook-comments-volume"> </a></h3><p>This is the last of the four datasets we will be discussing in this post, a second regression problem.</p>
<h4 id="3.4.1-TabMlp">3.4.1 <code>TabMlp</code><a class="anchor-link" href="#3.4.1-TabMlp"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">fb_comments_tabmlp</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"fb_comments_tabmlp.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">fb_comments_tabmlp</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_dropout</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>[100,50]</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>True</td>
<td>0.0</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>32.5931</td>
</tr>
<tr>
<th>1</th>
<td>[100,50]</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>33.3515</td>
</tr>
<tr>
<th>2</th>
<td>[200, 100]</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.001</td>
<td>256</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>33.4140</td>
</tr>
<tr>
<th>3</th>
<td>[200, 100]</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.1</td>
<td>0.001</td>
<td>256</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>33.5679</td>
</tr>
<tr>
<th>4</th>
<td>[200, 100]</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.001</td>
<td>512</td>
<td>0.0</td>
<td>RAdam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>33.6284</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 17</strong>. Results obtained for the Facebook comments volume dataset using <code>TabMlp</code>.</p>
<p>As in the case of the NYC Taxi trip duration, the validation loss is the <code>MSE</code> loss. The <code>std</code> of the target variable is ~13 in the case of the Facebook comments volume dataset. Therefore, following the same reasoning, we can see that the task of predicting the volume of facebook comments using this particular dataset
is challenging.</p>
<p>Let's see how the other DL models perform.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.4.2--TabResnet">3.4.2 <code>TabResnet</code><a class="anchor-link" href="#3.4.2--TabResnet"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">fb_comments_tabresnet</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"fb_comments_tabresnet.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">fb_comments_tabresnet</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>blocks_dims</th>
<th>blocks_dropout</th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_dropout</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>[100, 100, 100]</td>
<td>0.1</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0005</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>CyclicLR</td>
<td>0.0005</td>
<td>0.03</td>
<td>25</td>
<td>10000.0</td>
<td>10.0</td>
<td>34.4972</td>
</tr>
<tr>
<th>1</th>
<td>[100, 100, 100]</td>
<td>0.1</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0005</td>
<td>512</td>
<td>0.0</td>
<td>AdamW</td>
<td>CyclicLR</td>
<td>0.0005</td>
<td>0.03</td>
<td>25</td>
<td>10000.0</td>
<td>10.0</td>
<td>34.8520</td>
</tr>
<tr>
<th>2</th>
<td>[100, 100, 100]</td>
<td>0.1</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0005</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>CyclicLR</td>
<td>0.0005</td>
<td>0.03</td>
<td>25</td>
<td>10000.0</td>
<td>10.0</td>
<td>34.9504</td>
</tr>
<tr>
<th>3</th>
<td>[100, 100, 100]</td>
<td>0.1</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0005</td>
<td>512</td>
<td>0.0</td>
<td>Adam</td>
<td>CyclicLR</td>
<td>0.0005</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>10.0</td>
<td>35.1668</td>
</tr>
<tr>
<th>4</th>
<td>[100, 100, 100]</td>
<td>0.1</td>
<td>None</td>
<td>relu</td>
<td>0.1</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0</td>
<td>0.0005</td>
<td>512</td>
<td>0.0</td>
<td>AdamW</td>
<td>CyclicLR</td>
<td>0.0005</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>10.0</td>
<td>35.2503</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 18</strong>. Results obtained for the Facebook comments volume dataset using <code>TabResnet</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.4.3--Tabnet">3.4.3 <code>Tabnet</code><a class="anchor-link" href="#3.4.3--Tabnet"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">fb_comments_tabnet</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"fb_comments_tabnet.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">fb_comments_tabnet</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>n_steps</th>
<th>step_dim</th>
<th>attn_dim</th>
<th>ghost_bn</th>
<th>virtual_batch_size</th>
<th>momentum</th>
<th>gamma</th>
<th>dropout</th>
<th>embed_dropout</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>lambda_sparse</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>5</td>
<td>16</td>
<td>16</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.03</td>
<td>512</td>
<td>0.0</td>
<td>0.0001</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>35.8122</td>
</tr>
<tr>
<th>1</th>
<td>3</td>
<td>16</td>
<td>16</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.2</td>
<td>0.0</td>
<td>0.03</td>
<td>512</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>37.6417</td>
</tr>
<tr>
<th>2</th>
<td>5</td>
<td>16</td>
<td>16</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.03</td>
<td>512</td>
<td>0.0</td>
<td>0.0001</td>
<td>AdamW</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>38.9771</td>
</tr>
<tr>
<th>3</th>
<td>5</td>
<td>16</td>
<td>16</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.2</td>
<td>0.0</td>
<td>0.03</td>
<td>512</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>39.5899</td>
</tr>
<tr>
<th>4</th>
<td>5</td>
<td>16</td>
<td>16</td>
<td>False</td>
<td>128</td>
<td>0.98</td>
<td>1.5</td>
<td>0.0</td>
<td>0.0</td>
<td>0.03</td>
<td>256</td>
<td>0.0</td>
<td>0.0001</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.001</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5</td>
<td>40.9462</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 19</strong>. Results obtained for the Facebook comments volume dataset using <code>Tabnet</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.4.4-TabTransformer">3.4.4 <code>TabTransformer</code><a class="anchor-link" href="#3.4.4-TabTransformer"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">fb_comments_tabtransformer</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"fb_comments_tabtransformer.csv"</span><span class="p">)</span><span class="o">.</span><span class="n">iloc</span><span class="p">[:</span><span class="mi">5</span><span class="p">]</span>
<span class="n">fb_comments_tabtransformer</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>embed_dropout</th>
<th>full_embed_dropout</th>
<th>shared_embed</th>
<th>add_shared_embed</th>
<th>frac_shared_embed</th>
<th>input_dim</th>
<th>n_heads</th>
<th>n_blocks</th>
<th>dropout</th>
<th>ff_hidden_dim</th>
<th>transformer_activation</th>
<th>mlp_hidden_dims</th>
<th>mlp_activation</th>
<th>mlp_batchnorm</th>
<th>mlp_batchnorm_last</th>
<th>mlp_linear_first</th>
<th>with_wide</th>
<th>lr</th>
<th>batch_size</th>
<th>weight_decay</th>
<th>optimizer</th>
<th>lr_scheduler</th>
<th>base_lr</th>
<th>max_lr</th>
<th>div_factor</th>
<th>final_div_factor</th>
<th>n_cycles</th>
<th>val_loss_or_metric</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>2</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0005</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>CyclicLR</td>
<td>0.0005</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>10.0</td>
<td>33.0946</td>
</tr>
<tr>
<th>1</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>2</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0005</td>
<td>4096</td>
<td>0.0</td>
<td>AdamW</td>
<td>OneCycleLR</td>
<td>0.0010</td>
<td>0.01</td>
<td>25</td>
<td>1000.0</td>
<td>5.0</td>
<td>33.1283</td>
</tr>
<tr>
<th>2</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>2</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0010</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.0010</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>33.2175</td>
</tr>
<tr>
<th>3</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>2</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>same</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0010</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.0010</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>33.4698</td>
</tr>
<tr>
<th>4</th>
<td>0.0</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>8</td>
<td>16</td>
<td>4</td>
<td>4</td>
<td>0.1</td>
<td>NaN</td>
<td>relu</td>
<td>None</td>
<td>relu</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>False</td>
<td>0.0010</td>
<td>1024</td>
<td>0.0</td>
<td>Adam</td>
<td>ReduceLROnPlateau</td>
<td>0.0010</td>
<td>0.01</td>
<td>25</td>
<td>10000.0</td>
<td>5.0</td>
<td>33.7950</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 20</strong>. Results obtained for the Facebook comments volume dataset using the <code>TabTransformer</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h4 id="3.4.5--DL-vs-LightGBM">3.4.5 DL vs <code>LightGBM</code><a class="anchor-link" href="#3.4.5--DL-vs-LightGBM"> </a></h4>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">lightgbm_vs_dl_fb_comments</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="n">TABLES_DIR</span> <span class="o">/</span> <span class="s2">"lightgbm_vs_dl_fb_comments.csv"</span><span class="p">)</span>
<span class="n">lightgbm_vs_dl_fb_comments</span><span class="o">.</span><span class="n">round</span><span class="p">(</span><span class="mi">4</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>model</th>
<th>rmse</th>
<th>r2</th>
<th>runtime</th>
<th>best_epoch_or_ntrees</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>lightgbm</td>
<td>5.5290</td>
<td>0.8232</td>
<td>6.5259</td>
<td>687.0</td>
</tr>
<tr>
<th>1</th>
<td>tabmlp</td>
<td>5.9085</td>
<td>0.7981</td>
<td>250.4768</td>
<td>43.0</td>
</tr>
<tr>
<th>2</th>
<td>tabtransformer</td>
<td>5.9256</td>
<td>0.7969</td>
<td>533.3908</td>
<td>27.0</td>
</tr>
<tr>
<th>3</th>
<td>tabresnet</td>
<td>6.2138</td>
<td>0.7767</td>
<td>70.4661</td>
<td>9.0</td>
</tr>
<tr>
<th>4</th>
<td>tabnet</td>
<td>6.4285</td>
<td>0.7610</td>
<td>935.0205</td>
<td>59.0</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 21</strong>. Results obtained for the Facebook comments volume dataset using four DL models and LightGBM.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="4.-Summary">4. Summary<a class="anchor-link" href="#4.-Summary"> </a></h2><p>I have used four datasets and run over 1500 experiments (meaning runs with a parameter setup) comparing four DL models with <code>LightGBM</code>. This is a summary of some of the results.</p>
<ul>
<li><p><code>LightGBM</code> wins, and there was never a fight</p>
<p>With one exception, <code>LightGBM</code> performs better than the DL models, and that one exception is precisely that, exceptional. To the experiments run and discussed here I could add two occasions where I used DL for tabular data in companies that I worked with. In particular, the model that is referred here as <code>TabMlp</code> with a <code>wide</code> component in one case and on its own in the other.</p>
<p>The Wide & Deep model was used in the context of a recommendation algorithm, shortly after the popular <a href="https://arxiv.org/abs/1606.07792">Wide and Deep</a> [19] paper was published in 2016. Back then I was using XGBoost to predict a measure of interest and rank offers based on that measure. The Wide and Deep model, implemented then with <code>Keras</code>, obtained slightly better MAP and NDCG than XGBoost (almost identical metrics, although slightly lower, were obtained when using just the deep component). Given the number of additional considerations that one needs to take into account as you go to production, we eventually used XGBoost.</p>
<p>In the second occasion, a more recent project, <code>TabMlp</code> on its own obtained very similar, but still lower RMSE and R2 values to those obtained using <code>LightGBM</code>. Even though <code>TabMLP</code>'s predictions were not directly used, we found the embeddings useful for a number of additional projects and we built a production system around <code>TabMlp</code>.</p>
<p>Up to this point, I have focused on performance as measured by success metrics. However, when it comes to training (and prediction) time, the difference is so significant that makes some of these algorithms, at this stage, just useful for research purposes and/or kaggle competitions. Don't get me wrong, you only push an industry technologically by challenging current solutions and established concepts. I am simply stating that at this stage, in a production environment, it would be hard to envision a robust system built around some of these algorithms. This is the reason why I wrote "<em>there was never a fight</em>". When you go live, quite often is not only about success metrics but also speed and resilience. Considering altogether it seems to me like DL models for tabular data are still a bit far from being normally inserted in productions systems (but read below).</p>
<p>Finally, you might read here and there that with the proper feature engineering, noise removal, balancing and "who-knows-what-else" DL models outperform GBMs. The truth is that in my experience is actually the opposite. When one manages to engineer good, powerful features GBMs perform even better than DL models. This is also consistent with the results in some recent competitions. For example, in the <a href="https://recsys-twitter.com/">RecSys Challenge 2020</a> the guys at NVIDIA won using <a href="https://medium.com/rapids-ai/winning-solution-of-recsys2020-challenge-gpu-accelerated-feature-engineering-and-training-for-cd67c5a87b1f">clever featuring engineering</a> (e.g. target oriented encoding) "plugged" into XGBoost on steroids (or better, GPUs). I am not sure that using those features and a DL model would actually improve their results.</p>
<p>Overall, if I joined the results found this post, plus that I have found trying DL models on tabular data on real datasets in the industry, I can only conclude that DL models for tabular data "are not quite there yet" in terms of overall performance.</p>
</li>
</ul>
<ul>
<li><p><code>TabNet</code> and the <code>TabTransformer</code></p>
<p>One rather surprising results was the poor performance of <code>Tabnet</code>, and perhaps to a lesser extent, the <code>TabTransformer</code>.</p>
<p>One possibility is that I have not found the right set of parameters that lead to good metrics. In fact, the amount of overfitting when using <code>Tabnet</code> and <code>TabTransformer</code> was very significant, higher than in the case of <code>TabResnet</code> and furthermore <code>TabMlp</code>. This makes me believe that if I find a better set of regularization parameters, or simply using a different number of embeddings per categorical feature, I might be able to improve the results shown in the tables above. However, I should also say that given the good reception that these algorithms are having and the poor results I obtained, I placed a bit more emphasis in trying some additional parameters. Unfortunately, none of my attempts lead to a significant improvement.</p>
<p>A second possibility is, of course, that the implementation at <code>pytorch-widedeep</code> is wrong. I guess I will find this out as I keep releasing versions and using the package.</p>
<p>Overall, I find that <code>TabNet</code> is the worst performer (and the slowest) and I will certainly devote some extra time in the coming weeks to see if this is related to the input parameters.</p>
</li>
</ul>
<ul>
<li><p>Simplicity over complexity.</p>
<p>It is interesting to see that overall, the DL algorithm that achieves similar performance to that of <code>LightGBM</code> is a simple MLP. By the time I write this, I wonder if this is somehow related to the emerging trend that is bringing MLPs back (e.g. [20], [21] or [22]), and the advent of more complex models is simply the result of hype instead of a proper exploration of current solutions.</p>
<p>Of course, for more complex models, there is more room for exploration and hyperparameter optimization. While this is something I intend to keep exploring, there is a moment in space and time that one wonders "<em>is this really worth it?</em>".</p>
<p>Let's see if I manage to answer this question in the next section</p>
</li>
</ul>
<h2 id="5-Conclusion">5 Conclusion<a class="anchor-link" href="#5-Conclusion"> </a></h2><p>When I started thinking of this post a part of me already knew that DL models were, overall, not a real challenge for <code>LightGBM</code>. If we focused only in performance metrics and running time the only possible conclusion is that DL models for tabular data are still not competition for GBMs in real-world environments. However, at this stage in the industry/market, is that really <em>the question</em> to answer? I don't think so.</p>
<p>This is not a competition, and it should not be, this should be a coalition. The question to answer is: "how DL models for tabular data can help in the industry and complement the current systems". Let's reflect a bit on this question.</p>
<p>In my experience, DL models on tabular data perform best on sizeable dataset that involve many categorical features and these have many categories themselves. In those scenarios, one could just try DL models with an initial aim of using directly the prediction. However, even if the prediction is eventually not used, the embeddings contain a wealth of useful information. Information on how each categorical feature interacts with each other and information on how each categorical features relates to the target variable. These embeddings can be used for a number of additional products.</p>
<p>For example, let's assume that you have a dataset with metadata for thousands of brands and prices for their corresponding products. Your task is to predict how the price changes over time (i.e. forecasting price). The embeddings for the categorical feature <code>brand</code> will give you information about how a particular brand relates to the rest of the columns in the dataset and the target (price). In other words, if given a brand you find the closest brands as defined by embeddings proximity you would be "naturally" and directly finding competitors within a given space (assuming that the dataset is representative of the market).</p>
<p>In additions, GBMs do not allow for transfer learning, but DL models do. Furthermore, and as mentioned in the <code>TabNet</code> and the <code>TabTransformer</code> papers, self-supervised training leads to better performance in regimes where the data is low or the unlabeled dataset is much larger than the labeled dataset. Therefore, there are scenarios where DL models can be extremely useful.</p>
<p>For example, let's assume you have a large dataset for a given problem in one country but a much smaller dataset for the exact same problem in another country. Let's also asuume that the datasets are, column-wise, rather similar. One could train a DL model using the large dataset and "transfer the learnings" to the second, much smaller dataset with the hope of obtaining a much higher performance than just using that small dataset alone.</p>
<p>There are some other scenarios that I can think of, but I will leave it here. In general, I simply wanted to illustrate that, if you came here to enjoy the fact that GBMs perform better than DL models, I hope you enjoyed the ride (and that you start thinking in a good therapist), but in my opinion, that is not the point.</p>
<p><strong>In terms of metrics, GBMs perform better than DL models, that is correct, but the latter bring some functionalities to the table that GBMs don't have and therefore, complement them perfectly.</strong></p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="6.-Future-Work">6. Future Work<a class="anchor-link" href="#6.-Future-Work"> </a></h2><p>I started thinking in this post months ago. Then some other things took priority in my life (plus a lot of work) and it became a bit of a longer journey. I now hope I can get a bit of help from very clever people in my team and improve the Tabular vs DL code in the repo, perhaps automating some processes so I can easily add more datasets in the future.</p>
<p>Also this has been a good test for the <code>pytorch-widedeep</code> library (if you like it, or find it useful, give it a star please 😊). All the links in this post point towards the <code>tabnet</code> branch in the repo, which is the most updated. During the next few days I will merge and release v1 of the package and then update the links and the post. From there, there are a series of algorithms we would like to bring (such as SAINT) and also add some different forms of training.</p>
<p>Beyond adding more algorithms to the library or improving the benchmark code, I wanted to close this with one final thought. As I mentioned in the beginning of the post, there is an element of inconsistency between papers. Different papers will find different results for all algorithms considered, GBMs or DL-based. When you read them one gets the feeling that there is some rush, some urgency to publish something that obtains SoTA. For someone like me, coming from a different background than computer science, this reminds me, in a sense, of my days as astronomer. For years then I found that most of the publications in my field where not very good, but since all that you are judged for are publications and citations, one would publish anything, and the faster, the better.</p>
<p>At this stage, leaving publications and citations aside, I think there is an opportunity for some of us, and some companies as well, to collaborate and properly benchmark DL algorithms for tabular data. I believe the potential of these algorithms in the industry is enormous and with proper benchmarks we could learn not only where they perform better, but how to use them more efficiently.</p>
<p>And that's it! if you made it to here I hope you enjoyed and/or find this useful.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="References">References<a class="anchor-link" href="#References"> </a></h2><p>[1] Tabular Data: Deep Learning is Not All You Need: Ravid Shwartz-Ziv, Amitai Armon, 2021, <a href="https://[arxiv.org/pdf/2106.03253.pdf">arxiv:2106.03253</a></p>
<p>[2] XGBoost: A Scalable Tree Boosting System. Tianqi Chen, Carlos Guestrin 2016, <a href="https://arxiv.org/abs/1603.02754">arXiv:1603.02754</a></p>
<p>[3] CatBoost: unbiased boosting with categorical features. Liudmila Prokhorenkova, Gleb Gusev, Aleksandr Vorobev, Anna Veronika Dorogush, Andrey Gulin, <a href="https://arxiv.org/abs/1706.09516">arXiv:1706.09516</a></p>
<p>[4] LightGBM: A Highly Efficient Gradient Boosting Decision Tree. Guolin Ke, Qi Meng, Thomas Finley, Taifeng Wang, 2017, <a href="https://papers.nips.cc/paper/2017/file/6449f44a102fde848669bdd9eb6b76fa-Paper.pdf">31st Conference on Neural Information Processing Systems</a></p>
<p>[5] SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training. Gowthami Somepalli, Micah Goldblum, Avi Schwarzschild, C. Bayan Bruss, Tom Goldstein, 2021, <a href="https://arxiv.org/abs/2106.01342">arXiv:2106.01342</a></p>
<p>[6] Comment Volume Prediction using Neural Networks and Decision Trees, Kamaljot Singh, Ranjeet Kaur, 2015 17th UKSIM-AMSS International Conference on Modelling and Simulation.</p>
<p>[7] TabNet: Attentive Interpretable Tabular Learning, Sercan O. Arik, Tomas Pfister, <a href="https://arxiv.org/abs/1908.07442">arXiv:1908.07442v5</a></p>
<p>[8] Train longer, generalize better: closing the generalization gap in large batch training of neural networks.
Elad Hoffer, Itay Hubara and Daniel Soudry, 2017, <a href="https://arxiv.org/abs/1705.08741">arXiv:1705.08741</a></p>
<p>[9] TabTransformer: Tabular Data Modeling Using Contextual Embeddings. Xin Huang, Ashish Khetan, Milan Cvitkovic, Zohar Karnin, 2020. <a href="https://arxiv.org/abs/2012.06678">arXiv:2012.06678v1</a></p>
<p>[10] Attention Is All You Need, Ashish Vaswani, Noam Shazeer, Niki Parmar, et al., 2017. <a href="https://arxiv.org/abs/1706.03762">arXiv:1706.03762v5</a></p>
<p>[11] Adam: A Method for Stochastic Optimization, Diederik P. Kingma, Jimmy Ba, 2014, <a href="https://arxiv.org/abs/1412.6980">arXiv:1412.6980</a></p>
<p>[12] Decoupled Weight Decay Regularization, Ilya Loshchilov, Frank Hutter, 2017.<a href="https://arxiv.org/abs/1711.05101">arXiv:1711.05101</a></p>
<p>[13] On the Variance of the Adaptive Learning Rate and Beyond, Liyuan Liu, Haoming Jiang, Pengcheng He, Weizhu Chen, Xiaodong Liu, Jianfeng Gao, Jiawei Han, 2019, <a href="https://arxiv.org/abs/1908.03265">arxiv.org:1908.03265</a></p>
<p>[14] Cyclical Learning Rates for Training Neural Networks, Leslie N. Smith, 2017, <a href="https://arxiv.org/abs/1506.01186">arxiv.org:1506.01186</a></p>
<p>[15] Super-Convergence: Very Fast Training of Neural Networks Using Large Learning Rates, Leslie N. Smith, Nicholay Topin, 2017, <a href="https://arxiv.org/abs/1708.07120">arxiv.org:1708.0712</a></p>
<p>[16] Optuna: A Next-generation Hyperparameter Optimization Framework. Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, Masanori Koyama, 2019, <a href="https://arxiv.org/abs/1907.10902">arXiv:1907.10902</a></p>
<p>[17] Algorithms for Hyper-Parameter Optimization, James Bergstra, Rémi Bardenet, Yoshua Bengio, Balázs Kégl, 2011,
<a href="https://papers.nips.cc/paper/2011/file/86e8f7ab32cfd12577bc2619bc635690-Paper.pdf">25th Conference on Neural Information Processing Systems</a></p>
<p>[18] Focal Loss for Dense Object Detection, Tsung-Yi Lin, Priya Goyal, Ross Girshick, Kaiming He, Piotr Dollár, 2017, <a href="https://arxiv.org/abs/1708.02002?source=post_page---------------------------">arxiv.org:1708.02002</a></p>
<p>[19] Wide & Deep Learning for Recommender Systems, Heng-Tze Cheng, Levent Koc, Jeremiah Harmsen, et al, 2016, <a href="https://arxiv.org/abs/1606.07792">arxiv.org:1606.07792</a></p>
<p>[20] FNet: Mixing Tokens with Fourier Transforms, James Lee-Thorp, Joshua Ainslie, Ilya Eckstein, Santiago Ontanon, 2021, <a href="https://arxiv.org/abs/2105.03824">arxiv.org:2105.03824</a></p>
<p>[21] Pay Attention to MLPs, Hanxiao Liu, Zihang Dai, David R. So, Quoc V. Le, 2021, <a href="https://arxiv.org/abs/2105.08050">arxiv.org:2105.08050</a></p>
<p>[22] ResMLP: Feedforward networks for image classification with data-efficient training,
Hugo Touvron, Piotr Bojanowski, Mathilde Caron, et al, 2021, <a href="https://arxiv.org/abs/2105.03404">arxiv.org:2105.03404</a></p>
</div>
</div>
</div>
</div>Javier Rodriguezpytorch-widedeep, deep learning for tabular data III: the deeptabular component2021-02-18T00:00:00-06:002021-02-18T00:00:00-06:00https://jrzaurin.github.io/infinitoml/2021/02/18/pytorch-widedeep_iii<!--
#################################################
### THIS FILE WAS AUTOGENERATED! DO NOT EDIT! ###
#################################################
# file to edit: _notebooks/2021-02-18-pytorch-widedeep_iii.ipynb
-->
<div class="container" id="notebook-container">
<div class="cell border-box-sizing code_cell rendered">
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>This is the third of a <a href="https://jrzaurin.github.io/infinitoml/">series</a> of posts introducing <a href="https://github.com/jrzaurin/pytorch-widedeep">pytorch-widedeep</a>, a flexible package to combine tabular data with text and images (that could also be used for "standard" tabular data alone).</p>
<p>While writing this post I will assume that the reader is not familiar with the previous two <a href="https://jrzaurin.github.io/infinitoml/">posts</a>. Of course, reading them would help, but in order to understand the content of this post and then being able to use <code>pytorch-widedeep</code> on tabular data, is not a requirement.</p>
<p>To start with, as always, just install the package:</p>
<div class="highlight"><pre><span></span><span class="n">pip</span> <span class="n">install</span> <span class="n">pytorch</span><span class="o">-</span><span class="n">widedeep</span>
</pre></div>
<p>This will install <code>v0.4.8</code>, hopefully the last beta version*. Code-wise I think this could be already <code>v1</code>, but before that I want to try it in a few more datasets and select good default values. In addition, I also intend to implement other algorithms, in particular <a href="https://arxiv.org/abs/1908.07442">TabNet</a> [1], for which a very nice <a href="https://github.com/dreamquark-ai/tabnet">implementation</a> already exists.</p>
<p>Moving on, and as I mentioned earlier, <code>pytorch-widedeep</code>'s main goal is to facilitate the combination of images and text with tabular data via wide and deep models. To that aim, <a href="https://pytorch-widedeep.readthedocs.io/en/latest/model_components.html">wide and deep models</a> can be built with up to four model components: <code>wide</code>, <code>deeptabular</code>, <code>deeptext</code> and <code>deepimage</code>, that will take care of the different types of input datasets ("standard" tabular, i.e. numerical and categorical features, text and images). This post focuses only on the so-called <code>deeptabular</code> component, and the 3 different models available in this library that can be used to build that component. Nonetheless, and for completion, I will briefly describe the remaining components first.</p>
<p>The <code>wide</code> component of a wide and deep model is simply a liner model, and in <code>pytorch-widedeep</code> such model can be created via the <code>Wide</code> class. In the case of the <code>deeptext</code> component, <code>pytorch-widedeep</code> offers one model, available via the <code>DeepText</code> class. <code>DeepText</code> builds a simple stack of LSTMs, i.e. a standard DL text classifier or regressor, with flexibility regarding the use of pre-trained word embeddings, of a Fully Connected Head (FC-Head), etc. For the <code>deepimage</code> component, <code>pytorch-widedeep</code> includes two alternatives: a pre-trained Resnet model or a "standard" stack of CNNs to be trained from scratch. The two are available via the <code>DeepImage</code> class which, as in the case of <code>DeepText</code>, offers some flexibility when building the architecture.</p>
<p>To clarify the use of the term "<em>model</em>" and Wide and Deep "<em>model component</em>" (in case there is some confusion), let's have a look to the following code:</p>
<div class="highlight"><pre><span></span><span class="n">wide_model</span> <span class="o">=</span> <span class="n">Wide</span><span class="p">(</span><span class="o">...</span><span class="p">)</span>
<span class="n">text_model</span> <span class="o">=</span> <span class="n">DeepText</span><span class="p">(</span><span class="o">...</span><span class="p">)</span>
<span class="n">image_model</span> <span class="o">=</span> <span class="n">DeepImage</span><span class="p">(</span><span class="o">...</span><span class="p">)</span>
<span class="c1"># we use the previous models as the wide and deep model components</span>
<span class="n">wdmodel</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">wide</span><span class="o">=</span><span class="n">wide_model</span><span class="p">,</span> <span class="n">deeptext</span><span class="o">=</span><span class="n">text_model</span><span class="p">,</span> <span class="n">deepimage</span><span class="o">=</span><span class="n">image_model</span><span class="p">)</span>
<span class="o">...</span>
</pre></div>
<p>Simply, a wide and deep model has model components that are (of course) models themselves. Note that <strong>any</strong> of the four wide and deep model components can be a custom model by the user. In fact, while I recommend using the models available in <code>pytorch-widedeep</code> for the <code>wide</code> and <code>deeptabular</code> model components, it is very likely that users will want to use their own models for the <code>deeptext</code> and <code>deepimage</code>components. That is perfectly possible as long as the custom models have an attribute called <code>output_dim</code> with the size of the last layer of activations, so that <code>WideDeep</code> can be constructed (see this <a href="https://github.com/jrzaurin/pytorch-widedeep">example notebook</a> in the repo). In addition, any of the four components can be used independently in isolation. For example, you might want to use just a <code>wide</code> component, which is simply a linear model. To that aim, simply:</p>
<div class="highlight"><pre><span></span><span class="n">wide_model</span> <span class="o">=</span> <span class="n">Wide</span><span class="p">(</span><span class="o">...</span><span class="p">)</span>
<span class="c1"># this would not be a wide and deep model but just wide</span>
<span class="n">wdmodel</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">wide</span><span class="o">=</span><span class="n">wide_model</span><span class="p">)</span>
<span class="o">...</span>
</pre></div>
<p>If you want to learn more about different model components and the models available in <code>pytorch-widedeep</code> please, have a look to the <a href="https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples">Examples</a> folder in the repo, the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/model_components.html">documentation</a> or the <a href="https://jrzaurin.github.io/infinitoml/">companion posts</a>. Let's now take a deep dive into the models available for the <code>deeptabular</code> component</p>
<p>$^*$ <em>check the repo or this <a href="https://jrzaurin.github.io/infinitoml/2020/12/06/pytorch-widedeep.html">post</a> for a caveat in the installation if you are using Mac, python 3.8 or Pytorch 1.7+. <strong>Note that this is not directly related with the package</strong>, but the interplay between Mac and OpenMP, and the new defaults of the <code>multiprocessing</code> library for Mac).</em></p>
<h2 id="1.-The-deeptabular-component">1. The <code>deeptabular</code> component<a class="anchor-link" href="#1.-The-deeptabular-component"> </a></h2><p>As I was developing the package I realised that perhaps one of the most interesting offerings in <code>pytorch-widedeep</code> was related to the models available for the <code>deeptabular</code> component. Remember that each component can be used independently in isolation. Building a <code>WideDeep</code> model comprised only by a <code>deeptabular</code> component would be what is normally referred as DL for tabular data. Of course, such model is not a wide and deep model, is "just" deep.</p>
<p>Currently, <code>pytorch-widedeep</code> offers three models that can be used as the <code>deeptabular</code> component. In order of complexity, these are:</p>
<ul>
<li><p><code>TabMlp</code>: this is very similar to the <a href="https://docs.fast.ai/tutorial.tabular.html">tabular model</a> in the fantastic <a href="https://docs.fast.ai/">fastai</a> library, and consists simply in embeddings representing the categorical features, concatenated with the continuous features, and passed then through a MLP.</p>
</li>
<li><p><code>TabRenset</code>: This is similar to the previous model but the embeddings are passed through a series of ResNet blocks built with dense layers.</p>
</li>
<li><p><code>TabTransformer</code>: Details on the TabTransformer can be found in: <a href="https://arxiv.org/pdf/2012.06678.pdf">TabTransformer: Tabular Data Modeling Using Contextual Embeddings</a>. Again, this is similar to the models before but the embeddings are passed through a series of Transformer encoder blocks.</p>
</li>
</ul>
<p>A lot has been (and is being) written about the use of DL for tabular data, and certainly each of these models would deserve a post themselves. Here, I will try to describe them with some detail and illustrate their use within <code>pytorch-widedeep</code>. A proper benchmark exercise will be carried out in a not-so-distant future.</p>
<h3 id="1.1-TabMlp">1.1 <code>TabMlp</code><a class="anchor-link" href="#1.1-TabMlp"> </a></h3><p>The following figure illustrates the <code>TabMlp</code> model architecture.</p>
<p><figure>
<img class="docimage" src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/tabmlp_arch.png" alt="" style="max-width: 400px" />
</figure>
</p>
<p><strong>Fig 1</strong>. The <code>TabMlp</code>: this is the simples architecture and is very similar to the tabular model available in the fantastic fastai library. In fact, the implementation of the dense layers of the MLP is mostly identical to that in that library.</p>
<p>The dashed-border boxes indicate that these components are optional. For example, we could use <code>TabMlp</code> without categorical components, or without continuous components, if we wanted.</p>
<p>Let's have a look and see how this model is used with the well known <a href="http://archive.ics.uci.edu/ml/datasets/Adult">adult census dataset</a>. I assume you have downloaded the data and place it at <code>data/adult/adult.csv.zip</code>:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">import</span> <span class="nn">pandas</span> <span class="k">as</span> <span class="nn">pd</span>
<span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="nn">np</span>
<span class="kn">from</span> <span class="nn">sklearn.model_selection</span> <span class="kn">import</span> <span class="n">train_test_split</span>
<span class="n">adult</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="s2">"data/adult/adult.csv.zip"</span><span class="p">)</span>
<span class="n">adult</span><span class="o">.</span><span class="n">columns</span> <span class="o">=</span> <span class="p">[</span><span class="n">c</span><span class="o">.</span><span class="n">replace</span><span class="p">(</span><span class="s2">"-"</span><span class="p">,</span> <span class="s2">"_"</span><span class="p">)</span> <span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">adult</span><span class="o">.</span><span class="n">columns</span><span class="p">]</span>
<span class="n">adult</span><span class="p">[</span><span class="s2">"income_label"</span><span class="p">]</span> <span class="o">=</span> <span class="p">(</span><span class="n">adult</span><span class="p">[</span><span class="s2">"income"</span><span class="p">]</span><span class="o">.</span><span class="n">apply</span><span class="p">(</span><span class="k">lambda</span> <span class="n">x</span><span class="p">:</span> <span class="s2">">50K"</span> <span class="ow">in</span> <span class="n">x</span><span class="p">))</span><span class="o">.</span><span class="n">astype</span><span class="p">(</span><span class="nb">int</span><span class="p">)</span>
<span class="n">adult</span><span class="o">.</span><span class="n">drop</span><span class="p">(</span><span class="s2">"income"</span><span class="p">,</span> <span class="n">axis</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">inplace</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
<span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">adult</span><span class="o">.</span><span class="n">columns</span><span class="p">:</span>
<span class="k">if</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">dtype</span> <span class="o">==</span> <span class="s1">'O'</span><span class="p">:</span>
<span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span> <span class="o">=</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">apply</span><span class="p">(</span><span class="k">lambda</span> <span class="n">x</span><span class="p">:</span> <span class="s2">"unknown"</span> <span class="k">if</span> <span class="n">x</span> <span class="o">==</span> <span class="s2">"?"</span> <span class="k">else</span> <span class="n">x</span><span class="p">)</span>
<span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span> <span class="o">=</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">str</span><span class="o">.</span><span class="n">lower</span><span class="p">()</span>
<span class="n">adult_train</span><span class="p">,</span> <span class="n">adult_test</span> <span class="o">=</span> <span class="n">train_test_split</span><span class="p">(</span><span class="n">adult</span><span class="p">,</span> <span class="n">test_size</span><span class="o">=</span><span class="mf">0.2</span><span class="p">,</span> <span class="n">stratify</span><span class="o">=</span><span class="n">adult</span><span class="o">.</span><span class="n">income_label</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">adult</span><span class="o">.</span><span class="n">head</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>age</th>
<th>workclass</th>
<th>fnlwgt</th>
<th>education</th>
<th>educational_num</th>
<th>marital_status</th>
<th>occupation</th>
<th>relationship</th>
<th>race</th>
<th>gender</th>
<th>capital_gain</th>
<th>capital_loss</th>
<th>hours_per_week</th>
<th>native_country</th>
<th>income_label</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>25</td>
<td>private</td>
<td>226802</td>
<td>11th</td>
<td>7</td>
<td>never-married</td>
<td>machine-op-inspct</td>
<td>own-child</td>
<td>black</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>0</td>
</tr>
<tr>
<th>1</th>
<td>38</td>
<td>private</td>
<td>89814</td>
<td>hs-grad</td>
<td>9</td>
<td>married-civ-spouse</td>
<td>farming-fishing</td>
<td>husband</td>
<td>white</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>50</td>
<td>united-states</td>
<td>0</td>
</tr>
<tr>
<th>2</th>
<td>28</td>
<td>local-gov</td>
<td>336951</td>
<td>assoc-acdm</td>
<td>12</td>
<td>married-civ-spouse</td>
<td>protective-serv</td>
<td>husband</td>
<td>white</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>1</td>
</tr>
<tr>
<th>3</th>
<td>44</td>
<td>private</td>
<td>160323</td>
<td>some-college</td>
<td>10</td>
<td>married-civ-spouse</td>
<td>machine-op-inspct</td>
<td>husband</td>
<td>black</td>
<td>male</td>
<td>7688</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>1</td>
</tr>
<tr>
<th>4</th>
<td>18</td>
<td>unknown</td>
<td>103497</td>
<td>some-college</td>
<td>10</td>
<td>never-married</td>
<td>unknown</td>
<td>own-child</td>
<td>white</td>
<td>female</td>
<td>0</td>
<td>0</td>
<td>30</td>
<td>united-states</td>
<td>0</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1"># define the embedding and continuous columns, and target</span>
<span class="n">embed_cols</span> <span class="o">=</span> <span class="p">[</span>
<span class="p">(</span><span class="s1">'workclass'</span><span class="p">,</span> <span class="mi">6</span><span class="p">),</span>
<span class="p">(</span><span class="s1">'education'</span><span class="p">,</span> <span class="mi">8</span><span class="p">),</span>
<span class="p">(</span><span class="s1">'marital_status'</span><span class="p">,</span> <span class="mi">6</span><span class="p">),</span>
<span class="p">(</span><span class="s1">'occupation'</span><span class="p">,</span><span class="mi">8</span><span class="p">),</span>
<span class="p">(</span><span class="s1">'relationship'</span><span class="p">,</span> <span class="mi">6</span><span class="p">),</span>
<span class="p">(</span><span class="s1">'race'</span><span class="p">,</span> <span class="mi">6</span><span class="p">)]</span>
<span class="n">cont_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s2">"age"</span><span class="p">,</span> <span class="s2">"hours_per_week"</span><span class="p">,</span> <span class="s2">"fnlwgt"</span><span class="p">,</span> <span class="s2">"educational_num"</span><span class="p">]</span>
<span class="n">target</span> <span class="o">=</span> <span class="n">adult_train</span><span class="p">[</span><span class="s2">"income_label"</span><span class="p">]</span><span class="o">.</span><span class="n">values</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1"># prepare deeptabular component</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.preprocessing</span> <span class="kn">import</span> <span class="n">TabPreprocessor</span>
<span class="n">tab_preprocessor</span> <span class="o">=</span> <span class="n">TabPreprocessor</span><span class="p">(</span><span class="n">embed_cols</span><span class="o">=</span><span class="n">embed_cols</span><span class="p">,</span> <span class="n">continuous_cols</span><span class="o">=</span><span class="n">cont_cols</span><span class="p">)</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult_train</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Let's pause for a second, since the code up until here is going to be common to all models with some minor adaptations for the <code>TabTransformer</code>. So far, we have simply defined the columns that will be represented by embeddings and the numerical (aka continuous) columns. Once they are defined the dataset is prepared with the <code>TabPreprocessor</code>. Internally, the preprocessor label encodes the "embedding columns" and standardizes the numerical columns. Note that one could chose not to standardizes the numerical columns and then use a <code>BatchNorm1D</code> layer when building the model. That is also a valid approach. Alternatively, one could use both, as I will.</p>
<p>At this stage the data is prepared and we are ready to build the model</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">TabMlp</span><span class="p">,</span> <span class="n">WideDeep</span>
<span class="n">tabmlp</span> <span class="o">=</span> <span class="n">TabMlp</span><span class="p">(</span>
<span class="n">mlp_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">200</span><span class="p">,</span> <span class="mi">100</span><span class="p">],</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">embeddings_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">cont_cols</span><span class="p">,</span>
<span class="n">batchnorm_cont</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>/Users/javier/.pyenv/versions/3.7.9/envs/wdposts/lib/python3.7/site-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
and should_run_async(code)
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Let's have a look to the model we just built and how it relates to Fig 1</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">tabmlp</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>TabMlp(
(embed_layers): ModuleDict(
(emb_layer_education): Embedding(17, 8, padding_idx=0)
(emb_layer_marital_status): Embedding(8, 6, padding_idx=0)
(emb_layer_occupation): Embedding(16, 8, padding_idx=0)
(emb_layer_race): Embedding(6, 6, padding_idx=0)
(emb_layer_relationship): Embedding(7, 6, padding_idx=0)
(emb_layer_workclass): Embedding(10, 6, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(norm): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(tab_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=44, out_features=200, bias=True)
(2): ReLU(inplace=True)
)
(dense_layer_1): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=200, out_features=100, bias=True)
(2): ReLU(inplace=True)
)
)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>As we can see, we have a series of columns that would be represented as embeddings. The embeddings from all these columns are concatenated, to form a tensor of dim <code>(bsz, 40)</code> where <code>bsz</code> is batch size. Then, the "<em>batchnormed</em>" continuous columns are also concatenated, resulting in a tensor of dim <code>(bsz, 44)</code>, that will be passed to the 2-layer MLP <code>(200 -> 100)</code>. In summary <code>Embeddings</code> + continuous+ MLP.</p>
<p>One important thing to mention, common to all models, is that <code>pytorch-widedeep</code> models do not build the last connection, i.e. the connection with the output neuron or neurons depending whether this is a regression, binary or multi-class classification. Such connection is built by the <code>WideDeep</code> constructor class. This means that even if we wanted to use a single-component model, the model still needs to be built with the <code>WideDeep</code> class.</p>
<p>This is because the library is, a priori, intended to build <code>WideDeep</code> models (and hence its name). Once the model is built it is passed to the <code>Trainer</code> (as we will see now). The <code>Trainer</code> class is coded to receive a parent model of class <code>WideDeep</code> with children that are the model components. This is very convenient for a number of aspects in the library.</p>
<p>Effectively this simply requires one extra line of code.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">deeptabular</span><span class="o">=</span><span class="n">tabmlp</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>WideDeep(
(deeptabular): Sequential(
(0): TabMlp(
(embed_layers): ModuleDict(
(emb_layer_education): Embedding(17, 8, padding_idx=0)
(emb_layer_marital_status): Embedding(8, 6, padding_idx=0)
(emb_layer_occupation): Embedding(16, 8, padding_idx=0)
(emb_layer_race): Embedding(6, 6, padding_idx=0)
(emb_layer_relationship): Embedding(7, 6, padding_idx=0)
(emb_layer_workclass): Embedding(10, 6, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(norm): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(tab_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=44, out_features=200, bias=True)
(2): ReLU(inplace=True)
)
(dense_layer_1): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=200, out_features=100, bias=True)
(2): ReLU(inplace=True)
)
)
)
)
(1): Linear(in_features=100, out_features=1, bias=True)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>As we can see, our <code>model</code> has the final connection now and is a model of class <code>WideDeep</code> formed by one single component, <code>deeptabular</code>, which is a model of class <code>TabMlp</code> formed mainly by the <code>embed_layers</code> and an MLP very creatively called <code>tab_mlp</code>.</p>
<p>We are now ready to train it. The code below simply runs with defaults. one could use any <code>torch</code> optimizer, learning rate schedulers, etc. Just have a look to the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/trainer.html">docs</a> or the <a href="https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples">Examples</a> folder in the repo.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep</span> <span class="kn">import</span> <span class="n">Trainer</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.metrics</span> <span class="kn">import</span> <span class="n">Accuracy</span>
<span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">objective</span><span class="o">=</span><span class="s2">"binary"</span><span class="p">,</span> <span class="n">metrics</span><span class="o">=</span><span class="p">[(</span><span class="n">Accuracy</span><span class="p">)])</span>
<span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">5</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">256</span><span class="p">,</span> <span class="n">val_split</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 123/123 [00:02<00:00, 59.30it/s, loss=0.4, metrics={'acc': 0.8073}]
valid: 100%|██████████| 31/31 [00:00<00:00, 111.33it/s, loss=0.392, metrics={'acc': 0.807}]
epoch 2: 100%|██████████| 123/123 [00:02<00:00, 61.05it/s, loss=0.363, metrics={'acc': 0.827}]
valid: 100%|██████████| 31/31 [00:00<00:00, 122.68it/s, loss=0.376, metrics={'acc': 0.8253}]
epoch 3: 100%|██████████| 123/123 [00:01<00:00, 71.14it/s, loss=0.359, metrics={'acc': 0.8283}]
valid: 100%|██████████| 31/31 [00:00<00:00, 120.26it/s, loss=0.368, metrics={'acc': 0.8281}]
epoch 4: 100%|██████████| 123/123 [00:01<00:00, 73.66it/s, loss=0.354, metrics={'acc': 0.8321}]
valid: 100%|██████████| 31/31 [00:00<00:00, 122.50it/s, loss=0.361, metrics={'acc': 0.832}]
epoch 5: 100%|██████████| 123/123 [00:01<00:00, 73.94it/s, loss=0.353, metrics={'acc': 0.8329}]
valid: 100%|██████████| 31/31 [00:00<00:00, 119.44it/s, loss=0.359, metrics={'acc': 0.833}]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Once we understand what <code>TabMlp</code> does, <code>TabResnet</code> should be pretty straightforward</p>
<h3 id="1.2-TabResnet">1.2 <code>TabResnet</code><a class="anchor-link" href="#1.2-TabResnet"> </a></h3><p>The following figure illustrates the <code>TabResnet</code> model architecture.</p>
<p><figure>
<img class="docimage" src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/tabresnet_arch.png" alt="" style="max-width: 400px" />
</figure>
</p>
<p><strong>Fig 2</strong>. The <code>TabResnet</code>: this model is similar to the <code>TabMlp</code>, but the embeddings (or the concatenation of embeddings and continuous features, normalised or not) are passed through a series of Resnet blocks built with dense layers. The dashed-border boxes indicate that the component is optional and the dashed lines indicate the different paths or connections present depending on which components we decide to include.</p>
<p>This is probably the most flexible of the three models discussed in this post in the sense that there are many variants one can define via the parameters. For example, we could chose to concatenate the continuous features, normalized or not via a <code>BatchNorm1d</code> layer, with the embeddings and then pass the result of such a concatenation trough the series of Resnet blocks. Alternatively, we might prefer to concatenate the continuous features with the results of passing the embeddings through the Resnet blocks. Another optional component is the MLP before the output neuron(s). If not MLP is present, the output from the Resnet blocks or the results of concatenating that output with the continuous features (normalised or not) will be connected directly to the output neuron(s).</p>
<p>Each of the Resnet block is comprised by the following operations:</p>
<p><figure>
<img class="docimage" src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/resnet_block.png" alt="" style="max-width: 400px" />
</figure>
</p>
<p>Fig 3. "Dense" Resnet Block. <code>b</code> is the batch size and <code>d</code> the dimension of the embeddings.</p>
<p>Let's build a <code>TabResnet</code> model:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">TabResnet</span>
<span class="n">tabresnet</span> <span class="o">=</span> <span class="n">TabResnet</span><span class="p">(</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">embeddings_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">cont_cols</span><span class="p">,</span>
<span class="n">batchnorm_cont</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="n">blocks_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">200</span><span class="p">,</span> <span class="mi">100</span><span class="p">,</span> <span class="mi">100</span><span class="p">],</span>
<span class="n">mlp_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">100</span><span class="p">,</span> <span class="mi">50</span><span class="p">],</span>
<span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">deeptabular</span><span class="o">=</span><span class="n">tabresnet</span><span class="p">)</span>
<span class="n">model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>/Users/javier/.pyenv/versions/3.7.9/envs/wdposts/lib/python3.7/site-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
and should_run_async(code)
</pre>
</div>
</div>
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>WideDeep(
(deeptabular): Sequential(
(0): TabResnet(
(embed_layers): ModuleDict(
(emb_layer_education): Embedding(17, 8, padding_idx=0)
(emb_layer_marital_status): Embedding(8, 6, padding_idx=0)
(emb_layer_occupation): Embedding(16, 8, padding_idx=0)
(emb_layer_race): Embedding(6, 6, padding_idx=0)
(emb_layer_relationship): Embedding(7, 6, padding_idx=0)
(emb_layer_workclass): Embedding(10, 6, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(norm): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(tab_resnet): DenseResnet(
(dense_resnet): Sequential(
(lin1): Linear(in_features=44, out_features=200, bias=True)
(bn1): BatchNorm1d(200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(block_0): BasicBlock(
(lin1): Linear(in_features=200, out_features=100, bias=True)
(bn1): BatchNorm1d(100, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(leaky_relu): LeakyReLU(negative_slope=0.01, inplace=True)
(dp): Dropout(p=0.1, inplace=False)
(lin2): Linear(in_features=100, out_features=100, bias=True)
(bn2): BatchNorm1d(100, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(resize): Sequential(
(0): Linear(in_features=200, out_features=100, bias=True)
(1): BatchNorm1d(100, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(block_1): BasicBlock(
(lin1): Linear(in_features=100, out_features=100, bias=True)
(bn1): BatchNorm1d(100, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(leaky_relu): LeakyReLU(negative_slope=0.01, inplace=True)
(dp): Dropout(p=0.1, inplace=False)
(lin2): Linear(in_features=100, out_features=100, bias=True)
(bn2): BatchNorm1d(100, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
)
(tab_resnet_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=100, out_features=100, bias=True)
(2): ReLU(inplace=True)
)
(dense_layer_1): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=100, out_features=50, bias=True)
(2): ReLU(inplace=True)
)
)
)
)
(1): Linear(in_features=50, out_features=1, bias=True)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>As we did previously with the <code>TabMlp</code>, let's "walk through" the model. In this case, model is an instance of a <code>WideDeep</code> object formed by a single component, <code>deeptabular</code> that is a <code>TabResnetmodel</code>. <code>TabResnet</code> is formed by a series of <code>Embedding</code> layers (e.g. <code>emb_layer_education</code>) a series of so-called dense Resnet blocks (<code>tab_resnet</code>) and a MLP (<code>tab_resnet_mlp</code>). The embeddings are concatenated themselves and then, further concatenated with the normalised continuous columns. The resulting tensor of dim <code>(bsz, 44)</code> is then passed through two dense Resnet blocks. The output of one Resnet block is the input of the next. Therefore, when setting <code>blocks_dim = [200, 100, 100]</code> we are generating two blocks with input/output 200/100 and 100/100 respectively. The output of the second Resnet blocks, of dim <code>(bsz, 100)</code> is passed through <code>tab_resnet_mlp</code> , the 2-layer MLP, and finally "plugged" into the output neuron. In summary: Embeddings + continuous + dense Renset + MLP.</p>
<p>To run it, the code is, as one might expect identical to the one shown before for the <code>TabMlp</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">objective</span><span class="o">=</span><span class="s2">"binary"</span><span class="p">,</span> <span class="n">metrics</span><span class="o">=</span><span class="p">[(</span><span class="n">Accuracy</span><span class="p">)])</span>
<span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">5</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">256</span><span class="p">,</span> <span class="n">val_split</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 123/123 [00:04<00:00, 30.40it/s, loss=0.385, metrics={'acc': 0.8108}]
valid: 100%|██████████| 31/31 [00:00<00:00, 105.50it/s, loss=0.36, metrics={'acc': 0.8144}]
epoch 2: 100%|██████████| 123/123 [00:04<00:00, 30.05it/s, loss=0.354, metrics={'acc': 0.8326}]
valid: 100%|██████████| 31/31 [00:00<00:00, 97.42it/s, loss=0.352, metrics={'acc': 0.8337}]
epoch 3: 100%|██████████| 123/123 [00:03<00:00, 30.95it/s, loss=0.351, metrics={'acc': 0.834}]
valid: 100%|██████████| 31/31 [00:00<00:00, 105.48it/s, loss=0.351, metrics={'acc': 0.8354}]
epoch 4: 100%|██████████| 123/123 [00:03<00:00, 31.33it/s, loss=0.349, metrics={'acc': 0.8352}]
valid: 100%|██████████| 31/31 [00:00<00:00, 108.03it/s, loss=0.349, metrics={'acc': 0.8367}]
epoch 5: 100%|██████████| 123/123 [00:03<00:00, 31.99it/s, loss=0.346, metrics={'acc': 0.8359}]
valid: 100%|██████████| 31/31 [00:00<00:00, 107.30it/s, loss=0.348, metrics={'acc': 0.8378}]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>And now, last but not least, the last addition to the library, the <code>TabTransformer</code>.</p>
<h3 id=" 1.3-TabTransformer"> 1.3 <code>TabTransformer</code><a class="anchor-link" href="# 1.3-TabTransformer"> </a></h3><p>The <code>TabTransformer</code> is described in detail in <a href="https://arxiv.org/pdf/2012.06678.pdf">TabTransformer: Tabular Data Modeling Using Contextual Embeddings</a> [2], by the clever guys at Amazon. Is an entertaining paper that I, of course, strongly recommend if you are going to use this model on your tabular data (and also in general if you are interested in DL for tabular data).</p>
<p>My implementation is not the only one available. Given that the model was conceived by the researchers at Amazon, it is also available in their fantastic <code>autogluon</code> library (which you should definitely check). In addition, you can find another implementation <a href="https://github.com/lucidrains/tab-transformer-pytorch">here</a> by Phil Wang, whose entire github is simply outstanding. My implementation is partially inspired by these but has some particularities and adaptations so that it works within the <code>pytorch-widedeep</code> package.</p>
<p>The following figure illustrates the <code>TabTransformer</code> model architecture.</p>
<p><figure>
<img class="docimage" src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/tabtransformer_arch.png" alt="" style="max-width: 400px" />
</figure>
</p>
<p><strong>Fig 4</strong>. The <code>TabTransfomer</code>, described in <a href="https://arxiv.org/pdf/2012.06678.pdf">TabTransformer: Tabular Data Modeling Using Contextual Embeddings</a>. The dashed-border boxes indicate that the component is optional.</p>
<p>As in previous cases, there are a number of variants and details to consider as one builds the model. I will describe some here, but for a full view of all the possible parameters, please, have a look to the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/model_components.html#pytorch_widedeep.models.tab_transformer.TabTransformer">docs</a>.</p>
<p>I don't want to go into the details of what is a Transformer [3] in this post. There is an overwhelming amount of literature if you wanted to learn about it, with the most popular being perhaps <a href="https://nlp.seas.harvard.edu/2018/04/03/attention.html">The Annotated Transformer</a>. Also check this <a href="https://elvissaravia.substack.com/p/learn-about-transformers-a-recipe">post</a> and if you are a math "maniac" you might like this <a href="https://arxiv.org/abs/2007.02876">paper</a> [4]. However, let me just briefly describe it here so I can introduce the little math we will need for this post. In one sentence, a Transformer consists of a multi-head self-attention layer followed by feed-forward layer, with element-wise addition and layer-normalization being done after each layer.</p>
<p>As most of you will know, a self-attention layer comprises three matrices, Key, Query and Value. Each input categorical column, i.e. embedding, is projected onto these matrices (although see the <code>fixed_attention</code> option later in the post) to generate their corresponding key, query and value vectors. Formally, let $K \in R^{e \times d}$, $Q \in R^{e \times d}$ and $V \in R^{e \times d}$ be the Key, Query and Value matrices of the embeddings where $e$ is the embeddings dimension and $d$ is the dimension of all the Key, Query and Value matrices. Then every input categorical column, i.e embedding, attends to all other categorical columns through an attention head:</p>
$$
Attention(K, Q, V ) = A \cdot V, \hspace{5cm}(1)
$$<p>where</p>
$$
A = softmax( \frac{QK^T}{\sqrt{d}} ), \hspace{6cm}(2)
$$<p>And that is all the math we need.</p>
<p>As I was thinking in a figure to illustrate a transformer block, I realised that there is a chance that the reader has seen every possible representation/figure. Therefore, I decided to illustrate the transformer block in a way that relates directly to the way it is implemented.</p>
<p><figure>
<img class="docimage" src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/transformer_block.png" alt="" style="max-width: 600px" />
</figure>
</p>
<p><strong>Fig 5</strong>. The Transfomer block. The letters in parenthesis indicate the dimension of the corresponding tensor after the operation indicated in the corresponding box. For example, the tensor <code>attn_weights</code> has dim <code>(b, h, s, s)</code>.</p>
<p>As the figure shows, the input tensor ($X$) is projected onto its key, query and value matrices. These are then "<em>re-arranged into</em>" the multi-head self-attention layer where each head will attend to part of the embeddings. We then compute $A$ (Eq 2), which is then multiplied by $V$ to obtain what I refer as <code>attn_score</code> (Eq 1). <code>attn_score</code> is then re-arranged, so that we "<em>collect</em>" the attention scores from all the heads, and projected again to obtain the results (<code>attn_out</code>), that will be added to the input and normalised (<code>Y</code>). Finally <code>Y</code> goes through the Feed-Forward layer and a further Add + Norm.</p>
<p>Before moving to the code related to building the model itself, there are a couple of details in the implementation that are worth mentioning</p>
<p><strong><code>FullEmbeddingDropout</code></strong></p>
<p>when building a <code>TabTransformer</code> model, there is the possibility of dropping entirely the embedding corresponding to a categorical column. This is set by the parameter <code>full_embed_dropout: bool</code>, which points to the class <code>FullEmbeddingDropout</code>.</p>
<p><strong><code>SharedEmbeddings</code></strong></p>
<p>when building a <code>TabTransformer</code> model, it is possible for all the embeddings that represent a categorical column to share a fraction of their embeddings, or define a common separated embedding per column that will be added to the column's embeddings.</p>
<p>The idea behind this so-called "<em>column embedding</em>" is to enable the model to distinguish the classes in one column from those in the other columns. In other words, we want the model to learn representations not only of the different categorical values in the column, but also of the column itself. This is attained by the <code>shared_embed</code> group of parameters: <code>share_embed : bool</code>, <code>add_shared_embed: bool</code> and <code>frac_shared_embed: int</code>. The first simply indicates if embeddings will be shared, the second sets the sharing strategy and the third one the fraction of the embeddings that will be shared, depending on the strategy. They all relate to the class <code>SharedEmbeddings</code></p>
<p>For example, let's say that we have a categorical column with 5 different categories that will be encoded as embeddings of dim 8. This will result in a lookup table for that column of dim <code>(5, 8)</code>. The two sharing strategies are illustrated in Fig 6.</p>
<p><img src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/shared_embeddings.png" alt="" />
<!-- <img src="figures/pytorch-widedeep/shared_embeddings.png" width="600"/>
-->
<strong>Fig 6</strong>. The two sharing embeddings strategies. Upper panel: the "<em>column embedding</em>" replaces <code>embedding dim / frac_shared_embed</code> (4 in this case) of the total embeddings that represent the different values of the categorical column. Lower panel: the "<em>column embedding</em>" is added (well, technically broadcasted and added) to the original embedding lookup table. Note that <code>n_cat</code> here refers to the number of different categories for this particular column.</p>
<p><strong><code>fixed_attention</code></strong></p>
<p><code>fixed_attention</code>: this in inspired by the <a href="https://github.com/awslabs/autogluon/blob/master/tabular/src/autogluon/tabular/models/tab_transformer/modified_transformer.py">implementation</a> at the Autogluon library. When using "fixed attention", the key and query matrices are not the result of any projection of the input tensor $X$, but learnable matrices (referred as <code>fixed_key</code> and <code>fixed_query</code>) of dim <code>(number of categorical columns x embeddings dim)</code> defined separately, as you instantiate the model. <code>fixed_attention</code> does not affect how the Value matrix is computed.</p>
<p>Let me go through an example with numbers to clarify things. Let's assume we have a dataset with 5 categorical columns that will be encoded by embeddings of dim 4 and we use a batch size (<code>bsz</code>) of 6. Figure 7 shows how the key matrix will be computed for a given batch (same applies to the query matrix) with and without fixed attention.</p>
<p><img src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/fixed_attn.png" alt="" />
<!-- <img src="figures/pytorch-widedeep/fixed_attn.png" width="700"/>
-->
<strong>Fig 7</strong>. Key matrix computation for a given batch with and without fixed attention (same applies to the query matrix). The different color tones in the matrices are my attempt to illustrate that, while without fixed attention the key matrix can have different values anywhere in the matrix, with fixed attention the key matrix is the result of the repetition of the "fixed-key" <code>bsz</code> times. The project-layer is, of course, broadcasted along the <code>bsz</code> dimension in the upper panel.</p>
<p>As I mentioned, this implementation is inspired by that at the Autogluon library. Since the guys at Amazon are the ones that came up with the <code>TabTransformer</code>, is only logical to think that they found a use for this implementation of attention. However, at the time of writing such use is not 100% clear to me. It is known that, in problems like machine translation, most attention heads learn redundant patterns (see e.g. <a href="https://arxiv.org/abs/2002.10260">Alessandro Raganato et al., 2020</a> [5] and references therein). Therefore, maybe the fixed attention mechanism discussed here helps reducing redundancy for problems involving tabular data.</p>
<p>Overall, the way I interpret <code>fixed_attention</code> in layman's terms, is the following: when using fixed attention, the Key and the Query matrices are defined as the model is instantiated, and do not know of the input until the attention weights (<code>attn_weights</code>) are multiplied by the value matrix to obtain what I refer as <code>attn_score</code> in figure 5. Those attention weights, which are in essence the result of a matrix multiplication between the key and the query matrices (plus softmax and normalization), are going to be the same for all the heads, for all samples in a given batch. Therefore, my interpretation is that when using fixed attention, we reduce the attention capabilities of the transformer, which will focus on less aspects of the inputs, reducing potential redundancies.</p>
<p>Anyway, enough speculation. Time to have a look to the code. Note that, since we are going to stack the embeddings (instead of concatenating them) they all must have the same dimensions. Such dimension is set as we build the model instead that at the pre-processing stage. To avoid input format conflicts we use the <code>for_tabtransformer</code> parameter at pre-processing time.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">embed_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'workclass'</span><span class="p">,</span> <span class="s1">'education'</span><span class="p">,</span> <span class="s1">'marital_status'</span><span class="p">,</span> <span class="s1">'occupation'</span><span class="p">,</span> <span class="s1">'relationship'</span><span class="p">,</span> <span class="s1">'race'</span><span class="p">]</span>
<span class="n">tab_preprocessor</span> <span class="o">=</span> <span class="n">TabPreprocessor</span><span class="p">(</span>
<span class="n">embed_cols</span><span class="o">=</span><span class="n">embed_cols</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">cont_cols</span><span class="p">,</span>
<span class="n">for_tabtransformer</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult_train</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>/Users/javier/.pyenv/versions/3.7.9/envs/wdposts/lib/python3.7/site-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
and should_run_async(code)
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">TabTransformer</span>
<span class="n">tabtransformer</span> <span class="o">=</span> <span class="n">TabTransformer</span><span class="p">(</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">embeddings_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">cont_cols</span><span class="p">,</span>
<span class="n">shared_embed</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="n">num_blocks</span><span class="o">=</span><span class="mi">3</span><span class="p">,</span>
<span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">deeptabular</span><span class="o">=</span><span class="n">tabtransformer</span><span class="p">)</span>
<span class="n">model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>/Users/javier/.pyenv/versions/3.7.9/envs/wdposts/lib/python3.7/site-packages/ipykernel/ipkernel.py:283: DeprecationWarning: `should_run_async` will not call `transform_cell` automatically in the future. Please pass the result to `transformed_cell` argument and any exception that happen during thetransform in `preprocessing_exc_tuple` in IPython 7.17 and above.
and should_run_async(code)
</pre>
</div>
</div>
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>WideDeep(
(deeptabular): Sequential(
(0): TabTransformer(
(embed_layers): ModuleDict(
(emb_layer_education): SharedEmbeddings(
(embed): Embedding(17, 32, padding_idx=0)
(dropout): Dropout(p=0.1, inplace=False)
)
(emb_layer_marital_status): SharedEmbeddings(
(embed): Embedding(8, 32, padding_idx=0)
(dropout): Dropout(p=0.1, inplace=False)
)
(emb_layer_occupation): SharedEmbeddings(
(embed): Embedding(16, 32, padding_idx=0)
(dropout): Dropout(p=0.1, inplace=False)
)
(emb_layer_race): SharedEmbeddings(
(embed): Embedding(6, 32, padding_idx=0)
(dropout): Dropout(p=0.1, inplace=False)
)
(emb_layer_relationship): SharedEmbeddings(
(embed): Embedding(7, 32, padding_idx=0)
(dropout): Dropout(p=0.1, inplace=False)
)
(emb_layer_workclass): SharedEmbeddings(
(embed): Embedding(10, 32, padding_idx=0)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(tab_transformer_blks): Sequential(
(block0): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
(block1): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
(block2): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
)
(tab_transformer_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Linear(in_features=196, out_features=784, bias=True)
(1): ReLU(inplace=True)
(2): Dropout(p=0.1, inplace=False)
)
(dense_layer_1): Sequential(
(0): Linear(in_features=784, out_features=392, bias=True)
(1): ReLU(inplace=True)
(2): Dropout(p=0.1, inplace=False)
)
)
)
)
(1): Linear(in_features=392, out_features=1, bias=True)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>As we can see, the model is an instance of a <code>WideDeep</code> object formed by a single component, <code>deeptabular</code> that is <code>TabTransformer</code> model. <code>TabTransformer</code> is formed by a series of embedding layers (e.g. <code>emb_layer_education</code>) , a series of transformer encoder blocks$^*$ (<code>tab_transformer_blks</code>) and a MLP (<code>tab_transformer_mlp</code>). The embeddings here are of class <code>SharedEmbeddings</code>, which I described before. These embeddings are stacked and passed through three transformer blocks. The output for all the categorical columns is concatenated, resulting in a tensor of dim <code>(bsz, 192)</code> where 192 is equal to the number of categorical columns (6) times the embedding dim (32). This tensor is then concatenated with the "layernormed" continuous columns, resulting in a tensor of dim <code>(bsz, 196)</code>. As usual, this tensor goes through tab_transformer_mlp , which following the guidance in the paper ("<em>The MLP layer sizes are set to {4 × l, 2 × l}, where l is the size of its input.</em>") is <code>[784 -> 392]</code> , and "off we go". In summary <code>SharedEmbeddings</code> + continuous + Transformer encoder blocks + MLP.</p>
<p>To run it, the code is, as one might expect identical to the one shown before for the <code>TabMlp</code> and <code>TabRenset</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">objective</span><span class="o">=</span><span class="s2">"binary"</span><span class="p">,</span> <span class="n">metrics</span><span class="o">=</span><span class="p">[(</span><span class="n">Accuracy</span><span class="p">)])</span>
<span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">5</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">256</span><span class="p">,</span> <span class="n">val_split</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 123/123 [00:09<00:00, 13.42it/s, loss=0.376, metrics={'acc': 0.8236}]
valid: 100%|██████████| 31/31 [00:00<00:00, 34.98it/s, loss=0.373, metrics={'acc': 0.8228}]
epoch 2: 100%|██████████| 123/123 [00:09<00:00, 13.31it/s, loss=0.353, metrics={'acc': 0.8331}]
valid: 100%|██████████| 31/31 [00:00<00:00, 37.92it/s, loss=0.368, metrics={'acc': 0.8313}]
epoch 3: 100%|██████████| 123/123 [00:09<00:00, 13.30it/s, loss=0.349, metrics={'acc': 0.8354}]
valid: 100%|██████████| 31/31 [00:00<00:00, 34.20it/s, loss=0.372, metrics={'acc': 0.833}]
epoch 4: 100%|██████████| 123/123 [00:09<00:00, 12.91it/s, loss=0.347, metrics={'acc': 0.8376}]
valid: 100%|██████████| 31/31 [00:00<00:00, 36.76it/s, loss=0.369, metrics={'acc': 0.8351}]
epoch 5: 100%|██████████| 123/123 [00:10<00:00, 12.20it/s, loss=0.344, metrics={'acc': 0.8404}]
valid: 100%|██████████| 31/31 [00:00<00:00, 36.31it/s, loss=0.367, metrics={'acc': 0.8376}]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>$^*$ <em>Note that there is a small inconsistency in the naming of the</em> <code>TabTransformer</code> <em>main components relative to the other two models. If you installed the package via pypi, the transformer encoder blocks are named</em> <code>blks</code>. <em>A name more consistent with the other models would be, for example</em>, <code>tab_transformer_blks</code>. <em>I realised of such inconsistency just after publishing <code>v0.4.8</code> to pypi. Such small issue is not worthy of another sub-version. However, this is fixed if you install the package from github (as I have done for this post) and both the pypi and the github versions will be consistent in future releases</em></p>
<h2 id=" 2.-Conclusion-and-future-work"> 2. Conclusion and future work<a class="anchor-link" href="# 2.-Conclusion-and-future-work"> </a></h2><p>In this post my intention was to illustrate how one can use <code>pytorch-widedeep</code> as a library for "standard DL for tabular data", i.e. without building wide and deep models and for problems that do not involve text and/or images (if you wanted to learn more about the library please visit the <a href="https://github.com/jrzaurin/pytorch-widedeep">repo</a>, the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/index.html">documentation</a>, or the <a href="https://jrzaurin.github.io/infinitoml/">previous posts</a>). To that aim the only component that we need is the <code>deeptabular</code> component, for which <code>pytorch-widedeep</code> comes with 3 models implemented "out of the box": <code>TabMlp</code>, <code>TabResnet</code> and <code>TabTransformer</code>. In this post I have explained their architecture in detail and how to use them within the library. In the no-so-distant future I intend to implement <a href="https://arxiv.org/abs/1908.07442">TabNet</a> and perhaps Node, as well as performing a proper benchmarking exercise so I can set robust defaults and then release version <code>1.0</code>. Of course, you can help me by using the package in your datasets 🙂. If you found this post useful and you like the library, please give a star to the <a href="https://github.com/jrzaurin/pytorch-widedeep">repo</a>. Other than that, happy coding.</p>
<h2 id=" 3.-References"> 3. References<a class="anchor-link" href="# 3.-References"> </a></h2><p>[1] TabNet: Attentive Interpretable Tabular Learning, Sercan O. Arik, Tomas Pfister, <a href="https://arxiv.org/abs/1908.07442">arXiv:1908.07442v5</a></p>
<p>[2] TabTransformer: Tabular Data Modeling Using Contextual Embeddings. Xin Huang, Ashish Khetan, Milan Cvitkovic, Zohar Karnin, 2020. <a href="https://arxiv.org/abs/2012.06678">arXiv:2012.06678v1</a></p>
<p>[3] Attention Is All You Need, Ashish Vaswani, Noam Shazeer, Niki Parmar, et al., 2017. <a href="https://arxiv.org/abs/1706.03762">arXiv:1706.03762v5</a></p>
<p>[4] A Mathematical Theory of Attention, James Vuckovic, Aristide Baratin, Remi Tachet des Combes, 2020. <a href="https://arxiv.org/abs/2007.02876">arXiv:2007.02876v2</a></p>
<p>[5] Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation. Alessandro Raganato, Yves Scherrer, Jörg Tiedemann, 2020. <a href="https://arxiv.org/abs/2002.10260">arXiv:2002.10260v3</a></p>
<p>[6] Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data. Sergei Popov, Stanislav Morozov, Artem Babenko, <a href="https://arxiv.org/abs/1909.06312">arXiv:1909.06312v2</a></p>
</div>
</div>
</div>
</div>Javier Rodriguezpytorch-widedeep, deep learning for tabular data II: advanced use2020-12-11T00:00:00-06:002020-12-11T00:00:00-06:00https://jrzaurin.github.io/infinitoml/2020/12/11/pytorch-widedeep_ii<!--
#################################################
### THIS FILE WAS AUTOGENERATED! DO NOT EDIT! ###
#################################################
# file to edit: _notebooks/2020-12-11-pytorch-widedeep_ii.ipynb
-->
<div class="container" id="notebook-container">
<div class="cell border-box-sizing code_cell rendered">
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>This is the second of a series of posts introducing <a href="https://github.com/jrzaurin/pytorch-widedeep">pytorch-widedeep</a>, a flexible package to combine tabular data with text and images (that could also be used for "standard" tabular data alone).</p>
<p>In the first post I described <code>pytorch-widedeep</code>'s data preprocessing utilities, the main components of a <code>WideDeep</code> model and a quick example to illustrate the basic use of the library. In this post I will use a series of examples to dig deeper into the many options <code>pytorch-widedeep</code> offers as we build wide and deep models.</p>
<h2 id="1.-Binary-classification-with-varying-parameters">1. Binary classification with varying parameters<a class="anchor-link" href="#1.-Binary-classification-with-varying-parameters"> </a></h2><p>Let's start by using again the <a href="http://archive.ics.uci.edu/ml/datasets/Adult">adult census</a> dataset.</p>
<p>Before moving any further, let me emphasize that, as we go through the examples, one should not pay excessive (or any) attention to the loss or the metrics in the sense that the input parameters are not selected to obtain "state of the art", but to illustrate usability.</p>
<p>A proper benchmarking exercise will be carried out in a future post. Having said that, and without further ado, let's start.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">import</span> <span class="nn">pandas</span> <span class="k">as</span> <span class="nn">pd</span>
<span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="nn">np</span>
<span class="n">adult</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="s2">"data/adult/adult.csv.zip"</span><span class="p">)</span>
<span class="n">adult</span><span class="o">.</span><span class="n">columns</span> <span class="o">=</span> <span class="p">[</span><span class="n">c</span><span class="o">.</span><span class="n">replace</span><span class="p">(</span><span class="s2">"-"</span><span class="p">,</span> <span class="s2">"_"</span><span class="p">)</span> <span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">adult</span><span class="o">.</span><span class="n">columns</span><span class="p">]</span>
<span class="n">adult</span><span class="p">[</span><span class="s2">"income_label"</span><span class="p">]</span> <span class="o">=</span> <span class="p">(</span><span class="n">adult</span><span class="p">[</span><span class="s2">"income"</span><span class="p">]</span><span class="o">.</span><span class="n">apply</span><span class="p">(</span><span class="k">lambda</span> <span class="n">x</span><span class="p">:</span> <span class="s2">">50K"</span> <span class="ow">in</span> <span class="n">x</span><span class="p">))</span><span class="o">.</span><span class="n">astype</span><span class="p">(</span><span class="nb">int</span><span class="p">)</span>
<span class="n">adult</span><span class="o">.</span><span class="n">drop</span><span class="p">(</span><span class="s2">"income"</span><span class="p">,</span> <span class="n">axis</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">inplace</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
<span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">adult</span><span class="o">.</span><span class="n">columns</span><span class="p">:</span>
<span class="k">if</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">dtype</span> <span class="o">==</span> <span class="s1">'O'</span><span class="p">:</span>
<span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span> <span class="o">=</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">apply</span><span class="p">(</span><span class="k">lambda</span> <span class="n">x</span><span class="p">:</span> <span class="s2">"unknown"</span> <span class="k">if</span> <span class="n">x</span> <span class="o">==</span> <span class="s2">"?"</span> <span class="k">else</span> <span class="n">x</span><span class="p">)</span>
<span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span> <span class="o">=</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">str</span><span class="o">.</span><span class="n">lower</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">adult</span><span class="o">.</span><span class="n">head</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>age</th>
<th>workclass</th>
<th>fnlwgt</th>
<th>education</th>
<th>educational_num</th>
<th>marital_status</th>
<th>occupation</th>
<th>relationship</th>
<th>race</th>
<th>gender</th>
<th>capital_gain</th>
<th>capital_loss</th>
<th>hours_per_week</th>
<th>native_country</th>
<th>income_label</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>25</td>
<td>private</td>
<td>226802</td>
<td>11th</td>
<td>7</td>
<td>never-married</td>
<td>machine-op-inspct</td>
<td>own-child</td>
<td>black</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>0</td>
</tr>
<tr>
<th>1</th>
<td>38</td>
<td>private</td>
<td>89814</td>
<td>hs-grad</td>
<td>9</td>
<td>married-civ-spouse</td>
<td>farming-fishing</td>
<td>husband</td>
<td>white</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>50</td>
<td>united-states</td>
<td>0</td>
</tr>
<tr>
<th>2</th>
<td>28</td>
<td>local-gov</td>
<td>336951</td>
<td>assoc-acdm</td>
<td>12</td>
<td>married-civ-spouse</td>
<td>protective-serv</td>
<td>husband</td>
<td>white</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>1</td>
</tr>
<tr>
<th>3</th>
<td>44</td>
<td>private</td>
<td>160323</td>
<td>some-college</td>
<td>10</td>
<td>married-civ-spouse</td>
<td>machine-op-inspct</td>
<td>husband</td>
<td>black</td>
<td>male</td>
<td>7688</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>1</td>
</tr>
<tr>
<th>4</th>
<td>18</td>
<td>unknown</td>
<td>103497</td>
<td>some-college</td>
<td>10</td>
<td>never-married</td>
<td>unknown</td>
<td>own-child</td>
<td>white</td>
<td>female</td>
<td>0</td>
<td>0</td>
<td>30</td>
<td>united-states</td>
<td>0</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>if you read the first post you will be familiar with the code below:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep</span> <span class="kn">import</span> <span class="n">Trainer</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.preprocessing</span> <span class="kn">import</span> <span class="n">WidePreprocessor</span><span class="p">,</span> <span class="n">TabPreprocessor</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">Wide</span><span class="p">,</span> <span class="n">TabMlp</span><span class="p">,</span> <span class="n">TabResnet</span><span class="p">,</span> <span class="n">WideDeep</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.metrics</span> <span class="kn">import</span> <span class="n">Accuracy</span><span class="p">,</span> <span class="n">Recall</span>
<span class="n">wide_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'education'</span><span class="p">,</span> <span class="s1">'relationship'</span><span class="p">,</span><span class="s1">'workclass'</span><span class="p">,</span><span class="s1">'occupation'</span><span class="p">,</span><span class="s1">'native_country'</span><span class="p">,</span><span class="s1">'gender'</span><span class="p">]</span>
<span class="n">crossed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s1">'education'</span><span class="p">,</span> <span class="s1">'occupation'</span><span class="p">),</span> <span class="p">(</span><span class="s1">'native_country'</span><span class="p">,</span> <span class="s1">'occupation'</span><span class="p">)]</span>
<span class="n">cat_embed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s1">'education'</span><span class="p">,</span><span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s1">'relationship'</span><span class="p">,</span><span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s1">'workclass'</span><span class="p">,</span><span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s1">'occupation'</span><span class="p">,</span><span class="mi">32</span><span class="p">),(</span><span class="s1">'native_country'</span><span class="p">,</span><span class="mi">32</span><span class="p">)]</span>
<span class="n">continuous_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s2">"age"</span><span class="p">,</span><span class="s2">"hours_per_week"</span><span class="p">]</span>
<span class="n">target_col</span> <span class="o">=</span> <span class="s1">'income_label'</span>
<span class="c1"># TARGET</span>
<span class="n">target</span> <span class="o">=</span> <span class="n">adult</span><span class="p">[</span><span class="n">target_col</span><span class="p">]</span><span class="o">.</span><span class="n">values</span>
<span class="c1"># WIDE</span>
<span class="n">wide_preprocessor</span> <span class="o">=</span> <span class="n">WidePreprocessor</span><span class="p">(</span><span class="n">wide_cols</span><span class="o">=</span><span class="n">wide_cols</span><span class="p">,</span> <span class="n">crossed_cols</span><span class="o">=</span><span class="n">crossed_cols</span><span class="p">)</span>
<span class="n">X_wide</span> <span class="o">=</span> <span class="n">wide_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult</span><span class="p">)</span>
<span class="c1"># DEEP</span>
<span class="n">tab_preprocessor</span> <span class="o">=</span> <span class="n">TabPreprocessor</span><span class="p">(</span><span class="n">embed_cols</span><span class="o">=</span><span class="n">cat_embed_cols</span><span class="p">,</span> <span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">)</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">wide</span> <span class="o">=</span> <span class="n">Wide</span><span class="p">(</span><span class="n">wide_dim</span><span class="o">=</span><span class="n">np</span><span class="o">.</span><span class="n">unique</span><span class="p">(</span><span class="n">X_wide</span><span class="p">)</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span> <span class="n">pred_dim</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="c1"># We can add dropout and batchnorm to the dense layers, as well as chose the order of the operations</span>
<span class="n">deeptabular</span> <span class="o">=</span> <span class="n">TabMlp</span><span class="p">(</span><span class="n">column_idx</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">mlp_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">64</span><span class="p">,</span><span class="mi">32</span><span class="p">],</span>
<span class="n">mlp_dropout</span><span class="o">=</span><span class="p">[</span><span class="mf">0.5</span><span class="p">,</span> <span class="mf">0.5</span><span class="p">],</span>
<span class="n">mlp_batchnorm</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="n">mlp_linear_first</span> <span class="o">=</span> <span class="kc">True</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">embeddings_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">wide</span><span class="o">=</span><span class="n">wide</span><span class="p">,</span> <span class="n">deeptabular</span><span class="o">=</span><span class="n">deeptabular</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Let's have a look to the model that we will be running:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>WideDeep(
(wide): Wide(
(wide_linear): Embedding(797, 1, padding_idx=0)
)
(deeptabular): Sequential(
(0): TabMlp(
(embed_layers): ModuleDict(
(emb_layer_education): Embedding(17, 32, padding_idx=0)
(emb_layer_native_country): Embedding(43, 32, padding_idx=0)
(emb_layer_occupation): Embedding(16, 32, padding_idx=0)
(emb_layer_relationship): Embedding(7, 32, padding_idx=0)
(emb_layer_workclass): Embedding(10, 32, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(tab_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Linear(in_features=162, out_features=64, bias=False)
(1): ReLU(inplace=True)
(2): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(3): Dropout(p=0.5, inplace=False)
)
(dense_layer_1): Sequential(
(0): Linear(in_features=64, out_features=32, bias=True)
(1): ReLU(inplace=True)
(2): Dropout(p=0.5, inplace=False)
)
)
)
)
(1): Linear(in_features=32, out_features=1, bias=True)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Now we will define the set up for each model component, including optimizers, learning rate schedulers and initializers:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.initializers</span> <span class="kn">import</span> <span class="n">KaimingNormal</span><span class="p">,</span> <span class="n">XavierNormal</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.callbacks</span> <span class="kn">import</span> <span class="n">ModelCheckpoint</span><span class="p">,</span> <span class="n">LRHistory</span><span class="p">,</span> <span class="n">EarlyStopping</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.optim</span> <span class="kn">import</span> <span class="n">RAdam</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1"># Optimizers</span>
<span class="n">wide_opt</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">Adam</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">wide</span><span class="o">.</span><span class="n">parameters</span><span class="p">(),</span> <span class="n">lr</span><span class="o">=</span><span class="mf">0.03</span><span class="p">)</span>
<span class="n">deep_opt</span> <span class="o">=</span> <span class="n">RAdam</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">deeptabular</span><span class="o">.</span><span class="n">parameters</span><span class="p">(),</span> <span class="n">lr</span><span class="o">=</span><span class="mf">0.01</span><span class="p">)</span>
<span class="c1"># LR Schedulers</span>
<span class="n">wide_sch</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">lr_scheduler</span><span class="o">.</span><span class="n">StepLR</span><span class="p">(</span><span class="n">wide_opt</span><span class="p">,</span> <span class="n">step_size</span><span class="o">=</span><span class="mi">3</span><span class="p">)</span>
<span class="n">deep_sch</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">lr_scheduler</span><span class="o">.</span><span class="n">StepLR</span><span class="p">(</span><span class="n">deep_opt</span><span class="p">,</span> <span class="n">step_size</span><span class="o">=</span><span class="mi">5</span><span class="p">)</span>
<span class="c1"># Component-dependent settings as Dict</span>
<span class="n">optimizers</span> <span class="o">=</span> <span class="p">{</span><span class="s1">'wide'</span><span class="p">:</span> <span class="n">wide_opt</span><span class="p">,</span> <span class="s1">'deeptabular'</span><span class="p">:</span><span class="n">deep_opt</span><span class="p">}</span>
<span class="n">schedulers</span> <span class="o">=</span> <span class="p">{</span><span class="s1">'wide'</span><span class="p">:</span> <span class="n">wide_sch</span><span class="p">,</span> <span class="s1">'deeptabular'</span><span class="p">:</span><span class="n">deep_sch</span><span class="p">}</span>
<span class="n">initializers</span> <span class="o">=</span> <span class="p">{</span><span class="s1">'wide'</span><span class="p">:</span> <span class="n">KaimingNormal</span><span class="p">,</span> <span class="s1">'deeptabular'</span><span class="p">:</span><span class="n">XavierNormal</span><span class="p">}</span>
<span class="c1"># General settings as List</span>
<span class="n">callbacks</span> <span class="o">=</span> <span class="p">[</span><span class="n">LRHistory</span><span class="p">(</span><span class="n">n_epochs</span><span class="o">=</span><span class="mi">10</span><span class="p">),</span> <span class="n">EarlyStopping</span><span class="p">,</span> <span class="n">ModelCheckpoint</span><span class="p">(</span><span class="n">filepath</span><span class="o">=</span><span class="s1">'model_weights/wd_out'</span><span class="p">)]</span>
<span class="n">metrics</span> <span class="o">=</span> <span class="p">[</span><span class="n">Accuracy</span><span class="p">,</span> <span class="n">Recall</span><span class="p">]</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Build the trainer and fit!</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span>
<span class="n">objective</span><span class="o">=</span><span class="s1">'binary'</span><span class="p">,</span>
<span class="n">optimizers</span><span class="o">=</span><span class="n">optimizers</span><span class="p">,</span>
<span class="n">lr_schedulers</span><span class="o">=</span><span class="n">schedulers</span><span class="p">,</span>
<span class="n">initializers</span><span class="o">=</span><span class="n">initializers</span><span class="p">,</span>
<span class="n">callbacks</span><span class="o">=</span><span class="n">callbacks</span><span class="p">,</span>
<span class="n">metrics</span><span class="o">=</span><span class="n">metrics</span><span class="p">,</span>
<span class="n">verbose</span><span class="o">=</span><span class="mi">0</span><span class="p">,</span>
<span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_wide</span><span class="o">=</span><span class="n">X_wide</span><span class="p">,</span> <span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">10</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">256</span><span class="p">,</span> <span class="n">val_split</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">import</span> <span class="nn">matplotlib.pyplot</span> <span class="k">as</span> <span class="nn">plt</span>
<span class="o">%</span><span class="k">matplotlib</span> inline
<span class="kn">import</span> <span class="nn">seaborn</span> <span class="k">as</span> <span class="nn">sns</span>
<span class="n">sns</span><span class="o">.</span><span class="n">set</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>/Users/javier/.pyenv/versions/3.7.9/lib/python3.7/importlib/_bootstrap.py:219: RuntimeWarning: numpy.ufunc size changed, may indicate binary incompatibility. Expected 192 from C header, got 216 from PyObject
return f(*args, **kwds)
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">plt</span><span class="o">.</span><span class="n">figure</span><span class="p">(</span><span class="n">figsize</span><span class="o">=</span><span class="p">(</span><span class="mi">15</span><span class="p">,</span><span class="mi">8</span><span class="p">))</span>
<span class="n">plt</span><span class="o">.</span><span class="n">subplot</span><span class="p">(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">,</span><span class="mi">1</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">trainer</span><span class="o">.</span><span class="n">history</span><span class="p">[</span><span class="s1">'train_loss'</span><span class="p">],</span> <span class="n">label</span><span class="o">=</span><span class="s2">"train"</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">trainer</span><span class="o">.</span><span class="n">history</span><span class="p">[</span><span class="s1">'val_loss'</span><span class="p">],</span> <span class="n">label</span><span class="o">=</span><span class="s2">"val"</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">legend</span><span class="p">(</span><span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">xlabel</span><span class="p">(</span><span class="s2">"n epochs"</span><span class="p">,</span> <span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">ylabel</span><span class="p">(</span><span class="s2">"Loss"</span><span class="p">,</span> <span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">subplot</span><span class="p">(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">trainer</span><span class="o">.</span><span class="n">history</span><span class="p">[</span><span class="s1">'train_acc'</span><span class="p">],</span> <span class="n">label</span><span class="o">=</span><span class="s2">"train"</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">trainer</span><span class="o">.</span><span class="n">history</span><span class="p">[</span><span class="s1">'val_acc'</span><span class="p">],</span> <span class="n">label</span><span class="o">=</span><span class="s2">"val"</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">legend</span><span class="p">(</span><span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">xlabel</span><span class="p">(</span><span class="s2">"n epochs"</span><span class="p">,</span> <span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">ylabel</span><span class="p">(</span><span class="s2">"Accuracy"</span><span class="p">,</span> <span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">subplot</span><span class="p">(</span><span class="mi">2</span><span class="p">,</span><span class="mi">2</span><span class="p">,</span><span class="mi">3</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">trainer</span><span class="o">.</span><span class="n">lr_history</span><span class="p">[</span><span class="s1">'lr_wide_0'</span><span class="p">],</span> <span class="n">label</span><span class="o">=</span><span class="s2">"wide"</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">trainer</span><span class="o">.</span><span class="n">lr_history</span><span class="p">[</span><span class="s1">'lr_deeptabular_0'</span><span class="p">],</span> <span class="n">label</span><span class="o">=</span><span class="s2">"deeptabular"</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">legend</span><span class="p">(</span><span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">xlabel</span><span class="p">(</span><span class="s2">"n epochs"</span><span class="p">,</span> <span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
<span class="n">plt</span><span class="o">.</span><span class="n">ylabel</span><span class="p">(</span><span class="s2">"learning rate"</span><span class="p">,</span> <span class="n">fontsize</span><span class="o">=</span><span class="mi">13</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>Text(0, 0.5, 'learning rate')</pre>
</div>
</div>
<div class="output_area">
<div class="output_png output_subarea ">
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA4gAAAHkCAYAAAB8GQHGAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAACbaUlEQVR4nOzdeXhU5dnH8e+ZNXuGhEkCJIBAAqhsioJooVopyiIUsEVssbZFtFKU16pUtFqrWBVFEGsVrXXBFqwKYhVR1FYLVQERVJAdEpbsO1lmOe8fCQORIAkkmZnk97kumDn7fR5CztzzbIZpmiYiIiIiIiLS5lmCHYCIiIiIiIiEBiWIIiIiIiIiAihBFBERERERkVpKEEVERERERARQgigiIiIiIiK1lCCKiIiIiIgIALZgBxAshYXl+P2nPsNHYmIM+fllTRhR66cyazyVWeOpzBqnNZeXxWLQrl10sMMIO6f7fITW/XPVHFRejacyazyVWeO15jL7rmdkm00Q/X7ztB+Ap3t8W6QyazyVWeOpzBpH5SXHaorn45HzSMOpvBpPZdZ4KrPGa4tlpiamIiIiIiIiAihBFBERERERkVpKEEVERELQihUrGDlyJMOHD2fx4sXHbf/qq6+YMGECV1xxBdOmTaOkpKTO9kOHDnH++eeTlZXVUiGLiEgroARRREQkxGRnZzNv3jxefvllli9fzpIlS9ixY0edfe6//35mzJjBG2+8wRlnnMGzzz4b2Ob3+5k9ezYej6elQxcRkTDXZgepEREJJxUV5ZSVFeHzeZv0vDk5Fvx+f5OesyVZrTZiYlxERrau0UrXrFnD4MGDcblcAIwYMYKVK1cyffr0wD5+v5/y8nIAKioqiI+PD2x75plnGDJkCLt3727RuEVEJPwpQTwFhaVVYFPRiUjLqKgop7S0EJfLjd3uwDCMJju3zWbB6w3PBNE0TTyeaoqKcgFaVZKYk5OD2+0OLCclJbFp06Y6+8yaNYtrr72WOXPmEBkZydKlSwH48ssv+eSTT1i0aFG9TVNPJjEx5vSCr+V2xzbJedoKlVfjqcwaT2XWeC1VZqbfh+n11P6pxvTVvnq9mL5qPFVVVFdWUl1ZhSMmjoSMfk36eeBYynJOwZL3t1NUXs2syecEOxQRaQPKyopwudw4HM5ghxJSDMPA4XDicrkpLs5rVQmiaR4/rPqxHwQqKyuZPXs2zz//PH379uW5557j9ttvZ/78+dx777089thjWCyn1oskP7/stId1d7tjyc0tPa1ztCUqr8ZTmTVeWysz0/SD6QfTrPvq92Nigt8PR16/td+RY9vFOynMK8b0ecDnqXn1HnnvPbquNpnze734PNX4PdX4vDWvNYneMcf7vBh+L4bfg8XvxWLW/qHhv3dLTQulVz1GdFzcKZePxWKc8AvBkEwQV6xYwZNPPonH4+HnP/85V199dWDbli1bmDVrVmC5oKCA+Ph43nzzzRaLLyUhinVbczhc6SUqIiSLUERaEZ/Pi93uCHYYIctudzR509tgS05OZt26dYHlnJwckpKSAsvbtm3D6XTSt29fAH7yk58wf/581q1bR15eHjfccEPguOuuu46FCxfSrVu3lr0JEWl2pt8H3mpMb1Xta3XdZU8V1VWVeKqqqLBBeXklpt+PafprX03M2gTpyDK1yVGdbaZZk0DVHns0kTq6f91EzAS+a52JYdYkaIFXTAyz9hU/BrX7BbbVrquz77H7+Y85vmnmLjzciH29pgUvVjymte77Y169pgUPEXjNmnV+iw3TYgOLDdNqB4sdrHYsNhuGzYFhs2OxObDaHVjtdmx2B1ank7jE9vSKbb6azZDLbo50zH/ttddwOBxMmjSJQYMG0aNHDwB69+7N8uXLgZo+F1deeSX33HNPi8bYM83FGybs2F9M3+6JLXptEWmbmqsZSWvQGstmyJAhPP744xQUFBAZGcmqVav44x//GNjepUsXDh06xK5du+jWrRurV6+mT58+fO973+P9998P7HfJJZfw9NNPk5qaGozbEGmzTL8fvFW1NU5VxyRu1Q1YrknwfNVV+D1V+DxVmJ6jSZ/h82D4qrGYHixmw7oIWAEvcLJ2KH4zkHLhr63TMgPr6qRumObRdUfXG7XrOf4Y08A0jq6DI8uWuvvUvvfXObYmnmPfm7XxHnvtmvjBxILf5Fvrv72PceLz1Mbix4JptWPY7BhWOxb7sQlbzR+b3YHN4cDptOOwW4mwW3E6rDjsFpx2K067ldjaV4fDWrvOgsNuxRKiz6+QSxAb0jH/iKeeeorzzjuPgQMHtmiM3TrFY7UYbMssUoIoInKMyspKDh8uJyFBvxtPR3JyMjNnzmTKlCl4PB4mTpxI3759mTp1KjNmzKBPnz488MAD3HzzzZimSWJiInPmzAl22CJBV1Oj5QPv0eZ8xzYHPLaZ35Gmgcevq9t08Nh9j9/+7fN4wVsN/sa3avBj4DFteLBS5bdRbVrxYKPaPPLeQbUZhce0Uo0Nj2nFZ7GD1QE2R02Nk92J1e7E6qj5Y3NE4IiIxB7hJCEhjrLDHiwWC4bFgsWwgLXm1WKxYFgtWAwDi1HT/NBiGIFX45h1NkvtsnHsPmAcOcYwsFg47vhQ/TLvSJN+M/AXNU1QgSR3HPn5ZcEJLIhCLkFsSMd8gJKSEpYuXcqKFStO6Tqn2wm/R5qL3YdK1dm3kVRejacya7zWVmY5ORZstuablagpzz19+lR+9atpXHTR0EYdd/PN0/n+9y9m3LgJp3Rdi8XS6v7dx4wZw5gxY+qsW7RoUeD9sGHDGDZs2Hee49jaRJFQYZp+8FRheiprXr2VmNWV4K3EDKyveX/kNbDOW/O+yvTiraqsN7ELfMo/DX4s+A0bPsOKDytew1bzemxTwdpmgtWmA4/fQrVpodpnocq0Ul2b6NUkdzXJXE2yV7POtNqxBBK5COzOCBwOO5ERdiKdNiKdViIdNiKdNiJq37dz1ixHOqxERtiIcFixNqKvcVvrg9hQRxJXI/DX0TcWS2gmtc0t5BLEk3XMP2LFihVceumlJCae2rfUp9sJ/+xuiSz79072HyjCYbee8nnaEv1iajyVWeO1xjLz+/3NNtJoU49iWlRUhM9nNvqcc+cuADjlWPx+/3H/7t/VAV9EGsY0/bV92SqPT9aOJHDeKszqipqmknUSu28ngLXHeqsbfH2/YcVrceA1HHiwU42dKuw1yZYvuibh8lvwmBaq/Vaq/AZVPkttEmfFU5vMeev0BbMcs+3YvmFHli3YbDbsVgt2Wz1/rBbsNiuO2mVbnfUWIhxW2jltRDhttUmetTbRsxHlrEnsbFZNRS6hK+QSxJN1zD/ivffeY9q0aS0ZWh1ndUvk1Q92sOtACb26tAtaHCIioeJ3v/st2dmHuOuuWdxww2/44IP38Hg8HDiQxVNP/Y1Dhw7y7LN/ITNzH9XVHs4/fzB33vkHIiIimD79Oi6++AdMmPATJk4cw9ixE/jXv96gsDCffv0GcOed9xJ3GqO1iUj9TNPErCjGX5KDWZKDvzgbf0ku/pJs/CU5UFXe8HMZVnxWJz6LHa/hwGscSeicVPmjqTRtVPhtHMbGYa+Fcq+Vcq+VKtNOlWk7+krNa7VZU2sHEOGoTbJqX2OjHeA3a5M2a50ELqI2UTs+gbPWWW//VmJ35Dw2qxGyzSFFWkLIJYgn65gPNb/MvvrqKwYMGBCkKKH3GYkYwLbMIiWIItLi/rv5IB9vOnja5zGM2oHlvsNFfTtwYZ8OJz3XAw/MZeLEMcyceRvFxUVs3vwF8+Y9Qa9eZ2K1WvnlL3/KXXfdy0UXDSMnJ5tf//pXvPfeSkaPHnfcuT766EOefPIZ/H4/06dfx/Llr/Kzn117Svco0taZfj9meT7+4hz8JTn4S7IxA0lgLnirju5rWPBFtKPM5iLflk6JEUGF30a518Jhr5VSr5WyaguHfdZAIncksTuSzB1hMQwinVYiamvRIpw2IqNtddYlOI7UtFnrNKc8dp3TcfxgHq2xtYhIqAi5BLEhHfMLCgqw2+04ncGbEywm0k5aUgzfZBYFLQYRkVCWmNiegQPPB8Dn8/HXvy6mU6dUysrKyMvLJT7eRW5ubr3Hjh07nnbtEgAYNGgImZn7WixukXBk+jz4S3Mxj0kCj9QEmqV54Pcd3dlqgxg3VRGJFLfvQrYnhn2HI9hW5GBfuRM/Nc0fY6PstIt11jSTjKlJ3mIcNtof2z+uNomLdNQmgMe8d9gsqokTCUMhlyDCyTvmJyYm8t///relwzpOepqLjzYdwOvzqy25iLSoC/s0rFbvZJq6D+Kxjh3J1Gq18t///oclS14GoEePdCorK/D767+2y3W0ZYbNZqu3f7pIW2NWV9Qmfzm1TUJrk8DibMzyQuoMzmKPxBKXhCWhMxUp/cj3x3GgMoqdpU62Fxjk7q4M7O2wW+jUPprU7jEMdseQ6o4m1R1DXLTmXxVpi0IyQQwXPdNcrF6fxd7sUrp3jA92OCIiIeXYmoPNm7/gr39dxKJFz5OW1hmAGTOuD1ZoIiHJNE3MytKavoBH/hRn19YMZmNW1m1SaUTGYcQlYe3QE0tcEhWOBHJ8sew7HMmeAj9ZeeUc3H0YT+2XQIYBye2sdE6OZsjZHejkjiE1KRq3KzJk52MTkZanBPE0pKe5gJp+iEoQRUTAbrdTXn78oBbl5eVYrRacTic+n49Vq97miy8+56yz+gQhSpHQYFaV492zgeyPtlCRs79mUBhPxTF7GBjR7bDEJ2PtOgAjLhlLXBKeyEQOVkWTWeRlf045WQfL2L+5nLIKD1AEFBEf4yDVHUPvLu1IdceQ6o6hQ2KURl4XkZNSgnga4qMdpCREsW1fEZcP6hLscEREgu7yy0fz0EP38dOf/rzO+vPPH8zFF1/KlCmTsFot9Ox5JpdfPpq9e/cEJU6RYDGrK/Du24hnxyf4sr4EvxdrbCKGqxP2lB41zULjkjHikvBHJZJT4iErt5ys3DL276h5zSveEzif02EltX0052S4A01DU5NiiIm0B+8mRSSsGWYb7dhxuvMgHhk9629vb2Hd1lwW3Pw9Nc84CY041ngqs8ZrjWV26NBeUlKa50uo5uyD2JLqKyPNg3hqTvf5CK3z/+HpML1VePdtwrvzE7z7vgCfByO6HbZu52PvPoik3n3YvjufrNyyQDKYlVPOoYJyvL6afwuLYZCSGEWqO7qmaWhtMpgYH9EmP3/oZ6zxVGaN15rL7LuekapBPE0ZaS7+88VB9ueWk5akDyIiIiJSM6qoL/NLPDs/wbv3c/BWYUTGYe81FFv3QViTe5CZU84HG/az/u8ra5uH1mgX6yTVHUOfbgmkumPo5I6mQ2I0dpsGxBOR5qcE8TRlHNMPUQmiiIhI22X6vfj2f12TFO7ZANUVGM4Y7D0uwNb9fKwdeuHxmXyyNYcPV21g54ES7DYLF/brSGpiVCAZjI5Q81ARCR4liKepfXwkiXFOvsks4gfnpgY7HBEREWlBpt+P7+DWmuaju9djVpWBIxJb13Oxdz8fa6czMSw2sgsO8+GHO/l400HKK72kJEQx6QfpXNgnha5pCa22GZuIhB8liE0gI83FV3sKMU1TE8KKiIi0cqbpx3doO96dn+Ld/RlmRQnYnNi6DsDebRDWtLMxrHZ8fj8btufx4ef7+WpPIVaLwYD09lw8oBO9urTTZwYRCUlKEJtAepqLtV9lk1NYQXJCVLDDERERkSZmmib+3N01zUd3fVozMb3Vjq1zP2zdB2Hr3BfD5gSgsLSKf2/cxX++OEBRWTUJcU5+9L0z+F6/jrhinEG+ExGR76YEsQn0rO2H+E1mkRJEERGRVsI0Tfz5+/Du/ATPrs8wS3PBYsOW1gfboB9j69wfwxEJgN80+Xp3AR98vp+N2/MwTZOzuiXwsxGd6Ns9EatFA8yISHhQgtgEUhKiiI2ysy2ziKH9OgY7HBERETkNvsL9NUnhzk8xiw+BYcGaehb2c67A1vUcDGd0YN+yCg8fbzrIhxv3k1NYQUyknRHnpzFsQCeSXJFBvAsRkVOjBLEJGIZBRpqLbZlFwQ5FREREToG/+BCenZ/i3fkp/sIsMAysHXph63sZtjPOxRIRG9jXNE12HSjhg8/38+mWHLw+Pz1S4xl70RkM7OnGbrMG8U5ERE6PEsQmkpHqYv03uRSUVJIQFxHscEREROQk/KW5eHZ+hnfXJ/jz9gJgTcnAOeSn2LoNxBLlqrN/ZbWX/32dzYcb9rMvpwynw8r3+nbg+wM6aaorEWk1lCA2kWPnQxx8VkpwgxERCRNvvbWCV19dyrPPvhjsUKSN8JcX4t31KZ6dn+LP2QmAxd0N5+BJ2LqdhyUm8bhjsnLL+PDz/az58hCV1T5S3TH8bERPBp+ZTKRTH6VEpHXRb7UmkpYUQ6TTqgRRREQkBHn3f031huX4Dm4DTCyJnXGcPxF7t/OxxCUdt7/H62f9Nzl88Pl+tmcVY7ManNcriYsHpNK9U5ymqBCRVksJYhOxWAzSU118o36IItICPNv+i+eb/5z2eQzDwDTN79zH3nMo9owLT3quP/zhTtq3d3PjjTcBcPjwYa644oc8+uhCXn/9n2ze/AWFhQWkpqZxyy2z6Nu3/2nHL9IQpumn8sNFgIHj3HHYu5+PxdWh3n1ziyr4cON+Pt50kNLDHpJckVx5cXcu6tOB2ChHywYuIhIEShCbUEaai0078yk5XE2cHiIi0saMGDGShx+ew69/PQPDMPjoow/p2rUb//rXGwAsXvwKFouV+fMf4S9/Wcif//xMUOOVtsOfuwezvJCI70+t98sOv99k0858Pvh8P1/uygcD+veomdD+zDMSsKi2UETaECWITSgj1QXA9sxizu3pDm4wItKq2TMubFCt3snYbBa8Xn8TRATnnTcIr9fL5s1f0Ldvf959dyUjRozkkksuJSIiAqvVxsGDB4iNjSU3N7dJrinSEN7d68CwYuvSv8764rIq/rPpIP/ZuJ/8kiriYxyMHtKVYf07asA5EWmzlCA2oa4dYrHbLGzLLFKCKCJtjtVqZfjwy1i9ehWdO3fl88/XM3v2PeTk5DB//lz27NlNly5diI2NxzSbJikVORnTNPHs2YC1Yy8MZzSmafLNviI++Hw/G7bl4vOb9O7Sjp9ckk7/9PbYrJrQXkTaNiWITchmtdC9Y5zmQxSRNmvEiJH89rczOOOMbpx77nm0a5fADTf8irFjx/PEE4swDIO3336TXbt2BDtUaSP8RQcwiw9h9voB767L5MPP93Mw/zDRETZ+cG4qw/p3pENi9MlPJCLSRihBbGIZaS5WrNlDRZVXQ1+LSJuTnp6By9WOF1547pjBasqJjIzAMAz27NnNyy+/gNfrDXKk0lZ4d68H4MUtkXy6ZztndIjjFyN7c37vJBx2TWgvIvJtakfRxDLSXJgm7NhfHOxQRESC4rLLRlFeXsZFFw0F4Lbb7uDll1/khz8cxuzZt3L55aMpKiqkuLgouIFKm+DdswHD3Y31+zz88Lw07rpmIBf17aDkUETkBFTF1cS6d4zHajHYlllEn27HT7YrItLaXXXVT7nqqp8Gli+6aBgXXTSszj6TJ08BYOTIMYwcOaZF45O2w1+ahz9vDwU9RuHzm3oui4g0gGoQm5jTYaVrSqzmQxQREQky754NAGz2dMFqMeiRGh/kiEREQp8SxGaQkeZi94ESqj2+YIciIiJhasWKFYwcOZLhw4ezePHi47Z/9dVXTJgwgSuuuIJp06ZRUlICwM6dO5k8eTJjx47lJz/5CVu2bGnp0EOGd896LO06sf6gQfdO8TjVrFRE5KSUIDaDjDQXPr/JrgMlwQ5FRETCUHZ2NvPmzePll19m+fLlLFmyhB076o78ev/99zNjxgzeeOMNzjjjDJ599lkA7rzzTqZOncry5cu5+eabuf3224NxC0HnryjBd2gb/rQB7DtUSu8u7YIdkohIWFCC2AzSU+MxgG1ZRcEORURaCdM0gx1CyGqNZbNmzRoGDx6My+UiKiqKESNGsHLlyjr7+P1+ysvLAaioqCAiomZi9yuvvJKhQ2sGCOrZsycHDx5s2eBDhHfv52Ca7LP3wAQliCIiDaRBappBVISd1KQYzYcoIk3CarXh8VTjcDiDHUpI8niqsVpb1+MsJycHt9sdWE5KSmLTpk119pk1axbXXnstc+bMITIykqVLlwIwfvz4wD4LFizg0ksvbdS1ExNjTiPyo9zu2CY5z6k69P4mbPFutlXE47AXc37fjthtodvENNjlFY5UZo2nMmu8tlhmreuJGkIy0lx8tOkAXp8fm1UVtSJy6mJiXBQV5eJyubHbHRiGEeyQQoJpmng81RQV5RIb27pqh+qrFT32372yspLZs2fz/PPP07dvX5577jluv/12nn766cDxDz30EF988QUvvPBCo66dn1+G3396tbJudyy5uaWndY7TYVZXcHjXF9jPvIQNW3JI7xRHUeHhoMVzMsEur3CkMms8lVnjteYys1iME34hqASxmfRMc7F6fRZ7s0vp3lGjponIqYuMjAaguDgPn69pJ5i3WCz4/f4mPWdLslptxMa2C5RRa5GcnMy6desCyzk5OSQlJQWWt23bhtPppG/fvgD85Cc/Yf78+QB4vV5uv/12srOzeeGFF4iNbXvffnszN4PfS1VKX/b/J4fBZyYHOyQRkbChBLGZpKe5ANieWawEUUROW2RkdLMkQa3529FwNmTIEB5//HEKCgqIjIxk1apV/PGPfwxs79KlC4cOHWLXrl1069aN1atX06dPHwAefPBBysrK+Otf/4rD4QjWLQSVd896jIhYvqlMBHLo3SUh2CGJiIQNJYjNJD7aQXJCFNsyi7hsUOdghyMiImEkOTmZmTNnMmXKFDweDxMnTqRv375MnTqVGTNm0KdPHx544AFuvvlmTNMkMTGROXPmUFBQwOLFi0lNTeXKK68MnG/58uVBvJuWZfo8ePd9gb37+WzdV0yk00qXlKbpVyki0hYoQWxGPdPiWbc1F79pYlGfIRERaYQxY8YwZsyYOusWLVoUeD9s2DCGDRt23HFff/11s8cWynz7vwZPJbau57Ll7UIyUl1YLRoLQESkofQbsxllpLk4XOVlf255sEMRERFpE7x71oM9gpLYbmQXVmh6CxGRRlKC2IwyavsharoLERGR5mf6/Xj3fI6tcz+2ZJUB0EsJoohIoyhBbEbt4yNJiHMqQRQREWkBvuztmJWl2Lqey9a9hcRE1sxLLCIiDacEsZllpLnYlllU75xWIiIi0nS8u9eD1YY19Wy27iukV2eXxgAQEWkkJYjNLCPNRXF5NTmFFcEORUREpNUyTRPvnvVYO51F3mHIL6lS81IRkVOgBLGZ9azth/iNmpmKiIg0G3/+PsyyfGxdz2HL3kIADVAjInIKlCA2s5SEKGKj7OqHKCIi0oy8e9aDYWDrMoAtewuJj3GQkhAV7LBERMJOSCaIK1asYOTIkQwfPpzFixcft33Xrl387Gc/44orruCXv/wlxcXFQYiyYQzDICPVpQRRRESkGXl3r8eakoEREcvWfUX07tIOQ/0PRUQaLeQSxOzsbObNm8fLL7/M8uXLWbJkCTt27AhsN02TG264galTp/LGG2/Qu3dvnn766SBGfHIZaS7yiispKKkMdigiIiKtjr/4EP7C/di6nsuB/MOUlFfTu7Oal4qInIqQSxDXrFnD4MGDcblcREVFMWLECFauXBnY/tVXXxEVFcXQoUMBuP7667n66quDFW6DaD5EERGR5uPZvQEAW9dz2Frb/1AD1IiInJqQSxBzcnJwu92B5aSkJLKzswPL+/bto3379tx+++2MGTOGu+++m6io0O5jkJYUQ6TTqgRRRESkGXj3rMfSvguW2PZs2VtI+/gI3K7IYIclIhKWbMEO4Nvqmy/w2D4EXq+XTz/9lJdeeok+ffrw2GOP8ac//Yk//elPjbpOYuLpT5zrdsc2eN8zz0hk58HSRh3TGrX1+z8VKrPGU5k1jspLwpm/vBB/zk4cA8fj95t8s6+QARnukx8oIiL1CrkEMTk5mXXr1gWWc3JySEpKCiy73W66dOlCnz59ABg9ejQzZsxo9HXy88vw+0998nq3O5bc3NIG7981OYb1W3PYuTefuCjHKV83nDW2zERldipUZo3TmsvLYjGa5MtACW3ePbXNS884l8ycMsorvZreQkTkNIRcE9MhQ4awdu1aCgoKqKioYNWqVYH+hgADBgygoKCArVu3AvD+++9z1llnBSvcBuuZVvOw2p4ZuiOuioiIhBvvng0Y8SlYXB0D8x/20gA1IiKnLCRrEGfOnMmUKVPweDxMnDiRvn37MnXqVGbMmEGfPn144oknuPPOO6moqCAlJYWHHnoo2GGfVNcOsdhtFrZlFnFuTzV9EREROV1mVTm+A1tx9B2BYRhs3VdISkIU7WKdwQ5NRCRshVyCCDBmzBjGjBlTZ92iRYsC7/v168c///nPlg7rtNisFrp3jNNANSIiIk3Eu3cjmD5sZwzE6/PzTWYRQ85KCXZYIiJhLeSamLZmGWku9uWUUlHlDXYoIiIiYc+7Zz1GdDss7q7sOVRKVbVP/Q9FRE6TEsQWlJHmwjRhx371QxQRETkdprcKb+aX2Lqcg2FYAv0Pe3Z2BTcwEZEwpwSxBXXvGI/VYqiZqYiIyGnyZn4JvmpsZ5wLwNa9haQlxRDbRkcKFxFpKkoQW5DTYaVLSizfKEEUERE5Ld4968EZjbVDBh6vjx37izV6qYhIE1CC2MIy0lzsOVhCtccX7FBERETCkun34t27EVuX/hgWGzv3l+Dx+tX/UESkCShBbGEZaS68PpPdB0uCHYqIiEhY8h3YCtWHsXWtaV66ZW8hhlHzjBURkdOjBLGFpafGY4CamYqItGKbN28OdgitmnfPBrA5sKWeDcCWfYV0TYkjKiIkZ+8SEQkrShBbWHSEndSkGA1UIyLSil1//fWMGDGChQsXsm/fvmCH06qYph/vng3YUvtg2BxUVfvYfaBEzUtFRJqIEsQgyEh1sWN/MV6fP9ihiIhIM/joo4+488472b9/P+PHj+fHP/4xL774IgUFBcEOLez5c3ZhHi4KjF66PasIn9+kVxdXcAMTEWkllCAGQUZnF9UeP/uyy4IdioiINAOLxcL3vvc9HnjgAdasWcN1113Hq6++ytChQ5k6dSqrVq0Kdohhy7tnAxhWbJ37ATX9D60Wg/ROruAGJiLSSqixfhBkpMYDsC2ziG4d44IcjYiINAe/38+aNWt46623WL16NW63m9/85jd07NiRJ598knfffZeHH3442GGGFdM08exej7VTbwxnNFCTIHbvGIfTYQ1ydCIirYMSxCCIj3GSnBDFtswiLhvUOdjhiIhIE7vrrrt49913cTgcjBo1ir/97W/07t07sL1Hjx5Mnjw5iBGGJ3/hfsySbGx9RwBwuNLD3uxSxgzpGtzARERaESWIQZKRGs+Gbbn4TROLYQQ7HBERaUI+n4/HHnuMQYMGYdTzOz4tLY0XXnghCJGFN++e9YCBres5QM2I4KaJBqgREWlC6oMYJBlpLsorvRzILQ92KCIi0sTuuece1q5dS1ZWFgAvvfQS8+fPx+v1AhATE0OfPn2+8xwrVqxg5MiRDB8+nMWLFx+3/auvvmLChAlcccUVTJs2jZKSmvl1S0pKuO6667j88su5+uqryc3NbeK7Cx7v7g1YkrtjiXIBNc1L7TYL3TrGBzcwEZFWRAlikPSsncxX8yGKiLQ+99xzD1988QUOhwOA/v37s3HjRh544IEGHZ+dnc28efN4+eWXWb58OUuWLGHHjh119rn//vuZMWMGb7zxBmeccQbPPvssAI899hgDBw7k7bff5sorr+T+++9v2psLEn9JLv78vdi7nhtYt3VvIemp8dht+jgjItJU9Bs1SBLjI0iIc2o+RBGRVuj999/niSeeIDk5GYCzzz6bBQsW8Pbbbzfo+DVr1jB48GBcLhdRUVGMGDGClStX1tnH7/dTXl7TCqWiooKIiAgAPvzwQ8aMGQPA6NGj+c9//oPH42mqWwsa754NAIHpLUrKq8nKLVfzUhGRJqY+iEFiGAYZaS627CnENM16+6iIiEh4MgyDiooKoqOjA+uqq6uxWhs20mZOTg5utzuwnJSUxKZNm+rsM2vWLK699lrmzJlDZGQkS5cuPe5Ym81GTEwMBQUFgWT1ZBITYxq038m43bFNcp4jDuzfiCOpM8nduwOwdf9+AC7o16nJrxUMreEeWprKrPFUZo3XFstMCWIQZaS6+N9X2eQUVpCcEBXscEREpImMGjWKX//619xwww0kJyeTnZ3NU089xejRoxt0vGmax6079ovEyspKZs+ezfPPP0/fvn157rnnuP3223n66afrPZ/F0vAGQ/n5Zfj9x1+/MdzuWHJzS0/rHMfyV5RQmbkVxzlXBM776ZcHiXBYiY+wNum1gqGpy6stUJk1nsqs8VpzmVksxgm/EFQT0yDKqO2HqGamIiKty2233cb555/PH//4RyZNmsScOXMYMmQIM2fObNDxycnJ5OXlBZZzcnJISkoKLG/btg2n00nfvn0B+MlPfsKnn34K1NQ2HjnW6/VSVlaGy+VqojsLDu/ezwEz0LwUagaoyUhzYW1E8isiIien36pB1CExiphIuxJEEZFWxuFw8Nvf/pb333+fTZs28e677zJjxozAoDUnM2TIENauXUtBQQEVFRWsWrWKoUOHBrZ36dKFQ4cOsWvXLgBWr14dGBV12LBhLFu2DIC33nqLgQMHYrfbm/YGW5h393qMWDeWhDQACkoqyS44rP6HIiLNQE1Mg8gwDHqmuTSSqYhIK1NQUMBLL71EdnY2fr8fqKnN27lzJ6+99tpJj09OTmbmzJlMmTIFj8fDxIkT6du3L1OnTmXGjBn06dOHBx54gJtvvhnTNElMTGTOnDkA3HTTTcyaNYtRo0YRGxvL3Llzm/Vem5tZXYFv/9fYz/pBoJnt1n2FgOY/FBFpDs2WIBYVFfHiiy/ym9/8hk2bNjFr1ixcLhcPPPAAXbp0aa7Lhp30NBfrt+VSUFJJQlxEsMMREZEmcOutt1JaWkq7du0oLCykR48erF69mkmTJjX4HGPGjAmMRnrEokWLAu+HDRvGsGHDjjvO5XLxl7/85dSDDzHefV+A33tc89LoCBupSU0zoI6IiBzVbE1Mf//737N582ZM0+See+7hwgsv5LzzzuOuu+5qrkuGpZ7qhygi0ups2LCBRYsWMXPmTOLi4pgzZw7z5s1j48aNwQ4t7Hj3bMCIjMOa1AOoGcBn695CenVph0UjgIuINLlmq0HcuHEj7777LocOHeKbb77hueeeIzY2lvPOO6+5LhmW0pJiiHBY2ZZVzOCzUoIdjoiINIHo6Gji4+NxOBxs27YNqKnxu+2224IcWXgxvdV4Mzdh7z4Yo3YwmtziSvJLqrhskJqXiog0h2arQayurgbggw8+4MwzzyQ+Pp7CwkKcTmdzXTIsWSwG6aku1SCKiLQi6enpLF68mIiICKKioti8eTPbt29v1HQTAr79X4OnEtsZ5wTWbd2r/ociIs2p2WoQL7nkEq655hr27NnDzTffzO7du7nlllsYMWJEc10ybGWkxfPqv/MpOVxNXFTDRrgTEZHQdeutt3LzzTczdOhQpk+fzlVXXQXAr3/96yBHFl68e9aDPRJrxzMD67bsLSQ+2kGHRM0fLCLSHJotQfzDH/7A8uXLcTqdjBkzhr179zJ69GimTJnSXJcMW0fmQ9yeWcy5Pd3BDUZERE5beXk5b7/9NlarlbS0NM477zzKy8vp1q1bsEMLG6bfh3fvRmyd+2FYaz6umKbJlr2FnNmlXWBEUxERaVrNliDa7XYuv/xyoqOj8fl8fPXVV/Tu3RubTTNrfFvXlDjsNgvbs4qUIIqItALTp0/nP//5D1arFaiZtkIax3doO2ZlaZ3RSw/mH6akvJpeal4qItJsmq0zxBtvvBGY1Hfu3Lncf//93HrrrTz99NPNdcmwZbdZ6N4xTvMhioi0Ev369ePtt9/G4/EEO5Sw5d2zHqw2bGl9Auu21PY/VIIoItJ8mq0675lnnuGJJ57A4/GwdOlSnn32WdxuN1dddRXXXXddc102bGWkuVixZg8VVV4inaplFREJZ1lZWcyaNYvZs2cTGxtbpznk2rVrgxhZeDBNE++eDVg7nY1hPzpH8Na9hSTGReCO17zBIiLNpdkykUOHDjF48GD+97//ERERQf/+/QEoKytrrkuGtfQ0F6YJO/YX06dbYrDDERGR03DPPfcEO4Sw5s/bi1mWj/3ccUfXmSZb9xUyIN2t/ociIs2o2RLElJQU3n33XVasWMGFF14IwCuvvELXrl2b65JhrUfHeKwWg22ZRUoQRUTC3Pnnnx/sEMKad896MCxYu/QPrMvMLqO80qvpLUREmlmzJYizZs3ijjvuwOl08uyzz7JmzRrmzp3LwoULm+uSYc3psNIlJVbzIYqItAK9evU6YS3Xli1bWjia8OPdsx5rh55YImID67buU/9DEZGW0GwJ4pAhQ/jwww8Dy8nJyXz88cfY7fbmumTYy0hz8d66TKo9Phx2a7DDERGRU7RixYo6y4WFhTz//PN8//vfD05AYcRfdBB/4QGcvS+us37L3kKSE6JoF+sMUmQiIm1Ds46GsmTJEl5//XUOHTpEYmIiV1xxBddcc01zXjKsZaS5WPnJPnYfLKFnZ31DKiISrtLT049bd+aZZzJ27FiuvPLKIEQUPjx71gNg63pOYJ3X5+ebzCIuOCslWGGJiLQZzTqK6ZIlS/jVr35Fx44dyczM5K9//StVVVUaxfQE0lPjMYBvMouUIIqItDKHDx+mvLw82GGEPO/uDVjcZ2CJOdoff++hUqqqfep/KCLSApotQVyyZAlPPfUU3bp1C6wbNGgQv/jFL5QgnkB0hJ1O7hi2qx+iiEhYmzFjRp0+iB6Ph02bNnHxxRd/x1HiLyvAn7sLx3kT66w/Mv9hz86uIEQlItK2NFuCWFxcTOfOneusS0tLo6Kiorku2Sr0THPx8eaDeH1+bFZLsMMREZFTkJGRUWfZYrEwevRohg8fHqSIwoN3zwYAbGecU2f91n2FpLpjiItyBCMsEZE2pdkSxHPOOYf58+czc+ZMLBYLfr+fBQsWBOZDlPpldHaxekMW+7LL6NYxLtjhiIjIKZg+fTq7d+8mKSmJ6OhoNm3aRExMjAZqOwnvnvVYXB2wujoG1nm8frZnFTOsf8fvOFJERJpKs1VR3XHHHbz11ltccMEFjBkzhgsuuIAPPviAO++8s7ku2SpkpMYDaLoLEZEwtmLFCiZMmEBmZiYAX375JVdddRXvvfdekCMLXWZlGb6D32Drem6d9bsOFOPx+tX/UESkhTRbDWLnzp1ZuXIl69ato6CggA4dOtC3b19stmYdODXsxcc4SW4XybbMIi4b1PnkB4iISMhZsGABzz//PL169QJg8uTJnH322dx2221ceumlQY4uNHn3bQTTj+2Mugnilr2FGEZNFwwREWl+zdrJzW63c8EFFzBq1CjOOeccSkpKuPrqq0963IoVKxg5ciTDhw9n8eLFx21fuHAhF198MWPHjmXs2LH17hPOMtJcbM8qwm+awQ5FREROQX5+Pr17966z7qyzziI/Pz9IEYU+7+71GNEJWNp3rbN+y95CuqbEEhWh5rkiIi2hRavzPB4PGzZs+M59srOzmTdvHq+99hoOh4NJkyYxaNAgevToEdjnyy+/5NFHH2XAgAHNHXJQZKS5+GjTQQ7klpOaFBPscEREpJHOOussFi1axA033BBY9+yzz3LWWWcFMarQZXqq8GZ9ib3XsDqjv1ZV+9h1oIQfnp8WxOhERNqWkGvvuWbNGgYPHozL5QJgxIgRrFy5kunTpwf2+fLLL1m0aBGZmZmcd9553H777TidziBF3PSONKP5JrNICaKISBj6/e9/z7Rp03jhhRdwu93k5OQQHx/PX/7yl2CHFpK8mZvA5zmueen2/UX4/Ca9NTewiEiLCbkEMScnB7fbHVhOSkpi06ZNgeXy8nJ69+7N7bffTqdOnZg1axZ//vOfmTlzZqOuk5h4+omX2x172ueoT/v2MbSPj2BvTlmzXSNYWtv9tASVWeOpzBpH5dX00tPTeeedd9iwYQP5+fkkJSXRr18/jWJ6At49GzCcMVhT6k4PsmVvIVaLQXqqKziBiYi0QU2eIO7YseOE23Jzc096vFlPv7tjm5tER0ezaNGiwPIvfvEL7rjjjkYniPn5Zfj9p97Hz+2OJTe39JSPP5keneLZvDOPnJySOvcfzpq7zFojlVnjqcwapzWXl8ViNMmXgaeiqKiI++67jxtuuIFBgwaxcOFCli5dyu9//3tiYtQy5Fimz4t330ZsXQdiWKx1tm3dW0i3jnE4HdYTHC0iIk2tyRPE0aNHYxhGvYkecNJkJzk5mXXr1gWWc3JySEpKCiwfOHCANWvWMHHiRKAmoWyNI6NmpLn439fZ5BRVkNwuKtjhiIhII9x5551ERESQmJgIwLhx41iwYAF33303jzzySJCjCy2+g1uhugL7GefUWX+40sueQ6WMGdI1OIGJiLRRTZ5Zbd269bSOHzJkCI8//jgFBQVERkayatUq/vjHPwa2R0RE8PDDDzNo0CBSU1NZvHgxw4cPP92wQ05GbT/EbfuKlCCKiISZTz/9lP/+97+BJqWpqan88Y9/ZOjQoUGOLPR4d68DmxNrp7oD+GzLLMI00fyHIiItrFmnuTgVycnJzJw5kylTpjBu3DhGjx5N3759mTp1Kps3byYhIYF7772XG264gcsuuwzTNLn22muDHXaT65AYRUyknW2ZRcEORUREGikiIoIDBw7UWZeTk0N0dHSQIgpNpunHu+dzbGl9MGyOOtu27C3EbrPQrWN8kKITEWmbQrJt5pgxYxgzZkyddcf2OxwxYgQjRoxo6bBalGEYZKS5+EYJoohI2Pnxj3/M1KlT+dnPfkZKSgrZ2dm8+OKLTJo0KdihhRR/9k7MiuLjRi+FmgSxR6d47LaQ+y5bRKRVC8kEUWpkpLnYsC2XgpJKEuIigh2OiIg00I033khiYiJvvfUWeXl5pKSk8OMf/xifzxfs0EKKZ896sFixde5XZ33J4WqycssYP7RbkCITEWm79LVcCDsyH+K2rKKgxiEiIo1jGAZXXXUVL774InPnziUlJYX58+fz8ssvBzu0kGGaJt7d67F2OhPDUbev/Tf7igD1PxQRCQbVIIawtKQYIhxWtmUWM/jMlGCHIyIiDeT1elm5ciUvvfQSX3zxBZdffjlPPvkkQ4YMafA5VqxYwZNPPonH4+HnP/85V199dWDbli1bmDVrVmC5oKCA+Ph43nzzTbKysrj99tspKysjLi6OP/3pT3Tq1KlJ768p+AuyMEtzsfUfddy2rXsLiXBY6dpBc3SKiLQ0JYghzGIx6JEar4FqRETCRG5uLn//+99ZunQpCQkJTJo0iT179jB79uzAlBcNkZ2dzbx583jttddwOBxMmjSJQYMG0aNHDwB69+7N8uXLAaioqODKK6/knnvuAWD+/PmMGjWKyZMn8+KLLzJv3jzmzp3b5Pd6urx71gMGti4Djtu2ZW8hGWkurBY1dBIRaWn6zRvieqa5OJBXTunh6mCHIiIiJ3HxxReTmZnJwoULeeONN5g8efIpzdW7Zs0aBg8ejMvlIioqihEjRrBy5cp6933qqac477zzGDhwIAB+v5+ysjKgJnmMiAjNPuzePeuxpqRjiao7SmlhaRWHCg7Tq7Oal4qIBINqEEPckfkQt2cVc06GO7jBiIjId7r88sv5z3/+Q0VFBRMnTmTYsGGndJ6cnBzc7qO/85OSkti0adNx+5WUlLB06VJWrFgRWHfTTTcxadIkXnzxRTweD0uWLGnUtRMTY04p5m9zu0/cPNRTeIjS/EwSLr0G17f2+7K2/+GQ/p2+8xytTVu616aiMms8lVnjtcUyU4IY4rqmxGG3WdiWWaQEUUQkxD388MOUlJSwbNkyHnnkEe655x5KS0vJzMxsVBNT0zSPW2cYxnHrVqxYwaWXXlrn3Lfffjv33nsvl156Ke+88w7Tp0/njTfeqPf4+uTnl+H3H3/9xnC7Y8nNLT3h9upN/wGgqv3Zx+336eaDREfYiHFYvvMcrcnJykuOpzJrPJVZ47XmMrNYjBN+IagmpiHObrPQrUOc5kMUEQkTcXFxTJkyhRUrVvDoo49y+eWX8/Of/5xx48bxzDPPNOgcycnJ5OXlBZZzcnJISko6br/33nuPkSNHBpYLCgrYtWsXl156KVAzb3Bubi6FhYWneVdNy7t7A5bENCxxx3/xuXVfIb06t8PSwIRWRESalhLEMJCR5mJfdikVVd5ghyIiIo1wzjnn8Kc//YmPP/6YiRMn1mkK+l2GDBnC2rVrKSgooKKiglWrVjF06NA6+5imyVdffcWAAUcHeWnXrh1Op5N169YBsH79eqKjo0lISGi6mzpN/sPF+LJ3YOt67nHbcosqyCuupJemtxARCRo1MQ0DGZ1dmGtg5/5izu7W8CZKIiISGmJiYvjpT3/KT3/60wbtn5yczMyZM5kyZQoej4eJEyfSt29fpk6dyowZM+jTpw8FBQXY7XacTmfgOMMwWLhwIX/84x+prKwkOjqaxx9/vLlu65R4934OmNjOOD5B3LK3pqZTCaKISPAoQQwDPTrGY7UYfJNZpARRRKSNGDNmDGPGjKmzbtGiRYH3iYmJ/Pe//z3uuL59+/LKK680e3ynyrtnPUZcEpZ2qcdt27q3kLhoBx0To4IQmYiIgJqYhgWnw0qXlFjNhygiImHNrD6Mb//X2Lqec9ygOaZpsmVvIb27tGvwgDoiItL0lCCGiYxUF7sPllDt8QU7FBERkVPi3fcF+H3Y6+l/eKjgMMXl1fRW81IRkaBSghgmMtJceH0muw+WBDsUERGRU+LdvR4jMh5Lcvfjtqn/oYhIaFCCGCbS0+IxQM1MRUQkLJnearyZm2ublx7/8WPL3kIS45y44yOCEJ2IiByhBDFMREfY6eSOUYIoIiJhyZf1FXir6h291G+abN1bSC/1PxQRCToliGEkIy2eHftL8Pr8wQ5FRESkUTx71oMjEmuHXsdty8opo7zSq/6HIiIhQAliGMlIc1Hl8bEvuyzYoYiIiDSY6ffh27sRW+f+GNbjZ9jaeqT/YWcliCIiwaYEMYxkpLkA9UMUEZHw4ju0DbOqrN7mpVDT/zA5IYqEOPU/FBEJNiWIYcQV4yS5XaQSRBERCSve3evBaseW2ue4bT6/n28yi+jd2dXygYmIyHGUIIaZjDQX27OK8JtmsEMRERE5KdM08e7ZgC31bAy787jtew6VUlnt0/QWIiIhQglimMlIc1Fe6eVAbnmwQxERETkpf+5uzPKCEzYvVf9DEZHQogQxzBzph/iNmpmKiEgY8O7ZAIYFW+f+9W7fureQVHc0cdGOlg1MRETqpQQxzLSPj6BdrJPtWUXBDkVEROSkvHvWY+3YCyMi5rhtHq+f7VnFal4qIhJClCCGGcMw6Jnm4pvMIkz1QxQRkRDmKzyAv+ggtq7n1Lt914Fiqr1+eqt5qYhIyFCCGIbS01wUl1WTU1QR7FBEREROyLtnPQC2riee3sIwoKdGMBURCRlKEMNQYD7EfUVBjUNEROS7ePdswOLuhiW6/hrCrXsL6ZIcS1SEvYUjExGRE1GCGIY6JkYRE2lnm/ohiohIiPKX5ePP3X3C0UurPD52Hiiht/ofioiEFCWIYcgwDDLSXGzTSKYiIhKivHs2AGA/QfPSHVnF+PymEkQRkRCjBDFMZaS5yC2qpKCkMtihiIiIHMe7ez2Wdh2xuFLq3b5lbyFWi0GP1PgWjkxERL6LEsQwlZFW80BVM1MREQk1vsMl+A59c8LBaaAmQTyjYxwRDlsLRiYiIiejBDFMpSXFEOGwsi2zONihiIiI1HF4+zowzRMmiIcrvew5VKLpLUREQpC+tgtTVouFHqnxbFc/RBERCTHl33yCEZOIpX2XerdvyyrCNFH/QxFpEhUV5ZSVFeHzeZv0vDk5Fvx+f5Oes+UYOBwRtGvnxjCMRh2pBDGM9Uxz8eq/d1F6uJrYKEewwxEREcH0VFKx6wtsvb9/wg8lW/cWYrdZ6N4proWjE5HWpqKinNLSQlwuN3a7o9HJ0Hex2Sx4veGZIJqmn6KiPMrKiomNdTXqWDUxDWPpqS4AtmepmamIiIQGb+YmTJ/npP0Pe3SKx26ztmBkItIalZUV4XK5cTicTZochjvDsBAb246KirJGH6sEMYyd0SEOm9Wi6S5ERCRkeHdvwBIVhzUlo97tpYerycwpo5eal4pIE/D5vNjtaklXH6vVht/va/RxShDDmN1moXvHOCWIIiISEkyfF+++L4hOPw/DUv9HjG/2FQHqfygiTUc1h/U71XJRghjmMtJc7M0upaKqaTvlioiINJbpqQDTT0zfYSfcZ8u+QpwOK11TYlswMhERaSgliGEuo7ML04Sd+9UPUUREgssSEUvMz58gsvNZJ9xn695Ceqa5sFn1EURE2pbKykoKCvKDHcZJ6bdzmOveMQ6LYfCNmpmKiEgIMCwnHiC9sLSKg/mH6aX5D0WkDbrxxqls2fJ1o4+75ZYZLF/+WjNEVL+QTBBXrFjByJEjGT58OIsXLz7hfh9++CGXXHJJC0YWeiIcNrqkxKofoohIK/Ndz8ItW7YwduzYwJ/vfe97jB49GoCcnByuu+46xo0bx6RJk8jKygpG+PXauq8QUP9DEWmbiouLTum4Rx5ZwNix45s2mO8QcvMgZmdnM2/ePF577TUcDgeTJk1i0KBB9OjRo85+eXl5PPjgg0GKMrT0THPx3vpMPF6fhgwXEWkFTvYs7N27N8uXLwegoqKCK6+8knvuuQeA2267jREjRnDVVVfx97//nblz5/LYY48F6U7q2rK3kOgIG2lJMcEORURaqf9uPsjHmw42ybkMA0zzxNsv6tuBC/t0aNC5fve735KdfYi77prFDTf8hg8+eA+Px8OBA1k89dTfOHToIM8++xcyM/dRXe3h/PMHc+edfyAiIoLp06/j4ot/wIQJP2HixDGMHTuBf/3rDQoL8+nXbwB33nkvcXFNN69syNUgrlmzhsGDB+NyuYiKimLEiBGsXLnyuP3uvPNOpk+fHoQIQ09Gmguvz2TXgZJghyIiIk2goc9CgKeeeorzzjuPgQMHUlBQwNatW5k0aRIAEyZM4Oabb27ByL/b1r2F9OzcDotFIw6KSNvywANzSU5O4Y9//BPR0dFs3vwF06bdyJIly0lMbM/s2bdy9dXX8Oab7/HSS0vZuvVr3nuv/t/7H330IU8++Qwvv/wqmZn7WL781SaNNeRqEHNycnC73YHlpKQkNm3aVGefF154gTPPPJN+/fqd8nUSE0//20u3OzRGYBsc7WTBq5vYX1DBRed2DnY43ylUyiycqMwaT2XWOCqv0NOQZyFASUkJS5cuZcWKFQBkZmbSsWNH5syZwyeffELHjh256667GnXtpng+wvE/V4fyy8krrmTCJen6mauHyqTxVGaN1xrLLCfHgs12tM5r2IBODBvQKYgRfTer1cBiMWjfvj2DBw8GwOfz8fzzL5OamkZZWSmFhfm4XC7y8/Ow2SwYRs0xR+5z/PgJuN3tAbjgggvZvz+zThkcy2KxNPrfPeQSRLOeetxj5/DYtm0bq1at4m9/+xuHDh065evk55fh939HnfFJuN2x5OaWnvLxTS3VHc3nW7O5pH/HYIdyQqFWZuFAZdZ4KrPGac3lZbEYTZbstLSTPQuPWLFiBZdeeimJiYkAeL1evv76a37zm98we/ZsXnnlFWbNmsWLL77Y4Guf7vMR6v+5WvPFAQBSEyJb7c/cqWrN/w+bi8qs8Vprmfn9frxef7Oc22azNPm5fT4Tv9+kXbvEY85t8J///JslS14GoEePdCoqKvB6fXi9fkyz5pgj+8fGugLvLRYrPt+Jy8Dv99f77/5dz8iQa2KanJxMXl5eYDknJ4ekpKTA8sqVK8nNzWXChAlcd9115OTkMHny5GCEGlIy0lzs2F+Cz988/0FERKTlnOxZeMR7773HyJEjA8tut5vo6GguvvhiAEaPHl1vzWMwbNlXSFyUnY7to4MdiohI0B37pd/mzV/w178u4rHH/syrr77Jgw/OIzGxfdBiC7kEcciQIaxdu5aCggIqKipYtWoVQ4cODWyfMWMG77zzDsuXL+fpp58mKSmJl19+OYgRh4aMNBdVHh/7ssuCHYqIiJymkz0LoaaW8auvvmLAgAGBdZ07dyY5OZl///vfAHzwwQecddaJ5yRsKaZpsmVvIb26tKu3JlREpC2w2+2Ul5cft768vByr1YLT6cTn8/H222/yxRef4/V6gxBlCCaIycnJzJw5kylTpjBu3DhGjx5N3759mTp1Kps3bw52eCErI80FwDf7ioIah4iInL6GPAsLCgqw2+04nc46xy5cuJBnnnmG0aNH88ILLzBnzpxg3EIdhwoOU1xWrektRKRNu/zy0Tz00H3k5GTXWX/++YO5+OJLmTJlEldc8UPeffcdLr98NHv37glKnIZZX0eHNqC19UEEmPXUWjomRjNjYt9gh1KvUCyzUKcyazyVWeO05vIK5z6IwdQcfRA/2JDFi6u28adpg0lqF3W6IbY6rfn/YXNRmTVeay2zQ4f2kpLSpVnO3Rx9EFvaiconrPogyqnLSHOxPasIf9vM+UVEJERt2VtIQpwTtysy2KGIiMhJKEFsRXp1dlFe6eXev33Giv/uJiu3rN6R8ERERFqK3zTZuq+I3p3V/1BEJByE3DQXcuoGn5lC6WEP67bm8PpHu3n9o90ktYvknAw352S46dYxDoseziIi0oKycsooq/DQS/0PRUTCghLEVsRiMRhxfmdGnN+ZwtIqNu7IY8O2XN79LJOVn+wjPsbBgHQ356S3p1eXdtisqkAWEZHmtbV28DQNUCMiEh6UILZS7WKdXDygExcP6MThSg9f7Mxnw7Zc1nx5kA8/30+k00a/7omck+Hm7G4JRDj0oyAiIk1v695CkttFkhAXEexQRESkAZQVtAFREXYuOCuFC85Kodrj4+s9hWzYlsvGHXn87+tsbFYLZ3VtxzkZbvqntyc2yhHskEVEpBXw+f18k1nI+b2Tgx2KiIg0kBLENsZht9I/vT3909vj8/vZnlnMhu25fL4tly925mOshIxUF+dkuBmQ0Z728RpxTkRETs3eQ2VUVPnUvFREJIwoQTwFvoIsSvbvx2dLxNKuI4YjPOd0slos9OrSjl5d2nHVD9LZl13G+m01yeLfV2/n76u30zk5JjDITaf20RqBTkREGmzL3gIAenZWgigiEi6UIJ4Cz7aPydu0MrBsRLfD4uqIpV0nLAmdsLo61iSOzuggRtk4hmHQJSWWLimxjB/ajeyCw2zYnsuGbbks+2g3y46MiJpeOyJqJ42IKiIi323rviI6uaOJj1bXBRGRxnrrrRW8+upSnn32xRa9rhLEU+Ac9GNSLhpD3s5t+AoP4C/cj79wP54tH4KvOrCfEeWqSRrbdcLSriPW2tdwSByTE6K4fFAXLh/UhaKyKj7fXjsi6rpMVn66j/hoBwPS23NOhlsjooqIyHG8Pj/bM4sY2q9jsEMREZFGUIJ4CgzDgr1dCrYu0di6DAisN00/Zmke/sID+GqTRn/hATxbPwRvfYljTa1jqCeOrpi6I6Juqh0Rde1X2Xy48QCRTit9u9cki300IqqIiAC7DpRQ7fVr/kMREeAPf7iT9u3d3HjjTQAcPnyYK674IY8+upDXX/8nmzd/QWFhAampadxyyyz69u0ftFj1Sb4JGYYFIy4JS1wSti79A+trEsd8/IX7axPHmlpHz9Z/f2fiWJM8hlbiGBVhZ/BZKQw+KwWP18dXR0ZE3Z7HJ98aEbVfenviNCKqiEibtGVvIQbQs7Mr2KGISBvi2fZfPN/8p0nOZRgGpmmecLu951DsGRc26FwjRozk4Yfn8Otfz8AwDD766EO6du3Gv/71BgCLF7+CxWJl/vxH+MtfFvLnPz/TBHdwapQgtoCaxNGNJc79HYnjMU1VwyRxtNus9O/Rnv49akZE3ZFVXDvITV5gRNT02hFRz0lvj9sdG9R4RUSk5WzZW0jnlFiiI+zBDkVEJOjOO28QXq+XzZu/oG/f/rz77kpGjBjJJZdcSkREBFarjYMHDxAbG0tubm5QY1WCGETNkThaXCkYjmgMS8v2CbRaLPTs3I6enY+OiLphWy4btufyj9Xb+cfq7STEOWkX4yQxPoLEuAgS4yNof8x7NU0VEWkdKqu97DpQzKUD04Idioi0MfaMCxtcq3cyNpsFr9ffJOeyWq0MH34Zq1evonPnrnz++Xpmz76HnJwc5s+fy549u+nSpQuxsfGYZtNc81TpE3kIOmniWLQfX0Ft4lh04LjEEQCbA8MeCY4IDHskhiMSwx4B9oij72tfDUdkzfra/ersY3diGI1LNo8dEfVHQ7uRXXiYjdvzKCirJiu7lD0HS1n/TS4+f90q++gIWyB5bB8fecz7mgQyOsKmaTZERMLA1j0FeH2m5j8UETnGiBEj+e1vZ3DGGd0499zzaNcugRtu+BVjx47niScWYRgGb7/9Jrt27QhqnEoQw0idxLFz/8B60/RjltXUOPqLszGrKzA9lVBdcfS9pxJ/WR5mde16TwX4fQ27sD3imEQyEqM26axJJCPqJqJHEs8jyac9Erczgh8OSMad0o68/MMA+E2T4rJq8ksqyS+uJK+4gvySKvKLK8kurODrPYVUeerG57RbA8nikVrHY1/jYxyaekNEJARs2pGH1WKQnhof7FBEREJGenoGLlc7XnjhuWMGqyknMjICwzDYs2c3L7/8Al6vN6hxKkFsBQzDghHrxhLrbtRxps9TTyJZgVldeXS9pxKzugI8lZieo/uZh0tqlmv3owFV4WU10YLFAhYrNsNKisVKisUCFltgvRFvBZcVPxa8pkG1Dzx+qPYZVHlNKougMheqfVCOhVIMdpoWTMOC3WHH6bDjjHAQ4XQQEeEgKtJJVKSTyEgnFqsVLDV/jNpXjq0hPWFH5G+tP9F+TXx8aXYsnsN+DLuzplbY5sCwHXlf84rVrppVEQkpm7bncUaHOHUdEBH5lssuG8Xf/raIiy4aCsBtt93BggWP8uc/P47b7WbUqCt4+uk/U1xcFLQYDfO7huZpxfLzy/D7T/3W3e5YcnNLmzCi8GWaJvg8xyWS1CaXNUlmJdERBuVlNTWXpt9XU4Pp94HfX7NsHrvOV9P++thlv68mEfV7we/H7/Pi8/kwvV78x57P9GMxfVgwsRht4cfbCCSPNU2LnWBz1i47j66vs+ysk3QeTTidtUlo3WPDKQnV/83Gac3lZbEYJCbGBDuMsHO6z8eKKi+/mf8RIwd3YfzQbk0YWevVmv8fNheVWeO11jI7dGgvKSldmuXcTdkHMVhOVD7f9YzUV3ty2gzjmASFEzcnaueOxduCv5g8Xh8FxRXkFR+msOgwBcWHKSo5TFFJBcWlFZSVV9Ykk9T9j29SfyJ07HrDAKvVgtVSM0CPxWJgtViwWWteLVYDm8WC1WJgsda8Ht1uYLUaNccblqPvLTXbbFYLFqsFmwUSYh34KitwGF6cFh8OvDgMLza82PFgNb0YvmpMTxV4qzG91eCtOvpaWYrprcb0HrPdU8VxNZonYxh1k02LrWYdBjXF8q339a47Un7HrzPqbOdb+xrHrDPqbq9n3SGnnerqY5onH3uuOsvfOv6461LPNiOw+eg2o/awb2+rZ996k2yjnrf1xXSC4060/STXOpLw50c5qDr8rT7Mxx3bgC8HGvIFQmPOaxjYuw/G4ko5+XklpGzLLMLvV/9DEZFwpQRRWi27zUpyYgzJJ/h2xOf3U1RaTVF5FT6fidfnx+evfa1d9vpMvP5jl4/sY+Krs732fe12n8/E4/NT4fPj9Zv4qo/Z7jPx+f14fd6ac/qPnqvhDMABOLDbYnDarUQ4rDjtVpxHXo+si/zWst1ChM0kwuIn0urDYXiJsPhwWHzY8db+8YDPA56q45NObzX4vIB5tElsoCHC0XU1jRNOvP2k66CmxtisWW/WOZdZu8vx67xVFvxe3zH7cPz+9Rx/tPTrO3d9y9+x7dvrv31cPevM+pL2OseY9bxtSFNls87Ltxc83772caesL66TrTiVY45nRMTiUIIYdrbsLcRus9CjU1ywQxERkVOgBFHaLKvFUjPATXxEsEMBahKqI8mlz+/H4zOJiY3g4KESqjw+Kj0+qqp9VNW+Vlb7qK5vfe1rWYXn6HLtusZw2Cw4HZE47TE4HVYijkk+7TYLZiCPqk0Ia/8yOZIcHrmvo/cX2K92vXlM8nLkXCZ8Kwcz657n2P0COx09l8Nuw+fzYzFqmk9YDONbr0fXG4ZR0/X12H2OvD9mvWEY9Z/vyLpj1hsG9V+z9noYYDFqavAMOLqu9r1hHK3ds9QuYxx9bwRqLI/dXnON49Zx7LFHznnMNQ0Dd/sY8vPLv3W/dY8JBtM0w6ZZs9S1dW8hvbsmYLdZgx2KiIicAiWIIiHCMAxsVoOaz1Q1H6zcidFY/U3T9t1vmng8/lNKNmuWvVR5/JSUe/D4/LXJzdHY4VstTI/8Xafl6dFml0ePP5rw1NnvmHVG7YmOvDcsR85dt9kv1PQX8NfW5Hp8fvz+mns3/Sb+2iTcbxJY9psm/tp1fr+JaR6zn78mUfH7663fa/UMOJogW6hNlI9PlgNJsWFgHJOEG9SXgNeep3Y/45jE3DjmvFaLwSXnptKjk0bBDCdlFR725ZQx9JzUYIciIiKnSAmiSBthMYyaGkCHldbc8Ku5OuGbgWSSQFJZs47aZNIMJJN+82jCeXTfmlpOs7ZFrWl+e90xr3CCbcduP3J8fcc2fP+oaCelpZV14jw2aTbrXX98Qh1Ipk9wnm+XmcfnP5qkH0niv3XegpJKUIIYVuxWC2d1bcfQAak0ZHRrEZGmoFYn9TvVsUiVIIqINIBhGFgNA6vl5PuGk9Y6qp0Eh9Nh5ZZJA3C3j9bPlYi0CKvVhsdTjcPhDHYoIcfn82KxNL65fyv7qCMiIiIiIm1FTIyLoqJcqqurTrnGrDUyTT+lpYVERjZ+uifVIIqIiIiISFiKjIwGoLg4D5/P26Tntlgs+JtoLIiWZ+BwRBAT0/iuGkoQRUREREQkbEVGRgcSxabUVrthqImpiIiIiIiIAEoQRUREREREpJYSRBEREREREQHacB9Ei+X050ppinO0NSqzxlOZNZ7KrHFaa3m11vtqbk1Vbir/xlF5NZ7KrPFUZo3XWsvsu+7LMDUerIiIiIiIiKAmpiIiIiIiIlJLCaKIiIiIiIgAShBFRERERESklhJEERERERERAZQgioiIiIiISC0liCIiIiIiIgIoQRQREREREZFaShBFREREREQEUIIoIiIiIiIitZQgNtKKFSsYOXIkw4cPZ/HixcEOJywsXLiQUaNGMWrUKB566KFghxNWHnzwQWbNmhXsMMLC+++/z/jx47nsssu47777gh1OWFi+fHng/+aDDz4Y7HCkFdAzsvH0jDw1ej42nJ6PjdfWn49KEBshOzubefPm8fLLL7N8+XKWLFnCjh07gh1WSFuzZg0ff/wxr7/+OsuWLeOrr77i3XffDXZYYWHt2rW8/vrrwQ4jLGRmZnL33Xfz5z//mRUrVvD111/z73//O9hhhbSKigruv/9+XnzxRZYvX866detYs2ZNsMOSMKZnZOPpGXlq9HxsOD0fG0/PRyWIjbJmzRoGDx6My+UiKiqKESNGsHLlymCHFdLcbjezZs3C4XBgt9vp3r07Bw4cCHZYIa+oqIh58+Zx/fXXBzuUsPDuu+8ycuRIUlJSsNvtzJs3j379+gU7rJDm8/nw+/1UVFTg9Xrxer04nc5ghyVhTM/IxtMzsvH0fGwcPR8bT89HJYiNkpOTg9vtDiwnJSWRnZ0dxIhCX3p6Ov379wdgz549vPXWWwwbNiy4QYWB3//+98ycOZO4uLhghxIW9u7di8/n45e//CVXXHEFL7/8MvHx8cEOK6TFxMRw0003cfnllzN06FA6derEOeecE+ywJIzpGdl4ekY2np6PjaPnY+Pp+agEsVFM0zxunWEYQYgk/Gzfvp1f/OIX3H777XTt2jXY4YS0V155hQ4dOnDBBRcEO5Sw4fP5WLt2LQ8//DBLly5l8+bNan50Elu3buXVV1/lgw8+4OOPP8ZisfDss88GOywJY3pGnjo9IxtGz8fG0/Ox8fR8VILYKMnJyeTl5QWWc3JySEpKCmJE4WH9+vX8/Oc/55ZbbuFHP/pRsMMJeW+99Rb//e9/GTt2LAsWLOD9999nzpw5wQ4rpLVv354LLriAhIQEIiIi+MEPfsCmTZuCHVZI+/jjj7ngggtITEzE4XAwfvx4Pv3002CHJWFMz8hTo2dkw+n52Hh6Pjaeno9KEBtlyJAhrF27loKCAioqKli1ahVDhw4Ndlgh7eDBg9x4443MnTuXUaNGBTucsPDcc8/x5ptvsnz5cmbMmMEll1zCHXfcEeywQtrFF1/Mxx9/TElJCT6fj48++oizzjor2GGFtF69erFmzRoOHz6MaZq8//779OnTJ9hhSRjTM7Lx9IxsHD0fG0/Px8bT8xFswQ4gnCQnJzNz5kymTJmCx+Nh4sSJ9O3bN9hhhbRnn32Wqqoq/vSnPwXWTZo0iauuuiqIUUlr069fP371q18xefJkPB4PF154IRMmTAh2WCHtoosu4uuvv2b8+PHY7Xb69OnDddddF+ywJIzpGdl4ekZKc9PzsfH0fATDrK/TgIiIiIiIiLQ5amIqIiIiIiIigBJEERERERERqaUEUURERERERAAliCIiIiIiIlJLCaKIiIiIiIgAShBF5DtkZWXRs2dPysvLgx2KiIhIyNDzUVozJYgiIiIiIiICKEEUCbqsrCwGDhzI008/zYUXXsgFF1zAnDlzTrj/Z599xoQJExg4cCBXXnklmzZtCmzr2bMnTz/9NEOGDGHQoEE8+uij+P1+APLy8rjlllsYNGgQw4YN46GHHqK6uhqAqqoq7rvvPgYPHsygQYP43e9+R1VVVeC8zz//PD/4wQ8499xz60zovGLFCn74wx9y3nnnMWHCBD7++OOmLh4REWmj9HwUCQ4liCIhoLS0lKysLD744AOefPJJXn75ZT7//PPj9jtw4ADTpk3jhhtu4H//+x+/+MUvmDp1KkVFRYF9PvzwQ958801eeeUV3nzzTZYsWQLA9OnTAVi9ejVLly7l008/ZcGCBQA8/vjjbNy4keXLl7N69Wr279/PE088EThnTk4Ob7/9Ni+99BIvvfQS69evp6Kigt/97nc8+uijfPbZZ0yePJm77roL0zSbsaRERKQt0fNRpOUpQRQJEVOnTsXhcNC/f3+6devG3r17j9vnzTffZNCgQVx66aXYbDYuv/xyMjIyeOeddwL73HLLLSQkJNC5c2emTJnCv/71L/bt28fnn3/O7NmziYmJITk5mZtuuonXX38dgH/9619cf/31JCcnExMTw0MPPcTEiRMD55w2bRoOh4PevXtzxhlnkJWVBYDT6WTp0qV8/vnnjB07lvfffx/DMJq5pEREpC3R81GkZSlBFAkRCQkJgfc2my3Q9OVYBw4c4KOPPmLgwIGBP5s3b+bgwYOBfbp06RJ4n5KSQm5uLvn5+URFRdW5RseOHcnLy8Pj8ZCXl0dKSkqd4zp37hxYjouLC7y32+34fD4iIyN54YUXKCgo4Fe/+hUXXnghixYtOv2CEBEROYaejyItyxbsAESk4dxuNyNHjuShhx4KrMvMzKRdu3aB5ZycHNq3bw/UPDA7dOhAx44dOXz4MAUFBYGHYFZWFi6XC7vdTnJyMtnZ2Zx99tkAbN68mY0bN3LxxRefMJaysjLKy8tZuHAhXq+XNWvWcOONN3L++efTv3//Zrh7ERGR+un5KNJ0VIMoEkZGjRrFBx98wNq1azFNk/Xr13PFFVewefPmwD4LFiygrKyM3bt38+KLLzJu3DiSk5O54IILuP/++ykvLyc7O5sFCxYwZswYAMaMGcPTTz9NXl4epaWlPPLII+Tl5X1nLIcPH+ZXv/oVH330ETabjaSkJAzDID4+vlnLQERE5Nv0fBRpOqpBFAkjXbt25bHHHuPhhx9mz549JCQk8Lvf/Y4LLrggsE9qaiqjRo3C5/NxzTXXMG7cOADmzp3L/fffzw9+8AMArrjiCm655RYAbrjhBioqKhg3bhxer5fLLruMG2+8kZycnBPGkpSUxEMPPcScOXM4dOgQ7dq14/e//z1nnHFG8xWAiIhIPfR8FGk6hqkhlURajZ49e7JixQoyMjKCHYqIiEjI0PNRpOHUxFREREREREQAJYgiIiIiIiJSS01MRUREREREBFANooiIiIiIiNRSgigiIiIiIiKAEkQRERERERGppQRRREREREREALAFO4BgKSwsx+8/9fF5EhNjyM8va8KIWj+VWeOpzBpPZdY4rbm8LBaDdu2igx2GiIhIWGmzCaLfb55WgnjkHNI4KrPGU5k1nsqscVReIiIicoSamIqIiIiIiAigBFFERERERERqtUiCuGLFCkaOHMnw4cNZvHjxcdu3bNnChAkTGDFiBLNnz8br9QKwbt06xo8fz5gxY7j++uspLi4GoKSkhOuuu47LL7+cq6++mtzc3Ja4DRERERERkVbNME2zWTufZGdnc9VVV/Haa6/hcDiYNGkSjz76KD169AjsM3r0aO677z769+/PHXfcwdlnn83kyZMZPnw4Tz75JD169GDu3LlYLBb+7//+j3vvvZeUlBSuu+46li1bxocffshjjz3WqLjy88tOq9+N2x1Lbm7pKR/fFqnMGk9l1ngNLTPTNCkszKW6uhJou33wLBYLfr8/2GGcMqvVRkyMi8jI4wejsVgMEhNjghCViIhI+Gr2QWrWrFnD4MGDcblcAIwYMYKVK1cyffp0APbv309lZSX9+/cHYPz48SxYsIDJkyfz1ltvYbfb8Xg8ZGdn07NnTwA+/PDDQE3k6NGjuffee/F4PNjt9ua+HQB8fj8H88opKDzcItdrLRISNJqghI6ysmIMwyA5ORXDaLut7W02C15veCaIpmni8VRTVFTTiqS+JFFEREQap9kTxJycHNxud2A5KSmJTZs2nXC72+0mOzsbALvdzjfffMO1116LzWbj//7v/447xmazERMTQ0FBAcnJyc19OwAsXrWNDzceaJFrtSajLzyD8d87I9hhiABQUVFGQkJym04Ow51hGDgcTlwuN8XFeUoQRUREmkCzJ4j1tWA1DKPB23v27MmaNWv4xz/+wcyZM/nHP/5R73UslsZ9yDudZkdTRp/NgN4pp3x8W/TWmt18vi2HaeP7BjuUsON2xwY7hLDTkDLLyTFxOh11ft+0VTZbeCfJVmsERUV+/V8RERFpAs2eICYnJ7Nu3brAck5ODklJSXW25+XlBZZzc3NJSkqiqqqKjz76iEsvvRSAK664ggcffBCoqYXMy8sjJSUFr9dLWVlZoAlrQ51uH8RLBqapb1gjZB1K4JUPdrJjTz7x0Y5ghxM21Aex8RpaZn6/H5/PpC33P4TwbmJ6LL/ff9y/u/ogioiINF6zf208ZMgQ1q5dS0FBARUVFaxatYqhQ4cGtnfq1Amn08n69esBWLZsGUOHDsVms/GHP/yBL7/8EoC3336bc845B4Bhw4axbNkyAN566y0GDhzYYv0P5dRkpLoA2J5ZFNQ4RERERETkxFqkBnHmzJlMmTIFj8fDxIkT6du3L1OnTmXGjBn06dOHuXPncuedd1JeXs6ZZ57JlClTsFqtzJs3j9///vf4fD6Sk5O5//77AbjpppuYNWsWo0aNIjY2lrlz5zb3bchp6pISi8NuZVtWEQN7JZ38ABFpkFtumcHQod9n7Njxx21buPAxiouLmD37npYPTERERMJSs09zEao0zUXLm/fKJkrKqrj72vOCHUrY0M9Z4zW0zA4d2ktKSpcWiCh4GpIgtpYmpvX9e6qJqYiISOOF98gEElbO7JbAvpxSKqq8wQ5FJCz84hdXs2rVSgAqKir4/vcHs2zZPwHweDz88IfDmDBhNK++ugSAgwcPcNNNNzB8+Pe44YZfkJOTXed8r7/+TyZN+hEjR/6A3/3ut+Tn5yEiIiJyrGZvYipyxFlnJGKasPNAMWefkRjscETq+O/mg3y86WCLXOuivh24sE+Hk+53wQUXsW7dJ/zwh5fxxRefY7Va+fzz9YwbN5FNmzaSnJxMfLwrsP9dd83irLPO5uGH5/PNN1v4v//7Dd///iUAvP/+e7z44nPMnbuATp1SefrpP3P33Xfwl78801y3KSIiImFINYjSYnp1TcBiGGzLLA52KCJh4YILLmL9+s8A2LDhM0aPHsvGjRsAWLv2vwwZ8r3Avvv3Z7F169dcd92vcTgc9OnTj0sv/WFg+5tvLucnP5lMt27dcTqdXH/9dL7++kv27dvbsjclIiIiIU01iNJiIp02OifHaCRTCUkX9mlYrV5LOvPMs6iqqmLfvr2sW/cZd9xxNx9++D579+7hf/9bw2233cFXX20GoKAgn8jIKKKjj/a5S0npQFZWJgA5OYdYtOhJnntu0TFXMDh48CAdO6a15G2JiIhICFOCKC0qPdXFhxv34/X5sVlVgS3yXSwWC4MHD+HDD1eTm5tD9+49OOecgbz99psUFRVw9tl9A/u2b++mouIwxcVFgWanubm5ge2Jie2ZNOmnjB49NrBuz57ddOnSucXuR0REREKfPqFLi8pIi8fj9bPnkEbmFGmIIUO+x5Ili+nXrz+GYXDuuQP55z//waBBQ7BYjv4K79ChI3379mfhwseoqqpky5avePfdtwPbL7tsFP/4x2KysjLx+/3885//YNq0n1NRURGM2xIREZEQpRpEaVHpqS4AtmcW0aNTfHCDEQkD558/mPLycgYMOBeAc845j8rKyjr9D4+4994/8ac/3cvo0cPp2DGVoUMvDmy77LJRlJaW8NvfzqCgoIAuXbrw0EPziYuLaxXTXIiIiEjT0DyIp0jz0zXekTK74+n/kdwukpuu7BfskEKefs4aT/MgNo7mQRQREZFjqYmptLj01Hh27C/G3za/mxARERERCVlKEKXFZaS5KK/0ciCvPNihiIiIiIjIMZQgSotLT3MBaLoLEREREZEQowRRWpw7PgJXjINtWcXBDkVERERERI6hBFFanGEYpKe62J5VFOxQRERERETkGEoQJSgy0lwUlFSRV6w52EREREREQoUSRAmK9NSaORC3Z6qZqYiIiIhIqFCCKEGR6o4h0mljm5qZioiIiIiEDCWIEhQWi0F6ajzbNJKpSMjxer3k5GQHOwwREREJAiWIEjTpqfEczD9M6eHqYIciElZ++cuf8dZbK5rt/PfccwcfffRhg/a96KKB7Nq1o9HXePbZp7jzztsafZyIiIg0LyWIEjTpqS4Admi6C5GQUlRUFOwQREREJEhswQ5A2q4zOsRhs1rYllXEgAx3sMMRCVmfffYJ8+c/Qnb2QS6++FI8nppa96qqSp588nE+/PB9TNNk+PDLmDbtRux2OwCvv/5PlixZTElJCf36DeC3v51FYmJ7NmxYx6OPPkT//uewatVbxMe3Y9q0X3PppSOYP/8RNm3ayFdfbebAgQP85jczeeWVf7BixetkZx/C4XAybtwEfvnLaYH43n33Hd59dybl5eVMmPBjrr12KlarlenTr+Pii3/AhAk/AeDVV5fwwQerWbjw6Tr3V1VVyeOPP8Znn/2P/Pw82rd38+tf38TQod9nw4Z1PPLIn+jQoSNfffUl99//EOecM7CFSl5ERKTtUYIoQWO3WejWIZZtGslUQoBn23/xfPOfFrmWvedQ7BkXNmjfgoJ87rjjVm677Q4uvvhSli9/LdC8dOHC+WRl7eP55/+O329y112388ILf+WXv5zG+++/x4svPsfcuQvo1CmVp5/+M3fffUcgOduzZxfnnTeIlSvfZ926dcya9X9069aDm266he3bvwkkdl988TkvvPBX/vznZ0hL68wXX3zO9OnXMWLESFJT0wD48stN/PWvL1FaWsrNN99IUlIyV1zxowaXx9///hJ79+7m2WdfIjIyksWLn+exxx5m6NDvA7B37x4mT57Cffc9hM2mx5aIiEhzapEmpitWrGDkyJEMHz6cxYsXH7d9y5YtTJgwgREjRjB79my8Xi8A69evZ8KECYwdO5ZrrrmG/fv3A/DZZ58xaNAgxo4dy9ixY/nd737XErchzSA9zcW+7FKqqn3BDkUkJK1Z8zFpaWkMH34ZNpuNCRN+TGpqGqZp8tZbb3DDDb8hPt5Fu3bt+OUvp/HGG68D8Oaby/nJTybTrVt3nE4n118/na+//pJ9+/YCEBkZxfXXT8fhcHD++YMZNOgCPvjgveOu37Nnb5599kXS0jpTUJCPx+PB6XSSl5cb2GfatBuJi4unU6dUrrzyJ6xevapR9zh+/JXcd9+DREZGkpOTTVRUFLm5OYHtFouF4cMvIyIiQgmiiIhIM2v2J212djbz5s3jtddew+FwMGnSJAYNGkSPHj0C+9x6663cd9999O/fnzvuuIOlS5cyefJkbr31Vv785z/Tq1cv/vnPf3Lffffx5JNPsnnzZn7xi18wbdq077iyhIP0VBf/WruXXQeK6d01IdjhSBtmz7iwwbV6LamgIJ/27ZPqrEtJ6UBRUSFVVVX85jfTMAwDANM08Xi8VFVVkZNziEWLnuS55xYdc6RBdvZBrFYbSUlJOJ3OwBa3O4n8/Lzjrm8YBn/72zP8+9/v065dAj179gbA7/fXiedk5/kuZWVlPPLIg3z99Zd06pRKx46dME0zsD0mJhaHw9Goc4qIiMipafYEcc2aNQwePBiXywXAiBEjWLlyJdOnTwdg//79VFZW0r9/fwDGjx/PggULmDhxIjfddBO9evUCoGfPnrz00ksAbN68mfz8fN5++21SUlK4++676dChw3HXltDXo1M8BrAtSwmiSH3at3eTnX2wzrq8vFzi4+Ox2+389a+L6dQpFYCKigoKCvJxOp0kJrZn0qSfMnr02MBxe/bsplOnVDZv/oKCggJ8Ph82W01DkkOHDnHmmWcdd/0lSxaze/dOlixZTkxMDF6vl/fff7fOPvn5+bRv7w6cJzm55vex1WrF4/EE9isurr85+cMPz6Fr1248+OCj2Gw2Nm7cUOcatfmviIiItIBmb2Kak5OD2310AJKkpCSys7NPuN3tdpOdnY3D4WDs2JoPNn6/n4ULF3LppZcCEBsby5QpU1i2bBnDhg1j5syZzX0b0kyiImykJcVoPkSRExgy5HtkZ2ezbNmreL1eVqxYxp49u7FYrAwffhl/+ctCSktLqaio4OGH53D//fcAcNllo/jHPxaTlZWJ3+/nn//8B9Om/ZyKigoASktLeOmlv+H1eli79mM2bPiMSy8dAYDD4aC8vByA8vJybDY7druNw4cPs3DhY3g8Hnw+byDGZ555ktLSUvbt28Mrr/ydUaOuACAtrTOffLKWqqoq9u/PYtWqt+u9x/LycpxOJ1arlezsQzzzzF8AAt0NREREpOU0ew3isc2EjjCO+Tr4ZNurq6uZNWsWXq830KT03nvvDWy/6qqreOSRRygtLSU2NrbBcSUmxjR43xNxuxt+PalRX5n1TXfz3mf7aJcQjc2qmVe+TT9njdeQMsvJsQRqz0JZ+/YJPPLIfObO/RMLF87jvPMG0a9ffywWg1tuuY0nnljAlCk/prKykn79BnD//Q9is1kYPXoM5eWl/Pa3MygoKKBr16488sgCEhJcWK0WYmNjycvLZeTIH5KQkMCcOQ/RtWsXAEaMuJxHHnmQ7OyDTJt2I3ffPZsxY35IZGQU3/veUPr27c++fXu54IIhAJx55plcddWPcDicTJp0NT/84Q8BuOaaa7nvvnu44ooRdOqUysiRY/jss0+w2SxYLAaGYWCzWZg58xb+9Kf7ee21pbhc7fjRjybwzTdbyMzcg9VqAYyT/ltZLBb9XxEREWkChllfhtaEXn/9ddatW8f9998PwBNPPIFpmnWamP785z/n3XdrmhOtW7eOBQsW8MILL1BeXs4NN9yAy+Vi7ty5OBwO/H4/Tz31FNdddx1WqxWAgQMH8tFHHxEZGdnguPLzy/D7T/3W3e5YcnNLT/n4tuhEZfbplmz+svwr7rpmIGd0iAtCZKFLP2eN19AyO3RoLykpXVogotCzYcM67rrrdv71r9XYbBa8Xv/JDwpx9f17WixGk3wZKCIi0pY0+9fnQ4YMYe3atRQUFFBRUcGqVasYOnRoYHunTp1wOp2sX78egGXLlgW233rrrXTp0oX58+cHBiiwWCy8++67vPPOO4H9+/Xr16jkUEJLRpoLQM1MRURERESCrNmbmCYnJzNz5kymTJmCx+Nh4sSJ9O3bl6lTpzJjxgz69OnD3LlzufPOOykvL+fMM89kypQpfP3116xevZoePXowbtw4oKb/4qJFi3jwwQe56667eOKJJ0hISOChhx5q7tuQZuSKcZLkimRbZhEjzu8c7HBERERERNqsZm9iGqrUxLTlfVeZPfuvr/liRz7zZ1xUpw9qW6efs8ZTE9PGURNTEREROVboj9AgbUJ6qouyCg+HCg4HOxQRERERkTZLCaKEBPVDlGBoow0oWh39O4qIiDQdJYgSEpLbRRIXZWdbZv0TaYs0NYvFWmcuPwlfHk81Vmuzd6kXERFpE5QgSkgwDIP0NBfbs4qCHYq0EZGRMZSWFmGa4d//rq0yTZPq6iqKinKJiXEFOxwREZFWQV+5SshIT3Wx/ptcCkoqSYiLCHY40srFxMRTWJhLdnYW0HabKFosFvz+8E2SrVYbsbHtiIyMDnYoIiIirYISRAkZGWnxAGzPKmbQmUoQpXkZhkFCQlKwwwg6jZQrIiIix1ITUwkZaUkxOB1WtqmZqYiIiIhIUChBlJBhtVjo0Sme7RrJVEREREQkKJQgSkhJT41nf2455ZWeYIciIiIiItLmKEGUkJKR6sIEdmRpugsRERERkZamBFFCSreOcVgthvohioiIiIgEgRJECSkOu5WuHWLZnqkaRBERERGRlqYEUUJORqqL3QdLqPb4gh2KiIiIiEibogRRQk56qguf32T3wZJghyIiIiIi0qYoQZSQ0yM1HoBtGqhGRERERKRFKUGUkBMTaaeTO1rzIYqIiIiItDAliBKSMlJd7NhfjN9vBjsUEREREZE2QwmihKT01Hgqq31k5pQFOxQRERERkTZDCaKEpIw0F4DmQxQRERERaUFKECUkJcRFkBgXoX6IIiIiIiItqFEJYnV1NXv37sU0Tfx+f3PFJAJARlo827KKMU31QxQRERERaQkNShDLy8uZNWsW/fv3Z+zYsezZs4cRI0awa9euBl1kxYoVjBw5kuHDh7N48eLjtm/ZsoUJEyYwYsQIZs+ejdfrBWD9+vVMmDCBsWPHcs0117B//34ASkpKuO6667j88su5+uqryc3Nbej9ShhJT3VRUl5NTlFFsEMREREREWkTGpQgzpkzB4/Hw7vvvovdbqdz58788Ic/5A9/+MNJj83OzmbevHm8/PLLLF++nCVLlrBjx446+9x6663cddddvPPOO5imydKlSwPr77//fpYvX86YMWO47777AHjssccYOHAgb7/9NldeeSX3339/Y+9bwkD6kX6IamYqIiIiItIiGpQgfvjhh/zxj3+kU6dOGIaB1Wrl5ptv5uuvvz7psWvWrGHw4MG4XC6ioqIYMWIEK1euDGzfv38/lZWV9O/fH4Dx48ezcuVKqquruemmm+jVqxcAPXv25ODBg4F4xowZA8Do0aP5z3/+g8fjadSNS+jrmBhFTKSd7ZnFwQ5FRERERKRNaFCC6HQ6KS0trbOuqKiI2NjYkx6bk5OD2+0OLCclJZGdnX3C7W63m+zsbBwOB2PHjgXA7/ezcOFCLr300uOOsdlsxMTEUFBQ0JBbkTBiGAbpqfEayVREREREpIXYGrLT+PHjuf7667nxxhvx+Xx88sknLFy4kCuuuOKkx9Y3wIhhGA3eXl1dzaxZs/B6vUybNu2E17FYGjcga2JiTKP2r4/bffIEWepqbJkN6JXM5yu+wua00y4uopmiCm36OWs8lVnjqLxERETkiAYliL/+9a+JiIjgkUcewefzcddddzF27Fiuv/76kx6bnJzMunXrAss5OTkkJSXV2Z6XlxdYzs3NDWwvLy/nhhtuwOVy8eSTT2K324GaWsi8vDxSUlLwer2UlZXhcrkadMNH5OeX4fef+uiYbncsubmlJ99RAk6lzDq0q0kK//fFfgb2SjrJ3q2Pfs4aT2XWOK25vCwWo0m+DBQREWlLGlTttm3bNn71q1/x9ttvs3HjRlatWsWNN97I+vXrT3rskCFDWLt2LQUFBVRUVLBq1SqGDh0a2N6pUyecTmfgXMuWLQtsv/XWW+nSpQvz58/H4XAEjhk2bBjLli0D4K233mLgwIGB5FFaly7JsTjsFg1UIyIiIiLSAk5Yg+j3+6mqqsI0TSZPnsyaNWsCzUENw6C0tJRp06bx+eeff+cFkpOTmTlzJlOmTMHj8TBx4kT69u3L1KlTmTFjBn369GHu3LnceeedlJeXc+aZZzJlyhS+/vprVq9eTY8ePRg3bhxQU3O4aNEibrrpJmbNmsWoUaOIjY1l7ty5TVciElJsVgvdO6ofooiIiIhISzDME8xCnp2dzWWXXUZlZSWmadbpF3jE0KFDeeqpp5o9yOagJqYt71TLbNlHu1ixZg8Lbx5KpLNBraJbDf2cNZ7KrHFac3mpiamIiEjjnfDTdnJyMu+99x4VFRVMmDCB1157rU6i6HA46ow+KtJc0tNcmCbs3F/M2d0Sgx2OiIiIiEir9Z3VMYmJNR/GP/nkk3q3FxcXEx8f3/RRiRyje8c4LIbBtqwiJYgiIiIiIs2oQe31Pv/8cx555BGys7Px+/0AeL1eCgoK2Lx5c7MGKBLhsNElJYZtmcXBDkVEREREpFVr0Cim99xzD+np6YwcOZL09HR+85vfEBcXx8yZM5s7PhEA0lNd7DpQgsfrD3YoIiIiIiKtVoMSxL179zJ79mzGjx9PSUkJ48aN47HHHuPVV19t7vhEgJoE0evzs/dQ6xxMQ0REREQkFDQoQUxISMDv99OpUyd27doFQPfu3cnOzm7W4ESOSE+r6euq6S5ERERERJpPgxLEAQMGcOedd1JZWUn37t3529/+xpIlS2jXrl1zxycCQFyUgw6JUWzLLAp2KCIiIiIirVaDEsS77roLu91OVVUVs2fP5u9//zuPP/44d9xxR3PHJxKQnupiR1Yx/vqn7hQRERERkdPUoFFMly5dyh133EF0dDSJiYm88847zR2XyHEy0uL5zxcHOJBbTmqSJr8WEREREWlqDapBfOaZZ4iIiGjuWES+U3qqC1A/RBERERGR5tKgGsTRo0dz9913M2rUKNq3b49hGIFtPXr0aLbgRI7VPj6CdrFOtmUWcck5qcEOR0RERESk1WlQgvjyyy8D8M9//rPOesMw2LJlS9NHJVIPwzBIT41ne1YxpmnW+aJCREREREROX4MSxK1btzZ3HCINkpHm4tMtOeQVV+J2RQY7HBERERGRVqVBfRBFQsWRfojb1Q9RRERERKTJKUGUsNLJHU2U08a2zOJghyIiIiIi0uooQZSwYjEMeqTGqwZRRERERKQZKEGUsJOR5uJg/mFKDlcHOxQRERERkValQYPU/O53v6t3vd1up127dnzve99j4MCBTRqYyImkp8YDsCOrmHMy3EGORkRERESk9WhQDaLNZuPNN9+kurqa9u3b4/V6eeutt8jOzmbfvn1MmzaNV155pbljFQGga0ocNquFbZlFwQ5FRERERKRVaVAN4r59+3jqqacYMmRIYN2Pf/xjnnzySebNm8emTZu49dZbufLKK5stUJEj7DYL3TrGqR+iiIiIiEgTa1AN4ldffcX5559fZ92AAQPYuHEjAH379iUvL6/JgxM5kYy0ePYeKqOy2hvsUEREREREWo0GJYjp6ek89dRTmKYJgGmaPP3003Tr1g2Af//736Smpp7w+BUrVjBy5EiGDx/O4sWLj9u+ZcsWJkyYwIgRI5g9ezZeb90P/fPnz+fxxx8PLH/22WcMGjSIsWPHMnbs2BP2kZTWKyPVhd802XWgJNihiIiIiIi0Gg1qYnrfffdxww038OKLL5KUlEROTg4JCQnMmzePdevW8X//938sXLiw3mOzs7OZN28er732Gg6Hg0mTJjFo0CB69OgR2OfWW2/lvvvuo3///txxxx0sXbqUyZMnU1paygMPPMC//vUvfvWrXwX237x5M7/4xS+YNm3aad6+hKvuneIxDNiWWcSZXROCHY6IiIiISKvQoASxe/fuvPXWW2zcuJGcnBxSUlLo378/FouFyspK/ve//2G32+s9ds2aNQwePBiXywXAiBEjWLlyJdOnTwdg//79VFZW0r9/fwDGjx/PggULmDx5MqtXr6Zr165ce+21dc65efNm8vPzefvtt0lJSeHuu++mQ4cOp1gEEo4inTbSkmLYnlUc7FBERERERFqNBs+D+NVXX7F//36qq6vZt28fb7zxBsuWLSMiIuKEySFATk4ObvfRqQiSkpLIzs4+4Xa32x3YPm7cOK677jqsVmudc8bGxjJlyhSWLVvGsGHDmDlzZkNvQ1qRjFQXOw8U4/X5gx2KiIiIiEir0KAaxAceeICXX36Zbt26YbMdPcQwDMaNG/edxx7pt3gswzAavL0+9957b+D9VVddxSOPPEJpaSmxsbHfedyxEhNjGrzvibjdDb+e1GjKMht4VgfeW59FSZWPnl3im+y8oUY/Z42nMmsclZeIiIgc0aAE8c033+Sll16iX79+jb5AcnIy69atCyzn5OSQlJRUZ/uxI6Dm5ubW2f5tfr+fp5566riaxWMT14bIzy/D7z8+OW0otzuW3NzSUz6+LWrqMkuOcwDw6eaDJESduBY7nOnnrPFUZo3TmsvLYjGa5MtAERGRtqRBTUwNw+DMM888pQsMGTKEtWvXUlBQQEVFBatWrWLo0KGB7Z06dcLpdLJ+/XoAli1bVmf7cQFbLLz77ru88847gf379etHZGTkKcUn4Ss+xklSu0jNhygiIiIi0kQalCBec8013H///Rw4cICKioo6f04mOTmZmTNnMmXKFMaNG8fo0aPp27cvU6dOZfPmzQDMnTuXBx54gMsvv5yKigqmTJnyned88MEHeeGFFxg1ahSvvvoq9913X0NuQ1qhjFQX27OK8dfTVFlERERERBrHMOvrBPgt5513HqWlpcf1HTQMgy1btjRrgM1FTUxbXnOU2UebDvDcW1v5468G0al9dJOeOxTo56zxVGaN05rLS01MRUREGq9BHfeWLVvWzGGInJqMVBcA27OKWmWCKCIiIiLSkr6ziemuXbsAjmtW2pgmpiLNKaldJHHRDrZnFgU7FBERERGRsPedNYgTJ05kw4YNjB49ut7t4dzEVFoHwzDISI1nW2ZxsEMREREREQl735kgbtiwAYCtW7e2SDAipyI9zcW6b3IpKKkkIS4i2OGIiIiIiIStBk8emJ2dTWZmZp2J7Q3DYODAgc0SmEhDHemHuC2riMFnpgQ3GBERERGRMNagBPGZZ57h0UcfJSoqqs6E9IZhsHbt2mYLTqQh0pJiiHBY2Z5ZrARRREREROQ0NChBfOmll1iwYAGXXnppc8cj0mgWi0GPTvFsyyoKdigiIiIiImHtO0cxPaKiooJLLrmkuWMROWXpaS7255ZTVuEJdigiIiIiImGrQQnij370IxYtWoTP52vueEROSUZqPAA79ms0UxERERGRU9WgJqZr1qxh27ZtPP7448TGxtbZpj6IEgrO6BCH1WKwPbOI/j3aBzscEREREZGw1KAE8Y477sBiaVBlo0hQOOxWzugQp36IIiIiIiKnoUEJ4gMPPMDixYuJiYlp7nhETll6WjyrPs2k2uPDYbcGOxwRERERkbDToGrB0tJSKisrmzsWkdOSkerC5zfZdaAk2KGIiIiIiISlBtUg9u/fnx/96EcMHjyY9u3bYxhGYNttt93WbMGJNEaP1HgMYHtWEb26tAt2OCIiIiIiYadBCaLT6eSiiy4CoKioqDnjETll0RF2Ormj2ZalkUxFRERERE5Fg/sgioSD9DQXa748hM/vx6qBlUREREREGqVBCWJBQQEvvfQS2dnZ+P1+ALxeLzt37uS1115r1gBFGiMj1cUHG/aTmVNG15S4YIcjIiIiIhJWGlTFcuutt/Lxxx+Tl5fHzp07MQyD1atXM3jw4OaOT6RR0lPjAdieqWamIiIiIiKN1aAEccOGDSxatIiZM2cSFxfHnDlzmDdvHhs3bmzm8EQaJyEugvbxEZoPUURERETkFDQoQYyOjiY+Pp4uXbqwbds2AIYNG8bOnTubNTiRU5Ge6mJ7ZhGmaQY7FBERERGRsNKgBDE9PZ3FixcTERFBVFQUmzdvZvv27Vg0CIiEoIy0eEoOe8gurAh2KCIiIiIiYaXBfRCff/55srKymD59OldddRU/+tGP+NnPftagi6xYsYKRI0cyfPhwFi9efNz2LVu2MGHCBEaMGMHs2bPxer11ts+fP5/HH388sFxSUsJ1113H5ZdfztVXX01ubm6D4pC2ISPNBcD2zKKgxiEiIiIiEm4alCCeeeaZrFq1irS0NEaPHs3q1at54403+PWvf33SY7Ozs5k3bx4vv/wyy5cvZ8mSJezYsaPOPrfeeit33XUX77zzDqZpsnTpUgBKS0u54447+Otf/1pn/8cee4yBAwfy9ttvc+WVV3L//fc39H6lDUhJiCIm0q5+iCIiIiIijdSgaS4A9u7dy2uvvUZOTg6zZs3i448/plu3bic9bs2aNQwePBiXywXAiBEjWLlyJdOnTwdg//79VFZW0r9/fwDGjx/PggULmDx5MqtXr6Zr165ce+21dc754YcfBmoiR48ezb333ovH48Futzf0dk6L6a2icv9BfEWHW+R6rUVldVSLldkFyYfJzczHl+Nskes1C8OK2f6sYEchIiIiIm1IgxLEf//739x2221ccsklvPPOO9x8883Mnz+fvLw8pk2b9p3H5uTk4Ha7A8tJSUls2rTphNvdbjfZ2dkAjBs3DqBO89JvH2Oz2YiJiaGgoIDk5OSG3M5pq/rvSxz45qMWuVZr0pLp9GgACxxetqwFr9r0cndejDH4mmCHISIiIiJtRIMSxEceeYSFCxdy3nnn8d5775GcnMxzzz3HL3/5y5MmiPWNJGkYRoO3N1RjB8xJTIxp9DWO8I36JVX9h57y8dL89ueWsWj5l0y8JJ2zuyUGO5xTcnjHekrWrySp+wBizrww2OGEFbc7NtghhBWVl4iIiBzRoATx4MGDDBw4EDiavJ1xxhmUl5ef9Njk5GTWrVsXWM7JySEpKanO9ry8vMBybm5une31SUpKIi8vj5SUFLxeL2VlZYEmrA2Vn1+G33/q0yC4e5xDbm7pKR/fFrndsS1WZjExfnZSzJq8BM4YkN4i12xq5oAzcB7cSc6/nqQ8shOWmPBMdFtaS/6ctQatubwsFuO0vgwUERFpixpU7darVy+WLFlSZ93bb79Nz549T3rskCFDWLt2LQUFBVRUVLBq1SqGDj1a+9apUyecTifr168HYNmyZXW212fYsGEsq206+NZbbzFw4MAW638o4cFmtdC9Y3xYj2RqWGwkjbsZTJPKD57G9PuDHZKIiIiItHINShDvvPNOFi5cyLhx4zh8+DA/+9nPmDNnDrNnzz7pscnJycycOZMpU6Ywbtw4Ro8eTd++ffn/9u48Oqo6T//4U5VakpCQEKiEkLAqQZQlSGRtg4DKEgUacEaxB/3RRPDg0WZoT4Pr0UEQRPDQoq0OvbgwAx4aMiDN0gQYkCgQQaKgYiuYzYQkhJC9qlK/P4AaadkKUrmVyvt1Th2q7r1167mfUwl8uN97v+np6crJyZEkLVmyRAsXLtSYMWNUU1OjqVOnXnafTzzxhA4dOqS0tDStWrVKzz333NUcBlqYpI7Ryi2uVHWt68obByhrm/YKHfKg3IVfq/7zTUbHAQAAQJAzeS52EeBFVFVVadeuXSooKJDD4dAdd9yhqKgof+fzm+seYhrEw7L8palrdvR4mV7570P6zX191eeG5jk80+GIVHFxhWq3vynX99kKH/+0QmKvfPfgloyfTd8Ec70YYgoAgO8uew1iTU2N97nZbNbw4cN/tj4sLMw/yYDr1K1DlELMJh3LK2+2DaJ09rrf0NsfUlXRt6rZ8ZZaTXxBJmuo0bEAAAAQhC7bIPbr1++SdxT1eDwymUw6evSoX4IB18tuC1GnuMhmfR3ieSZ7K4UOf0Q1Gxepbu8qhQ6bZnQkAAAABKHLNojbt29vqhyAXyR1jNL27Hw5XQ2yWnybCiXQWDrcJFtymuoPbVRIx96ydrvN6EgAAAAIMpdtEBMSEpoqB+AXSYnR2rIvV98XViipY7TRca6brf8EufK/VO3uPysk9gaZI2KMjgQAAIAg0rxPqQBXcGPi2RspHcsrNzZIIzGFWBQ2fIbkdqp25zvyeJj6AgAAAI2HBhFBLTLcpvi24TqWd9roKI3GHN1e9iEPyl1wVM7Dm42OAwAAgCBCg4igl9QxWsfyTl/XtCaBxtojVZYu/VW3f63cJceNjgMAAIAgQYOIoJeUGK2aOpfyTlYaHaXRmEwmhab+P5nCWqt2+x/kcdYZHQkAAABBgAYRQa97x/PXIQbPMFNJMoVGKPSOdDWcLlLdJ/9ldBwAAAAEARpEBL22rUPVJtIeNDeq+SlLws2y9hkt59Gdch7/zOg4AAAAaOZoEBH0TCaTkjpG65vccnk8wXMd4nn22ybJ3Laz6nb9UQ1Vp4yOAwAAgGaMBhEtQlJilMor63XydK3RURqdKcSi0JEz5HHVq3bnfzL1BQAAAK4ZDSJahO4doyVJx3LLDc3hLyHRHWQf/IDc+V/KmbPN6DgAAABopmgQ0SJ0aNdKrUItQXkd4nnWnnfI0rmf6vZ9KHfpD0bHAQAAQDNEg4gWwWwy6caEKH2TG1x3Mv0pk8kk+7BpMtlbqTbzD/K46o2OBAAAgGaGBhEtRlLHaP1YVq2KquBtnMyhkQodnq6GUwWq+2S10XEAAADQzNAgosXwXocYxMNMJcmS2EvW3qPkPLJdrhOHjI4DAACAZoQGES1Gl/aRslrMOpYXvMNMz7PfNknmmI6q3bVSDdXlRscBAABAM0GDiBbDEmJWt/jW+iZI72T6UyaLTaEjZsrjrFXtrpVBOf8jAAAAGh8NIlqU7h2j9UNRpWrrXUZH8buQmATZB/2r3Lk5cn75d6PjAAAAoBmgQUSLktQxSg0ej/6RX2F0lCZhvXmkQjr1Vd2nq+UuyzM6DgAAAAJckzSIGzZs0NixY3XXXXfpgw8++Nn6o0ePatKkSRo1apSefvppuVxnz+4UFBTowQcf1OjRo/Xoo4+qqqpKkrR//34NHDhQ48eP1/jx4zVv3rymOAwEgRs6RMlkCv4b1ZxnMpkUOuzXMtnCVbudqS8AAABweX5vEIuKirRs2TKtWrVKGRkZWr16tb799tsLtnnyySf17LPPasuWLfJ4PFqzZo0k6YUXXtCUKVO0efNm9erVS2+88YYkKScnR9OmTVNGRoYyMjK0cOFCfx8GgkSY3aJOsZEt4jrE88xhrRU6bLoaTuWpbt+HRscBAABAAPN7g7h3714NGjRI0dHRCg8P16hRo7R582bv+vz8fNXW1io5OVmSNHHiRG3evFlOp1P79+/XqFGjLlgunW0QP/74Y02YMEEzZ85UYWGhvw8DQaR7xyh9V1Ahl7vB6ChNxtKpj6y33CnnF9vkyj1sdBwAAAAEKL83iMXFxXI4HN7XsbGxKioquuR6h8OhoqIinTp1ShEREbJYLBcsl6TIyEhNnTpV69ev17BhwzR79mx/HwaCSFJitOpdDTrx4xmjozQp+8B/kblNomp3/qcaalrGNZgAAADwjcXfH3Cx2+ubTKYrrr/c+1588UXvsgceeECvvvqqzpw5o8jIyKvO1bZtxFVveykOx9V/Hs4KhJoNCrXqjfVfqOBUjQYlJxod54oas2b1k/9d+X/8nTxZf1G7f5l3wc9iMAmE71lzQr0AAMB5fm8Q4+LidODAAe/r4uJixcbGXrC+pKTE+/rkyZOKjY1VTEyMKisr5Xa7FRIS4l3e0NCgt956S4888ohCQkL+70Asvh1KaWmlGhqufW44hyNSJ0+2rDNQ1yuQahYXE66DXxXr9l7tjY5yWY1eM1OMbAPuU3XWKhX87//IdvOIxtt3gAik71lzEMz1MptNjfKfgQAAtCR+H2I6ZMgQZWVlqaysTDU1Ndq6datSU1O96xMSEmS325WdnS1JWr9+vVJTU2W1WpWSkqJNmzZdsNxsNmvbtm3asmWLd3nfvn0VFhbm70NBEOmeGKVjeeVqaIETyFt73aWQjr1Vl/Vfcp8qMDoOAAAAAojfG8S4uDjNnj1bU6dO1YQJE3TPPfeoT58+Sk9PV05OjiRpyZIlWrhwocaMGaOamhpNnTpVkvT8889rzZo1Gjt2rA4cOKDf/OY3kqRFixbp3XffVVpamtauXav58+f7+zAQZJISo1VV61JhSZXRUZqcd+oLa6hqM9+Ux+00OhIAAAAChMlzsYv9WgCGmDa9QKpZ8alqzX3rE/3bqB4a3i/B6DiX5M+auU4cUs2W12TtM1qhg+73y2cYIZC+Z81BMNeLIaYAAPjO72cQgUDkiA5TVIRNx/LKjY5iGEvnZFlvHiHn4c1y5X1pdBwAAAAEABpEtEgmk0ndE6N1LLfc6CiGsg/6V5mjO6h25ztqqA3Os0gAAAC4ejSIaLGSEqNUWlGn0tO1RkcxjMliV+iIGfLUVqpu1x8vOr0MAAAAWg4aRLRYSR2jJUnftOBhppIU0q6z7AMmy3XioJxf7TI6DgAAAAxEg4gWK9ERoTB7iI7lnTY6iuGsve9WSMItqstapYbyQqPjAAAAwCC+zS4PBBGz2aQbEqJ09MQpff3DKaPjXFRRRZ3Ky6ub5LPM3e9TbPFindq8QiWDnpDMl/71EB1pV1yb8CbJBQAAgKZDg4gWrWfnNvriu39o0aqDRkcJCL2tAzQ9cqeOZPxZG2r6X3bbmzpFa2T/RCV3b6cQM4MRAAAAggENIlq0O/t3VLf41tc1J6Y/RUWH63QTnUE8K1mVR2o1Mu9T9b09VXUx3S+61XeFFdp5MF8r1n2hNpF23dEvQcP6dlDrVrYmzAoAAIDGZvK00NsWlpZWXldTEMyTS/sLNfOdETXzOOtU9dfnJVedWk36D5lCLz7ReEODR5//o0SZ2Xn68vgpWUJMuu2mWI3on6hu8a1lMpmaNPd5fM98E8z1MptNatv24t9fAABwcZxBBHABk9WusBEzVZ3xH6rd/WeF3jnros2e2WxSv+4O9evuUGFplTI/y9fHOYXK+rJIXdpHamT/RA3oGSurJcSAowAAAMC14MIhAD8T4ugiW8okub4/INc3e664fXzbVnrwriS9OmuofnV3kuqcbq386KjmrNirD3d+q5LTNU2QGgAAANeLM4gALsrWd7TceTmq/fh9hbRPkjkq7orvCbNbNOLWRA3vl6CvfihXZnaeNn/6gzZ/+oOSb2ynEf0TdXPnNoYNPwUAAMDl0SACuCiTyazQO9JVtfZZ1WT+QeHjn5bpMlNfXPhek3p2bqOenduo9HStdh7K165DBTp4rETxbcM14tZEDenVXmF2fgUBAAAEEoaYArgkc0SMQm9/WA0nv1d9dsY17aNtVKgmDbtBr84aoun39FSY3aIPtn2jf1/xsd7b+rXyS6oaOTUAAACuFf99D+CyrN1uk7vH7ao/uFEhib1kie9xbfuxhGhIr3gN6RWv7wsrlJmdp92fF2rHZ/nq2bmNRtyaqOTubZlTEQAAwEBMc3GNgvnW8P5CzXwXKDXzOGtVtfZ5qcGlVpNelMneqlH2W1Fdr92fF2jnwXyVVtQpprVddyQnKPU65lQMlJo1F8FcL6a5AADAdzSI1yiY/1HlL9TMd4FUM3fxd6rOeEmWrv0VOvLRRr3RjLuhQZ9/W6rMz/J0xDunYpxG9k9Utw6tfdpXINWsOQjmetEgAgDgO4aYArgqIbHdZEuZoPr9a+V0dJHZ0bVR998nQuqTGqrSvm108JsS5Xx3WP/99SHFtw1Xv+4O3dQ5WpaQKw8/rakOl6u8ulGzBbPArZdJIbHdZLJc25lkAABwbWgQAVw1W980ufO+UN2na/z2GeGShkoaGiYpTJJT0hHJeeTs0ythxkXfBHK9bLeOlz3ll0bHAACgRaFBBHDVTGazwsbMkbv4O0lNMzrd45Fyi84o+5sSfZtfLkm6MSFKtyY51CkuUv880jU6OlzlAXlGLDAFbr1MCom7wegQAAC0ODSIAHxisthk6XBTk35mtwSp2626YE7Fvx2vVHzbhp/NqRjmiFRleHBeU+cP1AsAAPxUk9ykZsOGDXrzzTfldDr18MMP68EHH7xg/dGjR/XMM8+osrJSKSkpeuGFF2SxWFRQUKAnn3xSpaWl6tq1q5YsWaJWrVqpoqJCv/3tb5Wbm6uYmBi99tprcjgcPmXiJjVNj5r5jppdnNPl1r6jxcr8LF/fF1bIbgvR0F7tdUe/BPXo1k4lJZVGR2w22rWLUMXp6qCcXoSb1AAA4Du/N4hFRUV64IEH9Ne//lU2m03333+/li5dqhtvvNG7zT333KP58+crOTlZTz31lHr16qUpU6ZoxowZGjdunNLS0rRixQpVV1frySef1Isvvqj27dvrkUce0fr167Vz50699tprPuWiQWx61Mx31OzKviuoUOZnedp3tEgud4u8KXOjsISYZLeGyHbuYbeaZbeGeB+2c69t55fZQmSzmP/v+T9t99OH1WqWuRHvenu1aBABAPCd3xvEdevWaf/+/VqwYIEkacWKFfJ4PHrsscckSfn5+XrooYf097//XZJ04MABLV++XCtXrtTAgQO1b98+WSwWFRYW6le/+pW2b9+uESNG6IMPPlB8fLxcLpcGDBigTz/9VFar9apz0SA2PWrmO2p29Sqq6/XZ1ydltVtUWVlndJxmIzzcrlOnq1XndKu+vkF1TvfZ5+f+rHM2eJ/X/+S1r789bRbzFZrLn6w/9xjQM1YxrUOv+dhoEAEA8J3fr0EsLi6+YPhnbGysDh8+fMn1DodDRUVFOnXqlCIiImSxWC5Y/s/vsVgsioiIUFlZmeLi4q46V2P8o8HhiLzufbQ01Mx31OzqOCTd0Lmt0TFaBI/Hc7Z5rD/7qK13qbbe7V1WW+9SbZ1bdfUu1Tndqq0//3D97D3VdS6dqqw7u/7ce+pdDZKkyEi7Jgzz7fIBAABwffzeIF7sBOVPJ9i+1Porve+fmX28foYziE2PmvmOmvmOmvnmeutlkhQWYlJYmEUKa5y/Uho8HjmdDbLbQq4rG2cQAQDwnd/vShAXF6eSkhLv6+LiYsXGxl5y/cmTJxUbG6uYmBhVVlbK7XZfsFw6exby/HtcLpcqKysVHR3t70MBADQBs8kkuy3E6BgAALRIfm8QhwwZoqysLJWVlammpkZbt25Vamqqd31CQoLsdruys7MlSevXr1dqaqqsVqtSUlK0adOmC5ZL0rBhw7R+/XpJ0qZNm5SSkuLT9YcAAAAAgJ9rsmku3nrrLTmdTk2ePFnp6elKT0/X448/rt69e+urr77SM888o6qqKt18881auHChbDab8vPzNXfuXJWWlio+Pl5Lly5VVFSUysvLNXfuXOXm5ioyMlJLlixRYmKiT5kYYtr0qJnvqJnvqJlvgrleDDEFAMB3TdIgBiIaxKZHzXxHzXxHzXwTzPWiQQQAwHfBNzMyAAAAAOCa0CACAAAAACQ1wTQXgcpsvvSUGU25j5aGmvmOmvmOmvkmWOsVrMcFAIA/tdhrEAEAAAAAF2KIKQAAAABAEg0iAAAAAOAcGkQAAAAAgCQaRAAAAADAOTSIAAAAAABJNIgAAAAAgHNoEAEAAAAAkmgQAQAAAADn0CACAAAAACTRIAIAAAAAzqFB9NGGDRs0duxY3XXXXfrggw+MjtMsvP7660pLS1NaWpoWL15sdJxmZdGiRZo7d67RMZqFzMxMTZw4UaNHj9b8+fONjtMsZGRkeH82Fy1aZHQcAAAQAGgQfVBUVKRly5Zp1apVysjI0OrVq/Xtt98aHSug7d27V3v27NG6deu0fv16ffnll9q2bZvRsZqFrKwsrVu3zugYzUJubq6ef/55vfHGG9qwYYOOHDmiXbt2GR0roNXU1Oill17Se++9p4yMDB04cEB79+41OhYAADAYDaIP9u7dq0GDBik6Olrh4eEaNWqUNm/ebHSsgOZwODR37lzZbDZZrVbdcMMNKigoMDpWwCsvL9eyZcs0c+ZMo6M0C9u2bdPYsWPVvn17Wa1WLVu2TH379jU6VkBzu91qaGhQTU2NXC6XXC6X7Ha70bEAAIDBaBB9UFxcLIfD4X0dGxuroqIiAxMFvu7duys5OVmSdPz4cW3atEnDhg0zNlQz8Nxzz2n27Nlq3bq10VGahRMnTsjtduvXv/61xo0bp1WrVikqKsroWAEtIiJCTzzxhMaMGaPU1FQlJCTo1ltvNToWAAAwGA2iDzwez8+WmUwmA5I0P8eOHdO0adP0u9/9Tl26dDE6TkD78MMPFR8fr8GDBxsdpdlwu93KysrSK6+8ojVr1ignJ4fhuVfw1Vdfae3atdqxY4f27Nkjs9mslStXGh0LAAAYjAbRB3FxcSopKfG+Li4uVmxsrIGJmofs7Gw9/PDDmjNnjn75y18aHSfgbdq0SR9//LHGjx+v5cuXKzMzUwsWLDA6VkBr166dBg8erJiYGIWGhmrkyJE6fPiw0bEC2p49ezR48GC1bdtWNptNEydO1L59+4yOBQAADEaD6IMhQ4YoKytLZWVlqqmp0datW5Wammp0rIBWWFioWbNmacmSJUpLSzM6TrPwpz/9SRs3blRGRoYef/xxjRgxQk899ZTRsQLa8OHDtWfPHlVUVMjtdmv37t265ZZbjI4V0G666Sbt3btX1dXV8ng8yszMVO/evY2OBQAADGYxOkBzEhcXp9mzZ2vq1KlyOp2aPHmy+vTpY3SsgLZy5UrV1dXp5Zdf9i67//779cADDxiYCsGmb9++mj59uqZMmSKn06mhQ4dq0qRJRscKaL/4xS905MgRTZw4UVarVb1799YjjzxidCwAAGAwk+diF9YBAAAAAFochpgCAAAAACTRIAIAAAAAzqFBBAAAAABIokEEAAAAAJxDgwgAAAAAkESDCOAy8vLy1KNHD1VVVRkdBQAAAE2ABhEAAAAAIIkGETBcXl6eUlJS9Pbbb2vo0KEaPHiwFixYcMnt9+/fr0mTJiklJUX33XefDh8+7F3Xo0cPvf322xoyZIgGDhyopUuXqqGhQZJUUlKiOXPmaODAgRo2bJgWL16s+vp6SVJdXZ3mz5+vQYMGaeDAgZo3b57q6uq8+/3LX/6ikSNHqn///nr55Ze9yzds2KC7775bt912myZNmqQ9e/Y0dnkAAADQhGgQgQBw5swZ5eXlaceOHXrzzTe1atUqHTx48GfbFRQUaMaMGXr00Uf1ySefaNq0aUpPT1d5ebl3m507d2rjxo368MMPtXHjRq1evVqS9Nhjj0mStm/frjVr1mjfvn1avny5JOn3v/+9Dh06pIyMDG3fvl35+flasWKFd5/FxcX629/+pvfff1/vv/++srOzVVNTo3nz5mnp0qXav3+/pkyZomeffVYej8ePlQIAAIA/0SACASI9PV02m03Jycnq1q2bTpw48bNtNm7cqIEDB+rOO++UxWLRmDFjlJSUpC1btni3mTNnjmJiYtSpUydNnTpVH330kX744QcdPHhQTz/9tCIiIhQXF6cnnnhC69atkyR99NFHmjlzpuLi4hQREaHFixdr8uTJ3n3OmDFDNptNPXv2VNeuXZWXlydJstvtWrNmjQ4ePKjx48crMzNTJpPJz5UCAACAv9AgAgEiJibG+9xisXiHhv5UQUGBdu/erZSUFO8jJydHhYWF3m06d+7sfd6+fXudPHlSpaWlCg8Pv+AzOnTooJKSEjmdTpWUlKh9+/YXvK9Tp07e161bt/Y+t1qtcrvdCgsL07vvvquysjJNnz5dQ4cO1TvvvHP9hQAAAIBhLEYHAHD1HA6Hxo4dq8WLF3uX5ebmqk2bNt7XxcXFateunaSzDWV8fLw6dOig6upqlZWVeZvEvLw8RUdHy2q1Ki4uTkVFRerVq5ckKScnR4cOHdLw4cMvmaWyslJVVVV6/fXX5XK5tHfvXs2aNUsDBgxQcnKyH44eAAAA/sYZRKAZSUtL044dO5SVlSWPx6Ps7GyNGzdOOTk53m2WL1+uyspKff/993rvvfc0YcIExcXFafDgwXrppZdUVVWloqIiLV++XPfee68k6d5779Xbb7+tkpISnTlzRq+++qpKSkoum6W6ulrTp0/X7t27ZbFYFBsbK5PJpKioKL/WAAAAAP7DGUSgGenSpYtee+01vfLKKzp+/LhiYmI0b948DR482LtNYmKi0tLS5Ha79dBDD2nChAmSpCVLluill17SyJEjJUnjxo3TnDlzJEmPPvqoampqNGHCBLlcLo0ePVqzZs1ScXHxJbPExsZq8eLFWrBggX788Ue1adNGzz33nLp27eq/AgAAAMCvTB5uOQgEjR49emjDhg1KSkoyOgoAAACaIYaYAgAAAAAk0SACAAAAAM5hiCkAAAAAQBJnEAEAAAAA59AgAgAAAAAk0SACAAAAAM6hQQQAAAAASKJBBAAAAACc8/8BUhZ6ffdUg3MAAAAASUVORK5CYII=
" />
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>As we can see from the plots, the learning rate effectively decreases by a factor of 0.1 (the default) after the corresponding <code>step_size</code> for each component. Note that the keys in the <code>model.lr_history</code> dictionary have a suffix <code>_0</code>. This is because if you pass different parameter groups to the torch optimizers, these will also be recorded. We'll see this in the regression example later in the post.</p>
<p>Before I move to the next section let me just mention that the <code>WideDeep</code> class comes with a useful method to "rescue" the learned embeddings, very creatively called <code>get_embeddings</code>. For example, let's say I want to use the embeddings learned for the different levels of the categorical feature <code>education</code>. These can be access via:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">education_embed</span> <span class="o">=</span> <span class="n">trainer</span><span class="o">.</span><span class="n">get_embeddings</span><span class="p">(</span>
<span class="n">col_name</span><span class="o">=</span><span class="s1">'education'</span><span class="p">,</span>
<span class="n">cat_encoding_dict</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">label_encoder</span><span class="o">.</span><span class="n">encoding_dict</span>
<span class="p">)</span>
<span class="n">education_embed</span><span class="p">[</span><span class="s1">'doctorate'</span><span class="p">]</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>array([ 0.41479743, 0.08521606, 0.2710749 , -0.17924106, -0.07241581,
-0.2514616 , -0.24809864, -0.20624267, -0.12701468, -0.00737057,
-0.17397854, 0.03000254, -0.06039784, 0.28008303, -0.35625017,
0.00706905, 0.18486224, -0.05701892, -0.05574326, -0.08269893,
-0.15482767, 0.30681178, -0.23743518, 0.08368678, 0.20123835,
0.30058601, -0.15073103, -0.08352864, 0.07049613, -0.28594372,
-0.05307232, -0.17094977], dtype=float32)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="2.-Using-the-Focal-Loss">2. Using the Focal Loss<a class="anchor-link" href="#2.-Using-the-Focal-Loss"> </a></h2><p>The Focal loss (hereafter FL) was introduced by Tsung-Yi Lin et al., in their <a href="https://arxiv.org/pdf/1708.02002.pdf">2018 paper</a> “Focal Loss for Dense Object Detection” [1]. It is designed to address scenarios with extreme imbalanced classes, such as one-stage object detection where the imbalance between foreground and background classes can be, for example, 1:1000.</p>
<p>The adult census dataset is not really imbalanced, therefore is not the best dataset to test the performance of the FL. Nonetheless, let me illustrate how easy is to use the FL with <code>pytorch-widedeep</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">wide</span><span class="o">=</span><span class="n">wide</span><span class="p">,</span> <span class="n">deeptabular</span><span class="o">=</span><span class="n">deeptabular</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span>
<span class="n">model</span><span class="p">,</span>
<span class="n">objective</span><span class="o">=</span><span class="s2">"binary_focal_loss"</span><span class="p">,</span>
<span class="n">optimizers</span><span class="o">=</span><span class="n">optimizers</span><span class="p">,</span>
<span class="n">lr_schedulers</span><span class="o">=</span><span class="n">schedulers</span><span class="p">,</span>
<span class="n">initializers</span><span class="o">=</span><span class="n">initializers</span><span class="p">,</span>
<span class="n">callbacks</span><span class="o">=</span><span class="n">callbacks</span><span class="p">,</span>
<span class="n">metrics</span><span class="o">=</span><span class="n">metrics</span><span class="p">,</span>
<span class="n">alpha</span><span class="o">=</span><span class="mf">0.2</span><span class="p">,</span> <span class="c1"># the alpha parameter of the focal loss</span>
<span class="n">gamma</span><span class="o">=</span><span class="mf">1.0</span><span class="p">,</span> <span class="c1"># the gamma parameter of the focal loss</span>
<span class="n">verbose</span><span class="o">=</span><span class="kc">False</span>
<span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_wide</span><span class="o">=</span><span class="n">X_wide</span><span class="p">,</span> <span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">256</span><span class="p">,</span> <span class="n">val_split</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>To learn more about the losses available at <code>pytorch-widedeep</code> have a look at the <code>losses</code> module in the library or the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/losses.html">docs</a>.</p>
<h2 id="3.-Regression-combining-tabular-data,-text-and-images">3. Regression combining tabular data, text and images<a class="anchor-link" href="#3.-Regression-combining-tabular-data,-text-and-images"> </a></h2><p>For this example we will use a small sample (so you can run it locally in a laptop) of the <a href="http://insideairbnb.com/get-the-data.html">Airbnb listings dataset</a> in London.</p>
<p>In case you are interested in all details, I did prepared the original dataset for this post, and all the code can be found at the <code>airbnb_data_preprocessing.py</code>, <a href="`https://github.com/jrzaurin/pytorch-widedeep/blob/master/examples/airbnb_data_preprocessing.py`">here</a>. After such preprocessing the data looks like this:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">airbnb</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="s1">'data/airbnb/airbnb_sample.csv'</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">airbnb</span><span class="o">.</span><span class="n">head</span><span class="p">(</span><span class="mi">1</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>id</th>
<th>host_id</th>
<th>description</th>
<th>host_listings_count</th>
<th>host_identity_verified</th>
<th>neighbourhood_cleansed</th>
<th>latitude</th>
<th>longitude</th>
<th>is_location_exact</th>
<th>property_type</th>
<th>room_type</th>
<th>accommodates</th>
<th>bathrooms</th>
<th>bedrooms</th>
<th>beds</th>
<th>guests_included</th>
<th>minimum_nights</th>
<th>instant_bookable</th>
<th>cancellation_policy</th>
<th>has_house_rules</th>
<th>host_gender</th>
<th>accommodates_catg</th>
<th>guests_included_catg</th>
<th>minimum_nights_catg</th>
<th>host_listings_count_catg</th>
<th>bathrooms_catg</th>
<th>bedrooms_catg</th>
<th>beds_catg</th>
<th>amenity_24-hour_check-in</th>
<th>amenity__toilet</th>
<th>amenity_accessible-height_bed</th>
<th>amenity_accessible-height_toilet</th>
<th>amenity_air_conditioning</th>
<th>amenity_air_purifier</th>
<th>amenity_alfresco_bathtub</th>
<th>amenity_amazon_echo</th>
<th>amenity_baby_bath</th>
<th>amenity_baby_monitor</th>
<th>amenity_babysitter_recommendations</th>
<th>amenity_balcony</th>
<th>amenity_bath_towel</th>
<th>amenity_bathroom_essentials</th>
<th>amenity_bathtub</th>
<th>amenity_bathtub_with_bath_chair</th>
<th>amenity_bbq_grill</th>
<th>amenity_beach_essentials</th>
<th>amenity_beach_view</th>
<th>amenity_beachfront</th>
<th>amenity_bed_linens</th>
<th>amenity_bedroom_comforts</th>
<th>...</th>
<th>amenity_roll-in_shower</th>
<th>amenity_room-darkening_shades</th>
<th>amenity_safety_card</th>
<th>amenity_sauna</th>
<th>amenity_self_check-in</th>
<th>amenity_shampoo</th>
<th>amenity_shared_gym</th>
<th>amenity_shared_hot_tub</th>
<th>amenity_shared_pool</th>
<th>amenity_shower_chair</th>
<th>amenity_single_level_home</th>
<th>amenity_ski-in_ski-out</th>
<th>amenity_smart_lock</th>
<th>amenity_smart_tv</th>
<th>amenity_smoke_detector</th>
<th>amenity_smoking_allowed</th>
<th>amenity_soaking_tub</th>
<th>amenity_sound_system</th>
<th>amenity_stair_gates</th>
<th>amenity_stand_alone_steam_shower</th>
<th>amenity_standing_valet</th>
<th>amenity_steam_oven</th>
<th>amenity_stove</th>
<th>amenity_suitable_for_events</th>
<th>amenity_sun_loungers</th>
<th>amenity_table_corner_guards</th>
<th>amenity_tennis_court</th>
<th>amenity_terrace</th>
<th>amenity_toilet_paper</th>
<th>amenity_touchless_faucets</th>
<th>amenity_tv</th>
<th>amenity_walk-in_shower</th>
<th>amenity_warming_drawer</th>
<th>amenity_washer</th>
<th>amenity_washer_dryer</th>
<th>amenity_waterfront</th>
<th>amenity_well-lit_path_to_entrance</th>
<th>amenity_wheelchair_accessible</th>
<th>amenity_wide_clearance_to_shower</th>
<th>amenity_wide_doorway_to_guest_bathroom</th>
<th>amenity_wide_entrance</th>
<th>amenity_wide_entrance_for_guests</th>
<th>amenity_wide_entryway</th>
<th>amenity_wide_hallways</th>
<th>amenity_wifi</th>
<th>amenity_window_guards</th>
<th>amenity_wine_cooler</th>
<th>security_deposit</th>
<th>extra_people</th>
<th>yield</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>13913.jpg</td>
<td>54730</td>
<td>My bright double bedroom with a large window has a relaxed feeling! It comfortably fits one or t...</td>
<td>4.0</td>
<td>f</td>
<td>Islington</td>
<td>51.56802</td>
<td>-0.11121</td>
<td>t</td>
<td>apartment</td>
<td>private_room</td>
<td>2</td>
<td>1.0</td>
<td>1.0</td>
<td>0.0</td>
<td>1</td>
<td>1</td>
<td>f</td>
<td>moderate</td>
<td>1</td>
<td>female</td>
<td>2</td>
<td>1</td>
<td>1</td>
<td>3</td>
<td>1</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>0</td>
<td>...</td>
<td>1</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>0</td>
<td>1</td>
<td>0</td>
<td>0</td>
<td>100.0</td>
<td>15.0</td>
<td>12.0</td>
</tr>
</tbody>
</table>
<p>1 rows × 223 columns</p>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Let's define what will go through the wide and deep components</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1"># There are a number of columns that are already binary. Therefore, no need to one hot encode them</span>
<span class="n">crossed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s1">'property_type'</span><span class="p">,</span> <span class="s1">'room_type'</span><span class="p">)]</span>
<span class="n">already_dummies</span> <span class="o">=</span> <span class="p">[</span><span class="n">c</span> <span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">airbnb</span><span class="o">.</span><span class="n">columns</span> <span class="k">if</span> <span class="s1">'amenity'</span> <span class="ow">in</span> <span class="n">c</span><span class="p">]</span> <span class="o">+</span> <span class="p">[</span><span class="s1">'has_house_rules'</span><span class="p">]</span>
<span class="n">wide_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'is_location_exact'</span><span class="p">,</span> <span class="s1">'property_type'</span><span class="p">,</span> <span class="s1">'room_type'</span><span class="p">,</span> <span class="s1">'host_gender'</span><span class="p">,</span>
<span class="s1">'instant_bookable'</span><span class="p">]</span> <span class="o">+</span> <span class="n">already_dummies</span>
<span class="n">cat_embed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="n">c</span><span class="p">,</span> <span class="mi">16</span><span class="p">)</span> <span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">airbnb</span><span class="o">.</span><span class="n">columns</span> <span class="k">if</span> <span class="s1">'catg'</span> <span class="ow">in</span> <span class="n">c</span><span class="p">]</span> <span class="o">+</span> \
<span class="p">[(</span><span class="s1">'neighbourhood_cleansed'</span><span class="p">,</span> <span class="mi">64</span><span class="p">),</span> <span class="p">(</span><span class="s1">'cancellation_policy'</span><span class="p">,</span> <span class="mi">16</span><span class="p">)]</span>
<span class="n">continuous_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'latitude'</span><span class="p">,</span> <span class="s1">'longitude'</span><span class="p">,</span> <span class="s1">'security_deposit'</span><span class="p">,</span> <span class="s1">'extra_people'</span><span class="p">]</span>
<span class="c1"># it does not make sense to standarised Latitude and Longitude. Here I am going to "pass" but you </span>
<span class="c1"># might want to check the LatLongScalarEnc available in the autogluon tabular library.</span>
<span class="n">already_standard</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'latitude'</span><span class="p">,</span> <span class="s1">'longitude'</span><span class="p">]</span>
<span class="c1"># text and image colnames</span>
<span class="n">text_col</span> <span class="o">=</span> <span class="s1">'description'</span>
<span class="n">img_col</span> <span class="o">=</span> <span class="s1">'id'</span>
<span class="c1"># path to pretrained word embeddings and the images</span>
<span class="n">word_vectors_path</span> <span class="o">=</span> <span class="s1">'data/glove.6B/glove.6B.100d.txt'</span>
<span class="n">img_path</span> <span class="o">=</span> <span class="s1">'data/airbnb/property_picture'</span>
<span class="c1"># target</span>
<span class="n">target_col</span> <span class="o">=</span> <span class="s1">'yield'</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Note the following: columns that are already dummies (defined as <code>already_dummies</code>), are treated as any other wide column. Internally, nothing will really happen to them. They will just add one entry to the embedding lookup table.</p>
<p>On the other hand, you will see that among the columns that will be passed through the <code>deeptabular</code> component we have <code>already_standard</code> columns, which are longitude and latitude in this case. These are columns for which it makes no sense to standardize them via <code>sklearn</code>'s <code>StandardScaler</code>, which is what <code>TabPreprocessor</code> uses internally. A solution would be to pre-process them before-hand (using for example the <a href="https://github.com/awslabs/autogluon/blob/master/tabular/src/autogluon/tabular/models/tab_transformer/tab_transformer_encoder.py">LatLongScalarEnc</a> available at the <code>autogluon</code> library) and then pass them to the <code>TabPreprocessor</code>.</p>
<p>Nonetheless, in this case I am going to "ignore" this issue and move on since I just want to illustrate the use of the package.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">import</span> <span class="nn">os</span>
<span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.preprocessing</span> <span class="kn">import</span> <span class="n">WidePreprocessor</span><span class="p">,</span> <span class="n">TabPreprocessor</span><span class="p">,</span> <span class="n">TextPreprocessor</span><span class="p">,</span> <span class="n">ImagePreprocessor</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">Wide</span><span class="p">,</span> <span class="n">TabMlp</span><span class="p">,</span> <span class="n">DeepText</span><span class="p">,</span> <span class="n">DeepImage</span><span class="p">,</span> <span class="n">WideDeep</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.initializers</span> <span class="kn">import</span> <span class="o">*</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.callbacks</span> <span class="kn">import</span> <span class="o">*</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">target</span> <span class="o">=</span> <span class="n">airbnb</span><span class="p">[</span><span class="n">target_col</span><span class="p">]</span><span class="o">.</span><span class="n">values</span>
<span class="n">wide_preprocessor</span> <span class="o">=</span> <span class="n">WidePreprocessor</span><span class="p">(</span><span class="n">wide_cols</span><span class="o">=</span><span class="n">wide_cols</span><span class="p">,</span> <span class="n">crossed_cols</span><span class="o">=</span><span class="n">crossed_cols</span><span class="p">)</span>
<span class="n">X_wide</span> <span class="o">=</span> <span class="n">wide_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
<span class="n">tab_preprocessor</span> <span class="o">=</span> <span class="n">TabPreprocessor</span><span class="p">(</span><span class="n">embed_cols</span><span class="o">=</span><span class="n">cat_embed_cols</span><span class="p">,</span> <span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">)</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
<span class="n">text_preprocessor</span> <span class="o">=</span> <span class="n">TextPreprocessor</span><span class="p">(</span><span class="n">word_vectors_path</span><span class="o">=</span><span class="n">word_vectors_path</span><span class="p">,</span> <span class="n">text_col</span><span class="o">=</span><span class="n">text_col</span><span class="p">)</span>
<span class="n">X_text</span> <span class="o">=</span> <span class="n">text_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
<span class="n">image_processor</span> <span class="o">=</span> <span class="n">ImagePreprocessor</span><span class="p">(</span><span class="n">img_col</span> <span class="o">=</span> <span class="n">img_col</span><span class="p">,</span> <span class="n">img_path</span> <span class="o">=</span> <span class="n">img_path</span><span class="p">)</span>
<span class="n">X_images</span> <span class="o">=</span> <span class="n">image_processor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>The vocabulary contains 2192 tokens
Indexing word vectors...
Loaded 400000 word vectors
Preparing embeddings matrix...
2175 words in the vocabulary had data/glove.6B/glove.6B.100d.txt vectors and appear more than 5 times
Reading Images from data/airbnb/property_picture
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre> 4%|▎ | 36/1001 [00:00<00:02, 346.67it/s]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Resizing
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>100%|██████████| 1001/1001 [00:02<00:00, 372.15it/s]
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Computing normalisation metrics
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>At this stage the data is ready to be passed through the model. However, instead of building a "simple" model that collects the <code>wide</code>, <code>deeptabular</code>, <code>deeptext</code> and <code>deepimage</code> component, I am going to use this opportunity to illustrate <code>pytorch-widedepp</code>'s flexibility to build wide and deep models. I like to call this, getting into <em>Kaggle mode</em>.</p>
<p>First we define the components of the model...</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">wide</span> <span class="o">=</span> <span class="n">Wide</span><span class="p">(</span><span class="n">wide_dim</span><span class="o">=</span><span class="n">np</span><span class="o">.</span><span class="n">unique</span><span class="p">(</span><span class="n">X_wide</span><span class="p">)</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span> <span class="n">pred_dim</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="c1"># deeptabular: 2 Dense layers</span>
<span class="n">deeptabular</span> <span class="o">=</span> <span class="n">TabMlp</span><span class="p">(</span>
<span class="n">column_idx</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">mlp_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">128</span><span class="p">,</span><span class="mi">64</span><span class="p">],</span>
<span class="n">mlp_dropout</span> <span class="o">=</span> <span class="mf">0.1</span><span class="p">,</span>
<span class="n">mlp_batchnorm</span> <span class="o">=</span> <span class="kc">True</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">embeddings_input</span><span class="p">,</span>
<span class="n">embed_dropout</span> <span class="o">=</span> <span class="mf">0.1</span><span class="p">,</span>
<span class="n">continuous_cols</span> <span class="o">=</span> <span class="n">continuous_cols</span><span class="p">,</span>
<span class="n">batchnorm_cont</span> <span class="o">=</span> <span class="kc">True</span>
<span class="p">)</span>
<span class="c1"># deeptext: a stack of 2 LSTMs</span>
<span class="n">deeptext</span> <span class="o">=</span> <span class="n">DeepText</span><span class="p">(</span>
<span class="n">vocab_size</span><span class="o">=</span><span class="nb">len</span><span class="p">(</span><span class="n">text_preprocessor</span><span class="o">.</span><span class="n">vocab</span><span class="o">.</span><span class="n">itos</span><span class="p">),</span>
<span class="n">hidden_dim</span><span class="o">=</span><span class="mi">64</span><span class="p">,</span>
<span class="n">n_layers</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span>
<span class="n">rnn_dropout</span><span class="o">=</span><span class="mf">0.5</span><span class="p">,</span>
<span class="n">embed_matrix</span><span class="o">=</span><span class="n">text_preprocessor</span><span class="o">.</span><span class="n">embedding_matrix</span><span class="p">)</span>
<span class="c1"># Pretrained Resnet 18 (default is all but last 2 conv blocks frozen) plus a FC-Head 512->256->128</span>
<span class="n">deepimage</span> <span class="o">=</span> <span class="n">DeepImage</span><span class="p">(</span><span class="n">pretrained</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span> <span class="n">head_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">512</span><span class="p">,</span> <span class="mi">256</span><span class="p">,</span> <span class="mi">128</span><span class="p">])</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>...and, as we build the model, add a fully connected <em>head</em> via the input parameters (could also be used via the additional component/parameter <code>deephead</code>)</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span>
<span class="n">wide</span><span class="o">=</span><span class="n">wide</span><span class="p">,</span>
<span class="n">deeptabular</span><span class="o">=</span><span class="n">deeptabular</span><span class="p">,</span>
<span class="n">deeptext</span><span class="o">=</span><span class="n">deeptext</span><span class="p">,</span>
<span class="n">deepimage</span><span class="o">=</span><span class="n">deepimage</span><span class="p">,</span>
<span class="n">head_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">128</span><span class="p">,</span> <span class="mi">64</span><span class="p">]</span>
<span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Let's have a look to the model</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>WideDeep(
(wide): Wide(
(wide_linear): Embedding(357, 1, padding_idx=0)
)
(deeptabular): TabMlp(
(embed_layers): ModuleDict(
(emb_layer_accommodates_catg): Embedding(4, 16, padding_idx=0)
(emb_layer_bathrooms_catg): Embedding(4, 16, padding_idx=0)
(emb_layer_bedrooms_catg): Embedding(5, 16, padding_idx=0)
(emb_layer_beds_catg): Embedding(5, 16, padding_idx=0)
(emb_layer_cancellation_policy): Embedding(6, 16, padding_idx=0)
(emb_layer_guests_included_catg): Embedding(4, 16, padding_idx=0)
(emb_layer_host_listings_count_catg): Embedding(5, 16, padding_idx=0)
(emb_layer_minimum_nights_catg): Embedding(4, 16, padding_idx=0)
(emb_layer_neighbourhood_cleansed): Embedding(33, 64, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(norm): BatchNorm1d(4, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(tab_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): BatchNorm1d(196, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(1): Dropout(p=0.1, inplace=False)
(2): Linear(in_features=196, out_features=128, bias=False)
(3): ReLU(inplace=True)
)
(dense_layer_1): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=128, out_features=64, bias=True)
(2): ReLU(inplace=True)
)
)
)
)
(deeptext): DeepText(
(word_embed): Embedding(2192, 100, padding_idx=1)
(rnn): LSTM(100, 64, num_layers=2, batch_first=True, dropout=0.5)
)
(deepimage): DeepImage(
(backbone): Sequential(
(0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
(1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(2): ReLU(inplace=True)
(3): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
(4): Sequential(
(0): BasicBlock(
(conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
(1): BasicBlock(
(conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(5): Sequential(
(0): BasicBlock(
(conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(downsample): Sequential(
(0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
(1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(1): BasicBlock(
(conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(6): Sequential(
(0): BasicBlock(
(conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(downsample): Sequential(
(0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
(1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(1): BasicBlock(
(conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(7): Sequential(
(0): BasicBlock(
(conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(downsample): Sequential(
(0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
(1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(1): BasicBlock(
(conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(8): AdaptiveAvgPool2d(output_size=(1, 1))
)
(imagehead): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=512, out_features=256, bias=True)
(2): ReLU(inplace=True)
)
(dense_layer_1): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=256, out_features=128, bias=True)
(2): ReLU(inplace=True)
)
)
)
)
(deephead): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=256, out_features=128, bias=True)
(2): ReLU(inplace=True)
)
(dense_layer_1): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=128, out_features=64, bias=True)
(2): ReLU(inplace=True)
)
)
(head_out): Linear(in_features=64, out_features=1, bias=True)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>This is a big model, so let me go component by component.</p>
<ol>
<li><p><code>wide</code>: simple linear model implemented via an <code>Embedding</code> layer</p>
</li>
<li><p><code>deeptabular</code>: embeddings concatenated to categorical columns that are then passed through two dense layers with the following sizes [196 $\rightarrow$ 128 $\rightarrow$ 64].</p>
</li>
<li><p><code>deeptext</code>: two stacked LTSMs that will received the pre-trained glove wordvectors and output a last hidden state of dim 64 (this would be 128 if we had used <code>bidirectional = True</code>)</p>
</li>
<li><p><code>deepimage</code>: a pre-trained ResNet 18 model where only the last <code>Sequential</code> block (7) will be trained. The rest will remain "frozen". on top of it we have <code>imagehead</code> which is just a <code>Sequential</code> model comprised of two dense layers with the following sizes [512 $\rightarrow$ 256 $\rightarrow$ 128]</p>
</li>
<li><p><code>deephead</code>: on top of the 3 deep components we have a final component referred as <code>deephead</code>. This component will receive the concatenated output from all the deep components, and pass it through a further collection of dense layers. In this case the sizes are [256 $\rightarrow$ 64 $\rightarrow$ 1]. We input 256 because the output dim from <code>deeptabular</code> is 64, the output dim from <code>deeptext</code> is 64 and the output dim from <code>deepimage</code> is 128. The final <code>deephead</code> output dim is 1 because we are performing a regression, i.e. one output neuron with no activation function.</p>
</li>
</ol>
<p>Let's go even a step further and use different optimizers, initializers and schedulers for different components. Moreover, let's use a different learning rate for different parameter groups in the case of the <code>deeptabular</code>, remember, this is <em>Kaggle mode</em>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1"># Optimizers. Different parameter groups for the deeptabular component will use different lr</span>
<span class="n">tab_params</span> <span class="o">=</span> <span class="p">[]</span>
<span class="k">for</span> <span class="n">childname</span><span class="p">,</span> <span class="n">child</span> <span class="ow">in</span> <span class="n">model</span><span class="o">.</span><span class="n">named_children</span><span class="p">():</span>
<span class="k">if</span> <span class="n">childname</span> <span class="o">==</span> <span class="s1">'deeptabular'</span><span class="p">:</span>
<span class="k">for</span> <span class="n">n</span><span class="p">,</span><span class="n">p</span> <span class="ow">in</span> <span class="n">child</span><span class="o">.</span><span class="n">named_parameters</span><span class="p">():</span>
<span class="k">if</span> <span class="s2">"emb_layer"</span> <span class="ow">in</span> <span class="n">n</span><span class="p">:</span> <span class="n">tab_params</span><span class="o">.</span><span class="n">append</span><span class="p">({</span><span class="s1">'params'</span><span class="p">:</span> <span class="n">p</span><span class="p">,</span> <span class="s1">'lr'</span><span class="p">:</span> <span class="mf">0.01</span><span class="p">})</span>
<span class="k">else</span><span class="p">:</span> <span class="n">tab_params</span><span class="o">.</span><span class="n">append</span><span class="p">({</span><span class="s1">'params'</span><span class="p">:</span> <span class="n">p</span><span class="p">,</span> <span class="s1">'lr'</span><span class="p">:</span> <span class="mf">0.03</span><span class="p">})</span>
<span class="n">wide_opt</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">Adam</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">wide</span><span class="o">.</span><span class="n">parameters</span><span class="p">(),</span> <span class="n">lr</span><span class="o">=</span><span class="mf">0.03</span><span class="p">)</span>
<span class="n">tab_opt</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">Adam</span><span class="p">(</span><span class="n">tab_params</span><span class="p">)</span>
<span class="n">text_opt</span> <span class="o">=</span> <span class="n">RAdam</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">deeptext</span><span class="o">.</span><span class="n">parameters</span><span class="p">())</span>
<span class="n">img_opt</span> <span class="o">=</span> <span class="n">RAdam</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">deepimage</span><span class="o">.</span><span class="n">parameters</span><span class="p">())</span>
<span class="n">head_opt</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">AdamW</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">deephead</span><span class="o">.</span><span class="n">parameters</span><span class="p">())</span>
<span class="n">optimizers</span> <span class="o">=</span> <span class="p">{</span><span class="s1">'wide'</span><span class="p">:</span> <span class="n">wide_opt</span><span class="p">,</span> <span class="s1">'deeptabular'</span><span class="p">:</span><span class="n">tab_opt</span><span class="p">,</span> <span class="s1">'deeptext'</span><span class="p">:</span><span class="n">text_opt</span><span class="p">,</span> <span class="s1">'deepimage'</span><span class="p">:</span> <span class="n">img_opt</span><span class="p">,</span> <span class="s1">'deephead'</span><span class="p">:</span> <span class="n">head_opt</span><span class="p">}</span>
<span class="c1"># schedulers</span>
<span class="n">wide_sch</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">lr_scheduler</span><span class="o">.</span><span class="n">StepLR</span><span class="p">(</span><span class="n">wide_opt</span><span class="p">,</span> <span class="n">step_size</span><span class="o">=</span><span class="mi">5</span><span class="p">)</span>
<span class="n">deep_sch</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">lr_scheduler</span><span class="o">.</span><span class="n">MultiStepLR</span><span class="p">(</span><span class="n">tab_opt</span><span class="p">,</span> <span class="n">milestones</span><span class="o">=</span><span class="p">[</span><span class="mi">3</span><span class="p">,</span><span class="mi">8</span><span class="p">])</span>
<span class="n">text_sch</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">lr_scheduler</span><span class="o">.</span><span class="n">StepLR</span><span class="p">(</span><span class="n">text_opt</span><span class="p">,</span> <span class="n">step_size</span><span class="o">=</span><span class="mi">5</span><span class="p">)</span>
<span class="n">img_sch</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">lr_scheduler</span><span class="o">.</span><span class="n">MultiStepLR</span><span class="p">(</span><span class="n">tab_opt</span><span class="p">,</span> <span class="n">milestones</span><span class="o">=</span><span class="p">[</span><span class="mi">3</span><span class="p">,</span><span class="mi">8</span><span class="p">])</span>
<span class="n">head_sch</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">optim</span><span class="o">.</span><span class="n">lr_scheduler</span><span class="o">.</span><span class="n">StepLR</span><span class="p">(</span><span class="n">head_opt</span><span class="p">,</span> <span class="n">step_size</span><span class="o">=</span><span class="mi">5</span><span class="p">)</span>
<span class="n">schedulers</span> <span class="o">=</span> <span class="p">{</span><span class="s1">'wide'</span><span class="p">:</span> <span class="n">wide_sch</span><span class="p">,</span> <span class="s1">'deeptabular'</span><span class="p">:</span><span class="n">deep_sch</span><span class="p">,</span> <span class="s1">'deeptext'</span><span class="p">:</span><span class="n">text_sch</span><span class="p">,</span> <span class="s1">'deepimage'</span><span class="p">:</span> <span class="n">img_sch</span><span class="p">,</span> <span class="s1">'deephead'</span><span class="p">:</span> <span class="n">head_sch</span><span class="p">}</span>
<span class="c1"># initializers</span>
<span class="n">initializers</span> <span class="o">=</span> <span class="p">{</span><span class="s1">'wide'</span><span class="p">:</span> <span class="n">KaimingNormal</span><span class="p">,</span> <span class="s1">'deeptabular'</span><span class="p">:</span><span class="n">KaimingNormal</span><span class="p">,</span>
<span class="s1">'deeptext'</span><span class="p">:</span><span class="n">KaimingNormal</span><span class="p">(</span><span class="n">pattern</span><span class="o">=</span><span class="sa">r</span><span class="s2">"^(?!.*word_embed).*$"</span><span class="p">),</span> <span class="c1"># do not initialize the pre-trained word-vectors!</span>
<span class="s1">'deepimage'</span><span class="p">:</span><span class="n">KaimingNormal</span><span class="p">}</span>
<span class="c1"># transforms and callbacks</span>
<span class="n">mean</span> <span class="o">=</span> <span class="p">[</span><span class="mf">0.406</span><span class="p">,</span> <span class="mf">0.456</span><span class="p">,</span> <span class="mf">0.485</span><span class="p">]</span> <span class="c1">#BGR</span>
<span class="n">std</span> <span class="o">=</span> <span class="p">[</span><span class="mf">0.225</span><span class="p">,</span> <span class="mf">0.224</span><span class="p">,</span> <span class="mf">0.229</span><span class="p">]</span> <span class="c1">#BGR</span>
<span class="n">transforms</span> <span class="o">=</span> <span class="p">[</span><span class="n">ToTensor</span><span class="p">,</span> <span class="n">Normalize</span><span class="p">(</span><span class="n">mean</span><span class="o">=</span><span class="n">mean</span><span class="p">,</span> <span class="n">std</span><span class="o">=</span><span class="n">std</span><span class="p">)]</span>
<span class="n">callbacks</span> <span class="o">=</span> <span class="p">[</span><span class="n">LRHistory</span><span class="p">(</span><span class="n">n_epochs</span><span class="o">=</span><span class="mi">10</span><span class="p">),</span> <span class="n">EarlyStopping</span><span class="p">,</span> <span class="n">ModelCheckpoint</span><span class="p">(</span><span class="n">filepath</span><span class="o">=</span><span class="s1">'model_weights/wd_out'</span><span class="p">)]</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Note that, since we will use pre-trained word embeddings, we do not want to initialize these embeddings. However you might still want to initialize the other layers in the <code>deeptext</code> component. This is not a problem, you can do that with the parameter <code>pattern</code> and your knowledge on regular expressions. In the <code>deeptext</code> initializer definition above:</p>
<div class="highlight"><pre><span></span><span class="n">KaimingNormal</span><span class="p">(</span><span class="n">pattern</span><span class="o">=</span><span class="sa">r</span><span class="s2">"^(?!.*word_embed).*$"</span><span class="p">)</span>
</pre></div>
<p>I am NOT initializing parameters whose name contains the string <code>word_embed</code>.</p>
<p>So...let's compile and run, which is as easy as:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">objective</span><span class="o">=</span><span class="s2">"regression"</span><span class="p">,</span> <span class="n">initializers</span><span class="o">=</span><span class="n">initializers</span><span class="p">,</span> <span class="n">optimizers</span><span class="o">=</span><span class="n">optimizers</span><span class="p">,</span>
<span class="n">lr_schedulers</span><span class="o">=</span><span class="n">schedulers</span><span class="p">,</span> <span class="n">callbacks</span><span class="o">=</span><span class="n">callbacks</span><span class="p">,</span> <span class="n">transforms</span><span class="o">=</span><span class="n">transforms</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_wide</span><span class="o">=</span><span class="n">X_wide</span><span class="p">,</span> <span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">X_text</span><span class="o">=</span><span class="n">X_text</span><span class="p">,</span> <span class="n">X_img</span><span class="o">=</span><span class="n">X_images</span><span class="p">,</span>
<span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">32</span><span class="p">,</span> <span class="n">val_split</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 25/25 [02:11<00:00, 5.28s/it, loss=1.27e+4]
valid: 100%|██████████| 7/7 [00:15<00:00, 2.25s/it, loss=9.2e+3]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>As I mentioned early in the post, please, <strong>do not focus on the success metric/loss</strong> (<code>mse</code> in this case). I am just using a very small sample of the dataset and some "random" set up. I just want to illustrate usability. A benchmark post will come in the "no-so-distant future".</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="4.-Finetune/Warmup-routines">4. Finetune/Warmup routines<a class="anchor-link" href="#4.-Finetune/Warmup-routines"> </a></h2><p>Let's place ourselves in two possible scenarios.</p>
<ol>
<li><p>Let's assume we have run a model and we want to just transfer the learnings (you know...transfer-learning) to another dataset, or simply we have received new data and we do not want to start the training of each component from scratch. Simply, we want to load the pre-trained weights and fine-tune.</p>
</li>
<li><p>Or, we just want to "warm up" individual model components individually before the joined training begins.</p>
</li>
</ol>
<p>This can be done with the <code>finetune</code> set of parameters (aliased all as <code>warmup</code> parameters if you wanted). There are 3 fine-tuning routines:</p>
<ol>
<li><p>Fine-tune all trainable layers at once with a triangular one-cycle learning rate (referred as slanted triangular learning rates in Howard & Ruder 2018)</p>
</li>
<li><p>Gradual fine-tuning inspired by the work of Felbo et al., 2017 [2]</p>
</li>
<li><p>Gradual fine-tuning based on the work of Howard & Ruder 2018 [3]</p>
</li>
</ol>
<p>Currently fine-tunning is only supported without a fully connected head, i.e. if <code>deephead=None</code>. In addition, Felbo and Howard routines apply only, of course, to the <code>deeptabular</code>, <code>deeptext</code> and <code>deepimage</code>models. The <code>wide</code> component can also be fine-tuned, but only in an "all at once" mode.</p>
<p>Let me briefly describe the "Felbo" and "Howard" routines before showing how to use them.</p>
<h3 id="4.1-The-Felbo-finetune-routine">4.1 The Felbo finetune routine<a class="anchor-link" href="#4.1-The-Felbo-finetune-routine"> </a></h3><p>The Felbo fine-tune routine can be illustrated by the following figure:</p>
<p><figure>
<img class="docimage" src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/felbo_routine.png" alt="resnet_block" style="max-width: 500px" />
</figure>
</p>
<p><strong>Figure 1</strong>. The figure can be described as follows: fine-tune (or train) the last layer for one epoch using a one cycle triangular learning rate. Then fine-tune the next deeper layer for one epoch, with a learning rate that is a factor of 2.5 lower than the previous learning rate (the 2.5 factor is fixed) while freezing the already warmed up layer(s). Repeat untill all individual layers are warmed. Then warm one last epoch with all warmed layers trainable. The vanishing color gradient in the figure attempts to illustrate the decreasing learning rate.</p>
<p>Note that this is not identical to the Fine-Tunning routine described in Felbo et al, 2017, this is why I used the word 'inspired'.</p>
<h3 id="4.2-The-Howard-finetune-routine">4.2 The Howard finetune routine<a class="anchor-link" href="#4.2-The-Howard-finetune-routine"> </a></h3><p>The Howard routine can be illustrated by the following figure:</p>
<p><figure>
<img class="docimage" src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/howard_routine.png" alt="resnet_block" style="max-width: 500px" />
</figure>
</p>
<p><strong>Figure 2</strong>. The figure can be described as follows: fine-tune (or train) the last layer for one epoch using a one cycle triangular learning rate. Then fine-tune the next deeper layer for one epoch, with a learning rate that is a factor of 2.5 lower than the previous learning rate (the 2.5 factor is fixed) while keeping the already warmed up layer(s) trainable. Repeat. The vanishing color gradient in the figure attempts to illustrate the decreasing learning rate.</p>
<p>Note that I write "fine-tune (or train) the last layer for one epoch [...]". However, in practice the user will have to specify the order of the layers to be fine-tuned. This is another reason why I wrote that the fine-tune routines I have implemented are inspired by the work of Felbo and Howard and not identical to their implemenations.</p>
<p>The felbo and howard routines can be accessed with via the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/trainer.html">finetune parameters</a> (aliased as <code>warmup</code> parameters in case the user wants to use consistent naming). Let me go back to the adult dataset and let's have a look:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">wide_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'education'</span><span class="p">,</span> <span class="s1">'relationship'</span><span class="p">,</span><span class="s1">'workclass'</span><span class="p">,</span><span class="s1">'occupation'</span><span class="p">,</span><span class="s1">'native_country'</span><span class="p">,</span><span class="s1">'gender'</span><span class="p">]</span>
<span class="n">crossed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s1">'education'</span><span class="p">,</span> <span class="s1">'occupation'</span><span class="p">),</span> <span class="p">(</span><span class="s1">'native_country'</span><span class="p">,</span> <span class="s1">'occupation'</span><span class="p">)]</span>
<span class="n">cat_embed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s1">'education'</span><span class="p">,</span><span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s1">'relationship'</span><span class="p">,</span><span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s1">'workclass'</span><span class="p">,</span><span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s1">'occupation'</span><span class="p">,</span><span class="mi">32</span><span class="p">),(</span><span class="s1">'native_country'</span><span class="p">,</span><span class="mi">32</span><span class="p">)]</span>
<span class="n">continuous_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s2">"age"</span><span class="p">,</span><span class="s2">"hours_per_week"</span><span class="p">]</span>
<span class="n">target_col</span> <span class="o">=</span> <span class="s1">'income_label'</span>
<span class="c1"># TARGET</span>
<span class="n">target</span> <span class="o">=</span> <span class="n">adult</span><span class="p">[</span><span class="n">target_col</span><span class="p">]</span><span class="o">.</span><span class="n">values</span>
<span class="c1"># WIDE</span>
<span class="n">wide_preprocessor</span> <span class="o">=</span> <span class="n">WidePreprocessor</span><span class="p">(</span><span class="n">wide_cols</span><span class="o">=</span><span class="n">wide_cols</span><span class="p">,</span> <span class="n">crossed_cols</span><span class="o">=</span><span class="n">crossed_cols</span><span class="p">)</span>
<span class="n">X_wide</span> <span class="o">=</span> <span class="n">wide_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult</span><span class="p">)</span>
<span class="c1"># DEEP</span>
<span class="n">tab_preprocessor</span> <span class="o">=</span> <span class="n">TabPreprocessor</span><span class="p">(</span><span class="n">embed_cols</span><span class="o">=</span><span class="n">cat_embed_cols</span><span class="p">,</span> <span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">)</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">wide</span> <span class="o">=</span> <span class="n">Wide</span><span class="p">(</span><span class="n">wide_dim</span><span class="o">=</span><span class="n">np</span><span class="o">.</span><span class="n">unique</span><span class="p">(</span><span class="n">X_wide</span><span class="p">)</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span> <span class="n">pred_dim</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="n">deeptabular</span> <span class="o">=</span> <span class="n">TabResnet</span><span class="p">(</span>
<span class="n">blocks_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">128</span><span class="p">,</span> <span class="mi">64</span><span class="p">,</span> <span class="mi">32</span><span class="p">],</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">embeddings_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">wide</span><span class="o">=</span><span class="n">wide</span><span class="p">,</span> <span class="n">deeptabular</span><span class="o">=</span><span class="n">deeptabular</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>WideDeep(
(wide): Wide(
(wide_linear): Embedding(797, 1, padding_idx=0)
)
(deeptabular): Sequential(
(0): TabResnet(
(embed_layers): ModuleDict(
(emb_layer_education): Embedding(17, 32, padding_idx=0)
(emb_layer_native_country): Embedding(43, 32, padding_idx=0)
(emb_layer_occupation): Embedding(16, 32, padding_idx=0)
(emb_layer_relationship): Embedding(7, 32, padding_idx=0)
(emb_layer_workclass): Embedding(10, 32, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(tab_resnet): DenseResnet(
(dense_resnet): Sequential(
(lin1): Linear(in_features=162, out_features=128, bias=True)
(bn1): BatchNorm1d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(block_0): BasicBlock(
(lin1): Linear(in_features=128, out_features=64, bias=True)
(bn1): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(leaky_relu): LeakyReLU(negative_slope=0.01, inplace=True)
(dp): Dropout(p=0.1, inplace=False)
(lin2): Linear(in_features=64, out_features=64, bias=True)
(bn2): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(resize): Sequential(
(0): Linear(in_features=128, out_features=64, bias=True)
(1): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(block_1): BasicBlock(
(lin1): Linear(in_features=64, out_features=32, bias=True)
(bn1): BatchNorm1d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(leaky_relu): LeakyReLU(negative_slope=0.01, inplace=True)
(dp): Dropout(p=0.1, inplace=False)
(lin2): Linear(in_features=32, out_features=32, bias=True)
(bn2): BatchNorm1d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(resize): Sequential(
(0): Linear(in_features=64, out_features=32, bias=True)
(1): BatchNorm1d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
)
)
)
(1): Linear(in_features=32, out_features=1, bias=True)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">objective</span><span class="o">=</span><span class="s2">"binary"</span><span class="p">,</span> <span class="n">metrics</span><span class="o">=</span><span class="p">[</span><span class="n">Accuracy</span><span class="p">])</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_wide</span><span class="o">=</span><span class="n">X_wide</span><span class="p">,</span> <span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">val_split</span><span class="o">=</span><span class="mf">0.1</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">256</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 172/172 [00:06<00:00, 26.32it/s, loss=0.415, metrics={'acc': 0.8016}]
valid: 100%|██████████| 20/20 [00:00<00:00, 74.72it/s, loss=0.364, metrics={'acc': 0.8044}]
epoch 2: 100%|██████████| 172/172 [00:06<00:00, 26.31it/s, loss=0.372, metrics={'acc': 0.8249}]
valid: 100%|██████████| 20/20 [00:00<00:00, 76.28it/s, loss=0.356, metrics={'acc': 0.8256}]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span><span class="o">.</span><span class="n">save_model</span><span class="p">(</span><span class="s2">"models_dir/model.t"</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Now we are going to fine-tune the model components, and in the case of the <code>deeptabular</code> component, we will fine-tune the resnet-blocks and the linear layer but NOT the embeddings.</p>
<p>For this, we need to access the model component's children: <code>deeptabular</code> $\rightarrow$ <code>tab_resnet</code> $\rightarrow$ <code>dense_resnet</code> $\rightarrow$ <code>blocks</code></p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1"># you can just load the model as any pytorch model or use the Trainer's staticmethod `load_model`</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">Trainer</span><span class="o">.</span><span class="n">load_model</span><span class="p">(</span><span class="s2">"models_dir/model.t"</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">tab_lin_layers</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">deeptabular</span><span class="o">.</span><span class="n">children</span><span class="p">())[</span><span class="mi">1</span><span class="p">]</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">tab_deep_layers</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span>
<span class="nb">list</span><span class="p">(</span><span class="nb">list</span><span class="p">(</span><span class="nb">list</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">deeptabular</span><span class="o">.</span><span class="n">children</span><span class="p">())[</span><span class="mi">0</span><span class="p">]</span><span class="o">.</span><span class="n">children</span><span class="p">())[</span><span class="mi">2</span><span class="p">]</span><span class="o">.</span><span class="n">children</span><span class="p">())[</span>
<span class="mi">0</span>
<span class="p">]</span><span class="o">.</span><span class="n">children</span><span class="p">()</span>
<span class="p">)[::</span><span class="o">-</span><span class="mi">1</span><span class="p">][:</span><span class="mi">2</span><span class="p">]</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">tab_layers</span> <span class="o">=</span> <span class="p">[</span><span class="n">tab_lin_layers</span><span class="p">]</span> <span class="o">+</span> <span class="n">tab_deep_layers</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">tab_layers</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>[Linear(in_features=32, out_features=1, bias=True),
BasicBlock(
(lin1): Linear(in_features=64, out_features=32, bias=True)
(bn1): BatchNorm1d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(leaky_relu): LeakyReLU(negative_slope=0.01, inplace=True)
(dp): Dropout(p=0.1, inplace=False)
(lin2): Linear(in_features=32, out_features=32, bias=True)
(bn2): BatchNorm1d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(resize): Sequential(
(0): Linear(in_features=64, out_features=32, bias=True)
(1): BatchNorm1d(32, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
),
BasicBlock(
(lin1): Linear(in_features=128, out_features=64, bias=True)
(bn1): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(leaky_relu): LeakyReLU(negative_slope=0.01, inplace=True)
(dp): Dropout(p=0.1, inplace=False)
(lin2): Linear(in_features=64, out_features=64, bias=True)
(bn2): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(resize): Sequential(
(0): Linear(in_features=128, out_features=64, bias=True)
(1): BatchNorm1d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)]</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">new_trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">objective</span><span class="o">=</span><span class="s2">"binary"</span><span class="p">,</span> <span class="n">metrics</span><span class="o">=</span><span class="p">[</span><span class="n">Accuracy</span><span class="p">])</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">new_trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span>
<span class="n">X_wide</span><span class="o">=</span><span class="n">X_wide</span><span class="p">,</span>
<span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span>
<span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span>
<span class="n">val_split</span><span class="o">=</span><span class="mf">0.1</span><span class="p">,</span>
<span class="n">finetune</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="n">finetune_epochs</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span>
<span class="n">finetune_deeptabular_gradual</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="n">finetune_deeptabular_layers</span> <span class="o">=</span> <span class="n">tab_layers</span><span class="p">,</span>
<span class="n">finetune_deeptabular_max_lr</span> <span class="o">=</span> <span class="mf">0.01</span><span class="p">,</span>
<span class="n">n_epochs</span><span class="o">=</span><span class="mi">2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre> 0%| | 0/1374 [00:00<?, ?it/s]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Training wide for 2 epochs
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 1374/1374 [00:09<00:00, 150.31it/s, loss=0.421, metrics={'acc': 0.7995}]
epoch 2: 100%|██████████| 1374/1374 [00:08<00:00, 160.97it/s, loss=0.361, metrics={'acc': 0.8158}]
0%| | 0/1374 [00:00<?, ?it/s]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Training deeptabular, layer 1 of 3
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 1374/1374 [00:23<00:00, 58.62it/s, loss=0.385, metrics={'acc': 0.8172}]
0%| | 0/1374 [00:00<?, ?it/s]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Training deeptabular, layer 2 of 3
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 1374/1374 [00:26<00:00, 51.08it/s, loss=0.373, metrics={'acc': 0.8193}]
0%| | 0/1374 [00:00<?, ?it/s]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Training deeptabular, layer 3 of 3
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 1374/1374 [00:24<00:00, 55.97it/s, loss=0.368, metrics={'acc': 0.8207}]
0%| | 0/1374 [00:00<?, ?it/s]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Fine-tuning of individual components completed. Training the whole model for 2 epochs
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 1374/1374 [00:33<00:00, 41.35it/s, loss=0.352, metrics={'acc': 0.8373}]
valid: 100%|██████████| 153/153 [00:01<00:00, 113.01it/s, loss=0.35, metrics={'acc': 0.8368}]
epoch 2: 100%|██████████| 1374/1374 [00:31<00:00, 43.85it/s, loss=0.344, metrics={'acc': 0.8398}]
valid: 100%|██████████| 153/153 [00:01<00:00, 129.62it/s, loss=0.348, metrics={'acc': 0.8395}]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="5.-Custom-model">5. Custom model<a class="anchor-link" href="#5.-Custom-model"> </a></h2><p>So far we have used the components that come with <code>pytorch-widedee</code>. However, as I mentioned in the first post, it is very likely that the user wants to use custom models for the <code>deeptext</code> and <code>deepimage</code> components. This is easily attainable by...well...simply passing your own model.</p>
<p>You should just remember that the model must return the last layer of activations (and NOT the predictions) and must contained an attribute called <code>output_dim</code> with the output dimension of that last layer.</p>
<p>For example, let's say we want to use as <code>deeptext</code> a <strong>very</strong> simple stack of 2 bidirectional GRUs. Let's see how to do such a thing with the airbnb dataset</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">crossed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s1">'property_type'</span><span class="p">,</span> <span class="s1">'room_type'</span><span class="p">)]</span>
<span class="n">already_dummies</span> <span class="o">=</span> <span class="p">[</span><span class="n">c</span> <span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">airbnb</span><span class="o">.</span><span class="n">columns</span> <span class="k">if</span> <span class="s1">'amenity'</span> <span class="ow">in</span> <span class="n">c</span><span class="p">]</span> <span class="o">+</span> <span class="p">[</span><span class="s1">'has_house_rules'</span><span class="p">]</span>
<span class="n">wide_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'is_location_exact'</span><span class="p">,</span> <span class="s1">'property_type'</span><span class="p">,</span> <span class="s1">'room_type'</span><span class="p">,</span> <span class="s1">'host_gender'</span><span class="p">,</span>
<span class="s1">'instant_bookable'</span><span class="p">]</span> <span class="o">+</span> <span class="n">already_dummies</span>
<span class="n">cat_embed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="n">c</span><span class="p">,</span> <span class="mi">16</span><span class="p">)</span> <span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">airbnb</span><span class="o">.</span><span class="n">columns</span> <span class="k">if</span> <span class="s1">'catg'</span> <span class="ow">in</span> <span class="n">c</span><span class="p">]</span> <span class="o">+</span> \
<span class="p">[(</span><span class="s1">'neighbourhood_cleansed'</span><span class="p">,</span> <span class="mi">64</span><span class="p">),</span> <span class="p">(</span><span class="s1">'cancellation_policy'</span><span class="p">,</span> <span class="mi">16</span><span class="p">)]</span>
<span class="n">continuous_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'latitude'</span><span class="p">,</span> <span class="s1">'longitude'</span><span class="p">,</span> <span class="s1">'security_deposit'</span><span class="p">,</span> <span class="s1">'extra_people'</span><span class="p">]</span>
<span class="n">already_standard</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'latitude'</span><span class="p">,</span> <span class="s1">'longitude'</span><span class="p">]</span>
<span class="n">text_col</span> <span class="o">=</span> <span class="s1">'description'</span>
<span class="n">img_col</span> <span class="o">=</span> <span class="s1">'id'</span>
<span class="n">word_vectors_path</span> <span class="o">=</span> <span class="s1">'data/glove.6B/glove.6B.100d.txt'</span>
<span class="n">img_path</span> <span class="o">=</span> <span class="s1">'data/airbnb/property_picture'</span>
<span class="n">target_col</span> <span class="o">=</span> <span class="s1">'yield'</span>
<span class="n">target</span> <span class="o">=</span> <span class="n">airbnb</span><span class="p">[</span><span class="n">target_col</span><span class="p">]</span><span class="o">.</span><span class="n">values</span>
<span class="n">wide_preprocessor</span> <span class="o">=</span> <span class="n">WidePreprocessor</span><span class="p">(</span><span class="n">wide_cols</span><span class="o">=</span><span class="n">wide_cols</span><span class="p">,</span> <span class="n">crossed_cols</span><span class="o">=</span><span class="n">crossed_cols</span><span class="p">)</span>
<span class="n">X_wide</span> <span class="o">=</span> <span class="n">wide_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
<span class="n">tab_preprocessor</span> <span class="o">=</span> <span class="n">TabPreprocessor</span><span class="p">(</span><span class="n">embed_cols</span><span class="o">=</span><span class="n">cat_embed_cols</span><span class="p">,</span> <span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">)</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
<span class="n">text_preprocessor</span> <span class="o">=</span> <span class="n">TextPreprocessor</span><span class="p">(</span><span class="n">word_vectors_path</span><span class="o">=</span><span class="n">word_vectors_path</span><span class="p">,</span> <span class="n">text_col</span><span class="o">=</span><span class="n">text_col</span><span class="p">)</span>
<span class="n">X_text</span> <span class="o">=</span> <span class="n">text_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
<span class="n">image_processor</span> <span class="o">=</span> <span class="n">ImagePreprocessor</span><span class="p">(</span><span class="n">img_col</span> <span class="o">=</span> <span class="n">img_col</span><span class="p">,</span> <span class="n">img_path</span> <span class="o">=</span> <span class="n">img_path</span><span class="p">)</span>
<span class="n">X_images</span> <span class="o">=</span> <span class="n">image_processor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>The vocabulary contains 2192 tokens
Indexing word vectors...
Loaded 400000 word vectors
Preparing embeddings matrix...
2175 words in the vocabulary had data/glove.6B/glove.6B.100d.txt vectors and appear more than 5 times
Reading Images from data/airbnb/property_picture
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre> 4%|▍ | 39/1001 [00:00<00:02, 389.27it/s]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Resizing
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>100%|██████████| 1001/1001 [00:02<00:00, 381.95it/s]
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Computing normalisation metrics
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">torch</span> <span class="kn">import</span> <span class="n">nn</span>
<span class="k">class</span> <span class="nc">MyDeepText</span><span class="p">(</span><span class="n">nn</span><span class="o">.</span><span class="n">Module</span><span class="p">):</span>
<span class="k">def</span> <span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">vocab_size</span><span class="p">,</span> <span class="n">padding_idx</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">embed_dim</span><span class="o">=</span><span class="mi">100</span><span class="p">,</span> <span class="n">hidden_dim</span><span class="o">=</span><span class="mi">64</span><span class="p">):</span>
<span class="nb">super</span><span class="p">(</span><span class="n">MyDeepText</span><span class="p">,</span> <span class="bp">self</span><span class="p">)</span><span class="o">.</span><span class="fm">__init__</span><span class="p">()</span>
<span class="c1"># word/token embeddings</span>
<span class="bp">self</span><span class="o">.</span><span class="n">word_embed</span> <span class="o">=</span> <span class="n">nn</span><span class="o">.</span><span class="n">Embedding</span><span class="p">(</span>
<span class="n">vocab_size</span><span class="p">,</span> <span class="n">embed_dim</span><span class="p">,</span> <span class="n">padding_idx</span><span class="o">=</span><span class="n">padding_idx</span>
<span class="p">)</span>
<span class="c1"># stack of RNNs</span>
<span class="bp">self</span><span class="o">.</span><span class="n">rnn</span> <span class="o">=</span> <span class="n">nn</span><span class="o">.</span><span class="n">GRU</span><span class="p">(</span>
<span class="n">embed_dim</span><span class="p">,</span>
<span class="n">hidden_dim</span><span class="p">,</span>
<span class="n">num_layers</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span>
<span class="n">bidirectional</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="n">batch_first</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="p">)</span>
<span class="c1"># Remember, this must be defined. If not WideDeep will through an error</span>
<span class="bp">self</span><span class="o">.</span><span class="n">output_dim</span> <span class="o">=</span> <span class="n">hidden_dim</span> <span class="o">*</span> <span class="mi">2</span>
<span class="k">def</span> <span class="nf">forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">X</span><span class="p">):</span>
<span class="n">embed</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">word_embed</span><span class="p">(</span><span class="n">X</span><span class="o">.</span><span class="n">long</span><span class="p">())</span>
<span class="n">o</span><span class="p">,</span> <span class="n">h</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">rnn</span><span class="p">(</span><span class="n">embed</span><span class="p">)</span>
<span class="k">return</span> <span class="n">torch</span><span class="o">.</span><span class="n">cat</span><span class="p">((</span><span class="n">h</span><span class="p">[</span><span class="o">-</span><span class="mi">2</span><span class="p">],</span> <span class="n">h</span><span class="p">[</span><span class="o">-</span><span class="mi">1</span><span class="p">]),</span> <span class="n">dim</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>And from here, "<em>proceed as usual</em>"</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">wide</span> <span class="o">=</span> <span class="n">Wide</span><span class="p">(</span><span class="n">wide_dim</span><span class="o">=</span><span class="n">np</span><span class="o">.</span><span class="n">unique</span><span class="p">(</span><span class="n">X_wide</span><span class="p">)</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span> <span class="n">pred_dim</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="n">deeptabular</span> <span class="o">=</span> <span class="n">TabMlp</span><span class="p">(</span>
<span class="n">mlp_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">64</span><span class="p">,</span><span class="mi">32</span><span class="p">],</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">embeddings_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span>
<span class="p">)</span>
<span class="n">mydeeptext</span> <span class="o">=</span> <span class="n">MyDeepText</span><span class="p">(</span><span class="n">vocab_size</span><span class="o">=</span><span class="nb">len</span><span class="p">(</span><span class="n">text_preprocessor</span><span class="o">.</span><span class="n">vocab</span><span class="o">.</span><span class="n">itos</span><span class="p">))</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">wide</span><span class="o">=</span><span class="n">wide</span><span class="p">,</span> <span class="n">deeptabular</span><span class="o">=</span><span class="n">deeptabular</span><span class="p">,</span> <span class="n">deeptext</span><span class="o">=</span><span class="n">mydeeptext</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>WideDeep(
(wide): Wide(
(wide_linear): Embedding(357, 1, padding_idx=0)
)
(deeptabular): Sequential(
(0): TabMlp(
(embed_layers): ModuleDict(
(emb_layer_accommodates_catg): Embedding(4, 16, padding_idx=0)
(emb_layer_bathrooms_catg): Embedding(4, 16, padding_idx=0)
(emb_layer_bedrooms_catg): Embedding(5, 16, padding_idx=0)
(emb_layer_beds_catg): Embedding(5, 16, padding_idx=0)
(emb_layer_cancellation_policy): Embedding(6, 16, padding_idx=0)
(emb_layer_guests_included_catg): Embedding(4, 16, padding_idx=0)
(emb_layer_host_listings_count_catg): Embedding(5, 16, padding_idx=0)
(emb_layer_minimum_nights_catg): Embedding(4, 16, padding_idx=0)
(emb_layer_neighbourhood_cleansed): Embedding(33, 64, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(tab_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=196, out_features=64, bias=True)
(2): ReLU(inplace=True)
)
(dense_layer_1): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=64, out_features=32, bias=True)
(2): ReLU(inplace=True)
)
)
)
)
(1): Linear(in_features=32, out_features=1, bias=True)
)
(deeptext): Sequential(
(0): MyDeepText(
(word_embed): Embedding(2192, 100, padding_idx=1)
(rnn): GRU(100, 64, num_layers=2, batch_first=True, bidirectional=True)
)
(1): Linear(in_features=128, out_features=1, bias=True)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">objective</span><span class="o">=</span><span class="s2">"regression"</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_wide</span><span class="o">=</span><span class="n">X_wide</span><span class="p">,</span> <span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">X_text</span><span class="o">=</span><span class="n">X_text</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">64</span><span class="p">,</span> <span class="n">val_split</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 13/13 [00:03<00:00, 3.77it/s, loss=1.79e+4]
valid: 100%|██████████| 4/4 [00:00<00:00, 13.34it/s, loss=1.49e+4]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="6.-Conclusion">6. Conclusion<a class="anchor-link" href="#6.-Conclusion"> </a></h2><p>In this second post I tried to illustrate in detail the different functionalities of the <code>pytorch-widedeep</code> package, and how these can be used to customize each of the four potential components of the <code>WideDeep</code> model that can be built with <code>pytorch-widedeep</code>. I have also describe the warm-up routines that can be used to "warm-up" each individual component before the joined training and finally, how custom models, "external" to <code>pytorch-widedeep</code> can be used in combination with the package.</p>
<p>However, this is not the end of the journey. As you will have seen, there is an "<em>imbalance in the <code>pytorch-widedeep</code> force</em>", in the sense that while fully pre-trained models are incorporated for the <code>deepimage</code> component, this is not the case for the <code>deeptext</code> component, where only pre-trained word embeddings are considered. Of course, as illustrated in Section 4, you could build your own pre-trained <code>deeptext</code> component and pass it to the <code>WideDeep</code> constructor class, but eventually, I want to allow that option within the package.</p>
<p>This means that eventually I will need to integrate the library with some of the pre-trained Language models available or simply code a custom version for <code>pytorch-widedeep</code>.</p>
<p>One the other hand, I want to bring more DL models for the <code>deeptabular</code> components, such as <a href="https://arxiv.org/pdf/1908.07442.pdf">TabNet</a>. There is already a fantastic <a href="https://github.com/dreamquark-ai/tabnet"><code>Pytorch</code> implementation</a> which I highly recommend.</p>
<p>If you made it this far, thanks for reading! And if you use the package, let me know your thoughts!</p>
<h4 id="References">References<a class="anchor-link" href="#References"> </a></h4><p>[1] Tsung-Yi Lin, Priya Goyal, Ross Girshick, et al., 2018: Focal Loss for Dense Object Detection. <a href="https://arxiv.org/pdf/1708.02002.pdf">arXiv:1708.02002v2</a></p>
<p>[3] Using millions of emoji occurrences to learn any-domain representations for detecting sentiment, emotion and sarcasm. Bjarke Felbo, Alan Mislove, Anders Søgaard, et al., 2017. <a href="https://arxiv.org/abs/1708.00524">arXiv:1708.00524</a></p>
<p>[3] Universal Language Model Fine-tuning for Text Classification. Jeremy Howard, Sebastian Ruder, 2018 <a href="https://arxiv.org/abs/1801.06146">arXiv:1801.06146v5</a></p>
</div>
</div>
</div>
</div>Javier Rodriguezpytorch-widedeep, deep learning for tabular data I: data preprocessing, model components and basic use2020-12-06T00:00:00-06:002020-12-06T00:00:00-06:00https://jrzaurin.github.io/infinitoml/2020/12/06/pytorch-widedeep<!--
#################################################
### THIS FILE WAS AUTOGENERATED! DO NOT EDIT! ###
#################################################
# file to edit: _notebooks/2020-12-06-pytorch-widedeep.ipynb
-->
<div class="container" id="notebook-container">
<div class="cell border-box-sizing code_cell rendered">
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>This is the first of a series of posts introducing <a href="https://github.com/jrzaurin/pytorch-widedeep">pytorch-widedeep</a>, which is intended to be a flexible package to use Deep Learning (hereafter DL) with tabular data and combine it with text and images via wide and deep models. <code>pytorch-widedeep</code> is partially based on Heng-Tze Cheng et al., 2016 <a href="https://arxiv.org/abs/1606.07792">paper</a> [1].</p>
<p>in this post I describe the data preprocessing functionalities of the library, the main components of the model, and the basic use of the library. In a separate post I will show a more advance use of <code>pytorch-widedeep</code>.</p>
<p>Before I move any further I just want to emphasize that there are a number of libraries that implement functionalities to use DL on tabular data. To cite a few, the ubiquitous and fantastic <a href="https://docs.fast.ai/tutorial.tabular.html">FastAI</a> (and their tabular api), NVIDIA's <a href="https://github.com/NVIDIA/NVTabular">NVTabular</a>, the powerful <a href="https://github.com/dreamquark-ai/tabnet">pytorch-tabnet</a> based on work of Sercan O. Arik and Tomas Pfisterfrom [2], which is starting to take victories in Kaggle competitions, and perhaps my favourite <a href="https://arxiv.org/abs/2003.06505">AutoGluon Tabular</a> [3].</p>
<p>It is not my intention to "compete" against these libraries. <code>pytorch-widedeep</code> started as an attempt to package and automate an algorithm I had to use a couple of times at work and ended up becoming the entertaining process that is building a library. Needless to say that if you wanted to apply DL to tabular data you should go and check all the libraries I mentioned before (as well as this one 🙂. You can find the source code <a href="(https://github.com/jrzaurin/pytorch-widedeep">here</a>)).</p>
<h2 id="1.-Installation">1. Installation<a class="anchor-link" href="#1.-Installation"> </a></h2><p>To install the package simply use pip:</p>
<div class="highlight"><pre><span></span>pip install pytorch-widedeep
</pre></div>
<p>or directly from github</p>
<div class="highlight"><pre><span></span>pip install git+https://github.com/jrzaurin/pytorch-widedeep.git
</pre></div>
<p><strong>Important note for Mac Users</strong></p>
<p>Note that the following comments are not directly related to the package, but to the interplay between <code>pytorch</code> and <code>OSX</code> (more precisely <code>pytorch</code>'s dependency on <code>OpenMP</code> I believe) and in general parallel processing in Mac.</p>
<p>In the first place, at the time of writing the latest <code>pytorch</code> version is <code>1.7</code>. This version is known to have some <a href="https://stackoverflow.com/questions/64772335/pytorch-w-parallelnative-cpp206">issues</a> when running on Mac and the data-loaders might not run in parallel.</p>
<p>On the other hand, since <code>Python 3.8</code> the <code>multiprocessing</code> library start method changed from <a href="https://docs.python.org/3/library/multiprocessing.html#contexts-and-start-methods">'fork' to 'spawn'</a>. This also affects the data-loaders (for any torch version) and they will not run in parallel.</p>
<p>Therefore, for Mac users I suggest using <code>python 3.7</code> and <code>torch <= 1.6</code> (with its corresponding <code>torchvision</code> version, i.e. <code><= 0.7.0</code>). I could have enforced this versioning via the <code>setup.py</code> file. However, there are a number of unknowns and I preferred to leave it as it is. For example I developed the package using <em>macOS Catalina</em> and maybe some of this issues are not present in the new release <em>Big Sur</em>. Also, I hope that they release soon a patch for <code>pytorch 1.7</code> and some, if not all these problems disappear.</p>
<p>Installing <code>pytorch-widedeep</code> via <code>pip</code> will install the latest version. Therefore, if these problems are present and the dataloaders do not run in parallel, one can easily downgrade manually:</p>
<div class="highlight"><pre><span></span>pip install <span class="nv">torch</span><span class="o">==</span><span class="m">1</span>.6.0 <span class="nv">torchvision</span><span class="o">==</span><span class="m">0</span>.7.0
</pre></div>
<p><em>None of these issues affect Linux users</em></p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="2.-pytorch-widedeep--architectures">2. <code>pytorch-widedeep</code> architectures<a class="anchor-link" href="#2.-pytorch-widedeep--architectures"> </a></h2><p>In general terms, <code>pytorch-widedeep</code> is a package to use deep learning with tabular data. In particular, is intended to facilitate the combination of text and images with corresponding tabular data using wide and deep models. With that in mind there are a number of architectures that can be implemented with just a few lines of code. The main components of those architectures are shown in the Figure below:</p>
<p><img src="/infinitoml/images/copied_from_nb/figures/pytorch-widedeep/widedeep_arch.png" alt="" /></p>
<p>The dashed boxes in the figure represent optional, overall components, and the dashed lines/arrows indicate the corresponding connections, depending on whether or not certain components are present. For example, the dashed, blue-arrows indicate that the <code>deeptabular</code>, <code>deeptext</code> and <code>deepimage</code> components are connected directly to the output neuron or neurons (depending on whether we are performing a binary classification or regression, or a multi-class classification) if the optional <code>deephead</code> is not present. Finally, the components within the faded-pink rectangle are concatenated.</p>
<p>Note that it is not possible to illustrate the number of architectures and components available in <code>pytorch-widedeep</code> in one Figure. This is why I wrote before "overall components", because within the components represented by the boxes, there are a number of options as well. Therefore, for more details on possible architectures (and more) please, see the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/index.html">documentation</a>, or the Examples folders and the notebooks in the <a href="https://github.com/jrzaurin/pytorch-widedeep">repo</a>.</p>
<p>In math terms, and following the notation in the <a href="https://arxiv.org/abs/1606.07792">paper</a>, the expression for the architecture without a <code>deephead</code> component can be formulated as:</p>
$$
preds = \sigma(W^{T}_{wide}[x, \phi(x)] + W^{T}_{deeptabular}a^{(l_f)}_{dense} + W^{T}_{deeptext}a^{(l_f)}_{text} + W^{T}_{deepimage}a^{(l_f)}_{image} + b)
$$<p>Where $W$ are the weight matrices applied to the wide model and to the final activations of the deep models, $a$ are these final activations, and $\phi(x)$ are the cross product transformations of the original features $x$. In case you are wondering what are "<em>cross product transformations</em>", here is a quote taken directly from the paper: "<em>For binary features, a cross-product transformation (e.g., “AND(gender=female, language=en)”) is 1 if and only if the constituent features (“gender=female” and “language=en”) are all 1, and 0 otherwise</em>".</p>
<p>While if there is a <code>deephead</code> component, the previous expression turns into:</p>
$$
preds = \sigma(W^{T}_{wide}[x, \phi(x)] + W^{T}_{deephead}a^{(l_f)}_{deephead} + b)
$$<p>It is important to emphasize that <strong>each individual component, <code>wide</code>, <code>deeptabular</code>, <code>deeptext</code> and <code>deepimage</code>, can be used independently</strong> and in isolation. For example, one could use only <code>wide</code>, which is in simply a linear
model. In fact, one of the most interesting offerings in <code>pytorch-widedeep</code> is the <code>deeptabular</code> component, and I intend to write a dedicated post focused on that component alone.</p>
<p>Finally, while I recommend using the <code>wide</code> and <code>deeptabular</code> models in <code>pytorch-widedeep</code> it is very likely that users will want to use their own models for the <code>deeptext</code> and <code>deepimage</code> components. That is perfectly
possible as long as the the custom models have an attribute called <code>output_dim</code> with the size of the last layer of activations, so that <code>WideDeep</code> can be constructed. Again, examples on how to use custom components can be found in the Examples folder in the repo. Just in case <code>pytorch-widedeep</code> includes standard text (stack of LSTMs) and image
(pre-trained ResNets or stack of CNNs) models.</p>
<h2 id="3.-Quick-start-(TL;DR)">3. Quick start (TL;DR)<a class="anchor-link" href="#3.-Quick-start-(TL;DR)"> </a></h2><p>Maybe I should have started with this section, but I thought that knowing at least the architectures one can build with <code>pytorch-widedeep</code> was "kind-off" necessary. In any case and before diving into the details of the library, let's just say that you just want to quickly run one example and get the feel of how <code>pytorch-widedeep</code> works. Let's do so using the <a href="http://archive.ics.uci.edu/ml/datasets/Adult">adult census dataset</a>.</p>
<p>In this example we will be fitting a model comprised by two components: <code>wide</code> and <code>deeptabular</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">import</span> <span class="nn">pandas</span> <span class="k">as</span> <span class="nn">pd</span>
<span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="nn">np</span>
<span class="kn">from</span> <span class="nn">sklearn.model_selection</span> <span class="kn">import</span> <span class="n">train_test_split</span>
<span class="kn">from</span> <span class="nn">sklearn.metrics</span> <span class="kn">import</span> <span class="n">accuracy_score</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">adult</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="s2">"data/adult/adult.csv.zip"</span><span class="p">)</span>
<span class="n">adult</span><span class="o">.</span><span class="n">columns</span> <span class="o">=</span> <span class="p">[</span><span class="n">c</span><span class="o">.</span><span class="n">replace</span><span class="p">(</span><span class="s2">"-"</span><span class="p">,</span> <span class="s2">"_"</span><span class="p">)</span> <span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">adult</span><span class="o">.</span><span class="n">columns</span><span class="p">]</span>
<span class="n">adult</span><span class="p">[</span><span class="s2">"income_label"</span><span class="p">]</span> <span class="o">=</span> <span class="p">(</span><span class="n">adult</span><span class="p">[</span><span class="s2">"income"</span><span class="p">]</span><span class="o">.</span><span class="n">apply</span><span class="p">(</span><span class="k">lambda</span> <span class="n">x</span><span class="p">:</span> <span class="s2">">50K"</span> <span class="ow">in</span> <span class="n">x</span><span class="p">))</span><span class="o">.</span><span class="n">astype</span><span class="p">(</span><span class="nb">int</span><span class="p">)</span>
<span class="n">adult</span><span class="o">.</span><span class="n">drop</span><span class="p">(</span><span class="s2">"income"</span><span class="p">,</span> <span class="n">axis</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">inplace</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
<span class="k">for</span> <span class="n">c</span> <span class="ow">in</span> <span class="n">adult</span><span class="o">.</span><span class="n">columns</span><span class="p">:</span>
<span class="k">if</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">dtype</span> <span class="o">==</span> <span class="s1">'O'</span><span class="p">:</span>
<span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span> <span class="o">=</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">apply</span><span class="p">(</span><span class="k">lambda</span> <span class="n">x</span><span class="p">:</span> <span class="s2">"unknown"</span> <span class="k">if</span> <span class="n">x</span> <span class="o">==</span> <span class="s2">"?"</span> <span class="k">else</span> <span class="n">x</span><span class="p">)</span>
<span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span> <span class="o">=</span> <span class="n">adult</span><span class="p">[</span><span class="n">c</span><span class="p">]</span><span class="o">.</span><span class="n">str</span><span class="o">.</span><span class="n">lower</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">adult_train</span><span class="p">,</span> <span class="n">adult_test</span> <span class="o">=</span> <span class="n">train_test_split</span><span class="p">(</span><span class="n">adult</span><span class="p">,</span> <span class="n">test_size</span><span class="o">=</span><span class="mf">0.2</span><span class="p">,</span> <span class="n">stratify</span><span class="o">=</span><span class="n">adult</span><span class="o">.</span><span class="n">income_label</span><span class="p">)</span>
<span class="n">adult</span><span class="o">.</span><span class="n">head</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>age</th>
<th>workclass</th>
<th>fnlwgt</th>
<th>education</th>
<th>educational_num</th>
<th>marital_status</th>
<th>occupation</th>
<th>relationship</th>
<th>race</th>
<th>gender</th>
<th>capital_gain</th>
<th>capital_loss</th>
<th>hours_per_week</th>
<th>native_country</th>
<th>income_label</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>25</td>
<td>private</td>
<td>226802</td>
<td>11th</td>
<td>7</td>
<td>never-married</td>
<td>machine-op-inspct</td>
<td>own-child</td>
<td>black</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>0</td>
</tr>
<tr>
<th>1</th>
<td>38</td>
<td>private</td>
<td>89814</td>
<td>hs-grad</td>
<td>9</td>
<td>married-civ-spouse</td>
<td>farming-fishing</td>
<td>husband</td>
<td>white</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>50</td>
<td>united-states</td>
<td>0</td>
</tr>
<tr>
<th>2</th>
<td>28</td>
<td>local-gov</td>
<td>336951</td>
<td>assoc-acdm</td>
<td>12</td>
<td>married-civ-spouse</td>
<td>protective-serv</td>
<td>husband</td>
<td>white</td>
<td>male</td>
<td>0</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>1</td>
</tr>
<tr>
<th>3</th>
<td>44</td>
<td>private</td>
<td>160323</td>
<td>some-college</td>
<td>10</td>
<td>married-civ-spouse</td>
<td>machine-op-inspct</td>
<td>husband</td>
<td>black</td>
<td>male</td>
<td>7688</td>
<td>0</td>
<td>40</td>
<td>united-states</td>
<td>1</td>
</tr>
<tr>
<th>4</th>
<td>18</td>
<td>unknown</td>
<td>103497</td>
<td>some-college</td>
<td>10</td>
<td>never-married</td>
<td>unknown</td>
<td>own-child</td>
<td>white</td>
<td>female</td>
<td>0</td>
<td>0</td>
<td>30</td>
<td>united-states</td>
<td>0</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>The following lines below is all you need</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep</span> <span class="kn">import</span> <span class="n">Trainer</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.preprocessing</span> <span class="kn">import</span> <span class="n">WidePreprocessor</span><span class="p">,</span> <span class="n">TabPreprocessor</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">Wide</span><span class="p">,</span> <span class="n">TabMlp</span><span class="p">,</span> <span class="n">WideDeep</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.metrics</span> <span class="kn">import</span> <span class="n">Accuracy</span>
<span class="c1"># define wide, crossed, embedding and continuous columns, and target</span>
<span class="n">wide_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s2">"education"</span><span class="p">,</span> <span class="s2">"relationship"</span><span class="p">,</span> <span class="s2">"workclass"</span><span class="p">,</span> <span class="s2">"occupation"</span><span class="p">,</span> <span class="s2">"native_country"</span><span class="p">,</span> <span class="s2">"gender"</span><span class="p">]</span>
<span class="n">cross_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s2">"education"</span><span class="p">,</span> <span class="s2">"occupation"</span><span class="p">),</span> <span class="p">(</span><span class="s2">"native_country"</span><span class="p">,</span> <span class="s2">"occupation"</span><span class="p">)]</span>
<span class="n">embed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s2">"education"</span><span class="p">,</span> <span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s2">"workclass"</span><span class="p">,</span> <span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s2">"occupation"</span><span class="p">,</span> <span class="mi">32</span><span class="p">),</span> <span class="p">(</span><span class="s2">"native_country"</span><span class="p">,</span> <span class="mi">32</span><span class="p">)]</span>
<span class="n">cont_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s2">"age"</span><span class="p">,</span> <span class="s2">"hours_per_week"</span><span class="p">]</span>
<span class="n">target</span> <span class="o">=</span> <span class="n">adult_train</span><span class="p">[</span><span class="s2">"income_label"</span><span class="p">]</span><span class="o">.</span><span class="n">values</span>
<span class="c1"># prepare wide component</span>
<span class="n">wide_preprocessor</span> <span class="o">=</span> <span class="n">WidePreprocessor</span><span class="p">(</span><span class="n">wide_cols</span><span class="o">=</span><span class="n">wide_cols</span><span class="p">,</span> <span class="n">crossed_cols</span><span class="o">=</span><span class="n">cross_cols</span><span class="p">)</span>
<span class="n">X_wide</span> <span class="o">=</span> <span class="n">wide_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult_train</span><span class="p">)</span>
<span class="n">wide</span> <span class="o">=</span> <span class="n">Wide</span><span class="p">(</span><span class="n">wide_dim</span><span class="o">=</span><span class="n">np</span><span class="o">.</span><span class="n">unique</span><span class="p">(</span><span class="n">X_wide</span><span class="p">)</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span> <span class="n">pred_dim</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="c1"># prepare deeptabular component</span>
<span class="n">tab_preprocessor</span> <span class="o">=</span> <span class="n">TabPreprocessor</span><span class="p">(</span><span class="n">embed_cols</span><span class="o">=</span><span class="n">embed_cols</span><span class="p">,</span> <span class="n">continuous_cols</span><span class="o">=</span><span class="n">cont_cols</span><span class="p">)</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult_train</span><span class="p">)</span>
<span class="n">deeptabular</span> <span class="o">=</span> <span class="n">TabMlp</span><span class="p">(</span>
<span class="n">mlp_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">200</span><span class="p">,</span> <span class="mi">100</span><span class="p">],</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">embeddings_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">cont_cols</span><span class="p">,</span>
<span class="p">)</span>
<span class="c1"># build, compile and fit</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">WideDeep</span><span class="p">(</span><span class="n">wide</span><span class="o">=</span><span class="n">wide</span><span class="p">,</span> <span class="n">deeptabular</span><span class="o">=</span><span class="n">deeptabular</span><span class="p">)</span>
<span class="c1"># Train</span>
<span class="n">trainer</span> <span class="o">=</span> <span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">objective</span><span class="o">=</span><span class="s2">"binary"</span><span class="p">,</span> <span class="n">metrics</span><span class="o">=</span><span class="p">[(</span><span class="n">Accuracy</span><span class="p">)])</span>
<span class="n">trainer</span><span class="o">.</span><span class="n">fit</span><span class="p">(</span><span class="n">X_wide</span><span class="o">=</span><span class="n">X_wide</span><span class="p">,</span> <span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab</span><span class="p">,</span> <span class="n">target</span><span class="o">=</span><span class="n">target</span><span class="p">,</span> <span class="n">n_epochs</span><span class="o">=</span><span class="mi">2</span><span class="p">,</span> <span class="n">batch_size</span><span class="o">=</span><span class="mi">256</span><span class="p">)</span>
<span class="c1"># predict</span>
<span class="n">X_wide_te</span> <span class="o">=</span> <span class="n">wide_preprocessor</span><span class="o">.</span><span class="n">transform</span><span class="p">(</span><span class="n">adult_test</span><span class="p">)</span>
<span class="n">X_tab_te</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">transform</span><span class="p">(</span><span class="n">adult_test</span><span class="p">)</span>
<span class="n">preds</span> <span class="o">=</span> <span class="n">trainer</span><span class="o">.</span><span class="n">predict</span><span class="p">(</span><span class="n">X_wide</span><span class="o">=</span><span class="n">X_wide_te</span><span class="p">,</span> <span class="n">X_tab</span><span class="o">=</span><span class="n">X_tab_te</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch 1: 100%|██████████| 153/153 [00:03<00:00, 43.06it/s, loss=0.428, metrics={'acc': 0.802}]
epoch 2: 100%|██████████| 153/153 [00:03<00:00, 44.41it/s, loss=0.389, metrics={'acc': 0.8217}]
predict: 100%|██████████| 39/39 [00:00<00:00, 149.41it/s]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="4.-Preprocessors">4. Preprocessors<a class="anchor-link" href="#4.-Preprocessors"> </a></h2><p>As you can see in Section 3, and as with any ML algorithm, the data need to be prepared/preprocessed before going through the model. This is handled by the <code>pytorch-widedeep</code> preprocessors. There is one preprocessor per <code>WideDeep</code> model component:</p>
<pre><code>WidePreprocessor
TabPreprocessor
TextPreprocessor
ImagePreprocessor</code></pre>
<p>"Behind the scenes", these preprocessors use a series of helper functions and classes that are in the <code>utils</code> module. Initially I did not intend to "expose" them to the user, but I believe they can be useful for all sorts of preprocessing tasks, even if they are not related to <code>pytorch-widedeep</code>, so I made them available. The <code>utils</code> tools are:</p>
<pre><code>deep_utils.LabelEncoder
text_utils.simple_preprocess
text_utils.get_texts
text_utils.pad_sequences
text_utils.build_embeddings_matrix
fastai_transforms.Tokenizer
fastai_transforms.Vocab
image_utils.SimplePreprocessor
image_utils.AspectAwarePreprocessor</code></pre>
<p>They are accessible directly from <code>utils</code>, e.g.:</p>
<div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.utils</span> <span class="kn">import</span> <span class="n">LabelEncoder</span>
</pre></div>
<p>Note that here I will be concentrating directly on the preprocessors. If you want more details on the <code>utils</code> tools, have a look to the <a href="https://github.com/jrzaurin/pytorch-widedeep/tree/master/pytorch_widedeep/utils">source code</a> or read the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/index.html">documentation</a>.</p>
<h3 id="4.1.-WidePreprocessor">4.1. <code>WidePreprocessor</code><a class="anchor-link" href="#4.1.-WidePreprocessor"> </a></h3><p>The Wide component of the model is a linear model that in principle, could be implemented as a linear layer receiving the result of on one-hot encoded categorical columns. However, this is not memory efficient (at all). Therefore, we implement a liner layer as an Embedding layer plus a bias. I will explain it in a bit more detail later. For now, just know that <code>WidePreprocessor</code> simply encodes the categories numerically so that they are the indexes of the lookup table that is an Embedding layer.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.preprocessing</span> <span class="kn">import</span> <span class="n">WidePreprocessor</span>
<span class="n">wide_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'education'</span><span class="p">,</span> <span class="s1">'relationship'</span><span class="p">,</span><span class="s1">'workclass'</span><span class="p">,</span><span class="s1">'occupation'</span><span class="p">,</span><span class="s1">'native_country'</span><span class="p">,</span><span class="s1">'gender'</span><span class="p">]</span>
<span class="n">crossed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s1">'education'</span><span class="p">,</span> <span class="s1">'occupation'</span><span class="p">),</span> <span class="p">(</span><span class="s1">'native_country'</span><span class="p">,</span> <span class="s1">'occupation'</span><span class="p">)]</span>
<span class="n">wide_preprocessor</span> <span class="o">=</span> <span class="n">WidePreprocessor</span><span class="p">(</span><span class="n">wide_cols</span><span class="o">=</span><span class="n">wide_cols</span><span class="p">,</span> <span class="n">crossed_cols</span><span class="o">=</span><span class="n">crossed_cols</span><span class="p">)</span>
<span class="n">X_wide</span> <span class="o">=</span> <span class="n">wide_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult</span><span class="p">)</span>
<span class="c1"># From here on, any new observation can be prepared by simply running `.transform`</span>
<span class="c1"># new_X_wide = wide_preprocessor.transform(new_df)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">X_wide</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>array([[ 1, 17, 23, ..., 89, 91, 316],
[ 2, 18, 23, ..., 89, 92, 317],
[ 3, 18, 24, ..., 89, 93, 318],
...,
[ 2, 20, 23, ..., 90, 103, 323],
[ 2, 17, 23, ..., 89, 103, 323],
[ 2, 21, 29, ..., 90, 115, 324]])</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">X_wide</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>array([ 1, 17, 23, 32, 47, 89, 91, 316])</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Note that the label encoding starts from 1. This is because it is convenient to leave 0 for padding, i.e. unknown categories. Let's take from example the first entry</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">wide_preprocessor</span><span class="o">.</span><span class="n">inverse_transform</span><span class="p">(</span><span class="n">X_wide</span><span class="p">[:</span><span class="mi">1</span><span class="p">])</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>education</th>
<th>relationship</th>
<th>workclass</th>
<th>occupation</th>
<th>native_country</th>
<th>gender</th>
<th>education_occupation</th>
<th>native_country_occupation</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>11th</td>
<td>own-child</td>
<td>private</td>
<td>machine-op-inspct</td>
<td>united-states</td>
<td>male</td>
<td>11th-machine-op-inspct</td>
<td>united-states-machine-op-inspct</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>As we can see, <code>wide_preprocessor</code> numerically encodes the <code>wide_cols</code> and the <code>crossed_cols</code>, which can be recovered using the method <code>inverse_transform</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="4.2-TabPreprocessor">4.2 <code>TabPreprocessor</code><a class="anchor-link" href="#4.2-TabPreprocessor"> </a></h3><p>Simply, <code>TabPreprocessor</code> label-encodes the categorical columns and normalizes the numerical ones (unless otherwise specified).</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.preprocessing</span> <span class="kn">import</span> <span class="n">TabPreprocessor</span>
<span class="c1"># cat_embed_cols = [(column_name, embed_dim), ...]</span>
<span class="n">cat_embed_cols</span> <span class="o">=</span> <span class="p">[(</span><span class="s1">'education'</span><span class="p">,</span><span class="mi">10</span><span class="p">),</span> <span class="p">(</span><span class="s1">'relationship'</span><span class="p">,</span><span class="mi">8</span><span class="p">),</span> <span class="p">(</span><span class="s1">'workclass'</span><span class="p">,</span><span class="mi">10</span><span class="p">),</span> <span class="p">(</span><span class="s1">'occupation'</span><span class="p">,</span><span class="mi">10</span><span class="p">),(</span><span class="s1">'native_country'</span><span class="p">,</span><span class="mi">10</span><span class="p">)]</span>
<span class="n">continuous_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s2">"age"</span><span class="p">,</span><span class="s2">"hours_per_week"</span><span class="p">]</span>
<span class="n">tab_preprocessor</span> <span class="o">=</span> <span class="n">TabPreprocessor</span><span class="p">(</span><span class="n">embed_cols</span><span class="o">=</span><span class="n">cat_embed_cols</span><span class="p">,</span> <span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">)</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">tab_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">adult</span><span class="p">)</span>
<span class="c1"># From here on, any new observation can be prepared by simply running `.transform`</span>
<span class="c1"># new_X_deep = deep_preprocessor.transform(new_df)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="nb">print</span><span class="p">(</span><span class="n">X_tab</span><span class="p">[:</span><span class="mi">5</span><span class="p">])</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>[[ 1. 1. 1. 1. 1. -0.99512893
-0.03408696]
[ 2. 2. 1. 2. 1. -0.04694151
0.77292975]
[ 3. 2. 2. 3. 1. -0.77631645
-0.03408696]
[ 4. 2. 1. 1. 1. 0.39068346
-0.03408696]
[ 4. 1. 3. 4. 1. -1.50569139
-0.84110367]]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Note that the label encoding starts from 1. This is because it is convenient to leave 0 for padding, i.e. unknown categories. Let's take from example the first entry</p>
<p>Behind the scenes, <code>TabPreprocessor</code> uses <a href="https://pytorch-widedeep.readthedocs.io/en/latest/utils/dense_utils.html">LabelEncoder</a>, simply a custom numerical encoder for categorical features, available via</p>
<div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.utils</span> <span class="kn">import</span> <span class="n">LabelEncoder</span>
</pre></div>
<h3 id="4.3.-TextPreprocessor">4.3. <code>TextPreprocessor</code><a class="anchor-link" href="#4.3.-TextPreprocessor"> </a></h3><p>This preprocessor returns the tokenized, padded sequences that will be directly "fed" to the <code>deeptext</code> component.</p>
<p>To illustrate the text and image preprocessors I will use a small sample of the Airbnb listing dataset, which you can get <a href="http://insideairbnb.com/get-the-data.html">here</a>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">airbnb</span><span class="o">=</span><span class="n">pd</span><span class="o">.</span><span class="n">read_csv</span><span class="p">(</span><span class="s2">"data/airbnb/airbnb_sample.csv"</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">texts</span> <span class="o">=</span> <span class="n">airbnb</span><span class="o">.</span><span class="n">description</span><span class="o">.</span><span class="n">tolist</span><span class="p">()</span>
<span class="n">texts</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>"My bright double bedroom with a large window has a relaxed feeling! It comfortably fits one or two and is centrally located just two blocks from Finsbury Park. Enjoy great restaurants in the area and easy access to easy transport tubes, trains and buses. Babies and children of all ages are welcome. Hello Everyone, I'm offering my lovely double bedroom in Finsbury Park area (zone 2) for let in a shared apartment. You will share the apartment with me and it is fully furnished with a self catering kitchen. Two people can easily sleep well as the room has a queen size bed. I also have a travel cot for a baby for guest with small children. I will require a deposit up front as a security gesture on both our parts and will be given back to you when you return the keys. I trust anyone who will be responding to this add would treat my home with care and respect . Best Wishes Alina Guest will have access to the self catering kitchen and bathroom. There is the flat is equipped wifi internet,"</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.preprocessing</span> <span class="kn">import</span> <span class="n">TextPreprocessor</span>
<span class="n">text_preprocessor</span> <span class="o">=</span> <span class="n">TextPreprocessor</span><span class="p">(</span><span class="n">text_col</span><span class="o">=</span><span class="s1">'description'</span><span class="p">)</span>
<span class="n">X_text</span> <span class="o">=</span> <span class="n">text_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
<span class="c1"># From here on, any new observation can be prepared by simply running `.transform`</span>
<span class="c1"># new_X_text = text_preprocessor.transform(new_df)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>The vocabulary contains 2192 tokens
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="nb">print</span><span class="p">(</span><span class="n">X_text</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>[ 29 48 37 367 818 17 910 17 177 15 122 349 53 879
1174 126 393 40 911 0 23 228 71 819 9 53 55 1380
225 11 18 308 18 1564 10 755 0 942 239 53 55 0
11 36 1013 277 1974 70 62 15 1475 9 943 5 251 5
0 5 0 5 177 53 37 75 11 10 294 726 32 9
42 5 25 12 10 22 12 136 100 145]
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><code>TextPreprocessor</code> uses the utilities within the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/utils/text_utils.html">text_utils</a> and the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/utils/fastai_transforms.html">fastai_transforms</a> modules. Again, all the utilities within those modules are are directly accessible from <code>utils</code>, e.g.:</p>
<div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.utils</span> <span class="kn">import</span> <span class="n">simple_preprocess</span><span class="p">,</span> <span class="n">pad_sequences</span><span class="p">,</span> <span class="n">build_embeddings_matrix</span><span class="p">,</span> <span class="n">Tokenizer</span><span class="p">,</span> <span class="n">Vocab</span>
</pre></div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="4.4-ImagePreprocessor">4.4 <code>ImagePreprocessor</code><a class="anchor-link" href="#4.4-ImagePreprocessor"> </a></h3><p>Finally, <code>ImagePreprocessor</code> simply resizes the images, being aware of the aspect ratio. By default they will be resized to <code>(224, 224, ...)</code>. This is because the default <code>deepdense</code> component of the model is a pre-trained <code>ResNet</code> model, which requires inputs of height and width of 224.</p>
<p>Let's have a look</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.preprocessing</span> <span class="kn">import</span> <span class="n">ImagePreprocessor</span>
<span class="n">image_preprocessor</span> <span class="o">=</span> <span class="n">ImagePreprocessor</span><span class="p">(</span><span class="n">img_col</span><span class="o">=</span><span class="s1">'id'</span><span class="p">,</span> <span class="n">img_path</span><span class="o">=</span><span class="s2">"data/airbnb/property_picture/"</span><span class="p">)</span>
<span class="n">X_images</span> <span class="o">=</span> <span class="n">image_preprocessor</span><span class="o">.</span><span class="n">fit_transform</span><span class="p">(</span><span class="n">airbnb</span><span class="p">)</span>
<span class="c1"># From here on, any new observation can be prepared by simply running `.transform`</span>
<span class="c1"># new_X_images = image_preprocessor.transform(new_df)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Reading Images from data/airbnb/property_picture/
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre> 4%|▍ | 41/1001 [00:00<00:02, 396.72it/s]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Resizing
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>100%|██████████| 1001/1001 [00:02<00:00, 354.70it/s]
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>Computing normalisation metrics
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">X_images</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span><span class="o">.</span><span class="n">shape</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>(224, 224, 3)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><code>ImagePreprocessor</code> uses two helpers: <a href="https://pytorch-widedeep.readthedocs.io/en/latest/utils/image_utils.html"><code>SimplePreprocessor</code> and <code>AspectAwarePreprocessor</code></a>, available from the <code>utils</code> module, e.g.:</p>
<div class="highlight"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.utils</span> <span class="kn">import</span> <span class="n">SimplePreprocessor</span><span class="p">,</span> <span class="n">AspectAwarePreprocessor</span>
</pre></div>
<p>These two classes are directly taken from Adrian Rosebrock's fantastic book "Deep Learning for Computer Vision". Therefore, all credit to Adrian.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="5.-Model-Components">5. Model Components<a class="anchor-link" href="#5.-Model-Components"> </a></h2>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Let's now have a look to the components that can be used to build a wide and deep model. The 5 main components of <code>WideDeep</code> are:</p>
<pre><code>wide
deeptabular
deeptext
deepimage
deephead</code></pre>
<p>The first 4 will be collected and combined by the <code>WideDeep</code> class, while the 5th one can be optionally added to the <code>WideDeep</code> model through its corresponding parameters: <code>deephead</code> or alternatively <code>head_layers</code>, <code>head_dropout</code> and <code>head_batchnorm</code>.</p>
<h3 id="5.1.-wide">5.1. <code>wide</code><a class="anchor-link" href="#5.1.-wide"> </a></h3><p>The wide component is a Linear layer "plugged" into the output neuron(s)</p>
<p>The only particularity of our implementation is that we have implemented the linear layer via an Embedding layer plus a bias. While the implementations are equivalent, the latter is faster and far more memory efficient, since we do not need to one hot encode the categorical features.</p>
<p>Let's have a look:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">import</span> <span class="nn">pandas</span> <span class="k">as</span> <span class="nn">pd</span>
<span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="nn">np</span>
<span class="kn">from</span> <span class="nn">torch</span> <span class="kn">import</span> <span class="n">nn</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">df</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">({</span><span class="s1">'color'</span><span class="p">:</span> <span class="p">[</span><span class="s1">'r'</span><span class="p">,</span> <span class="s1">'b'</span><span class="p">,</span> <span class="s1">'g'</span><span class="p">],</span> <span class="s1">'size'</span><span class="p">:</span> <span class="p">[</span><span class="s1">'s'</span><span class="p">,</span> <span class="s1">'n'</span><span class="p">,</span> <span class="s1">'l'</span><span class="p">]})</span>
<span class="n">df</span><span class="o">.</span><span class="n">head</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>color</th>
<th>size</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>r</td>
<td>s</td>
</tr>
<tr>
<th>1</th>
<td>b</td>
<td>n</td>
</tr>
<tr>
<th>2</th>
<td>g</td>
<td>l</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>one hot encoded, the first observation (<code>color: r, size: s</code>) would be</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">obs_0_oh</span> <span class="o">=</span> <span class="p">(</span><span class="n">np</span><span class="o">.</span><span class="n">array</span><span class="p">([</span><span class="mf">1.</span><span class="p">,</span> <span class="mf">0.</span><span class="p">,</span> <span class="mf">0.</span><span class="p">,</span> <span class="mf">1.</span><span class="p">,</span> <span class="mf">0.</span><span class="p">,</span> <span class="mf">0.</span><span class="p">]))</span><span class="o">.</span><span class="n">astype</span><span class="p">(</span><span class="s1">'float32'</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>if we simply numerically encode (or label encode) the values:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">obs_0_le</span> <span class="o">=</span> <span class="p">(</span><span class="n">np</span><span class="o">.</span><span class="n">array</span><span class="p">([</span><span class="mi">0</span><span class="p">,</span> <span class="mi">3</span><span class="p">]))</span><span class="o">.</span><span class="n">astype</span><span class="p">(</span><span class="s1">'int64'</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Note that in the implementation of the package we start from 1, saving 0 for padding, i.e. unseen values.</p>
<p>Now, let's see if the two implementations are equivalent</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1"># we have 6 different values. Let's assume we are performing a regression, so pred_dim = 1</span>
<span class="n">lin</span> <span class="o">=</span> <span class="n">nn</span><span class="o">.</span><span class="n">Linear</span><span class="p">(</span><span class="mi">6</span><span class="p">,</span> <span class="mi">1</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">emb</span> <span class="o">=</span> <span class="n">nn</span><span class="o">.</span><span class="n">Embedding</span><span class="p">(</span><span class="mi">6</span><span class="p">,</span> <span class="mi">1</span><span class="p">)</span>
<span class="n">emb</span><span class="o">.</span><span class="n">weight</span> <span class="o">=</span> <span class="n">nn</span><span class="o">.</span><span class="n">Parameter</span><span class="p">(</span><span class="n">lin</span><span class="o">.</span><span class="n">weight</span><span class="o">.</span><span class="n">reshape_as</span><span class="p">(</span><span class="n">emb</span><span class="o">.</span><span class="n">weight</span><span class="p">))</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">lin</span><span class="p">(</span><span class="n">torch</span><span class="o">.</span><span class="n">tensor</span><span class="p">(</span><span class="n">obs_0_oh</span><span class="p">))</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>tensor([0.0656], grad_fn=<AddBackward0>)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">emb</span><span class="p">(</span><span class="n">torch</span><span class="o">.</span><span class="n">tensor</span><span class="p">(</span><span class="n">obs_0_le</span><span class="p">))</span><span class="o">.</span><span class="n">sum</span><span class="p">()</span> <span class="o">+</span> <span class="n">lin</span><span class="o">.</span><span class="n">bias</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>tensor([0.0656], grad_fn=<AddBackward0>)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>And this is precisely how the linear component <code>Wide</code> is implemented</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">Wide</span>
<span class="n">wide</span> <span class="o">=</span> <span class="n">Wide</span><span class="p">(</span><span class="n">wide_dim</span><span class="o">=</span><span class="mi">10</span><span class="p">,</span> <span class="n">pred_dim</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="n">wide</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>Wide(
(wide_linear): Embedding(11, 1, padding_idx=0)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Again, let me emphasize that even though the input dim is 10, the <code>Embedding</code> layer has 11 weights. This is because we save 0 for padding, which is used for unseen values during the encoding process</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="5.2.-deeptabular">5.2. <code>deeptabular</code><a class="anchor-link" href="#5.2.-deeptabular"> </a></h3><p>There are 3 alternatives for the so called <code>deepdense</code> component of the model: <code>TabMlp</code> and <code>TabResnet</code> and the <code>TabTransformer</code>:</p>
<ol>
<li><p><code>TabMlp</code>: this is almost identical to the <a href="https://docs.fast.ai/tutorial.tabular.html">tabular model</a> in the fantastic <a href="https://docs.fast.ai/">fastai</a> library, and consists simply in embeddings representing the categorical features, concatenated with the continuous features, and passed then through a MLP.</p>
</li>
<li><p><code>TabRenset</code>: This is similar to the previous model but the embeddings are passed through a series of ResNet blocks built with dense layers.</p>
</li>
<li><p><code>TabTransformer</code>: Details on the TabTransformer can be found in: <a href="https://arxiv.org/pdf/2012.06678.pdf">TabTransformer: Tabular Data Modeling Using Contextual Embeddings</a></p>
</li>
</ol>
<p>For details on these 3 models and their options please see the examples in the <a href="https://github.com/jrzaurin/pytorch-widedeep/tree/master/examples">Examples folder</a> and the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/">documentation</a>.</p>
<p>Through the development of the package, the <code>deeptabular</code> component became one of the core values of the package. The possibilities are numerous, and therefore, I will further describe this component in detail in a separate post.</p>
<p>For now let's have a quick look:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Let's have a look first to <code>TabMlp</code>:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">TabMlp</span>
<span class="c1"># fake dataset</span>
<span class="n">X_tab</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">cat</span><span class="p">((</span><span class="n">torch</span><span class="o">.</span><span class="n">empty</span><span class="p">(</span><span class="mi">5</span><span class="p">,</span> <span class="mi">4</span><span class="p">)</span><span class="o">.</span><span class="n">random_</span><span class="p">(</span><span class="mi">4</span><span class="p">),</span> <span class="n">torch</span><span class="o">.</span><span class="n">rand</span><span class="p">(</span><span class="mi">5</span><span class="p">,</span> <span class="mi">1</span><span class="p">)),</span> <span class="n">axis</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="n">colnames</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'a'</span><span class="p">,</span> <span class="s1">'b'</span><span class="p">,</span> <span class="s1">'c'</span><span class="p">,</span> <span class="s1">'d'</span><span class="p">,</span> <span class="s1">'e'</span><span class="p">]</span>
<span class="n">embed_input</span> <span class="o">=</span> <span class="p">[(</span><span class="n">u</span><span class="p">,</span><span class="n">i</span><span class="p">,</span><span class="n">j</span><span class="p">)</span> <span class="k">for</span> <span class="n">u</span><span class="p">,</span><span class="n">i</span><span class="p">,</span><span class="n">j</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">colnames</span><span class="p">[:</span><span class="mi">4</span><span class="p">],</span> <span class="p">[</span><span class="mi">4</span><span class="p">]</span><span class="o">*</span><span class="mi">4</span><span class="p">,</span> <span class="p">[</span><span class="mi">8</span><span class="p">]</span><span class="o">*</span><span class="mi">4</span><span class="p">)]</span>
<span class="n">column_idx</span> <span class="o">=</span> <span class="p">{</span><span class="n">k</span><span class="p">:</span><span class="n">v</span> <span class="k">for</span> <span class="n">v</span><span class="p">,</span><span class="n">k</span> <span class="ow">in</span> <span class="nb">enumerate</span><span class="p">(</span><span class="n">colnames</span><span class="p">)}</span>
<span class="n">continuous_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s1">'e'</span><span class="p">]</span>
<span class="c1"># my advice would be to not use dropout in the last layer, but I add the option because you never </span>
<span class="c1"># know..there is crazy people everywhere.</span>
<span class="n">tabmlp</span> <span class="o">=</span> <span class="n">TabMlp</span><span class="p">(</span>
<span class="n">mlp_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">16</span><span class="p">,</span><span class="mi">8</span><span class="p">],</span>
<span class="n">mlp_dropout</span><span class="o">=</span><span class="p">[</span><span class="mf">0.5</span><span class="p">,</span> <span class="mf">0.</span><span class="p">],</span>
<span class="n">mlp_batchnorm</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="n">mlp_activation</span><span class="o">=</span><span class="s2">"leaky_relu"</span><span class="p">,</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">embed_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">)</span>
<span class="n">tabmlp</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>TabMlp(
(embed_layers): ModuleDict(
(emb_layer_a): Embedding(5, 8, padding_idx=0)
(emb_layer_b): Embedding(5, 8, padding_idx=0)
(emb_layer_c): Embedding(5, 8, padding_idx=0)
(emb_layer_d): Embedding(5, 8, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(tab_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): BatchNorm1d(33, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(1): Dropout(p=0.5, inplace=False)
(2): Linear(in_features=33, out_features=16, bias=False)
(3): LeakyReLU(negative_slope=0.01, inplace=True)
)
(dense_layer_1): Sequential(
(0): Linear(in_features=16, out_features=8, bias=True)
(1): LeakyReLU(negative_slope=0.01, inplace=True)
)
)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">tabmlp</span><span class="p">(</span><span class="n">X_tab</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>tensor([[-2.0658e-03, 5.0888e-01, 2.1883e-01, -3.1523e-03, -3.2836e-03,
8.3450e-02, -3.4315e-03, -8.6029e-04],
[-2.8116e-03, 2.1922e-01, 5.0364e-01, -1.3522e-03, -9.8741e-04,
-1.2356e-03, -1.4323e-03, 2.7542e-03],
[ 1.1020e-01, 4.0867e-01, 4.3776e-01, 3.1146e-03, 2.7392e-01,
-1.2640e-02, 1.2793e-02, 5.7851e-01],
[-4.4498e-03, 2.0174e-01, 1.1082e+00, 2.3353e-01, -1.9922e-05,
-4.9581e-03, 6.1367e-01, 9.4608e-01],
[-5.7167e-03, 2.7813e-01, 7.8706e-01, -3.6171e-03, 1.5563e-01,
-1.1303e-02, -7.6483e-04, 5.0236e-01]], grad_fn=<LeakyReluBackward1>)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Let's now have a look to <code>TabResnet</code>:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">TabResnet</span>
<span class="n">tabresnet</span> <span class="o">=</span> <span class="n">TabResnet</span><span class="p">(</span>
<span class="n">blocks_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">16</span><span class="p">,</span> <span class="mi">8</span><span class="p">],</span>
<span class="n">blocks_dropout</span><span class="o">=</span><span class="mf">0.1</span><span class="p">,</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">embed_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span><span class="p">,</span>
<span class="p">)</span>
<span class="n">tabresnet</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>TabResnet(
(embed_layers): ModuleDict(
(emb_layer_a): Embedding(5, 8, padding_idx=0)
(emb_layer_b): Embedding(5, 8, padding_idx=0)
(emb_layer_c): Embedding(5, 8, padding_idx=0)
(emb_layer_d): Embedding(5, 8, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(tab_resnet): DenseResnet(
(dense_resnet): Sequential(
(lin1): Linear(in_features=33, out_features=16, bias=True)
(bn1): BatchNorm1d(16, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(block_0): BasicBlock(
(lin1): Linear(in_features=16, out_features=8, bias=True)
(bn1): BatchNorm1d(8, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(leaky_relu): LeakyReLU(negative_slope=0.01, inplace=True)
(dp): Dropout(p=0.1, inplace=False)
(lin2): Linear(in_features=8, out_features=8, bias=True)
(bn2): BatchNorm1d(8, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(resize): Sequential(
(0): Linear(in_features=16, out_features=8, bias=True)
(1): BatchNorm1d(8, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">tabresnet</span><span class="p">(</span><span class="n">X_tab</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>tensor([[-1.7038e-02, -2.2898e-03, 6.7239e-01, -1.1374e-02, -1.4843e-03,
-1.0570e-02, 5.0264e-01, -1.3277e-02],
[ 2.2679e+00, -5.1538e-04, -2.6135e-02, -2.9038e-02, -2.2504e-02,
5.5052e-01, 1.0497e+00, 1.3348e+00],
[ 2.5005e-01, 7.7862e-01, 4.0052e-01, 7.6070e-01, 5.2203e-01,
6.5057e-01, -2.3226e-02, -4.0509e-04],
[-1.3928e-02, -6.9325e-03, 1.6976e-01, 1.3968e+00, 5.9813e-01,
-9.4279e-03, -9.0917e-03, 7.7908e-01],
[ 5.7862e-01, 1.9515e-01, 1.3709e+00, 1.8836e+00, 1.2787e+00,
7.9873e-01, 1.6794e+00, -7.4565e-03]], grad_fn=<LeakyReluBackward1>)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>and finally, the <code>TabTransformer</code>:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">TabTransformer</span>
<span class="n">embed_input</span> <span class="o">=</span> <span class="p">[(</span><span class="n">u</span><span class="p">,</span><span class="n">i</span><span class="p">)</span> <span class="k">for</span> <span class="n">u</span><span class="p">,</span><span class="n">i</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">colnames</span><span class="p">[:</span><span class="mi">4</span><span class="p">],</span> <span class="p">[</span><span class="mi">4</span><span class="p">]</span><span class="o">*</span><span class="mi">4</span><span class="p">)]</span>
<span class="n">tabtransformer</span> <span class="o">=</span> <span class="n">TabTransformer</span><span class="p">(</span>
<span class="n">column_idx</span><span class="o">=</span><span class="n">column_idx</span><span class="p">,</span>
<span class="n">embed_input</span><span class="o">=</span><span class="n">embed_input</span><span class="p">,</span>
<span class="n">continuous_cols</span><span class="o">=</span><span class="n">continuous_cols</span>
<span class="p">)</span>
<span class="n">tabtransformer</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>TabTransformer(
(embed_layers): ModuleDict(
(emb_layer_a): Embedding(5, 32, padding_idx=0)
(emb_layer_b): Embedding(5, 32, padding_idx=0)
(emb_layer_c): Embedding(5, 32, padding_idx=0)
(emb_layer_d): Embedding(5, 32, padding_idx=0)
)
(embedding_dropout): Dropout(p=0.1, inplace=False)
(blks): Sequential(
(block0): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
(block1): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
(block2): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
(block3): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
(block4): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
(block5): TransformerEncoder(
(self_attn): MultiHeadedAttention(
(dropout): Dropout(p=0.1, inplace=False)
(inp_proj): Linear(in_features=32, out_features=96, bias=True)
(out_proj): Linear(in_features=32, out_features=32, bias=True)
)
(feed_forward): PositionwiseFF(
(w_1): Linear(in_features=32, out_features=128, bias=True)
(w_2): Linear(in_features=128, out_features=32, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
(activation): GELU()
)
(attn_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
(ff_addnorm): AddNorm(
(dropout): Dropout(p=0.1, inplace=False)
(ln): LayerNorm((32,), eps=1e-05, elementwise_affine=True)
)
)
)
(tab_transformer_mlp): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Linear(in_features=129, out_features=516, bias=True)
(1): ReLU(inplace=True)
(2): Dropout(p=0.1, inplace=False)
)
(dense_layer_1): Sequential(
(0): Linear(in_features=516, out_features=258, bias=True)
(1): ReLU(inplace=True)
(2): Dropout(p=0.1, inplace=False)
)
)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">tabtransformer</span><span class="p">(</span><span class="n">X_tab</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>tensor([[0.0000, 0.0000, 0.0000, ..., 0.0399, 0.2358, 0.3762],
[0.1373, 0.0000, 0.0000, ..., 0.0550, 0.0000, 0.0000],
[0.0000, 0.0000, 0.0000, ..., 0.0000, 0.0212, 0.0000],
[0.3322, 0.0000, 0.0000, ..., 0.0000, 0.0000, 0.0000],
[0.2914, 0.0000, 0.0000, ..., 0.0000, 0.0000, 0.6590]],
grad_fn=<MulBackward0>)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="5.3.-deeptext">5.3. <code>deeptext</code><a class="anchor-link" href="#5.3.-deeptext"> </a></h3><p><code>pytorch-widedeep</code> offers one model that can be passed to <code>WideDeep</code> as the <code>deeptext</code> component, <code>DeepText</code>, which is a standard and simple stack of LSTMs on top of word embeddings. You could also add a FC-Head on top of the LSTMs. The word embeddings can be pre-trained. In the future I aim to include some simple pre-trained models so that the combination between text and images is fair.</p>
<p>On the other hand, while I recommend using the <code>wide</code> and <code>deeptabular</code> models within this package when building the corresponding wide and deep model components, it is very likely that the user will want to use custom text and image models. That is perfectly possible. Simply, build them and pass them as the corresponding parameters. Note that the custom models MUST return a last layer of activations (i.e. not the final prediction) so that these activations are collected by <code>WideDeep</code> and combined accordingly. In addition, the models MUST also contain an attribute output_dim with the size of these last layers of activations.</p>
<p>I will illustrate all of the above more in detail in the second post of these series.</p>
<p>Let's have a look to <code>DeepText</code></p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">import</span> <span class="nn">torch</span>
<span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">DeepText</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">X_text</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">cat</span><span class="p">((</span><span class="n">torch</span><span class="o">.</span><span class="n">zeros</span><span class="p">([</span><span class="mi">5</span><span class="p">,</span><span class="mi">1</span><span class="p">]),</span> <span class="n">torch</span><span class="o">.</span><span class="n">empty</span><span class="p">(</span><span class="mi">5</span><span class="p">,</span> <span class="mi">4</span><span class="p">)</span><span class="o">.</span><span class="n">random_</span><span class="p">(</span><span class="mi">1</span><span class="p">,</span><span class="mi">4</span><span class="p">)),</span> <span class="n">axis</span><span class="o">=</span><span class="mi">1</span><span class="p">)</span>
<span class="n">deeptext</span> <span class="o">=</span> <span class="n">DeepText</span><span class="p">(</span><span class="n">vocab_size</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span> <span class="n">hidden_dim</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span> <span class="n">n_layers</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">padding_idx</span><span class="o">=</span><span class="mi">0</span><span class="p">,</span> <span class="n">embed_dim</span><span class="o">=</span><span class="mi">4</span><span class="p">)</span>
<span class="n">deeptext</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>/Users/javier/.pyenv/versions/3.7.9/envs/wdposts/lib/python3.7/site-packages/torch/nn/modules/rnn.py:60: UserWarning: dropout option adds dropout after all but last recurrent layer, so non-zero dropout expects num_layers greater than 1, but got dropout=0.1 and num_layers=1
"num_layers={}".format(dropout, num_layers))
</pre>
</div>
</div>
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>DeepText(
(word_embed): Embedding(4, 4, padding_idx=0)
(rnn): LSTM(4, 4, batch_first=True, dropout=0.1)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">deeptext</span><span class="p">(</span><span class="n">X_text</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>tensor([[ 0.1727, -0.0800, -0.2599, -0.1245],
[ 0.1530, -0.2874, -0.2385, -0.1379],
[-0.0747, -0.1666, -0.0124, -0.1875],
[-0.0382, -0.1085, -0.0167, -0.1702],
[-0.0393, -0.0926, -0.0141, -0.1371]], grad_fn=<SelectBackward>)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>You could, if you wanted, add a Fully Connected Head (FC-Head) on top of it</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">deeptext</span> <span class="o">=</span> <span class="n">DeepText</span><span class="p">(</span><span class="n">vocab_size</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span> <span class="n">hidden_dim</span><span class="o">=</span><span class="mi">8</span><span class="p">,</span> <span class="n">n_layers</span><span class="o">=</span><span class="mi">3</span><span class="p">,</span> <span class="n">padding_idx</span><span class="o">=</span><span class="mi">0</span><span class="p">,</span> <span class="n">embed_dim</span><span class="o">=</span><span class="mi">4</span><span class="p">,</span>
<span class="n">head_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">8</span><span class="p">,</span><span class="mi">4</span><span class="p">],</span> <span class="n">head_batchnorm</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span> <span class="n">head_dropout</span><span class="o">=</span><span class="p">[</span><span class="mf">0.5</span><span class="p">,</span> <span class="mf">0.5</span><span class="p">])</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">deeptext</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>DeepText(
(word_embed): Embedding(4, 4, padding_idx=0)
(rnn): LSTM(4, 8, num_layers=3, batch_first=True, dropout=0.1)
(texthead): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Dropout(p=0.5, inplace=False)
(1): Linear(in_features=8, out_features=4, bias=True)
(2): ReLU(inplace=True)
)
)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">deeptext</span><span class="p">(</span><span class="n">X_text</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>tensor([[0.4726, 0.0555, 0.0000, 0.1431],
[0.4907, 0.1357, 0.0000, 0.2591],
[0.4019, 0.0831, 0.0000, 0.1308],
[0.3942, 0.1759, 0.0000, 0.2517],
[0.3184, 0.0902, 0.0000, 0.1955]], grad_fn=<ReluBackward1>)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="5.4.-deepimage">5.4. <code>deepimage</code><a class="anchor-link" href="#5.4.-deepimage"> </a></h3><p>Similarly to <code>deeptext</code>, <code>pytorch-widedeep</code> offers one model that can be passed to <code>WideDeep</code> as the <code>deepimage</code> component, <code>DeepImage</code>, which is either a pre-trained ResNet (18, 34, or 50. Default is 18) or a stack of CNNs, to which one can add a FC-Head. If is a pre-trained ResNet, you can chose how many layers you want to defrost deep into the network with the parameter <code>freeze_n</code></p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="kn">from</span> <span class="nn">pytorch_widedeep.models</span> <span class="kn">import</span> <span class="n">DeepImage</span>
<span class="n">X_img</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">rand</span><span class="p">((</span><span class="mi">2</span><span class="p">,</span><span class="mi">3</span><span class="p">,</span><span class="mi">224</span><span class="p">,</span><span class="mi">224</span><span class="p">))</span>
<span class="n">deepimage</span> <span class="o">=</span> <span class="n">DeepImage</span><span class="p">(</span><span class="n">head_hidden_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">512</span><span class="p">,</span> <span class="mi">64</span><span class="p">,</span> <span class="mi">8</span><span class="p">],</span> <span class="n">head_activation</span><span class="o">=</span><span class="s2">"leaky_relu"</span><span class="p">)</span>
<span class="n">deepimage</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>DeepImage(
(backbone): Sequential(
(0): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
(1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(2): ReLU(inplace=True)
(3): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
(4): Sequential(
(0): BasicBlock(
(conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
(1): BasicBlock(
(conv1): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(5): Sequential(
(0): BasicBlock(
(conv1): Conv2d(64, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(downsample): Sequential(
(0): Conv2d(64, 128, kernel_size=(1, 1), stride=(2, 2), bias=False)
(1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(1): BasicBlock(
(conv1): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(6): Sequential(
(0): BasicBlock(
(conv1): Conv2d(128, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(downsample): Sequential(
(0): Conv2d(128, 256, kernel_size=(1, 1), stride=(2, 2), bias=False)
(1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(1): BasicBlock(
(conv1): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(7): Sequential(
(0): BasicBlock(
(conv1): Conv2d(256, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(downsample): Sequential(
(0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
(1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(1): BasicBlock(
(conv1): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(relu): ReLU(inplace=True)
(conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
)
)
(8): AdaptiveAvgPool2d(output_size=(1, 1))
)
(imagehead): MLP(
(mlp): Sequential(
(dense_layer_0): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=512, out_features=64, bias=True)
(2): LeakyReLU(negative_slope=0.01, inplace=True)
)
(dense_layer_1): Sequential(
(0): Dropout(p=0.1, inplace=False)
(1): Linear(in_features=64, out_features=8, bias=True)
(2): LeakyReLU(negative_slope=0.01, inplace=True)
)
)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">deepimage</span><span class="p">(</span><span class="n">X_img</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>tensor([[ 0.0965, 0.0056, 0.1143, -0.0007, 0.3860, -0.0050, -0.0023, -0.0011],
[ 0.2437, -0.0020, -0.0021, 0.2480, 0.6217, -0.0033, -0.0030, 0.0566]],
grad_fn=<LeakyReluBackward1>)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="5.5.-deephead">5.5. <code>deephead</code><a class="anchor-link" href="#5.5.-deephead"> </a></h3><p>The are two possibilities when defining the so-called <code>deephead</code> component.</p>
<ol>
<li><p>When defining the <code>WideDeep</code> model there is a parameter called <code>head_hidden_dims</code> (and the corresponding related parameters. See the package documentation) that define the FC-head on top of the <code>deeptabular</code>, <code>deeptext</code> and <code>deepimage</code> components.</p>
</li>
<li><p>Of course, you could also chose to define it yourself externally and pass it using the parameter <code>deephead</code>. Have a look at the <a href="https://pytorch-widedeep.readthedocs.io/en/latest/wide_deep.html">documentation</a>.</p>
</li>
</ol>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="6.-Conclusion">6. Conclusion<a class="anchor-link" href="#6.-Conclusion"> </a></h2><p>This is the first of a series of posts introducing the python library <code>pytorch-widedeep</code>. This library is intended to be a flexible frame to combine tabular data with text and images via wide and deep models. Of course, it can also be used directly on "traditional" tabular data, without text and/or images.</p>
<p>In this post I have shown how to quickly start using the library (Section 3) and explained the utilities available in the <code>preprocessing</code> module (Section 4) and and model component definitions (Section 5), available in the <code>models</code> module.</p>
<p>In the next post I will show more advance uses that hopefully will illustrate <code>pytorch-widedeep</code>'s flexibility to build wide and deep models.</p>
<h4 id="References">References<a class="anchor-link" href="#References"> </a></h4><p>[1] Wide & Deep Learning for Recommender Systems. Heng-Tze Cheng, Levent Koc, Jeremiah Harmsen, et al. 2016. <a href="https://arxiv.org/abs/1606.07792">arXiv:1606.07792</a></p>
<p>[2] TabNet: Attentive Interpretable Tabular Learning. Sercan O. Arik, Tomas Pfister, 2020. <a href="https://arxiv.org/abs/1908.07442">arXiv:1908.07442</a></p>
<p>[3] AutoGluon-Tabular: Robust and Accurate AutoML for Structured Data Nick Erickson, Jonas Mueller, Alexander Shirkov, et al., 2020. <a href="https://arxiv.org/abs/2003.06505">arXiv:2003.06505</a></p>
<p>[4] Universal Language Model Fine-tuning for Text Classification. Jeremy Howard, Sebastian Ruder, 2018 <a href="https://arxiv.org/abs/1801.06146">arXiv:1801.06146v5</a></p>
<p>[5] Single Headed Attention RNN: Stop Thinking With Your Head. Stephen Merity, 2019 <a href="arXiv:1911.11423v2">arXiv:1801.06146v5</a></p>
</div>
</div>
</div>
</div>Javier RodriguezRecoTour III: Variational Autoencoders for Collaborative Filtering with Mxnet and Pytorch2020-05-15T00:00:00-05:002020-05-15T00:00:00-05:00https://jrzaurin.github.io/infinitoml/2020/05/15/mult-vae<!--
#################################################
### THIS FILE WAS AUTOGENERATED! DO NOT EDIT! ###
#################################################
# file to edit: _notebooks/2020-05-15-mult-vae.ipynb
-->
<div class="container" id="notebook-container">
<div class="cell border-box-sizing code_cell rendered">
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>This post and the code here are part of a larger repo called <a href="https://github.com/jrzaurin/RecoTour">RecoTour</a>, where I normally explore and implement some recommendation algorithms that I consider interesting and/or useful (see <a href="https://medium.com/datadriveninvestor/recotour-a-tour-through-recommendation-algorithms-in-python-52d780628ab9">RecoTour</a> and <a href="https://towardsdatascience.com/recotour-ii-neural-recommendation-algorithms-49733938d56e">RecoTourII</a>). In every directory, I have included a <code>README</code> file and a series of explanatory notebooks that I hope help explaining the code. I keep adding algorithms from time to time, so stay tunned if you are interested.</p>
<p>As always, let me first acknowledge the relevant people that did the hard work. This post and the companion repo are based on the papers “<a href="https://arxiv.org/pdf/1802.05814.pdf">Variational Autoencoders for Collaborative Filtering</a>” [1] and "<a href="https://arxiv.org/pdf/1312.6114.pdf">Auto-Encoding Variational Bayes</a>" [2]. The code here and in that repo is partially inspired by the implementation from <a href="https://github.com/younggyoseo/vae-cf-pytorch">Younggyo Seo</a>. I have adapted the code to my coding preferences and added a number of options and flexibility to run multiple experiment.</p>
<p>The reason to take a deep dive into variational autoencoders for collaborative filtering is because they seem to be one of the few Deep Learning based algorithm (if not the only one) that is obtains better results that those using non-Deep Learning techniques <a href="https://arxiv.org/abs/1907.06902">[3]</a>.</p>
<p>All the experiments in this post were run using a p2.xlarge EC2 instance on AWS.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="1.-Variational-Autoencoders-for-collaborative-filtering">1. Variational Autoencoders for collaborative filtering<a class="anchor-link" href="#1.-Variational-Autoencoders-for-collaborative-filtering"> </a></h2><p>I must admit that when it comes to variational autoencoders (VAEs) I find that there is a "notable" difference between the complexity of the math and that of the code (or maybe is just me that I am not a mathematician). Nonetheless, I think that speaking about VAEs and not mentioning log likelihoods, Evidence Lower Bound (EBLO) or the Kullback–Leibler divergence ($\text{D}_{\text{KL}}$) is almost like "cheating". With that in mind I will try to give some mathematical context to "Partially Regularized Multinomial Variational Autoencoder" ($\text{Mult-VAE}^{\text{PR}}$) for collaborative filtering and then move to the code. The whole purpose of the math below is to ultimately justify the loss function we will be using when training the $\text{Mult-VAE}^{\text{PR}}$ as well as the architecture of the algorithm.</p>
<p>Before diving into the problem scenario and the mathematical formulation, let me describe the notational convention. Following <a href="https://arxiv.org/pdf/1802.05814.pdf">Liang et al., 2018</a>, I will use $u \in \{1,\dots,U\}$ to index users and $i \in \{1,\dots,I\}$ to index items. The user-by-item <strong>binary</strong> interaction matrix (i.e. the click matrix) is $\mathbf{X} \in \mathbb{N}^{U\times I}$ and I will use lower case $\mathbf{x}_u =[X_{u1},\dots,X_{uI}]^\top \in \mathbb{N}^I$ to refer to the click history of an individual user $u$.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="1.1-Problem-scenario">1.1 Problem scenario<a class="anchor-link" href="#1.1-Problem-scenario"> </a></h3><p>We are given a dataset $\mathbf{X} = \{ {\mathbf{x}_u} \}^{U}_{u=1}$ of user clicks (a more general scenario is described in "<a href="https://arxiv.org/pdf/1312.6114.pdf">Auto-Encoding Variational Bayes</a>" [2]). Our job is to estimate the parameters of the underlying probability distribution so that we can do inference. In other words, we need to find a statistical model of the data (like in any other ML problem). To do this, we need to maximize the likelihood function $p_{\theta}(\mathbf{X})$ so that under the assumed statistical model the observed data is most probable.</p>
<p>To find the maximum likelihood we could assume that the statistical model of the data involves some latent variable $\bf{z}$, so that the marginal likelihood can be written as:</p>
$$
p_{\theta}(\mathbf{x}_u) = \int {p_{\theta}(\mathbf{z}_u)p_{\theta}(\mathbf{x}_u \vert \mathbf{z}_u) d\mathbf{z}_u} \hspace{1cm} (1)
$$<p>where $\theta$ are the parameters of the distribution. Eq (1) is solvable if we assume that both the prior $p_{\theta}(\mathbf{z}_u)$ and the conditional probability $p_{\theta}(\mathbf{x}_u \vert \mathbf{z}_u)$ come from parametric families of distributions and that their PDFs are differentiable almost everywhere w.r.t. both $\theta$ and $\mathbf{z}_u$. However, for "<em>moderately</em>" complicated likelihood functions $p_{\theta}(\mathbf{x}_u \vert \mathbf{z}_u)$, such as a neural network with a nonlinear layer, Eq (1) is intractable (it is not possible to evaluate of differentiate the marginal likelihood). Furthermore, the true posterior $p_{\theta}(\mathbf{z}_u \vert \mathbf{x}_u) = p_{\theta}(\mathbf{x}_u \vert \mathbf{z}_u)p_{\theta}(\mathbf{z}_u)/p_{\theta}(\mathbf{x}_u)$ is also intractable, and therefore we cannot use an EM algorithm (since the E-step involves the computation of the true posterior at a given iteration).</p>
<p>To address these and some other limitations, and find a general solution to this problem, <a href="https://arxiv.org/pdf/1312.6114.pdf">Kingma and Welling 2014</a> proposed a flexible neural network based approach.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="1.2-Auto-Encoding-Variational-Bayes">1.2 Auto-Encoding Variational Bayes<a class="anchor-link" href="#1.2-Auto-Encoding-Variational-Bayes"> </a></h3><p>The following Section is both a summary and my understanding of the paper "<a href="https://arxiv.org/pdf/1312.6114.pdf">Auto-Encoding Variational Bayes</a>" to which I keep referring and that I strongly recommend reading.</p>
<p>Let me remind you: our goal is to maximize the likelihood, or more conveniently the log likelihood $\log p_{\theta}(\mathbf{X})$, where:</p>
$$
\log p_{\theta}(\mathbf{X}) = \sum_u \log p_{\theta}(\mathbf{x}_u) \hspace{1cm} (2)
$$<p>Each term in the summation can be re-written as:</p>
$$
\log p_{\theta}(\mathbf{x}_u) = D_{KL}\left(q_\phi(\textbf{z}_u\vert \textbf{x}_u) \| p_\theta(\textbf{z}_u \vert \textbf{x}_u)\right) + \underbrace{\mathbb{E} \small{ q_{\phi}(\mathbf{z}_u \vert \mathbf{x}_u) } \left[ -\log q_{\phi}(\mathbf{z}_u \vert \mathbf{x_u}) + \log p_{\theta}(\mathbf{x}_u, \mathbf{z}_u) \right]}_{ELBO \mathcal L(\textbf{x}_u, \phi,\theta)} \hspace{1cm} (3)
$$<p>Where the first elements in the right hand side is the <a href="https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence">Kullback–Leibler divergence</a> ($\text{D}_{\text{KL}}$) and $q_\phi(\textbf{z}_u\vert \textbf{x}_u)$ is the approximate posterior of the true posterior $p_\theta(\textbf{z}_u \vert \textbf{x}_u)$. Eq (3) is our "<em>point of entry</em>" from which we will derive the remaining equations. If you want proof of Eq (3) I would recommend reading <a href="https://vannevar.ece.uw.edu/techsite/papers/documents/UWEETR-2010-0002.pdf">this tutorial</a> or this <a href="https://medium.com/@jonathan_hui/machine-learning-summary-proof-terms-8ca7c588905e">"crazy" post</a>.</p>
<p>Moving on, given that $\text{D}_{\text{KL}}$ is non-negative, $\log p_{\theta}(\mathbf{x}_u) \geq \mathcal L(\textbf{x}_u, \phi,\theta)$ and therefore $\mathcal L$ is referred as Evidence Lower Bound (ELBO). It is straightforward to understand from Eq (3) that maximizing $\log p_{\theta}(\mathbf{x}_u)$ implies maximizing ELBO $\mathcal L$. If we re-order the terms in that equation, we could also think of the problem as follows: maximizing ELBO $\mathcal L$ implies minimizing $\text{D}_{\text{KL}}$, which makes sense, since $D_{KL}$ measures the dissimilarity between the approximate posterior $q_\phi(\textbf{z}_u\vert \textbf{x}_u)$ and the true posterior $p_{\theta}(\textbf{z}_u\vert \textbf{x}_u)$.</p>
<p>ELBO $\mathcal L$ in Eq (3) can also be re-written as:</p>
$$
\mathcal L(\textbf{x}_u, \phi,\theta) = - D_{KL}\left(q_\phi(\textbf{z}_u\vert \textbf{x}_u) \| p_\theta(\textbf{z}_u \right) + \mathbb{E} \small{ q_{\phi}(\mathbf{z}_u \vert \mathbf{x}_u) } \left[ \log p_{\theta}(\textbf{x}_u\vert \textbf{z}_u) \right] \hspace{1cm} (4)
$$<p>We can see that Eq (4) involves sampling $\tilde{\mathbf{z}_u} \sim q_{\phi}(\mathbf{z}_u \vert \mathbf{x}_u)$. When sampling is involved, backpropagation is not trivial (how one would take gradients with respect to $\phi$?). To remedy this situation Kingma & Welling introduced the so called "<em>reparameterization trick</em>". Instead of sampling from the approximate postertior $q_{\phi}(\mathbf{z}_u \vert \mathbf{x}_u)$, the authors used a differentiable transformation $g_{\phi}(\mathbf{\epsilon}, \mathbf{x}_u)$ of a noise variable $\epsilon$, such that:</p>
$$
\tilde{\mathbf{z}_u} = g_{\phi}(\mathbf{\epsilon}, \mathbf{x}_u) \hspace{1cm} with \hspace{1cm} \mathbf{\epsilon} \sim p(\epsilon) \hspace{1cm} (5)
$$<p>where $p(\epsilon)$ can be, for example, a variable sampled from a random normal distribution (see Section 1.3 for the selection of $g_{\phi}$ in the particular case of the $\text{Mult-VAE}^{\text{PR}}$). With these formulation, one can use Monte Carlo estimates of expectations of some function $f(\mathbf{z})$ with respect to $q_{\phi}(\mathbf{z}_u \vert \mathbf{x}_u)$ such that:</p>
$$
\mathbb{E} \small{ q_{\phi}(\mathbf{z}_u \vert \mathbf{x}_u) }\left[ f(\mathbf{z}_u) \right] = \mathbb{E} \small{ q_{\phi}(\mathbf{z}_u \vert \mathbf{x}_u) }\left[ f(g_{\phi}(\mathbf{\epsilon}, \mathbf{x}_u)) \right] \simeq \frac{1}{L} \sum_{l=1}^{L} f(g_{\phi}(\mathbf{\epsilon}^l), \mathbf{x}_u)
\\
\text{where} \hspace{1cm} \mathbf{\epsilon}^l \sim p(\epsilon) \hspace{1cm} (6)
$$<p>Replacing the second term in Eq (4) with the result in Eq (6), we see that the ELBO $\mathcal L$ can be approximated by what Kingma and Welling called "<em>Generic Stochastic Gradient Variational Bayes</em>" (SGVB) estimator $\tilde{\mathcal L}(\textbf{x}_u, \phi,\theta) \simeq \mathcal L(\textbf{x}_u, \phi,\theta)$:</p>
$$
\tilde{\mathcal L}(\mathbf{x}_u, \phi,\theta) = - D_{KL}\left(q_\phi(\textbf{z}_u\vert \textbf{x}_u) \| p_\theta(\textbf{z}_u \right) + \frac{1}{L} \sum_{l=1}^{L} \log p_{\theta}(\mathbf{x}_u \vert \mathbf{z}^l_u) \\
\text{where} \hspace{1cm} \mathbf{z}^l_u = g_{\phi}(\epsilon^l_u, \mathbf{x}_u) \hspace{1cm} \text{and} \hspace{1cm} \epsilon^l \sim p(\epsilon) \hspace{1cm} (7)
$$<p>Of course, when running a practical application, we will be using minibatches. With that in mind, we can re-write ELBO $\mathcal{L}$ in "minibatch form" as:</p>
$$
\mathcal L(\mathbf{\text{X}}^M, \phi,\theta) \simeq \tilde{\mathcal L}^{M}(\mathbf{\text{X}}^M, \phi,\theta) = \frac{1}{M} \sum_{u=1}^{M} \tilde{\mathcal L}(\mathbf{x}_u, \phi,\theta) \hspace{1cm} (8)
$$<p>where $\mathbf{X}^M = \{\mathbf{x}_u \}_{u=1}^M$ is a minibatch of M users. In their experiments the authors found that the number of samples $L$ can be set to 1 as long as the minibatch size was large enough, e.g. $M$ = 100. Therefore, as long as our batch sizes are of 100 or more, Eq (7) can be re-written as:</p>
$$
\mathcal L(\mathbf{\text{X}}^M, \phi,\theta) \simeq \frac{1}{M} \sum_{u=1}^{M} - D_{KL}\left(q_\phi(\textbf{z}_u\vert \textbf{x}_u) \| p_\theta(\textbf{z}_u \right) + \log p_{\theta}(\mathbf{x}_u \vert \mathbf{z}^s_u) \hspace{1cm} (9)
$$<p>Note that $\mathbf{z}^s_u$ signifies that $\mathbf{z}_u$ still needs to be sampled once from $q_\phi(\textbf{z}_u\vert \textbf{x}_u)$, but using the reparameterization trick this will be rather easy, as we will see in the next section. Finally, now that we have a "nice looking" mathematical expression, this is how Auto-Encoding Variational Bayes works:</p>
<ol>
<li>Select a prior for latent representation of $\textbf{x}_u$, $p_{\theta}(\textbf{z}_u)$</li>
<li>Use a neural network to parameterize the distribution $p_{\theta}(\textbf{x}_u\vert \textbf{z}_u)$. Because this part of the model maps the latent variable/representation $\textbf{z}_u$ to the observed data $\textbf{x}_u$, it is referred as a "<em>decoder</em>" network. </li>
<li>Rather than explicitly calculating the intractable posterior $p_{\theta}(\textbf{z}_u\vert \textbf{x}_u)$, use another another neural network to parameterize the distribution $q_\phi(\textbf{z}_u\vert \textbf{x}_u)$ as the approximate posterior. Since $q_\phi$ maps the observed data $\textbf{x}_u$ to the latent space of $\textbf{z}_u$'s, is referred as the "<em>encoder</em>" network.</li>
<li>maxmize ELBO $\mathcal{L}$ in Eq (9) using Stochastic Gradient Descent or any of its cousins</li>
</ol>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="1.3-Partially-Regularized-Autoencoder-for-Collaborative-Filtering">1.3 Partially Regularized Autoencoder for Collaborative Filtering<a class="anchor-link" href="#1.3-Partially-Regularized-Autoencoder-for-Collaborative-Filtering"> </a></h3><p>or $\text{Mult-VAE}^{\text{PR}}$...</p>
<p>In the previous Section we obtained Eq (9), which is a generic form of the function we need to maximize to solve the problem described in Section 1.1. Now let's see a particular case of that equation for the set up used by <a href="https://arxiv.org/pdf/1802.05814.pdf">Liang and co-authors</a> in their paper. Such set up is described as follows: for each user $u$, the latent representation $\textbf{z}_u$ is assumed to be drawn from a standard Gaussian prior $p(\textbf{z}_u) \sim \mathcal N(0, I)$. Such representation is then transformed by a multi-layer perceptron (MLP), and the output is normalized via a Softmax function to produce a probability distribution over all items <strong>$I$</strong>, $\pi(\mathbf{z}_u) = Softmax(MLP(\mathbf{z}_u))$. Then, the click history of user $u$ is assumed to be drawn from a Multinomial distribution with probability $\pi(\mathbf{z}_u)$:</p>
$$
\textbf{x}_u \sim \text{Mult}(N_u, \pi(\mathbf{z}_u)) \hspace{1cm} (10)
$$<p>where $N_u = \sum_i x_{ui}$ is the total number of clicks for user $u$. In this set up, the log-likelihood of the click history $\mathbf{x}_u$ conditioned to the latent representation $\mathbf{z}_u$ is simply:</p>
$$
\begin{equation*}
\log(p_{\theta}(\textbf{x}_u\vert \textbf{z}_u)) = \mathbf{x}_u \log(\pi(\mathbf{z}_u)) \hspace{1cm} (11)
\end{equation*}
$$<p>The posterior $q_\phi(\textbf{z}_u\vert \textbf{x}_u)$ is also chosen to be a standard Gaussian $q_\phi(\textbf{z}_u\vert \textbf{x}_u) \sim \mathcal N(\mu_\phi(\textbf{x}_u), \sigma_\phi(\textbf{x}_u) I)$ where $\mu_\phi(\textbf{x}_u)$ and $\sigma_\phi(\textbf{x}_u)$ are functions implemented as neural networks. Then, we use the reparameterization trick and chose $g_{\phi}(\mathbf{\epsilon}, \mathbf{x}_u) = \mu(\textbf{x}_u) + \sigma(\textbf{x}_u) \cdot \epsilon$, where $\epsilon \sim \mathcal{N}(0,I)$. This way $\mathbf{z}^s_u = \mu(\textbf{x}_u) + \sigma(\textbf{x}_u) \cdot \epsilon$ where we sample directly $\epsilon$.</p>
<p>At this stage we have defined the Gaussian prior, the Gaussian approximate posterior and our sampled latent representation. We are finally ready to "plug the terms" into Eq (9) and write the loss function that we will minimize when training the Mult-VAE:</p>
$$
Loss = -\frac{1}{M} \sum_{u=1}^{M} \left[ \mathbf{x}_u \log(\pi(\mathbf{z}_u)) + \frac{\beta}{2} \sum_j ( 1 + \log(\sigma_{uj}^2) - \mu_{uj}^2 - \sigma_{uj}^2 ) \right] \hspace{1cm} (12)
$$<p>Note that the expression above is the negative ELBO $\mathcal L$ (maximizing $\mathcal L$ is equivalent to minimize -$\mathcal L$) with a multiplicative factor $\beta$ applied to the $D_{KL}$. For the math behind the $D_{KL}$ expression given this set up have a look <a href="https://stats.stackexchange.com/questions/318748/deriving-the-kl-divergence-loss-for-vaes">here</a>.</p>
<p>Let me just comment on that $\beta$. Looking at the loss function in Eq (12) within the context of VAEs, we can see that the first term is the reconstruction loss, while the $D_{KL}$ act as a regularizer. With that in mind, Liang et al add a factor $\beta$ to control the strength of the regularization, and propose $\beta < 1$.</p>
<p>Let's pause for one second and think on what this means. First of all, we are no longer optimizing a lower bound for a given log likelihood. In addition, remember that the $D_{KL}$ divergence measures the similarity between the approximate posterior $q_\phi(\textbf{z}_u\vert \textbf{x}_u)$ and the prior $p_\theta(\textbf{z}_u)$. Therefore, by using $\beta < 1$ we are weakening the influence of the prior constrain $q_\phi(\textbf{z}_u\vert \textbf{x}_u) \approx p_\theta(\textbf{z}_u)$ on the loss. This means that we are less able to generalize to novel user clicks from historical data. However, when building recommendation systems we are often not interested in reproducing precisely click histories (i.e. achieving the best loss) but in making good recommendations (i.e. achieving the best ranking metrics). As the authors show in the <a href="https://arxiv.org/pdf/1802.05814.pdf">paper</a> (and we will see here later), the best ranking metrics are obtained when using $\beta < 1$ and in consequence they name the algorithm Partially Regularized Multinomial Autoencoder or $\text{Mult-VAE}^{\text{PR}}$.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="2.-Preparing-the-data">2. Preparing the data<a class="anchor-link" href="#2.-Preparing-the-data"> </a></h2><p>Throughout this exercise I will use two dataset. The <a href="http://jmcauley.ucsd.edu/data/amazon/">Amazon Movies and TV</a> dataset [4] [5] and the <a href="https://grouplens.org/datasets/movielens/20m/">Movilens</a> dataset. The later is used so I can make sure I am obtaining consistent results to those obtained in the paper. As we will see through the notebook, the Amazon dataset is significantly more challenging that the Movielens dataset.</p>
<p>The data preparation is fairly simple, and is identical for both datasets. Therefore, I will focus here only on the Amazon dataset.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">import</span> <span class="nn">os</span>
<span class="kn">import</span> <span class="nn">sys</span>
<span class="kn">import</span> <span class="nn">pandas</span> <span class="k">as</span> <span class="nn">pd</span>
<span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="nn">np</span>
<span class="kn">import</span> <span class="nn">pickle</span>
<span class="kn">from</span> <span class="nn">tqdm</span> <span class="kn">import</span> <span class="n">trange</span>
<span class="kn">from</span> <span class="nn">typing</span> <span class="kn">import</span> <span class="n">Tuple</span><span class="p">,</span> <span class="n">Dict</span><span class="p">,</span> <span class="n">Union</span>
<span class="kn">from</span> <span class="nn">pathlib</span> <span class="kn">import</span> <span class="n">Path</span>
<span class="n">sys</span><span class="o">.</span><span class="n">path</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">os</span><span class="o">.</span><span class="n">path</span><span class="o">.</span><span class="n">abspath</span><span class="p">(</span><span class="s1">'/Users/javier/ml_experiments_python/RecoTour/Amazon/mult-vae/'</span><span class="p">))</span>
<span class="n">rootpath</span> <span class="o">=</span> <span class="n">Path</span><span class="p">(</span><span class="s2">"/Users/javier/ml_experiments_python/RecoTour/Amazon/mult-vae/"</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">DATA_DIR</span> <span class="o">=</span> <span class="n">Path</span><span class="p">(</span><span class="n">rootpath</span> <span class="o">/</span> <span class="s1">'data'</span><span class="p">)</span>
<span class="n">new_colnames</span> <span class="o">=</span> <span class="p">[</span><span class="s2">"user"</span><span class="p">,</span> <span class="s2">"item"</span><span class="p">,</span> <span class="s2">"rating"</span><span class="p">,</span> <span class="s2">"timestamp"</span><span class="p">]</span>
<span class="n">inp_path</span> <span class="o">=</span> <span class="n">DATA_DIR</span> <span class="o">/</span> <span class="s2">"amazon-movies"</span>
<span class="n">filename</span> <span class="o">=</span> <span class="s2">"reviews_Movies_and_TV_5.p"</span>
<span class="n">raw_data</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">read_pickle</span><span class="p">(</span><span class="n">inp_path</span> <span class="o">/</span> <span class="n">filename</span><span class="p">)</span>
<span class="n">keep_cols</span> <span class="o">=</span> <span class="p">[</span><span class="s2">"reviewerID"</span><span class="p">,</span> <span class="s2">"asin"</span><span class="p">,</span> <span class="s2">"overall"</span><span class="p">,</span> <span class="s2">"unixReviewTime"</span><span class="p">]</span>
<span class="n">raw_data</span> <span class="o">=</span> <span class="n">raw_data</span><span class="p">[</span><span class="n">keep_cols</span><span class="p">]</span>
<span class="n">raw_data</span><span class="o">.</span><span class="n">columns</span> <span class="o">=</span> <span class="n">new_colnames</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="nb">print</span><span class="p">(</span><span class="n">raw_data</span><span class="o">.</span><span class="n">shape</span><span class="p">)</span>
<span class="n">raw_data</span><span class="o">.</span><span class="n">head</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>(1697533, 4)
</pre>
</div>
</div>
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>user</th>
<th>item</th>
<th>rating</th>
<th>timestamp</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>ADZPIG9QOCDG5</td>
<td>0005019281</td>
<td>4</td>
<td>1203984000</td>
</tr>
<tr>
<th>1</th>
<td>A35947ZP82G7JH</td>
<td>0005019281</td>
<td>3</td>
<td>1388361600</td>
</tr>
<tr>
<th>2</th>
<td>A3UORV8A9D5L2E</td>
<td>0005019281</td>
<td>3</td>
<td>1388361600</td>
</tr>
<tr>
<th>3</th>
<td>A1VKW06X1O2X7V</td>
<td>0005019281</td>
<td>5</td>
<td>1202860800</td>
</tr>
<tr>
<th>4</th>
<td>A3R27T4HADWFFJ</td>
<td>0005019281</td>
<td>4</td>
<td>1387670400</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="2.1-Filter-triples-(user,-item,-score)">2.1 Filter triples (user, item, score)<a class="anchor-link" href="#2.1-Filter-triples-(user,-item,-score)"> </a></h3><p>The first thing that the we do is to "filter triples" (hereafter refereed as <code>tp</code>) based on the number of times a user interacted with items (<code>min_user_click</code>) or items that where "interacted with" by a user a given number of times (<code>min_item_click</code>).</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">def</span> <span class="nf">get_count</span><span class="p">(</span><span class="n">tp</span><span class="p">:</span> <span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">,</span> <span class="nb">id</span><span class="p">:</span> <span class="nb">str</span><span class="p">)</span> <span class="o">-></span> <span class="n">pd</span><span class="o">.</span><span class="n">Index</span><span class="p">:</span>
<span class="sd">"""</span>
<span class="sd"> Returns `tp` groupby+count by `id`</span>
<span class="sd"> """</span>
<span class="n">playcount_groupbyid</span> <span class="o">=</span> <span class="n">tp</span><span class="p">[[</span><span class="nb">id</span><span class="p">]]</span><span class="o">.</span><span class="n">groupby</span><span class="p">(</span><span class="nb">id</span><span class="p">,</span> <span class="n">as_index</span><span class="o">=</span><span class="kc">False</span><span class="p">)</span>
<span class="n">count</span> <span class="o">=</span> <span class="n">playcount_groupbyid</span><span class="o">.</span><span class="n">size</span><span class="p">()</span>
<span class="k">return</span> <span class="n">count</span>
<span class="k">def</span> <span class="nf">filter_triplets</span><span class="p">(</span>
<span class="n">tp</span><span class="p">:</span> <span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">,</span> <span class="n">min_user_click</span><span class="p">,</span> <span class="n">min_item_click</span>
<span class="p">)</span> <span class="o">-></span> <span class="n">Tuple</span><span class="p">[</span><span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">,</span> <span class="n">pd</span><span class="o">.</span><span class="n">Index</span><span class="p">,</span> <span class="n">pd</span><span class="o">.</span><span class="n">Index</span><span class="p">]:</span>
<span class="sd">"""</span>
<span class="sd"> Returns triplets (`tp`) of user-item-rating for users/items with </span>
<span class="sd"> more than min_user_click/min_item_click counts</span>
<span class="sd"> """</span>
<span class="k">if</span> <span class="n">min_item_click</span> <span class="o">></span> <span class="mi">0</span><span class="p">:</span>
<span class="n">itemcount</span> <span class="o">=</span> <span class="n">get_count</span><span class="p">(</span><span class="n">tp</span><span class="p">,</span> <span class="s2">"item"</span><span class="p">)</span>
<span class="n">tp</span> <span class="o">=</span> <span class="n">tp</span><span class="p">[</span><span class="n">tp</span><span class="p">[</span><span class="s2">"item"</span><span class="p">]</span><span class="o">.</span><span class="n">isin</span><span class="p">(</span><span class="n">itemcount</span><span class="o">.</span><span class="n">index</span><span class="p">[</span><span class="n">itemcount</span> <span class="o">>=</span> <span class="n">min_item_click</span><span class="p">])]</span>
<span class="k">if</span> <span class="n">min_user_click</span> <span class="o">></span> <span class="mi">0</span><span class="p">:</span>
<span class="n">usercount</span> <span class="o">=</span> <span class="n">get_count</span><span class="p">(</span><span class="n">tp</span><span class="p">,</span> <span class="s2">"user"</span><span class="p">)</span>
<span class="n">tp</span> <span class="o">=</span> <span class="n">tp</span><span class="p">[</span><span class="n">tp</span><span class="p">[</span><span class="s2">"user"</span><span class="p">]</span><span class="o">.</span><span class="n">isin</span><span class="p">(</span><span class="n">usercount</span><span class="o">.</span><span class="n">index</span><span class="p">[</span><span class="n">usercount</span> <span class="o">>=</span> <span class="n">min_user_click</span><span class="p">])]</span>
<span class="n">usercount</span><span class="p">,</span> <span class="n">itemcount</span> <span class="o">=</span> <span class="n">get_count</span><span class="p">(</span><span class="n">tp</span><span class="p">,</span> <span class="s2">"user"</span><span class="p">),</span> <span class="n">get_count</span><span class="p">(</span><span class="n">tp</span><span class="p">,</span> <span class="s2">"item"</span><span class="p">)</span>
<span class="k">return</span> <span class="n">tp</span><span class="p">,</span> <span class="n">usercount</span><span class="p">,</span> <span class="n">itemcount</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">filtered_raw_data</span><span class="p">,</span> <span class="n">user_activity</span><span class="p">,</span> <span class="n">item_popularity</span> <span class="o">=</span> <span class="n">filter_triplets</span><span class="p">(</span>
<span class="n">raw_data</span><span class="p">,</span> <span class="n">min_user_click</span><span class="o">=</span><span class="mi">5</span><span class="p">,</span> <span class="n">min_item_click</span><span class="o">=</span><span class="mi">0</span>
<span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Note that, since I am using the "reviews_Movies_and_TV_5" (i.e. the 5-core dataset, where users and items have at least 5 reviews each) <code>filtered_raw_data</code> has no effect on the Amazon dataset. It does however filter some users/items in the case of the Movilens dataset.</p>
<p>Let's now have a look to the sparsity of the dataset:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">sparsity</span> <span class="o">=</span> <span class="p">(</span>
<span class="mf">1.0</span>
<span class="o">*</span> <span class="n">filtered_raw_data</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span>
<span class="o">/</span> <span class="p">(</span><span class="n">user_activity</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span> <span class="o">*</span> <span class="n">item_popularity</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span>
<span class="p">)</span>
<span class="nb">print</span><span class="p">(</span>
<span class="s2">"After filtering, there are </span><span class="si">%d</span><span class="s2"> watching events from </span><span class="si">%d</span><span class="s2"> users and </span><span class="si">%d</span><span class="s2"> movies (sparsity: </span><span class="si">%.3f%%</span><span class="s2">)"</span>
<span class="o">%</span> <span class="p">(</span>
<span class="n">filtered_raw_data</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span>
<span class="n">user_activity</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span>
<span class="n">item_popularity</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span>
<span class="n">sparsity</span> <span class="o">*</span> <span class="mi">100</span><span class="p">,</span>
<span class="p">)</span>
<span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>After filtering, there are 1697533 watching events from 123960 users and 50052 movies (sparsity: 0.027%)
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Comparing these numbers to those of the Movilens dataset (9990682 watching events from 136677 users and 20720 movies: sparsity: 0.353%. see the <a href="https://github.com/dawenl/vae_cf/blob/master/VAE_ML20M_WWW2018.ipynb">notebook</a> corresponding to the original publication, or the <a href="https://arxiv.org/pdf/1802.05814.pdf">original publication</a> itself) one can see that the Amazon dataset is $\sim$13 times more sparse than the Movielens dataset. In consequence, I one would expect that the algorithm finds it more challenging, resulting in lower ranking metrics.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="2.2-Train,-validation-and-test-split">2.2 Train, validation and test split<a class="anchor-link" href="#2.2-Train,-validation-and-test-split"> </a></h3><p>Once the raw data is filtered, we follow the same procedure than that of the original paper to split the users into training, validation and test users.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">def</span> <span class="nf">split_users</span><span class="p">(</span>
<span class="n">unique_uid</span><span class="p">:</span> <span class="n">pd</span><span class="o">.</span><span class="n">Index</span><span class="p">,</span> <span class="n">test_users_size</span><span class="p">:</span> <span class="n">Union</span><span class="p">[</span><span class="nb">float</span><span class="p">,</span> <span class="nb">int</span><span class="p">]</span>
<span class="p">)</span> <span class="o">-></span> <span class="n">Tuple</span><span class="p">[</span><span class="n">pd</span><span class="o">.</span><span class="n">Index</span><span class="p">,</span> <span class="n">pd</span><span class="o">.</span><span class="n">Index</span><span class="p">,</span> <span class="n">pd</span><span class="o">.</span><span class="n">Index</span><span class="p">]:</span>
<span class="n">n_users</span> <span class="o">=</span> <span class="n">unique_uid</span><span class="o">.</span><span class="n">size</span>
<span class="k">if</span> <span class="nb">isinstance</span><span class="p">(</span><span class="n">test_users_size</span><span class="p">,</span> <span class="nb">int</span><span class="p">):</span>
<span class="n">n_heldout_users</span> <span class="o">=</span> <span class="n">test_users_size</span>
<span class="k">else</span><span class="p">:</span>
<span class="n">n_heldout_users</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">test_users_size</span> <span class="o">*</span> <span class="n">n_users</span><span class="p">)</span>
<span class="n">tr_users</span> <span class="o">=</span> <span class="n">unique_uid</span><span class="p">[:</span> <span class="p">(</span><span class="n">n_users</span> <span class="o">-</span> <span class="n">n_heldout_users</span> <span class="o">*</span> <span class="mi">2</span><span class="p">)]</span>
<span class="n">vd_users</span> <span class="o">=</span> <span class="n">unique_uid</span><span class="p">[(</span><span class="n">n_users</span> <span class="o">-</span> <span class="n">n_heldout_users</span> <span class="o">*</span> <span class="mi">2</span><span class="p">)</span> <span class="p">:</span> <span class="p">(</span><span class="n">n_users</span> <span class="o">-</span> <span class="n">n_heldout_users</span><span class="p">)]</span>
<span class="n">te_users</span> <span class="o">=</span> <span class="n">unique_uid</span><span class="p">[(</span><span class="n">n_users</span> <span class="o">-</span> <span class="n">n_heldout_users</span><span class="p">)</span> <span class="p">:]</span>
<span class="k">return</span> <span class="n">tr_users</span><span class="p">,</span> <span class="n">vd_users</span><span class="p">,</span> <span class="n">te_users</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">unique_uid</span> <span class="o">=</span> <span class="n">user_activity</span><span class="o">.</span><span class="n">index</span>
<span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">seed</span><span class="p">(</span><span class="mi">98765</span><span class="p">)</span>
<span class="n">idx_perm</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">permutation</span><span class="p">(</span><span class="n">unique_uid</span><span class="o">.</span><span class="n">size</span><span class="p">)</span>
<span class="n">unique_uid</span> <span class="o">=</span> <span class="n">unique_uid</span><span class="p">[</span><span class="n">idx_perm</span><span class="p">]</span>
<span class="n">tr_users</span><span class="p">,</span> <span class="n">vd_users</span><span class="p">,</span> <span class="n">te_users</span> <span class="o">=</span> <span class="n">split_users</span><span class="p">(</span>
<span class="n">unique_uid</span><span class="p">,</span> <span class="n">test_users_size</span><span class="o">=</span><span class="mf">0.1</span>
<span class="p">)</span>
<span class="nb">print</span><span class="p">(</span><span class="n">tr_users</span><span class="o">.</span><span class="n">shape</span><span class="p">,</span> <span class="n">vd_users</span><span class="o">.</span><span class="n">shape</span><span class="p">,</span> <span class="n">te_users</span><span class="o">.</span><span class="n">shape</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>(99168,) (12396,) (12396,)
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>And this is how the authors set up the experiment: for validation and test they consider "<em>only</em>" items that have been seen during training</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1"># Select the training observations raw data </span>
<span class="n">tr_obsrv</span> <span class="o">=</span> <span class="n">filtered_raw_data</span><span class="o">.</span><span class="n">loc</span><span class="p">[</span><span class="n">filtered_raw_data</span><span class="p">[</span><span class="s2">"user"</span><span class="p">]</span><span class="o">.</span><span class="n">isin</span><span class="p">(</span><span class="n">tr_users</span><span class="p">)]</span>
<span class="n">tr_items</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">unique</span><span class="p">(</span><span class="n">tr_obsrv</span><span class="p">[</span><span class="s2">"item"</span><span class="p">])</span>
<span class="c1"># Save index dictionaries to "numerise" later one</span>
<span class="n">item2id</span> <span class="o">=</span> <span class="nb">dict</span><span class="p">((</span><span class="n">sid</span><span class="p">,</span> <span class="n">i</span><span class="p">)</span> <span class="k">for</span> <span class="p">(</span><span class="n">i</span><span class="p">,</span> <span class="n">sid</span><span class="p">)</span> <span class="ow">in</span> <span class="nb">enumerate</span><span class="p">(</span><span class="n">tr_items</span><span class="p">))</span>
<span class="n">user2id</span> <span class="o">=</span> <span class="nb">dict</span><span class="p">((</span><span class="n">pid</span><span class="p">,</span> <span class="n">i</span><span class="p">)</span> <span class="k">for</span> <span class="p">(</span><span class="n">i</span><span class="p">,</span> <span class="n">pid</span><span class="p">)</span> <span class="ow">in</span> <span class="nb">enumerate</span><span class="p">(</span><span class="n">unique_uid</span><span class="p">))</span>
<span class="n">vd_obsrv</span> <span class="o">=</span> <span class="n">filtered_raw_data</span><span class="p">[</span>
<span class="n">filtered_raw_data</span><span class="p">[</span><span class="s2">"user"</span><span class="p">]</span><span class="o">.</span><span class="n">isin</span><span class="p">(</span><span class="n">vd_users</span><span class="p">)</span>
<span class="o">&</span> <span class="n">filtered_raw_data</span><span class="p">[</span><span class="s2">"item"</span><span class="p">]</span><span class="o">.</span><span class="n">isin</span><span class="p">(</span><span class="n">tr_items</span><span class="p">)</span>
<span class="p">]</span>
<span class="n">te_obsrv</span> <span class="o">=</span> <span class="n">filtered_raw_data</span><span class="p">[</span>
<span class="n">filtered_raw_data</span><span class="p">[</span><span class="s2">"user"</span><span class="p">]</span><span class="o">.</span><span class="n">isin</span><span class="p">(</span><span class="n">te_users</span><span class="p">)</span>
<span class="o">&</span> <span class="n">filtered_raw_data</span><span class="p">[</span><span class="s2">"item"</span><span class="p">]</span><span class="o">.</span><span class="n">isin</span><span class="p">(</span><span class="n">tr_items</span><span class="p">)</span>
<span class="p">]</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Now that we have the validation and test users and their interactions, we will split such interactions into so-called "<em>validation and test train and test sets</em>".</p>
<p>I know that this sounds convoluted, but is not that complex. The "<em>validation_train and test_train sets</em>", comprising here 80% of the total validation and test sets, will be used to build what we could think as an input binary <em>"image"</em> (i.e. the binary matrix of clicks) to be "<em>encoded -> decoded</em>" by the trained auto-encoder. On the other hand the "<em>validation_test and test_test sets</em>", comprising here 20% of the total validation and test sets, will be used to compute the ranking metrics at validation/test time. If you want more details along with a toy example please go to the corresponding <a href="http://localhost:8790/notebooks/notebooks/01_prepare_data.ipynb">notebook</a> in the repo. I will discuss this further in Section 4.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">def</span> <span class="nf">split_train_test</span><span class="p">(</span>
<span class="n">data</span><span class="p">:</span> <span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">,</span> <span class="n">test_size</span><span class="p">:</span> <span class="nb">float</span>
<span class="p">)</span> <span class="o">-></span> <span class="n">Tuple</span><span class="p">[</span><span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">,</span> <span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">]:</span>
<span class="n">data_grouped_by_user</span> <span class="o">=</span> <span class="n">data</span><span class="o">.</span><span class="n">groupby</span><span class="p">(</span><span class="s2">"user"</span><span class="p">)</span>
<span class="n">tr_list</span><span class="p">,</span> <span class="n">te_list</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(),</span> <span class="nb">list</span><span class="p">()</span>
<span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">seed</span><span class="p">(</span><span class="mi">98765</span><span class="p">)</span>
<span class="k">for</span> <span class="n">i</span><span class="p">,</span> <span class="p">(</span><span class="n">nm</span><span class="p">,</span> <span class="n">group</span><span class="p">)</span> <span class="ow">in</span> <span class="nb">enumerate</span><span class="p">(</span><span class="n">data_grouped_by_user</span><span class="p">):</span>
<span class="n">n_items_u</span> <span class="o">=</span> <span class="nb">len</span><span class="p">(</span><span class="n">group</span><span class="p">)</span>
<span class="k">if</span> <span class="n">n_items_u</span> <span class="o">>=</span> <span class="mi">5</span><span class="p">:</span>
<span class="n">idx</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">zeros</span><span class="p">(</span><span class="n">n_items_u</span><span class="p">,</span> <span class="n">dtype</span><span class="o">=</span><span class="s2">"bool"</span><span class="p">)</span>
<span class="n">idx</span><span class="p">[</span>
<span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">choice</span><span class="p">(</span>
<span class="n">n_items_u</span><span class="p">,</span> <span class="n">size</span><span class="o">=</span><span class="nb">int</span><span class="p">(</span><span class="n">test_size</span> <span class="o">*</span> <span class="n">n_items_u</span><span class="p">),</span> <span class="n">replace</span><span class="o">=</span><span class="kc">False</span>
<span class="p">)</span><span class="o">.</span><span class="n">astype</span><span class="p">(</span><span class="s2">"int64"</span><span class="p">)</span>
<span class="p">]</span> <span class="o">=</span> <span class="kc">True</span>
<span class="n">tr_list</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">group</span><span class="p">[</span><span class="n">np</span><span class="o">.</span><span class="n">logical_not</span><span class="p">(</span><span class="n">idx</span><span class="p">)])</span>
<span class="n">te_list</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">group</span><span class="p">[</span><span class="n">idx</span><span class="p">])</span>
<span class="k">else</span><span class="p">:</span>
<span class="n">tr_list</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">group</span><span class="p">)</span>
<span class="n">data_tr</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">concat</span><span class="p">(</span><span class="n">tr_list</span><span class="p">)</span>
<span class="n">data_te</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">concat</span><span class="p">(</span><span class="n">te_list</span><span class="p">)</span>
<span class="k">return</span> <span class="n">data_tr</span><span class="p">,</span> <span class="n">data_te</span>
<span class="k">def</span> <span class="nf">numerize</span><span class="p">(</span><span class="n">tp</span><span class="p">:</span> <span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">,</span> <span class="n">user2id</span><span class="p">:</span> <span class="n">Dict</span><span class="p">,</span> <span class="n">item2id</span><span class="p">:</span> <span class="n">Dict</span><span class="p">)</span> <span class="o">-></span> <span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">:</span>
<span class="n">user</span> <span class="o">=</span> <span class="p">[</span><span class="n">user2id</span><span class="p">[</span><span class="n">x</span><span class="p">]</span> <span class="k">for</span> <span class="n">x</span> <span class="ow">in</span> <span class="n">tp</span><span class="p">[</span><span class="s2">"user"</span><span class="p">]]</span>
<span class="n">item</span> <span class="o">=</span> <span class="p">[</span><span class="n">item2id</span><span class="p">[</span><span class="n">x</span><span class="p">]</span> <span class="k">for</span> <span class="n">x</span> <span class="ow">in</span> <span class="n">tp</span><span class="p">[</span><span class="s2">"item"</span><span class="p">]]</span>
<span class="k">return</span> <span class="n">pd</span><span class="o">.</span><span class="n">DataFrame</span><span class="p">(</span><span class="n">data</span><span class="o">=</span><span class="p">{</span><span class="s2">"user"</span><span class="p">:</span> <span class="n">user</span><span class="p">,</span> <span class="s2">"item"</span><span class="p">:</span> <span class="n">item</span><span class="p">},</span> <span class="n">columns</span><span class="o">=</span><span class="p">[</span><span class="s2">"user"</span><span class="p">,</span> <span class="s2">"item"</span><span class="p">])</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">vd_items_tr</span><span class="p">,</span> <span class="n">vd_items_te</span> <span class="o">=</span> <span class="n">split_train_test</span><span class="p">(</span><span class="n">vd_obsrv</span><span class="p">,</span> <span class="n">test_size</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
<span class="n">te_items_tr</span><span class="p">,</span> <span class="n">te_items_te</span> <span class="o">=</span> <span class="n">split_train_test</span><span class="p">(</span><span class="n">te_obsrv</span><span class="p">,</span> <span class="n">test_size</span><span class="o">=</span><span class="mf">0.2</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>And that's it regarding to data preparation. We can now move to the model itself.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="3-$\text{Mult-VAE}^{\text{PR}}$:-the-code">3 $\text{Mult-VAE}^{\text{PR}}$: the code<a class="anchor-link" href="#3-$\text{Mult-VAE}^{\text{PR}}$:-the-code"> </a></h2><p>After the explanation in Section 1 you might expect the code to look rather complex. However, you might feel disappointed/pleased when you see how simple it really is.</p>
<p>In the <a href="https://arxiv.org/pdf/1802.05814.pdf">original publications</a> the authors used a one hidden layer MLP as generative model. There they say that deeper architectures do not improve the results, which I find it to be true after having run over 60 experiments. With that it mind, let's first have a look the model $ I \rightarrow 600 \rightarrow 200 \rightarrow 600 \rightarrow I$, where $I$ is the total number of items:</p>
<p><img src="/infinitoml/images/copied_from_nb/figures/mvae/multvae_arch.png" alt="" /></p>
<p><strong>Figure 1</strong>. $\text{Mult-VAE}^{\text{PR}}$ architecture. The colors in the Figure are my attempt to emphasize the <em>reparameterization trick</em>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>In code, the figure above is:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">from</span> <span class="nn">typing</span> <span class="kn">import</span> <span class="n">List</span>
<span class="kn">import</span> <span class="nn">numpy</span> <span class="k">as</span> <span class="nn">np</span>
<span class="kn">import</span> <span class="nn">mxnet</span> <span class="k">as</span> <span class="nn">mx</span>
<span class="kn">from</span> <span class="nn">mxnet</span> <span class="kn">import</span> <span class="n">autograd</span><span class="p">,</span> <span class="n">gluon</span><span class="p">,</span> <span class="n">nd</span>
<span class="kn">from</span> <span class="nn">mxnet.gluon</span> <span class="kn">import</span> <span class="n">nn</span><span class="p">,</span> <span class="n">HybridBlock</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>/usr/local/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
from ._conv import register_converters as _register_converters
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="3.1-Encoder">3.1 Encoder<a class="anchor-link" href="#3.1-Encoder"> </a></h3>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">class</span> <span class="nc">VAEEncoder</span><span class="p">(</span><span class="n">HybridBlock</span><span class="p">):</span>
<span class="k">def</span> <span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">q_dims</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">int</span><span class="p">],</span> <span class="n">dropout</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">float</span><span class="p">]):</span>
<span class="nb">super</span><span class="p">()</span><span class="o">.</span><span class="fm">__init__</span><span class="p">()</span>
<span class="c1"># last dim multiplied by two for the reparameterization trick</span>
<span class="n">q_dims_</span> <span class="o">=</span> <span class="n">q_dims</span><span class="p">[:</span><span class="o">-</span><span class="mi">1</span><span class="p">]</span> <span class="o">+</span> <span class="p">[</span><span class="n">q_dims</span><span class="p">[</span><span class="o">-</span><span class="mi">1</span><span class="p">]</span> <span class="o">*</span> <span class="mi">2</span><span class="p">]</span>
<span class="k">with</span> <span class="bp">self</span><span class="o">.</span><span class="n">name_scope</span><span class="p">():</span>
<span class="bp">self</span><span class="o">.</span><span class="n">q_layers</span> <span class="o">=</span> <span class="n">nn</span><span class="o">.</span><span class="n">HybridSequential</span><span class="p">(</span><span class="n">prefix</span><span class="o">=</span><span class="s2">"q_net"</span><span class="p">)</span>
<span class="k">for</span> <span class="n">p</span><span class="p">,</span> <span class="n">inp</span><span class="p">,</span> <span class="n">out</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">dropout</span><span class="p">,</span> <span class="n">q_dims_</span><span class="p">[:</span><span class="o">-</span><span class="mi">1</span><span class="p">],</span> <span class="n">q_dims_</span><span class="p">[</span><span class="mi">1</span><span class="p">:]):</span>
<span class="bp">self</span><span class="o">.</span><span class="n">q_layers</span><span class="o">.</span><span class="n">add</span><span class="p">(</span><span class="n">nn</span><span class="o">.</span><span class="n">Dropout</span><span class="p">(</span><span class="n">p</span><span class="p">))</span>
<span class="bp">self</span><span class="o">.</span><span class="n">q_layers</span><span class="o">.</span><span class="n">add</span><span class="p">(</span><span class="n">nn</span><span class="o">.</span><span class="n">Dense</span><span class="p">(</span><span class="n">in_units</span><span class="o">=</span><span class="n">inp</span><span class="p">,</span> <span class="n">units</span><span class="o">=</span><span class="n">out</span><span class="p">))</span>
<span class="k">def</span> <span class="nf">hybrid_forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">F</span><span class="p">,</span> <span class="n">X</span><span class="p">):</span>
<span class="n">h</span> <span class="o">=</span> <span class="n">F</span><span class="o">.</span><span class="n">L2Normalization</span><span class="p">(</span><span class="n">X</span><span class="p">)</span>
<span class="k">for</span> <span class="n">i</span><span class="p">,</span> <span class="n">layer</span> <span class="ow">in</span> <span class="nb">enumerate</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">q_layers</span><span class="p">):</span>
<span class="n">h</span> <span class="o">=</span> <span class="n">layer</span><span class="p">(</span><span class="n">h</span><span class="p">)</span>
<span class="k">if</span> <span class="n">i</span> <span class="o">!=</span> <span class="nb">len</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">q_layers</span><span class="p">)</span> <span class="o">-</span> <span class="mi">1</span><span class="p">:</span>
<span class="n">h</span> <span class="o">=</span> <span class="n">F</span><span class="o">.</span><span class="n">tanh</span><span class="p">(</span><span class="n">h</span><span class="p">)</span>
<span class="k">else</span><span class="p">:</span>
<span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span> <span class="o">=</span> <span class="n">F</span><span class="o">.</span><span class="n">split</span><span class="p">(</span><span class="n">h</span><span class="p">,</span> <span class="n">axis</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="n">num_outputs</span><span class="o">=</span><span class="mi">2</span><span class="p">)</span>
<span class="k">return</span> <span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="3.2-Decoder">3.2 Decoder<a class="anchor-link" href="#3.2-Decoder"> </a></h3>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">class</span> <span class="nc">Decoder</span><span class="p">(</span><span class="n">HybridBlock</span><span class="p">):</span>
<span class="k">def</span> <span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">p_dims</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">int</span><span class="p">],</span> <span class="n">dropout</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">float</span><span class="p">]):</span>
<span class="nb">super</span><span class="p">()</span><span class="o">.</span><span class="fm">__init__</span><span class="p">()</span>
<span class="k">with</span> <span class="bp">self</span><span class="o">.</span><span class="n">name_scope</span><span class="p">():</span>
<span class="bp">self</span><span class="o">.</span><span class="n">p_layers</span> <span class="o">=</span> <span class="n">nn</span><span class="o">.</span><span class="n">HybridSequential</span><span class="p">(</span><span class="n">prefix</span><span class="o">=</span><span class="s2">"p_net"</span><span class="p">)</span>
<span class="k">for</span> <span class="n">p</span><span class="p">,</span> <span class="n">inp</span><span class="p">,</span> <span class="n">out</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">dropout</span><span class="p">,</span> <span class="n">p_dims</span><span class="p">[:</span><span class="o">-</span><span class="mi">1</span><span class="p">],</span> <span class="n">p_dims</span><span class="p">[</span><span class="mi">1</span><span class="p">:]):</span>
<span class="bp">self</span><span class="o">.</span><span class="n">p_layers</span><span class="o">.</span><span class="n">add</span><span class="p">(</span><span class="n">nn</span><span class="o">.</span><span class="n">Dropout</span><span class="p">(</span><span class="n">p</span><span class="p">))</span>
<span class="bp">self</span><span class="o">.</span><span class="n">p_layers</span><span class="o">.</span><span class="n">add</span><span class="p">(</span><span class="n">nn</span><span class="o">.</span><span class="n">Dense</span><span class="p">(</span><span class="n">in_units</span><span class="o">=</span><span class="n">inp</span><span class="p">,</span> <span class="n">units</span><span class="o">=</span><span class="n">out</span><span class="p">))</span>
<span class="k">def</span> <span class="nf">hybrid_forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">F</span><span class="p">,</span> <span class="n">X</span><span class="p">):</span>
<span class="n">h</span> <span class="o">=</span> <span class="n">X</span>
<span class="k">for</span> <span class="n">i</span><span class="p">,</span> <span class="n">layer</span> <span class="ow">in</span> <span class="nb">enumerate</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">p_layers</span><span class="p">):</span>
<span class="n">h</span> <span class="o">=</span> <span class="n">layer</span><span class="p">(</span><span class="n">h</span><span class="p">)</span>
<span class="k">if</span> <span class="n">i</span> <span class="o">!=</span> <span class="nb">len</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">p_layers</span><span class="p">)</span> <span class="o">-</span> <span class="mi">1</span><span class="p">:</span>
<span class="n">h</span> <span class="o">=</span> <span class="n">F</span><span class="o">.</span><span class="n">tanh</span><span class="p">(</span><span class="n">h</span><span class="p">)</span>
<span class="k">return</span> <span class="n">h</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="3.3-The-model">3.3 The model<a class="anchor-link" href="#3.3-The-model"> </a></h3>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">class</span> <span class="nc">MultiVAE</span><span class="p">(</span><span class="n">HybridBlock</span><span class="p">):</span>
<span class="k">def</span> <span class="fm">__init__</span><span class="p">(</span>
<span class="bp">self</span><span class="p">,</span>
<span class="n">p_dims</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">int</span><span class="p">],</span>
<span class="n">dropout_enc</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">float</span><span class="p">],</span>
<span class="n">dropout_dec</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">float</span><span class="p">],</span>
<span class="n">q_dims</span><span class="p">:</span> <span class="n">List</span><span class="p">[</span><span class="nb">int</span><span class="p">]</span> <span class="o">=</span> <span class="kc">None</span><span class="p">,</span>
<span class="p">):</span>
<span class="nb">super</span><span class="p">()</span><span class="o">.</span><span class="fm">__init__</span><span class="p">()</span>
<span class="bp">self</span><span class="o">.</span><span class="n">encode</span> <span class="o">=</span> <span class="n">VAEEncoder</span><span class="p">(</span><span class="n">q_dims</span><span class="p">,</span> <span class="n">dropout_enc</span><span class="p">)</span>
<span class="bp">self</span><span class="o">.</span><span class="n">decode</span> <span class="o">=</span> <span class="n">Decoder</span><span class="p">(</span><span class="n">p_dims</span><span class="p">,</span> <span class="n">dropout_dec</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">hybrid_forward</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">F</span><span class="p">,</span> <span class="n">X</span><span class="p">):</span>
<span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span> <span class="o">=</span> <span class="bp">self</span><span class="o">.</span><span class="n">encode</span><span class="p">(</span><span class="n">X</span><span class="p">)</span>
<span class="n">std</span> <span class="o">=</span> <span class="n">F</span><span class="o">.</span><span class="n">exp</span><span class="p">(</span><span class="mf">0.5</span> <span class="o">*</span> <span class="n">logvar</span><span class="p">)</span>
<span class="n">eps</span> <span class="o">=</span> <span class="n">F</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">normal_like</span><span class="p">(</span><span class="n">std</span><span class="p">)</span>
<span class="n">sampled_z</span> <span class="o">=</span> <span class="n">mu</span> <span class="o">+</span> <span class="nb">float</span><span class="p">(</span><span class="n">autograd</span><span class="o">.</span><span class="n">is_training</span><span class="p">())</span> <span class="o">*</span> <span class="n">eps</span> <span class="o">*</span> <span class="n">std</span>
<span class="k">return</span> <span class="bp">self</span><span class="o">.</span><span class="n">decode</span><span class="p">(</span><span class="n">sampled_z</span><span class="p">),</span> <span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Before I move on, let me mention (and appreciate) one of the many nice "little" things that <code>Mxnet</code>'s <code>Gluon</code> has to offer. You will notice the use of <code>HybridBlock</code> and the use of the input <code>F</code> (the backend) when we define the forward pass, or more precisely, the <code>hybrid_forward</code> pass. One could write a full post on the joys of <code>HybridBlocks</code> and how nicely and easily the guys that developed <code>Gluon</code> brought together the flexibility of imperative frameworks (e.g. <code>Pytorch</code>) and the speed of declarative frameworks (e.g. <code>Tensorflow</code>) together. If you want to learn the details go <a href="https://gluon.mxnet.io/chapter07_distributed-learning/hybridize.html">here</a>, but believe me, this is <strong>FAST</strong>.</p>
<p>Having said that, there is only one more piece that we need to have the complete model, the loss function in Eq (12).</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">def</span> <span class="nf">vae_loss_fn</span><span class="p">(</span><span class="n">inp</span><span class="p">,</span> <span class="n">out</span><span class="p">,</span> <span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span><span class="p">,</span> <span class="n">anneal</span><span class="p">):</span>
<span class="c1"># firt term</span>
<span class="n">neg_ll</span> <span class="o">=</span> <span class="o">-</span><span class="n">nd</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">nd</span><span class="o">.</span><span class="n">sum</span><span class="p">(</span><span class="n">nd</span><span class="o">.</span><span class="n">log_softmax</span><span class="p">(</span><span class="n">out</span><span class="p">)</span> <span class="o">*</span> <span class="n">inp</span><span class="p">,</span> <span class="o">-</span><span class="mi">1</span><span class="p">))</span>
<span class="c1"># second term without beta</span>
<span class="n">KLD</span> <span class="o">=</span> <span class="o">-</span><span class="mf">0.5</span> <span class="o">*</span> <span class="n">nd</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">nd</span><span class="o">.</span><span class="n">sum</span><span class="p">(</span><span class="mi">1</span> <span class="o">+</span> <span class="n">logvar</span> <span class="o">-</span> <span class="n">nd</span><span class="o">.</span><span class="n">power</span><span class="p">(</span><span class="n">mu</span><span class="p">,</span> <span class="mi">2</span><span class="p">)</span> <span class="o">-</span> <span class="n">nd</span><span class="o">.</span><span class="n">exp</span><span class="p">(</span><span class="n">logvar</span><span class="p">),</span> <span class="n">axis</span><span class="o">=</span><span class="mi">1</span><span class="p">))</span>
<span class="c1"># "full" loss (anneal is beta in the expressions above)</span>
<span class="k">return</span> <span class="n">neg_ll</span> <span class="o">+</span> <span class="n">anneal</span> <span class="o">*</span> <span class="n">KLD</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>In the <a href="https://arxiv.org/pdf/1802.05814.pdf">paper</a> the authors also use a Multinomial Denoising Autoencoder (Mult-DAE). The architecture is identical to that of the $\text{Mult-VAE}^{\text{PR}}$ apart from the fact that there is no variational aspect. I have implemented the Mult-DAE and run multiple experiments with it. However, given its simplicity and an already lengthy post, I will not discuss the corresponding code here.</p>
<p>Let's have a look to the <code>MultiVAE</code></p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">I</span> <span class="o">=</span> <span class="mi">50000</span>
<span class="n">q_dims</span> <span class="o">=</span> <span class="p">[</span><span class="n">I</span><span class="p">]</span> <span class="o">+</span> <span class="p">[</span><span class="mi">600</span><span class="p">,</span> <span class="mi">200</span><span class="p">]</span>
<span class="n">p_dims</span> <span class="o">=</span> <span class="p">[</span><span class="mi">200</span><span class="p">,</span> <span class="mi">600</span><span class="p">]</span> <span class="o">+</span> <span class="p">[</span><span class="n">I</span><span class="p">]</span>
<span class="n">dropout_enc</span> <span class="o">=</span> <span class="p">[</span><span class="mf">0.5</span><span class="p">,</span> <span class="mf">0.</span><span class="p">]</span>
<span class="n">dropout_dec</span> <span class="o">=</span> <span class="p">[</span><span class="mf">0.</span><span class="p">,</span> <span class="mf">0.</span><span class="p">]</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">vae_model</span> <span class="o">=</span> <span class="n">MultiVAE</span><span class="p">(</span>
<span class="n">p_dims</span><span class="o">=</span><span class="n">p_dims</span><span class="p">,</span>
<span class="n">q_dims</span><span class="o">=</span><span class="n">q_dims</span><span class="p">,</span>
<span class="n">dropout_enc</span><span class="o">=</span><span class="n">dropout_enc</span><span class="p">,</span>
<span class="n">dropout_dec</span><span class="o">=</span><span class="n">dropout_dec</span><span class="p">,</span>
<span class="p">)</span>
<span class="n">vae_model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>MultiVAE(
(encode): VAEEncoder(
(q_layers): HybridSequential(
(0): Dropout(p = 0.5, axes=())
(1): Dense(50000 -> 600, linear)
(2): Dropout(p = 0.0, axes=())
(3): Dense(600 -> 400, linear)
)
)
(decode): Decoder(
(p_layers): HybridSequential(
(0): Dropout(p = 0.0, axes=())
(1): Dense(200 -> 600, linear)
(2): Dropout(p = 0.0, axes=())
(3): Dense(600 -> 50000, linear)
)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Note that following the original implementation, I apply dropout at the input layer for both $\text{Mult-VAE}^{\text{PR}}$ and $\text{Mult-DAE}$ to avoid overfitting. I also include the option for applying dropout throughout the network.</p>
<p>Note that even though I have explored different dropouts, the best way of addressing the interplay between dropout, weight decay, $\beta$, etc, and the architecture is using "proper" hyperparamter optimization.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="4.-Train,-validate-and-test-model">4. Train, validate and test model<a class="anchor-link" href="#4.-Train,-validate-and-test-model"> </a></h2><p>So far we have explained (a bit) the math behind the model, prepared the data and build the (relatively) simple model. Now is time to train it.</p>
<p>If you go to <code>prepare_data.py</code> in the <a href="https://github.com/jrzaurin/RecoTour/tree/master/Amazon/mult-vae">repo</a>, you will see that the results of the data preparation in Section 2 are saved into pickle files and are later loader in <code>main_mxnet.py</code> by a class called <code>Dataloader</code> in the module <code>utils</code>. This is of course inspired by the <a href="https://github.com/dawenl/vae_cf/blob/master/VAE_ML20M_WWW2018.ipynb">original</a> implementation and the already mentioned <a href="https://github.com/younggyoseo/vae-cf-pytorch">Pytorch implementation</a>.</p>
<p>Let's have a look using this time the Movilens dataset</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">from</span> <span class="nn">utils.data_loader</span> <span class="kn">import</span> <span class="n">DataLoader</span>
<span class="kn">from</span> <span class="nn">utils.metrics</span> <span class="kn">import</span> <span class="n">NDCG_binary_at_k_batch</span><span class="p">,</span> <span class="n">Recall_at_k_batch</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">data_loader</span> <span class="o">=</span> <span class="n">DataLoader</span><span class="p">(</span><span class="n">DATA_DIR</span> <span class="o">/</span> <span class="s2">"movielens_processed"</span><span class="p">)</span>
<span class="n">n_items</span> <span class="o">=</span> <span class="n">data_loader</span><span class="o">.</span><span class="n">n_items</span>
<span class="n">train_data</span> <span class="o">=</span> <span class="n">data_loader</span><span class="o">.</span><span class="n">load_data</span><span class="p">(</span><span class="s2">"train"</span><span class="p">)</span>
<span class="n">valid_data_tr</span><span class="p">,</span> <span class="n">valid_data_te</span> <span class="o">=</span> <span class="n">data_loader</span><span class="o">.</span><span class="n">load_data</span><span class="p">(</span><span class="s2">"validation"</span><span class="p">)</span>
<span class="n">test_data_tr</span><span class="p">,</span> <span class="n">test_data_te</span> <span class="o">=</span> <span class="n">data_loader</span><span class="o">.</span><span class="n">load_data</span><span class="p">(</span><span class="s2">"test"</span><span class="p">)</span>
<span class="n">train_data</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre><116677x20108 sparse matrix of type '<class 'numpy.float32'>'
with 8538846 stored elements in Compressed Sparse Row format></pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>As you can see, the training data (same applies to validation and test) is the binary sparse matrix of interactions. Have a look to the class <code>DataLoader</code> if you want a few more details on how it is built.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="4.1-Annealing-schedule">4.1 Annealing schedule<a class="anchor-link" href="#4.1-Annealing-schedule"> </a></h3><p>As mentioned before, we can interpret the Kullback-Leiber divergence as a regularization term. With that in mind, in a procedure inspired by <a href="https://arxiv.org/abs/1511.06349">Samuel R. Bowman et al, 2016</a> [6], Liang and co-authors linearly annealed the KL term (i.e. increase $\beta$) slowly over a large number of training steps.</p>
<p>More specifically, the authors anneal the KL divergence all the way to $\beta$ = 1, reaching that value at around 80% of the total number of epochs used during the process. They then identify the best performing $\beta$ based on the peak validation metric, and retrain the model with the same annealing schedule, but stop increasing $\beta$ after reaching that value.</p>
<p>If we go to <a href="https://github.com/dawenl/vae_cf/blob/master/VAE_ML20M_WWW2018.ipynb">their implementation</a>, these are the specifics of the process: using a batch size of 500 they set the total number of annealing steps to 200000. Given that the training dataset has a size of 116677, every epoch has 234 training steps. Their <code>anneal_cap</code> value, i.e. the maximum annealing reached during training, is set to 0.2, and during training they use the following approach:</p>
<div class="highlight"><pre><span></span><span class="k">if</span> <span class="n">total_anneal_steps</span> <span class="o">></span> <span class="mi">0</span><span class="p">:</span>
<span class="n">anneal</span> <span class="o">=</span> <span class="nb">min</span><span class="p">(</span><span class="n">anneal_cap</span><span class="p">,</span> <span class="mf">1.</span> <span class="o">*</span> <span class="n">update_count</span> <span class="o">/</span> <span class="n">total_anneal_steps</span><span class="p">)</span>
<span class="k">else</span><span class="p">:</span>
<span class="n">anneal</span> <span class="o">=</span> <span class="n">anneal_cap</span>
</pre></div>
<p>where <code>update_count</code> will increase by 1 every training step/batch. They use 200 epochs, therefore, if we do the math, the <code>anneal_cap</code> value will stop increasing when <code>update_count / total_anneal_steps</code> = 0.2, i.e. after 40000 training steps, or in other words, after around 170 epochs, i.e. $\sim$80% of the total number of epochs.</p>
<p>Whit that in mind my implementation looks like this:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">batch_size</span> <span class="o">=</span> <span class="mi">500</span>
<span class="n">anneal_epochs</span> <span class="o">=</span> <span class="kc">None</span>
<span class="n">anneal_cap</span> <span class="o">=</span> <span class="mf">0.2</span>
<span class="n">constant_anneal</span> <span class="o">=</span> <span class="kc">False</span>
<span class="n">n_epochs</span> <span class="o">=</span> <span class="mi">200</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">training_steps</span> <span class="o">=</span> <span class="nb">len</span><span class="p">(</span><span class="nb">range</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="n">train_data</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">],</span> <span class="n">batch_size</span><span class="p">))</span>
<span class="k">try</span><span class="p">:</span>
<span class="n">total_anneal_steps</span> <span class="o">=</span> <span class="p">(</span>
<span class="n">training_steps</span> <span class="o">*</span> <span class="p">(</span><span class="n">n_epochs</span> <span class="o">-</span> <span class="nb">int</span><span class="p">(</span><span class="n">n_epochs</span> <span class="o">*</span> <span class="mf">0.2</span><span class="p">))</span>
<span class="p">)</span> <span class="o">/</span> <span class="n">anneal_cap</span>
<span class="k">except</span> <span class="ne">ZeroDivisionError</span><span class="p">:</span>
<span class="k">assert</span> <span class="p">(</span>
<span class="n">constant_anneal</span>
<span class="p">),</span> <span class="s2">"if 'anneal_cap' is set to 0.0 'constant_anneal' must be set to 'True"</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>once the <code>total_anneal_steps</code> is set, the only thing left is to define the training and validation steps. If you are familiar with <code>Pytorch</code>, the next two functions will be look very familiar to you:</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="4.2-Train-Step">4.2 Train Step<a class="anchor-link" href="#4.2-Train-Step"> </a></h3>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">def</span> <span class="nf">train_step</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">optimizer</span><span class="p">,</span> <span class="n">data</span><span class="p">,</span> <span class="n">epoch</span><span class="p">):</span>
<span class="n">running_loss</span> <span class="o">=</span> <span class="mf">0.0</span>
<span class="k">global</span> <span class="n">update_count</span>
<span class="n">N</span> <span class="o">=</span> <span class="n">data</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span>
<span class="n">idxlist</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="nb">range</span><span class="p">(</span><span class="n">N</span><span class="p">))</span>
<span class="n">np</span><span class="o">.</span><span class="n">random</span><span class="o">.</span><span class="n">shuffle</span><span class="p">(</span><span class="n">idxlist</span><span class="p">)</span>
<span class="n">training_steps</span> <span class="o">=</span> <span class="nb">len</span><span class="p">(</span><span class="nb">range</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="n">N</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">))</span>
<span class="k">with</span> <span class="n">trange</span><span class="p">(</span><span class="n">training_steps</span><span class="p">)</span> <span class="k">as</span> <span class="n">t</span><span class="p">:</span>
<span class="k">for</span> <span class="n">batch_idx</span><span class="p">,</span> <span class="n">start_idx</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">t</span><span class="p">,</span> <span class="nb">range</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="n">N</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">)):</span>
<span class="n">t</span><span class="o">.</span><span class="n">set_description</span><span class="p">(</span><span class="s2">"epoch: </span><span class="si">{}</span><span class="s2">"</span><span class="o">.</span><span class="n">format</span><span class="p">(</span><span class="n">epoch</span> <span class="o">+</span> <span class="mi">1</span><span class="p">))</span>
<span class="n">end_idx</span> <span class="o">=</span> <span class="nb">min</span><span class="p">(</span><span class="n">start_idx</span> <span class="o">+</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">N</span><span class="p">)</span>
<span class="n">X_inp</span> <span class="o">=</span> <span class="n">data</span><span class="p">[</span><span class="n">idxlist</span><span class="p">[</span><span class="n">start_idx</span><span class="p">:</span><span class="n">end_idx</span><span class="p">]]</span>
<span class="n">X_inp</span> <span class="o">=</span> <span class="n">nd</span><span class="o">.</span><span class="n">array</span><span class="p">(</span><span class="n">X_inp</span><span class="o">.</span><span class="n">toarray</span><span class="p">())</span><span class="o">.</span><span class="n">as_in_context</span><span class="p">(</span><span class="n">ctx</span><span class="p">)</span>
<span class="k">if</span> <span class="n">constant_anneal</span><span class="p">:</span>
<span class="n">anneal</span> <span class="o">=</span> <span class="n">anneal_cap</span>
<span class="k">else</span><span class="p">:</span>
<span class="n">anneal</span> <span class="o">=</span> <span class="nb">min</span><span class="p">(</span><span class="n">anneal_cap</span><span class="p">,</span> <span class="n">update_count</span> <span class="o">/</span> <span class="n">total_anneal_steps</span><span class="p">)</span>
<span class="n">update_count</span> <span class="o">+=</span> <span class="mi">1</span>
<span class="k">with</span> <span class="n">autograd</span><span class="o">.</span><span class="n">record</span><span class="p">():</span>
<span class="k">if</span> <span class="n">model</span><span class="o">.</span><span class="vm">__class__</span><span class="o">.</span><span class="vm">__name__</span> <span class="o">==</span> <span class="s2">"MultiVAE"</span><span class="p">:</span>
<span class="n">X_out</span><span class="p">,</span> <span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">X_inp</span><span class="p">)</span>
<span class="n">loss</span> <span class="o">=</span> <span class="n">vae_loss_fn</span><span class="p">(</span><span class="n">X_inp</span><span class="p">,</span> <span class="n">X_out</span><span class="p">,</span> <span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span><span class="p">,</span> <span class="n">anneal</span><span class="p">)</span>
<span class="n">train_step</span><span class="o">.</span><span class="n">anneal</span> <span class="o">=</span> <span class="n">anneal</span>
<span class="k">elif</span> <span class="n">model</span><span class="o">.</span><span class="vm">__class__</span><span class="o">.</span><span class="vm">__name__</span> <span class="o">==</span> <span class="s2">"MultiDAE"</span><span class="p">:</span>
<span class="n">X_out</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">X_inp</span><span class="p">)</span>
<span class="n">loss</span> <span class="o">=</span> <span class="o">-</span><span class="n">nd</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">nd</span><span class="o">.</span><span class="n">sum</span><span class="p">(</span><span class="n">nd</span><span class="o">.</span><span class="n">log_softmax</span><span class="p">(</span><span class="n">X_out</span><span class="p">)</span> <span class="o">*</span> <span class="n">X_inp</span><span class="p">,</span> <span class="o">-</span><span class="mi">1</span><span class="p">))</span>
<span class="n">loss</span><span class="o">.</span><span class="n">backward</span><span class="p">()</span>
<span class="n">trainer</span><span class="o">.</span><span class="n">step</span><span class="p">(</span><span class="n">X_inp</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">])</span>
<span class="n">running_loss</span> <span class="o">+=</span> <span class="n">loss</span><span class="o">.</span><span class="n">asscalar</span><span class="p">()</span>
<span class="n">avg_loss</span> <span class="o">=</span> <span class="n">running_loss</span> <span class="o">/</span> <span class="p">(</span><span class="n">batch_idx</span> <span class="o">+</span> <span class="mi">1</span><span class="p">)</span>
<span class="n">t</span><span class="o">.</span><span class="n">set_postfix</span><span class="p">(</span><span class="n">loss</span><span class="o">=</span><span class="n">avg_loss</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="4.3-Validation-step">4.3 Validation step<a class="anchor-link" href="#4.3-Validation-step"> </a></h3>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="k">def</span> <span class="nf">eval_step</span><span class="p">(</span><span class="n">data_tr</span><span class="p">,</span> <span class="n">data_te</span><span class="p">,</span> <span class="n">data_type</span><span class="o">=</span><span class="s2">"valid"</span><span class="p">):</span>
<span class="n">running_loss</span> <span class="o">=</span> <span class="mf">0.0</span>
<span class="n">eval_idxlist</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="nb">range</span><span class="p">(</span><span class="n">data_tr</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">]))</span>
<span class="n">eval_N</span> <span class="o">=</span> <span class="n">data_tr</span><span class="o">.</span><span class="n">shape</span><span class="p">[</span><span class="mi">0</span><span class="p">]</span>
<span class="n">eval_steps</span> <span class="o">=</span> <span class="nb">len</span><span class="p">(</span><span class="nb">range</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="n">eval_N</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">))</span>
<span class="n">n100_list</span><span class="p">,</span> <span class="n">r20_list</span><span class="p">,</span> <span class="n">r50_list</span> <span class="o">=</span> <span class="p">[],</span> <span class="p">[],</span> <span class="p">[]</span>
<span class="k">with</span> <span class="n">trange</span><span class="p">(</span><span class="n">eval_steps</span><span class="p">)</span> <span class="k">as</span> <span class="n">t</span><span class="p">:</span>
<span class="k">for</span> <span class="n">batch_idx</span><span class="p">,</span> <span class="n">start_idx</span> <span class="ow">in</span> <span class="nb">zip</span><span class="p">(</span><span class="n">t</span><span class="p">,</span> <span class="nb">range</span><span class="p">(</span><span class="mi">0</span><span class="p">,</span> <span class="n">eval_N</span><span class="p">,</span> <span class="n">batch_size</span><span class="p">)):</span>
<span class="n">t</span><span class="o">.</span><span class="n">set_description</span><span class="p">(</span><span class="n">data_type</span><span class="p">)</span>
<span class="n">end_idx</span> <span class="o">=</span> <span class="nb">min</span><span class="p">(</span><span class="n">start_idx</span> <span class="o">+</span> <span class="n">batch_size</span><span class="p">,</span> <span class="n">eval_N</span><span class="p">)</span>
<span class="n">X_tr</span> <span class="o">=</span> <span class="n">data_tr</span><span class="p">[</span><span class="n">eval_idxlist</span><span class="p">[</span><span class="n">start_idx</span><span class="p">:</span><span class="n">end_idx</span><span class="p">]]</span>
<span class="n">X_te</span> <span class="o">=</span> <span class="n">data_te</span><span class="p">[</span><span class="n">eval_idxlist</span><span class="p">[</span><span class="n">start_idx</span><span class="p">:</span><span class="n">end_idx</span><span class="p">]]</span>
<span class="n">X_tr_inp</span> <span class="o">=</span> <span class="n">nd</span><span class="o">.</span><span class="n">array</span><span class="p">(</span><span class="n">X_tr</span><span class="o">.</span><span class="n">toarray</span><span class="p">())</span><span class="o">.</span><span class="n">as_in_context</span><span class="p">(</span><span class="n">ctx</span><span class="p">)</span>
<span class="k">with</span> <span class="n">autograd</span><span class="o">.</span><span class="n">predict_mode</span><span class="p">():</span>
<span class="k">if</span> <span class="n">model</span><span class="o">.</span><span class="vm">__class__</span><span class="o">.</span><span class="vm">__name__</span> <span class="o">==</span> <span class="s2">"MultiVAE"</span><span class="p">:</span>
<span class="n">X_out</span><span class="p">,</span> <span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">X_tr_inp</span><span class="p">)</span>
<span class="n">loss</span> <span class="o">=</span> <span class="n">vae_loss_fn</span><span class="p">(</span><span class="n">X_tr_inp</span><span class="p">,</span> <span class="n">X_out</span><span class="p">,</span> <span class="n">mu</span><span class="p">,</span> <span class="n">logvar</span><span class="p">,</span> <span class="n">train_step</span><span class="o">.</span><span class="n">anneal</span><span class="p">)</span>
<span class="k">elif</span> <span class="n">model</span><span class="o">.</span><span class="vm">__class__</span><span class="o">.</span><span class="vm">__name__</span> <span class="o">==</span> <span class="s2">"MultiDAE"</span><span class="p">:</span>
<span class="n">X_out</span> <span class="o">=</span> <span class="n">model</span><span class="p">(</span><span class="n">X_tr_inp</span><span class="p">)</span>
<span class="n">loss</span> <span class="o">=</span> <span class="o">-</span><span class="n">nd</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">nd</span><span class="o">.</span><span class="n">sum</span><span class="p">(</span><span class="n">nd</span><span class="o">.</span><span class="n">log_softmax</span><span class="p">(</span><span class="n">X_out</span><span class="p">)</span> <span class="o">*</span> <span class="n">X_tr_inp</span><span class="p">,</span> <span class="o">-</span><span class="mi">1</span><span class="p">))</span>
<span class="n">running_loss</span> <span class="o">+=</span> <span class="n">loss</span><span class="o">.</span><span class="n">asscalar</span><span class="p">()</span>
<span class="n">avg_loss</span> <span class="o">=</span> <span class="n">running_loss</span> <span class="o">/</span> <span class="p">(</span><span class="n">batch_idx</span> <span class="o">+</span> <span class="mi">1</span><span class="p">)</span>
<span class="c1"># Exclude examples from training set</span>
<span class="n">X_out</span> <span class="o">=</span> <span class="n">X_out</span><span class="o">.</span><span class="n">asnumpy</span><span class="p">()</span>
<span class="n">X_out</span><span class="p">[</span><span class="n">X_tr</span><span class="o">.</span><span class="n">nonzero</span><span class="p">()]</span> <span class="o">=</span> <span class="o">-</span><span class="n">np</span><span class="o">.</span><span class="n">inf</span>
<span class="n">n100</span> <span class="o">=</span> <span class="n">NDCG_binary_at_k_batch</span><span class="p">(</span><span class="n">X_out</span><span class="p">,</span> <span class="n">X_te</span><span class="p">,</span> <span class="n">k</span><span class="o">=</span><span class="mi">100</span><span class="p">)</span>
<span class="n">r20</span> <span class="o">=</span> <span class="n">Recall_at_k_batch</span><span class="p">(</span><span class="n">X_out</span><span class="p">,</span> <span class="n">X_te</span><span class="p">,</span> <span class="n">k</span><span class="o">=</span><span class="mi">20</span><span class="p">)</span>
<span class="n">r50</span> <span class="o">=</span> <span class="n">Recall_at_k_batch</span><span class="p">(</span><span class="n">X_out</span><span class="p">,</span> <span class="n">X_te</span><span class="p">,</span> <span class="n">k</span><span class="o">=</span><span class="mi">50</span><span class="p">)</span>
<span class="n">n100_list</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">n100</span><span class="p">)</span>
<span class="n">r20_list</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">r20</span><span class="p">)</span>
<span class="n">r50_list</span><span class="o">.</span><span class="n">append</span><span class="p">(</span><span class="n">r50</span><span class="p">)</span>
<span class="n">t</span><span class="o">.</span><span class="n">set_postfix</span><span class="p">(</span><span class="n">loss</span><span class="o">=</span><span class="n">avg_loss</span><span class="p">)</span>
<span class="n">n100_list</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">concatenate</span><span class="p">(</span><span class="n">n100_list</span><span class="p">)</span>
<span class="n">r20_list</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">concatenate</span><span class="p">(</span><span class="n">r20_list</span><span class="p">)</span>
<span class="n">r50_list</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">concatenate</span><span class="p">(</span><span class="n">r50_list</span><span class="p">)</span>
<span class="k">return</span> <span class="n">avg_loss</span><span class="p">,</span> <span class="n">np</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">n100_list</span><span class="p">),</span> <span class="n">np</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">r20_list</span><span class="p">),</span> <span class="n">np</span><span class="o">.</span><span class="n">mean</span><span class="p">(</span><span class="n">r50_list</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>I have widely discussed the evaluation metrics (NDCG@k and Recall@k) in a number of notebooks in this <a href="https://github.com/jrzaurin/RecoTour">repo</a> (and corresponding posts). Therefore, with that in mind and with the aim of not making this a more "infinite notebook", I will not describe the corresponding implementation here. If you want details on those evaluation metrics, please go the <code>metrics.py</code> module in <code>utils</code>. The code there is a very small adaptation to the one in the <a href="https://github.com/dawenl/vae_cf/blob/master/VAE_ML20M_WWW2018.ipynb">original implementation</a>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h3 id="4.4-Running-the-process">4.4 Running the process<a class="anchor-link" href="#4.4-Running-the-process"> </a></h3><p>Let's define the model, prepare the set up and run a small sample (of course, ignore the results printed. I only want to illustrate how to run the model)</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span> <span class="o">=</span> <span class="n">MultiVAE</span><span class="p">(</span>
<span class="n">p_dims</span><span class="o">=</span><span class="p">[</span><span class="mi">200</span><span class="p">,</span> <span class="mi">600</span><span class="p">,</span> <span class="n">n_items</span><span class="p">],</span>
<span class="n">q_dims</span><span class="o">=</span><span class="p">[</span><span class="n">n_items</span><span class="p">,</span> <span class="mi">600</span><span class="p">,</span> <span class="mi">200</span><span class="p">],</span>
<span class="n">dropout_enc</span><span class="o">=</span><span class="p">[</span><span class="mf">0.5</span><span class="p">,</span> <span class="mf">0.0</span><span class="p">],</span>
<span class="n">dropout_dec</span><span class="o">=</span><span class="p">[</span><span class="mf">0.0</span><span class="p">,</span> <span class="mf">0.0</span><span class="p">],</span>
<span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">model</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_text output_subarea output_execute_result">
<pre>MultiVAE(
(encode): VAEEncoder(
(q_layers): HybridSequential(
(0): Dropout(p = 0.5, axes=())
(1): Dense(20108 -> 600, linear)
(2): Dropout(p = 0.0, axes=())
(3): Dense(600 -> 400, linear)
)
)
(decode): Decoder(
(p_layers): HybridSequential(
(0): Dropout(p = 0.0, axes=())
(1): Dense(200 -> 600, linear)
(2): Dropout(p = 0.0, axes=())
(3): Dense(600 -> 20108, linear)
)
)
)</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">ctx</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">gpu</span><span class="p">()</span> <span class="k">if</span> <span class="n">mx</span><span class="o">.</span><span class="n">context</span><span class="o">.</span><span class="n">num_gpus</span><span class="p">()</span> <span class="k">else</span> <span class="n">mx</span><span class="o">.</span><span class="n">cpu</span><span class="p">()</span>
<span class="n">model</span><span class="o">.</span><span class="n">initialize</span><span class="p">(</span><span class="n">mx</span><span class="o">.</span><span class="n">init</span><span class="o">.</span><span class="n">Xavier</span><span class="p">(),</span> <span class="n">ctx</span><span class="o">=</span><span class="n">ctx</span><span class="p">)</span>
<span class="n">model</span><span class="o">.</span><span class="n">hybridize</span><span class="p">()</span>
<span class="n">optimizer</span> <span class="o">=</span> <span class="n">mx</span><span class="o">.</span><span class="n">optimizer</span><span class="o">.</span><span class="n">Adam</span><span class="p">(</span><span class="n">learning_rate</span><span class="o">=</span><span class="mf">0.001</span><span class="p">,</span> <span class="n">wd</span><span class="o">=</span><span class="mf">0.</span><span class="p">)</span>
<span class="n">trainer</span> <span class="o">=</span> <span class="n">gluon</span><span class="o">.</span><span class="n">Trainer</span><span class="p">(</span><span class="n">model</span><span class="o">.</span><span class="n">collect_params</span><span class="p">(),</span> <span class="n">optimizer</span><span class="o">=</span><span class="n">optimizer</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">stop_step</span> <span class="o">=</span> <span class="mi">0</span>
<span class="n">update_count</span> <span class="o">=</span> <span class="mi">0</span>
<span class="n">eval_every</span> <span class="o">=</span> <span class="mi">1</span>
<span class="n">stop</span> <span class="o">=</span> <span class="kc">False</span>
<span class="k">for</span> <span class="n">epoch</span> <span class="ow">in</span> <span class="nb">range</span><span class="p">(</span><span class="mi">1</span><span class="p">):</span>
<span class="n">train_step</span><span class="p">(</span><span class="n">model</span><span class="p">,</span> <span class="n">optimizer</span><span class="p">,</span> <span class="n">train_data</span><span class="p">[:</span><span class="mi">2000</span><span class="p">],</span> <span class="n">epoch</span><span class="p">)</span>
<span class="k">if</span> <span class="n">epoch</span> <span class="o">%</span> <span class="n">eval_every</span> <span class="o">==</span> <span class="p">(</span><span class="n">eval_every</span> <span class="o">-</span> <span class="mi">1</span><span class="p">):</span>
<span class="n">val_loss</span><span class="p">,</span> <span class="n">n100</span><span class="p">,</span> <span class="n">r20</span><span class="p">,</span> <span class="n">r50</span> <span class="o">=</span> <span class="n">eval_step</span><span class="p">(</span><span class="n">valid_data_tr</span><span class="p">[:</span><span class="mi">1000</span><span class="p">],</span> <span class="n">valid_data_te</span><span class="p">[:</span><span class="mi">1000</span><span class="p">])</span>
<span class="nb">print</span><span class="p">(</span><span class="s2">"="</span> <span class="o">*</span> <span class="mi">80</span><span class="p">)</span>
<span class="nb">print</span><span class="p">(</span>
<span class="s2">"| valid loss </span><span class="si">{:4.3f}</span><span class="s2"> | n100 </span><span class="si">{:4.3f}</span><span class="s2"> | r20 </span><span class="si">{:4.3f}</span><span class="s2"> | "</span>
<span class="s2">"r50 </span><span class="si">{:4.3f}</span><span class="s2">"</span><span class="o">.</span><span class="n">format</span><span class="p">(</span><span class="n">val_loss</span><span class="p">,</span> <span class="n">n100</span><span class="p">,</span> <span class="n">r20</span><span class="p">,</span> <span class="n">r50</span><span class="p">)</span>
<span class="p">)</span>
<span class="nb">print</span><span class="p">(</span><span class="s2">"="</span> <span class="o">*</span> <span class="mi">80</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>epoch: 1: 100%|██████████| 4/4 [00:04<00:00, 1.25s/it, loss=737]
valid: 100%|██████████| 2/2 [00:01<00:00, 1.12it/s, loss=562]</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stdout output_text">
<pre>================================================================================
| valid loss 561.928 | n100 0.006 | r20 0.003 | r50 0.006
================================================================================
</pre>
</div>
</div>
<div class="output_area">
<div class="output_subarea output_stream output_stderr output_text">
<pre>
</pre>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>And with a few more rings and bells (e.g. optional learning rate scheduler, early stopping, etc...) this is exactly the code that you will find in <code>main_mxnet.py</code>.</p>
<p>Before I move to the next, final section, just a quick comment about something I normally find in these scientific publications. Normally, once they have found the best hyperparameters on the validation set, they test the model on the test set.</p>
<div class="highlight"><pre><span></span><span class="c1"># Run on test data with best model</span>
<span class="n">model</span><span class="o">.</span><span class="n">load_parameters</span><span class="p">(</span><span class="nb">str</span><span class="p">(</span><span class="n">model_weights</span> <span class="o">/</span> <span class="p">(</span><span class="n">model_name</span> <span class="o">+</span> <span class="s2">".params"</span><span class="p">)),</span> <span class="n">ctx</span><span class="o">=</span><span class="n">ctx</span><span class="p">)</span>
<span class="n">test_loss</span><span class="p">,</span> <span class="n">n100</span><span class="p">,</span> <span class="n">r20</span><span class="p">,</span> <span class="n">r50</span> <span class="o">=</span> <span class="n">eval_step</span><span class="p">(</span>
<span class="n">test_data_tr</span><span class="p">,</span> <span class="n">test_data_te</span><span class="p">,</span> <span class="n">data_type</span><span class="o">=</span><span class="s2">"test"</span>
<span class="p">)</span>
</pre></div>
<p>In "real-life" scenarios, there would be one additional step, the one merging the train and validation sets, re-training the model with the best hyperparameters and then testing on the test set. In any case, since here my goal is not to build a real-life system, I will follow the same procedure to that found in the original <a href="https://github.com/dawenl/vae_cf/blob/master/VAE_ML20M_WWW2018.ipynb">implementation</a>.</p>
<p>Time now to have a look to the results obtained with both <code>Pytorch</code> and <code>Mxnet</code>.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<h2 id="5.-Summary-of-the-results">5. Summary of the results<a class="anchor-link" href="#5.-Summary-of-the-results"> </a></h2><p>let me remind again the annealing schedule described in Section 4.1. Basically, we gradually anneal to $\beta = 1$, which is reached at around 80% of the total number of epochs, and record the best anneal parameter ($\beta_{best}$). Then we apply the same annealing schedule but with $\beta_{best}$, i.e. we anneal to $\beta_{best}$ reaching that value at around 80% of the total number of epochs.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="kn">from</span> <span class="nn">utils.plot_utils</span> <span class="kn">import</span> <span class="n">plot_anneal_schedule</span><span class="p">,</span> <span class="n">find_best</span><span class="p">,</span> <span class="n">plot_metric_vs_loss</span><span class="p">,</span> <span class="n">plot_ndcg_vs_pdims</span>
<span class="n">plot_anneal_schedule</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_png output_subarea ">
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA4YAAAEpCAYAAADPpjwfAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOzdd5xU1d348c+d3rf3xrKwVOlNF0EFUYGgaEw00Zj8jImJz5OoifGxoGiwRIkmxsQSY4OoMSoBuxKlKtJ7W7awbC8zO73P+f0xMLjSFlnYBc779ZrXlFvme87MnHO/9557RxFCCCRJkiRJkiRJkqSzlqq7A5AkSZIkSZIkSZK6l0wMJUmSJEmSJEmSznIyMZQkSZIkSZIkSTrLycRQkiRJkiRJkiTpLCcTQ0mSJEmSJEmSpLOcTAwlSZIkSZIkSZLOcjIxlCRJkqTDKC8vZ8KECVxzzTUEAoFOLzdnzhxGjx5Nv379uOiii05ihJIkSZLUdWRiKEmSJEmHsWzZMpqamtiwYQN79uzp9HL33nsvkyZNOomRSZIkSVLX03R3AJIkSZLUE02dOpVly5aRk5PDgAEDujscSZIkSTqpZGIoSZIk9TjPP/88L730Ena7HYD/+7//47PPPmPz5s0MGzaMxx9/nBUrVvDaa69RVVXFBRdcwO9//3tMJhMAb7/9Nv/85z9pbm4mFotRWFjIjTfeyMUXX8y8efP485//jNvtRqPRMG3aNB577DGWLVvGPffcQygU4rbbbuP555+nrq4OgJkzZzJ27FgAqqureeKJJ1i/fj0GgwGLxcIPf/hDvvvd76IoylHL9d577/Hiiy9it9tRqVT06dOH3/72t5SWluLxeLjmmmsoLy8H4Pvf/z7BYJAVK1agKAo/+tGP+NnPfpZY12uvvcbrr7+O0+kkNTWVaDTKgAEDeOyxx7r885AkSZLOAkKSJEmSeqCnnnpKlJaWitLSUvHOO+8IIYS48sorRWlpqbj00kvF6tWrRSgUEmVlZaK0tFQ8//zzQggh/vrXv4rS0lJx4YUXCo/HI9ra2sSYMWNEaWmpeOutt4QQQrz00kuitLRU9OvXT+zbty/xnnfffbd47733hBBCrFq1KvH+q1atEkII0dTUJMaNGydKS0vF559/LkKhkLj44otFaWmpeOWVVxLrufPOOxMxHPDmm2+K0tJSMXHiRBEIBBLrHz58uGhsbEzMd+A9p0yZInw+n9i2bVvitc2bNwshhFi+fLkoLS0VkydPFsFgUAghxPbt28WgQYO6/HOQJEmSzg7yHENJkiSpx7v44osB6NWrFwB2u53Ro0ej1WrJz88HYNOmTfh8Pp577jkAxo0bh9lsJjU1lSFDhgDw5JNPIoRg+vTpaDQahBAsXLgQgEAgwMqVK5k8efIR4/jnP/+ZOIo5fPhwtFotAwcOBOJHOY/mL3/5CwADBgxAr9czdOhQALxeL/Pnzz9k/vHjx2M0GiktLU28tnHjRgB27twJQEtLC4sXL8btdjNgwAD++Mc/HjUGSZIkSToSOZRUkiRJ6vEsFgsAWq0WALPZnJh24DWn08mePXsSVxBNSkpKzJOcnAzEE6nGxkZycnIoKytj6dKlLFq0iFtuuYXFixdTVlaGXq8/Yhzbt29PPL7++utRFAW32016ejoAHo8nEevXtbW10dTUBMDatWu5/PLLARLLtba2HrLMgfg1moNdtdPpBGDMmDGoVCr8fj+33XYbarWacePG8Ytf/OKIsUuSJEnS0cjEUJIkSTorXXHFFSxdupTq6mo2btzIggULjplYCSESj+fNm9ch+ezscqNGjeKZZ5455jKHO1/xwHqGDBnCa6+9xrx581i5ciXt7e2sXLmS1atX8/7771NUVNSpuCRJkiTpADmUVJIkSTpj9OnTB4PBABw8ugbgcDgAyMjIIDs7G4BJkyZhs9mA+DDQffv2MXLkyKOuv3///onHDQ0NicerVq3itttuO+Jy6enpZGRkANDY2Nhh2iOPPMKiRYuOWbavW716NZFIhCeeeIIvv/wyMYQ0HA6zY8eO41qXJEmSJIFMDCVJkqQziMlk4uabbwbiyZrX68Vut7N582YAbr/99sSROL1ez6WXXgrAf//7X6ZPn37Mq4pee+21WK1WAF5//XUgfm7iM888w4gRI4667E033QTAjh072LBhQ+Lx+++/z+jRo4+rnNu3b+cPf/gDbrcblUqV+DuNr5/zKEmSJEnHQxFfH98iSZIkST3AN/+uYvz48Vx00UX86U9/wuVyodVqmTp1Kjk5Obzyyiv4/X6MRiMTJkzgqaee4q233uK111477N9VfN26dev4wQ9+AMCnn35KYWEhACtXrmTWrFmJv6vIy8vjoYce4txzz2XXrl08+eSTrF27FpvNRkZGBlOnTuWGG24AYM6cOSxcuDAR5+DBg3njjTcA+Pe//828efPYu3cvvXv3Ji0tjV/96lcMGTLkkL+rSE1NZdasWbz55pt8+eWXidd+8pOfMHjwYJ555hnq6uowGAy0trZSVFTEzTffzKRJk07ypyNJkiSdiWRiKEmSJEmSJEmSdJaTQ0klSZIkSZIkSZLOcjIxlCRJkiRJkiRJOsvJxFCSJEmSJEmSJOksJxNDSZIkSZIkSZKks5xMDCVJkiRJkiRJks5yMjGUpB5g5cqVXH755fTr14/rrruOa6+9lmnTpvHqq692a1ybN2/m8ssv56KLLurWOCRJkiQJ4n8rc6C/fPfddw+Z7vF4GDlyJBdeeCFPPfVUN0QoSacvmRhKUg9QVlbG3XffDcDLL7/M66+/zp/+9Ccee+wxVq5cedRl+/XrR21t7UmJa8iQIYm4JEmSJKm7XXzxxdx9990YDAbmzZt3yPT//Oc/RCIRZsyYwa9+9auTFsdFF13EV199ddLWL0ndQSaGktRD9e3bl9LSUpYvX97doUiSJElSjzJ16lS2bt3K5s2bE68JIVi5ciXnnHNON0YmSacvTXcHIEnSkXk8Hj777DNee+01vvOd7/DQQw/xwgsv8Pe//52ZM2eyZ88eAG6//Xb0ej1z584lMzOTf/zjH3zyySeo1Wp69erFPffcg8ViYfbs2bz33ntcd9117Nmzh61bt3LVVVfxv//7v7z44ot89NFH6PV6DAYDt956K4MGDUrE8sILL7BkyRKcTidPPfUUxcXF3VUtkiRJ0lkuNzeXSZMm8eqrrzJ37lwAVqxYQVlZGR9//DEQ77eee+45LBYLL7/8Mv/4xz/4+OOPuemmm7BarTz33HMMHToUq9XKli1bSE9P5+mnn0av1wOwfPlynn76abRaLRaLhQceeICsrCzuuusuWlpaePjhh7HZbNx5550MHjy42+pCkrqMkCSpR1i1apUoLS0V4XA48XzAgAFi48aNYtCgQaKhoUEIIUQwGBS33HJLYrnS0lKxb9++xPMFCxaIqVOnCp/PJ4QQ4u677xZ33XVXYvp1110nfvKTn4hIJCIqKirEm2++KRYtWiSmTZuWWOaFF14QTz31VCKOQYMGiTVr1gghhLj//vvFrFmzTmJNSJIkSdKRrVq1Sjz11FPiq6++EoMGDRLNzc1CCCFuv/124fF4xHXXXSeeeOIJIYQQ77//vrjggguE2+0WL774oliyZEliPU899ZQYP368aG9vF9FoVEybNk28++67QgghampqxLBhw0RFRYUQQoj58+eLG264IbHshRdeKFatWnWKSixJp4YcSipJPcyPf/xjrr32Wv7yl7/w5z//maFDh1JWVsaiRYsAWLp0KRMmTDji8gsXLuSyyy7DaDQCcOWVV7Jo0SKi0WhinokTJ6JWq+nduzdXX30177zzDpdeemlime9973tccskliflNJhOjRo0CoH///iftnEZJkiRJ6qwxY8ZQUlLCG2+8QU1NDRkZGZjN5g7zTJ06lYEDB3LHHXdQXV3NxIkTO0wfOnQoSUlJqFQq+vbtm+jf3nvvPQYPHkzv3r0BmD59Ol9++SXNzc2npnCS1A3kUFJJ6mFefvllNJqOP83LL7+cv/71r/zsZz/jww8/ZPbs2UdcvrGxkdTU1MTz1NRUwuEwra2tZGVlAWC1Wo+6jNVq7TCPxWJJPNbpdITD4W9VNkmSJEnqStdddx1//vOfaW9v50c/+tFh57n77ruZNGkSf/vb3w6Z9vX+Ta/XJ/q3xsZGKioquP766xPT8/LyaGtrIzMzs4tLIUk9gzxiKEmngUmTJtHc3Mzy5ctRFAWbzXbEeXNycrDb7YnndrsdrVZLenp6p5fx+XxUVlZ2TfCSJEmSdJLMmDGDcDhMXV0dRUVFh51nwYIF/PCHP+SRRx7B7/d3ar05OTkMHjyYefPmJW4LFiygtLS0K8OXpB5FJoaSdBrQ6/Vceuml3HXXXUydOrXDNJPJRCAQYOHChXz00UfMnDmTjz76iEAgAMQv3T1jxgzUavUR139gmQMd5iuvvCKvhipJkiT1eHq9nocffphbb731sNN37tyJ1+tl1qxZFBcX88QTT3RqvdOmTWPTpk3U1dUB0NbWxvXXX08sFgPAbDYTCARYtWoVr7zyStcURpK6mXr20cakSZJ0SqxcuZJHH32U1tZW1qxZQ35+Pnl5eR3msdlsvPvuuzzwwAMdkjyPx8Nzzz3H7t27ufHGGxk+fDh+v58//vGPvPPOO9hsNu655x50Oh2PPfYYS5cuZceOHYTDYYYPHw7E/wsxEAgwd+5cFi5ciFqt5le/+hWVlZXMmjWLuro6GhsbSUtL49FHH6Wmpob29nbKyspOaT1JkiRJZ7cD/eWmTZsIh8OMGDGC3r17J0bF/O53v2PNmjVUV1fj9Xp59NFH0Wq1TJ8+neeff54lS5ZQUVFBOBzmpZdeorKyEqPRyPbt2/n3v/9NeXk5qampjB49mgEDBjBnzhwWLlzIxx9/zN13353om2OxGM8++yzr1q3jxz/+MWlpad1ZLZLUJRQhhOjuICRJOraKigrmz5/P/fff392hSJIkSZIkSWcYOZRUknq49957j0gkwjvvvMPMmTO7OxxJkiRJkiTpDCQTQ0nq4bZt28bMmTNxOBwMGTKku8ORJEmSJEmSzkByKKkkSZIkSZIkSdJZTh4xlCRJkiRJkiRJOsvJxFCSJEmSJEmSJOksp+nuAHqi+vp65syZQ3p6Os3Nzdxzzz0UFBR0mEcIweOPP05bWxsej4dJkyZx5ZVXdlPER9aZsgCsW7eO2bNnM3HiRH772992Q6TH1pmyPP/88+zZs4fU1FQqKyu5/vrrOf/887sp4iPrTFk+//xz3nnnHfLz86murqZPnz7cfvvtKIrSTVEfXme/YwC1tbXMmDGDe++997T9vbzzzjuJy58DZGdn8/bbb3dHuEfV2c/l/fffZ926dQCUl5dz6623MnLkyFMd7jF1pjw33HADe/bsSTz3+Xzccsst/PSnPz3V4Z5RTrRPPF36y2M50Xo4XdqOzuiKbYuXXnqJTZs2oSgK/fv35+c///mpCr9LnWhdfPXVV/zyl7/EYDAkXlu5cuUpib2rneh22pnSVsCJ10W3tRdCOsRNN90kPv30UyGEEJ9//rm44YYbDpnngw8+EL/85S+FEEIEAgFx4YUXin379p3KMDulM2XZs2ePeOGFF8RvfvMb8fjjj5/iCDuvM2W57rrrRDgcFkIIsXv3bjFs2DARCAROZZid0pmyvPnmm6KyslIIIUQwGBQjR44UmzZtOpVhdkpnyiKEELFYTPz6178W06dPF2+//fYpjLDzOlOWt99+W6xateoUR3b8OlOWbdu2iTlz5iSe19XVicbGxlMV4nHpTHkefPDBDs//53/+R9TV1Z2K8M5oJ9onni795bGcaD2cLm1HZ5zotsWmTZvEzJkzRTQaFbFYTFx11VVizZo1pyL0LneidbFq1aoe2ycerxPdTjtT2gohTrwuuqu9kENJv8HhcLBixYpExn7eeeexdu1ampqaOsy3cOFCJkyYAIBer2fMmDG8//77pzzeo+lsWUpKSrjxxhvRaHruAeTOluWVV15JlCM/Px+fz4fb7T7l8R5NZ8ty9dVXU1xcDEBTUxNarZbc3NxTHu/RdLYsAPPnz+eyyy4jOTn5VIfZKcdTlrfeeos//OEPPPDAA+zatetUh3pMnS3Lq6++SnZ2Nk888QQPPPAAGzZsICsrqztCPqrOlmfWrFmJx/X19SiK0uN+M6ebrugTT4f+8li6atugp7cdndEV2xaLFi1i/PjxqFQqFEVh4sSJLFq06JTE35W6ajtr8eLFPProo8yePZs1a9ac9LhPhq7YTjsT2groum3W7mgvem4m0E3q6+sxmUzo9XoAdDodNpuNurq6DhtMdXV1pKWlJZ6npaVRW1t7yuM9ms6W5XTQ2bKoVAf3dSxZsoSLL76Y9PT0Ux7v0Rzv53Lfffexbt06Hn744dO2LNXV1ezYsYPrr7+e+fPnd1e4R9XZspSWllJSUsLQoUOpqanhmmuuYcGCBT3qN9XZslRUVNDQ0MCLL75INBrl2muvRa/XM3ny5O4K/bC+TVv2xhtvcO21157KMM9IXdEnng795bF0RT2cDm1HZ3TFtkVtbS3jxo1LPE9LS2PDhg0nJd6TqSvqIjc3l2uuuYYJEybgcDiYOXMmzz77LP379z+ZoXe5rthOOxPaCuiauuiu9kIeMZTOSPX19bz55pv8/ve/7+5QTtiDDz7Ia6+9xmOPPcaqVau6O5zjFovFmDt3LnfccUd3h9IlBg8ezNChQwEoLCykf//+LFmypHuD+pa8Xi+TJk1CrVaj0+mYMmUKH3zwQXeHdcJCoRDr16/n3HPP7e5QJCnhTGo7pK5TUFCQOEqWkpLC+eeff0a0w8dyJm2nnajD1UV3tRcyMfyG3NxcfD4fwWAQiG9guFwu8vLyOsyXl5dHW1tb4nlbW9sh83S3zpbldHA8Zamrq+Ohhx5i7ty5pKSknOpQj6mzZfn6cIKkpCTOO+88Pv3001Ma67F0piy7du0iGAzy5JNPct9991FVVcWCBQt4/PHHuyvsw+rs51JVVdXhuVarJRAInLI4O6OzZcnOzkatVieea7XaxDI9yfG2ZR9++CGXXnrpqQzxjNUVfeLp0F8eS1fUw+nQdnRGV2xb5OfnY7fbE89Px+8EdE1dVFdXd3h+NnwvjrSddia0FdA1ddFd7YVMDL8hJSWFsrIyli9fDsAXX3zBiBEjyMrKYvHixYmN9RkzZrBs2TIAgsEgq1evZtq0ad0W9+F0tiyng86WpaamhkceeYSHHnqItLQ0PvjgA9avX9+doR+is2W57bbbOnxG5eXlFBYWdkvMR9KZsgwYMIC///3vPPjggzz44IMUFxczc+bMHncEsbOfy5w5c3A6nUD8qpdbt25lzJgx3Rb34XS2LJdddhlfffVVYrm1a9dSVlbWLTEfzfG2ZQsXLuSKK67ojlDPOF3RJ54O/eWxdEU9nA5tR2d0xbbFjBkzWLFiBbFYDCEES5cuZcaMGSc79C7XFXXx7LPPJq6mHI1GWbNmTYdhtqeLrthOOxPaCuiauuiu9kIRQoiT/i6nmdraWh566CEyMjJobm7mrrvuoqioiOnTpzN79mxGjRqFEII//OEPOBwO3G43F110Ed/97ne7O/RDdKYssViMOXPm8OWXX2I0GjnvvPN65F9WdKYsU6ZMweFwoNPpAAgEAvztb39j7Nix3Rx9R50py6uvvsrKlSvp1asXzc3NpKWlceeddyYuXdxTdKYsAJFIhIcffphPP/2UkpISpk2bxtVXX93N0XfU2c/liy++oKioiH379jFlypQemYR0pizRaJQnn3wSl8tFLBYjKSmJ3/zmNx3Oe+gpOvs927ZtG2+99Rb3339/N0d85jjRPvF06S+P5UTr4XRpOzqjK7Yt/vGPf7B582YURaFfv3784he/6MYSfXsnWhfvv/8+CxYsoKSkhMbGRgYPHsxNN93UzaX6dk50O+1MaSvgxOuiu9oLmRhKkiRJkiRJkiSd5XrebmFJkiRJkiRJkiTplJKJoSRJkiRJkiRJ0llOJoaSJEmSJEmSJElnOZkYSpIkSZIkSZIkneVkYthJLpeLv/zlL7hcru4OpUucSeWRZemZZFl6JlkW6WSQn8VBsi4OknURJ+vhIFkXB/XEupCJYSe5XC6efvrpHvXhnYgzqTyyLD2TLEvPJMsinQzyszhI1sVBsi7iZD0cJOvioJ5YF5ruDkCSJEmSznT19fXMmTOH9PR0mpubueeeeygoKDjsvLW1tcyYMYN7772XK6+8MvH6Sy+9xKZNm1AUhf79+/Pzn/+8U9MkSZIkqTNkYihJkiRJJ9ns2bP53ve+x+TJk1myZAmzZs3i5ZdfPmQ+IQRz584lLy+vw+ubN2/m3Xff5a233kJRFK6++mpGjhzJqFGjjjpNkiRJkjpLDiWVJEmSpJPI4XCwYsUKzj//fADOO+881q5dS1NT0yHzzp8/n8suu4zk5OQOry9atIjx48ejUqlQFIWJEyeyaNGiY06TJEmSpM6SiWEn6XQ6RowYgU6n6+5QusSZVB5Zlp5JlqVnkmU59err6zGZTOj1eiAet81mo66ursN81dXV7Nixg0suueSQddTW1pKampp4npaWRm1t7TGnnSqny2dxKsi6OEjWRZysh4NkXRzUE+tCDiXtpMzMTF5//fXuDqPLnEnlkWXpmWRZeiZZlp4pFosxd+5cfv/733d3KN/KmfRZnChZFwfJuoiT9XCQrIuDemJdnFWJodPpIxYT33r5lBQzDoe3CyPq+c62Mp/u5TVU/5VAr1s6PX9nyvvsxr9y87DOr7MnOx0/32ef1XLzzeFvtezpWN4TcaC8KpVCUpKpu8NJyM3NxefzEQwG0ev1hEIhXC5Xh/MId+3aRTAY5MknnwSgqqqKBQsWUFFRwR133EF+fj52uz0xf1tbW2L5o007HrKPPHmO1Y6ezLo73n7hdNLd37kTaZ+7W3fX3enqdK+3Y/WPZ1ViGInETqjTO7COs83ZVubTubyxoOO44z/W/Hbf8a+zJzvdymK3ixOK+XQr74mKRGKoVEp3h9FBSkoKZWVlLF++nMmTJ/PFF18wYsQIsrKyWLx4MWPHjmXAgAH8/e9/TyxTVVXFzJkzE1clnTFjBrNnz+bWW29FURSWLl3KnXfeecxpx0P2kSdPZ9rRk1V336ZfOJ10Z9lOtH3ubqdz7N3pdK63Y/WPZ1ViKEmSJEnd4f777+ehhx5i2bJlNDc3J4aM/ulPf2L27NmJK4hGIhEefvhhqqurWbRoEdFolKuvvpohQ4Ywbdo0brvtNhRFYdKkSYwePRrgqNMkSZIkqbNkYihJkiRJJ1l+fj7PPPPMIa+/9957HZ5rNBruu+8+7rvvvkPmvfHGG4+4/qNNkyRJkqTOkFcllSRJkiRJkiRJOsvJxFCSJEmSJEmSJOksJxNDSTpDxDxtiHCwu8OQJEmSeoiYz4nwuxDixC4qJEnS2UGeYyhJ3cgd8rCtbScDUvuRpLfS6G0ixZCCXn18f3YaqduO/8Mn0CVvI+gfjW7IpSjqQ3/eoWgYlaKgUR37px+Khlha+wVVzr0EoyG0HjsoalS2jOOKTZIkSTp1Yu5WItXriFStI9pYji55G94td2C6fBYqU1J3h3da8wcjNDv8tLT7aXUGqG3Jpr41THaqqcddDVmSvg2ZGErScWjyNvPJ3iVYdRau6DOVcCzCl/WrGZE1FIvWfMzlq101tPja6JfaB7PGxIOrHscX8TN7XPzS8s9ufhlf2M8tw26kyFYAgBCCmLMBRaVBsWagKAqOQDu+iJ88Sw7R5kr8nzwFsQjEolRv+g/zWpeRbEjmZ8N+is2aGV9PNMwnez9jSe1KxuWM4rt9Z8RfD3gAUAwWAGJeBwiBXQmzqPIjytsrCez5guDyeagn/wLzNxJDEQkRczYSa6sh2rYPRavHk9ePCiXImOwRiGiEmKMWAFVK/mETVunwwpEY/mAEIQSKIjc6JOlMImIRYs5mYs4GPM0awi4/IFCn90KVnANAtLmSWHsDit6MYrSiGOI3tIZD2oTQpg8IV6wm1lp98EWVBrR6FKFBMdri7xsJEd7+GZqSsajMKYlZA6EIDncQuyuI3R3A4QpidweJRGNo1Cq0ahUajfK1x/F7k0GDzazDZtJhNWmxmnRoNcc3IC0YjtLQ5qWuxUtdq5f6Vi91LR7aPSFSbXqyUkxkpZjITDWSnWoiK8VIWpLhW9X7N8ViAl8wgj8YwReIJB57A2HsriDNDh/N7X6aHX7cvo7/Wbhrl5p7X9iFTquiIMNCYZaVwqz4fX6GGa1G3SUxStKpIrfQpNOWJ+TFrDUlOsdQNEQwGsKqs+AMumnyNZNvycWkNSaWEUEvMXcrquQcFM2Rj8o5Au3o1TpM2o5/AhoRUVY1riXTlM4VfaaypXU7/9r9H1Y1ruOOkf9DbcNWdjasp7c1n97F4wmrFDa3bKNfal+sOgvvVnzMTkc5vxz6/xiU1p8h6YNwhlwk6a0AGDVGQtEQOeZsIq5mlm1fyJDaavT2Opp0alqsyQxLG8B7eh+r/LVc3fdyRqxdzLw0PS5LOrOTk8goN+BQC6zuFkwxQZvfztK6L5iyYSWjQz4+zzbQ3rybaOo+vtj6JUuqvmSErYSRE/+HxTVLWVW1hB9U1ZJbMp5Lc8sI2XejWvoyeywm/lm9kLHudnJFf3ptfx67sNPL7SEGLE82MdwdQCsEj7evwq9SyDCmkb11BeFtn6IAMbUGf3oBSaY0RCQEkSDaQZPR9h7N9rZd2HctoWjvDlLQolhSsef1JaPXGFRqE6271lFvGQIqNUXZVtKTDESiMbZW2Vm1uxpfKMSAnHz65CWRlWoiFhNEozHaIk2Y9DqyzZmHfM7BUJS9TW5UikLvPBuqoyRf4UiUrVV21u9uIRSOMX5IDoOKU1EpCrGYoLLBRbPDh0atQq1SAYJAKIovGCEUjiIExIRApSiYDBpMeg1JZh15mRZspo7fxVA4ypINdXzwVQ1r1hTT+ucKinNtpFoNtDrjGyiBUJT8DDP5mRaKc2wMLUnHZDh6kx6OxNhZ46C+1YWeCrwAACAASURBVIvJoMFi1GI2aDHo1Bj0GqxGLUa97BYkqSt9VPkpO9p2UuOupdCaT2jLx4R3LiXW3gQiCkDga/Prz/shuv2JYbj8C8LbFh+6UrUGobewxjCORQ4rZlL4keoL8qK1RBQtzuR+BLOHoC0cisn5DPX6H+Hc1YLbH8bYsJFBtf8msOoN6tX57I1lIUJ+dLEAJiXI+lAxa0O9AZhm3MD5+p04YuYOt8aYmfaYmaaoDY8wHhKeUa/BZtJiNevQqlUoCiiKgqKASlFQAJ0SJi1UT7VLy+52HTEO3/62tAdoaQ+wtcresQpUCtlpJtJsBjJTjPHEMcWYSBrj7XC83bW7AjQ5/DTZfTTafTQ7/DTafTi9IYKhaKc/S61GRUaykcxkI2k2A86dJlJteuyuIBX1LirqXYl5VYpCilVHms1AapKBNNv+W5KBVJuBdJsBve74E8doLEZDq49QJEZBpuW4k3BJOhq5BSCdVCIWIVK5FnVWCSrr8Q1BjMai/Gv3AnItOZyXMwadWos75EkcmXty/TNoVBp+ds6P2Ouu5Z873mJ09nCu6TeTXY5yXtn+BiVJvbht+M2ERARf2Ie2Yi1i5XwCllT8536XlLwhWHWWRKwQP6r3+Nqn0aAwyzyckE7H686tXB0wYnW2MNOkIVefCoBVa6a/OY9+be145v+azfowH6RbKdu7hsxlr/NFfgGLdF6uKJnKxUUXUCL06LGiWvkaHl+QGbEY0WgM35Y7CRHltjHfw9frHHRqLV+unc/bNPKFOcyPA1b+lm0gokDS3lVoLAY0KTaCbans6f0D6lpfxkmIBqEicMFd3FzxMUk4UButrNi3lP/WLMMS83Gu18PN+zyoRCu+bbNYn2ZmY4oZq6cZTXkzLeF2GqJePksx8v0dnzNhp4oNkVqEehAf6fPwx5x8uqGKwY4KFg3wEVIZuMPrp8qazIfpalanp/HLUAF9Yl4qYyrmv1eDylhLdloqY9ujfJitEFA5uaW6Cq9axWvZSVxYv41RvUax3b6LzwMVTNSFuaytnRctIXa1O/h/739CqT/Eu1k2tmtW4qkchvAmM926hY2hImqDNjQ5FWjy97BzRz+iS3p1+B7phy5FpfdT6p5JcWYuy73/IiJCmOrLaGwUHDj1JivFyMRheRRlW9lT52T3vnbsrgCKoqBSoMUZ6LABsWZnMzlpJoqyrWyrsh+yJ/l4JJl1ZKUY0WnVaDUqKutdOL0hADRqFd5AhK2V9kOW21nTzs6a9v3zKZzTO41hfdNJMusw6bW0+cJU1jhocwaobnSxuaKNwDE2ggqzLAwsSiU33Uxzu4+6Fi9uX5hkq550m4Fkqx7j/kRSr1Wj06jQalWYDVoyk42J4VS+QITNla3Ut3rJSjGRl2EmN82MTttxQygcidLuCZFmM8ihWFK3iMQitAedJOmT0HZimL2IRRFeO6g0iSNu4er1RMq/QIT81Ec8rND4yQ2GOdfuRm3RUqXsY13TJgqt+dT4Wwn6mikQMVTWDFTJORgsFoLBCCgKKltW4r2i6UVs7XMO3rCfc31RvAEXLyQLIsBt++zstLcRGbye9qiad7b0xyD6sEtnJlyZAhUqWLmNa3PqeL2hOrHOXhoPYUMhg7S15EVryaMWtAfLFzJno03PItVmYICnEWPdFoyqdnJpP6QudiWPZ4PxXFzeEH6vjzZ/DLc3fsTNH4zQ5PAfskyhupXzDTsZqqtBr0RABYEULc1KOh5jLr6sYVgL+5GXYY7vDHMFaLb74omdw0eT3Uerw4PX46O5JUxdi/eQ91CrFNKSDGjVKprb/YSP8ofkCmDQx3fWmQwajPsfG/UaUqz6RLKZkWwk2arvsAOxbqOO3/0yBbcvRE2zh5omNzVN8fvGNh9triBtriDUOg/73mk2A3kZZvIzLORlmMlLN5OTZk4ke7GYoNHuo7rRRXWDm+pGNzVNbkL7y6PVqCjOsdE3P4m++cn0ybNhMmgP+14nIhKNsa/Zw+597ZTXOmmy+xgzIJOp5xYlEnDpzCATQ+mkESJGYMk/iOz5EsVgxfz9R1H08aROxKJEa7fR4m2mOeqhQJ+GzZLJYk8lO917uWHgNSSjYZ+jmpX1q8kz5xCIBnh1+7/4Tu9LGJTWH0/Yi1pRkaxPIhD2E4gGaGmvIdJWi0mlp8iaz1BflMCnf2FF/6H8p+IDZqSPpMycynvGMGur/sOV+9ZSltyPz2pXsEUvePCqR8k25JKhSsHsbUK/5yOa9RoqC1J5P9TKdU4XY50QFOls2N1CVSNkVuZwbmgDAMnCTB+7GmvAiIh5cHuaSbOl4XFpCIWjlLkEqj0V8TogfvWnA02qAHZXNSK0A9jQXovPZSRfp0HrLeJPjsF41VswaT2sbi8gz+XEUzmaf0UbAFBZhiMietalLOL1T9cDGUAG1m3r0JbsRASy+VdtPz6JRuitaaaftgG9tpFeLjNbPbl86uzFJ1u2ohhUWCzjCep0VGrX0TtcThQVL7gnUm4vwJjmIFeXi7VQDWEHoViY+zznI3wCrbqCBncKdzf1AmKAQqvRjn6QnTq0fFF/PgZlIypVmBfDY3Aa/NgNTcxrbOaFPy4lPVuHxpbJyqYs1nqs6K070esdtKl0+ESMCoORiDZAaU4msfoWyntV44i1U9h6GQNKRrMsVE5xvgF/zITDshmNJwdtMBVvwEIsqmFTuZNNO3wYRjpQ1FGczSFUaEnpV0nUnkVTC7y9ZQnRJbkgDt/RFWVbGVmagaLAZ+vraGjz0dDmAyA9yUDvXBtCxDtRiO81N+o0aLUq1Kr43vJoTOAPRvEFwrS5AtS2eHF6Q4lE8Ovvdfn4Yj5R5XLjz1OobHDh8obISDaQmWJCp1FR2+KhpsnD9mo7u2ra2VDeyoby1qP+LvMzLPQtSCIYiuLxh/EGwgRDUQKhKO2e4P6NGk/nf+hfo9OoyM+0YNCp2VXTTjTW8YIXigI5aWYKMy3YzDoqG1xUN7iIRAV6rZqiLAulhSlMO7cIvVYOwZJOjVa/nUdWP0lUxJg17rdkmTJ4t/JjNCiM1+eit9fT7qgBdxtGdxu420BE0Q65DMO47wPQaK+kvmkzA30h2kw61uQmk6VSODcWYaAfCi1mRmUNB2CxxsuWglS+X/IdnHUFLN60i5R0wbjCgQzvl8K/q/+Fo+FT7h1zO9o+Y5lf9y4qnYqLpjxMdbWdusrHQRHc77yCooJsUnUeNCoNM66Yzr72ZiraX8IoTBQ7ZuLw+DHZQhRrtCQbkrCatFiMRfhM57JbFyHLswtzxInRakNvsaIyWBifnMOEpGwAhCiG4Axinrb4Bc4S93ZiXjtDBw9hVMkgAALLXyZSuxXNqNGEC0bh1mXh9oeJxkT8lAOVFiEEln0ryNhdGV/Gkocu4sUQaKeQBvA3oM8fhK5/fIRHaOunJO9aRlI4RN9IEBEJQjgE2iikgDL0O+xOmUCTI34UsHl/8hgfAnowKT2w8y0r1RS/pZjITjWSYtVj0GuOOlqkM6wmHYN6pTKoV2ritXAkit0dxO4M0OoKYHcFaXMGaHMFsLvi9wdumyvaEsupFIWsVCNmo5Z9zZ7DHtFMTzKg1ahoaPOxe187u/e1A3tRgLz9bXx+hgWLUYvFoMFs1MYfG7WH7Jz7OiEE0ZggFI5S3ehOJIKV9S6C4Y5xLFhexcY9rdw4bSC56cc+lUY6PcjEUDqmQCTAw6v/RCQW4ariKQxubiIC6NOKUKUVJPaYHjgPqqK9mpX1X1Ha1sqgPWsA0PafSExr4I0d/6autZwpze2UtDXzdnYSOyx6ftjg5BxvkOo+JZTjZm3DZia0tjCxcjflJj25S+axNTMLX8TP9t2fMtK3jNmDpmPPyEWtUpP61QJ+W9tGergZ/6YNFCoqfqo2oIn4CKHwmT+M2mykymmjePw9aHe8TEaoDlXTdsI717MnJ4kaRc/978xn76YcosHhjDbs4UPCNFqcpLXH8O8t5dFIClqiRJ0q6rZvAcCmmGjQTKSVFOrtVkBhC/CJEqCvp5FoNIl314f48KPlFCo6ctVjqI+m4IkZMOg15KSZqW3z4wqA364jsnHT/prvt/8mgCh4BxAxaqkw62k0ahmiU2PUa4hEY3j8ybj9YSxGLTlp8eGvdncQtzcMm4cCMLBXCtPGFRGKxPjoqxr+uS++91enUdE3x4YqHfY2qXG3RlkHrONcMlUDWasTXJgxmFvGFjK8b0biqI4/MgS/T2FTXiv7mj20tBfREgqgzVDRO8dGSV4S2akm9gZysOpN5J5TxPbGvjjbFby9VGh8PoR9L25fmEg0RmOdDupGYDZoKOqbQlHOOeSl2sgabiQlWc/DaoUqZw19U3rjdjVx19ovAR8/vSCffGsuwx1W+iQX4wq5mbv2M9SKnfvG3QGcS32rj6oCF6hUhCLFCE2AXtfm4tLs48XtHzGkRMsY4zg+2BbCb4zSP6OA0oLkRGcnBJgNGlJtB89puWRMIRvLW3G4gwwsTiU3zfStzgOMCUFru582Z4BwVBCJxjAbNJQWJKMoCp8AaUmGw55Pk2ozMKQknenn9cLhDrJmZzNVDS68gTD+QARFrcJq0JBmM5CVauKckjQykw8d9nVAKBylvM7JjmoHLe1+slJN5KabSLHocbiDtLkCtLtDBMIRAqEowVCUUCRGOBLF5Q3R5gpSuX8olaJAaUEyffKSaGn3U9viocnup37/+UMHKIDNpMXlC7O71snuWicDCpMZ8LUNLEnqap6wlwV73mdSwQRyLdlMzD+Pz/etxOZ2IozpfFazjFAszMjKFogJHi9Ox2dU8WB9MzoB9/fOQPg28Gh0JvaAg0fbV2POz2RYyfcYpjXQbN/CyKyhWC4uxKLWMnjNIxRYcwHISSqg2tPMx59HaWisRD9oDS1mF28uVfHm5xrMI6uJqUPUOlopSM2gLHcM0YiKv/5nExt2OVBM40g12bj28nMY3jcDRRlHKBpGp9ZidnpZtzOLDGM6P580lN2OCurX1WAu3Mr/jvwF+9x1vFv5MSkpfRhXOAF/JAdX0I3JmIZadWjCoCgqMFhQGyyo04uOWJ9CCKLNFQh3K+HNH8LmD7ElZZFaOIxocwWKSo3pO3fF582bTMgi0JaWYbXFE8CYz0mstZpoazXq3P6J9Ub2biTWtu/QN1TUoDNQMPFyLIH4pmxo88eohhWjzhpGOBKjpd1PJCrITDGelCHyIhICjnxKilajTpwbeTjRWIxmh5+6Fi+1LR7qWr3Utnhpdhzc4QiQZtPTK9tGrxwrvbJtFGVbsRjjRwU9/jB76pyU18YTuOoGF7UtHmpbjrxzT6dVYTFq0WjUhEIRIvv7nUg0fvrFka5fm5Vq2n9kMgmTXsPr/y2nqsHNAy+v4aoJvZk8uuCEE+yuEN/RGUFRKagUBfX+e5UqPpxZtf+5dHgyMTzLiVgU4WkDjY69IQcbW7fTO6mIoRmDiYkY7QEXBo2BacUX8+qOfxFb9x9CjfW8mWnFWafm8hY35JTycWYKep2JGwdfRzAa5KvGddQHwgxQqTFdcisitz9qlRqNu4294XaaIm56WzLIVFtxBwI0RTOxRvwEyy0EY315Y72PNpub8RorA7wOYrgZ2FLFjUYtJf5mYsBnLVnUJOmwmV2c49SQFzPh0Nsg6CYp5kIrfLRGLczznk+TPT6MdTWweuUmYDhF6nwspo1sFhoiVflEtRZ2+S2IUJTinDSU9CLaozHU0RgWf5g2SwiXN4RWoyLJrOccs468dDOlhcn0zU/CbIjvDY1E4yeyu70h2r1Bdu9rZ0uFnb1NbvZpcrDmD2R833QGFqWQkWxEURTCkRgbylv4Ymsj4UiM9CQD6clG0vefj5BmM5Bk0aFRH33Ihqniv5xbMg6IJxxtzgB1LV5SbXoKs6yJ+Yb2SaeuxYPJYsCqUyXWeyBJafeE4kOBQhGUho3cM2nUIe9l1Bgx2uDCEflHjamUMYnHhVmDvzF1JBC/0luj3YdapZCfaTlio903JX7ei9WWxe/PuxuVok6cn1maUgLAltbtKIrC9QO/n9jQKci0UJBpISPDSkuLO7G+9qDChQXjCUZCDO+bwTklk4kJgU597KE4GrWKUf0PPW/xeKkUhcwUE5lH2HjorBSrnimjCzq89s3yHotOqz5kr/fx8PjD7Gty4/aH6V+Ygs186LmTda1e9ja5cXlCFGVb6bP/t+Pyhdjb6MYXiNCvKOUI7yBJXePj6s9Y1bCWNm8LP3drubhyLRdFQ4jgQmKX3MqVfadjt1djtm8jkl6ElhrMgPW7c9BYMggtvw9BDI2iJtuUSbGtiGxzJuHsEixaM9/JLj3s+/qDEbyVJbSs0yGIkZGqRpOsxajJx1JqZceeIP5dwxBhPbPXbKZfQQp5GUNYsamecMSBTqti+qjhXDKmoMOFTQ60WcVJhdw79jdEY/GjO+FYmBR9MkW2eDtd72lkW9tODGo9ADvs5fxj63wGpw3gF0N/Qp2ngY0tW+mbXExpSp/E+g/s+HUG3bQF2rDpbKQbU9nTXkWb306BNY+cKx8g2lhOpOIrIlVrEc4mwls+jq9Aa0AEPCgGC4rBgn7UzA71ojIloSociqZwaIfX9WO/BxC/JoBGj6LVx+/3X8RMY7VCwE3M3ULwq3+BiKHK6oNu6GXkFg2PJ7bfENr+OZGqtcTaakAIVMk58VvK/vuM3qj2X6gHQIR8xBz1RB11xOx1xBzxm/C1E7M/Bhx/PyAiIWhvIMtgIbs0tUNfEgpHaWjz4QmEKciwHNKOfp3FqGVYn3SG9UkH4kcpqxrclNe209IewOsP4/nGLRSOYT/KX1upFAWNRiE3zUzf/GRKC5Lok59M0jfiGFCUyuv/3c3KLY288dkeNpS3cuO0AaQfYeejEIImh58dex3sa/YkhhsHQlEC+7c3/MH46BWNWqEoy5pIhnvlWMncv730dbGYoK7VS2V9/KhmZYOL+hbvEZPbA9KTDBRlWSnMtlKUZaUo23pI+c5WijiL/tymrc1DLPbti3u8G1k9gQj541ck2984CiFAxFD2bzA3fTSXaN12rNEYC9MtfJlsYkrEwmW557GQVnb5arlt+C/RB/00fPQ4RkcT2DL4Q4aagIjym1oXEY2JJ3M06DFzc6g/trY1/NcY4FxngEDJ93kzUoE/EiC19Xw8Pi/jNB9Q6erFJl8fYt/4K02zIT7G3+4KJoah6QjTW9tMb00zUaGiPWamPWaiIZqMSxzYmBbwtRPXzTrB2AI1mQWFZKRaSbUZaLT72FnjYE+dkzSbgVH9Mhlemo43EGFLRRu7ahzkZ9sY0SeN/AxLl38W3kAYrVp11GEcJ8pU8TC+krs7PX9nvtOPrX6Y343p/Dq7mxACgUB1mA2C0/E3/NhjOn73u9CxZzyM07G8J+JAeVUqhbS0rv8Nn+nO9D6yPehkbdNGJhdOxBdw8c81zzKpqoqMQHzIoWLLRJM/GMP4Hx1zXdFYlHAsgkETT7COdeXgx1Y/zCWpP2PeJ7uwu4KoFIVLxhYwo6wYvVadqDt/MMKmPa2s3tHM1qo2ItGDn8fYgVlcfUFJh5ELnfH1fsEZdFHlqsGiNdMnuZh1TRv5T8WHjMgcwsw+0/h83wreKl9EWe5YftD/KlY3rue1nW9zceFEpvWewvuVn/BB9WIu6zWJ6b0v4cOqxbxX9QkXF17AFX2m8lnNMv67bzmT8sczQZ1ORc1XRC2p9Os/BbXu+HZ+uUJuWv1tFFkLDns0Ew5+50TAQ2jrp4S2LYZgfESCkpSNJm8gMXst2kGT0JaMBSCwch7hbf894vsaLrgJbWkZAL73/kC0fsfhZ9QaeabpeX73f/EkPLRzKZqi4R2Sym8SIT+BFa8S2bsBwgcuOaTErzhrSkYxJWOcfEs8AebY36vjJUT8omhef5iUVDMupx+1WkGjUqHVKKhVquM+13tDeQuvfLgTly+MXqfm2kl9OX9IDoqiYHcF2LHXkbg53N/+v5ZNek0iURQIqupdVDW6Dxlqq1YpmA0aYiKeOMaE+Np9fAf44SRbdBRlWSnIspJq02M9MPzWpMNq1GI2alCrVF3azsWEwO0NYXcHcey/AWSnmchJNZFi1Xf5FcmP1T/KI4ZnGCEEERFFsdcTXv8fQtXrcRhNZKYW40xK44nATvKN6fy67Hdsad3Os7pmBmSl8CN7iDKnj6BKYYS9lWB1JXvzkmkzmvj3FxspaG9nlL2JVlJ4uuYCnDVqVFYHTzqtJKl8hLxB/K40KnQbGG9o5QK3gYW+cazb4EM/qAoUFa1N9YigmbeZAsSTwKJsK4OL0xhcnEp2milx5CociVLbEj+yEApFEQxACDAZNPQ367CZdYTCUZze+FG8SDT+oxdCkJ9hYWCvlEMuE12UbWXswKxvVhlmg5ZJI/OZNDL/pG7YmE/CCeHSoRRFQTnC1e0kSTo7RWIRHl/7NO1BJyluJ/02L+cadwsAml4j0Y/7Pipb54/8qFXqDsnK0TbenN4Qa3c1s7NuMxDvi35yWf8OozgOMOo1jBuUzbhB2fgCYTaUt7K30c3Ifhn0KzzxI+lJehvDMg6O3BiZNYyRWcM4cIygl62AKUUX0jspPmy0ztNAOBbGH4knMenGNHrZCrHp4rGXJBczKmsYvZIKAWgN2GkPOhGKgiZ/ENsCVfy3ZhlT9um5vOQyorHoEZM8iJ+6Eo5FsOosvLFrAZtatjIh7zy+3++Ko5brwFFI3dCphHctI7T5I4SzkbCzEQBVelEiMdT2Ox913kDUaUWgUhNrbyDWXr//vhFV6sFRMIreDGoNquRcVCl5qFLyUKfkoUrNQ7Gkw1w1ECXaXElw2UsE1fPQ9jkX7TmXoE7NR0RCRGq3oCkYgqLWgtZAtKkcwgEUWyZEQgi/E+F3IfwucNTB/iumi2gE37sPo+09Gu2AixLJ4olQFAWjXoNBCZGRrEcdO/JFeTpreN8M+uQlMe/jXazd1cLLH+5kxeYG3P4wTXZfh3ktRi0DilLok5eExaTFqNNg2H9ajEGvTjz3h6Ls3X/BnaqGeALo8obYXu1ge7WjwzoPnN/fOzeJ3rk2irIsR/2bkGgsRqPdT02jm71Nbvbuv2/3hGj3tLHpa+d7fpPZoCHZGk8aU20GUm36+L11/2OrAaNejT8Ywe0L4/KFcPvCuH0hXAfuDySCriDtnuAh5+J/nV6rJjvVlEgUs9NMDOyVmhhKfDLIxPA0Uuuu56kNz5NiSOYH/a8iz5LDx9Wf0ehr5v8N+iGKonD7slmEoiHurWzBEhPMLUojosCd1duwt2rwFaTiC8TPAQo6TRDTsDWSyp2tY7hsdBE/GGSheuMaanevpY+7jdaKQSzzBQADG7UXUBXJwCcM6LVqDOF8DCkazOYcxtj0JPXR09ZmZL69lLXtyVhNes7JsWDSmemdVEjOFen7/+tIi9moPeqwSK1GTXGOjeKcI+95kyRJkqQjqXLWsKJuFb2TiijLG8vFRRewo203ufWVCHcLqpQ89Of9EE3ewC5/72gsxo5qB19tb2Ld7hbqFC/naFXMPL83k0fld+pKjiaDlrJzcig7J6fL4/umA4ltcVIRxUkHzyWcXjyFy3pNQr9/6OnYnJGMzRmZmF6aUpIYxg9wZZ/pXFRwfmL+PHMOKfrkRDL6yvY3cIXcXF16OXmWjuX6smEt/9q1gPPzxnFV3+8wJnsE29t2cUFBWWJ63+Ri0o1pRy6HVo8y8EJ2pKVitjdQElbRZLGwTXjo56ymd1Iv1Onx2wEqcwoc4TtgmPAT0BpRjvV5abRoioYT2buR8K7lhHctR5VRTKy9AcIBjJf8Gk3RcJT/z96dx0dV3osf/5xZk5ns+04CgQSByBIWwxKVoCBKFYUiVrpo1Z/1WqB47b3IclW8trUutVVvq1Wv9boUlyKiLSg7imyyyZIAAbLv20xmP78/BgbGAAmQne/79fJlcrZ5zkF55nue5/l+FYWACT9DExSJJtT7olr1uL2BobXOO9321J+Fq3Annoqj2CuO4vh2FfohN2IYNBHFcP514q1xlR3GueefuI5/iy0sGkPufWhj+l7y9U4LNhn4f7cOZut35fztX4cpKPZmYTUatGQkh3FVn3AGpkaQGG1u0/o+g15LVr8osvp5p8iqqkpdk4PCU0EiqKTFe4PBi50CqtVoSIzyZoC9ZrA3yZJHVamsa+Z4WSNFlU00WE4FdM1OmqzeKbiWZicWmwuLzXXB62sU5byjkucSFKgnItgbYIYHG72Ba7WV0horjVanN3gtPzNg0TchhMfmtFze014kMOxBEoPiGZs4mn8dX0uQ3oxW0bKhaAtNLis3HDpA0g8Wo1d0uBQXpsg+6GLSqXfsI9wQQnPuzfRtrOfJyAQaQlJ47dPv2Ly3FJhIsMlAo8PJx5uPs/JrDS63CZhAsiGInOwYPG4PRoOWqNDBzIzw1gk6f8ZA75oEp8uDTqtIUW4hhBCdosZWy1el2wnQGpmYMoEaWw1fl22nprma0UoouYk55CbmoDZV4YofjP6q63zLKtqDR1UpKKpn64Fyth+s8CthExMbyBO3jSb6AsmfuiO9Vo+eto9O6DQ6v8BtdPwIRsYNQ0HB6XZyoOYwzS4bgTrvdNh3Dn4AisKdGdOJM0Xj9DiptXkTow2NHsyzuU+gUTSUWsr5v4PL0SlaHs/5D1+ZqbPZ3Q6MWgNflWzjvcMfMSA8nauG3UfBiQ2sLFhHrbuZvqGp7KrYy+aSrYyJG0F23DBKmsqottUQZ4ol2hSJxWnF4XZg1pswGNuWbVMbkUzgjb/EU1+GY99qnIc24ak8BoAmKtWbjev0M/peEKpotCjmcDD7jwjr+o4kUDcX+84VeCqP4ti2HMeezzAMvgHD4DxflvfWqKrHt5zIXXoYV+EOAFy1Zbj+sQzjqNvRZ00+53rMi6EoCmMGxZGRw/N1MgAAIABJREFUEs6+Y9XER5pJjQtuNTdCW68dHmwkPDiaYQMurvRZW2gUxZcoaNTAljPLwDst1WJzojXqOXq8xptx9lS22ZpGG7WNdmoa7NidbowGrbeO56lpqMFmA8EmPcGBBkLNBiJCjKfux3jB0c2mZidlNdZTgaKFippmhvQ7/4uR9iCBYQ9Q3VzDFyc3clv6VKakTuSqiAxC6qqwH/6IG8oqMbpdBFgcuMuP8N/jFqHVaFE9HlAUltrrCDOG+tZbfXeogrf+7zANFgc6rZabc/owZXQfjpbU896XBRSWNRJqNjA9ty9jB8cTGxtySVMrpeCqEEKIznB6HVaDo5FVx1YTGRDO9cnjyYwYwK19J9N370aa97yAedZv0ASGoARHYxicd0mf43Kr2Bwub+ZDpzczb7PdxXeFtXxzsJyahjNrqOIiTIy+KpZRA2N468jXPS4obC+nv3/otXoez/k1BXXHiAgIp8JaxaaSreg1Om7rdxOpISk8kfMfRASEtzjXpAskO3YoAVojwYYgqpqrOVJXyE1RE6i11fHmd+/S5LSwcNR8smOH8k3ZTobHDEFVVVJCkrkueRwZp5LpHKk7xoGaw6SHeUfKtpXv4l/H13JL38lMTr2eL05s4J/Hv+SWvjcyOXUiJxqKaHQ2MSCsH/pWEpNpQuMIGHs3xuzpuEoOoo1MvqgpymdTFAVdn6FoU67GXbwfx84VuMsO49jxEY49nxOY9yC65CEAuE7uBa3Om9zHGIRiNOMuy8d5YC2oKoE3/Jv3zyBjHKge9OnXoD2yjoZtn2Lf+j6uov0EXPdzNKawS2rr2cKDjYzPSrjs63Q3Go1CsMlAdHQwJu25Bz1Ol/toj2AYvKOJ6YmhpCeGtsv12kICw25OVVVe2/82xxtOond7uMmiEp//FbZTayNGAdrkLAzj8tDGZ/jefp6e9hAREI7T5eZAUQ3rdxWz/ZD3vAHJYfx0SiaxEd7F4Bkp4Tz242yOlTaQGGUmwCD/aQghhOjeqppr+Ou+t7kzczopwUlcnzyeAeH9UFEx6QIZV3gMV+kxFHOEN9nHBRKDnEtRZRPvry3gaLG3jtuF1gOBt7TAqIGxjL4qluSYoDOzZo5c6h32LoG6QIZEeUfMjFoDd2ZMJzUkBaPWm2Tj7KDwbKHGEH581Sw8qndN3IcFn7K7ch/NmibGRY+lzFKB3W2n2lZDVGAkC7J/4Ts3PSyN9LA03+8TUyaQHpZGvNk7MhQTGMXgyExiTd6RKL1GT6ghBLPe+/1obdEmvinbyW3pU8lLyaW4qZRgQ5BvneW5KEYz+rQR591/MRRFQZc0GF3SYFwlB3HsWoG7rABNZIrvGNu6v3jXKJ6LRufLCKsxhWEcPg2AqBt+hjOyP7Z1r+Eu3o/ty//BdPOjF9U2d00RamMVqscNHhd43OB2oYnphzYi8ZLvuadSFAXdeYLGnkK+/XcTR+sL+azwC+4Z9CMCdEY+OfI5dreDOwZM486M6XxcsIrrGuw4dn8OgGIOR9dvDIbMXJr0EXz6zQma9h0GTlW9c3twulUszU6OFNfjcHn/MjXqtcy4rh/XDktsMc9boyj0S+i8txJCCCHE5fjs2BqON57kX8fXcs/gH3F7/1t8+xx7/4Xr8EbQGgi88ZcXNXJjtTn5aOMx1u4s9lsvpNMq3jX2p5JkGA1ajHotCVFmRg+MpW9iiNRIa6NQYwjjEsdc1DkaRYOqqmRFXUVJUykT+43D3aTl/qyfEGeO8U1RvZDwgDDCA86MjF2TMJJrEkb6fp+SNpEpaRN9vycHJVAaVMbgyIEAvHfoY47WF/KLq+8Bvl+CqWPpEjLRJWTiaapGYzrzfU0bn+lNYmNrQrU1otosKOYw9BkT0GdOQAk4dxZKXcpQTHc8gW3D6xhH3XHOYzyWWtzl+bjLCnCX52McNg1d6jAAnPvW4Dy4rsU5xmvuvCIDw95AAsNuoMlh4Y/fvopH9WDQ6nG6nXx+/Eu0ipYbQjJIjsvg34b9HNVuwdZYg37gtWjjM1E0Giw2J8+8veuCxUzBW8dtcN8IrhuaeN4aM0IIIURP8sOM2wgxBjMpJddvu6toH/av3wEg4Lp7L1ic/WweVWXTnlKWrztCU7MTRYHrhycy9ZpUgk0XTpomOoeiKIyJz2ZU3HAiAkOpbGokLTSl9RMv0fUpE7g+ZQIAHtWDSR+IUWsgNdRbN/aP377KgPB+XJs0rk01cE9PfVZVlT2nakefa83khWiCIqlqrsbmshNtiiIw78FzfkabrmUKwzR53plznXZs618DRYO7PN9b6/osrrJDvsBQE5mENjnLW1dSowWN99+asN43lfRKIYFhF3N6XAQZzNzRfxqHa48C4FLd3Bg/hqRj+/Gs/C3u6f+FNiIJxWj2+5/f7nDzwt/3UFTZRFyEiRtHJfv+stFpNeh1Ggw6LanxwYQFXX6aYyGEEKI72F25jwZHI+MSxvCDflP89nnqy2he8xKoKobh09D3HdWmax4pruft1YcpLPOuqx+QHMbsvP7nLCshut65atV2xmc+kPUTXB4XOo2O6uYajtccpqSpjOuSx+P2uLG4rOecZlpvb+Tvhz+m3tHAr0b8gqrmGv68903MehO/GbcEp8fFdzWH6B/W1zeNFbzBqEbRsLl4K5tLviGvTy7DY7JYV7SZtSc3cWu/m5jU51o2l2zlaN1xxsRn0z/80jONOnZ9guvoN2c26APQxqaf+qe/XxZTw6A8DIMufr2u6L4kMOxC605u5puynTw87D5yEkZxTeww3Cd2w5FvuO7YDnA7UAKCUS21cFZdHY+qUlpt5b0v8ikoricixMiCWUMvuuitEEII0dM0OS28deB9ml02woyhvjVr4M3A2LzmJXBY0aUOxzDiwvXvABosDv6+toDN+7w178KDjcy8Lp1RA2Mks7Y4J53G+/U5MjCCH199Dza3Hb1Gx/byb3nrwPtMSc3jhj7XsrN8N/trDvGjzBmY9IEcqMnH5rZRY6vF5raTEZ5OiCEYRVE4Wl/IX/b+L8nBifx65C/ZUrKNVcdWMyHxGm5IvY5GZxPHG09SUHeU4TFZmHUm4kwxvrWRB2vy2Vmxh4yI9Mu6N8PQqSgBwaDTo43tjyY8sfVyHaLXkMCwi9jdDlafWEedvZ782iNklJ7AvvMf4Gj2HaPrNwZjzmw0pxbLV9Q18+6afA6frMNq99ZRCTbp+dUPJSgUQojurqSkhCeffJKoqCgqKipYuHAhycnJfsesXbuWDz/8kKSkJAoLC0lPT2f+/PkoisKvf/1rNm7c6DvWbrdzyy23sGTJEj788EOefvpp9HrvVLa4uDg++OCDTr2/zhKkNzNrwG3sqz7kW/d1mqJoCBj/Y+zbPyLguvtaTcFfXmPld+/uoqbBjk6rcOOoFKZe00cSsIk2uyoyw/fzycZiXB4XwXozCgqfFq6mwlrFmLhsMiLS+dng2cSbY4kICCciIJyHh93nO1dBoX9YX/qGpgKg02iptddxoqkYgOzYYaSF9KFPiPfvjClpeUxJOzNal5eSS0Z4Ov1OnX+pFEMghqwbL+saoufqtL/52tIhnlZUVMS0adN47LHHmD59um/766+/zu7du1EUhczMTO6///7Oan678qgejFoDC0b8gu+qDzHQ7qH563cB0ESmoOs7En3fkWhC43zn1Fsc/P7dXVTW2QDvG830xFCmjU0lPrJttWyEEEJ0naVLlzJz5kzy8vJYt24dixYt4o033vA7pqqqivnz55OWlobD4SAnJ4dJkyaRlZWF2Wxm8+bNvmOXLVvGlClnplG++OKLjB49urNup0tsLd1BVvQgsuOGkR037JzHaGP6YbppQavXKqpo4pn3vqXB4qBfQgj33nIVseGmVs8T4nxuS5/K6LgRRAdGoigKk1Kuxe52kBDk/T43KDLzvOdmRKT7jfYNjhzIwlHziTN7kyZFBUYQFRhx3vP7hCT7gkYhLlWnBYZt6RDBu2D2mWeeITHRP5vRnj17+OSTT1i+fDmKojBjxgxGjBhBdnZ2J91B+6iz1/Onb1/j9qjhZCSPZmyitxPXXzURXfIQdH2GtjjH5nDx/N93U1lnIzUumF/cNoTIUBkhFEKInqK2tpZNmzbx4osvApCTk8NDDz1EeXk5sbFnCirPmDHD93N5eTl6vZ6EBG8ih0WLFvn2Wa1WDh48yMKFC33bli9fzrp167DZbMyaNYuMjDMjGb3Bt5X7+N8D7xF3Ipb/HDkX7VnF6Z2HN+OuOIIxZzaKpvWvNkdLGnju/W+x2FwM7BPOw7dnYTS0X7F7ceU6HQQC5CS0bX3ruZj0gZj0kixQdK5OmTR8ukMcP3484O0Qt2/fTnl5eYtj//a3vzFlyhTCwvyLbK5YsYJx48ah0WhQFIXc3FxWrFjRGc1vV1/kf06JpYzPDnyM4/Am3/aAcXe3CArtDjfHyxr500f7OF7WSExYIHNnXC1BoRBC9DAlJSWYTCaMRm8iMIPBQEhICMXFxec8fvHixTzwwAM89dRTREVFtdj/ySefMG3aNN/vAwYM4Ec/+hGPPvooP/3pT/npT396zj62J4s3x5IYFM/YhFF+QaG7shDbxtdxfvclruPftnqdQydq+d27u7DYXAzrH8XcGRIUCiEEdNKI4YU6xLPflBYWFnLgwAHuvvtu/va3v/ldo6ioiDFjztS7iYyMZNeuXZ3R/HZ144kTGJqaGOnU+YrRf19BUT2vrTpAeY3Vty3YpGfeD68mxGzorKYKIYToIo8//jj19fXMmjWLwMBAv/4PYOXKlfzlL3/x/T548Jl6aikpKWRmZrJu3Tp++MMftvkzIyMvLmX+uURHt38GT6fbSWljBYP79OU3Sf+BXqPzJYVRVQ/FK94Ct4vgYTcQPeq6C15r+4Fynnt/Nw6Xh9xhScy9c1inlKAwm42tPpuOeHYAlBoxd9S1u4EOe25tYDZDdHTPzfrelc+uJ+vNz63brK72eDw888wzPPHEEx32GV3Z6e0pO8D+o98wsmg/1xkCSPnFC2hNIS2Oq6pr5qWP91HX5F0IHx9lJiU2hB9OGkBaFxWf783/A5xLj77fS/gC0Nr9tuULTU/S0+7lcr949LT7vVzd9X4TEhKwWq3Y7XaMRiMOh4OGhoYWyyYaGxsJDvbeQ2hoKDk5OaxevdovMNy+fTuDBw8mIODM7JFjx46Rlpbm+12v12Oz2S6qjdXVTXg8ausHnkd0dDCVlY2XfP75fHpsNf86vpa7Mu9gVNxwv33Ow5twlB1BMYWhDr39gp+//WAF/7NiP26PSu7QBO6e1J/aGku7t/dcLBb7BdvWUc8OwGS1Y+2ga3e1jnxubWGxGKisdHTZ51+Orn52PVVPf24ajXLBeKhTAsO2dIiHDh3Cbrfz3HPPAd5O7qOPPuLIkSM88sgjJCUlUVNT4zu+urq6RYfamq7q9BxuJy9+9Tr1jka0wQHk9MujxqKAxf9aTpeH3/zfTuqa7AzsE868mVf7vcnsiv8Qe/r/ABerp9/vxX4BaMv9tvaFpifpiX++l/PFoyfe7+U4fb+tdXxdITw8nLFjx7Jx40by8vLYsmULw4cPJzY2ljVr1jB69GiCg4OZN28ezz33nC84zM/PZ+LEiX7Xeuedd/jlL3/pt+3JJ5/k2WefJTQ0FKvVyr59+5g7d26n3V9HqrPV4/K4CDf6LzFRnTbs3ywHwDhqBor+/C9QNu8t5a+rDqCqcOOoZGZely6lKIQQ4ns6JTBsS4c4cOBAv2kxx44d47bbbvNlJZ02bRpLly5l7ty5KIrC+vXrefTRRzuj+ZfsYE0+9fYGRseP4K64cWza/xHDbWDImnzO499efZijJQ1EhgTwwA8Gdcr0FiGEEJ1jyZIlLFu2jA0bNlBRUeGbIfP888+zdOlSsrOzmTBhAgsWLCA1NZWKigoGDBjA7NmzfdeorKzEZrORkpLid+3c3FweffRR+vTpw8mTJ3nkkUcYONC/lENPddfAO5iYMsGXnfE0x7efolrr0ESnoet/zXnPX7ermP/95yEAfjAujWljUyUoFEKIc+i0qaRt6RABXC4XTz31FIWFhaxYsQK3282MGTPIyspi6tSpzJs3D0VRmDhxIiNHjuys5l+07WW7eP27dzDpAhkaM4S0Q9tJKW/AMOwWlICWb7I/+/o4G3aXoNdpeGj6EIJNspZQCCF6k6SkJF5++eUW21euXOn7ec6cOcyZM+e814iOjuZPf/pTi+2tndcTfVuxl91V+7ktfWqLoNDTVI1jz+cABFwz+7z1Cr/eX8Zbp4LCmdelM3l0yjmPE0II0YmBYVs6RACdTsfixYtZvHhxi2PvueeeDmtfexsQkc5dmXdQbq1E9XjQpWWjWuswDGlZNHTF5mN8vPEYAD+ZkkmfuO65RkYIIYToDB7Vwz+OfEZFcxVpIX2YkOQ/IqiYwjGO/RGe6pNo4/qf8xrf5lfx6soDqMCMa/tJUCiEEK3oNslnehNVVQkxBPvXr7nqOvQDr/WbvqKqKh9tPMrKLcdRFPjZTQO5ZlDcOa4ohBBCXDk0ioYHr76H9cWbGXeq3u/ZFI0GQ2buec8/eLyWlz7eh0dVmXpNH6aM6dORzRVCiF5BFrF1gD1V+3ly6+/ZWPw1qqPZt/37axq+2FHEyi3H0SgK990yiLFD4ju7qUIIIUS3Um6tpKixhGhTJHf0n4bmrGmiqurBVfwdqnr+RHLHSht44YM9uNwerhuWyPQJfTuj2UII0eNJYNgBDtTkU2opp8lhofmfz2Nd+Rs8DRV+x5TXWlm+7ggA994ykNFXxZ7rUkIIIcQVQ1VV3jn4Ab/Z/gd2Vuxpsd+V/xXNn/4W27pXz3l+cWUTz773LXaHmzFXxXLXDQMk0YwQQrSRTCXtAHf0v4Wh0YOJaKjFXXoIjGa/hDMeVeX1VQdxuDxcMyiWMVfJ9FEhhBDCpbpJDIqn3FpJRni63z7Vacf+zd8B0CW2zLhaUdfMM+99i8Xm4up+kfxs6kA0EhQKIUSbSWDYzmwuO3qNjozwdJo3PoUbMAy5EcVg8h2zdmcxh0/WEWI2cGfegK5rrBBCCNFNuD1udIqWGQN+wC19byRAF+C337F7lbc8RVQquv45fvtqG+38/t1d1Dc5yEgO4//dOlhKPgkhxEWSwLCdrT25kS9ObmBa5DCGleeD0Yxh8CTf/oq6Zv6+rgCAH9+YQVCgvquaKoQQQnQbHxR8QnVzDTMH3EpkYITfPk9TNY7dqwAw5viXp2hqdvLse99SWWcjNS6Yh+/IwqDXdmrbhRCiN5DAsJ2VWsppdtkwVRUBYBiUh2II9O1f/c1JHE4PowbGMGxAdFc1UwghhOg2rM5mtpd9S7Pbhs1tb7Hf/s3fwe1E13cUujj/mTZvrz5McZWFhCgz82ZeTaBRvtoIIcSlkL8929nPBt/FLdbJ6D/5LQC65CG+fS63h60HygGYMlpSZwshhBAAJn0gC0f/ivzaAhKD/DN0u8vycRV8DVodxtEz/PYdPF7L1u/K0es0/PKOLIJNhs5sthBC9CoSGLajSms1ATojkWix1JeDzogmOtW3f9+xGpqanSRGmUmJDTr/hYQQQogrRKmlHIfbQUpwEtlxw1rsd5UdAsAwZDKa4DMzbVxuD2+vPgzA1Gv6EB0W2OJcIYQQbSeBYTtaVbiabWW7uDttClnpY0DRomjOPOKv95cBMGZQrKTPFkIIIYDVx9extWwHt6ffzPUpE1rsNw69GV18JprIZL/tX+4oorjKQkxYIFNGp3RWc4UQoteSwLAdqSroNFqSojMITLvWb1+z3cWu/CoAKU8hhBBCnBIeEEaoIZhBkZl+21VV9b1E1cb6l66oa7Lz8aZjANyZ1x+9TpLNCCHE5ZLAsB39ZNAsnJ470CotU2RvP1SB0+UhIzmMyNCAc5wthBBCXHlu6XsjU9MmoTmr71RVFdvqP6KJSsFw9U0oWv8M3u+vLcDmcDM0PYqr06M6u8lCCNErSZGfdlLcVMrxhpMotiac2z7EVXLAb//X+71JZ64ZLKOFQgghBMCGoi18W7EXl8ftt911dBuuwh049vwT1W7123foRC1f7/cmnLkzr39nNlcIIXo1GTFsJ2tOrOebsp3cHjGMkd/+E23FEXQJAwGoabBx8HgtOq2G7AwpUSGEEEI43U7+ceQzbG47S8b8OzEm78ifardg3/I2AMbRM9GYQn3nuNwe/nYq4cxNYyThjBBCtCcJDNtJmDGUqMBIUhsbAdAmnFkrsfW7clRgaHokpgApaC+EEEK4VQ9T0vIoaSrzBYUA9m0foDbXo43tjz7TPxnNlzuLKa60EBUaIAlnhBCinUlg2E5+0G8KP+g3Bcvyx/AA2nhvYKiqKpv2lgKQMzj+AlcQQgghrhwBOiN5Kbl+29zlBTi/WwuKFuP4n6Ccte6wvsnOPzYdBWB23gAMekk4I4QQ7UnWGLaDo/WF7KzYQ2NjOZ6aItDq0UanefeVNFBabSXEbGBIv4gubqkQQgjR9RxuJy/s+jNfntiAqqoAqB4Xto1vACqGq6egjUj0O+f9tUdotru5ul8kQ/tLwhkhhGhvMmLYDjYUfc228p1MjxjGKEAb0w9FZwA4M1o4KA6tRuJwIYQQ4rvqgxyuLcDmsvlqFzoPbcJTU4QSHI1h+DS/4w+frOOr/WXotBrunDSgK5oshBC9ngSG7aBvaAoNjgb6Wm0AaOMzAHA43XxzwJuNdOwQyUYqhBBCAAyMzOBng+5CrznzNUSfMR6cNjQRyb6XqwBuj4e//esQADeNSSFGEs4IIUSHkMCwHUxIymFCUg7WT/4bN6CN8U4j3Xm4kma7m7T4YBKjg7q2kUIIIbpMSUkJTz75JFFRUVRUVLBw4UKSk5P9jlm7di0ffvghSUlJFBYWkp6ezvz581EUha1bt/Lggw8SEHCmDu7mzZt9P7/++uvs3r0bRVHIzMzk/vvv77R7u1hujxu9RseI2Kv9tisaLYasyS2O33GokqJTCWduGtOns5ophBBXHAkML9OBmsMUN5UyJOoqIkfejrusAG1MOnBmGum4IZJ0RgghrmRLly5l5syZ5OXlsW7dOhYtWsQbb7zhd0xVVRXz588nLS0Nh8NBTk4OkyZNIisrC4CFCxcyffr0Ftfes2cPn3zyCcuXL0dRFGbMmMGIESPIzs7ujFu7aLur9vNB/idM6nMt1yaNxWOtQ7XWowlPRNG2/Fpy+GQdALlDEyThjBBCdCAJDC/T/qqDrC3ahKqqTOpzLbo479qH6nobBwq9tQtHXRXbxa0UQgjRVWpra9m0aRMvvvgiADk5OTz00EOUl5cTG3umf5gxY4bv5/LycvR6PQkJCb5ta9as4fDhw9hsNqZOncrIkSMBWLFiBePGjUNzah17bm4uK1as6LaB4cGaw9TZ63GfKmrvOroN+5a30WdOIGDCz1ocf7SkAYC+CaEt9gkhhGg/EhhepgHh/fDgoW9oqt/2bQcrUIFh/aMwS+1CIYS4YpWUlGAymTAajQAYDAZCQkIoLi72CwxPW7x4MTt27OCpp54iKsqbfTMhIYFZs2YxYcIEamtrue2223jllVfIzMykqKiIMWPG+M6PjIxk165dF93OyMjLX/IQHR3c6jH/FvVjplTnEhsURWhAMBVNxdiBkD4ZhH7vfIfTTVFlE4oC2YPje2wtYLPZ2OqzacuzuySlRswdde1uoMOeWxuYzRAdbeyyz79cXfnserLe/NwkMLxMWdGDyIoehH3HxzRX/QPD1TehjevP8XJvoftBaVKiQgghRNs9/vjj1NfXM2vWLAIDAxkzZgzJycm+NYnh4eGMHz+eVatWkZmZ2W6fW13dhMejXvL50dHBVFY2XvCYGlstJp2JcF00jkaobGzEWlQAQHNALI7vnV9QXI/LrZIYbcbSaMPSaLvk9nUli8V+wWfTlmd3qUxWO9YOunZX68jn1hYWi4HKSkeXff7l6Opn11P19Oem0SgXfAko9RMug6qqrDjyORuKtuA4uQfX8V2oLu9fECcrmgBIjpGkM0IIcSVLSEjAarVit9sBcDgcNDQ0kJjoX6evsfHMl43Q0FBycnJYvXo1AIWFhX7H6vV6bDZvkJSUlERNTY1vX3V1dYtrdxfL8z/h15v+i71V3wGguhx4aktAUdBEJrc4/mhxPQD9EkI6tZ1CCHElksDwMlhdzfzz+Jd8fGQVavVJALTRqThdbsqqrSgKJESZu7iVQgghulJ4eDhjx45l48aNAGzZsoXhw4cTGxvLmjVrfAHhvHnz/ILD/Px8UlJSAHjllVcoKPCOrLndbrZt2+abPjpt2jQ2bdqEx+NBVVXWr1/PtGn+dQC7A1VVsblsuDxukoO9gaunpghUD5qwBBRdyyl5R0tlfaEQQnQWmUp6GRTglr434myqRjm0CiUkFsVopqSsEY+qEhdhwigZ1IQQ4oq3ZMkSli1bxoYNG6ioqOCJJ54A4Pnnn2fp0qVkZ2czYcIEFixYQGpqKhUVFQwYMIDZs2cDMH78eJ5++mn69etHWVkZ06ZN4/rrrwcgKyuLqVOnMm/ePBRFYeLEib7ENN2Joig8POw+Gh1NBBu8s2ncVYUAaKLOXYbCl3gmXkYMhRCio0lgeBlMehOTUyfiOLAOO6CN9tYvlGmkQgghzpaUlMTLL7/cYvvKlSt9P8+ZM4c5c+ac8/ypU6cyderU817/nnvuufxGdrBdFXtJDk4kKvDM2ntP1XEAtOcIDOstDqrqbRgNWpl9I4QQnaBNgWFNTQ0rVqxg79691NTUoKoqkZGRDBkyhFtuuYXIyMiObme3tL/6IMfqj9O/4ggJeKeRggSGQgjRm0gfePmaXTbe2P9/uFUPT49bTJDBG+hpwuLRxg1AG923xTlHS7zrC9PigtFolE5trxBCXIlaXWO4efNmpkyZwrp16wgPD2epxT+KAAAgAElEQVTo0KEMGzaM8PBw1q1bx9SpU/nqq686o63dzr6qg3xW+AVHG4sB0PhGDL1rRJIkMBRCiB5N+sD2YXPZuDp6MIMiM3xBIYAhazKmaf+JNq5/i3OkfqEQQnSuVkcMn3vuOd577z1SU1PPub+wsJAFCxawfPny9m5bt5cVfRUmrYHULR+DoqCN6oOqqhRVWgBIkcBQCCF6NOkD20d4QBg/G3wXqtr2chhnAkNZXyiEEJ2h1cBQq9Wet0MESE1NRau9MhOsDIwYwMCIAXiSxuOpLUHRB1DbaKep2Yk5QEd4cM8teiqEEEL6wPZgc9lZcfQzhsdcTXpYmm+7u7wAT10p2vhMNCHRfud4PCrHSiUwFEKIztTqVNK+ffvyH//xH2zfvp2amhqcTicul4uamhp27NjBf/7nf9K3b8u1Ab2dqqq8c+hDPi/8EoxmdPEZwJn1hUnRQSiKrIkQQoieTPrAy1fVXM36oi28c/ADv+3O/K+wrX8N59FvWpxTWm3B5nATGWIkLEhesgohRGdodcTwv/7rv/jDH/7AQw89RH19vd++0NBQZs6cyUMPPdRhDeyuGp1NbCr+GpMukMmp1/u2n15fKIlnhBCi55M+8PIF6gKYmDIBs87kt91dff6MpKenkabJ+kIhhOg0rQaGBoOBBQsW8Ktf/YqTJ09SVVUFQGRkJCkpKVfsqJhO0TFzwK3YjnxN09vzMebchT5thG99oQSGQgjR80kfePkiAyOYnn6z3zbV48FTfQI4dw1DX2F7qV8ohBCdptWppGfzeDyoqorH4/H9fKUy6QPJTcphfLMG1VIDp74c+KaSSmAohBC9ivSBl2Zf1QHWnFhPSVOZb5unvgxcDpSgSDQBwS3OkcQzQgjR+VodMXQ4HDz//PMsX76choYGv32hoaHMmDGDhx9+GIPB0GGN7I52Vewlv+4oA521pAJKQDBOl5uyaiuKAolSjFcIIXo86QMv3/by3Wwr34k500RCUBwAnqpC4NzTSG0OF0WVTWg1Cn3iWgaNQgghOkargeGSJUtQVZWXXnqJ9PR0QkK8b+8aGhooKCjggw8+YPHixTz99NMd3tju5EDNYTaXbCXMA6mAJiCYk1VWPKpKfKQJg16y1AkhRE8nfeDlGxI1ELM+kOTgRN82d5V3feG5ppEeL2tEVb0zb4zSlwohRKdpNTA8duwY7777bovtYWFhZGdnk52dzaxZszqkcd3ZqLjhxJii6LPx7wAoAUGcOOF9myzrC4UQoneQPvDyjYi9mhGxV/tt81S1nnhGppEKIUTnajUwdLlcnDhxgpSUlHPuP3HiBC6Xq9UPKikp4cknnyQqKoqKigoWLlxIcnKy3zFr167lww8/JCkpicLCQtLT05k/fz6KorB161YefPBBAgICfMdv3ry51c/tKOlhafQLSabJ8jooGjCaKKvxrp9IkGmkQgjRK7RXH3il8qgeVhz5nLCAUHITc3zJevQDc9GEJ6CJSmtxjgSGQgjRNVoNDOfPn8/MmTPJzMykb9++BAd75/s3NjZy7NgxDh48yLPPPtvqBy1dupSZM2eSl5fHunXrWLRoEW+88YbfMVVVVcyfP5+0tDQcDgc5OTlMmjSJrKwsABYuXMj06dMv4Tbbl0f18Mb+dwjXBnI9oA0IQlE0VNfbAIgKDbjwBYQQQvQI7dUHXqmanBZWn1iHWWfi2qSxvu369GvQp19zznOOlHjLgkhgKIQQnavVwDAnJ4dVq1axcuVK9u7dS2FhIQARERFce+21/P73vyciIuKC16itrWXTpk28+OKLvms+9NBDlJeXExsb6ztuxowZvp/Ly8vR6/UkJCT4tq1Zs4bDhw9js9mYOnUqI0eOvKibbS8NjkZ2VOwmSBdIHt5ppAA1DXYAIkMkMBRCiN6gPfrAK5lW0XJL3xvbnMG1psFGXZMDk1FHbISp9ROEEEK0m1YDQ/B2gHPmzLnkDykpKcFkMmE0GgFvXaiQkBCKi4v9AsPTFi9ezI4dO3jqqaeIiooCICEhgVmzZjFhwgRqa2u57bbbeOWVV8jMzLzkdl0qo9bIj6+ahdPtJGj0YFSXA4DqBu+IoQSGQgjRe1xuH3glM+tNTE6d6LfNcXA9nppi9APGtlhjeKawfQgaqREphBCdqk2BYWd7/PHHqa+vZ9asWQQGBjJmzBiSk5N9axLDw8MZP348q1atuqjAMDLy8pPCREcHA8GkxOf6bXe6PNQ12dEo0L9vFDrtRZWI7Na893zl6NH3W2rEfJHtb+1+zWZjz34m39PT7sVshuho4yWf39Pu93Jdaffb3R2syfeWdooYQHqYdz2h6+g23EX70MZntAwMpbC9EEJ0mXYJDB9//HEWL1583v0JCQlYrVbsdjtGoxGHw0FDQwOJiYl+xzU2NvrWb4SGhpKTk8Pq1asZM2YMhYWFpKam+o7V6/XYbLaLamd1dRMez6UXJI6ODqayspFvynZyoOYwI2OHcVVkBgAVdc2oKoSHGKmtsVzyZ3Q3p+/5StHT79dktWO9iPa35X4tFnuPfiZn64l/vhaLgcpKxyWd2xPv93Kcvl+NRmmXF4Ft1VofeCX7ruYQX5zYgEGjJz0sDVVVz8pI2jKhz+kRw36JEhgKIURnazUw3LZtW6sX2bVr1wX3h4eHM3bsWDZu3EheXh5btmxh+PDhxMbGsmbNGkaPHk1wcDDz5s3jueee8wWH+fn5TJzonYLyyiuvcO+995Keno7b7Wbbtm3MmzevLffY7oqbSvmmbCcxddUk5/8B47Bp1ISOAiBCppEKIUSv0R594JVsUEQmBo2B/uH9AFAtNai2RjCaUYKi/I51ezwUlp2aSiojhkII0elaDQznzp2LTqdDqz1/kdnq6upWP2jJkiUsW7aMDRs2UFFRwRNPPAHA888/z9KlS8nOzmbChAksWLCA1NRUKioqGDBgALNnzwZg/PjxPP300/Tr14+ysjKmTZvG9ddf39b7bFfZsUOJN8cSW7AL7BZQFN/6wigJDIUQotdorz7wSpURkU5GRLrvd/dZ9QuV760hLK604HB6iAkLJNhk6NR2CiGEaENg+N///d+sX7+eRYsWnfeYu+++u9UPSkpK4uWXX26xfeXKlb6f58yZc94F/lOnTmXq1Kmtfk5nSA5OJDk4kebvtuHCm5W0uvxU4hkpVSGEEL1Ge/WBl1vL94MPPmDLli3ExMRw7NgxJk+ezK233grQ7er8nu3jglUYtUauTxmPUWvwTSPVREpheyGE6G5aDQwnTJjAyZMnW5SWONu9997b7g3rztacWE9JUxnX2GuJA5SAYN+IoUwlFUKI3qO9+sDLreX7z3/+k2effZagoCDq6urIzc1l5MiRvrX63aXO79ncHjdfnNyAqqrk9fEmbDt7xPD7JDAUQoiu1abkM3fdddcF9+fm5l5wf29zsCafAzWHGezUewPDwGCqG2oBKVUhhBC9zeX2ge1Ry/eVV15Bo/Fmuw4LC8NkMlFVVeULDLtLnd+zeVC5o/80rE4reo3364an+nRgmNri+DOF7UM7rY1CCCHO6JblKrq7SSnXMiLmauLWvwOcGjGsLwVkKqkQQgh/7VHL93RQCLB3717i4+PJysoCuled37PpNTpyk3J8v6uqSuAND+OuOo4SGuN3rNXmoqzaik6rkBzTeRllhRBCnNHmwHDnzp2sXr0as9mMoig0NTVx7bXXMnr06I5sX7eUEZGOqqo0Wf/Hu8FoprrBDkBkyKXXGxNCCNE9dWYfeK5avqc1NDTw3HPP8cILL/iSt7RHnV9oz1q/Xgcrj7Dl5HYGx2QwKmmod2NMFpDV4rxvD1egAv0Sw0iI730jhm2pB9thNTgvob5tT9KVtUsvt85sV5O6r5emNz+3NgWGb775Joqi8Mgjj/i9tXzppZcwGAwMGzaswxrYHb136GPsrmYmqW6CtAYaHRpcbg/mAB0BBhmEFUKI3uRy+8D2qOULUF9fz2OPPcbixYv9Ete0R51faL9av6ftOnGAzwvWYWt2kmbsd8Fzdx4oByA5xtwra2+2Vg+2I2uOXmx9256kq2u1Xk6d2a7W1c+up+rpz621Or+a8+45Zffu3VRUVDBr1ix27NjBtm3bfP8MHTqUt99+G4Bly5ZRWlrafi3vxnZV7mFr+S5Mdz2L+c7fUV0vGUmFEKI3ao8+8OxavkCLWr6Njd4vGfPmzfP9DN5avikp3iLwNTU1LFy4kH//938nNTWVnTt3smrVKsC7/rCgoADAV+f37FHGrpIR3p/b02/m6uhBANjW/xXryt/grjzW4tijxafXF0riGSGE6CqtDm+98847PPLII1gsFt5++222b9/O6NGjUVWVrVu3Mm3aNMCbrvvPf/4zS5Ys6fBGd7VZGdOxOpsxGYPRaHTUnKgAJPGMEEL0Nu3VB15uLd9f/epX7N69m1mzZgHgdDr59a9/DXSvOr9nSw5OIDk4wfe7q+QAamMlaPV+x6mqytHS0xlJe980UiGE6ClaDQzLy8uJjIwEwGw288EHH/gWy5eXl/PHP/4RgJSUFE6cONGBTe0+hkYPRlU9gHd9R9XpEUMJDIUQoldprz7wcmv5vv766+e9dneq83u2fxz5DLfHzcSUXELQ+IJCTVi833FV9TYarU6CAvVEy8wbIYToMq0GhmevUzh69KivgwTv9Jj8/Hzf783Nze3cvO7H4XbyvwfeI9DSwLS9O9Fn5lLd7E0+IFNJhRCid5E+8NJ9VbqNRkcT1yWPw119EgBNZDKKRut3XMGpaaT9EkJ8CXWEEEJ0vlYDw/T0dLZv3052djbXXXcdd999t+/N5KeffkpeXh4A+/fv99Vb6s1sbhu7KvYQpOiZ5nGBRnNmjaGMGAohRK8ifeCluz39FmpstYQaQ3BVfQWcu35hQZE3MOyfHNaZzRNCCPE9rQaGs2fPZtGiRbz77rvcd999ZGRksGXLFgAeeOABcnNzcTqd/Pa3v2Xu3Lkd3uCuFqA18rNBs3EWbAWKUQKCqWmQ5DNCCNEbSR946UbGncnW6q7yFrbXRPVpcVx+UR0A6YmyvlAIIbpSq4HhwIEDueGGG7jrrrtYtmwZubm55Obm+vafPHmSRYsWMXLkyCuibIVBa2BE7FBsh3bh5FRx+wYZMRRCiN5I+sBLU9RYwhcnN9AvNJVxiWPwVBUCoP1eYGi1OSmutKDTKqTF997aYEII0RO0qejefffdR0pKCg8++CCBgYH06dMHvV5PYWEhjY2NPPzww9x8880d3dZuocxSzqfHVhNrLyYXcOpMWGwW9DoNwSZ9q+cLIYToWaQPvHglljK+KduJ2+NmbMwwPPXloNGiCfev3VhQ3IAKpMaFoNdpz30xIYQQnaLN1dgnT57M5MmTOXjwIIWFhaiqSp8+fRg4cOAVtVi81lbPzoo99Fd15AKNbiNgISIk4Ip6DkIIcSWRPvDi9A1N5UeZMwgLCAWdgaCf/Am1uRHle6UqCoq900j7J8k0UiGE6GptDgxPy8zMJDMzsyPa0iMkBMXx00Gz0W55F4Bah/cRRoUYu7JZQgghOsGV3ge2VVRgBFGBEWc2GEwoBlOL4/JPehPPpEtgKIQQXa7VwLCwsJD333+fgIAAHn74YQDuuusuysrKANBoNLzxxhskJiZe6DK9RqgxhOzYoTRZ3kAFqmzeRxgh6wuFEKLXkT7w0qw8+i/q7fVMTMkl1hR9zlFVl9vDsVOF7SXxjBBCdD1Nawe8++67bN++nZEjR/q22Ww2HnjgAR544AHGjBnDX//61w5tZHeyr+oAf933Nvsn3k3QT16molkCQyGE6K2kD7w0e6r2s6V0Gw63A8e3n9L45i9w7PnM75jj5Y04XB7iI00Emwxd1FIhhBCntTpiuHPnTl599VVCQkJ820wmEzNmzADglltuYebMmR3Xwm6m1FLOjordhBlDUeJHUG9xAhAWJJ2aEEL0NtIHXprp6TdT2VxFtCkK1VILdgso3ytsf7p+oUwjFUKIbqHVwNBoNPp1iABvvvmm7+eAgABMppbrBnqrIVFXEWYIJiYwCoB6iwOA0CBZYyiEEL2N9IGXJjOiP5n0B6DZ6k0wo5j9C9jnnwoM0xOlsL0QQnQHrQaGHo+nxTaNptUZqL1WnDmGqIZamt9fgjVxEPVN3npWoWYZMRRCiN5G+sCLV9Vcw98P/4OUkCSmpk3CY60FQDGF+45RVZWCU4Xt+yfLiKEQQnQHrfZuiYmJrFmz5rz7P/vsM5KTk9u1Ud3ZhqItvHHiXxwJ1KPoDNSdGjEMkxFDIYTodaQPvHiVzVXsqz5Afu0RAFSLNwDUmM6MDFbUNtNgdRJi0hMTFtgl7RRCCOGv1RHDefPmMWvWLD7//HPGjRtHbGwsqqpSXl7Oxo0b2bFjB++++25ntLVbOFp/nF2Wk/TXacEYRKPVgQJS3F4IIXoh6QMvXlJQAj8ffDcGrQFV9aBavVNGFdOZkcF83/rCMKkDKYQQ3USrgWF8fDzvv/8+zz77LE8++SRNTU0AmM1m8vLyeP/994mNje3whnYX1yaPZUCTlcTmzTi0gaiqNyjUaWVqkRBC9DbSB168YEMQQ2OGAOCx1oPqRjEGoejOLLnIPzWNVOoXCiFE99GmAvexsbH85je/QVVVampqAIiIiLgi3/KlhqQQTzAOlwcr3umjoWaZRiqEEL2V9IEX55+FX3KisZiJKePp4/JmIv1+4pmC4jMjhkIIIbqHVgNDq9XKV199BUBubi6RkZG+fevXryc7Oxuz2dxxLexmPir4lGprPnk6Darb+/hCpVSFEEL0StIHXrz8uqMcqDnMNfHZaKMGEnTvq6iOZt/+RquD0morBp2GlNigLmypEEKIs7U6//Gjjz7i0UcfZc+ePaiq6rdv165d3HHHHRQXF3dYA7ub76oPsUutx6FRsJx6ExomGUmFEKJXkj7w4v2g3xR+Omg2KSFJACgaHZqAYN/+06OFfRNCZBmGEEJ0I62OGH7++ee8+eabDBo0qMW+uXPnMmLECH73u9/x/PPPd0gDu5vp/W+mZufHhLsqKTgVGIbIiKEQQvRK0gdevOTgRJKDEwFQXQ7Q6v2m3frqF8r6QiGE6FZafVWnquo5O8TTxo8f71tzcSUYGDGAsXn/TtQ9r3FESQUgTNYYCiFEryR94MVpdjXz7I6XePvAcgDsW96m6bWf4zy8yXdMQZGsLxRCiO6o1RHDtiyuP1cB4N7q1X1/Q6do+dHAGdRbXICsMRRCiN6qvfrAkpISnnzySaKioqioqGDhwoUt6h+uXbuWDz/8kKSkJAoLC0lPT2f+/Pm+Nrz++uvs3r0bRVHIzMzk/vvv9517oX2dqcZWx5H6QpqcVgA81jrwuMDgrVXodLkpLGtAAfolyIihEEJ0J60GhkFBQezbt4/Bgwefc//evXuvmIX3Ho+HXRV7UFC4e+BM6k8Vtw+VNYZCCNErtVcfuHTpUmbOnEleXh7r1q1j0aJFvPHGG37HVFVVMX/+fNLS0nA4HOTk5DBp0iSysrLYs2cPn3zyCcuXL0dRFGbMmMGIESPIzs6+4L7OFhkQwcND78OjeoPlM8XtwwE4VtqIy62SFB2EKaBNidGFEEJ0klb/Vv7lL3/Jz3/+c26//XbGjh3rq9dUXl7Opk2b+Mc//sGrr77a4Q3tLn4+ZA5NG/6K9dV7cbvvBHSEBclUUiGE6I3aow+sra1l06ZNvPjiiwDk5OTw0EMPUV5e7lcDccaMGb6fy8vL0ev1JCQkALBixQrGjRuHRuNdAZKbm8uKFSvIzs6+4L7OFqAzkhGR7vtdtdYCoJi9geGZMhUyWiiEEN1Nq4FhZmYmf/nLX1i2bBmvvPKKb0qLqqrk5OTw2muv0a9fvw5vaHeg0WgYGj2YxgYroFJt8WaoC5ERQyGE6JXaow8sKSnBZDJhNHpfIhoMBkJCQiguLvYLDE9bvHgxO3bs4KmnniIqKgqAoqIixowZ4zsmMjKSXbt2tbqvs609uYl9VQeYkJRDVmQGanMjKApKYAgA+SelsL0QQnRXbZrHkZmZyVtvvUVtbS0nT54EIDk5mfDw8A5tXHdTba3lf/a8TWiIjqnV0OTSYtRrCTTKdBghhOitOrsPfPzxx6mvr2fWrFkEBgb6BX0dKTLy8msKVjjLOVibz7Xpo4kI9NCEitYcTkxsGB6PytHSBgDGZCUSHWG67M/rKcxmI9HRwRc8prX9l6zUiLmjrt0NdNhzawOzGaKje+6ssa58dj1Zb35uFxXRlJaWUlJSAnjfeF5pgWGDvYk9VfuJN+lR6/R40Mj6QiGEuEJcah+YkJCA1WrFbrdjNBpxOBw0NDSQmJjod1xjYyPBwd4vHKGhoeTk5LB69WrGjBlDUlKSX/bT6upq3/kX2ncxqqub8HjU1g88j+joYCbGX8vg0MEk6GOpOhVEqwGhVFY2UlxlodHqJDzYCC4XlZWNl/xZPY3FYr/g/UZHB3fY8zBZ7Vh76bPuyOfWFhaLgcpKR5d9/uXo6mfXU/X056bRKBd8CdimyrLbtm3jxhtvZPr06cydO5e5c+dy6623cuONN/LNN9+0W2O7u2hzBPcMmM7kagserfcNkWQkFUKI3u1y+8Dw8HDGjh3Lxo0bAdiyZQvDhw8nNjaWNWvW0Njo/ZIxb948388A+fn5pKSkADBt2jQ2bdqEx+NBVVXWr1/PtGnTWt3X2aICIxkUmUF4QBiq3eKtYWjylqUoKPJOI+2fFNqmbK9CCCE6V6sjhgcPHuTBBx9k+vTp3HbbbSQlJeHxeCguLuaDDz7gF7/4BW+99RaZmZmd0d4uFWQwMzQoFYvVgT3g1FtdSTwjhBC9Vnv1gUuWLGHZsmVs2LCBiooKnnjiCQCef/55li5dSnZ2NhMmTGDBggWkpqZSUVHBgAEDmD17NgBZWVlMnTqVefPmoSgKEydOZOTIka3u60yqqvKbbS8QbAjm/iE/RpecRdDP/gxuJ3BWYftEWV8ohBDdUauB4UsvvcRTTz3FpEmT/LaHhITw2GOPMWrUKF566SX+8Ic/dFgju4sDlfl8UPARfUIDGerQA1KqQggherP26gOTkpJ4+eWXW2xfuXKl7+c5c+YwZ86c817jnnvuuaR9naXB3siJxmICdYFoNVrgVB1InbeflML2QgjRvbU6lbS0tLRFh3i2G264wbfmorcrb6piT8MxigMM2FVvYBgmU0mFEKLXkj6w7cwGM4+OfJifD74bANXWhHpqtLC+yU5FXTNGg5akmCuj9rEQQvQ0rY4YBgYGtnoRk6n1zGIlJSU8+eSTREVFUVFRwcKFC0lOTvY7Zu3atXz44YckJSVRWFhIeno68+fP961FeP3119m9ezeKopCZmcn999/f6ue2p0ExA7h38N2EGUP41/pGKKmQUhVCCNGLtVcfeCXQabSkBCf5fm/+4mXcxfsJvGkB+U0xAKQnhKDVtCm9gRBCiE7WamBYVFTECy+80OoxrVm6dCkzZ84kLy+PdevWsWjRIt544w2/Y6qqqpg/fz5paWk4HA5ycnKYNGkSWVlZ7Nmzh08++YTly5ejKAozZsxgxIgRnVrAN9ocybCYIQDUW701oqS4vRBC9F7t1QdeCdYd+4qVB74kJ2EU4xPHoFq9yWaUwBDyD55aXyjTSIUQottqNTBsbGxk+/btFzymqanpgvtra2vZtGkTL774IgA5OTk89NBDlJeX+xX3nTFjhu/n8vJy9Ho9CQkJAKxYsYJx48ahOfWmMTc3lxUrVnRqYLj+2NdsOrqNUTFDqW+0A7LGUAgherP26AOvFEUNZZxoLOJq52AAPKcDQ1MYBcWHAG9GUiGEEN1Tq4Fhbm4uzzzzzAWPWbBgwQX3l5SUYDKZMBq9o2sGg4GQkBCKi4v9AsPTFi9ezI4dO3jqqaeIiooCvG9kzy7yGxkZya5du1prfrs6VneS3VXfkXRwKzmOfvyd4RedldTtdlFbW4nL1TPq3lRUaPB4PF3djE7z/fvV6QyEh0ej1V5UyU8hRC/RHn3glWJaRh4DzP0JMYSguhxgt/D/27vTwKaq/OHj36RN0n0vXdkRERAECkUKKGNVxFEcBllUcEHlYReFQcpgAVuWslOEEUTAcf7iihuMICoCIirLCAgoO3Rf6B665j4vApFS6EKTpkl+nzck99zee85Nfpxzcu49B7UTpU5unE8rRK1S0SrUy9rZFELcIltrw1qCrbSL1WonXF098PCo2/JANbZ2L1++XONBaqo062rOnDnk5eUxbNgwXF1dK3UI66O6BR1r4y+aXoRlpOJ1fg8HytWo1SpaNvPDSV37C37mzBnc3d3x8AiVdZwaOUVRKCjIQ6/PoVWrVtbOTu2k6nAP9KzTnwTWsL+7u67GfWyJrZXF3R0CA2/9lnVbK299mbu81qgDbZWXiyctvZsDYMjPAIyjhRk5lzEoCiH+brho5Uc2IWxVTk4mLi5uuLsHO2wb1tlZTXl54+4YKopCRUU5BQW55ORk4ufXpNZ/W+P/0D///DMjR45EUZQqaVe/FCqVio0bN970GKGhoej1ekpKStDpdJSWlpKfn09YWFil/QoKCvD0vLI+oLc3vXr14uuvv6Znz56Eh4dz6dIl077Z2dlV/r4m2dmFGAxVy1FbzQLDUBXrKCuroFjR4Omm4VJ23W4hKirSExQUQEWFAtx6XhqKLQSAOV1fXldXT9LTc8jMLKjmrxoPN30J+jrkNTDQs8ayFRWV2Ez5a1Kb8jY2RUVaMjNv7ddZWyxvfVwtr1qtqvcPgVeZow50FP/csZDy8gpevPNpXItyAFC5+5KVVwxAoE/NE/kIIRqv8vJSh+4U2gqVSoWzswYfH3/S0+v2DHyNHcN27drxzjvvVNl+4cIFZsyYwenTp5k9e3a1x/D19SUqKordu3cTHR3N3r176dq1K0FBQezYsebV2J0AACAASURBVIPIyEg8PT2ZPHkyS5cuNXUOT548yX333QfAo48+yqxZs3jppZdQqVR8//33TJs2rU6Fra8Pj37J78WnuUfnTHGRBh/3W/sVXwLKdshnJYRjM0cd6AgMioHTl85RoRhwcXYxTTyjdvMhM9c46hroLR1DIWydtItsh0qlpq6DUDXOGZ2QkFBl24YNGxg4cCD+/v58+eWX1a7xdFVsbCwff/wxr732Gps2beL1118HYNmyZfz+u/Gh9L59+zJlyhTmzZvH5MmTadu2LU888QQAnTp14uGHH2by5MlMnjyZ++67j+7du9epsPV1PPMUhylAr1ZRomjwljUMhRDCrpmrDrR3KlQsGzCLyV3HoFE7Q0U5Kp2HccQw1zhiGODjYuVcCiFEZQcO/MKoUSNYt+7NG6bHxk5n3769ABw8uJ8xY16o1/kOHtzP+PEv1usYllTjiGFISIjp9enTp4mJieHChQvEx8czYMCAWp8oPDyc1atXV9n+5Zdfml6PHDmSkSNH3vQYo0aNqvX5LGHonY9w11fphJRmUqJo8JMZSQFjUK1atYJevXozalTVtSVjY6fz0EOP0LNnLw4e3M/bb69h5co1t3w+cxxDCCFqw1x1oL1TqVQ08QhEfaXzp2kbhaZtFIqikPnRYQACZMRQCNHIdOvWnV69et80fcKEl/H2dpxldmr1FLjBYGDNmjWsWrWKvn37smrVKvz9/S2dt0bn9oDWuJRARYVCseIsI4ZXSFAJIeyZ1IE1+zXzKJ/9/F+6BHTikVYPmrarVKprnjGUEUMhhPmUl5cTEzOVZs2aU1pagre3D6NGjWb69FfYvft7XnnlVb799mtOnz7F1q3f8M0329m9+3uaNGnCqVMnGTHiWbp06QZARkY6sbHTOXnyD4YMGc5jjw3m4MH9rF69gp49o3j88WF8+OEmkpIusGTJAjp27MwDD/Tnm2++Zv/+n/Dy8iYrK4OxYyfh7x9Abm4uiYlL8PX1Izc3B2dnZ8aOnXjlGBdNx8jJyWbdujVs3PgeKpWaefPm0KRJE2bMmMXmzR+xYcNbPPDAQ6SkJHHo0EFefXUmbdvezrp1b+LvH0BGRjr33/8gd99983Z4XdTYMTxx4gQxMTEkJyczd+5c/vrXv1bZZ9WqVYwdO9YsGWrM3vhpIzkuRTyidqJE0eCm01g7S/VWU1BNnTqdHTu2mzmoLtp1UAkh7IfUgbWTdfkS6YWZXPYxdgINBZmotG4oGleyrjxjKCOGQghzGzDgr9x7r3E+kqlTJ/Hbb0eZN28xvXtH0KRJEImJb/Lxxx9w7txZVq5cxvvvf4pWq2Xv3j2cOXPK1IZNSUlmxYp/cfHiBSZMeJHHHhtM164R9OwZBYCXlzePPz6M9evX8vLLxjlOzp8/x/r1a3nnnU2o1Wq++OJTVq1awcyZc1i+fBF33NGBIUOGAzBlykTTMd5+e43pGAAffrgJgODgYB588CEOHToAwN/+Nphjx46SmppCfPxCjhz5FRcXF+bMmcno0ePp3Pku9Poihg79G//3fx+b5mipjxo7hoMHD0ZRFP7+979z7tw5Vq5cWWWfzZs3O0SleDDlCAXOZahCx5OUdQlXnZO1s2QW1QVVUJD5g+ragGhsQeXrK4svCyH+JHVg7fQJ60mf27pRkGucQVe/ZRFKfjqGh2dRWm7A3cUZNxdZqkIIYT5OTk6kp6cxb94c3NzcSU1N4eLF83To0BGA7t0jAfj734fw0UebuO22tmi1xrv9rr/TrUOHO1GpVISFVV4FoTr79/9EaWkJixfPB0Cv11NWZvw/8KeffmTo0CdM+y5atOKWyxkR0QOAO+/sjF5fxOHD/+O///2C7du3AhAWFk5GRnrDdAzbtGlDTExMtfvs2LGj3hmxBRN6Pkda9iX2/agCVLjqbL+SqymoevQwriHpKEElHUMhxLWkDqwdrZOWQC9/MksKUBQFRW9criKrzHj7aIAsVSGEMLMdO7bx1VdbeOutf+Pk5ER8/KxKi89fba9CzbOpajTGuwCdnJxuuDzRzTRt2pypU/+sI/R6/ZXz1foQqFQqU77Ly8urpF9bjqtefHEsfn7GRxpKSopxdjbPXYw19mzGjx9Pjx49atzHEdwV0p6M8lR2FxtnUbWHhXprE1RX1/WToBJCOBqpA2tnxaE1FFUU8cwdTxCs8YTyUnDWkVlorAsCveX5QiGEeeXn5+Hu7oGTk/EOvvT0tJvuGxERybvvbqS0tNR011tWViaPPvq3Wp9Pq9VhMFQAsGXL50RERLJ+/Vr0+iLc3Nz5448TbN78EdOm/ZPIyF4cPvwr7dq1R1EUXnttOrGxcVeOYTAd4+GHH8XPz5/s7CzCwsI5efKPavPg5ubOnXd25uef99G//8MYDAZeeWUiixatMF2H+qixZxMdHV3jQWqzj60rLi9m3vcbUf+xn2fzC5nMU3ZxK6l1gurPgGhsQaWzg+dGhRDmI3Vg7aQVpZNXWoDOSYuhyLiGocrdh0xZ3F4IYSEPPvgwe/bs4p//nEZwcAgFBfls2/ZffvrpRwCWLFnAM888j5+fP82bt2DcuEnEx8cSENCEoqJCJk2awtGjh9m37wcAeve+h59+Mi5NsW7dm3Tp0s2U1qVLN9q374iiwNy5swkNDaN58xa89NJUXn/9NUJDwyksLGDs2IkATJz4ComJi1mxYjGFhYXcf39/nJ2dadPmNhRFMR0D4Mknn2b16kTuvLMTimLg2LGj7Nz5DRqNlmPHjpKRkYGXlxe9e98DwMyZc1i5chknThynuPgyTz31DC4u5vnxzfaHvBpIcUUJh9J+w9NNSxkaQIWrHYwY1hRUixbNZ+TIUWYOKsWug0oIIRzNjMhXwLUM1zJPDNmpwJXF7fOurmEoHUMhhHl5eHiwdOkbN0ybPXtulW3R0Q8SHf1gpW0dO3Zi7dp3TO9vv70dI0c+Z3p/bRrAmjVvm+6ku9kxAXx8fJg58/Uq211cXFi9el2lbX373kvfvvfesBxRUX2qbAsNDWPu3IU33L++bL9n00DcnF2Z2vVJMr58gxLFOKrkYgcjhjUFlbOzusYAqGtQXR8Q9hZUQgjhaNw1bgT6epKZWUBFkfH5QpW7L1lpxhlJ5VZSIYRo/NTWzoCt0Dpp6ezTjDv0pRQrxv60PUw+I4QQQpiTQX/lVlI3H9MahjJiKIQQjZ/0bGopqSCFNac+I8DPnbYZxsvmorX9EUMhhBANIyUlhbi4OAICAsjIyGDGjBk0bdq00j5r1qzh1KlT+Pn5cebMGUaMGEGfPsa7Hp5++mlOnTpl2lev1zNu3Dief/55EhMT+c9//mN6XrxLly43XFqjIajUTqg8/MHdn+z8YlSAv5eMGAohRGMnHcNayi3J49ecs7TVOdNc0aDVqHFSy4CrEEKI2pk1axZDhgwhOjqanTt3MnPmTDZs2FBpn927d7N+/XqcnZ05efIkQ4YMYd++feh0Otq0acPGjRtN+06YMIEBAwaY3n/00UeEh4c3VHFuStupP9pO/cnIvYyi/Iivpw6Ns9SXQgjR2Mn/1LXU3Kspk1r+hb/k6ClWNHYx8YwQQoiGkZOTw549e0yjf7169WL//v2kp6dX2m/jxo04Oxvrl/DwcPR6PQUFBQDMnDnTtF9KSgoqlYrQ0FDTtnXr1rFgwQLi4uJISUmxdJFqlJUrzxcKIYQtkd5NLXlqPWjiGUZ6uRO/KBpc5PlCIYQQtZSSkoKbmxs6nQ4wrq3q5eVFcnIyQUFBpv3U19yJsnPnTu6//34CAgKqHG/Tpk0MHz7c9D4iIoLg4GBatmzJgQMHeOqpp9iyZQuurg3/bF9F1nlULp5k5RrXpJXnC4UQwjZI76aW/pd5lAPZ/yOw5xN8+nk5LTzl+UIhhBCWkZKSwgcffMCSJUuqpJWWlnLw4EFefvll07a7777b9Lpbt25oNBoOHDhA7969a31Of3+P+mUaCAjw4Oxbr4OhnMudXwOgRag3gYGe9T62LXN319V4DSx2jVJ1uNvx9bfmd8vdHQIDdVY7f33V9dplZKhxltvCbeoaqNXqOn3O0jGspfSiDA6mHKGLtzvgIzOSCiGEqLXQ0FD0ej0lJSXodDpKS0vJz88nLCysyr7JycnMnTuXRYsW4evrWyX9v//9L/3796+07ezZs7Rs2dL0XqPRUFxcXKc8ZmcXYjAodfqbawUGepJxMQUM5aBz52yaccTQVaMmM7Pglo9rD4qKSqq9BoGBnha7Rm76EvR2ev0ted1qo6hIS2ZmqdXOXx+3cu0MBkOlJcysrXfvCAYOHETfvv2IjDT+OHb48P9YvHg+PXtGMWbMhEr7b9r0LseO/YZKBW3atGXEiGdrlXatFSsWo9frcXNz49Spk4wc+RwRET0AKCgoYNGiubi7e5CZmcGoUaNp1659jWlLlyZw9OgRevXqzahRo816jQwGQ6XPWa1WVfsjoPRuaqlrk860DQzlzKnL/Mglu5mRtKagmjBhUqX9zRFUy5Ytorj4ss0ElRBC1Jevry9RUVHs3r2b6Oho9u7dS9euXQkKCmLHjh1ERkbi6enJhQsXSEhIID4+Hh8fH7Zu3UpwcDBdu3Y1Heuzzz5jxYoVlY4/ffp0/v3vf6PRaEhLSyMjI4POnTs3dDFRrixVob5mqYpAuZVUCGEhU6fGmF6fO3eWo0eP0Lr1bVX2O378N7Zv/4q33noHlUrFiy8+TadOXejc+a5q066n1WqZOPEVAL755muWLVvEu+9+AMCaNavo0KETQ4YM58yZU8TE/IP33vsYlUpVbdrkyf9g3bo3LXSF6kY6hrUU6OZPk4M7CNj/Xy7qulOuC7F2lszGGkH10ktTAPsMKiGEuJHY2Fji4+PZtWsXGRkZvP766wAsW7aMWbNmERERwfPPP09OTg4PP/wwAMXFxaxatcp0jN9++43mzZvj4VH5F98ePXowefJkQkNDuXDhAgsXLiQwMLDhCneFUnRlDUN3X7JSjZPPBMjkM0KIBtCiRUtatGhJfPysKmnbtm0lMvJu03PcPXtGsW3bFjp3vqvatOuNHz/JNGp68eJ5WrdubUrbvn0ra9e+A0CrVm0oLy/jt9+O0LFjp2rTGhPpGNaBodRYyZUoGjzMNCvpD0dS2XM41SzHul7vTiFE3Vm3DmxDBNXYsRNNr+0xqIQQ4kbCw8NZvXp1le1ffvml6fX27durPUaHDh3o0KFDle3XPm9oTQZ9jvFfFy/y9WU4O6nw8bTdZ7CEEDfX2Nqw1UlNTaFbt+6m935+/hw9erjGtBs5ceI4Gze+RUFBAXFxCQDk5+dRVFSEr6+faT9fXz9SUlJo1qz5TdMaWxvWdp6ebAQMJcaOYbGiwUVnH7eS1kVqakql5138/PxJTU2pMe1GTpw4zvTpr7B//89MnjwNqD6oqksTQgjROFwdMbysNo5o+nu7olaprJklIYQwq3bt7mDevMUMHz6C8eNfoLTUNp8zvREZMawDpdT4vEQJzmabfCbqTvP+ImIrrgbVDz/sZvz4F3j77f9YO0tCCCHqSbkyYliguAOyhqEQ9syW2rAhIaHk5OSY3l+6lE1wcGiNadeqqKigtLQYrdb4/1pUVB9mzZrBmTOnadfuDtzc3MnJuYSnp3EW0JycS4SEhODl5X3TtMZGRgzr4OqtpMYF7h1vxNBcQaXX603vo6L6kJ6ezpkzpysFzlU3Cqrr04QQQjQOKlcv1L5hZBuMI4ayhqEQojF44IGH+OmnHzEYDCiKwr59P/Dggw/VmHatjIx05s+PM71PTU2hoqKcoKBg03H27dsLwJkzp3FycqJDhztrTGtMpGNYB9c+Y+iIC9ybK6gSEuJN7+0xqIQQwlHpIgbh/ng8J5WmAAT6yIihEKJhGAwGlixZwLFjR/nll59YvTrRlNa+fUeiox8kNjaG2NgYeve+h7vu6lpj2smTf/D008MB8PLyoqKigrlzZ5OYuISFC+cRGxtvepTqxRfHcOTIryQkxLNq1XJiY+NMc29Ul9aYOF7vph6Ukj87hq5mmnymsTEYDCxbtpBjx46i07nwxhsrGD16PFA5cFQq1U2D6vq0kyf/IC4ulo0b36sUVJ6enpw9e7ZKUC1cOI9z586QkZFeJahuliaEEKLxyMq9slSFt4wYCiEahlqt5uWXp900/YknRtQ57dChA/Ttey8A7u4exMcvuOlajl5e3rz++vw6pzUm9tm7sRCVRkcZzhQrzrja6eQz1weVs7O6UgCYI6iqCwx7CCohhHBEiqGC8uRjqN19ycq7slSFjBgKISykffuOLF++mKioPqb1sM3JYDCQl5fLM888b/ZjX+uNN5aTmppMZGQvi56nNqRjWAdN/98Kxi74Br1SiIudjBhKUAkhhDCHisJcLm9JQOXqRWbe3wEIkBFDIYSFrFmzwaLHV6vVvPDCGIueA2DcuEkWP0dt2UfvpgFdLikHsJsRQwkqIYQQ5lBeeGUNQ1cfSkorcNU54+4izQwhhLAV8oBWLSmGCsoLc6goLQFwyMlnhBBCiJupKDDOGl3qbJyOPdDbBZWsYSiEEDZDOoa1pBRmc2H580zQfgxgt5PPCCGEELei/ErHUK82rmEoS1UIIYRtkY5hLSllVxa3V5xxdlKhcZZLJ4QQQlxVUWjsGOYZ3AAIkMXthRDCpkjvppaudgyLFY3dTDwjhBBCmEt5gfEZw+wy40hhoIwYCiGETZEeTm2VXh0x1NjNxDMAvXtHMHDgIPr27Udk5N28++4Gzp49g4+PLxcunGPo0OFERPQEQFEUVq9ewaVLlygqKqJPn3sYMOCRGtOul5eXy4YN61CpICcnB7VazcyZcwBIS0tj+fKF+Pn5k5WVycSJrxAWFl5tWlFRIatWreDgwf2MGPHsTc8rhBDCcq6OGKYXawBZ3F4IYVk1tWEHDx5GZOTdgHnbsG+9tdZu27DSMawl04gh9re4/dSpMabXP/30I0uXvoGzszNnzpxm9Ohn+PLLHeh0Or777hsuXrzIvHmLKCkp4cknB9OlSzdCQkKrTbve0qULGTduEoGBTQA4cuRXU9rixfN49NG/0afPvezdu4eEhLksX76q2jR3dw+mTo0hPn6WZS+UEEKIm9IEhFNSkMfFdB0gS1UIISyvoduwixcnMGbMRLttw9pXD8eClDLjYr0lijOudjwj6fLlq1GrjXcYh4aGcfnyZYqKCtHpdGzbtoVevfoAoNPp6NKlGzt2bGPEiGerTbtWSkoyqakpfPvt12RnZ1NeXsZTTz0DGH+F+fnnfcTFJQDQvXskMTFTyMrKRKPR3DQtICCwIS6NEEKIagTc/ywV6fn8vmgnoMgzhkI4AP0X82643e2R6QAU7/0PhuwLVdJ1dz+BU0Bzyn7fTdkfe27693XREG3YlJRku27D2m8Px9zKrr2V1LyXrTEF1dWAAti7dw/33NMPPz9/ANLSUvH19TOl+/r6kZKSUmPatc6fP8dvvx3hxRfHMnRod77//lumTJnIW2/9m7S0NFxdXdHpjL82azQaPD29SE1NQavV3TStsQWVEEI4qksFxVQYFLw9tGg19vPYhRCi8bu+Ddu3r/nbsEePHuGFF+y3DSsdw9pSqanQuKPX63Cxo2cMbyYtLY3PP/+E+Pj5Zj2uXq/H09OLbt26A9C3bz9mz55JUlLVjq8QQgjboJTqKfrjBLnJFQAEym2kQjiEmgYhXHo9WW265vY+aG7vY84smdqws2fPNetx9Xo9Xl723YaVjmEtaTvez/+0Xfjqk8P0M/Mzho0tqNLSUlm+fDGxsXF4e/tQXm4AIDg4hJycS6b9cnIu0bRpsxrTrtWkSROcnP78RUelUuHk5ERJSSnBwcFcvnyZkpISdDodZWVlFBTkExISikajuWmaEEII66q4lEz65/Px8GgK9CNAJp4RQljB9W3Yq8zVhr12VNIe27CyXEUd6IvLAOx6xDA5OYkVK5YwffpMfH39+PrrbaYHax94YAD79u0FoKSkhEOHDhAd/WCNade6444OuLq6c+bMKQBOnvwdT09PWrRoibe3D927R/Lzzz8C8MsvP9Gp010EBARWmyaEEMK6lCLjUhWFXFncXkYMhRAN7Po27DffbDd7G9bNzb7bsDJiWAeXS8oB7G5W0mu98soEcnNzeeqpIYAxQObNWwTAX/4SzbFjR4mLi6WoqJBnnnme0NCwGtN2797Jli2fM3/+EpydnZk7N4F1694kODiE1NRU5s1bjEajuXL+V1m+fBH79u0lMzOTf/xjxjV5u3maEEII61H0xo5hnuHqGoYyYiiEaFhV27DFzJu3GDBfG3b+/EV23Ya13x6OBeiLr3QM7XhW0k2bNld67+ysNt1KqlKpmDBh8g3/rrq0Q4cO0KfPvab3t912O/HxC2+4b0hIKPPnL6lzmhBCNHYpKSnExcUREBBARkYGM2bMoGnTppX2WbNmDadOncLPz48zZ84wYsQI+vQxPirwySefMH/+fFMjJDg4mI8//hgwrsO1cOFCsrOzKSws5L777mPQoEENVjZDUS4AWaXGDqE8YyiEaGjXt2GvZa42bNu29t2GbbAejiUrxIZiupVUaz+3krZv35HlyxcTFdWHiIgeZj++Xq/Hw8OThx9+1OzH/vMcRaxd+y/y8vLw8fG12HmEEKI+Zs2axZAhQ4iOjmbnzp3MnDmTDRs2VNpn9+7drF+/HmdnZ06ePMmQIUPYt2+faTa7xMREIiMjqxz7q6++4vz587zxxhuUlJTw0EMP0aNHD8LDwxuiaCh6Y8cw7cri9vKMoRDC0qQNa34N1jG0ZIXYUOxxxHDNmg0WPb6bmxvPPfeihc/hzqRJr1j0HEIIUR85OTns2bOHxMREAHr16sX48eNJT08nKCjItN/GjRtNkxuEh4ej1+spKCgw1YMfffQRO3fupLi4mGHDhnH77bcD8Nlnn9GvXz/AuA5Xjx492LJlC6NHj26Q8l19xjBNr8VJrcLPUzqGQgjLkjas+TVID8fSFWJD+fMZQ/sZMRRCCGF5KSkpuLm5meozrVaLl5cXycnJlerBa2e827lzJ/fffz8BAQEAtG3bltatW9O5c2cuXLjAsGHD2Lx5M0FBQSQnJ+Pv72/6W39/f5KSkhqodGC4MmKYa3DD38sFtVrVYOcWQghhHg3SMbR0hdhQikyzktrPiKEQQojGJyUlhQ8++IAlS/58JqVjx46m182aNaNdu3bs3LmToUOHmuWc/v4et/7Hre7kUpIX+Tlu3B7oQWCgp1nyZC/c3XU1XhOLXbNUHe52/HlY87vm7g6BgTqrnb++6nrtMjLUODvLgga2dA3UanWdPudG2cOxVIVYr0qPP28lDQvxJjDw1o5li0Fla/mtr+vLW9egsqpbaADUVLbaNGhsia2Vpb4ND1srb3011vKGhoai1+tN61iVlpaSn59PWFhYlX2Tk5OZO3cuixYtwtf3z2dOzp49S8uWLU3vNRoNxcXFAISFhZGdnW1Ky87OpkWLFnXKY3Z2IQaDUseSXdH9SU55ZXL59yN4uWnIzCy4tePYqaKikmqvSWCgp8WumZu+BL2dfh6WvG61UVSkJTOz1Grnr49buXYGg8E0IaGjunZSRltgMBgqfc5qtara/lCDdAwtXSHWVr0qPeDylY7h5cJiMrm149haUNlaANTXjcp7fVA1ZnVtANSmYqipQWNLrN2IuBX1aXjYYnnr42p5a6r4rMHX15eoqCh2795NdHQ0e/fupWvXrgQFBbFjxw4iIyPx9PTkwoULJCQkEB8fj4+PD1u3biU4OJiuXbsSFxfHkiVL8Pb2Rq/Xc/ToUV566SUAHn30UbZs2cLQoUMpKSnh559/Zty4cQ1axrRLekCWqhBCCFvVIB1DS1eIDUUvt5IKIYS4RbGxscTHx7Nr1y4yMjJ4/fXXAVi2bBmzZs0iIiKC559/npycHB5++GEAiouLWbVqFQD33HMP06ZNo3nz5ly8eJGpU6dyxx13APDQQw9x+PBhpk2bRkFBAWPHjq0y87elpZs6hrJUhRDC8nr3jmDgwEH07duPyMi7+eGH3Wzd+jmhoeFcvHieFi1aMXr0OFQq4zPPmza9y7Fjv6FSQZs2bRkx4lnTsapLu9aPP/7Ap59+QkhIGJmZ6YwcOYrWrdsAUFBQwKJFc3F39yAzM4NRo0bTrl37GtOWLk3g6NEj9OrVm1GjGmbCsJtpsB6OJSvEhlBeYaC03IBapUJrR7dW1hRUrVq15oUXxpo1qPbt28sXX2y226ASQogbCQ8PZ/Xq1VW2f/nll6bX27dvv+nfjxw5kpEjR94wTaVS8eqrr9Y/k/WQnm3sGAbIGoZCiAYydWqM6XVOTjajR4+nWbPmlJWV8cgj93PPPf24444OHD/+G9u3f8Vbb72DSqXixRefplOnLnTufFe1adfKz89jxoxp/N//fUxAQCDJyUlMnjyO9977BCcnJ9asWUWHDp0YMmQ4Z86cIibmH7z33seoVKpq0yZP/gfr1r3Z0JfuhhqsY2jJCrEhFJdWAOCqczJ1kuxFTUHVp8+9Zg2qmTNf5b337DeohBDC0SiKQtqlIkDWMBRCWMdf//qY6XVGRjrOzhqCgoIB2LZtK5GRd5smuuzZM4pt27bQufNd1aZdKyUlGbVaTUBAIABhYeFkZWVy/PhvdOzYie3bt7J27TsAtGrVhvLyMn777UiNaY2J/Qx9WdjVpSpctObvSy87+C9+TN1v9te34q9/fYxmzZoDxqDSaG4cVCqVyhQ4NaVdKyUlGSenGwcVwPbtW+nZsxdQOXBqShNCCGE9RcXl6IvL0Wmc8HTVWDs7QogG0pjasFctXDiXadNeZvr01/DzMy7jk5qaUmnuEj8/f1JTU2pMu1azZi3QaDScOHEMgKNHD1NaWkp6ejr5+XkUFRXh6+tn2t/X14+UlJRq0xobd14MXAAAEC1JREFU6RjWkmkNQ51jrGF4NahmzIg1e1A5O9t3UAkhhKPJyrsMGCeesbe7aoQQtmXq1BhWrXqLVauWc/DgfrMd183NjcTEf/HFF5/y5ptvcPz4MZo3b4G7u7vZzmFtMotKLV29ldQSE8+81PX/WfT1rZg6NYb8/HzGjHkOrdaFrl0j6nW8q9zc3Fi+fDWffPIB33//HX5+/nYXVEII4Wiyco2zhMvzhUI4lsbUhi0sLMTDwzgjtZeXFxERPfj++2/p2jWCkJBQcnJyTPteupRNcHAoQLVp12vT5jbTI1gGg4ENG9bSrFlzvLy8cXNzJyfnEp6exmWTcnIuERISUm1aYyMjhrVkGjG0wK2kjUlhYaHptZeXF927R/L9998C1QdOXYKqdes2TJ0aw+jR4/j734eQm5tTJaiuulFQXZ8mhBDCujKvjBjK84VCCGuJjY2p1I49e/YMYWHG2ZkfeOAhfvrpRwwGA4qisG/fDzz44EM1pl1v8eIEDAbjsmZ79+6mU6cuhIaGmY6zb99eAM6cOY2TkxMdOtxZY1pjIh3DWrpc6hi3klYNqtNmD6plyxbadVAJIYSjybwyYhgoI4ZCCCvp2bMXc+b8k8TEJcTGTqdVq9YMGvQ4AO3bdyQ6+kFiY2OIjY2hd+97uOuurjWmnTz5B08/Pdx0jry8PP75z2ksXryAXbt28uqr/zSlvfjiGI4c+ZWEhHhWrVpObGycaUKb6tIaE/se/jKj4pIrt5La+Yjh1aBq2rQZWVmZtGrV5oZBpVKpbhpU16edPPkHcXGxbNz4HvBnUPn7B1BSUlwlqBYunMe5c2fIyEivElQ3SxNCCGE9WblXnzGUjqEQwjoef3wYjz8+7KbpTzwxos5phw4doG/fe03v58yJp7zccMN9vby8ef31+XVOa0zsu5djRo4yYnh9UDk7qysFgDmCKjY27qbHsIegEkIIR5OZd+UZQ7mVVAjRQNq378jy5YuJiupDREQPsx/fYDCQl5fLM888b/ZjX+uNN5aTmppMZGQvi56nNqRjWEtOV2ZZ8/HQWTkn5iVBJYQQor5KyyrQOqvlVlIhRINZs2aDRY+vVqt54YUxFj0HwLhxkyx+jtqSjmEt9ekcSmiwF62DPKydFbOSoBJCCFFfYx/riIenCzqtfd9VI4QQ9kwe0KolV50z/bo1xdUMy1UoimKGHImGIJ+VEELUrHWYNx1bB1g7G0IIC5N2ke1QFANQt3VlpWPYwJydtRQV5Utg2QBFUSgqysfZWWvtrAghhBBCWJW0YW2DoiiUl5eRm5uFVlu3577lVtIG5usbSE5OJoWFudbOSq2o1WrT0hKO4PryOjtr8fUNtGKOhBBCCCGsz9basJZgK+1itdoJV1cPPDy86/R30jFsYE5OzgQE2M6i7IGBnmRmFlg7Gw3G0corhBBCCFEbttaGtQR7byfKraRCCCGEEEII4eCkYyiEEEIIIYQQDs6hbiVVq+s2M4+ljmFrHK3Mtlxelc6nzvmvaX9f17ofszGztbL4+tYvz7ZW3vpSq1UOV2ZzkTrScmrz/6ilrt2t1Au2xJplq+//z9Zmy3m3Jlu+bjXlXaXI1EJCCCGEEEII4dDkVlIhhBBCCCGEcHDSMRRCCCGEEEIIBycdQyGEEEIIIYRwcNIxFEIIIYQQQggHJx1DIYQQQgghhHBw0jEUQgghhBBCCAcnHUMhhBBCCCGEcHDSMRRCCCGEEEIIBycdQyGEEEIIIYRwcM7WzoCtSElJIS4ujoCAADIyMpgxYwZNmza1drbMIj09nUWLFuHr60tJSQm5ubnExsbi5+dn1+UGePvtt1mwYAG///47APn5+cTGxuLp6UlaWhoTJkzgzjvvtHIuzaO4uJjExETKy8vJz88nLS2N9evX222Zv/76az744ANatWrF+fPneeqpp+jdu7fdfKfLy8vZuHEjiYmJfPzxx7Ru3Rqo/jtsy5/1jcpbVFTE/Pnz0Wg0qNVqkpKSmD59Os2bNwdsu7y2xF5iqiHcStwKx26nmENcXByXL1/G3d2dEydOMGbMGO6++2753tWSI7UVUUStvPDCC8rXX3+tKIqifPfdd8rTTz9t3QyZ0b59+5SlS5ea3s+fP1+JiYlRFMW+y33q1CnlhRdeUNq2bWvaNmvWLGXDhg2KoijK77//rjzwwAOKwWCwVhbNKj4+Xjl69Kjp/YEDBxRFsc8yGwwGpVu3bsqvv/6qKIqi/Prrr0rPnj0VRbGf7/SmTZuUAwcOKG3btlVOnTpl2l7d52nLn/WNynvx4kVlypQppn3efffdSp+nLZfXlthLTDWEW4lb4bjtFHNJSEgwvd6yZYsyYMAARVHke1cbjtZWlFtJayEnJ4c9e/bQp08fAHr16sX+/ftJT0+3cs7Mo0ePHkyaNMn0Pjw8nPT0dLsud0VFBUuXLuXll1+utP3zzz+nb9++ALRt25aysjL+97//WSOLZlVcXMzOnTs5duwYixcvZs6cOfj7+wP2WWaVSkVAQABZWVkAZGVloVKp7Oo7PXToULp27Vple3Wfpy1/1jcqb3h4OAkJCZXeX/tZ2nJ5bYU9xVRDuJW4FY7ZTjGnqVOnml6fO3eOtm3bAvK9q4mjtRVBnjGslZSUFNzc3NDpdABotVq8vLxITk62cs7MQ6VSoVKpTO937drFsGHD7Lrca9euZciQIXh4eJi25ebmUlhYaOowAfj7+5OUlGSNLJpVcnIy58+fR6VS8corrzBo0CBGjhxJenq63ZZ55cqVrFy5kpiYGJYtW8aKFSvs+jsN1X+H7fX7fe3/XTt37mT48OGAfcdzY2LvMdUQ5LtaM0dsp5jb0aNHGTt2LHv37mXmzJnyvasFR2srgnQMxXU+/PBDbrvtNqKjo62dFYs5ceIE6enppl97HEFRUREA/fv3B6Bjx464uLhw4MABa2bLYoqLixk9ejQzZ85k7ty5zJ07l+XLl1NRUWHtrAkL2bVrF4WFhYwYMcLaWRFCWJAjtFMsoWPHjqxatYpRo0bx1FNPUV5ebu0sNWqO2FYE6RjWSmhoKHq9npKSEgBKS0vJz88nLCzMyjkzr82bN5OUlMSUKVMA+y33t99+S1FREa+99hpLly4F4LXXXmP//v24u7uTnZ1t2jc7O9vmywsQFBQEgFr9Z8hrtVq0Wq1dlvmPP/4gLy+PLl26AMYK8fTp05SVldnld/oqHx+fm36e1aXZuj179rB9+3bmz59vGlWw5/I2JvZaTzQk+a7WnqO0U8ypoqLC9OMwQL9+/UhNTSUtLU2+d9VwxLYiSMewVnx9fYmKimL37t0A7N27l65du5oa2/bg/fffJzk5mcmTJwPGGazstdxjx44lISGBOXPmmMo7Z84coqOjefTRR9m1axcAJ0+exMnJibvuusua2TWLoKAgunXrxs8//wwYn7nLzMykS5cudlnm8PBwSktLSUtLA4zlLSgoIDg42C6/09eq7vO0x8/6u+++45tvvmHOnDk4OTkRFxdnSrPH8jY29lpPNDT5rtbMkdop5pSamsprr71mep+UlER5eTmhoaHyvauGI7YVAVSKoijWzoQtSEpKIj4+nsDAQDIyMipNiW7r9u/fz4gRI/Dz8zNt8/DwYNu2bXZf7g8//JBPP/2UJ598kuHDhxMYGEhsbCze3t6kpqYyYcIEOnXqZO2smkVSUhIJCQmEhISQkpLCsGHDiIqKMk37bW9l3rp1K5s3b6Zly5acOXOG/v37M3jwYLv5Th86dIgvvviC//znPzzyyCP079+f6Ojoaj9PW/6sb1Te22+/nYceeggvLy/TSGFBQQGHDx8GbLu8tsReYqoh3ErcCsdtp5hDYWEhM2bMwM3NDS8vL06dOsWwYcO4//775XtXC47WVpSOoRBCCCGEEEI4OLmVVAghhBBCCCEcnHQMhRBCCCGEEMLBScdQCCGEEEIIIRycdAyFEEIIIYQQwsFJx1AIIYQQQgghHJx0DIVwYCtXriQqKorExERrZ0UIIYRoNKR+FI7I2doZEEJYz/jx40lKSrJ2NoQQQohGRepH4YhkxFAIIYQQQgghHJyMGArRyFy4cIFZs2ZRWlqKwWBgypQpnDx5kjfffJP27duj0+lISkrCycmJBQsW0LRpUwAOHz5MQkICiqKgUqn4xz/+QadOnQDIzs5m9uzZZGdnU15eTufOnXn55ZdxcXEBICcnh6lTp3L8+HE6dOjAggULADh9+jSzZ88GoLy8nMGDBzNo0CArXBUhhBCOTupHISxMEUI0GmVlZUr//v2VDz/8UFEURTl+/LjSo0cPpaCgQFmxYoXSrVs3JT09XVEURVm9erUydOhQRVEUJT8/X+nRo4eyb98+RVEU5ZdfflF69Oih5OXlKYqiKM8++6ySmJioKIqilJSUKI899phy8eJFRVEUZdq0acrAgQOVkpISpbi4WOnRo4dy8OBBRVEUZeLEicqWLVsURVGUjIwMZdSoUQ10JYQQQog/Sf0ohOXJraRCNCK//vorFy9eZODAgQC0a9eOoKAgdu7cCUD37t1p0qQJAAMHDuTQoUOkpKTw3Xff4eHhQWRkJAARERF4e3vz7bffkp6ezg8//GD6JVOr1TJ37lz8/PxM542MjESr1aLT6WjRooXpuQpvb2+++uorkpKSCAwMlIfwhRBCWIXUj0JYntxKKkQjkp6eDsBzzz1n2lZaWkpBQQFgrIiu8vHxASAzM5O0tLRKFRmAn58faWlppKWlmd5fdccdd1Ta18PDw/Raq9VSVlYGQExMDG+//TZPP/00TZo0YeLEidx99931LqcQQghRF1I/CmF50jEUohEJDg5Go9Hw73//27RNr9ejVqtZu3Ytubm5pu05OTkABAYGEhISwqVLlyod69KlSwQHBxMcHGx6HxoaCsDFixfx8vKqVJHeSH5+PmPHjmXMmDF89tlnjBkzhr179+Lm5maW8gohhBC1IfWjEJYnt5IK0Yh07tyZkJAQtm/fDhgfaB83bhznzp0D4NChQ2RkZADw6aef0qVLF0JDQ+nXrx9FRUX88ssvABw4cIC8vDz+8pe/EBQURFRUFJ988glg/IV10qRJpl89qzN9+nSysrJQqVR0796d8vJyVCqVBUouhBBC3JzUj0JYnkpRFMXamRBC/OnChQvMnj2bkpISDAYDgwYNYvDgwSQmJnL69GlcXFw4e/ZslVnXjh49yoIFCzAYDNXOulZRUcEzzzxD//79Wb9+PW+99RY6nY7p06fz+++/s2HDBgICAoiNjSUtLY33338frVZLYWEhzz//PAMGDLDm5RFCCOGgpH4UwrKkYyiEjUhMTCQ5OZn58+dbOytCCCFEoyH1oxDmIbeSCiGEEEIIIYSDk46hEDbg/fffZ/PmzezevZvVq1dbOztCCCFEoyD1oxDmI7eSCiGEEEIIIYSDkxFDIYQQQgghhHBw0jEUQgghhBBCCAcnHUMhhBBCCCGEcHDSMRRCCCGEEEIIBycdQyGEEEIIIYRwcNIxFEIIIYQQQggH9/8BKQMmAaAktqUAAAAASUVORK5CYII=
" />
</div>
</div>
<div class="output_area">
<div class="output_png output_subarea ">
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA4YAAAEpCAYAAADPpjwfAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOzdd3wUdf748dds3/TeeyC0EHoRRKqFIh5iP7zTn+dZznZ69lOxnvXw1LP3cnp6iiDwBVF6b5LQIQkhpPey2exmy/z+WFyMtAVCEsz7+XjwCJudmX3Pezf7mfd8PvMZRVVVFSGEEEIIIYQQXZamowMQQgghhBBCCNGxpDAUQgghhBBCiC5OCkMhhBBCCCGE6OKkMBRCCCGEEEKILk4KQyGEEEIIIYTo4qQwFEIIIYQQQoguTgpDIYQQQgghhOjipDAUQgghhBBCiC5OCkMhhBBCCCGE6OJ0HR2AEEIIcTreeOMN5s+fj1arpaysjMDAQCZOnMjtt99OUVERt9xyCwUFBQBcc8011NTUsG7dOvz9/XnggQdIT0/n1VdfZfny5SQlJfHkk0+SlZUFQF5eHs8++yyFhYUA1NbW0qdPH2677TYGDRoEwDfffMMLL7xAeHg4AI2NjZSVlQEwffp0nnnmGQC+/vprPvvsMyoqKnC73SQlJXHDDTdw/vnnA3DHHXewYsUKmpubGTBgAIMGDWLp0qWUl5czbtw4nnjiCcxmc7vlVQghRNciPYZCCCHOaosXL+bWW29lzpw5LFmyhMDAQN5++21mzZpFWloa7733nnfZ/fv3M2vWLG6//XaKi4u5//77mT9/Pi+//DJXXXUVu3fv5oEHHvAuX1hYSHV1NXPmzGHRokU88sgjrFmzhptvvpny8nLvctdccw3z5s1j7ty5JCUlAaDX65kxYwYAr7/+Og899BB1dXUsWrSIefPmsX//fm677Ta+/vprAF555RX69u0LQHZ2NtOnT2f27NkYjUbmzp3Lf/7znzOeSyGEEF2XFIZCCCHOaq+++ioTJ04EwN/fn3PPPReARYsWHbHs+PHj0Wg0pKSkAGC1WsnMzATw/i4vLw+LxQLAsGHDePfddzGZTABceOGFADQ0NLB69WoAxo4dyxVXXAHAhx9+yIYNGwC49dZb6d27N1arlbfeeguA4cOH4+/vT1hYmLdXctasWaiq2irObt26kZaWhtFoJDk5GYCtW7eeTpqEEEKI45KhpEIIIc5qlZWVPPfcc+Tm5gJQX18PQElJyRHL+vv7A57evOP9rr6+noCAAHQ6HV988QXLli2jtrYWPz8/7zI/bz80NBSAffv2MWvWLACysrK46aabAMjNzcVmswEQHBzsXT8kJMQbf1lZGbGxsd7nfrmcTudpquvq6nxNiRBCCHHSpMdQCCHEWau0tJTrr7+eRYsWcemll7JgwQKuuuoqgCN64U7Gz+s+//zz/Otf/6KiooIvv/ySOXPmHLEMgMPh4L777qOlpQWTycRzzz2HVqs95ddXFOWU1xVCCCFOhfQYCiGEOGvl5ORgtVoBGDduHAAul6vNtr9+/XoABg4cSGho6DG3/dprr7Fz504A/va3v5GWlkZ1dTVFRUV0794dk8mEzWbz9maCZyIbgMjISGJiYtosZiGEEOJUSI+hEEKIs9Yvh1/m5+fjcrnYtGlTm23/54KtoKAAt9vtLRR/aevWrbzzzjsAjBgxwjvhTG5uLp9//jl+fn7cfPPNAKxbt46mpiZqamrIyckB4O6775YeQiGEEB1OO3PmzJkdHYQQQghxKqKjo3E6neTm5rJmzRpKSkrw9/f39t698847LF68mIaGBgC2b99OYmIiM2fOpLGxEYANGzYQGxvL888/T1NTEwArVqxg5MiRnHPOOWzbto28vDw2bdpEUlISixcvBjyFn9vtZtWqVd7Xs1gsfP7553z00UcsWLCA5ORkJkyYwJAhQ4iNjWXfvn289dZbfPbZZ6SkpPD3v/+dKVOmAJ7bVWzcuBGn00lNTQ319fWsX7+exYsXe3+3c+dO70Q7QgghRFtS1NO5CEMIIYQQQgghxFlPhpIKIYQQQgghRBcnhaEQQgghhBBCdHFSGAohhBBCCCFEFyeFoRBCCCGEEEJ0cVIYCiGEEEIIIUQXJ4WhEJ3A6tWrueSSS+jRowczZszg6quvZvLkyXz88ccdGldOTg6XXHKJ98bhQgghREdavHixt7387rvvjnjeYrEwaNAgxo4dyyuvvNIBEQpx9pLCUIhOYOTIkTz00EMAfPjhh3z++ee8/PLLPP/886xevfq46/bo0YOioqIzEldWVpY3LiGEEKKjnX/++Tz00EOYTCY++eSTI57/9ttvcTqdTJ06lTvuuOOMxTFu3DjWr19/xrYvREeQwlCITqp79+5kZGSwcuXKjg5FCCGE6FQmTZrE9u3bycnJ8f5OVVVWr15N3759OzAyIc5euo4OQAhxbBaLhSVLlvCf//yHiy++mKeffpp3332Xd955h2nTppGbmwvA3XffjdFo5MUXXyQqKor33nuP77//Hq1WS0pKCg8//DABAQHMnDmTefPmMWPGDHJzc9m+fTvTp0/n9ttv5/3332fhwoUYjUZMJhN33XUXffr08cby7rvvsmzZMurr63nllVdITU3tqLQIIYTo4uLi4hg/fjwff/wxL774IgCrVq1i5MiRLFq0CPC0W2+99RYBAQF8+OGHvPfeeyxatIgbb7yRwMBA3nrrLfr160dgYCDbtm0jIiKC1157DaPRCMDKlSt57bXX0Ov1BAQE8PjjjxMdHc2DDz5IZWUlzzzzDEFBQdx///1kZmZ2WC6EaDOqEKJTWLdunZqRkaE6HA7v4169eqlbt25V+/Tpo5aWlqqqqqp2u139y1/+4l0vIyNDPXjwoPfx7Nmz1UmTJqlWq1VVVVV96KGH1AcffND7/IwZM9Trr79edTqdal5envrll1+qc+fOVSdPnuxd591331VfeeUVbxx9+vRRN27cqKqqqj722GPqI488cgYzIYQQQhzbunXr1FdeeUVdv3692qdPH7WiokJVVVW9++67VYvFos6YMUP95z//qaqqqs6fP18dM2aM2tjYqL7//vvqsmXLvNt55ZVX1HPPPVetq6tTXS6XOnnyZPW7775TVVVVCwsL1f79+6t5eXmqqqrqp59+qv7xj3/0rjt27Fh13bp17bTHQrQPGUoqRCdz3XXXcfXVV/Pqq6/yr3/9i379+jFy5Ejmzp0LwPLlyznvvPOOuf6cOXOYOHEiZrMZgEsvvZS5c+ficrm8y4wePRqtVktaWhqXX34533zzDRdddJF3nSuuuIILL7zQu7yfnx+DBw8GoGfPnmfsmkYhhBDCV0OHDiU9PZ0vvviCwsJCIiMj8ff3b7XMpEmT6N27N/feey8FBQWMHj261fP9+vUjODgYjUZD9+7dve3bvHnzyMzMJC0tDYApU6awdu1aKioq2mfnhOgAMpRUiE7mww8/RKdr/ad5ySWX8O9//5s///nP/N///R8zZ8485vplZWWEhYV5H4eFheFwOKiqqiI6OhqAwMDA464TGBjYapmAgADv/w0GAw6H45T2TQghhGhLM2bM4F//+hd1dXX84Q9/OOoyDz30EOPHj+f1118/4rlftm9Go9HbvpWVlZGXl8e1117rfT4+Pp7q6mqioqLaeC+E6Bykx1CIs8D48eOpqKhg5cqVKIpCUFDQMZeNjY2lpqbG+7impga9Xk9ERITP61itVvLz89smeCGEEOIMmTp1Kg6Hg+LiYpKTk4+6zOzZs/n973/PP/7xD5qbm33abmxsLJmZmXzyySfef7NnzyYjI6MtwxeiU5HCUIizgNFo5KKLLuLBBx9k0qRJrZ7z8/PDZrMxZ84cFi5cyLRp01i4cCE2mw3wTN09depUtFrtMbf/8zo/N5gfffSRzIYqhBCi0zMajTzzzDPcddddR31+9+7dNDU18cgjj5Camso///lPn7Y7efJksrOzKS4uBqC6upprr70Wt9sNgL+/PzabjXXr1vHRRx+1zc4I0cG0M483Jk0I0S5Wr17Ns88+S1VVFRs3biQhIYH4+PhWywQFBfHdd9/x+OOPtyryLBYLb731Fnv37uWGG25gwIABNDc389JLL/HNN98QFBTEww8/jMFg4Pnnn2f58uXs2rULh8PBgAEDAM+9EG02Gy+++CJz5sxBq9Vyxx13kJ+fzyOPPEJxcTFlZWWEh4fz7LPPUlhYSF1dHSNHjmzXPAkhhOjafm4vs7OzcTgcDBw4kLS0NO+omPvuu4+NGzdSUFBAU1MTzz77LHq9nilTpvD222+zbNky8vLycDgcfPDBB+Tn52M2m9m5cydfffUV+/btIywsjCFDhtCrVy+eeuop5syZw6JFi3jooYe8bbPb7ebNN99k8+bNXHfddYSHh3dkWoRoE4qqqmpHByGEOLG8vDw+/fRTHnvssY4ORQghhBBC/MbIUFIhOrl58+bhdDr55ptvmDZtWkeHI4QQQgghfoOkMBSik9uxYwfTpk2jtraWrKysjg5HCCGEEEL8BslQUiGEEEIIIYTo4qTHUAghhBBCCCG6OCkMhRBCCCGEEKKL03V0AJ1RSUkJTz31FBEREVRUVPDwww+TmJjYahlVVXnhhReorq7GYrEwfvx4Lr300g6KuPPwJXdvv/02ubm5hIWFkZ+fz7XXXsuoUaM6KOLOw5fc/ayoqIipU6fy97//XT53h/iav/nz57N582YA9u3bx1133cWgQYPaO9xOxZfcVVRU8OijjxIXF4fFYiE8PJz77rsPRVE6KGrRXk63TexK7eXp5uqbb77x3l4BICYmhq+//rrd96M9+PqdvXnzZmbOnMno0aP529/+1uq5Dz74gOzsbBRFoWfPntx0003tFX67O918rV+/nltvvRWTyeT93erVq9sl9vZ2useiXek76wiqOMKNN96oLl68WFVVVV26dKn6xz/+8YhlFixYoN56662qqqqqzWZTx44dqx48eLA9w+yUfMndjBkzVIfDoaqqqu7du1ft37+/arPZ2jPMTsmX3KmqqrrdbvXOO+9Up0yZon799dftGGHn5kv+duzYoT711FPex8XFxWpZWVl7hdhp+ZK7J598Un3ssce8jydOnKguXbq0fQIUHep028Su1F6ebq6+/vprdd26de0Wb0fyJVe5ubnqu+++q95zzz3qCy+80Oq57Oxsddq0aarL5VLdbrc6ffp0dePGje0Reoc43XytW7euyxwznO6xaFf6zvo1GUr6K7W1taxatcp71mDEiBFs2rSJ8vLyVsvNmTOH8847DwCj0cjQoUOZP39+u8fbmfiau48++gidztNZnZCQgNVqpbGxsd3j7Ux8zR3Ap59+ysSJEwkJCWnvMDstX/P38ccfExMTwz//+U8ef/xxfvrpJ6Kjozsi5E7D19xFRUVRU1MDgM1mw2KxSG9hF9AWbWJXaS/b6vjhf//7H8899xyPP/44e/bsab8daEe+5io9PZ0bbrjBe8zwS3PnzuXcc89Fo9GgKAqjR49m7ty57RJ/e2uLfAH88MMPPPvss8ycOZONGzee8bg7Qlsci3aV76yjkcLwV0pKSvDz88NoNAJgMBgICgqiuLi41XLFxcWEh4d7H4eHh1NUVNSusXY2vuZOozn8sVu2bBnnn38+ERER7RprZ+Nr7goKCti1axcXXnhhR4TZafmav7y8PFasWMGdd97Jgw8+yPvvv88PP/zQESF3Gr7m7k9/+hN6vZ5bbrmFP/7xj0yfPp3Ro0d3RMiiHbVFm9hV2su2yFVGRgYzZszg/vvv5/rrr+f6668/6gnCs52vuTqeoqIiwsLCvI9/q58raJt8xcXFcdVVV/HAAw9w5513cu+997J79+4zFXKHaYtj0a7ynXU0UhiKDlNSUsKXX37Jk08+2dGhnBXcbjcvvvgi9957b0eHctZqampi/PjxaLVaDAYDF1xwAQsWLOjosM4Ks2bNwt/fnzfeeINPPvmErVu3kpOT09FhCfGbkpmZSb9+/QBISkqiZ8+eLFu2rGODEr8JiYmJ3l6w0NBQRo0aJe0fciz6a1IY/kpcXBxWqxW73Q5AS0sLDQ0NxMfHt1ouPj6e6upq7+Pq6uojlulqfM0deM7GPP3007z44ouEhoa2d6idji+527NnD3a7nVmzZvHoo4+yf/9+Zs+ezQsvvNBRYXcavn72YmJi0Gq13sd6vd67Tlfla+6WLFniPagwGAz07t2br776qt3jFe2rLdrErtJetkWu9u/f32pZvV6PzWY7w5G3v5M5XjiWhIQE7/B2+O1+rqBt8lVQUNDqsXy2jn0s2lW+s45GCsNfCQ0NZeTIkaxcuRKANWvWMHDgQKKjo/nhhx+844+nTp3KihUrALDb7WzYsIHJkyd3WNydga+5Kyws5B//+AdPP/004eHhLFiwgC1btnRk6B3Ol9z16tWLd955hyeeeIInnniC1NRUpk2bJj2I+P7ZmzhxIuvXr/eut2nTJkaOHNkhMXcWvuYuJSWF3Nxc73p5eXnExMR0SMyi/bRFm9hV2su2yNVTTz1FfX09AFarle3btzN06NAO2Jszy9dcHc/UqVNZtWoVbrcbVVVZvnw5U6dOPdOhd4i2yNebb77p/Q53uVxs3LiR4cOHn9G4O0JbHIt2le+so1FUVVU7OojOpqioiKeffprIyEgqKip48MEHSU5OZsqUKcycOZPBgwejqirPPfcctbW1NDY2Mm7cOC677LKODr3D+ZK7Cy64gNraWgwGA+CZyOL1119n2LBhHRx9x/IldwBOp5NnnnmGxYsXk56ezuTJk7n88ss7OPqO50v+XC4Xs2bNoqGhAbfbTXBwMPfcc0+raw26Il9yV1xczJNPPklcXBxNTU243W4ef/xx/Pz8Ojp8cYadbpvYldrL083Vxx9/zJo1a0hOTubgwYNccMEF/O53v+vgvTozfMmV2+3mqaeeYu3atZjNZkaMGNHqFgzvvfceOTk5KIpCjx49uOWWWzpwj86s083X/PnzmT17Nunp6ZSVlZGZmcmNN97YwXt1ZpzusWhX+s76NSkMhRBCCCGEEKKL69qnyYUQQgghhBBCSGEohBBCCCGEEF2dFIZCCCGEEEII0cVJYSiEEEIIIYQQXZwUhj5qaGjg1VdfpaGhoaNDOStJ/k6d5O70SP5OneRO+Eo+K76TXJ0cyZfvJFcnR/J1JCkMfdTQ0MBrr70mH55TJPk7dZK70yP5O3WSO+Er+az4TnJ1ciRfvpNcnRzJ15GkMBRCCCGEEEKILk4KQyGEEEIIIYTo4qQwFEIIIYQQQoguTgpDHxkMBgYOHIjBYOjoUM5Kkr9TJ7k7PZK/Uye5E76Sz4rvJFcnR/LlO8nVyZF8HUlRVVXt6CCEEEIIIYQQQnQcXUcH0J7q66243adeB4eG+lNb29SGEbWfzhr7m2/quflmx3GX6ayx++Jsjf1sjRt++7G/ufXf3Nz/Lyfclqng39hSTrxcW+lMeddoFIKD/To6jLNOV2sj2ypeX/8mT8SXv9mumuO2cqJjjs4Wry/Otpgl3jPveDGfqH3sUoWh0+k+rUbv522crTpj7DU1qk9xdcbYfXW2xn62xg2/7dhrrLU+7Z/b7ttybamz5F2jUTo6hLNSV2wj2yJeX/8mT8TXv9mumOO24ssxR2eK11dnW8wS75l3rJhP1D7KNYZCCCGEEEII0cVJYSiEEEIIIYQQXZwUhkIIIYQQQgjRxUlhKIQQQgghhBBdXJeafEYIIXxVbq3ET2cm0BDQ0aF4tbgcNDnOrtnRhBCiIzicLqx2F812J812Jw6nG6croqPDEqJTk8JQCHHWcrid6DVt8zVmc9rZXL6V5KBEEgLj+LFwBevLNjMtfTJjEke2WrbCWolRayTYGNQmr+2r97Z/wvKiNRxoOEhyUOIJl/9633eUNVVwc9Z1aDXadohQCCHaj6qq7Cyo5cfNRdRa7N4isNnuxOk6cobdvet6onxaSmZqGJlp4SRHB8osxkL8ghSGQoizjsvt4tu8BSw9uIpb+/0/eof3aPW8zWmjqrmGWP/o4xZEpU3lNDmsdAtJZdGBJXx/YCnDYwdzba8rSA9OYXXJeqL9IgHYXL6VEGMI6SEp5FTt5Lu8hYxPGs3U9ItabbPF5UCraHwqxJocVn6qyKFnWAYR5rAjnttff4BuIWmYdEZsTjtNDisA8QGxtLgc5FTtYHB0/2NuPz0kle1Vu1h0YAmTUs8/6jKqqmJxNOGv90OjnPrVBW732TedtxDi7LWnsJbZK/ez92DdUZ/XahTMRh1+Rh1mow4U2AvsK6pnX1E9s1fuJ8Csp3dKKH1Sw8hMDScyMvCMxauqKoXlFnLyqti+vwZ/k57fjUolKfrMvaYQJ0sKQyHEMdXZ61lfupkAvT8j44cdd9kWVwsFDYV0C0k7boGxomgtG8q2MCZhBIOi+6MoCqqqoiies7Y2px2j1uB9bHe1YNQa2FW9l/VlW8gITWdQdD921+xDRaVHaDccbifvb/+MuIAY/l/kZawp3cjX+74jI7Qbdw7481HjyK8/wKwtb9AtJI07B/yZc2KHkFu3n95hniJzWOwgUoKTiDJHYHPa+XzPN4QYg3lo6F8ZEJnFt7kLUPGckW5sseCnM6PVaHln28eUNpXz+16X0Ssso9Vrbq3YxoritQyI6svIuGG8kf0++xsKuSRtIqMTR7K+6CeKKis5L+EcSpvKeSPnA1KDkrhzwE2YdEbuGfQXmloq0CpaPtjxCVsrt1PTXMv4pPMoaSrH4rDQI7Qbu2r2EVq3H1OokWaXjfiAWFxuF40OCyHGYFxuF1ZnM4GGAH48uILZufOJ8Y/mkWH3sL++kA93fk6sfzQ3Z10HgFt1e99TS4tnKGuAwd+7X83OZu5Z9E9GxgxnVPxw77I/v69f7Z1Dfv0B7h9yx3E/Q0KIzs/tVtm+v4bC8kZMBi1+Jl2rAszP5Pm/yahDVVVaHC7sDhf2Fhe2X/20t7gIDjCQHheM0eDbqIa8knq+XZHPjoJaAPxNOi4alkSv5DDMRq03Dr1O421HfqYr0XHRND079lezfX8NVfU2NuyqYMOuCgCSYgLpmRhCZmoYGYkhGPSnN9Ki2e5kZ0ENOXnV5ORXU29pafV8dl4Vo/vHM21UKoF+htN6LSHaghSGQohjKm0qZ27+QqL9IhkRN5Tt1bvYXJ7NpNTziTCHsatmL/tq87kkfSJuVeWVn97hnNghXNNzeqtC78fC5RxoLOKWrOvJiuzNN7nzWFO6kcExA9hRvZtvcudzVcY04gNieXXr26QEJXFx2oV8kzufYksp9w+5gyi/CDaWb6HcWs6IuCHclPVH6uwNaDValhYuJ6dqB27VBUC3kFT0Gj2Do/sBUN1cg7/eD5PO5N23AL0fSYEJGA4NRY3yi+CeQbe22v+fewtdqotzYoewu2YfJZYyEgLjeGDInYSZQlBVlfd3/AedouXWfv+PMYnn8nr2exRbSskISee1re+ionLngJvoEdadT3Z9SWOLhVHx53Bhyjg+2/0/xiSeS5OjiZfWvI1Ja2Rk3FCSAxMA6BXegz21ueTXH2BC0mjMOhOKopAV0Yfcuv1kRvRiT20u/85+j1BjCI8Ov5f/7P4f5zfnYoyt44lzHkCjaPh413+psFZx7+DbWFq0iu8PLOXRYfdyXvw5rChaS7gpFAAVN1XN1USawwHIry/gy71z+FPmteTV7eeLPd+QGpzMHQP+zMritXQLSSO3bj/FDWVsYAuj4oeTXbmdefnfc1n3qfQI60Zji4XCxiIsLU2tCkpxWElJCU899RQRERFUVFTw8MMPk5jYeriwqqq88MILVFdXY7FYGD9+PJdeeikAFRUVPProo8TFxWGxWAgPD+e+++474sBYiFNlaXawMqeEpVuKqaq3+bSORgH3kSM6j6DVKKTEBtIjMZSeSSGkxwd7evl+4UBZI9+uzCc7rxoAs1HLBUOSOH9wIn4m3w5n9ToNg3pEMqhHJKqqUlHbzPb9NWzPr2Z3YR2FZY0UljXy/caD6LQaeiQG0yc1nMzUMOIj/Y/79+Rwuqi1tFDXaGd/aQM5edXsPViH6xcJCA000jctnL5p4ew5WMuSzcUs+6mYDTvL+d2oVMYOjEerkXkhRcdpt8LQl0YPYPPmzcycOZPRo0fzt7/9rdVzH3zwAdnZ2SiKQs+ePbnpppvaK3whOpUmhxWn23nENW6NLZZTGhJY1VxDi6uFuIAYKqyVfLb2Sy5PnUZGSDoj44bSN6I3btXNN/vmUdFcxeDo/oSbQvloxxc0Oa0MiRlAhDkcrUZLYWMRdpcdjaJBr9GjKArLi9fQ5LBS0HCQ+IAYLut+sbfoKmg4SFlTOcuKVjMmYQQlTeVYHc2cGz+crZXbcLidNDttBBuDuDLjdxi1RgAizOFEHCpe+ob3It4/Fn+9HwBJgQm8eN7jaBQNjS0WXvnpbcw6E7f2v4EgQyB19nqi/CK5e+At3l6/4/HX+zG9+8WtfpcQGAd4elWrm6upszdQZq2gZ2g3ft/zcobHDkKjaChtKqfRYSGvvoDkwASu7DENreI5C903ojf/GPkIiqJg0IYwKnkowZoQHG4nJp2RF897HLPOzHf5i1h0YAlBxsNDjobFDqJfZCYmnZEQYzAR5nDOjRuGQavn2l5XoOzZRvdDMeyrzSO3bj86RUudvZ6C+kKaHFa2Ve/inNjBPDHiAVxuT1GdEBDPY8PvRafxnO3/Lm8RBxuLWVW8jpTgJFrcDsYnnUdji4UVRWv5Ln8RM4ffR0JEJAaH57NX0HCQkqYyNpRvoUdYNyalTmBi6gTMvyjMRWszZ87kiiuuYMKECSxbtoxHHnmEDz/8sNUyCxcu5MCBA/z73//GbrczceJEhg4dSkJCAm+//TYxMTE8+uijAEyaNIlhw4YxZsyY9t8ZcVJcNcW0bJmDQbcdd1wDGnP7Xrt8IvklDSzdUsT6XRU4XZ4h4xHBJgZ0j8TpdtNsc2I9dF2f96fNia3FhVsFnVbBqNdiMmgxGnSH/6/XYtBrKK9tprC8kbziBvKKG1iw7gAaRRhAYfgAACAASURBVCE5JpAeSSGkxgaxYVc5m/dUAmDUa5kwOIELhyYRYNaf8n4pikJ0mB/RYX6MH5SAw+mmqqmFNVuL2Z5fw4HyRnYU1LKjoJYvl0JwgIHMlDCSYgKxWB3UWuzUNdqps9ipbbTTZHMe5TWgW0Iw/dI9xWBiVIC3uBzUI5LR/eP54oe97Cio5T8/7GP51hKuntCd3ilhR2xLiPbQboWhL41eXl4eW7dupUePHkesn5OTw3fffcf//vc/FEXh8ssvZ9CgQQwePLid9kCIM0tVVSqslUT7R3l/53A50Gs9DV+xpZQYvyg0iobXtr5LSVMZf8qcQfeQdJyqE7fq5qXNr5MenMLve17m82QjZU0VvLzlTbQaLfcOvo0NZVtYU7gJjVPL1T2nc03Py7zL3tLvelYUrSUlKAmtRsu4pFG4VTcBen+MWgO397+RpMB4KpureSP7A67qMY3MiF5cmTGNIEMgqcFJAJwbP9y7zSHR/UkOTPBeS3dr1v8jwhxOuDmU63pfTaQ53FtQnJcw4qj7EO0f1SpvALpDPYF2lx1FUVAUDQaNge8PLGVO3v9xabcpjE86z6ccHU+IMZi/D7uHndV7CDIEotVoGRE3xPv8TVl/xF/vR6Q5AkVRGBozsNX6vzwDffvw66msbPQ+NuvMANTYaokyRzAybhgbSpZ7nzfpPEWyn97M4+fcD4C7uYHU3K2EBKTQfOgEQYQ5nEvSJ9I/si8GrZ4bMmewu2YfBu3hoUs/f14MWj1Rh4p2gD9n/YElhSu5KGU8Wo2Wvw36C6nByVRYq6hsruKS9En46f0YGtnfG/vQmIF0C0mle0g6ADH+0aea3i6htraWVatW8eqrrwIwYsQIbrvtNsrLy4mOPpy7OXPmMHbsWACMRiNDhw5l/vz53HTTTURFRbF9+3YAbDYbFotFegs7ObetkZZNs3HsWgqqijskl6bP78WQeT6GfhNRjB3Xu97icLF+VzlLtxRTUOb5u1aArPRwxg6Ip29a+AknbXG7VcLCA6irPfFMylabk9ziOvYU1rHnYB0FpY3sL21gf2mDdxm9TsPYAfFMGp5MkH/bD7vU6zRkdYskNtjE9NHpNFhb2FlQw478GrYX1FBvaWH19jJWby876vpajUJIgIGQACNRoX70Tfdct3i84jU+wp+7r+zP1n1VfLFkH8VVTbz4xVYGZkRy5bhuRIaY23w/hTiedikMfW300tPTSU9P54EHHjhiG3PnzuXcc89Fc6iLffTo0cydO1cKQ3HWqrc38N+93xJuCmV694tZV7qJ/+z5mimpF3B+8hje2fYJeXX7eWLEgwC8tPnfDIkZyFUZ0zg3bhj/2fM1Rq2Bj3f9l2JLKecnjaahpZESSykOt4Pv8heRXbWdC5LGck7ckFbXif1SuCmUSL8IDBo9Rq2BWP9oxqScw4io4UcsG+UXyWUZU72PL0oZ3+r5biGpABTUF9LQ0sjmimwyI3ox6NCQzqOJ8otsVYj0COvm/X9mRC8fs3lsEeZw7hn0F1RUbyEFtOlBs0FroH9U36M+lxqcDIDqdmLfMh/F6I++zzgURYOqqqj15SjB0SiKgsvaQMv2H3BbqtGnD0cbmYJqszDVHYouOgXdoZ5G1WHDWbAFXWIWiqn17TQUvQnH3tU4zBtxRe5DE5aAecO39MpfjyuxH4y/GUVR6BWecUSsR2NyurhACceduw7VP5TksARc5bn47VvNPbYwYtUjJ06I9Y8m9jjFoNtaD24nmoBwn2L4rSspKcHPzw+j0fP5NBgMBAUFUVxc3KqNLC4uJjz8cM7Cw8MpKioC4E9/+hP33nsvt9xyCzU1NUyfPp3Ro0e3744In7mqCrDOex5arKBo0Pcag6a5FurstGydhza6G7rkY08sdaZU19tYsqWIFdkl3h4wf5OOUf3iGNM/jqhQP5+3pdEo6HW+jV7xM+nISo8gK91zO4lmu5O84nr2HKwjv6SB+Ah/Jg5PJjTQeIIttZ0gPwPDe8cwvHcMqqpSXNXE9vwaymutBPsbCAk0EhJgJDTASGigkQA/PZpTaFcURWFARiSZaWF8v/Eg89YcYMveSnLyqhneJ5oeiSF0TwwhMth00u2Ww+mios5GsL/htHpXRdfRLoWhr43e8RQVFTF8+OED1fDwcH766aczEq8QbcXtdreaWGVX9V7WlW3ij72vwq26ya7cjl6jY2LKBKptNaiqSqgpxDsE0upsZn/9AYKNQQQbgzzX96VMYETcUNJCUgg2BFJjq6XJ0UT3kDTuGnAToaYQTDoTdpedCmsVhkM9jrNz53OwsZire1xKtH8USw6uRKtoGZ0wgluyrseoNaDVaBkU3Z+LIke16rk6qX1W3dhcdq7vcw39IzPbLJen45f3IhydMJJz44bjp299JlZVVZwHDhVbWt8bULe1Dmfuelzl+9Al9UOXMRLlVwW46rTT/MPruAqzAXAWbsWQNZGWn+biri8nYMbLAFh2rMK+5jMAHDkL0YQl4K4rQ+f2HKA5x3qGcTUveAlX+T40Ecn4TX0Y3E5QNCh6E2j1aCKSoWEN1gUvopgCUC2ea3Jw2r0xuSoLcBZsxlW6B1dNEZqQGLRhCaDRoehNGIdd4dm/hgpsi1896r6HAA59ALokT+Hvri8DnRGNf+hxc2Zf8xnOwmyMg34HOj3Og9tRFAVdykB0KQM7tKfkbDVr1iz8/f156aWXaGlp4aabbiInJ4esrCyftxEefvr37DyTszqeCW0Rr7+/8aS3o4b1pCggGF1Id8InXIchMglyqjCOuQbLjlWEDzrXMzGX20VjzjICMkeh0RnaLOZWsagqO/fXMHdlHuu2lXqvCeyWGMLkEamMGhCP8TQmYTnVeJMSQhl7/DnPTpq/P0RGHr+4PFa8UVFBDOgd27YB/cp1U0O4eHQ3Ppy/k2Wbi1iVU8qqnFLAc31i79RweqeG0Ss1jLS4YLRaT1sTFh5AeXUTBaUNHCht4EBZIwWlDZRWWbzvZ6CfgYSoAOIi/YmPDCAhKoD4yABiIwJ8LuDbSlf8nmhvpxpzl5p8pis2er/UGWP35UsaOib2elsDWkVLwHEOUt2qm+dXvcnguCzGpJ7D1tLtfLfnR0YlD2FC+igW7lvGgr1LuG3YdXQLT2HVzjVsLd/J8JT+DE8cyF3n/Im4wGiSQ6O4Pu4yJvQcQUJQLIqicMvw3xNsCiLUHAzAqylPUGapJNQcjElnJArPdShPR91LSUMZaWHJrWKbEfg7pvQZS5R/BCa9kZx1O6iy1hAQbCQs2I8NmzdT1FDKyG4DSI6MOmLfTifnV0VNPuV1AVTVjep0oNGf/Nlhp6UO/b4l2A7uxq/bIAL7j0fRHv6qU90u7KV5GEKCvdtvLthGzZJPsZfmEn7+9QQPneLTa7msDRS+fx+q0zPTnHP/Jox7l2FM7EVLaR664EiiLrkTl1WlpKkS1ezJqatoO81FnmF/puQ+RER4rjtx+Y+iOX8ruuBIGrevwF1TBCiYUvqiOh3EDh2H/6q1hJ97IRVfv4i76gDKli9xNdXjqCsj5rL70IfFof7+Yaxfr4U6O6rFjiEmncgpt6IPj0OjM+CyNVH81Zs4aw8PiXJX5OOuyAdAGxhG5JQbPPtkjKcqYwiK3oizroKWykI0Rj8Ceo9E6xdEQJ9R6II9++Xe9CXNeVvQBoRijElDFxqDPiQKVVXRBYYR0HskqstBhUGD02nHvv6/rd+7Az8R4qwjbMw1J/u2n9Xi4uKwWq3Y7XaMRiMtLS00NDQQHx/farn4+Hiqq6u9j6urq0lJSQFgyZIl/PWvfwU8J1979+7NV199dVKFYXW1BbcvM4UcQ2Rk4CmfUOoIbRVvU5P9hNtx1RzEvvYLDAOmoIvzjIQwTHkIxRhAPQpUNuJntWONjYUBl1NVZQHAsXc1tmXvUL3sCwyDLiHunAupqvVt4pcTcTjdbNhVzuJNByks97yeVqMwvFcU4wcnkB7naXsa6qyn/Bqd7TPR1GSgsrLlmM93lnj/cH4GY/vFsWN/DfuK6thXVE9to53VOSWszikBPNdapsYG4nSrFJY10uI88pZBiuK5HrTR6qDR2sKughp2FdQcsUxkiJm0uCC6xwfTPSGEuEj/U+r99EVnyXFto50120sZkRl73J7ozhLvyThezBqNctx6qF0KQ18bveNJSEigpubwh7m6uvqk1ves07UavV/qrLGf6EsaTj52l9vFf/fOBhSu6Tn9lGP7cu8c1pZs5Oqe0xkaM5BvcxcQZAhgdMJILI4mdBod+fUFbCnZRn51Ib39e1NSVc2uyn1o3Vr6BfVnR8VeyiyVZBfuJVSNJNoYw9S0JDLMPaivsdHdnAFOvPtnItB7QOBPCE4LVFoO77sOM422FhppnbNAwo6SIwV/Qmiqd9KEkwcG38nm8mwMdj8q9xfgamrk2m6XoLOZqbS1XrcjPy/O4p3YVn2E2lSH8Zyr0fcc3Wr4jGP3ClxVBzAOvxJF1/o6E8fe1dhWfODpQQOsuZupXjsH45DL0KUORlEU3PXlNH3zOIpWj77naFxVBbgOFWmKOZgmp56WykZUtwtUtVVRCeC21KDoDIeGcCroeo1FbaxEE9UNx/bvsZfmYS/NA0AfEH0ojwqGC+9BddhR9EZsS97CVZGHPvN8dAMu9r7nkZGBaMfdjgr4Z/0OV+keNGEJaALCUVWVqlobTU12msMz8Zs2E+vcp2jc+oMnMIMfNbXNaFye980voR+GkAEoGi36zPE0aHRQawfsqDYLhCahDYxBnz4UbWwP3A0VuGuKQVFQjP6/eP+NaMf8xfMSgP5QD7gbcAO1LUBlI5GRgTg0JjCYcVlqseZubv3GKhoam0GXlIXmvJsxpwynJXshijkQXWIWqtuFM38jLTH9Tvuzd6KGr7MJDQ1l5MiRrFy5kgkTJrBmzRoGDhxIdHQ0P/zwA8OGDSMwMJCpU6cyf/58rrzySux2Oxs2bOAvf/G8NykpKeTm5jJhwgTAc81+375HH94s2oaqqjh2LcV5cBsM9fzObanGtuoT+HlSK1UFtxNXyS5QVVpUt7cw1JhOfPJN8QvxjByoKcK+4gMK1nyGJjINbWwPz7/odBTdyZ1Aq7fYWfqTZzbMBqsDgACznjED4hk7IL5dh2uKY0uMCiAxKoCLhiWhqiplNdZD92D0FIoVtc3sLjx8D8fQQCPxkf4kRAYQH+H5GRfhh16nRVVV6iwtlFU3UVZjpaym+dDPJqrqbVTUNlNR28y6HeUAmI060uOD6J4QQvf4YFLjgk6r17iz2X2gljfnbKfB6uCnfVU8NGPQCa+Z7SrapTD0tdE7nqlTpzJz5kzuuusuzyyHy5dz//33t0f44iyzoWwLq0s2MCjKM7yt2FLKxrKfmJp+EZ/s+pLM8J6kBadwsLGYrMg+3vVUVWV92WaKLaVc2m0K9fYGWtwO4gM8Q0e2Ve+irKmcKL9IFhcuo7q5lj9n/YHr+1zjmYFTqyczohc3ZM6gZ2h3AP56zp/YkLfDO3nKxWkXnva1baqq4ti2CHdTLcbB0zzDB31g1pm9k764HDbu2J2HUtKI6/xEtOFHzhB87Nd3HzFU8mSoh4YzKjojzuKduKsOYOg30RNXZQHN85/3Lmtf+SGuwmyMo65D4xeMu7kB24r3PU/qDJiGX9lq25qoVFDd6JIHoI3vTcuOH1Hry7D98G80kWmYx96IJiQWQ9ZFtGz+lpat8zwr6s0Y+k/CkHkBit7oeZ0fXsdVuhvl0MGbJjIVjV8Ijtw16HuNwzTC06tlOudq7+sb+ozDsXs5qsOGNiIVTWTK4dh+cT2d+eIHweU4orD9JUVv8g7RhCOvidRGpmAceS32FR94tjnuz2iCftHzqygY+086+rZNAZgntL41hyYgHOJOfE3n8T6/5rF/9vT2NlTgqj6Iu6ES1VIFGi2KwQy/mBBJl9QfXVLra6gMvcac8PV/qx577DGefvppVqxYQUVFBU8++SQAL7/8MjNnzmTw4MFMnDiRnJwc7r//fhobG7n11lu9s3s/9NBDPPnkkzzxxBM0NTURGBjI9ddf35G79JumOuzYlr9HVeEmGrQVh3/fYsNVuPXIFRQN+j7jMA6adlKvo0vog3b6EzjzNtCSPR939UFcpbtxle4GwDT+VvTpnqrUVVOMxi8YNBrUlmbcdit1tfVUV9VRX1tHtiOJoiobJVVNhFKPxR1AYlQQEwYnMLx3NHrdb+fA/7dGURRiw/2JDffnvH6eGbHrm1rYX9JAXEwQ/noFf9OxL4FQFIXQQM+1kL1+NeOpw+mmtLqJ3OJ69hXVk1tUR3WDne35NWzP93TI/Hw7kfEDExjaK/qsLaJUVWXRhoP8b1keblVFwTPr7vLsEsYOOLnOpt+qdhtK6kuj53a7eeqpp8jOzsZsNvPiiy96b1mRlZXF5MmT+etf/4qiKIwfP54hQ4Yc7yXFWaDWXs/CgrX0Ds8g6dB9206F3dVCfl0BMf5RDI8dTI2tlviAWBxuJ69nv0+dvZ6SpjJ2VO9mQ9kWwDNhyMzh93lv+WB1NvPV3jnYXHYyw3txY98/UNVcQ4TZ8yU6PvE8qpqrSQlKwul24VJdhBpDWsUdaAhgYNThoVsajYb0kBTv47aY8MSx/Xvs674AwFW8E/OFd6AJjDzBWqDam8Dgh6IoaEIT0ITE4q4rxfrtk2hjuqFa60BnQt9tOK7hFxx1Gy3bF2Pf8BWoKvruIzCd5znwVJ0toGhAo8FZ8BOO3ctQ9GZ0aYPRJfZDOTRk01VXgvXbJ6GlGQxmz09FQZvQB214Eu66EtDqMAyYiiYwAtvqT3Ae+Aln6W6MQ6/A0HsspjE3Ylv2Do5tC9ElZeHYuwbj4GloAsLQhsSRfOe71Fg9hau+91gcu1fSsvlb3PWl3iLPMPAStHG9cO5dhWIOxpB1UatJXNSmGlyHhlWqh3pTXQdzcP38vK2x1bWjP1P0Jgx9Lzzhe6EoChynKPSVoedoFK0exWA+osjqKIqiQQmOQRMc09GhnFUSEhJ44403jvj9vHnzvP9XFOWok7OBZ5jpm2++ecbiE4e5Gyqwfv8KS901/JAcjstyeKilJiAM0wV3oKB4pvE89FMTEtf6xM1JUBQN+m7D0XcbTpi/SvmOLbhK9+Iq3Y029vAkUs0/voFaW9RqXSMQd+jfp7WXY1HNKIrK3eE/YNY40Sdmojfa0bb4gS7klOITHSPY30D/7hGnPcJHr9OQFB1IUnQg4wZ6jmdqGmyeQvFgPfuK6zhYYTl0O5GdzFm1n8nnpDC8TzQ67dlzz8Vmu5MPFuxi06Hbnkw+J5nEqADenLOD/y3LY2D3CIIDpLe83QpDXxo9jUbjvQfT0dxwww1nJDbRvhxuJ+VNFSQExlHVXM13+QvZXbOXOwfc1OpAe3vVLgxaAxER/XC4HHyTOw+TzsQFyWMw68xsq9qJ1dHMsNhB1NpqeS37XUbEDuWqHtOYnHa4sJnR83JWlaznut5XsaZ0IyHGINaXbkZz6J57y3au5sKUcWSEpnNx2kUYtHoyQj3T7P9cFAKtbkFw98BbqGqubjWpSXtwFvyEfa2nKFT8w3DXHKTpm5kYh17u7W1xlu1FtdSgjeuJxs/T0LttjTTPeRptXE+MI/+AotHgd+nj2FZ9jHPvKlzFO72vYa/MpyUtA/xTAM8MmIrehCN3rXdiFE8AvxjembsW++rPUMyBhyc6AZz5G1DMQZgn3YsmNA7b0nc9xSCK56dWh2HgJWhCPGdAtZGp+F/5PJoAT961MRnYVn2M62AO7gbPEBd9xkhcNQdx5Cykef4LoLpRG8oxX/wgiqKg9Q8G66Hp1TU6DL3Hou8+And1obf4UxQFXWwPdLFH3hoHQBuRgv9Vz6HarZ51XC24yvNw15ehS+qP9hc9gR1N3/3ot/AQQrQ9Z9EOrD++jmJvoiY+CoPOiF9EN1pcLby/4zMmpkwgOWXgiTd0irR+QehTBqFPGdTq9xZrM0X1biJVLW402FQ9NlWPQzGC3oTW5M9VA7sTHRtNrL8L9fsfcdeV4Nq/Cdf+TQBoIpLRJWahS8xCE93t0OQ3bnDaPCf/nC2oTge4WlCdLSimALSh0svyWxQWZGJokImhvTwTRDbbnWzcXcH8tQWU1zbz/oJdzF29n0nDkxnZN7bdJ685WSVVTfx79jZKq62YjVr+NLk3AzIiUVWV1dvK2JZfzX+X5PLnqX1OvLHfuC41+YxoPw6Xgw3lWxgROxRFUVhY8OOh3rR+zM6dx/rSzfyh95WEGbsxJPFcpqR6CrmVxWsZETsUrUbLf/d+S42tlgGp/0Sn0bG5Ipsmh5Xzk8YA8NHO/+JwtdArPAONoqVnaHeqbTUUW0pJCjrci9crPMM7Nf/oQ/fB6xPeE51Gx4L9i9lduw9FUcgITWdM4kif9k+r0R5x37y24CzZhW35+yimAHTxvdGlDEIblQZ4hmB6hlGqGAZfiqHPeJqXvIXrYE6r2Sad+zfj2LYIAE1ILIpfCO6mGtT6ctDqPcsazCg6A6bRN+DqORoczSh+objry3AWZmNK7oOlqgnV7aTpywfRBMfgKtsLgHH41eh7jQH18IXuan2552DBUo3iH4Yh6yJQ3Tj2rALVjWIKoCX7/3BX5qP4h+F/2ZOeIalafauhsJqQ1jO+aQIjMF/0V5wHfvJelwNgHHwpzgM/eV7XYMY46rrj9sYqeiPamO4n9V5o/EPhF7Nr+tIrK4T4bVJVFUv2fObnL6QhRMsMYz8uO+8PDLPVMDf3K34sXMm2ql3U2Op4YMidR7010JlSb7Hz0n+3UVQ1gbAgI5OHJxMfGUBchP+xb1FwxTO4GypxHszBeTAHV/Eu3FUHaKk6gCNvPf5XPgeAqyiH5oUvH/O1dWlDMY2/Re6Z+RtnNuo4r18cI/vGsH5nOfPWHKCsxsrHi/bw3ZoCLhqWxOh+cRh+dR2iy+2m3tJCVb2NmgYb1Q02IsMD6J0Y3G63z9i0u4L3FuzC3uIiPsKfv1zal5gwz21XFEVhxgUZ/P3d9azbWc7IvrH0SQ07wRZ/26QwFG1OVVVe2fo2+fUH0ChahsUM5PsDS7G7WsgI6YZW0aIoCpF+EYSbw7isu+e+eGtKNvLFntnYnHYmJI0mIySdyuZqaqx1GJUArsz4HRXWakw6I6qq0i0klTBTKKqqEuUXwe0DbjwyFpcT1dZ4xPT5P9/8fFziebhUN2MTzz2lfXXVFqM2VqK2NHsmGdAb0ZiD0USlea/D88xYqR51ggDV7UY5dG9Oz8Qp74PbhdpYSUvlflq2zkffZzzGYVeg6IyYxt2Mc/9GDAMuRlEUzBf91dMTZg7yblMb3c1zJrhkD+66UqjzTHWtBEZgnnSP51qvQxRFQfeLgkkbnog+bYg3dnf1QVRbI66mWgD0fS/EkHXkUEnjsCvQZ12E2lCBJiLFO2GLvs94T4+jMQD3oSFOptH/D8Xoj6+HEYqioP/VGXhFZ8A8/lZatsxBnzURbWicj1sTQohT4HJQn7eW9cEmHBoNdUOvIs4vnJ5+4cwFJiSdR7OzmSExA9AoGrIrtxPjF3VGTiD+UmVdMy99sZWKumaiw/z425X9CQ/27bpzTVAkhj7jMfQZj+pswVW6G2dhDopf8OFCT2cEvRlFpwedAUVrAJ0eRWvAVXMQTVCUd9mjDa8Xvy1ajYYRmbEM7x3Dpj0VzFtTQFFlE5//sI/5aw8wtGcUTTYH1fU2qhvs1DbacatHTvqo12kY1iuasQPjSY0NOsorHZ+l2YHD6cZs1GLUa4/6uXO53Xy9LJ+FGwoBGNoriusn9sJoaF28RoaYmToyha+X5/PJ93t48oahXfp6WykMRZtyup3oNDpGxZ+D1dFMtF8ELreL36VPptxaQaRfOFf2mMb4pPOIMHsm43DkrsVdU0yFs4wInT9Ben/PWZxel+M8sIWg0gKsob0ZFN36Gqqbs647ZhxqSzOO3ctoyVmEaq3HNP5m9OlH3hDJpDNycdqxrwlzFmbTsu170OrRhiWgjeuJNr4PqCr29f/19sz9miYiBb/feYZFW+c+jbuhEuPwK9H3GIW7cj+O3ctxlu5Bl5iFacTvUVU3jty14Hahz7wAXWImzsJsHDuX4djxI6q1HvP5t6FL6IMu4fBQB0VR0Ea0vk2FPm0I+rQhqC4H7tpiVLsV1WFDF9cTxeD7zYnBM7Qz4Pcve2JzOdEfpSj07rM5CMytv+AV7f9n787jo6rPxY9/zjmzZSaTbbLvCdkgC/u+iIAKIqCoqLXVam21altrveqtVWnV1t/1Wr21vXa7rVpbd62VrYiI7LITDAQSyL7vmUwy6zm/PwaCMQSQLBD4vl8vXq9k5izPDEnOPOf7/T6PvrsnoOnyu1FzruweAe0vJTyJgCt/OCDHEgRBOBW7u4N/Hl3FrLipJFzxI24u20J00gRiA3vObtArepam+9vc1Hc28krBG2jAzyY/2H2tG2hVDR08/9Y+WjvcJEVZ+fGy0QRZzm3tsqQzdE8j/TJd7Eisd/ReBgSgdrX3qNrs2vQqoPmXBwRe2qMuFztZlpg0MooJWZHsL2rkX1tLKau1s253Za9tgy0GwoJM2IKMhAWZaGhzsvdIA5sP1LD5QA3J0VYuHxfHpJFRfVY+7ejycLi8lcKyFgrLW6hqdHQ/J0kQYNARYFQwGXUEGHUEGHS0OVyU13WgyBLL5qQxb3x8nzcurpqUyLaCOqobHazcVsa1Mwfmc8pwJBJD4Zy5fW5WlaxjauxEoswRfFa5lY2VW/lh7Bxyy75gzOjbMBxfbD8rfmqPfU9cKNXWWpzr/wDA3OP/lJa1qHNTkQwBuDb/jfrOVuTgaHSZs9Aczf5kp8sOeiOWax8HTt6p1LxuPAfX4967As3lbwGAIQDlFNUW3Yc24D22E8O4xb3WmvlaqnFtf9M/VyECSQAAIABJREFUTfPEY+X7kEt2YV72S9T2WjwHPwVJQYkb6W/ILUlobidqUxlKZGr3SKAuPhf3vhW4Nv4V98730brauo+pmkoBf3GBgHn34i3b171mTJeQhz5zJs6Nf8UwfsnX/v+RFD1KePLX3q/XcUyBGHKu6P9xJGnAkkJBEITBoGkaTc5m2lx2kux21hW8w3aDkxpHHf8x/n4m5565qmig3sKYyFxUTSU8wIbT68TpcxFiDB6wOI9Vt/PC2/twOL1kJoTwwxvyCDAO7Uc6+Us3AtWudjxHNoPqxVO0BX32PAxjFgIXXv9kYeDIksTYjAjGpIdTUNJMSa2d0ECjPwkMNhFmNfVafxgRYeWLw3Vs2FfF5vwaSmvt/HVVIW99UsyMvBhmj40j2GLgcMXJRLCiroMvjzsadDImow6ny4vbq9Lp8tLp8gKuHucKthj4/rU5ZCScvrCSTpG57apMnv37HlZtL2PyqChibH33sL6YicRQ6FNRy1H+XfYpYyNyGR81BpPOiE/10e62E2oKYUXJWj4p30h5eyU/GPtdiluPUdtZz0FvKznHPsdbvA2m3owcFInaUoUcaENJHH1ymmJnK96yvZAMuvRpSAFBeIu346svRm1vQBebhWnmt+n6+CXUtlrcO97uGeCXpkSqDcfoXPH/QFLA0wWAHJWGcew1KPF5SLKM5vOAzwM6A66t/8BzcD0AXdUHMYxeiGHi9YCG87P/w1u09fjU0ACM4xYjBdpQm8qRwxORJBklJBbTnLuRjBZ0sVk9wtJ8XvCd7DNomLgUOSwe19a/+5NCowVD1mXoUicif6lNhGQw9yokooQnY75uuZieIwiCMAjcPje76/bT5rYzP3kODV1N/Hz7fxGEwk+La5glQUtWNgtH3nTWf4fN+gBuH3UzPtVfx/jdoo/YVbePb468kQlR/a8eXFDazG/fO4DL42NMWjj3LMnutbZrqMkBQVhueArXrvfxHtuJJ38NnkMbaMidhUsXhBI5onud+Ikq1l/tESsMX5IkkZNqIyf17EbHo8LM3DQnnetmprLjUD2f7q2kpMbO2p0VrN1ZgSxJPaag6hSJEbHBjEwKJSsplJSYoO6E0+tTcbp9OF1eutw+ulxenG4vbo9KVlLoWa9lzEgIYUZeDJvza3h97REeunnMJfnZS/xWXqJaXW2UtVcwOiKHNlc7Gyq3+AvAtNfRuf4PSKqPuMQx6CQXbxx+n5G2DLq8Tn629Rki5QAen/QgV8ROp7xyL/MKdtOxbzcj0rKZkHs7eaHpOKMP4C3b291j7QQ5LAHTzNtRotJwbXkdvKkocdmYZt+FJMmoYxbiqyzoTrZ0SWNIefgf1Gxbi6+2CCk4CiUsAckS6h+lO07rsoPXffwc8Rgn3YCSMLrHL7WvupCu1c+DYvAnbooOXcpEvMXb0brajm8rgerz95zKmoVh/LX+vlAAx3tFnaBP6VkV7gRJ0cGXLngnSo3rEnLx1RWhxI78Wg2JL8U/TIIgCIOpwl5FREQWIPH3wneRJInL42cQXFVEuEclzO3Co9MROPZa7hh9dfcMkK9DkRW8qpdOTyde1dvd2mjlsbXEW2PJDR/1tYvUbM2v5n/e2Y/XpzE1O4o7rh55wbQMkENiCJh3H76GUly73sNXcQD7nrUA6LPndSeGnsKNuLa+DkYLckAQSkyW/waqSYwuXmoMeoUZeTHMyIuhpKadT/dW8fnBOlRVIy02mKykEEYmhjIiLrjPmx86RSYwQB6QYjY3zh7BvqJGDpW1sK2glmk5Mafd3uXxsWFvFWW1duZPTiQx6tx/hmubO9E07byPVIrE8BLU7Gzhqc+fR9M0lk99mD8feJ2S9jJ8bbVM/mIHb4YZSXJ6uHLfCnIDjYRHJxNmCqXL60TSwOe00/rmQxiMgXzneFESgCmGSEzHG8abrvwh3qKtuPevAr0JOSQaX9VB1OYKUPQ43V52WWahWj2YZk/tHkWUTVbktCndx6xudNDl0wjInIk+c2afr0lJHE3gt19Gc3f6k8ZTXGw1l6M7KZTMIQRccT9KVBreUXN6NHg3TroR48TrB7wKpWS0XDC95gRBEC5VK0s+ZlXJx9zDN8m15nF5wgwsmoxj3Uvoyw/wEKDEZGJaegdySP/6cepkHd/Lu50WZyuhphAau5pYXfoJiiTz1PSfEmSw4vQ6MelOXzBG1TQ259fw2ppCVA3mjo/nlnnpyBfgjUMlIhnzgp/gqy3C5KjA3lCPLvrkcg3N6/T3vXU5UF0O1NYavMd2Ypx6M7r06eJm6CUqJSaIlJggvnlFBppGryIxQ8FqNrDs8jT+suoQb60vJm9EOKf6JOg+nhCu+rycdod/UGLHoXqumBjPkhkpmAxnn161Odx8sPEom/bXoCgSDy4bQ1ZS6Jl3HCQiMbwEaF4XncVHqDyyhwB7C9aYkYwKTscnaWiaxvXp1/DOzj8ybc9GJOCY2UqVxcz06HGMqS5mYrQ/UTMpRl7I/i6evR/h8+ajeVvwBsVRGHMNLQTj7vBi3lnBqKRQ4iIs6DOmo8/wt3+obe5El+slxHEMwhJ4+d0DHDjWRFnrKO71mAg/RdyHy1v47zf34VM1xqaHs2RGSp93YyRJot2rsHJ7M22OGnyqhkmvEGI1EhtuYdLISP+o3YjJaE47ktGMdLwyqRKVRovdhVXyodcpyIEnp0J4fSotdhe2IBOyLC5WgiAIw51RMSAhYVD8hVqWjrgax1uPoNkb/e1vptyMPnPmKW8wnqtQk3+Nk0Vv4fr0RXS4OwgyWClvr+SFPS9zWfx0vvGVS0x9axcHS5s5WNpCYVkLHV0eAJbMSGHx9OQLPoFSotMJiRiH5yvN141jrsEw+mo0ZweavQHXjnfxVR/CueHPKIc3Y5pzd69K4sKl43xPi56eG82WAzUcrmjl3Q1H+Y/bThZScnt8fLavmlXby2g7nhAmR1tJiAxkc34N/95Rwc7Cem69IoOx6acfXPB4VdbtruCjLaU43f5p516fxkvvH+A/vzmO+Iih7ZN9gkgMLxADVeZZ1VRkSaapzcmRilZCnFXEH3mDrdh5OzKIMR1Obij+nGs1mXe986gP1YiPjGVmZRg6Xx2btPFcFpbB+Phs9h11UWGagrFMxlRbhMuj0un00NF1OUZlJDpHHbub41FLu4Cu4xE0AhAWZOSbV2YyJi2cTfuree3fhwFYODUJ+6EiDhzzN0F3un08/9Y+vrc4m5omBx2dHkanh6OqGi+9dwCfqiFLsLeokX1FjcydEM/1l42gud3Jqu1l6BWZJTNTQdN47s19VH+pUtWXfX6wjnuWZGMy6HApZo6WtVNU2crRqjZKa+04nF4UWSI+MhBbkAlV1ejo8lBeZ8ftVUmIDOTea3OICjNTUd9BZX0Hk0dFiWRREARhmNhVu5fRkbnMS7yMHNtIchJTaWzsQJIVjOOvw1u2F+P0byKbT1+ooj8CdKYe7ZEOtxTjVj34NB9ur49NBWUUlzs4VNpKY5uzx75hQUZuvjKLCWmDU+V0KEmS7G+zFBBEwMKH8RZtxbX9TdT2+h4tlQRhqEmSxLeuyuTJv+xg4/5qrilpJtgks3F/DSu2ldLW4U8Ik6KsLJmRwug0G5IkMXtsHK+tOUxZnZ2X3jvA2PRwvjEvo1f7GE3T2FfUyFvri6lv9X92zhthY9nlaXyw6Ri7Dzfwwtv7eexb4wkLOrvWMwNJJIbnmeq049r6D2RrOMaJ13+tfRu7mviscivhATYui5/GsdZS/rT/DTxVI2guDydGaeGusFXg82ExhaFJEg36aI54AsjQ15PkK+O/3thLsMWA2zESt5aHFwXWwL85SO/OM19mBBKJDA0gIz6EEKsRo16mpqmTg6XNNLe7+M27+WQmhHC4orV7r39tKQX8C4nvX5rLo/s16lq6eOrVXd3bvLm+GKNBweX2MTY9nB/dMo7XVx1k/e4q1u2qZGdhPe0ONyfWJX9+qB6LSUdjm5O4cAsLpyYhyxJOt4/mdifr91SRf7SJZ/++B1uQiQPHmvH6VL4sMECPo8tDWa2dstqv3N3UK1TUd/DzV3aSFGXtfj1Wi56clOF/gRYEQbjYfVy2gX8eXUVefT7fy70NW10pdTveRp59n7+fa/q07hkuQ+mKpNmkBqbz9rpSVjvLec/6FpLOg9uZh8UUTFZSKKOSQhmVHEZkaACRkUE0fGUEbriTJAl9xnR0iaNROxqR9P4Pw77mCrQuO7q4Uec5QuFSExtuYcGURFZsLePXb+zG5fbRYvdXPE2MCmTJjBTGpIX3GNBJiQni8dsn8MmeSj7YeIy9RY0cLG1hyYwUrpgYjyLLVNZ38MYnRRwq8y/DirGZuWVuenfRnu8tGsV/O/ZRVNnGi+/s59Fbx2M2DW2qJhLD88TfZ+8z3PtWojntoA/AkDe/u6DKjto9bK7azoLkeWSFpILsb+DpVb0UtR5jZFgG7fZ61ldswiYZSNVG8Mf9q7GbW0BrIMAYhS7NxfNmG1OLQthQPRG10UWh28wxncztE6Pw2d1oO5to7XCTGBXOvdflsruwnn9uLkHTNEaPCCd3hA3f8YpPBr2C2aQjMEBPkNlAiNVI8Cl6Jqmaxsc7K3jvs6McrmhFliS+eVUGMWFmXlldSH1rF3cuHEneiHCm5ih4Ikpod7hIjQ3GaFDYV9SIy+0jJcbK9xZlYwsO4BvzMpiWE83/rThEVaMDRZaYMTqG5nYXB4410eXykhAZyE9uHkOQuWdMU7KjeeHtfZTXdVBe14GEf+g/IyGEtLhgUmODCLUacbp9lNfZsXd6UBQJk14hIcqKIkv8ddUhdh1u4HBFK0aDwmWjY8lKFFNdBEEQhoORYRl8Ur6R8ZF5oPlwbXsTzdFMQGYhutiR521apten8u6aOooqXExK8GGwdiIrPu67YTyj4mJA0r52gZrhSjIFopj80+c0VcX52V9RG46hS5+GIfdK5MBwMFou+Cm0wsXhmqnJfH6wjtqmTgASIv0J4dj08D5/BmVZ4ooJCUzIjOSNT4rYVVjP258Ws62glpQYK5vya9A0sJh0LJmRwuyxcT2KR+l1Cj+4Po9fvb6bygYHv30/nx8vG9Or5cdgEonhENFUH65Nr6K214Ek42sqB5d/2qMSOxLTrDuQjBZUewOune9Tpu/gqKeOL7b8kfjaJiRzMPKIKfzd3MW+pkPcEjuLtL2fMF9xEeP2Ih9+kttVEw0hXYQHdJL5o5n88cCrtLXoib1qLv9hGsWx8hYcTg9TRkURd3zucnpaCyU17cwZH49Rr7BgShJzxsejqto590SSJYmrJiUyKjmMtTvLmZIdTXayf472U3dNxtHlITjQX5XTqFd4/Ds9q3063V6OVLSRHh/cY/FxcnQQT3x7InuLGkiNCSI8JKB7SL64qo0FU5JOWZUqOszMY9+awIptpUSFmhmXEUGotXdV0ACjjsw+kr3vX5vD1i9q6XR5mZ4TjdnU/+pXgiAIwuBaV/4Z2bYs4q2xLJ/6MCadCc+RLWiOZvS2OJSv9LAdSpqm8dqawxyuaCU40MC8sclMy3iYGkcdqcGxHGk5yvvFK/hm1o3EW2PPW5znh4ouaQzu5nK8RVv9LaQAdEZkqw0pMJyA+Q8gSTKapvkrfkekICni2iwMDINe4Z4lOWzYX0NeSihjMyLOuthTqNXIvdfmkH+0kdfXHqGivoOK+g5kSWLOuDiWzEzps4pqYICeH984mmf+tpvC8lb+suoQ3100asgKTYnEcIhoLgeaz42v5nD3Y3JUGsYxC1ES/b1SfE4721c/TXZrG3MlcNoCuaqxAxVo8NqJKvyM5Nk3crithIi2Diz2FhI9EajIBOnrCJKdxBOBedYyFFlmyYgFWA2BWA2BRERYyYjpXbgl63hPmC8zDtDC34TIQL6zsOcUEJ0idyeFfTEZdOSNOPUUTb1OZtLIqO7vpePNVcdmnH6Rb5DFwDfmZZxl5L1JksT03NOXLRYEQRAuHJ/X7OaD4pWsL9/Ek8eTQk1T/dWygeApS3Cdx9G41Z+Xs/lADQadzI9uyCPAsRFNZyI1OAmANaWfUGGvYn9jwSWXGEqyDuO4xehHTMa1+wPUpkrUjkbwOFFbqpFcnd3FgTSnna5//RLJHIIhbz76kbO7p6MKQn+kxAQxKS/unKdv540I56m7Qlm9vYy6li6umZrUPTBzOuEhAfx42Wh+9fc9fH6wjlCrkWWXp51TDF+XSAwHiVf1srFyK3WdDdySdT1eQwBVY68keeTlqF43R7ytjEqe3mOKyM6Ww/wj3EhWaDzfCxnLLZGpeIIi+P2xf1HcVsp/xy0iJ2gcDkMMf97dTLpzKgdI57rL0lEsFSj2GgyjFyAZzADEBvavzLYgCIIgDFdjI/PYVbePydHjMB6vQOorz0dtqUKyhGLNmYWrxXmGowyO3YfreXfDUQC+u2gUydFBcLTnNt/LvZ0NlVuYlzgLVVN5o/B9JnTmkGnOOg8Rnx9ycBQBc+4B/COsuDtRO5rR3J0nN/I4kYKj0dpqcW1/E/feFehzr8CQPa9Hv+Mz0XzegQ5fEDDqFa6dmfq190uMsnLfdTn8zzv5rPm8nDCrkXkTEs68Yz+JxHCQfFqxmRUla/GqXmbFT2Nr9Q42VG7hySkPE2kO573t/8WH9Tu5M+dWYiz+ETCzLoBwUxiTUq7AGONvnt7a1URDVzOhxhD+VKSw74PPu88RFDOOxxdnExlqBgb/h0UQBEEQLnQbK/3TDmfFT+Pe0Xf2WA90YrTQkHslkk4PDH1iWFLTzp8+OgjADbNHMD4z8pTbmXRG5ifPAWBL9edsrdlBvauezDFZVHfU8sbh98m2ZTE/eQ4+1V/uXpHPb6n/wSRJEhgtKF9J9uSgSCzLfoWvfD+uvR+h1h/FvesD3PtXYxg1B33uVcjm4F7H05wdeKsP4qsswFt1EF3KBFh0FwC++qN4S/egxOegRKUjKeLjsjD0clJsfHtBFv+38hBvrCsi1Grs8+/FQBE/6YPkiqTZTIudRFl7BZFtLRjrS5GQKG0vx2YKJcYSTXHrMUKMwWiaRoW9iryIbEbZMnuMIoYH2Hh6+k95b8MxVn9RjlGvMC4jnLHpEYxJD++xaFUQBEEQLmXNzhbeL16BR/WSGZpGlOXkhyhvbRG+2iNgMKPPmn1+4mt38pt383F7VWbkxbBgcuJZ7TcxaixdXicRIf5WGqXtFRxrKyXEGARAQVMhfyn4B9NiJ7EsY8mgxX+hkiQJXdIYlMTR+GoKce/9CF/VQdz7VyGHxCBnzkTzeVDtbbh2rMVbWYDaWAZfqr+uNhzr/tpzbCee/DWwbyXoTSgxWegSctDF5yIHR50igrOnubv8P4eSjBKbJdZFCqc1PTeGZruLDzYe448fHeQhi4H0+MFrqSMSw0HQ6mrD7nYQFxjNKFsmXZ/+icuL9jJt3DUYwzJQZIU7sr9BraOOAJ2JA40H+X3+K9yQvpjL4qf1qkC27Ys6Vn9ejixJ/OD6XEYlh/VxZkEQBEG4dIUaQ7gpcykV9qoeSSGA5mxHMoegz5hxXnrlOd1e/ufdfNocbrISQ7jtqsyzrrBpUAzMS7yMiAgrDQ12xkTkEGoMxqTzr9mv72rEo3rQy/6PdbWOOkw6EyHG3iNlFzNJktDFjkQXOxJf/VE8hz5Dlz4VALWpAu+RQtz7Vvo3lnUo0ekocdno4rORbUndx9EnjwdNw1f5BWpLFb7yffjK9+ECJGs4phm3oUvIA/wVVCX5zDfpfc0VODe9ilp/DLTjLbP0AeiSxqBLmYAuIRdJ17vSuyBcMzWJlnYnG/ZV8491RTz57YmDdi6RGA6CnbV7+efRVcyKm8aytIV4S/cAEJY2A9ngX3Sql3UkWOMAsLsdmBQThc1HmB1/so+Spmls2FvFP9YVAXDrlRkiKRQEQRCEUzjQeJDM0DSmxkxgasyEXs/rk8f7P8yfh7Vkqqrxhw8LqKjvICrMzL3X5fZrxo9ZH8BI28mCavMSL2NqzER8mg+v6uUvBf+g2dnCfaO/Q0pw0mmOdPFSIkegRI7o/l4OT0YKbECfNx9dXDZKTAaS7tTF8JTodJTodADUjmZ8lV/grfwCb1UBmr2xu5YDQNe/X0CzNyJHpKBEpKCEJ4Oiw1t1EM3egGnmtwGQTEGodcUgychRaeB1+ZPV4m14i7ehxOdgvvqhQXs/hOFLkiRuvTKD4EAjUWGDe1NLJIaDQJEVbKYwUoIT8VV+AZ4uZFsCcsipi8FMi53I1JgJaGjddw+9PpXX/n2Yzfk1AFwzLYnLx8YN2WsQBEEQhOHiWFspf8h/lShzBI9O/BH6PqbnSYoezsPUvbc/LWb/0SYsJh0P3JDXZ6n6/rDo/cmKw9NJqDEYl9dFXGAMmqbR7rYTfHza6aVKkmX0mTMxTZn8tfaTA8OQs2ahz5qFpqqoLZXIIf4q5ZqmoTZVoHW2orbWnGyr8SWGCUuRA4KQzcEEXPMISnhy94i12laHp2QX3pJd6JLGdO/jPrQBX0U++qzZ6BLz+vGqhYuFIsssmZEy6OcRieEgmJMwkzkJM1E1FfeW1wH8i5pPQ5IkJE5OKVm1vYzN+f4y1t9ekMWUbFFhVBAE4WJVXV3N008/TXh4OPX19Tz22GMkJPQsKqZpGs899xxNTU10dHQwd+5cli5dCsDtt99OcXFx97adnZ3cd9993HXXXUP6Os4Xo2Ik2hJJjm1kr6RQ7WjCteV19KOvRnd8FGgofbq3irU7K1BkifuX5hIVZj7zTv1g0Zu5J+8O7J4ODIqBz2t28+aRD7gxfTHTYied+QBCnyRZRrGdXBcqSRKWW55Dba7E11CC2lCCr6EEzedBF5OJEjuqx6ikLnZkj+PJwVEYxyzEOGahv+rqcd7ibfhqDvsL4CSOwTTtVuSg07flEoSBIBLDAdbqaqOsvYLU4GSshkB81YUA6GJHnWHPkzxeH5/srgTg/qW55KSeuqefIAiCcHFYvnw5y5YtY968eWzYsIHHH3+cV155pcc2a9asoaysjN/97ne4XC4WLFjApEmTiI+PJy0tjVdffbV72x/84AdcffXVQ/wqzg+7u4O4wBgemfBDOMWaPfeBtXjL9oLOMOSJ4ZYDNby+1t+/+Pb5WWQmhp5hj4EhSRJBBn/v4mNtpbh97u6+fz7Vd1FXLx1qkqL3TyGN6N9ozpfXm5rm3IOnaKu/iE75PhxVBRjGLvK3JBPFaoRBJEpaDrAvGg/xxwOv8faRf6J2tqG2VoPOgBx59n8wthXUYe/0kBgVSHaKWFMoCIJwMWtpaWHz5s3MnDkTgGnTprFr1y7q6up6bPfhhx8ya9YsAIxGI5MmTWLlSn8hjccff7x7u+rqaiRJIjb24m+KfqDxIE9se5at1TvRK/ru4isnaM4OPIc2AGAYPbSJ8sb91fxl5SE0DZbOSmVGXsyQnv+EW7Ku54Gx9zAlejxun5tf7nyRP+S/2t3i4utocbayrXpn9752d0ePkS5hYMiWUIxjFmK56Vl0aVPA58G9630c7/wMb8WB8x2ecBETieEAC9RbyAgZQVZoOr4a/11CJToDST67wVlN0/h4ZwUAV01MPOuKZYIgCMLwVF1djdlsxmj0TzkzGAwEBQVRVVXVY7uqqipstpMzSGw2G5WVlb2O9+abb3LLLbcMbtAXiMPNxbh9bpzerlM+7z64Hrwufz+68KErwrJ+TyWvrC5EA26cPYJrpiUP2blPJT00FUmSONB4iFpHHa2uVhRZobGrmac+f55VJR+fcj+n18mBxoN0eBwA/Hbfn3m98B1K2stRNZUX9/ye53b/lmZny1C+nEuGbA4hYM49BFzzCHJILFp7Hd6SXec7LOEiJqaSDrAxkbmMicwFQPN5CVj82Nfav6C0mapGByGBBiaOHNwmloIgCMLFxe12s2fPHh588MGvva/NFtjv80dEWPt9jK/j+xG3Mr12HDlRmb1aPakeF+UH1wEQedkNBJwitoGI12Ix9jjOhxuP8vraIwDctSSHJbNG9LVrTzVGLGcRT39inh8xg0mpObQ624kIs7K3eA+1jjqaPE1ERFgpbCjmo8PruDl3MQnBsTy78TX21HzB/ZO/zazYyUxNGkdFWzXhYVZ8Biedvk58eBkR5y+Ot7/2IGNjsnv8Xwz1z8TpWCwQEXHqSqQnXEjxdouYhJYzlrZdq7HmzkYx+2N0VhaielMvzJhPQ8Q7+M41ZpEYDqAOt4MddXtID0klwRqHpOi+1nqGxrYu3lrvLx4wd3y8aF4vCIJwCYiNjaWzsxOXy4XRaMTtdtPe3k5cXM9K1HFxcTQ1NXV/39TURHJyco9tVq9ezfz5888pjqamDlT13KcFnuixNxS2Vu/gaFspN6YvJkYXT1Ojo9c27oJ1qJ3tyBEp2M1JdHwltoGK1+FwdR9n9edlvPPpUQBuvSKDaSMjz/oc5k4XnWfYdmBiVrASSkODnVxrHj8aG4xRMdDQYGdj8S52Vu0nyhDFgpR5jAgcQUtQOy6Hj4YGO/Ni5kAMoAJOWD7lURq7mmhp6mR7zS7+duhtxkTk8N3c2wYw3oHjcBhoaHD3+fyFFm8vqZfT7AAcdtT2ehzvPIkkyf5WGdEZ/n9RaeelT+fZuuDf468YbvHC6WOWZem0NwFFYjiAKjuqea/oI0YEJ/PjsXeDJHUv9j6TgtJm/vBhAR1dHiJDApgtWlMIgiBcEkJDQ5k+fTqbNm1i3rx5bN26lXHjxhEVFcW6deuYPHkyVquVxYsXs3LlSm666SZcLhc7duzgvvvu63GsDz/8kN/85jfn6ZUMDbfPzYdHV9PhcTAqLJPxUaN7baOpPtz5awD/2sKhWJbx0ZYSPthUggTcNj+Ty8Zc+NdxnawjI/TkiObs+OlEBNi636/ZCdOZnTC9r90xKgbiAv1rJ3WSQogxmLzwbAD21OfFRidwAAAgAElEQVRj6TSQHpDRazRX6D/N2YEcEofaVIav5nD38iUkCTksEV3KeIzjFndvq7bXoznb0brsqF3taCf+Oe0osSMxjlkIgLfqIN6SXf7kUh+AZDAhGcxI5hCU2Kyz/lwrDE8iMRxAFr2ZKdETiDJH4C3ZhXPzaxhyr8Q4bslp92vtcPE/7+Tj9ankpIbxvUXZWEyi6pQgCMKl4sknn+SZZ55h48aN1NfX89RTTwHw4osvsnz5ciZMmMCCBQvIz8/nkUcewW63c++99/ZoaVFQUEBSUhKBgf2fEnohMygGHhh3D7vr9jEu8tQ93jSnHTkoClVW0CWPH/SYPth4jI+2liIBd1w98rwVmumvUFMIM+KmnNO+E6LHMjoyFxkJr+rl/aIVtLha+V7ubYyOyBngSAUlMhXL9T8nzAL1BXvx1h7GV1uE2lCK2lSGGnryZ9BzZBOu7W/1fawvteBQG0vxHFx/6u1iMjHNuhM5OGrgXohwQRGJ4QBKsMbxrVHLAHBuehVcDpDOXBL6UGkLXp9KVmIID9wwGlkWBWcEQRAuJfHx8bz88su9Hl+xYkX315Ik8eijj/Z5jOzsbLKzswclvgtFVUcNRsVAjCWKa1Kv6nM72RyCeeF/oLk7keTBG+HQNI2DpS0UHitFkuC714y6pPsOn6gK61N9XJ0yj0Pth8kNH4XH5+Fvh97m8oQZpAQPXRGgS4FitqJLHosueSwAmteFr/4Ykv7kdFLJGokcnoQUEIRkCkIKsCIHBHV/L4edHN1W4rIxTvsmmqcL3F1oHieauwtfVYG/t2L5Pgy5ff/uCcObSAwH0I7aPWiaRnZ4FlJtEQC6mMwz7ldY7q/mlTciXCSFgiAIgtCHD4pXcqj5CHflfIuxxwu9nY5kGNxm8u9+dpSiqlZGShJ3L8lmYpYoGgegyArTYiexZPRcGhrsfFa9ld31+6ntrOc/Jz4w5BXXNU27ZKq8SzojutiRPR7Tp4xHn3J2I+dKeNIpK/hqzg7cBZ+gz76i+zHV0YJsGZrenMLQEBOFB9Dq0nW8dugtWjtbUFtrAAn5S8PzfTlc3gpAZmLIIEcoCIIgCMOTqqkEGayYdQE91sV9Vdf6P+D+4mM0b99FRgbCpv3VrN5ejixJfP/aHJEUnsaUmPHMT5rDtSP86z0/rdjM3w69TYuz9ZTb+1QfnZ7OHt+fi/rORv5a8A/ePvJPwL8+taSt7JyOdamTTIEYxy/pHoH3lu/D8eZ/4Nq7Ak31nufohIEiRgwH0ISosdQ46rB5vHg1H1JQJJL+9GWRm9ud1Ld2EWBUSIy6uNeFCIIgCMK5kiWZ20bdhFf1ouujN7C3tghv8Ta85fvQZ8wYtFiOVrXxt7X+Yh95I2yMz4wYtHNdDAJ0ASwa4a+W61W9rC37lHa3nbERuQTqLTy/53/xqT4em/wgLp+bBz/7GXpZx4uzf4mqqTy57f8RY4nizpxvEKA7fcVNTdOo7KgmwRqHqqnsqtuHSTGyeMR89tQXUb/nDe7MuZUxYt1jv/hqi8Hnxb3zXbwlOzFd9p0eaxWF4UkkhgNoYYp/eN1z9HO8gBJ65opkJ0YL0+NDUAZxHYQgCIIgDGdrSteTFpJC6mnWqLn3+ddkGrLnDVrJ/ha7i99+cACvT2POuDiqQ4Zfj7PzSSfr+PG477Ozbi/ZtiwAKu3VaGj4VB8GWY9OUtDJenyqj9rOelpdbciShEkx0enp4uX8v5AXns0VSbN7HNun+nhu10tUdtTw86mPEG2J5NasGxgZloFBNhBktBKoN5MS5E9gVE0VFVPPkXHSDSixWTg3/hW1sYzO93+Ocdo3MGTPPd+hCf0gEsMBUtVRQ0FTIanBySQ0VwIgn0VieGJ9YVaimKMtCIIgCKdS19nAR8fWYFKM/GrGExhO0efX11iGr3w/6Azoc68clDg8Xh+/ff8AbR1uMhNCuHluOr/ePSinuqhFmsO7b6YDPDzxBxgVI5IkIUkSL87+ZfeawLjAGH4143Eau5qRJImDTYUcaytDkRSuSJpNXWcDK4+tZX7yXGIDo4kJjMbucdDQ1YQtIIxpsZO6zzMqLJN7Jz9IoN5CYXMR/zq2hu9k34otIGzI34OLgS4+B8sNT+Pa8S6eQ+uRw+LPd0hCP4nEcIAcaTnKh0dXMyNuCvFOf1PJs/kFEesLBUEQBOH0jIqBKxJnI0kSBuXU7Zzc+1YCoM+ajWwa+FE8TdN4bc1hSmrasQWZ+P51OehOkaAKX1+itefnpa8WirEaArEa/MttciOy+V7ubRhkAwD76g+wu34/OlnHbaNu4vq0RZh0xj6nGwfqLWiaxkfH/k1ZewU7aveyIEWMcp0ryRCAaca3MORdhRwk1tkOd0OWGFZXV/P0008THh5OfX09jz32WI/+S+D/o/vcc8/R1NRER0cHc+fOZenSpQDU19fzxBNPEBsbS0dHBzabjYcffviCqTKVYI3j8oQZpIeMwJSZjXHKLXCG2MT6QkEQBEE4sxBjMNemXd3n82prLd5jO0FWMOTNH5QY1u2qZMsXtRj0Mj+4Ppcgs2FQziOcnlEx9OiLONKWgYZGUpD/M2WgwXLGY0iSxH2j72RD5RauSr4cVVPZU5/P/PDBW5d6sRNJ4cVhyBLD5cuXs2zZMubNm8eGDRt4/PHHeeWVV3pss2bNGsrKyvjd736Hy+ViwYIFTJo0ifj4eP74xz8SHR3NE088AcDVV1/N5MmTmT179lC9hNNKC0khLSSl+/szFZ0B+KKkGRDrCwVBEAShL2XtFawq+ZjpsZPJizh1n0b3/lWAhj5jOnLgwE8LLCht5q31xQDcefVIEqPEusILRaI1vteI49kw681cfXw660dH17CmbD0VXeVcl7y4X/FU2KvZ3/AF02MnEWoSs8GE4WVIspGWlhY2b97MzJkzAZg2bRq7du2irq6ux3Yffvghs2bNAsBoNDJp0iRWrvRPDYmMjKS52Z9IOZ1OOjo6LpjRQoAVx9ayoXILrq521M5WNE077fatHS7e3XAUgLHp4UMRoiAIgiAMO5/X7uGLpkKKWo/1uY1kCQGDGcPovkcVz1V9axe//+cXqJrGwqlJTBoZNeDnEM6v5OBELDoz0xL9vf6cXtfXPobT60LTNKLNEeQ3FvDEtmep62wY6FAFYVANSWJYXV2N2WzGaPSPohkMBoKCgqiqquqxXVVVFTabrft7m81GZaW/kMtdd92FXq/n+9//PrfffjvXX389l1122VCEf0Yen4c1pZ/wXtFHaOX7cbz+AM7P/tzn9qqm8aePDtLR5WFUcigzR8cOYbSCIAiCMHwsSJ7LjRlLmBozsc9tjBOWEvjNF5GDowf03E63l5fey8fh9DJ6hI3rZqUO6PGFC0Nu+Ch+Pu1RcqKyaHO18/Pt/8XqknWomnpW+5e1V/CrHS+wqWobkiSRGz6K5KBEIgPCcXg6+dOB1yhq6fvGxpf5VB/1IqEUzpNhU3zmhRdewGKx8Pzzz+N2u7n77rvJz88nLy/vrI9hs/V/HV9ERO/pI06Pk1tHX0eH24G5sYk2wBodT+gptgV4/9MiDpW1EGQx8MjtkwgLMvU7rrNxqtjPN4sFIiLOPO32Qoz9bA3X2Idr3HBxx26xGM/u9dUYsQzx+zCc33fhwuTxebAaApkdP/2M20q6gV3zp2oaf15xiKoGBzE2M99dlI18Ac1UEgZWgM7/Wexg02Ha3XaKW0u46iz3bXK20OhsZkftXmbETWFR6lX4VB+SJLGpahv7Gr7A5XOTHpqK2+dGJ+u622S0udqp7Kgm25ZFp6eTn255BkWSeW7WzwHYVLWdaTET0fdRdEkQBtKQJIaxsbF0dnbicrkwGo243W7a29uJi+vZziEuLo6mpqbu75uamkhOTgZg/fr1/PjHPwb8I46jRo3inXfe+VqJYVNTB6p6+imepxMRYaWhwX7K56bapgDQse/XAHQZwvGeYltV03jvU/86hW/Pz8Ln8tDQ4DnnmM7W6WI/nxwOAw0N7tNuc6HGfjaGa+zDNW64+GN3OFxn9frMnS46h/B9uJDed1mWBuRGoHD+/b3wXaodtdyceR2pwcm9nnft+gC1oxHjuCUDXvxi5dZS9hxpIMCo4wfX52E2DZt76UI/TI2dSKgphGhLJLIks6Z0PREBYYyPGtNjO7u7g38dXc3S9GsYF5mHlv0N8iJyuhM+RVYAmB47GZ/qIyM0DYCVJR+zv+ELHpn4IwyynuXb/h9u1cOzM57orr6qkxTa3XZ21u7ln0dXkd9QwA/GfnfAX6umaUiSJPo5Ct2G5KcgNDSU6dOns2nTJgC2bt3KuHHjiIqKYt26ddjt/g8TixcvZuPGjQC4XC527NjBwoULAUhOTqa4uLj7mEePHiU6emCnjJyr/IYCPjy6mpK2MtQW//RYpY9WFeV1dtodbsKCjIxOs51yG0EQBEG41KmaSnFrCVUdNVj1vUejNXcn7gNr8R7ZgtbVPqDnPlzewj83lyABdy/OJjrMPKDHFy5sWWHphBiDqbRXs+LYv/lrwRu91gv+7dDbbK3ZyQfF/loY46PGoD9FiwyrIZCFqVeSHpqKqqkcaj5CQ1cT9Z0NKLJCZlgaWaHpdHo6AXh88k94curDhBiDSQ9NJSLAxpxEf42Opq6WM9aw6EunpxO7uwOAzyq38uTWZ9lUtR3grKfMChe/Ibv99eSTT/LMM8+wceNG6uvreeqppwB48cUXWb58ORMmTGDBggXk5+fzyCOPYLfbuffee7tbWvz0pz/lqaee4he/+AUOhwOr1codd9wxVOGf1pGWo3xauRmLbCC8owkUHVIfdy7zj/pHRPNSbRdU8RxBEARBuJDIksyTUx+mpK2UCHPvG6nugvXg6UKJHYkSlTZg57V3uvnDvwrQNLhmWhJ5I8RN3EtVXGAMN2Vei8PTRZQ5gqauZvIbD3J5wgxuSF+EBCxInnfWx5MlmUcm/JADTYcwKv6pz/fk9fwsa1BOTolODkrkZ5N/gk7W0eF28N+7f0t8YCx35nyDAF3Aac/V7GzBq3qJNEewrvwzPiheyRWJs7k27WpUTaXR2Uxlh38wQ0J8HhX8zioxbG5u5l//+hcHDhygubkZTdOw2Wzk5uayaNGiHgVj+hIfH8/LL7/c6/EVK1Z0fy1JEo8++ugp94+Li+P3v//92YQ75HLDR2HRW0iV/VOX5KBopONTCL7qRGKYKy40giAIw95AXB+FUytrryA+MLZ7Ct6XaV4XngP/BsAw5poBO6emafzfykO0drhJiw9myYyUM+8kXLQkSWJm3FTAP6r26sE3OdpWSlpICgnWOL4/+s6vfUxFVhjzpT6MZ6I7PgpZ19mAT/Xh8rkwKsbuaaAnuH0e2t3thAfY2Fazi9cPvc2EqDHckf0NIgLC0cs63Kp/6dK4yNFkhI4g2hzZHZMgwFlMJd2yZQsLFixgw4YNhIaGMmbMGMaOHUtoaCgbNmxg4cKFbNu2bShivWBlhqWxIGUuCar/F0uynvqDQHunm5LqdnSKxKikge+zJAiCIAwdcX0cPHZ3B/+9+3c8se1ZPL7e6/A9hRvRnHbkiBSUuFEDdt6Pd1aQf7QJi0nH3YuyRY9hoZuqqaQEJ5EUlIBH9Q75+UeEJPPTyT/mjuxvHF/7+AlvHv4At8/DsbYyHt70JK8UvAlAalAiRsWAXvYXrMmxZfH8rKdYlrEEgGCjlbjAGJEQCr2cccTwhRde4K233uouAvNVpaWlPPTQQ7z77rsDHduwseLYWkBjhi8QXXA0ckjMKbcrONaMBmQmhmI0iF9GQRCE4UxcHwdPs7OFiAAb4QG2XtUYNZ8X9/7VABjGXjNgyzJKatp553h/4TuvHokteGgqhgvDg07WcV3awvMaQ4gxGPDfOFlb9ikBOhN6WUesJQpN01A1FZ/qI9IcwX/NXN492igSQOFsnTExVBSlz4se+IvCKMql/QP3WeUWOr1dXDbjCQLT+y6pvf9oI+BfXygIgiAMb+L6OHiSghJ4fPJDOH29G417i7ehOZqRQ2PRJY0dkPN1ubz8/sMv8Kkac8fHMzYjYkCOKwiDwWoI5Cfj72NV6Tqau1ox6Uz8asYTmPUn1x3qJFFFV/j6zvhTk5qayn/+539y/fXXk5qaitVqRZIk2tvbKSkp4b333iM19dJt+KppGotS59PutmPR9121zKeqFJQ0A5AnqpEKgiAMe+L6ODianS0caj7C2IhczKe4rkrmEGRbIoa8+UgDUGJf0zReXVNIQ6uTxKhAll0+cIVsBGGwxFtj+V7ubdjMVhoc9h5JoSCcqzMmhj//+c/5zW9+w/33309bW1uP54KDg1m2bBn333//oAV4oZMkiVnx/oXJvoZSVKMFKdCG9JV1CYXlrTicXqJCA4gKFWWvBUEQhjtxfRwcn9fsYUXJvyluLeH2UTf3el6XkIsSnwOce1/iL9uUX8OOQ/UY9Qr3LMlBrxPrCgVBuDSdMTE0GAw89NBD/OQnP6GiooLGRv90SJvNRmJi4iXfcqG+s5FPKzYRHxjL6E9eR3PasXzzRSRzSI/ttn1RC8DkUVHnI0xBEARhgA3U9bG6upqnn36a8PBw6uvreeyxx7pbNZ2gaRrPPfccTU1NdHR0MHfuXJYuXdr9/MqVK9m9ezcARUVFPPDAA4wfP36AXunQig2MJjM0jYlRfU8T9b+3/f/8UdXQwT8+PgLAbVdlin6FgiBc0r7WBGRVVdE0zb/A9fjXIjFsYGPVNkaGppHntIOsQwoI6rGNy+1j9xF/Y9Sp2dHnI0xBEARhEPXn+rh8+XKWLVvGvHnz2LBhA48//jivvPJKj23WrFlDWVkZv/vd73C5XCxYsIBJkyYRHx/PwYMH2bdvH0888QTgTzSH89rG0RHZjI7I7vW45vPS9fFL6OJz0GfP7fc0UpfHx+8/LMDtVZmeE83UHHF9FgTh0nbGxNDtdvPiiy/y7rvv0t7e3uO54OBgbrzxRn74wx9iMBj6OMLFLdoSxY3pSwj0eoGtSIFhvS5We4sacLl9jIgNIkrcjRQEQbgoDMT1saWlhc2bN/PSSy8BMG3aNO6//37q6uqIijo5w+TDDz/k8ssvB8BoNDJp0iRWrlzJ3XffzWuvvUZ6ejq//vWvsdvtTJgwgYULz2/1xHO1rvwzXF4X0+Mmd1dgPMFXfQhf+X40exOGnCv6fa4//fMAVY0OosPM3HplRr+PJwiCMNydMTF88skn0TSN//3f/yUtLY2gIP9oWHt7O8XFxbz33ns88cQTPPvss4Me7IUoPCCM2QnT8VYcoAuQA3sXltla4J9GKu5GCoIgXDwG4vpYXV2N2WzGaDQC/umpQUFBVFVV9UgMq6qqsNlOXl9sNhuVlZUAHD16lJqaGv7yl7/g8/m45ZZbMBqNzJs3bzBe9qBRNZX15Rtpc9sZZcvslRh6S/xTZXUp/Z8iu6uwnn9vL0OnyNyzJBuTQVRwFARBOONfwpKSEt58881ej4eEhDBhwgQmTJjAzTf3Xhx+qdhWs4tKexVjujSiAekriWFbh4uCkmYUWWJiVuT5CVIQBEEYcBfK9dHhcLBo0SIURUFRFK688kpWrVr1tRNDmy2w37FERFjPeV9VU/nB1DvYV1PAxBHZPabiaqqP8oq9/nOMm4WxH+fx+VQ+2FwCwJ2LshmfE3vOxwKwWIz9et3daoxYzuI4A3KuIXQhxWuxQESE8bTbXEjxnq3hFrOId/Cda8xnTAy9Xi/l5eUkJiae8vny8nK8Xu85nfxicLCpkD31+cSZRhBN7xHDnYX1aBrkjrBhNV+a020FQRAuRgNxfYyNjaWzsxOXy4XRaMTtdtPe3k5cXFyP7eLi4mhqaur+vqmpqbuHYnR0dI81hXq9Hperd/+/M2lq6kBVz73SZ0SElYYG+znvDxCjxBMTH09jY0ePx701h/E52pCCImmTwpD6cZ7tB2upaXQQbTMzMcPW75gdDle/jwFg7nTReYbjDMR7PJQutHgdDgMNDe4+n7/Q4j0bwy1mEe/gO13Msiyd9ibgGRPDBx98kGXLlpGVldXdpwnAbrdTUlJCYWEhv/71r88x9OFveuxkkoMSiS/OB0C2hvd4/mBpCwDjRLNcQRCEi8pAXB9DQ0OZPn06mzZtYt68eWzdupVx48YRFRXFunXrmDx5MlarlcWLF7Ny5UpuuukmXC4XO3bs4L777gNgwYIFbNq0iVtvvRWAXbt2MWPGjMF98YNgxbG1NHQ1Mi/xMhKsPRPj7mmkyeP7VfRO1TRWbisD4IY5GSiyaE0hCIJwwhkTw2nTprFq1SpWrFjBgQMHKC0tBSAsLIzZs2fz/PPPExYWNthxXrCywtLJCkvHVdeA15aIFHRyuqiqaRRVtvq3Swzp6xCCIAjCMDRQ18cnn3ySZ555ho0bN1JfX89TTz0FwIsvvsjy5cuZMGECCxYsID8/n0ceeQS73c69997b3dJi6dKllJWV8cQTT6CqKikpKcNyiceBxoNUdlQzK25aj8c1TcNb6k8M9f1cX7ivqJGqBgehViNzJiTQ2uLo1/EEQRAuJme12josLIzbbrttsGMZlv5R+C4mxcSi8UuwTLy+x3OV9R04nF5sQUbCQwLOU4SCIAjCYBmI62N8fDwvv/xyr8dXrFjR/bUkSTz66KOn3F9RFB566KF+xXAhuHXkDVS0VxFv7bnmT7M3oHW1IZlDkCNTz/n4mqaxYmspAAsmJ4pG9oIgCF8hynD1g0f1sqV6B7Ikc23a1b2eP1zuHy3MTAwd6tAEQRAEYVhJtMaTaI3v9bgcFEngt15Cba/rV+/CgpJmSmvtBJn1zBrdv4IzgiAIF6MBuV32i1/8YiAOMyx9a+Qyrk+Zj1qRj9pa0+O5wxXHE8MEMY1UEAThUnQpXx+/jh21e/j17v9le82uUz4vGQJQwpP7dY4To4VXTkrEoFdOv7EgCMIl6Iwjhjt37jzjQfbu3TsgwQw3elnHlJgJeGuP0PWvXyJHpGK57gnAv77wyInEMEmMGAqCIFxsxPVx4JS0lXG0rZTc8FE9Hlc7W8HdhRwS06/jHy5v4UhlGxaTjsvHxp15B0EQhEvQGRPDBx54AJ1O16MU9ld9uYT2paSkrZxNVdtIdamMBmTryVYV1Q0OOro8hFqNRASbzl+QgiAIwqAQ18eBc3XKFeSGjyLS3LOyt6dwI+5d72MYsxDjpBvP+fgrjlcinTs+ngCjWEUjCIJwKmf86/irX/2Kzz77jMcff7zPbb71rW8NaFDDRbWjhs9rd6MaohhNz+b23dNIE0P6VVpbEARBuDCJ6+PAsRoCGWXL7PX4iWqkSlTaOR+7pKadgpJmjAaFeRMSzvk4giAIF7szrjGcNWsWqamp1NXV9bnNXXfdNaBBDRfpIancmnUDEzR/o0jZHNz93OFyf//CLFF4RhAE4aIkro8Do85Rz8+2/JI3Ct/r8bhqb0BtLAOdESUu+5yPf2Jt4ZyxcQQG6PsTqiAIwkXtrOZTnGia25fLLrtsQIIZbiLNEUSaI+gq3IMXkExB3c+V1LQDkB4f3MfegiAIwnAnro/9V2GvosXVSpvb3uNxb8keAHSJo5F0hnM6dmV9B3uLGtHrZK6clNjvWAVBEC5mYqJ9P3xctoH6zgYmO1uIAqQAKwBdLi9N7S7+P3v3Hd90tT9+/PVJ0qZtugfdlCWyZNMiU6+It+rlOsFx4Q6/V6/gQuWCIBRlqSCCA664wOu98tOLXLygqHAvCiJg2Xu2he42dKY0bZrP749ApIwmpWmatu/n4+HDJueTk3dCm/M+OUun1dAmRM4vFEIIIa6mb2Qv4gJisarWWvdfmEaqa8Ch9uu22dYWDusVQ5Dh2jqXQgjRWjjdMdy1axffffcdBoMBRVEoLy/npptuIikpqTHj82gHjIc5UZxGj2rb1JQLI4Y5xgoAokL90GrkAF0hhGjJpH1suChDm1q3rRXF1OQeB40OXXzPa6oz72wFOw7nodUoJCfJaKEQQjjiVMdwxYoVKIrCpEmT0FzU0VmyZAne3t706dOn0QL0ZMntRlBwzkjMyYNoNf4o59cYZhWWAxAbYWjK8IQQQjQyaR8bRlVVpv04h0DvAJ7p+xi+OtssG0v6bkBFG9cdxfvaZt6s25aBqsLgnlGEBsru4EII4YjDjuHevXvJz8/n6aefZufOnbXKevfuzT/+8Q/69OnDnDlz+NOf/kR0dMPOGmpOuoReRxeug9iBte7PLjQBEBPm1xRhCSGEcANpHxuupKqUsqpyrKoVH+0vnTdteAJeXYahje1xTfUWlpzjpwO5KAokD0xwVbhCCNGiOewYfvrpp0yaNAmTycQ//vEPUlNTSUpKQlVVtm/fzqhRowDbltzLli0jJSWl0YP2FB8d/Cd6rTdjOt+NVvPLOVbZhbappDHh/k0VmhBCiEYm7WPDBeuDWDDsJc5WFtc62knbpgPaNh2uud71209TY1UZ2D2SyBD5klYIIZzhsGOYl5dHWJjtfD6DwcCqVauIjIy0l7399tsAtG3bltOnTzdiqJ6luqaa1Lw96BQt93m1hYBweyOWLVNJhRCixZP2seEqLWb0Wj0x/lEuq7O43MwPe3MAuENGC4UQwmkOO4aVlZX2n0+dOmVvBAFCQkI4fvy4/fa5c+dcHJ7nUhSFP3V/CHNRFuaNS9C06YDhrhkX7UiqEBEsaxqEEKKlkvax4ZYf+ienijN4pMfvuD7Udoh95Za/g0aDd89kNP6h9a5z485MLDVW+nWOIDZCZu4IIYSzHHYMO3XqRGpqKv379+fmm29m7Nix3HHHHQCsW7eOESNGAHDw4EFiYmIaN1oPotPo6BfZm+pzNVRypR1JDbIjqRBCtGDSPjbc2cpiTJYKgvW2NlStNlN9dDPUVOHd6/Z611dVXcP3e7IBuE12IhVCiHpx2DF86KGHmPMN3gEAACAASURBVD59OitXruTRRx/l+uuvZ+vWrQD85S9/Yfjw4VRXV/Paa6/xzDPPNHrAniLXlMdXaRuIPlfJEEBz/gzDCzuSxoTLmgYhhGjJpH1suBcGPEOxuYSg8x1DS+Z+qKlC06YjGkNIvevbfjiP8nPVJEQF0DEm0NXhCiFEi+awY9i1a1dGjhzJww8/zJw5cxg+fDjDhw+3l585c4bp06czYMCAVrUtt7GymJ35e7leF8wQQPGxdQxzzm88Exsu6wuFEKIlk/axYSqqz6EoCiE+wfb7LGm23V29ruFQe1VV2ZiaCcCIfnG1NrMRQgjhmFPnGD766KO0bduW8ePH4+vrS0JCAl5eXqSnp1NWVsZTTz3FnXfe2dixepRY/yj+2O1BvI5vB0DxtX0zmXXhqArpGAohRIsn7eO125K1jTWnvub2diO4o8NI1BoLlow9AOja1b9jeDyzhNP55QT4eZHYNdLV4QohRIvnVMcQ4Ne//jW//vWvOXLkCOnp6aiqSkJCAl27dm2V38oF64PoH9WHc4e2Y+GXEcNs+1RS6RgKIURrIO3jtamwnEOnaAn1sU0Zrck+DNXn0ITGoQmqf8du407baOHw3jF46WSNvxBC1JfTHcMLunTpQpcuXRojlmZlV/4+9hYcoJu5kG6A4htQa0fSNiG+TR2iEEIIN5L2sX7u6nQ7d3YYiaqqAFhOnx8tTKj/tNuiMjM7jxagURRu6h3r0jiFEKK1cNgxTE9P57PPPsPHx4ennnoKgIcffpjc3FwANBoNy5cvJza27g/i7OxsZs+eTXh4OPn5+UybNo34+Pha16iqyvz58zEajZSXl3PLLbdwzz332MvXrVvHzp229QfHjx/nmWeeoV+/+k83cYXTpZmk5u2hTWBbtG17ofiHk19k2448MsRPdiQVQogWzlXtIzS8jfziiy945ZVX8PLyAiAqKopVq1a58uW6VHVNNYWVZ2njG45Oa0tFrMYzAOgSete7vv/tzsKqqvTv0obQQDkqSgghroXDjuHKlSvZtWsXEydOtN9XWVnJX/7yFwD27dvHhx9+yPTp0+usZ+bMmYwePZoRI0awadMmpk+fzvLly2tds379ejIyMnjnnXcwm80kJyeTmJhIXFwchw4dYs+ePcyYMQOwNaJarba+r9dlBkT1IcY/ilj/aPz8owEoPJoPQESwjBYKIURL56r2ERreRgK89dZbJCUlue4FNqKMskze2LWU9oFteb7/EwD4/uYFrGfPoAmNq1dd1ZYavt+TBdg2nRFCCHFtHA5r7dq1i/fff58bb7zRfp+fnx/3338/999/P9OmTePnn3+us46ioiK2bNnC0KFDARg0aBCpqank5eXVum7NmjUMGzYMAL1eT2JiIuvWrQPg448/JioqioULF/LSSy+xe/duIiObbnF5rH80iVF9iT3fKQQoLLEddhwWJN9WCiFES+eK9hFc00YC/Otf/+LVV1/lpZde4ujRo654iY2m0lJJmE8I0YZf2nFFUdCGtUVR6jfjZsfhfMoqqmnbxp/r4oJcHaoQQrQaDkcM9Xo9gYG1zwJasWKF/WcfHx/8/Oo+sy87Oxs/Pz/0ej0A3t7eBAYGkpWVVatzl5WVRVhYmP12WFgYmZm2xeQnT54kJyeHDz/8kJqaGh588EH0er39AGF3W3vqGworCvmVEkJMUFt0cT3sHcNw6RgKIUSL54r2EVzTRnbu3JmOHTvSq1cvTp8+zQMPPMDq1aub9AvUuvQI70qP8K7UWGsAqMk9jiY8AUXnXa96VFW1bzpzixxRIYQQDeKwY2i1Wi+7T9ME6+dMJhO/+c1v0Gq1aLVaRo4cyVdffVWvjmFYmH+D44iIsO0+emLPSY6fTad/5lkifEKJnrCUsnMWADrEh9iv8ySeGJPBABEReofXeWLszmqusTfXuKFlx24w6J17fTl6DG5+H5rz+34tPKV9BOjRo4f957Zt29KlSxc2bdrEmDFjnK7DlW2kI4cLjhPt34YI32BqKkrJ+M88NHo/Ep5+H0Xn5fTzHck4S3puGQF+3twxvBN6r/otMXHF76zTf5OOOPk329z+zjwpXmdyDk+K11nNLWaJt/Fda8wOO4axsbFs2LDhqh2wr7/++rIF8peKiYmhoqICs9mMXq+nqqqK0tLSyxbkx8bGYjQa7beNRiPt2rUDbAvpL15T6OXlhdlsdhR+LUZjOVarWq/HXCwiIoCCgjIAbk8YidHnGOGnPsca4E9BQRnZ58u8FezXeYqLY/ckJpM3BQVVdV7jqbE7o7nG3lzjhpYfu8lkdur1+VWYqXDj++BJ77tGo7ikk+OIK9pHcE0bmZaWRvv27e1lXl5eVFZW1uv1uLKNrEu11cJL3y9CVVVeHz4LzalUUK0oYQkUFlUCzse9asMxAIb0jKK0uKJR4nXE2b9JR5z5m/WkvzNneFq8jnIOT4vXGc0tZom38dUVs6P20WHHcOLEiTzwwAOsX7+eIUOGEBkZiaqq5OXlsXnzZnbu3MnKlSvrrCMkJITBgwezefNmRowYwdatW+nbty+RkZFs2LCBpKQkAgICGDVqFOvWrWPMmDGYzWZ27NjBhAkTAEhOTmbz5s08/PDDAKSmpjJkyBBH4TeaziGdqC4podKqovgGoKqqrDEUQohWxBXtI7imjZw9ezYLFy4kKCiIiooKDhw4wDPPPNPYb8E1MVWb6BCUQLXVgl7rzbkLh9rXczfS4nIzPx/JR1Hg5j5yRIUQQjSUw45hdHQ0n332GQsXLmT27NmUl9sOcDcYDIwYMYLPPvvMqTUMKSkpzJkzhx9++IH8/HxmzZoFwKJFi5g5cyb9+/cnOTmZffv2MXnyZMrKyhg/frz929Z77rmHjIwMZsyYgdVqpX379jzwwAMNee0Nsmz/x3iVF3E3oPMJxFRpobKqBh9vLQafeh8PKYQQoplxVfsIDW8jhw8fzuTJk0lISODMmTNMmjSJrl27Ns4Lb6BgfRAT+z4OgGq1YDmzHwBd2171qmfT7ixqrCp9O0cQHiS7gQshREM51YOJjIzk1VdfRVVVzp49C0BoaGi9FnnHxcWxdOnSy+5fu3at/WdFUZgyZcoVH6/Vann++eedfr7GVF1Tzd6CA2hRuBfQ+AaQV2I7wzA8yEcWvwshRCvhivYRGt5Gjhs3jnHjxtXrOZvKieI0dBotsYZolLwTUH0OTUgMmsA2TtdhqbHy/Z5sQI6oEEIIV3G4Sr6iooKNGzeyceNGampqCAsLIywsDEVR+P777zGZTO6I06NoFA2P3vB7HvRqiwIovgEUFl/YkVS+tRRCiNZA2sdr8+8TXzE/9W1OlqRjuTCNtG39ppGmHsmnxFRFbISB69sGN0aYQgjR6jjsGK5evZrJkyezb98+VLX2ovTdu3dz3333kZWV1WgBeiKtRkuviO70rrYNuCo+gbK+UAghWhlpH69NrH8U0YZI4gJisJy2dQy19ZxGukGOqBBCCJdzOJV0/fr1rFixgu7du19W9swzz9CvXz/mz5/PokWLGiVAT5Rdnst/Tn1DjAFGdEhEExyNMd3WMYyQjqEQQrQK0j5emwe73AuAaq2h6vqh1GQfQRvZyenHp+WUciq7FD+9jhu7RTVWmEII0eo4HDFUVfWKjd4FQ4cOta+raC2KzCXsKzxIhl6L74jxaNt0oPD8GsMwmUoqhBCtgrSP9XemLJtd+fsoNpegaLToe9+J3+3Po2icP39wQ6pttHBor2j03vU7t1AIIcTVORwxdGaKxpUO+W3J4gNi+PMN4/DT/TI6WFh6YY2hjBgKIURrIO1j/f2cu4uNZ37g9va3ckf7W+v9+FJTFT8fyUMBbu4rm84IIYQrORwx9Pf358CBA1ct379/PwaDwaVBebpA7wB6R/SgXW4m1Sd3YK2psa8xDA+WjqEQQrQG0j7WX7Qhkm5h19PBL4qK9YuoOvJ9vR6/7WAulhqVnh3DaBMsM3SEEMKVHI4YPv300/z5z3/m3nvvZfDgwfYzmfLy8tiyZQtr1qzh/fffb/RAPUlq7m525e+l+/6f6FlhgYeXYq6qwVevxU8vZxgKIURrIO1j/d0YM4AbYwZQfXI7laf3QHUl3l2GO/34nccKABjYXdYWCiGEqznsxXTp0oX33nuPOXPm8Le//c0+dUZVVQYNGsQHH3xAx44dGz1QT5JZnsPewkNEeWtRrHoKLuxIGugru6MJIUQrIe1j/ZRWlbEjdxftAxOIuXBMRYLzu5GWlJs5kVmCTquhZ8ewxgpTCCFaLaeGt7p06cLf//53ioqKOHPmDADx8fGEhIQ0anCeamB0P+K1fgRn/B3FEISxRNYXCiFEayTto/PSSjJYfWIdXUI68Ycz+4D6nV+461gBKtCjfSi+MjtHCCFcrl6frDk5OWRnZwPg7e3dahu+KEMk4b4lnKuuAb3hl/WF0jEUQohWSdpHx4L1QQyJHUh0jQbMJpSgSDTB0U4//sI00n7XRzRWiEII0ao51TH8+eefefHFF8nIyKh1f0JCArNmzSIxMbFRgvNUa05+TV7hCW721hKrN1BiMgMQHKBv4siEEEK4k7SPzksIjCchMB7z9s+oon6jheXnqjmSUYxWo9CrU3jjBSmEEK2Yw11Jjxw5wvjx47npppv497//TWpqKjt27GD16tUMHTqUCRMmcOTIEXfE6jGOFZ1kr+kMZo2Coven/Fw1AP6+Xk0cmRBCCHeR9rF+vjy5np+yf7ZtOgPo2jq/vnD38QKsqkqXtsHS1gohRCNxOGK4ZMkS5s6dy6231j5vKDAwkBdffJHExESWLFnCm2++2WhBepp7r7uTwmM/EFH1PxS9AZPRAkjHUAghWhNpH51XVlXONxn/Ra/xJqU4B7x80UZ1dvrxO49emEbaprFCFEKIVs9hxzAnJ+eyRu9iI0eOZNmyZS4NytN1CGpH26hSLBXVaCM7UXayCpCOoRBCtCbSPjpPQeG3HZMxW8wEDhxCTVEWita5bQ7OmS0cSj+LAvTpLOsLhRCisTj8VPb1dXyArJ+fn0uCaQ5UVWXJvg/Ra/X8cegf0Gq0lG/cBoBBOoZCCNFqSPvoPH9vAyMTbrbf1kVf7/Rj954sxFKj0jkuiCCDd2OEJ4QQAic6hpmZmSxevNjhNa1FtdXCIeNRdBodWo0WANP5NYYB0jEUQohWQ9pH5/37xFd4KxoGh3YnKCS+Xo+VaaRCCOEeDjuGZWVlpKam1nlNeXm5ywLydFpFw4Rej3Au+zDVRzejieuJqdLWMTT4yrlKQgjRWkj76Jwaaw3fZ/5IlbWavv/7f3h3GoTvTX926rHm6hr2nzICckyFEEI0Noc9meHDh7NgwYI6r3n++eddFpCn02q0dAu7HtP3n1JZmA7JU1FV8NXr0GocbvIqPFRNjYWiogIsliqX1pufr8Fqtbq0Tnfw5Lg1Gi2+vv74+wehKEpThyNaMWkfnaOi8ruu95N56BsM1vx6nV144NRZqqqttI8OJDRQzgoWoim5Ilfy5PziSppbvGCLGZRrypUcdgzPnTvnsBJHDWNLkl2eyxcn1hLlZeI2oEK1nV0o00ibt6KiAnx8/DAYolza2dDpNFgszesDBTw3blVVqamxUFZWTFFRAaGhMrVMNB1pH52j0+jo26Yn13/7ISr1O79w57F8QEYLhfAErsiVPDW/uJrmFi+AVqtgNlddU67ksGO4Y8cOxo0bh6qql5Vd+KVQFIUVK1bUI+Tmq6SqlMNnj1GtqQGg3GpbCC8bzzRvFkuVyzuFwvUURUGn8yI4OIy8PFm7JZqWtI/OWXPya0rK8xhsMRFpCEETEuvU46otVvaeKASkYyiEJ5BcqXloSK7ksGPYpUsXPv7448vuP336NNOmTePkyZO89NJL9XrS5izeP5bHb/gD6tdvAApl1bYNaOSoiuZPPuiaD0XRAJcn40K4k6vax+zsbGbPnk14eDj5+flMmzaN+PjaG7Soqsr8+fMxGo2Ul5dzyy23cM8999S6JjMzk1GjRvHiiy9eVtaUdubtxVh5liRFQRvezunP2sMZRZwz1xAX4U9kiOzuKoQnkFyp+biWXMnhorjXXnvtsvuWL1/Ob3/7W8LCwli7dm2d5zi1NP7eBroFxNOushr0fpSfs40c+svGM8LD7Nz5M488MpYPPnj3iuUpKS+wbdtWAHbtSuWJJx5t0PO5og4hmhNXtY8zZ87knnvu4eWXX+aBBx5g+vTpl12zfv16MjIyePXVV1m4cCFvv/12rR1PVVVlwYIFxMY6NxrnTn/s/hB3eccTWWVBE9bW6cftPGqbRtpfRguFEI3EUa704otTWlWu5LBjGB39yyLxkydPMmbMGN59913mzJnDokWLCA0NbdQAPc2u/H0sOfB3UgN8UPT+lJ8/qsLfV85WEp6lX78BDBo05KrlTz75LP36DXBjREK0LK5oH4uKitiyZQtDhw4FYNCgQaSmppKXl1frujVr1jBs2DAA9Ho9iYmJrFu3zl7+ySefkJycTHBwsCtemku1D2rL4LIqtIAmzLmjKmqsVnYft00j7SsdQyFEI3GUKz399HOtKldyapjLarWybNkylixZwrBhw1iyZAlhYWGNHZtHyjPlc7g0nWgvLYrWYD+qQkYMhStZLBamTp1E27YJVFWZCQoK5pFHHuOFF55j8+bvee65Kfz3v99x8uQJvvpqIxs3fsvmzd/Tpk0bTpw4ztixf6RPn34A5OfnkZLyAsePH2P06Ae566772LUrlaVL32TgwMHcf/8DfP75SjIzz7Bw4av06NGL22+/nY0bvyM1dTuBgUEUFuYzfvzThIWFU1xczFtvLSQkJJTi4iJ0Oh3jxz91WR1FRUY++GAZK1Z8iqJomDfvZdq0acO0aTNZvfpfLF/+PiNHJpOdncnu3buYMmU6nTtfzwcfvEtYWDj5+Xnceutt3Hjj1T+whWhqDW0fs7Oz8fPzQ6+3bWTm7e1NYGAgWVlZREZG2q/LysqqVW9YWJh9xDA9PZ3Dhw8zduxYPvnkExe9Mtf4Om0jx4pPcpOfno6BbdA6OWJ47HQx5eeqiQz1Izbc0MhRCiGaI3fkSn/721skJQ26Yq40cuSvW1yu5LA3c+TIEaZOnUpWVhZz587lzjvvvOyaJUuWMH78eJcE5OkGRPUhTutHYMYhdAExlOVc6BjKGkPhWrfffic33XQLAJMmPc3BgweYN+91hgzpT5s2kbz11rusWvUZ6elpvP32Iv7f//s33t7ebN26hVOnTtg/7LKzs3jzzb9x5sxpnnzyUe666z769u3PwIGDAQgMDOL++x/gww+X8eyzkwFIT0/jo4/e4+OPV6LRaPjPf/7NkiVvMn36yyxevICuXbszevSDADz//FNXrAPg889XAhAVFcVttyWze/dOAO6++z4OHTpATk42c+bMZ//+vfj4+PDyy9N57LEn6NWrNxUVJsaMuZt//nMVAQEB7nnThagHT2gfrVYrCxYsYNasWQ2uKyzMv8F1RETU/ls9uf8Ux4pOcNeQx2kf29PpelZtTgNgWJ9Y2rQJbHBcV3NpvNfCYNC7pB5y9BicqMclz+VGnhSvwQAREfo6r/GkeJ3lrpjz8zXodA0/ms0VddhouPPOUfzqV7Zc6dlnn+LIkYPMn/8GAwf2JTo6iqVL3+Nf//p/nDmTzttvL2LVqi/x9vbmxx83k55+kgEDBqDRKOTkZPPOO+9y5sxpHn/8z9x332gSExPZu9eWK4WGhvDAAw/x/vvv8te/vgDYcqXly9/jH//4DI1Gw5o1q/nb394iJWUWb775Ot27d2fMmIcAmDjxySvWAbZcSavVEBMTQ3Ly7ezatROdTsP994/m8OGD5OXl8Oqrr7Nvny1XmjVrOo8//iS9e/fBZDJx332/5bPPVtfKlS68xxqNpl6/Hw47hvfddx+qqnLvvfeSnp7O22+/fdk1q1evbjUdw3DfMMLbDoK2gwAwndoPgL+fTCUVrqPVasnLy2XevJfx8zOQk5PNmTMZdO/eA4ABA5IAuPfe0fzrXyu57rrOeHvbfgcvnRLRvfsNKIpCbGwcZ8+eder5f/55O1VVZl5//RUAKioqqK62nVu0fftP9g86gAUL3rzm19m/fyIAN9zQi4oKE/v27eHrr//Dt99+BUBsbBz5+XnSMRQeyRXtY0xMDBUVFZjNZvR6PVVVVZSWll62VjA2Nhaj0Wi/bTQaadeuHUePHsVsNvPGG28AkJaWxurVqzl58iSTJk2q1+sxGsuxWq99U6eIiAAKCspq3Tfu+gdIi8ogQom6rOxqrKrKlr1ZAHSND3L6cfV1pXivhclkdkk9fhVmKhzU46qY3cXT4jWZvCkouPoZfJ4WrzPcGbPVam3w0Q2uPP5BVW0dulmzZuLnZyA7O4v09HS6dOkOQN++iVgsVu666357rqTR6LBYrCQl2Tp8FosVq1WlW7ce1NSoREbGcPassVaMVquKxWKlpsaKqqr2su3bt2E2m3n11bnAL7mSxWJl27atjB79oP3a+fMXX7GOC2pqrPZYLi5XVZW+fQdgsVjp1u0GKipM7N27h7Vrv+Trr23LCWJj48jOzqFjR8Nl77HVaq31+6HRKHV+CeiwY9ipUyemTp1a5zUbNmxwVE2LsS7tOzJKz3Bbwq/oGNzulzWGPjKVVLjOhg3fsH79Ot5//+9otVrmzJlZ64DVC51AcLxDmJeXbTRbq9VecVv9q4mPT2DSpF/+9isqKs4/n9NVoCiKPW6LxXJZ+cWv44JHHx1PaKhtypzZXIlOJ6PxwjO5on0MCQlh8ODBbN68mREjRrB161b69u1LZGQkGzZsICkpiYCAAEaNGsW6desYM2YMZrOZHTt2MGHCBOLj43nvvffs9aWlpXH33Xd7xK6kqqoS6B1AD9UPNfcEakR7FB/Ho5KnskopKa8iLNCHhEj5UkgIcWWSK7k+V3I4lvvEE0+QmJhY539PPPGES4JpDjJKz3DQeITSkz9RYzxD+fk1hnKOoXCl0tISDAZ/tFrbcSh5eblXvbZ//ySOHz9GVZXtW9CtW7fw5Zer6/V83t56+4fSunVfMmBAEkePHqKiwgTAsWNHeOuthQAkJQ1i3769gC3xmz59ChaL5bI6AEJDwzAabRtIHD9+rM4Y/PwM3HBDL3bs2AbYvuV67rmnqK6urtdrEcJdXNU+pqSksGrVKmbMmMHKlSvt00IXLVrE0aNHAUhOTiY+Pp7JkyczceJExo8fX+tIC4vFwssvv0x6ejpffvkln3/+eeO86Hr4PmsrM7a+wqYD/+Lc169TnZbq1OMuPtRetsYXQlxNU+dK/fu3vFzJ4TDXiBEjHFbizDUtxW86/JqBJZVE7fmWGu8wyitsZyvJGkPhSrfddgdbtvzAiy9OJioqmrKyUr755mu2b/8JgIULX+UPf/g/QkPDSEhox4QJTzNnTgrh4W0wmcp5+unnOXBgH9u2/QjAkCHD2b7dtt3yBx+8S58+/exlffr0o1u3Hqiqyty5LxETE0u7du155plJzJo1g5iYOMrLyxg//ikAnnrqOd5663XefPN1ysvLufXWX6PT6ejU6bpadQA8/PDvWbr0LW64oSeqauXQoQNs2rQRLy9vDh06QH5+PoGBgQwZMhyA6dNf5u23F3HkyGEqK8/xu9/9AR8fH7e+90I4y1XtY1xcHEuXLr3s/rVr19p/VhSFKVOmXLUOnU7HjBkzmDFjhsPnc5fTpZkYK8+iltu+g3Zm4xlVVdl5tACQQ+2FEHVzR670008/oqrqFXOlhIR2LS5XUtT6jJc2c65aP3Huv+9iOfET+pv+zIR/V1FjVfnbc8Px9tK6MFrX8tR586+95s1f/3r1+f7gnthzczOIikpweb2unEvvTs0h7qv9m3nq77oznIn9tR1z+Wti3dMXAfxOzqWio+PrXMWT3ndHayjElbl6jWGNtYbs0kx0X7yMf40V/z++i6Krez1+Rm4ZLy3/mSB/b16fMBhNI44Yuup31tm/SUec+Zv1pL8zZ3havI5yDk+L1xnujNkVuVJzyC8u1tzihdoxX/pv1uA1hqK29w98QmXNGe7VKGi0vtRYzXh7aTy6UyiEEEK4k8VqWycTU61SYalBExzjsFMIkHr+UPu+nSMatVMohBDictIxrKejZ49Toa1CAc6pti2PZRqpEEII8YuDxqN8dPAfDPSN4w6cP9h+17Hz00g7yzRSIYRwN1cdJNJqPHrD7/l9MfhYVSpU27ef0jEUQgghfpFfUUC11YJ3pW1TBo0T6wszcsvIMVbg7+vF9W2DGztEIYQQl5ARw3q6LqQDZWW2hq68RjqGQgghxKVuTbiJQTGJmNbbzld0ZuOZNVtsh9rf2D0KrUa+txZCCHdzW8cwOzub2bNnEx4eTn5+PtOmTau11TbYdiObP38+RqOR8vJybrnllsvOYsrMzGTUqFG8+OKLbj+nqdRczhu7lhIYpGV0PpRabG+fdAyFEEIImxprDaVVZYT4BOOV0I8afaDDEcOT2SXsOVGIt5eG2290/UZgQgghHHNbx3DmzJmMHj2aESNGsGnTJqZPn87y5ctrXbN+/XoyMjJ45513MJvNJCcnk5iYSFxcHGDrOC5YsIDY2Fh3hV1LeZWJE8VpRAQF4RWRSFmlbccf6RgKIYQQNlmmHF79+U06BrXn2X6PQ89fO3zM6h9OAXBr/3iCDI43qRFCCOF6bpmrUVRUxJYtWxg6dCgAgwYNIjU1lby8vFrXrVmzhmHDhgGg1+tJTExk3bp19vJPPvmE5ORkgoObZu1BqG8wT/d5lId6/wGfwb/DdM52mKR0DIUQQgib4soSfHW+hPmGOHX9kYwiDqUX4avXclui4ymnQgghGodbRgyzs7Px8/NDr7ft4unt7U1gYCBZWVlERkbar8vKyiIsLMx+OywsjMzMTADS09M5fPgwY8eOdNFNKwAAIABJREFU5ZNPPnFH2Jfx0enpHNLJfrv8fMfQIB1D4WJDhvTnt7+9h2HDbiYp6UYA9u3bw+uvv8LAgYN5/PEna12/cuUnHDp0EEWBTp06M3bsH50qu9iiRQuorDyHn58fJ08eZ+zYP9G/fyIAZWVlLFgwF4PBn4KCfB555DG6dOnmsOyNN17jwIH9DBo0hEceeczl75MQwvP0jOjOa+EpmE5tx7znK7za9UETHH3Fa1VV5YvNttHC2xLbyhetQginNXWudOLEccaNa1m5UrPYfMZqtbJgwQJmzZrVoHoaeuDxwfxjfL7nCzpbvbir081Unz/vMqZNABERAQ2q2x08MUaDASIi9A6va+zY8/M16HSNM4B+rfW+8MKL9p/T0k5x+PABOnW6Do1GqVXnoUMH+e679Xz00ScoisIjj4yjT5++9O7dp86yS/n46Hn++b8CsGHDtyxevIBPP/0XAO+/v4SePXsxZsxDnDx5gilTnuezz1ajKEqdZZMmTeG99/7WoPfhajQazVV/Lzzxd91ZjmI3GPTOvb4cPQY3vw/N+X0XrpNWcppY/2h0J3ZQlbEbjX/IVTuG+0+d5URmCf6+Xtza37kjLYQQ4oJJk6baf05PT+PAgf107HjdZdcdPnyQb79dz/vvf4yiKDz66O/p2bMPvXr1rrPsUt7e3jzzzPMAbNz4HYsWLeCTTz4DYNmyJXTv3pPRox/k1KkTTJ36Vz79dBWKotRZNnHiX/ngg3cb6R2qH7d0DGNiYqioqMBsNqPX66mqqqK0tPSytYKxsbEYjUb7baPRSLt27Th69Chms5k33rDtbpaWlsbq1as5efIkkyZNcjoOo7Ecq1W95tdRYDJyqCgDfek5CoutGIuTAFAtNRQUlF1zve4QERHgkTGaTN4UFFTVeY07YrdarVgsVpfXq9Nprrneix8XH9+OMWPaMWfOTKxWtVbZV1+tJTHxRqxWAJWkpEF89dVaevToVWfZpf7ylyft9Z4+fZoOHTrab69f/xXvvfcxFouVhIQOVFdXs3fvXnr06FlnGWD/m3P1+2u1Wq/4e+Gpv+vOcCZ2k8ns1OvzqzBT4cb3wZPed41GafAXgeLalFWVs2Dn2/jpfJl+1oTC1Y+qUFXVvrbw9oEJ+OqbxXfVQggP1a5de9q1a8+cOTMvK/vmm69ISroRzfkdjwcOHMw336yjV6/edZZdavz4p+w/nzmTQceOHe23v/3Wlg8BdOjQCYulmoMH99OjR886yzyJWz6FQ0JCGDx4MJs3b2bEiBFs3bqVvn37EhkZyYYNG0hKSiIgIIBRo0axbt06xowZg9lsZseOHUyYMIH4+Hjee+89e31paWncfffdbt+VtHd0d8YH9UV3egNKhIHyCplK2lL9uD+HLftyGlyPooB6yXcRQ3pGM/iGK397fi1ycrLp12+A/XZoaBgHDuxzWHYlR44cZsWK9ykvL2PWrNcAKC0twWQyERISar8uJCSU7Oxs2rZNuGqZp33YCSEaX2lVGbH+0fhrfVHKtoFWhyYo6orX7jxaQEZeGUH+3vyqb9NsKieEuHbXkitdKS+6kuaQK5WVlTF7dsvKldx2UFBKSgqrVq1ixowZrFy50j4tdNGiRRw9ehSA5ORk4uPjmTx5MhMnTmT8+PG1jrSwWCy8/PLLpKen8+WXX/L555+7K3wAgn0C6aR6E1NlweplINtYAUBEsK9b4xCisXTp0pV5817n4YfH8cQTf6aqqu7RXCGEuFisfzRTEyfyl+jhAGhC41E02suus1pVVp9fW/ibQe3w9rr8GiGE8EQXcqUHHxzb4nIlt83biIuLY+nSpZfdv3btWvvPiqIwZcqUq9ah0+mYMWMGM2bMaJQYHVl/fBM/lO0nyaAnocoLS42VmHCDLJZvgQbf4JpvqhoyldRZ0dExFBUV2W+fPWskKirGYdnFampqMJvN+Pn5ATBkyDCmT5/KqVMn6dKlK35+BoqKzhIQYFtDVlR0lujoaAIDg65aJoRofQ4ZjxLuG0aQ8QwA2tArrxvcdiiXHGMF4UE+DOt1+WeSEMLzXUuu5I686EoaI1caPHgoM2dOa1G5kttGDFuCMyXZnFBNlGs15JQrAHSKDWriqERrN3JkMtu3/4TVakVVVbZt+5Hbbkt2WHax/Pw8Xnttjv12dnY2NTUWIiOj7PVs27YVgFOnTqLVaune/QaHZUKI1sNirWHZ/o95adtrlBnTgCuvL7TUWFmzxVY+anB7dFpJRYQQjasxcqWcnJaXK8lK73r4zfUjaHdoH2EVhexQbd92XBcnHUPR+KxWK4sWzefQoQPo9T4A9m2Yu3XrwYgRt5GSMhVFURgyZDi9e/d1WHb8+DFmz05hxYpPCQwMpKamhrlzXyIgIICMjDRSUuYQEmI7h+zRRx9n/vx5pKefIj8/j5SU2fZF2nWVCSFaj4qqCrqGdqa8uhzfjBysgCbs8hHDLftzKCiuJCrUjxt7RF5ekRBCXIPGypVmzpx+xVwpLa3l5UrSMayHqIA2VFZUYbVYOVFoAaRjKNxDo9Hw7LOTr1r+0ENj6122e/dOhg27CQCDwZ9Zs16xl1061SMwMKhW+cXqKhNCtB6BPgE81vP3AFgi9lBTkI72khHDaksN//kxHYC7hrZH64GJkRCieWqMXGnXrtSr5kqXagm5knwi18N7qf/k/UhfMjsNIqtCT6DBWzaeEY2iW7ceLF78OqmpOxqlfqvVSklJMePG/alR6r/gnXcWk5Z2kjZtZFRAiJZuR+Yejpw9TlVNFbq2vdH3uwvFu3Yb+b/d2RSVmYlv40//Lm2aKFIhREvgjlypuLh15UoyYlgPx4xpZFSdpV3QrylXz9IvNghFUZo6LNECLVu2vFHr12g0/PnPjzfqcwBMmPB0oz+HEMIzfLznX+SbjEwZ8AzxAZdv3FBZZWHdT+kA3D20AxppP4UQDeCOXOmxx8Y3+mY5npQrScewHiYkjuN0fj5btlUCMo1UCCGEAKix1tA/pifHCtIJT9vLucIv8e52C9qo6+zXbEjNpKyimg4xgfTqFNaE0QohhLgS6RjWQ7xPCLqTG9mXWQok0CkuuKlDEkII0cxlZ2cze/ZswsPDyc/PZ9q0abXO8AVQVZX58+djNBopLy/nlltu4Z577gHgf//7H1988QVxcXGkp6fTqVMnnn32WbfOaNFqtPyh72gKCsqoWDefmqyD6DokcuF0QnNVDeu3nwbgnmEdZLaNEEJ4IOkYOklVVVK+fxM17xS/tqj8T9eetpH+TR2WEEKIZm7mzJmMHj2aESNGsGnTJqZPn87y5ctrXbN+/XoyMjJ45513MJvNJCcnk5iYSFxcHIWFhTz77LO0b9+eqqoqBg0axK233krPnj3d9hp25+9HW2ol3rsdXkZbB1B70Y6kB9LOUmG20C4qgK4JIW6LSwghhPNk8xknWawWjpVkcsrXi3Oqnuhwg5y9JIQQokGKiorYsmULQ4cOBWDQoEGkpqaSl5dX67o1a9YwbNgwAPR6PYmJiaxbtw6A+++/n/bt2wOQl5eHl5cXMTHuPTR+U+YW3k39B6eNx1Ary8DbF8U/3F6+53gBAP2uj5DRQiGE8FAyYugkjaJhaqdk8r//JxVqFEEG76YOSQghRDOXnZ2Nn58fer0eAG9vbwIDA8nKyiIy8pcd6rKysggL+2VdXlhYGJmZmbXqmjFjBjt37mTu3LmEh4fjTv3a9CY8IISEKhUAbVhbewewxmplz4lCAPp2jnBrXEIIIZwnHUMnaTVaOig+BJ6r5ierngA/r6YOSbRgQ4b057e/vYdhw24mKelGPvlkOWlppwgODuH06XTuu+8BkpJuBGzTnJcufZOzZ89iMpkYOnQ4t9/+G4dllyopKWb58g9QFCguLkZRFKZPfxmA3NxcFi+eT2hoGIWFBTz11HPExsbVWWYylbNkyZvs2pXK2LF/vOrzCiFc4+WXX6akpIQHHngAX19fBg4cWK/Hh4Vd+/KIeyNGAlC89QvOAoa4joRHBACw/0QhpkoLsREGenaJuubnaAwR52NsCINB75J6yNFjcKIelzyXG3lSvAYDRETo67zGk+J1lrtizs/XoNM1fLacK+oAGDiwL3fffS/Dh9/MwIGD+Pjj5aSlnSQ4OISMjHRGj36AgQMHAbZ86O23F1NU9Es+dOedoxyWXRpvSUkxH374HoqicPZsEVqthpSUWQDk5uawcOF8wsLCKCgoYOLE5y/Kla5cZjKV8/bbi9m5M5Vx4/5Y63kb4kLMGo2mXr8f0jF0Uo4pj39m/Jc24f74n9ET6CcjhqJxTZo01f7z9u0/8cYb76DT6Th16iSPPfYH1q7dgF6v53//28iZM2eYN28BZrOZhx++jz59+hEdHVNn2aXeeGM+EyY8TUREG3Q6Dbt377aXvf76PEaNupuhQ29i69YtvPbaXBYvXlJnmcHgz6RJU5kzZ2ajv1dCNFcxMTFUVFRgNpvR6/VUVVVRWlpKbGxsretiY2MxGo3220ajkXbt2gFQVlZGQICt4Q8KCmLQoEF899139e4YGo3lWK3qNb+WiIgASk+fAKDKN4qCgjIA/vezbc1hzw5h9vs8QUREgEviMZnMLqnHr8JMhYN6XBWzu3havCaTNwUFVVct97R4neHOmK1Wa4OPbtDpNC49/uG5514AwGKx8tNPP141V/rvfzdw+vTpWvlQr159iY6OqbPs0njnz3/VnisB7N+/117+6qtza+VDc+fOtudKVyvT6/147rkXmDNnJlar6pL35uKYrVZrrd8PjUap80tA6Rg6qayqnFNVJah6HYqqJ1w6hi1exX/mXfF+v9/YPoQqt/4D6/lNFi6mv/EhtOEJVB/dzLnjP6Kq6hUfXx+LFy9Fo7F9+xMTE8u5c+cwmcrR6/V88806Bg2yrU/S6/X06dOPDRu+YezYP9ZZdrHs7CxycrL573+/w2g0YrVaeOih3wO2b8d27NjG7NmvATBgQBJTpz5PYWEBXl5eVy0LD5cpY0I4EhISwuDBg9m8eTMjRoxg69at9O3bl8jISDZs2EBSUhIBAQGMGjWKdevWMWbMGMxmMzt27GDChAkATJw4kTfeeMPeOTx+/Di33HJLk7yeC5+JmvC2gO2b+N3n1xf2uU4+E4RoaeqbKymKgqqqtXKl6mNbrvr4+nB3rmSxVPO73/0BaDm5kuye4qT4gBgm6ttzR2E5FTKVVLjZhQ86gK1btzBs2M2EhtrWG+Xm5hASEmovDwkJJTs722HZxTIy0jl4cD+dOnVm/Pin6NWrD88//xRWq5Xc3Fx8fX3ta6C8vLwICAgkJye7zjIhhHNSUlJYtWoVM2bMYOXKlcyaZZuWtGjRIo4ePQpAcnIy8fHxTJ48mYkTJzJ+/Hj7kRbDhg3j+eefZ968eUycOJHOnTvz0EMPNclr8U1+Dp+RT6EJsY14nskvp7CkkkCDNx1iApskJiFE6yC5UsPJiKGTfHW+9Ox2M18fVzldE8aNsvlMi+fo2yqfQQ/XWe51/VB8uw936ZSJ3NxcvvzyC156aa7L6gSoqKggICCQfv0GAHDTTb8iJeVFMjMvHxEVQrhWXFwcS5cuvez+tWvX2n9WFIUpU6Zc8fHjxo1j3LhxjRZffWgCwtEEXLwbqW3Tmd6dwtFoZDdSIVqa+uZKl07N9Lp+KF7XD3VpTO7KlYYNu5mXXpreonIl6Rg6aWfeHn7M3UamIZjC3FBZYyjcLjc3h8WLXyclZTZBQcH2+6OioikqOmu/XVR0lvj4tg7LLtamTRu0Fx2/oigKWq0Ws7mKqKgozp07Z18DVV1dTVlZKdHRMXh5eV21TAjRulw6bR5gl30aqXt3SRVCtE6SKzWMTCV1kvFcEUeNpzinlgPIVFLhVllZmbz55kJeeGE6ISGhbNz4Lfv37wVg5Mjb2bZtKwBms5ndu3cyYsRtDssu1rVrd3x9DZw6Zds44tixowQEBNCuXXuCgoIZMCCJHTt+AuDnn7fTs2dvwsMj6iwTQrQuxu8+wvT5i1hO77HdLqnkdF45ei8t3drJofZCiMbl7lzp+PGWlyvJiKGTkqL70elsJj8czWMrNQTIiKFwo+eee5Li4mJ+97vRAJjNlcyb9zoAv/rVCA4dOsDs2SmYTOX84Q//R0xMrMOyzZs3sW7dl7zyykJ0Oh1z577GBx+8S1RUNLm5Ocyb9zpeXl7nn38KixcvYNu2rRQUFPDXv067KLarlwkhWg9zzkmsRZmgsaUWFzad6dEhFC+dtilDE0K0Ao2RK/3wwya+/PLfV8yVcnJaXq4kHUMnBXr7o9n5X+7VquzU98PLRWewCOGMlStXX7VMURSefHJivct2797J0KE32W9fd931zJkzH7h8DUB0dAyvvLLwivXUVSaEaB1U1UpVfjoAmjDbFKzd59cX9pXdSIUQbtAYudKuXalXzZUu1RJyJendOMtcAarKOasXBj+fpo5GtHDduvVg8eLXSU3d0Sj1V1RU4O8fwB13uOYg1Ss/h4nFi1+npKSE4GCZRiZES6aWFaJWVaL4BaPxDcRUWc3R08VoFIWencKaOjwhRAskuZLryYihk1SzbW2hSdUTKDuSika2bNnyRq3fz8+PP/3p0UZ+DgNPP/1coz6HEMIz1BRmAL+MFu47YcSqqnRNCMHgI2vyhRCu545c6f/+7zGX7i5/+XN4Vq4kI4ZOUittHcMKVc4wFEIIIS524QBrbZjtbMXdshupEEI0O9IxdJJ9xNAqI4ZCCCHExWqMZwDbiGG1pYb9p2xbv/eWjqEQQjQb0jF0klppAmxTSWVHUiGEEOIXaplthFATFs+h9CLM1TW0jfQnPMi3iSMTQgjhLFlj6KQLU0lNMpVUCCGEqMXvvtmEeFdSZNaz+6djgOxGKoQQzY10DJ2kje7MvqDhHCz14iYZMRSNbMiQ/vz2t/cwbNjNJCXdyI8/buarr74kJiaOM2cyaNeuA489NgFFUQBYufITDh06iKJAp06dGTv2j/a66iq72LZtW/nPf1YTHR1LYWE+Y8f+iY4dOwFQVlbGggVzMRj8KSjI55FHHqNLl24Oy9544zUOHNjPoEFDeOSRxxrzLRNCNCFFUfAKboOaV8qe8+sLZRqpEKIxNXWuVFCQx7hxj7SoXEk6hk7SRrRnu6aUoxYjo2TEULjBpElT7T8XFRl57LEnaNs2gerqan7zm1sZPvxmunbtzuHDB/n22/W8//7HKIrCo4/+np49+9CrV+86yy5WWlrC9OlT+PTTVYSHR5Cbm8VTT43n00+/QKvVsmzZErp378no0Q9y6tQJpk79K59+ugpFUeosmzjxr3zwwbvufuuEEE3kVHYppRXVhAf5EN/Gv6nDEUK0cE2ZK2VlZTJx4oQWlSvJGsN6KCk3AxAgm88IN7vzzrto2zYBgPz8PHQ6LyIjowD45puvSEq6EY1Gg6IoDBw4mG++Weew7GLZ2VlotRrCw21Tv+Li4iksLODw4YMAfPvtVwwcOAiADh06YbFUc/DgfodlQojW5ZfdSCPs39ILIYQ7uDtXio2Na3G5knQM66G4rAqAQJlK2ios2vU3fspJdfnPDTF//lwmT36WF16YQWio7dDonJxsQkJ+ORQ1NDSMnJxsh2UXa9u2HTqdF0eOHAJg//69VFVVkZeXR2lpCSaTiZCQUPv1ISGhZGdn11kmhGh9dh0vBOSYCiFai/rmQa//vKTF5EoHDuxrcbmSTCV1Uo3VSvm5KhTA31emkoqmMWnSVEpLS3n88T/h6+tL3779XVKvn58fixcv5YsvPuP77/9HeHg4CQntMBgMLqlfCNHynckrI+9sBQYfHdfFBzV1OEKIVspduVJoaFiLy5WkY+ik8nMWVNXWKdRoZHpMa/BM37806s/1UV5ejr+/bb1OYGAg/fsn8v33/6Vv3/5ER8dQVFRkv/bsWSNRUTEAdZZdqmPHTva5+hoNfPjhMtq2TSAwMAg/PwNFRWcJCAgAoKjoLNHR0XWWCSFal20HcgDo1SkcrUYmJAnRGtQ393luwHgsFqvT19eHu3Mlq9XK8uXvtahcST65nVRmOj+NVNYXiiaQkjKV8vJy++20tFPExsYDMHJkMtu3/4TVakVVVbZt+5Hbbkt2WHapRYvmY7XaPqy3bNlMz559iImJtdezbdtWAE6dOolWq6V79xsclgkhWo/tB3IB2/pCIYRwN3fnSlu3trxcSUYMnVRacWF9oUwjFe43cOAgXn75ReLj21JYWECHDh255577AejWrQcjRtxGSspUFEVhyJDh9O7d12HZ8ePHmD07hRUrPgWgpKSEF1+cTFhYONXVZqZMedH+/I8++jjz588jPf0U+fl5pKTMRnN+RKCuMiFE61Bcbubo6SK8dBp6tA91/AAhhHCxxsqVZs6cfsVcyWyubHG5knQMnXShYxggG8+IJnD//Q9w//0PXLX8oYfG1rts9+6dDBt2k/12Ssps+886ncY+1QMgMDCIWbNeuWI9dZUJIVqHPec3neneLhS9t7aJoxFCtEaNkSvt2pV61VzpUi0hV/K8rqqHKquoBmRHUuEe3br1YPHi10lN3dEo9VutVkpKihk37k+NUv8F77yzmLS0k7RpE9mozyOEaFq77MdUyG6kQgj3cEeuVFzcunIlt40YZmdnM3v2bMLDw8nPz2fatGnEx8fXukZVVebPn4/RaKS8vJxbbrmFe+65B4Bly5Zx4sQJQkNDOXXqFGPHjmXo0KHuCh9Vtf0/LMjHbc8pWq9ly5Y3av0ajYY///nxRn0OgAkTnm705xCiJWjubWR2oQmdVqFXJ+kYCiHcwx250mOPja81g6oxeFKu5LaO4cyZMxk9ejQjRoxg06ZNTJ8+neXLl9e6Zv369WRkZPDOO+9gNptJTk4mMTGRuLg4Nm/ezEcffYROp+P48eOMHj2abdu2odfr3RL/0J7RRLcJoFOUv1ueTwghROvR3NvI8XfdgH+Aj2zQJoQQzZhbppIWFRWxZcsW+7eXgwYNIjU1lby8vFrXrVmzhmHDhgGg1+tJTExk3bp1AKxYsQKdztaPjYuLo6KigrKyMneED4CvXsev+sfjq5dlmS2VemFYWHg8VbUCcmyMaBlaQhvZISaQ7h3C3PZ8QoimIblS83EtuZJbOobZ2dn4+fnZv7n09vYmMDCQrKysWtdlZWURFvZLwxIWFkZmZqYt0It27tm0aRO33nor4eEyZUW4hk7njclUKh94Hk5VVSyWaoqLC/H2lmndomWQNlII0RxIrtQ8NCRXanbDX9nZ2Xz22WcsXLiw3o8NC2v4NNCIiIAG19FUPDF2gwEiIhxPdWrs2IODfThz5gwFBZmN+jyi4XQ6LSEhIYSHh191q2dP/F13lqPYDQa9c68vR4/Bze9Dc37fWwppI+vHFfE6/TfpiJN/s63xPXYVZ3IOT4rXWe6KWXKl5sOZXOmKj2vEmOxiYmKoqKjAbDaj1+upqqqitLSU2NjYWtfFxsZiNBrtt41GI+3atbPfzsrKYu7cuSxYsICQkJB6x2E0lmO1Xvu3HBERARQUuG9qjit5auwmkzcFBVV1XuOu2AMCIghw8Werp77vjjSHuI1G0xXvbw6xX40zsZtMZqden1+FmQo3vg+e9L5rNIpLOjnuIm1k03BVvM7+TTrizN9sa32PXcVRzuFp8TrD3TE3NFdqbu9xc4sXasd8aa7kqH10y1TSkJAQBg8ezObNmwHYunUrffv2JTIykg0bNtjXQYwaNYoffvgBALPZzI4dO7jjjjsAOH36NPPmzWPOnDmEhYXx1VdfsWvXLneEL4QQQjQaaSOFEEJ4ArdNJU1JSWHOnDn88MMP5OfnM2vWLAAWLVrEzJkz6d+/P8nJyezbt4/JkydTVlbG+PHj7dt1/9///R9FRUX2RrCyspIlS5a4K3whhBCi0UgbKYQQoqm5rWMYFxfH0qVLL7t/7dq19p8VRWHKlClXfPy3337b4Bg0mobvYuiKOpqKJ8YeEuJcXJ4Yu7Oaa+zNNW5o2bGH+AY79foUvXPXuZKnvO+eEkd9SBvZNFwRr7N/k444+zfbGt9jV3Em5/CkeJ3V3GKWeBvf1WJ29FoUVbYWEkIIIYQQQohWzS1rDIUQQgghhBBCeC7pGAohhBBCCCFEKycdQyGEEEIIIYRo5aRjKIQQQgghhBCtnHQMhRBCCCGEEKKVk46hEEIIIYQQQrRy0jEUQgghhBBCiFZOOoZCCCGEEEII0cpJx1AIIYQQQgghWjldUwfQXGRnZzN79mzCw8PJz89n2rRpxMfHN3VYl8nLy2PBggWEhIRgNpspLi4mJSWF0NDQZvMaPvzwQ1599VWOHj0KQGlpKSkpKQQEBJCbm8uTTz7JDTfc0MRR1lZZWclbb72FxWKhtLSU3NxcPvroo2YR+3fffcdnn31Ghw4dyMjI4He/+x1DhgzxyN8Xi8XCihUreOutt1i1ahUdO3YE6v4d8ZR/gyvFbjKZeOWVV/Dy8kKj0ZCZmckLL7xAQkKCx8T+/9u5+5gq6/+P48+DeXDC0FAEkbZ0izKdt4kp60ZnDl2lOUyt1NKsSaZ5F2p5x9Rx2BwmLtds5VZuOVbeNFvqFKfmXEKmc3kH3nFADoKo4A0HOJ/vH84rSUB//cRznc7r8de5Ls70dX3O+3xurnNdV2Ntfsf27duZPn06u3btIi4uDgCv18uSJUsICQnh0qVLvPXWW7z00kuPNLc8OnbsK+5n4sSJ5OfnW9uTJk1i8uTJfkxU37/p6/ypsbzz5s1j37591vuGDRvG559/7q+YlkCbKzWV1861vGzZMm7evElYWBgnTpxg6tSpDBgwwLZ13Fheu9bxHQ913mzkgUyZMsXs3LnTGGNMTk6OmThxon8DNeLgwYMmMzPT2k5PTzcLFiwwxgTGMeTn55spU6aY+Ph4a9+SJUvM+vWiNLmCAAALpUlEQVTrjTHGnDx50gwdOtT4fD5/RWzQ8uXLzbFjx6ztvLw8Y4z9s/t8PtO3b19z5MgRY4wxR44cMc8//7wxxp718sMPP5i8vDwTHx9v8vPzrf1NtbNdPoOGshcWFpo5c+ZY7/n+++/rtbMdsjfW5sYYU15ebj755BMTHx9vCgsLrf3r1q0zy5cvt94zcOBAU1VV9Uhzy6Njx77iflJTU/0doUn/pq/zp8by2rWdA22u1FReu7axMcZkZGRYr7dt22aGDx9ujLFvHTeW185t/LDnzbqU9AFUVFSwf/9+XnjhBQAGDhxIbm4uHo/Hz8nulZCQwIwZM6ztuLg4PB5PQBxDXV0dmZmZzJo1q97+rVu38uKLLwIQHx9PTU0Nf/75pz8iNujWrVvs2bOHv/76i5UrV5KWlka7du0A+2d3OBy0b9+esrIyAMrKynA4HLatlzFjxtCnT5979jfVznb5DBrKHhcXR0ZGRr3tu9vYDtkba3MAl8vF7Nmz79m/ZcsWK3dkZCRdunQhJyenWXOKf9i1r7ifGzdu4HK5SE9PZ/Xq1dy8edPfker5N32dPzXVT6xcuRKXy0VGRgYVFRWPOFnDAm2u1FhesHctz50713p97tw54uPjAfvWcWN5wZ513BzzZi0MH0BxcTGtW7cmNDQUAKfTSUREBEVFRX5Odi+Hw4HD4bC29+7dy9ixYwPiGNatW8ebb75JeHi4te/KlStUVVVZCy2Adu3a4Xa7/RGxQUVFRZw/fx6Hw8Hs2bMZNWoUEyZMwOPx2D47wJo1a1izZg0LFixg1apVrF69OiDq5Y6maiQQ6ufu7+uePXsYN24cYP/a37p1K7169bIuH71bUVGRbXPLwxVIfcXdBg8ezLRp05g3bx5Op5PU1FR/R7ovu/cJDRk8eDATJkwgNTWVrl278sEHH+Dz+fwdK+DmSo3lBfvX8rFjx0hJSeHAgQMsXLjQ9nX8z7xg3zpujnmzFob/YdnZ2Tz11FMMGTLE31Hu68SJE3g8HusMRyC5fv06AElJSQB0796dVq1akZeX589YD+TWrVt8+OGHLFy4kBUrVrBixQq++OIL6urq/B0t6Ozdu5eqqirGjx/v7yj35fF42L17t7WIFQk0I0eOJCwsDIA33niDnTt3Ul1d7edU/z1Dhw4lKioKgFdffZUTJ05w/vx5P6eqL5DmSnBvXrvXcvfu3fnyyy+ZPHky77zzDrW1tf6O1KR/5vV6vbas4+aaN2th+ABiY2O5ceOG9UXzer1cu3aNTp06+TlZ4zZt2oTb7WbOnDmA/Y9h9+7dXL9+nUWLFpGZmQnAokWLyM3NJSwsjPLycuu95eXltskNEB0dDUBIyN9fJ6fTidPptH32U6dOcfXqVXr37g3c7hALCgqoqamxdb3crW3bto22c1N/s5P9+/ezY8cO0tPTrbPCds6+d+9e4PZ3dNGiRQBkZmaSnZ0NQKdOnWyZWx4+u48tDfF6vRQXF1vbLVu2xOfz2Woy3RA79wmNOXv2rPXa4XDw2GOP2aqdA22u9M+8dq7luro668Q5wKBBg7h48SIlJSW2rOPG8p46dcqWddxc82YtDB/A448/TmJiovVEogMHDtCnTx9rQWA3GzdupKioiJkzZwK3n7Jk92NISUkhIyODtLQ0K3daWhpDhgzh9ddftyaip0+fpkWLFvTq1cufceuJjo6mb9++/P7778Dt+/QuXbpE7969bZ89Li4Or9dLSUkJcDt7ZWUlMTExtq6Xf2qqne3+GeTk5LBr1y7S0tJo0aIFy5Yts/5m1+yjR49m1apVpKWlkZaWBsDMmTMZPXo0UD/35cuXOXPmDIMGDfJbXmk+dh9bGlJaWorL5bK2Dx48SLdu3YiIiPBjqgdj1z6hMXffs3X06FHCw8Pp0qWLHxP9LdDmSg3ltXMtX7x40TpxCOB2u6mtrSU2NtaWddxUXjvWcXPNmx3GGNNsqf9D3G43y5cvJyoqitLS0nqPlLeT3Nxcxo8fT2RkpLUvPDyc7du3B8Qx5Obmkp2dzebNm3n77bcZN24cUVFRLF68mDZt2nDx4kU+/vhjevTo4e+o9bjdbjIyMujYsSPFxcWMHTuWxMRE65HSds7+yy+/sGnTJjp37syZM2dISkoiOTnZlvVy+PBhfv75ZzZs2MBrr71GUlISQ4YMabKd7fIZNJT96aefZtiwYURERFi/FFZWVnL06FHbZG+szQEKCgrYsGEDGzZsYOTIkSQnJ9OvXz+8Xi+LFy8mJCSEsrIyxo0bx8svv/xIc8ujY8e+oilVVVV89tlntG7dmrCwMIqLi5k7dy6dO3f2dzTLv+nr7Jh3/vz5eL1e2rVrx4ULF0hJSbFF3kCbKzWW98cff7RtLd/9PYuIiCA/P5+xY8fyyiuv2LKOm8pr1zqGhz9v1sJQREREREQkyOlSUhERERERkSCnhaGIiIiIiEiQ08JQREREREQkyGlhKCIiIiIiEuS0MBQREREREQlyWhiKBLE1a9aQmJhIVlaWv6OIiIjYhsZHCUaP+TuAiPjPtGnTcLvd/o4hIiJiKxofJRjpF0MREREREZEgp18MRWzmwoULLFmyBK/Xi8/nY86cOZw+fZqvvvqKZ599ltDQUNxuNy1atMDlcvHEE08AcPToUTIyMjDG4HA4+PTTT+nRowcA5eXlLF26lPLycmpra+nZsyezZs2iVatWAFRUVDB37lyOHz9Ot27dcLlcABQUFLB06VIAamtrSU5OZtSoUX5oFRERCXYaH0WamRER26ipqTFJSUkmOzvbGGPM8ePHTUJCgqmsrDSrV682ffv2NR6PxxhjzNq1a82YMWOMMcZcu3bNJCQkmIMHDxpjjDl06JBJSEgwV69eNcYY895775msrCxjjDHV1dVm5MiRprCw0BhjTGpqqhkxYoSprq42t27dMgkJCeaPP/4wxhgzffp0s23bNmOMMaWlpWby5MmPqCVERET+pvFRpPnpUlIRGzly5AiFhYWMGDECgGeeeYbo6Gj27NkDQL9+/ejQoQMAI0aM4PDhwxQXF5OTk0N4eDj9+/cH4LnnnqNNmzbs3r0bj8fDb7/9Zp3JdDqdrFixgsjISOv/7d+/P06nk9DQUJ588knrvoo2bdrw66+/4na7iYqK0k34IiLiFxofRZqfLiUVsRGPxwPApEmTrH1er5fKykrg9kB0R9u2bQG4dOkSJSUl9QYygMjISEpKSigpKbG27+jatWu994aHh1uvnU4nNTU1ACxYsIBvvvmGiRMn0qFDB6ZPn86AAQP+38cpIiLyf6HxUaT5aWEoYiMxMTG0bNmS7777ztp348YNQkJCWLduHVeuXLH2V1RUABAVFUXHjh25fPlyvX/r8uXLxMTEEBMTY23HxsYCUFhYSERERL2BtCHXrl0jJSWFqVOnsmXLFqZOncqBAwdo3br1QzleERGRB6HxUaT56VJSERvp2bMnHTt2ZMeOHcDtG9o/+ugjzp07B8Dhw4cpLS0FYPPmzfTu3ZvY2FgGDRrE9evXOXToEAB5eXlcvXqVwYMHEx0dTWJiIj/99BNw+wzrjBkzrLOeTZk/fz5lZWU4HA769etHbW0tDoejGY5cRESkcRofRZqfwxhj/B1CRP524cIFli5dSnV1NT6fj1GjRpGcnExWVhYFBQW0atWKs2fP3vPUtWPHjuFyufD5fE0+da2uro53332XpKQkvv32W77++mtCQ0OZP38+J0+eZP369bRv357FixdTUlLCxo0bcTqdVFVV8f777zN8+HB/No+IiAQpjY8izUsLQ5EAkZWVRVFREenp6f6OIiIiYhsaH0UeDl1KKiIiIiIiEuS0MBQJABs3bmTTpk3s27ePtWvX+juOiIiILWh8FHl4dCmpiIiIiIhIkNMvhiIiIiIiIkFOC0MREREREZEgp4WhiIiIiIhIkNPCUEREREREJMhpYSgiIiIiIhLktDAUEREREREJcv8Dwnt3Ixij4jMAAAAASUVORK5CYII=
" />
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Figure 2</strong>. Same annealing schedule as on <a href="https://arxiv.org/pdf/1802.05814.pdf">Liang et al 2018</a> for 3 different architectures and for the Movielens and the Amazon datasets using <code>Pytorch</code> and <code>Mxnet</code>. During the annealing schedule, $\beta=1$ is reached at 170 epochs (out of 200, i.e. 85%)</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">best_results</span> <span class="o">=</span> <span class="n">pd</span><span class="o">.</span><span class="n">concat</span><span class="p">([</span>
<span class="n">find_best</span><span class="p">(</span><span class="n">dl_frame</span><span class="o">=</span><span class="s1">'pt'</span><span class="p">,</span> <span class="n">model</span><span class="o">=</span><span class="s1">'vae'</span><span class="p">),</span>
<span class="n">find_best</span><span class="p">(</span><span class="n">dl_frame</span><span class="o">=</span><span class="s1">'pt'</span><span class="p">,</span> <span class="n">model</span><span class="o">=</span><span class="s1">'dae'</span><span class="p">),</span>
<span class="n">find_best</span><span class="p">(</span><span class="n">dl_frame</span><span class="o">=</span><span class="s1">'mx'</span><span class="p">,</span> <span class="n">model</span><span class="o">=</span><span class="s1">'vae'</span><span class="p">),</span>
<span class="n">find_best</span><span class="p">(</span><span class="n">dl_frame</span><span class="o">=</span><span class="s1">'mx'</span><span class="p">,</span> <span class="n">model</span><span class="o">=</span><span class="s1">'dae'</span><span class="p">),</span>
<span class="p">])</span><span class="o">.</span><span class="n">reset_index</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
<span class="n">best_results</span><span class="o">.</span><span class="n">sort_values</span><span class="p">([</span><span class="s2">"dataset"</span><span class="p">,</span> <span class="s2">"model"</span><span class="p">])</span><span class="o">.</span><span class="n">reset_index</span><span class="p">(</span><span class="n">drop</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_html rendered_html output_subarea output_execute_result">
<div>
<style scoped="">
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
<table border="1" class="dataframe">
<thead>
<tr style="text-align: right;">
<th></th>
<th>dataset</th>
<th>dl_frame</th>
<th>model</th>
<th>p_dims</th>
<th>weight_decay</th>
<th>lr</th>
<th>lr_scheduler</th>
<th>anneal_cap</th>
<th>best_epoch</th>
<th>loss</th>
<th>n100</th>
<th>r20</th>
<th>r50</th>
</tr>
</thead>
<tbody>
<tr>
<th>0</th>
<td>amazon</td>
<td>Pytorch</td>
<td>dae</td>
<td>[50,150]</td>
<td>0.0</td>
<td>0.001</td>
<td>False</td>
<td>NA</td>
<td>28</td>
<td>87.588</td>
<td>0.091</td>
<td>0.120</td>
<td>0.182</td>
</tr>
<tr>
<th>1</th>
<td>amazon</td>
<td>Mxnet</td>
<td>dae</td>
<td>[100,300]</td>
<td>0.0</td>
<td>0.001</td>
<td>False</td>
<td>NA</td>
<td>18</td>
<td>85.985</td>
<td>0.090</td>
<td>0.119</td>
<td>0.182</td>
</tr>
<tr>
<th>2</th>
<td>amazon</td>
<td>Pytorch</td>
<td>vae</td>
<td>[300,900]</td>
<td>0.0</td>
<td>0.001</td>
<td>False</td>
<td>0.7</td>
<td>170</td>
<td>92.263</td>
<td>0.101</td>
<td>0.137</td>
<td>0.204</td>
</tr>
<tr>
<th>3</th>
<td>amazon</td>
<td>Mxnet</td>
<td>vae</td>
<td>[200,600]</td>
<td>0.0</td>
<td>0.001</td>
<td>False</td>
<td>0</td>
<td>8</td>
<td>85.310</td>
<td>0.090</td>
<td>0.118</td>
<td>0.179</td>
</tr>
<tr>
<th>4</th>
<td>movielens</td>
<td>Pytorch</td>
<td>dae</td>
<td>[200,600]</td>
<td>0.0</td>
<td>0.001</td>
<td>False</td>
<td>NA</td>
<td>136</td>
<td>349.714</td>
<td>0.418</td>
<td>0.386</td>
<td>0.530</td>
</tr>
<tr>
<th>5</th>
<td>movielens</td>
<td>Mxnet</td>
<td>dae</td>
<td>[200, 600]</td>
<td>0.0</td>
<td>0.005</td>
<td>True</td>
<td>NA</td>
<td>184</td>
<td>348.841</td>
<td>0.424</td>
<td>0.393</td>
<td>0.536</td>
</tr>
<tr>
<th>6</th>
<td>movielens</td>
<td>Pytorch</td>
<td>vae</td>
<td>[200, 600]</td>
<td>0.0</td>
<td>0.005</td>
<td>True</td>
<td>0.2</td>
<td>155</td>
<td>365.372</td>
<td>0.427</td>
<td>0.398</td>
<td>0.538</td>
</tr>
<tr>
<th>7</th>
<td>movielens</td>
<td>Mxnet</td>
<td>vae</td>
<td>[200,600]</td>
<td>0.0</td>
<td>0.001</td>
<td>False</td>
<td>0</td>
<td>101</td>
<td>350.479</td>
<td>0.417</td>
<td>0.388</td>
<td>0.531</td>
</tr>
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Table 1</strong>. Best performing experiments (in terms of NDCG@10) among all the experiments I run (which can be found in the <code>run_experiment.sh</code> file in the repo). A csv file with the results for <em>all</em> experiments run can be found in the <code>all_results.csv</code> file in the repo.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Figure 1 reproduces the same annealing schedule for 3 different architectures and for the Movielens and the Amazon datasets using <code>Pytorch</code> and <code>Mxnet</code>. During the annealing schedule, $\beta=1$ is reached at 170 epochs (out of 200, i.e. 85%). In addition, I have also used early stopping with a "<em>patience</em>" of 20 epochs, which is why none of the experiments reaches the 200 epochs. The vertical lines in the figure indicate the epoch at which the best <code>NDGC@100</code> is reached, and the corresponding $\beta$ value is indicated in the top x-axis.</p>
<p>On the other hand, Table 1 shows the best results I obtained for all the experiments I run, which you can find in this repo in the file <code>run_experiments.sh</code>.</p>
<p>At first sight it is apparent how different the two deep learning frames behave. I find <code>Pytorch</code> to perform a bit better than <code>Mxnet</code> and to be more stable across experiments. This is something that I keep finding every time I use these two frames for the same exercise. For example, <a href="https://github.com/jrzaurin/nlp-stuff/tree/master/amazon_reviews_classification_HAN">here</a>, using Hierarchical Attention networks. I actually believe this is due to the fact that I know (or have used) more <code>Pytorch</code> than <code>Mxnet</code>. Nonetheless, at this stage it is clear for me that I need to do a proper benchmark exercise between these two deep learning libraries.</p>
<p>Focusing on the results shown in Figure 1, the first apparent result is that the <code>Mxnet</code> implementation performs better with little or no regularization. In fact, I have run over 60 experiments and, as shown in Table 1, the best results when using <code>Mult-VAE</code> and Mxnet are obtained with no regularization, i.e. a denoising autoencoder with the reparametrization trick. Furthermore, the best overall metrics with <code>Mxnet</code> are obtained using the <code>Mult-DAE</code> (NDCG@100 = 0.424).</p>
<p>If we also focus in the differences between datasets, it is first apparent that the metrics are significantly smaller for the Amazon dataset relative to those obtained with the Movielens dataset. This was of course expected since as I mentioned in Section 2.1 the Amazon dataset is 13 times more sparse that the Movielens dataset, i.e. significantly more challenging. In addition, we see that the <code>Pytorch</code> implementation shows a very stable behavior for both datasets and architectures, reaching the best <code>NDCG@10</code> later in the training epochs in the case of the Amazon dataset. Again this is different in the case of the <code>Mxnet</code> implementation, where we see less consistency and the maximum <code>NDCG@10</code> being reached very early during training for both datasets.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">plot_ndcg_vs_pdims</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_png output_subarea ">
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA40AAAEcCAYAAABj8VH2AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOzdeVxU9f748deZGQYYGPZFcc9dc0dBQdGkkixLS+te87Znt+vtVrdu3cwll7q3uuXN+pXfuq3eW1ZmoqkpGAiau2nuG4iA7NuwzTDM+f2BjaCioMCwvJ+PBw9hzjLvczwzn/M+n01RVVVFCCGEEEIIIYS4DI2jAxBCCCGEEEII0XxJ0iiEEEIIIYQQolaSNAohhBBCCCGEqJUkjUIIIYQQQgghaiVJoxBCCCGEEEKIWknSKIQQQgghhBCiVpI0CtFGrFmzhtmzZ191vdjYWCZMmMCMGTOaICohhBDi+kVHRxMcHIzFYnF0KEK0SpI0CtFG3Hbbbbz00ktXXW/8+PE8/vjjTRCREEII0TBiY2OxWq0kJiY6OhQhWiVJGoVoI7RaLW5ubo4OQwghhGhQJpMJrVbLuHHjWL9+vaPDEaJV0jk6ACEErFixgmXLljFo0CCMRiN79+6lT58+zJo1i7feeosjR47w4IMPMn36dCoqKnjrrbfYt28fAEOGDOHZZ59ly5YtvPzyy7i6uvLcc89x2223MWPGDE6fPs0TTzzBt99+i8lkYvPmzQDk5uYyb9488vPzqays5NFHHyUyMvKy8dW27oEDB5gzZw4mk4nf//73xMXFUVhYyDvvvEO3bt2w2Wy88sorHD9+HI1GQ9euXZk9ezYGg6HJzq0QQojWLTY2lltuuQWdTsff/vY3LBYLer2eWbNmER8fz5///Gf27dvH8ePHee655zCbzXz33Xfk5+ezdOlSunbtCsC7777Lzp07AXB1dWXBggUEBgayY8cO5s2bh7+/PwDJyckYjUbWrVuHqqr85z//YePGjWi1Wns55+7uzvz581m7di33338/p0+f5tixY9x66608++yzjjpVQlw7VQjRLLzzzjvqmDFj1KKiItVsNqsjR45UX375ZdVms6kHDx5UBw8erFZUVKjvvvuu+sADD6hWq1W1Wq3qww8/rL777ruqqqrqZ599pj700EP2fW7atEldsWKFqqqqun37dnXcuHH2ZQ8++KC6ZMkSVVVVNTMzUx0xYoR69uxZVVVVdeXKler9999fp3W3b9+u9u/fX921a5eqqqo6b948dc6cOaqqqmpcXJz6yCOP2Pfz5JNP2rcTQgghGsKzzz6rlpeXq2azWQ0ODlY3bdpkXzZu3Dh14cKFqqpWlYmhoaHqxo0bVVVV1YULF9rLK1VV1c8//1y12WyqqlaVg88995yqqqq6a9cuNTo6WlVVVc3KylJDQkLULVu2qKqqqqtWrVJvu+02tbS0VFVVVX3ppZfUv//97/Z93n///epjjz2m2mw2NTMzU+3Xr5+akZHRWKdCiEYjzVOFaEYGDhyI0WhEr9fTpUsXevXqhaIo9O7dm9LSUnJzc1m9ejV33XUXWq0WrVbLnXfeyXfffQfA7bffzq5du8jMzARgw4YNREVFXfI+mZmZbNu2jXvuuQeAgIAAhg4dyg8//HBN6xoMBoKDgwHo3bs3qampAHh4eHD8+HG2bt2KzWbjrbfeIigoqAHPmBBCiLasqKgINzc3nJ2d0ev13HLLLaxbt67GOqNGjQKgZ8+e5OXlMXLkSKBmeQXQvn17/vCHPzB9+nQ+++wzDh06BEBwcDB33HEHAC+//DITJkxg9OjRAKxevZqoqChcXV0BmDJlCtHR0VRWVtr3Gx4ejqIoBAQE4OXlRVpaWiOdDSEajzRPFaIZqd7nUKfT2f/W6ao+qhUVFWRkZODt7W1fz8fHx54k+vj4EBYWxurVq7n33nvRarUYjcZL3icjIwOAF154AUVRAMjPz6dXr17XtK67u7v9d2dnZyoqKoCqprMLFy7kww8/5KWXXuLee+9l5syZ9T0tQgghxGXFxMSwb98++4jfhYWFnD17lvLyclxcXIALZatWqwUulFlardZeXiUnJ/P000/zv//9j4EDB7Jjxw7+/ve/13ivFStWkJSUxNtvv21/LSMjAx8fH/vfPj4+VFRUkJOTQ2BgYI33g5plpBAtiSSNQrQw7du3Jz8/3/53Xl6evWACuOuuu1i6dClGo/GytYwA7dq1A+Cdd96xF3Zmsxmr1Xpd617MZDIxYsQIIiIiSElJ4dFHHyUwMJC77767jkcrhBBC1C4xMZHvvvsOJycnACwWCyNHjiQuLo4JEybUeT+HDx/Gzc2NgQMHAlxSxqWkpPDGG2/w4Ycf1uiX3759e/Ly8ux/5+Xl4eTkhJ+f3/UclhDNjjRPFaKFmTx5sr3pi81mIzo6milTptiX33TTTWRnZ/P1118THh5+2X0EBgbaayR/M2/ePHbs2HFd615s06ZNrFixAoDOnTsTGBiIzWar87EKIYQQtSksLESr1doTRgC9Xk9ERES9R1Ht0qULRUVFJCUlAZCQkGBfZrPZeOGFF5g+fTpDhgwBYO7cuUBVmbxhwwbKy8sB+P7775k0aZK9VlOI1kI7f/78+Y4OQoi2bs2aNXzyySecPn0aV1dX4uPjiYmJ4ciRI/Tv35/Fixdz+vRp9u/fz/PPP09qairvvvsuK1eupH///syaNcteQGm1WlJTU+ncuTMREREAHDt2jLlz55KWlsbRo0eJiooiPDycr776ii+++IKVK1cyaNAgpk6dSmxsLEuXLiUlJYXMzEwiIiJqXffkyZPMmTOHtLQ0MjIy8PX15R//+AcpKSkUFBRw00038eWXX/LNN9/w3//+l+7duzNz5kwpTIUQQlwXk8nEjBkzOHPmDF26dLGPgBoXF8c333zDwYMH7SOmHjx4kJCQEJ5//nkyMzM5fPgwnTt3rlFeTZ48GavVyhtvvMH27dvR6/Xs2bOH5ORkSktL+eKLL/Dw8GDjxo2sX7+e3bt388ADD9C7d2/Kysr417/+xXfffYeHhwezZ89Gr9fz+uuvEx8fby/Lly1bxu7duzl48CDDhw+v0axViOZOUVVVdXQQQgghhBBCCCGaJ2meKoQQQgghhBCiVpI0CiGEEEIIIYSolSSNQgghhBBCCCFqJUmjEEIIIYQQQohaSdIohBBCCCGEEKJWkjQKIYQQQgghhKiVztEBNBf5+SXYbNc3+4ivrzu5ucUNFFHrIOdE1IVcJ6Kurvda0WgUvL3dGjCituF6y0j5jF9KzomoK7lWRF00dvkoSeN5Npt63Unjb/sRNck5EXUh14moK7lWml5DlJHy/3YpOSeiruRaEXXRmNeJNE8VQgghhBBCCFErSRqFEEIIIYQQQtRKmqcKIUQzo6oq+fnZWCzlgDRJqi4rS4PNZrvKWgp6vQve3v4oitIkcQkhhGh8Uj7Wrm7lI1xrGSlJoxBCNDPFxYUoikJgYEcURRqEVKfTabBar1woqqqNgoIciosLMRq9migyIYQQjU3Kx9rVpXyEay8j5WwLIUQzU1ZWjNHoJQXiNVIUDUajN2VlMtqgEEK0JlI+Xr9rLSOlplEI4TA2i4XKvFzKygtQnT2lKeF5NlslWq18PV8PrVaHzVbp6DAEVZ9zc2YmaAyODkUI0cJJ+dgwrqWMlLMuhGg0qqpiKzZhzc3BmpONNTcXa17V75W5OVQWFQKQATh17Ixx7HgMg4aiaLWODbwZkAT6+sj5az4K164ibesW/B59Ete+/R0djhCihZPv9+t3LedQkkYhxHVRrRVY8/KqEsPzP5W52ed/z0W1mGvfWKNB5+OLWl5GRWoKecs/ofCH1RhHj8MtdBQaF9emOxBxWQcO/MKHH75PcnISY8aMxWw2k5WVydNPP88NN3S/7DZ79+7m44//j3ff/b8GjWXPnl28//47jBwZziOPzGzQfYvGo/MLAFWlIPo7XHr1kYdCQohWozmVkbt372Tp0n8zalTjlJGSNAohrkhVVWylJeeTwQuJoTUnG2teLpUF+aDWPoKZxtWA1tcPnZ8fOh8/dH7+6Hz90Pn6ofXyRtFo8PV05syGWExxMVizsyiIXknhxh9wDw3HffRYdN4+TXjEorqBAwcTFXU733+/kueffwmAL79czquvvsJHH33epLEMGzacsLDRMsl1C+M+KpzSxJ+wZJ6jZNd23EPDHB2SEEI0iOZURgYHj2DUqPBG23+TJY3p6eksWrQIPz8/srKymD17Np06dbrsuqmpqUyaNImXX36ZKVOm2F8/ceIE8+fPJyAggLffftv+emVlJYsWLcJkMuHp6Ul2djYLFizAy0tGzROiLtTKSioL8i9KCHOw5lT9rZaX1b6xoqD18bUnglU/FxJDjeHq/Zg0ej3uI8NxCxlF+ZGDmOJiMZ86gSkuBtOWzRgGD8M4NhJ9x8t/Z4im1bVrN44ePUxkZDgREeOYM2chcXGxfPDBezz22BPExGwkNfUsb731T268cRC33DKB2NiNJCZuwc/Pn7y8HGbNehZvb2/+/ve/kpAQz1//+iKbN2/i1KmTrFsXS2zsRhIS4gkICODkyRPMmPEQQ4YMAyArK5N58/7OiRPHmTbtd9x11z0OPiPiShSdE0FTp5L8/vsUbViLYehwNHq9o8MSQohG4agycvjw4UDjlZFNljTOnz+fadOmERkZSVxcHHPmzOHTTz+9ZD1VVXnzzTfp0KFDjdfz8/NZu3YtgwcPJj09vcay2NhYduzYwbp16+zvtWzZMl544YVGOx4hWhpbeZk9CbTam4+eb06anwdXmNtHcXaukQjqztccan380Pn4NlhzM0WjwbX/QFz7D8Ry9gymuFhK9++ldO8uSvfuwrlnb4xjI3Hp06/N9GnI/vA9yo8capR9u/Ttj/9jf6r3drt2bWfkyDD8/ALs39V9+/ZnzJgIxo+/BW9vHz7++P949tmq7+Dk5CTefXcJK1Z8j16v5/vvV/LWW/9k4cJ/8Npr/yI8PJiAgECWLl3GypVfX7L+tm2JnD590p40pqen8c47H3D2bAp//vPjkjS2AN6hoaStWUtF6lmKt2zGI3KCo0MSQrQCUkZeKCN/Sxobq4xskqQxPz+fxMREli5dCsCoUaOYNWsWmZmZBAYG1lh3+fLlREVFsXz58hqve3t788wzz9j3UZ2/vz8lJSWUl5fj4uJCbm5urbWYQrRWqs1GZWFBtWQwm8rcXHutoa2k5Irba7287M1Htb6/NSU9X1vo5t7kSZq+Uxd8ZzyM58Q7MSX8RMn2rZhPHMN84hi6wPYYI27CbdgIFCenJo2rrTp3Lp033niV8vJy3NzcmD37FbKzM3nppef5/e//QHT0KiZNmnLZbXfv3kHPnr3Qn69dGjhwEO+99+8a6wwfHgLA3XdP49tvv6qx/sXNbfr3H4CiKHTo0JG8vLyGPlTRCBSNBq/bJ5P9wTsUxW7ELTQMrbvR0WEJIUSDaAtlZJMkjenp6RgMBpydnQHQ6/V4eHiQlpZWI2lMTk7myJEjzJgx45Kk8UqGDBnCww8/zIMPPoivry8Af/zjHxv2IIRoBmxmc1XNYN5vzUir1Rzm5UGltdZtFSenqmTQ3nzU90LtoY9vs02+dD6+eN95D5633Ebxz4kUJ8RhzTxH/tf/pXD9GtzDI3AfNRqtm7ujQ20U1/KUszG0bx9k76/xGy8vL/z8/IiP30xGxjk61tJ8uC4PHPTVmitebX2n89eqVqtFvUJ/WtG8uPTqg0uffpQfPUzRpvV4T57m6JCEEC2clJGXaqwystkMhGOz2XjzzTdZuHBhvbeNj49n1apVrFixAmdnZ1599VXWrFnD73//+zrvw9e3YW44/f3lyenF5JzUnaqqWAsLMWdmYs7OxpyVhSUrC/P5H2th4RW313l64hwQgHNAAPrz//72o/Ns3vMgXv06MRLY+W5sd99J/vbtZK1fT1lKCkXr12CK/RG/MWMImDAB54taL7REWVkadLrmM3GxRqOgKFw2prvvnsYbb7zKyy+/Yl9uMLigqjZ0Og1r164mJCSU5cs/w2azotfrOXToAKNGhdXYX/XfL15/69YEsrOzueuuKfZ4atu2Ztwa+f5pZjxvv4vyY0co3paAcfQ4dH7+jg5JCCEazeTJU3n99VeZPXue/TW93hnb+S5BP/wQTXBwCMuXf4bFYkGv13PgwH5CQ0fVus+L19+2LZGcnGymTLm7UY+lSZLGoKAgSktLMZvNODs7Y7FYKCoqqtFv8dixY5jNZvsAN0lJSaxatYpTp07x/PPPX3H/cXFxDB8+3F6TGRYWxquvvlqvpDE3t/i6R+Tz9zeSnW26rn20NnJOLqVWVGDNy73QrzAnp2rQmfOjk6oVFbVvrNVV1RBWazqq8/VD6+uPzscXzfnPwG9sQBlQVgHkFDfmYV2Xel8nvQfh22sg5hPHMMXFUH70MNkxMWTHxuJ64yCMY8fj3O3yQ123BDabDau19j6mTengwQOsX7+Oc+fO8eabr/P008/VWB4RMZ7PPvuY0NAwe8xdu/bAZlNZsGAeQUEdmDChC3/601945ZU5+Pr6UVBQwNNP/w2r1caHH74PwOuvv8aDDz6Kj48vHTteWN/PL4CSkmL+8pfn+OWXX/j5562oqsqoUWPYsWMbAMuWvX/Z4cVtNtsl15VGozTYQ0JRf/qgjhiCQyjdtZ2CddH4/eERR4ckhBDX7ODBA/z443oyMs6xZMmbl5SR48ZF8sUXnzBq1Gj7az169ERVVV599RWCgjrQpUtX/vSnv7B48Tx7Gflbf8ffysi33vqnvYysvn71MvLXX/ezfftWAMLDI+xl5H/+s6xBpuBQ1CZq2/PYY49x77332gfC+fjjj/n888+JiYkhJCQEo7Hm0+AZM2YwefLkGqOnAixdupTTp0/XGD31s88+Iz4+no8//hiATz75hLi4OD777LM6xydJY+Noi+dEVVVsJcU1E8Lzk9lbc3OoLCy44vYaN7cag85oqw0+o/X0QtE0nxqohnK914nlXDrF8bGU7Nllb6Kr79oNY0QkrgMGtbhzlpFxhnbtujg6jCsymUyoqo3MzAy2b9/GjBkPNcn76nSaOifUlzuPkjRem+stI6t/xq35eZx7bT5YrQQ8/TecO3dtmCBbmLZYPoprI9fKBS2hfATHlJH1KR/h0nN5tfKxyZqnzps3j8WLF7NlyxaysrLszVCXLFnC/PnzCQ4OBsBqtfLqq6+SnJxMdHQ0lZWVTJ061b5uYmIiJpOJBQsW8OKLL6LX6/nd735nr5H08PDgzJkzzJs3r9ZYhLheqtWKNT/vorkLs7Hm5lZNUWEur31jjQadty9aX9+a01Ocn8dQ4yoT2teXvn0QPvfNwPO2SZgS4yjZmoAlOYnc5A/R+vphHHMTbiNGXlITK65dSsoZ3ntvCT4+Prz44lxHhyNaEJ23D8bR4zD9tInCNavwf/LpZt10Xggh6qs1lpFNVtPY3ElNY+NoyefEVlpac3qK6s1I8/OuOKG94uJ6ofno5Sa0b6ApKlqLhr5ObGYzJTt/xrRlM5W5OQBoXA24hY3GGD4WrYdng71XY2gpT1IdQWoaHaMhaxoBbGWlnFs8D1tpCX6P/hHXfgMaIswWpSWXj6JpybVygZSPtWs1NY1CNDeqzXZhQvuci+YtzM3BVlZa+8aKgtbb56IJ7X9rSuqPxmCQJ+cOpHF2xjh6LO5hYyj7dT+muBgsZ5IwxfyI6adY3IYNxz1iPPr2QY4OVYg2SeNqwOPmCRSsXknB2u9x6dO/xTUjF0KItkSSRtGq2crLa8xbWKM5aV7ulSe01zvbp6XQVms+qvPzR+ftg6KTj09zp2g0GAYNwTBoCOakU5jiYik7uJ+SnT9TsvNnXPr0wzg2EueevSXJF6KJuYeNwZQQhzXjHCW7tuMeUvtogUIIIRxL7npFi6babFQWFdaoIbTXHOblYCu+8oihWg/P8wmh/0UjkvqjcW/6Ce1F43Hu1h3nbt2pyM6ieMtPlOzcRvnRw5QfPYxTUEeMY8djGDxMHgYI0UQUnROeUZPI++8nFK1fg2FIMJpqc5EJIYRoPuTuSDR7Novl0sns835LDHPBWvuE9uh0lzQh/W3gGa2Pr9ygtEFO/gF4330vHhMmUrItAVNiPBXpqeT97zMKf1iN++ixuI8MR+NqcHSoQrR6hiHDMMXHUJF6luItm/GInODokIQQQlyGJI3C4VRVxWYqqtGnsPqPrejKE9pr3I01k0K/35JCP7QeHtJPRlyW1s0dj5ujMI6NpGTvLkxxsVgzz1G49nuKNq3HLSQM45hx6Hx8HR2qEK2WotHgdftksj94h6LNG3ELDUfrLgMVCSFEcyNJo2gSqrUCa15ejf6FldVGJFUtlto31mrR+VyY0L76vIU6Hz80Li5NdyCi1VGcnHAPGYXb8FDKjx3BFBeD+cQxirdspjgxDteBQ/AYF4m+U9scrW3Xrh18/vnHHDx4gK++WkVgYLsay2fOfIiCgnymTr2Pe+65r0Hf+5577mDp0mW0lwGLWjWXXn1w6d2P8mOHKdq0Hu/JUx0dkhBC1ElbKiMlaRSNQlVVijatJ//MKcrOZVRNaH+FKSo0Brfz8xb6XzoiqZe31BaKRqdoNLj27Y9r3/5YUs9iiouh9Jc9lJ3/cb6hB8Zxkbj0vbFNXY/Dh4eQnZ1FZmYGy5d/xl//+oJ92c6d28nPz6Nv334NXhiKtsXz9rsoP36E4m1bqmr4ff0cHZIQQlxVWyojJWkUjcJ8/AhFG9ZeeEGjqTZFRbXE8PzAM9J/TDQn+o6d8L3/ITwn3klxQhzF2xMxnz6J+fRJdAGBGCPGYxg2okn6xC75Zj8HTuU2yr4Hdvfl6amD6rTu9OkPsHTpW/zhDw/h7x8AwI8/riMy8lbS0s6yceMG3ntvCb169WbOnAX89a9PERTUgRdfnMOzz84iOzuLCRMmsmfPTlxdDbz++hJ0Oh1FRYW8//5SPD29yMvLZdCgIUycOImvv/6SoqIi/vOfZbi7G3n66eca5RyI5kHfoSOG4BBKd22ncF00vjMednRIQogWQMrIpisjJWkUjaJo0wYAAqKi0A4JRevtIxPaixZH5+2D16QpeNwSRcmObZi2/IQ1K5P8b/5H4fpo3MMicA8bg9bd6OhQG12nTp0ZM2Yc//3vZzz99PPs3r2TIUOGkpmZCcAtt0zAw8ODf//7TYqLi+nRoxcvvDAbgLlzF3LffZO55ZYoHn30CZ544mH27NlFSMhI/v3vNxkxYiS33noblZWV3HffFAYMGMS0ab/j66//xyOPzGw1zVPT09NZtGgRfn5+ZGVlMXv2bDp16lRjHVVVeeONN8jNzaW4uJjx48czZcoUAHJycnjttdfw8fGhrKyMnj178sADD9i3/eSTT9i/fz+KotCnTx9mzpwJwI4dO3jyySdxqdaUf+vWrU1wxPXjOeF2SvftpnTfbtwjbsK5c1dHhySEEHXSFspISRpFgys/eRzz6ZNoXA20nzyZvOIrjG4qRAugcXHFGDEe9/CxlO7fiykulorUFIp+/AFT7EYMw0MwRozHKSCwwd+7rk85m8IDDzzCI4/cz/33P8SGDT/w4otz+Oyz/9iXh4aOYsuWYcya9TgfffR5jW29vLzo3LmqX2iHDh3Iy6t6Mrx9+zYqKqwcOPALAO3atSMj45x93dZk/vz5TJs2jcjISOLi4pgzZw6ffvppjXU2bNjAmTNneO+99zCbzURFRTFixAg6duzI4sWL6du3L48//jgA06dPp3fv3oSGhnLgwAHWrFnDt99+i6IoTJ06lWHDhhEcHAzA7Nmz7clnc6Xz9sE4ehymnzZRuGYV/k8+LdMeCSGuSMrIpiNJo2hwRTFVtYzuY25C6+oKxSYHRyREw1C0WtyGDscwJBjzqROY4mIoP3yQkp8TKfk5EZf+AzGOHY/zDT1a5c1uly5dCQsbw9y5L3Lrrbehu8yclt26dWfbtkROnTqJT7WRZ52cLjTl1Wi0qNX6ON9773T6978RAIvFgqYV9hnNz88nMTGRpUuXAjBq1ChmzZpFZmYmgYEXHjasXr2acePGAeDs7MyIESP44YcfmDlzJidOnGDixIn2dbt37866desIDQ0lOjqa8PBw+7mLiIggOjranjTGxMRw/PhxysvLmThxIsOHD2+qQ68Xj/G3UrJ9K+ZTJyg/cgjXfjc6OiQhhKiT1l5Gtr6SWTiUOTkJ8/GjKC4uGMeMdXQ4QjQKRVFw6dEL/0efpN0Lc3ALDQOdjvJDB8h+722ylrxO6b7dqJWVjg61wT344KNotVpuu+2OS5alpCSTn5/HggWv8c9/LqK4uPiq+wsJGcWuXdvtf7/yymxycrIB0Ov12Gw2du3aQVZWZsMdhAOkp6djMBhwdnYGqo7Nw8ODtLS0GuulpaXh63vhRsLX15fU1FQAgoOD2bZtG1B147Bv3z7OnTsHQGpqKj4+PpfdLigoiPvuu48XX3yRv/zlLzz//PMcPXq08Q72OmgMBjxujgKgcO33qDabgyMSQoi6a81lpNQ0igZVFLMeAPfwCBncRrQJToHt8Zk2Hc+oSRRvjad4azyWs2fI/eJjtN4+GMfchFvIqBY7NczBgwf48cf1GAwG/Pz86NbtBpYuXQbA1q0JbN++FZOpmK+//pI1a1YxZco0DAY3bDYbL7zwDE899Ve++aaqw/53331D585dOHz4IFlZWQwYMIinnvorb7/9Ov/61z9RVRvh4RG0a9cegMjIW3n33SWAyty5ixx4FpqHv/3tb3z00Ue8+uqrGAwGQkJCyM29+gAQnTp1sved9Pb2ZvTo0axbt44+ffrU+b19fa9/7kR//7r1/fW9ayKHt8VjyUhHe2w/vmPGXPd7N1d1PSdCyLVSJStLg07XfOq8fv11Pxs3rsfNzY3AwAB69uzB++9/CEBi4hZ27NiGyWRi5cqvWL16FXffPQ2j0R2bzcaLLz7D008/Zx/U5vvvv6VLl6oyMjs7i8GDh/Dss8/xr3+9zttv/xObTSUiYhwdO3YAqvpJ/r//929UVeWVV4r2rD0AACAASURBVBYD1OvcaDSael1XiqpeYR6ENiQ3txib7fpOhb+/kezsttsU05KaQuZb/0DR62n/8iK07u5t/pyIumlN14nNYqF093ZM8ZuxZmcBoLi44j4yHPfR49B5eV11HxkZZ2jXrvX16WsIOp0Gq7VutU+XO48ajdIgCVB95efnExYWxr59+3B2dsZisTB06FBiY2NrNE994oknGDduHPfeey8AL774Il27duWJJ564ZJ+LFy/G09OTWbNmsWjRIgwGA88++ywAS5cuJSsri4ULF5KcnEzXrl3t2y1YsACdTsdLL71U5/ivt4ys72e8ZM9O8v77KVpPL9r9fX6TjFTc1FrT955oXHKtXCDlY+3qUz7CpefyauVj80nVRYtXtOl8LeOo0Wjdm/6mTIjmQKPX4z5qDO1emIvfwzNxvqEHankZpp82cW7Ry+T+71MsaamODlM0MW9vb8LCwkhISABg27ZtDB06lMDAQGJiYjCZqm4IJ02axJYtWwAwm83s3LnT3o9xzZo17Nu3D4CSkhISEhKYNm2afbvExERsNhuqqhIfH8+kSZMA+OCDDzh58iQAlZWV7Nq1i9DQ0KY7+GtgGBKMU4dOVBYWULxls6PDEUKINk+ap4oGYTmXTtmv+0Gnwzg20tHhCOFwikaD642DcL1xEOYzyZjiYyjbv4/S3Tsp3b0T5159MI6NxKV331Y5aI641Lx581i8eDFbtmyx1wICLFmyhPnz5xMcHExUVBQHDhzghRdewGQy8eSTT9qbljo5OfHaa68xcOBAcnNzmTdvHgEBVfOBDRw4kIkTJ/LMM8+gKArjx4+3D3YzevRo/vGPf9C9e3cyMjKYNGkSN910k2NOQh0pGg1ed0wm+4N3KNq8EbfQcHkYKYQQDiTNU8+T5qnXJ/eLj6vm1gqPwHvKvfbX2/I5EXXXVq4Ta24Opi0/UbJjG6rFDIBTuyCMY8djGBqMonMCqpqMBAZ2lmTyMura/EZVVTIzU5pN89SWrqmbp/4me9m7lB87jPvocXhPnnrN798ctZXvPXH95Fq5QMrH2tWneerlyshm0zw1PT2dJ598krlz5/LEE09w9uzZWtdNTU1l6NChfPfddzVeP3HiBNOnT+eZZ56p8frSpUsJDQ0lLCyMsLAwQkJC7PNYicZXkZVJ6S97QKvFOO5mR4cjRLOl8/XDe/JUguYuxnPiXWg9PKnISCfvqy9IXziHopgNVJaUoNFoqayU+U2vR2WlFY1G6+gwxHXyvP0uUBSKt23Bmpvj6HCEEA4m5WPDuJYyssmap9ZlUmOoynzffPNNOnToUOP1/Px81q5dy+DBg0lPT6+xzMXFhVWrVtG+fdWIe59++mmNIctF4yqK/RFUFbfhoei8fa6+gRBtnMZgwGP8LRgjbqJ0325McTFUnEuncF00RTEbcLrjTop6a/H2DURRpOt5famqDZMpH1dXqVFs6fQdOmIYNoLS3TsoXBeN74yHHR2SEMKBXF3dMZkK8PLylfLxGl1rGdkkSWNdJzUGWL58OVFRUSxfvrzG697e3jzzzDP2fVT32GOP2X9XVZVNmzbxySefNMKRiItZc3Mo3bMTNBo8xt/q6HCEaFEUnQ634aEYgkMwHz+KKS6G8mNHMK9aieWWWyjrdgMaV1eUy0wQ3FZpNBpsV527T0Gvd8Hd3bNJYhKNyzPqDkp/2UPpvt0Yx45H30lGThSirXJ39yQ/P5vMzFRAethVV7fyEa61jGySO5ErTWpcPWlMTk7myJEjzJgx45Kksa62bt3K8OHD0ddzeO6G6uPS1ubRSVnzDdhs+ISH075Pt8uu09bOibg2bf46CRgB4SMoTUkha/168jdtwlZZiQ1w69WLwKgoPIcORdHIk1XRtui8fTCOHovppxgKor/D/8mnpT+TEG2Uoij4+AQ4OoxmqbH7vjabx9c2m40333zTPprctVqxYgWzZ8+u93YyEE79WQvyydmyBRQFffj4yx57Wzsn4trIdVKNqzduU36Py/jbMCXEUfxzAiXHj3P6+HF0fv64R9yE2/CRrXLeurq43mtFBsJpmTzGT6Bk+zbMp05QfuQQrv1udHRIQgjRpjRJ0hgUFERpaSlms9k+qXFRUVGNfovHjh3DbDbz9ttvA5CUlMSqVas4deoUzz//fJ3eJy0tDa1WS7t27RrlOERNps2boLKyaj6tgMCrbyCEqDOtpxdet9+Fx80TKNnxM6Ytm7HmZFOwcgVFG9biPmoM7uERaI0ejg5ViEanMRjwuDmKguiVFK79Hpc+/aTWXQghmlCTJI3VJzWOjIy8ZFLjkJAQ+vbty4cffmjfJikpicmTJzNlypQ6v89XX33Ffffd1xiHIC5SWVRIyfatABilL6MQjUbj7IJxzDjcw8ZQ9ut+THExWFKSKdq0nqLNm3ALHoExYjxO7do7OlQhGpV7+BhMCT9RkZFOye4duI8Y6eiQhBCizWiyx3Tz5s1j5cqVzJ07l6+++qrGpMbHjh2zr2e1WlmwYAHJyclER0fzzTff2JctWbKE+Ph4Dh8+zIIFC7BYLPZlFouFX375hdDQ0KY6pDbNFBeLaq3AdcAg9EEdrr6BEOK6KFothsFDCfjL8wTMehbXGweBrZKSHdvIeH0h2R++R/mJY8jUu6K1UnROeN42CYCi9WuwVbsHEEII0bgUVe4wAOnTWB+VxcWcWzQH1WIm8JkX0XfqXOu6beWciOsj18m1qcjOwhQfS+nO7ajWCgCcOnTCOHY8hsHDULStb55C6dPoGNdbRjbUZ1y12ch8+59UpJ3Fc+KdLXrUbvneE3Ul14qoi8YuH6VDgKi34i2bUS1mXPr2v2LCKIRoXE7+Afjc8zvaz12Mx4Tb0bgbqUg7S95/P+Xc4jkUxcVgKytzdJhCNBhFo8HrjslA1RzBlcXFDo5ICCHaBkkaRb3YykoxJcYB4HFzlGODEUIAoHV3x/OW2wiaswjvadPRBQRSWVBAYfR3pC+YTf7qlVjz8xwdphANwqVXH1x690UtL6coZoOjwxFCiDah2Uy5IVoGU0Icank5zj1749z1BkeHI4SoRnFywj00DLcRIyk/eghTXCzmk8cpjo+lOOEnDIOGYhwXib6jtBAQLZvn7XdRfvwoxVvjMY4ei87Xz9EhCSFEqyZJo6gzW3k5xVt+AqSWUYjmTNFocO03ANd+A7CkpmCKi6X0lz2U7ttN6b7dOHfviXFcJC59+su0BaJF0nfohGHYCEp376BwXTS+Mx52dEhCCNGqyd2CqLPibVuwlZag79Yd5+49HR2OEKIO9B0743v/Q7SfvQDj2PEozi6YT50g56P3yXh9IcXbt6JWVDg6TCHqzTPqDtDpKN23G8vZM44ORwghWjVJGkWd2CwWTHGxQFUto6IoDo5ICFEfOm8fvCbdTdDcxXhOmoLWywtrVib5X/+X9IUvU7hxnQwqIloUnbcPxtFjAShYs0qmmxFCiEYkSaOok5KfE7EVm9B36oJL776ODkcIcY00rq54jI2k/eyF+Ex/CKcOnbAVmyjasJZzC2eT9+2XVGRnOTpMIerEY/wENK4GzCePU370sKPDEUKIVkv6NIqrUisqMP20CQCPW6SWUYjWQNFqcRs2HMPQYMwnj2OKi6H8yCFKtiVQ8nMirv0HYhw7Hn237vKZF82WxmDAePMECqO/o3DNKlx695V+ukII0QgkaRRXVbLzZyqLCnEK6ohLvwGODkcI0YAURcGlZ29cevamIuMcpvhYSnbvpOzgfsoO7kffuSvGsZG4DhiEotU6OlwhLmEMi6A4IY6KjHRKd+/AbcRIR4ckhBCtjjyOE1ekVlZStHkjAB43T5AaByFaMad27fG5936C5izE4+YoNAY3LCnJ5H7+Eedem49py0/YzOWODlOIGhQnp6pBcYDCDWuwWSwOjkgIIVofSRrFFZXs3kFlfh66wHa4Dhjs6HCEEE1A6+GJZ9QdtJ+zCK+770Xn509lXi4F339D+oLZFKz9nsrCAkeHKYSdYehwnDp0pLKggOKEOEeHI4QQrY4kjaJWamUlptgfAfCInCD9RIRoYzTOzhjDImj34jx8H3wcfbfuqGVlmDZvJH3RHHK//BxLepqjwxQCRaPB6/bJABTFbpCRgIUQooFJn0ZRq9Jf9mDNyUbn649h8LB6b//zoQzyilOwmK1otQo6rQadRkGr1aDVKmg1Va/V+m/1dbQKOo3Gvp+q1xW0Gg0ajTSZFaIxKRoNhoGDMQwcjDk5CVN8DGUHfqF013ZKd23HpXdfjGMjce7VR5qwC4dx6d0Xl959KT92hKKYDXjfdY+jQxJCiFZDkkZxWarNRlHMBgCMkbfWewCM0+lFfLimaYY/VxTQajTnk8hLk0z7sssknlp7InthvSstq/5ajfWqr3NxLNUS5Qvvr6BRFLnBFi2Oc9duOHd9DGtuDqb4zZTs3Eb5sSOUHzuCU/sOGMeOxzAkGEUnxYtoep6330X58aMUb43HOHosOl8/R4ckhBCtgpTq4rLKfv0Fa2YGWm8f3IaNqPf267afAWB4v0CCfAxYK21U2lT7v5WVNiorVaznf7dWqpcsr7nswjoXL1NVsFbasFY29FlofDUT0Zo1qBcnoNUTz4sTWO2Vamx/22cttbm6yyS1vy2/uFb3t2WS7Aqdrx/eU6bhcetESn5OwJQYT8W5NPK+/JzCH1bjPnoc7iPD0RgMjg5VtCH6Dp0wDBtO6e6dFK6LxnfGw44OSQghWgVJGsUlVFWlaFNVLaPHTbfUu8YgPaeEvcez0Wk1zJo6mEpzRWOECVTFalPVqgS0RnJ5uaT0on+rr3dRUnvh3/MJ7lWWXS7xtV5lWVWyq2KtrMTcaGeocWg1V6+5rS2BvVzNrb+vGx18XOnRwRO9k0zr0JJo3dzwiJyAcex4SvfuxhQXS0VGOoU/fE/RpvW4hYzCGHETOh9fR4cq2gjPCXdQ+steSvftxjg2En2nzo4OSQghWrwmSxrT09NZtGgRfn5+ZGVlMXv2bDp16nTZdVNTU5k0aRIvv/wyU6ZMsb9+4sQJ5s+fT0BAAG+//XaNbQ4fPsyKFStwcnLi7NmzhIeHM2PGjEY9ptaq/PCvVKSnovHwvKb5rtbvqKplDB/QDh8PF7KzGy9pVBQFraKg1YDeqdHeplHYbJepQb2oVtV6vkb2koS32jLr+cT30uT28jW31RNea41ll0+Qq8dXaVNr/FiwNeg50WkVenTwpE8Xb/p18aFreyM6rQzA1BIoOifcRozEMDyU8mNHMMXFYD5+lOKEnyhOjMN10BCMEZE4d+nq6FBFK6fz8cUYPhZTXAwFa1fh/8RT0jpCCCGuU5MljfPnz2fatGlERkYSFxfHnDlz+PTTTy9ZT1VV3nzzTTp06FDj9fz8fNauXcvgwYNJT0+vsay8vJx///vfvPfee+h0OsrLy0lKSmrMw2m1VFWlaON6ADzG3YziVL9MLLewnO2HMlEUmBAiT3evRKNR0Gi0OLWw+n6bqmK7TI2rPSmtltTWWMd2UZJbLfEts9rYezSTs5nFHE0p4GhKAd8nJOHspKVXJy/6dvGmbxdvOgW6o5Gbv2ZNURRc+/TDtU8/LGmpmOJjKN27m7Jf9lL2y16cb+iBcex4XPoNkBGZRaPxiJxAyY5tmE8co/zoYVz79nd0SEII0aI1ye1qfn4+iYmJLF26FIBRo0Yxa9YsMjMzCQwMrLHu8uXLiYqKYvny5TVe9/b25plnnrHvo7p169YRGBjIe++9R1lZGV5eXjzyyCONd0CtmPn4ESxnz6Bxd8dtZHi9t/9xVwqVNpURfQMI8Ja+TK2RRlHQnG9+2lD8/Y1MGtmF4rIKjqXkc/hMPkfP5HMut5RfT+fy6+lcANxcdPTp7F1VE9nVm3Y+BqlBaMb0HTri+/sH8bztTooT4ij+ORHz6ZOYT59E5x+AMeImDMGhaPR6R4cqWhmNwYDx5gkURn9H4dpVuPTuKw8phBDiOjRJ0pieno7BYMDZ2RkAvV6Ph4cHaWlpNZLG5ORkjhw5wowZMy5JGq/k1KlT/Pjjj6xfvx4fHx8WLVrE66+/zuzZsxv8WFozVVUpPF/LaIyIrPeNnKnUwpb9VbXAt4V2afD4ROvn7urEsN4BDOsdAEC+yczRlHyOJOdz5EweuUVm9hzPZs/xbAA83fX2Wsi+Xbzx83R1ZPiiFjovb7zumIzHzVGU7NiGactmrNlZ5H/7FYXr1+AeFoF7WARao9HRoYpWxBgWQXFCHBXn0indveOaulsIIYSo0mwaxtlsNt58800WLlxY721LSkoYPnw4Pj4+AEycOJE///nP9UoafX3d6/2+l+Pv33JvekxHjmBJOoXWzY2ud96G1rV+N+AbNxzFUmFjWJ8Aht0YZH+9JZ8T0XQud534+xvpdYMfk6h6qJGZV8r+EzkcOJHNgZM5FBSb2X4ok+2HMgFo52tgYA9/BvX0Y0APP7yNLk18FOLKjNDpTtTJt1OwezeZ69ZRevo0RRvXYfppEz5hYQRGReESFHTFvch3iqgLxckJz6g7yPvfZxRuWIPr4GFSqy2EENeoSZLGoKAgSktLMZvNODs7Y7FYKCoqqtFv8dixY5jNZvsAN0lJSaxatYpTp07x/PPPX3H/7dq1Iy8vz/63k5MTZnP9xqPMzS3GZlPrtc3F/P2NZGebrmsfjpT1zUoA3MLHkVdsheK6H0uZ2cqahFMA3Dyso/08tPRzIppGXa8TLTC0uw9Du/ugqr1IzynhyJl8jpzJ52hKARm5pWTknmHj+cGYOvi5nR9Ux5venb0wuLSw0ZJasxv64fOnvridPokpLpbyw7+SGxdHblwcLv1uxDg2EufuPS9pfny93ykajdJgDwlF82cYOhxTfCwVaakUJ8ThMf4WR4ckhBAtUpMkjd7e3oSFhZGQkEBkZCTbtm1j6NChBAYGEhMTQ0hICH379uXDDz+0b5OUlMTkyZNrjJ5am1tvvZWvvvoKi8WCXq9n9+7dhIWFNeYhtTrm5NOYTxxDcXHBOGZsvbffsj+dknIrPTp40rOjZ8MHKMRFFEWhg787HfzdiQzuhM2mcibTxNEzVX0iT5wtIC2nhLScEmL3pKIo0CXQaG/K2rOjF856md7DkRRFwaV7T1y696QiKxNTfCylu3ZQfvgg5YcP4tSxMx7jInEdOARFK/9Xov4UjQav2yeTvWwpRbE/4hY6Cq2bPDQQQoj60s6fP39+U7zR0KFDWbZsGTt37mTHjh3MnTsXLy8vnnrqKQYMGEDQ+eZIVquVRYsWsXfvXrKzs6moqKB//6pRz5YsWUJCQgJpaWkkJSUxcuRItFot3t7e+Pr68tFHH7F7927OnDnDnDlzcK1H88qyMgvq9VU04ubmTGmp5fp24iD53/wPa042xrHjce03oF7bVlhtfLD6IOWWSu6/tTftfd3sy1ryORFNpyGuE0VR8DY607OjF6NubMetIzpzYzcffD1dUG0qBcUW8k1mTqYV8vOhTDbsSOFwUh45heVoNAqe7no0GhlUx1G0bu649h+A28hwFL0zFZnnqMzJpuzAPkp27wBUnNq1x93T7bquFUVRMBikiWJ9XW8Z6ciyQOfnjznpFNbMDFRbJa59+jkkDoCiEgv7TmSzYWcK38WfwtfDGX8v6YstrkzupURdXO91crXyUVHV602VWoe23DzVcjaFzLf/gaLX0/7lRWjd6/cUdsv+dD5df5QO/m688vCIGlMitNRzIppWU1wnZkslJ9IKqpqzJudzJtNU4yZY76ShZ8cL03t0CTRKEulANouF0j07McXHYs2q6rOquLjQ/amnMLfres37leap1+Z6y0hHlwWW1LNkvv0P0Ghp/+JcdL5+TfK+FVYbJ9MKOZiUy6GkPFIyi2ss12k1/PnuAQy4wbdJ4hEtk6M/P6JlaOzuG81mIBzhOEUxVSOmuo8aXe+E0WZTWb+9qv/YbaFdZA490Ww567Xc2M2XG7tV3ZyVlFdwPKXA3icyLaeEQ0l5HEqq6h/t6qyjT2cve5/IID83md6jCWn0etxHhuMWMoryIwcxxcViPnWC8vR0lOtIGkXbpO/YCcOw4ZTu3knh+jX43v9Qo7yPqqpk5JVy8HQeh5LzOJqSj6XCZl/upNPQq5MX/bv6UFBawcYdZ1i68gCzpgxgYPemSWSFEOJaSNLYxlnS0yj7dT/odBjHRtZ7+73Hs8nML8PP04URfQMaIUIhGoebixNDevkzpJc/AIUlFo6eTyCPnMkju6CcfSdy2HciBwAPgxN9fpveo6sP/p4ukkQ2AUWjwbX/QFz7D6SypAT/LoHk5BRffUMhLuI54Q5K9+2ldO8ujBHj0Xfq3CD7LS6r4HBy1QOnQ8l55BXVHIivo78b/bv5cGM3X3p29ETvVNU/18/PncqKSmL3prJ05a/8afIABveUxFEI0TxJ0tjGmWJ/BMA9NAytR/0GsFFVlR9+rqplnBDSGa1MnCxaME83PSH9AgnpVzV3bE5BGUdS8u0D6xQWW9h5JIudR7IA8PVwsTdl7dPFG2+jsyPDbxO0blLbK66dzscX4+ixmOJiKFi7Cv8nnrqm68laaeN0ehEHz7dMSD5XRPWGu0aDE/27+tC/mw/9uvrU+t2gKAq/v7knigZidqfy3qpf+eNdNzL0/IMsIYSoC1VVKdm+FUtZEU7johqtnJSksQ2ryMqk9Jc9oNViHHdzvbc/fL5fmIfBifAB7RshQiEcx8/LldFeroweGGRvcmaf3uNMPrlF5ST+eo7EX88B0N7XUFUT2bkqiXR3lek9RE3p6eksWrQIPz8/srKymD17Np06daqxjqqqvPHGG+Tm5lJcXMz48ePto4jn5OTw2muv4ePjQ1lZGT179uSBBx6wb/vJJ5+wf/9+FEWhT58+zJw5s07L2hLj+Fsp3rEV84ljlB89jGvf/nXaLjO/1N58/ciZfMotlfZlOq1Cz45e9O/mQ/+uPnQKdK9zVw1FUfjd+J5oNQo/7jzL+98fZOak/gT3kZY7Qoirs1ks5H/zP0r37ETRagkaOQ6lnvOs15UkjW1YUeyPoKq4DR+Jztun3tv/8HMyADcP72RvbiNEa6QoCu193Wjv68ZNQztiU1XOZhafnx8yn2NnCziXW8q53FJ+2puGAnQKdK8xvYers3zdtnXz589n2rRpREZGEhcXx5w5c/j0009rrLNhwwbOnDnDe++9h9lsJioqihEjRtCxY0cWL15M3759efzxxwGYPn06vXv3JjQ0lAMHDrBmzRq+/fZbFEVh6tSpDBs2jODg4Csua2u0bm54RE6gcM0qCtd+j0vvviiXaSVTWm7lyJl8DiXncSgpl+yC8hrL2/sazjc59aF3J+/rmr5HURSmjeuBRqOwfnsKH6w+xOOqyoi+gde8TyFE62ctKCD3k2VYzp5B0evp+sQTWBspYQRJGtssa24OpXt2gkZzTZMdn0ov5GhKAS56LeOGdGiECIVovjSKQpd2Rrq0MzIhpDPWShvJ50wcOVNVC3EyrZCUzGJSMov5cedZtBqFbu097H0ie3TwwEknD1rakvz8fBITE1m6dCkAo0aNYtasWWRmZhIYeCE5WL16NePGjQPA2dmZESNG8MMPPzBz5kxOnDjBxIkT7et2796ddevWERoaSnR0NOHh4WjOJ0ARERFER0cTHBx8xWVtkTF8LMWJ8VScS6N0z07chodSaav6DB9KyuNgUh6n04uwVRte2c1FR7/zTU77d62ayqchKYrCPRHd0WoU1m47w7LoQ9hsKqH92zXo+wghWgdzchI5nyzDZipC6+OL38Mz8R7Ut1FH2ZWksY0qit0INhuG4aHXNPT4uvN9GccN7YDBRZrhibZNp9XQo6MnPTp6ckdYNywVlZxMK7Q3ZU06Z+JkWiEn0wpZuy0ZJ52GHh087TWRXdsbpU9wK5eeno7BYMDZuap/m16vx8PDg7S0tBpJY1paGr6+F6Zf8PX1JTU1FYDg4GC2bdtGZGQkFouFffv20a5dVVKRmppKaGhoje327dt31WV11RDTlPj7G697Hw1FO20q+z/8gl827iUj08iBU3mUlFVcWK5R6N/Nt2qwrN4BdO/ohbYRpuC5+Jw8PmUQRncXvtx4jI/WHsbN3YWbgjvVsrVoS5rT50c4Vm5CAtkff4xqteLety83/PnP6IxV10djXieSNLZB1vw8Snb9DIqCx/hb6719ek4J+07koNNquEUKMyEuoXfS0q9r1SAYAGVmK8fOFthHZz2bVWzvHwngotfSu5OXfWTWDv5uMn2NuMTf/vY3PvroI1599VUMBgMhISHk5uY2yXu39HkaAcotVo6mFHDodB4Hk0rJ9LutasHBqnlAA7xdq5qcdvWhTxfvGk3K83IbfsTe2s7JzUM7UFZm4fuEJJZ8uZfCwjLCB8q4AW1Zc/j8CMdTKyspWLOK4i2bAXAPG4PXXVPJLwfKTTJPo2h4pp82QWUlhiHBOAXUv8/Eb/Myhg9sj6e7jBgpxNW4OusY3MOPwT2qavWLSi0cqzZHZGZeKftP5bL/VFUC4O5abXqPLt4EervKqKEtXFBQEKWlpZjNZpydnbFYLBQVFdGhQ83m/R06dKiRCObm5tK1a1cADAYDTz31lH3Z4sWL6d69OwAdO3YkLy+vxna/7ftKy1ozm6qSkmmqmjMxKY+TaYVUVkt8XXQK3YpT6aHmMXLmdNoFNZ/pLiaFdUOrUVgZf5pP1h3BpqqMGRTk6LCEEA5SWVJC7hf/wXz8KGg0eN99H+4jw5s0Bkka25jKokKKt28FqkaRq6/cwnK2H85EUWDCCKllFOJaeBj0DO8TwPDzIyTmFZXbm7IePpNPvsnM7qNZ7D5aNb2Ht9HZnkD27eKNj0fD9qcSjc/b25uwsDASEhKIjIxk27ZtDB06lMDAQGJiYggJCcFoNDJp0iR+Ea0JxwAAIABJREFU+OEH7r33XsxmMzt37uRPf/oTAGvWrKFjx44MGTKEkpISEhIS+PzzzwGYNGkS8+fP5+mnn0ZRFOLj43nhhReuuqy1yTeZOZiUy6GkPA4n51NcrcmpokD3II+qfondfLghyIPc/3sX8/FjOO+KhzvvdmDkl5o4sisajcI3P53i0/VHsdlUxsoYAkK0ORUZ58j5+AOsOdlo3N3xe/BxnG/o0eRxSNLYxpjiYsBqxXXAIPRB9S98ftyZQqVNJaRfIAHehkaIUIi2x8fDhbAB7Qkb0B5VVcnKL7PXQh45n0T+f/buPDyq8mz8+PfMTGaSmewrkIQAIUBYAgQSwr4FIWwKCoKW2lJbrVor7evy/lTkVXGlhVZtXSmtVpFFkB0FZScQ9iUQQkzIBtmXWZKZzJzz+yOaSmXJPpPk+VyX1+U1M+fknuEkz9zneZ77PnTuGofOXQMgxM+jrj9knwg/vPVaJ78DoT5eeOEFli5dyr59+ygsLOSll14CYMWKFSxZsoShQ4eSlJTEmTNnePrppzEajTzyyCN1bTnc3Nx49dVXiYmJoaSkhBdeeIHg4NobDzExMUybNo1FixYhSRITJ04kLi7uts+1ddYaB5dyyuvaYeQVm697PsBbR7/uAfTv7k90Nz8M/7UH33f6LAqWv4bpwF68Ro9D4x+AK0kaFoFaklj9zWX+tTMNh6wwcUiYs8MSBKGVVJ0/Q8knq1Cs1biFhhG48OFGdTxoDpKiKI3fpNCONHW/Brj+mnOHycTVl59DsdkIWfQM2vCuDTreaLHx5N8OYbPLLPllHF1Dbr/Z1tU/E8E1iOvk5mRFIa/IXDcTmZZTRpXVcd1rwoL+096jd9f23d6jpfdsCDfmKnsaFUUhp9D0fSuMUi7lVGB3yHXP69zU9Ola2zOxf4+Aei3tLvn3KizHj6KPjSPgZ79scoz11ZDP5OtjOXy2Kx2A+YlRTBL1BDoUMUZ2PIqiYNy9k4rtm0FR8BgUi/+9C1Dpbr4tTOxpFJqNcd9uFJsN9+h+DU4YAXYdy8Vml4mJDKhXwigIQtOpJInwYE/Cgz25Iy68tjXANWNdUZ303Apyi0zkFpn4+lgOKkmiW2evupnIqFAf0UdVaNMqzDZSv2+FcT6rlEqzre45CYjo5EX/73smRob6oFE3rBKxT9IMLKdOYDmRgte4iWjDGj4+trRJQ8NRSRL//voSn+1KR5EV7oh3vTgFQWg62WajdPXHVJ06DoDP1Jl4TZzs9NoGImnsIGSLBdP+vQB4T0pq8PFVVjvfnKgt+z41IaJZYxMEof7UKhWRXXyI7OLDtOHdqLHLZHzf3uNCdhmZ+ZV89/1/Ww9fQaOW6BnqU1dYp3tn7wZ/qRaE1lRjd5CeW1HXMzGn8PrKpb6eWvp3D6Df90tOm7o8W+MfgNfosRj37KZ88waCHn7c6V/ObmTikDDUKol/7Uxj9TeXcSgKScPEeCwI7Ym9rJTile9Rk5eDpNMRcP8v8egf4+ywAJE0dhjG/XtQrNXoonqj69ajwcfvPZWPudpOzzAfeoX7tkCEgiA0hptGVbe3cRa1bQUu5VTUzURmFxi5mF3OxexyNu7PROemptcP7T0i/AgP9kTVAv3nBKG+FEUhv8TyfZJYwqXscmz2/yw51WpU9OrqS/9utQVsugQamj2p85o4BdORQ1jT06hOu4BHn77Nev7mMm5wKCqVxD+3X2TttxnIssK04d2cHZYgCM3A+t1lild9gGwyogkIIvBXD+HWyXWqJouksQOQq6vrero0Zpaxxi7zVUo2IGYZBcHVuWs1xEQGEBNZW9DDVFVDWvZ/iupcLbFw9rsSzn5X29bB4K6hT1e/upnIzgF6l5xlEdoXo8XGhStlte0wskopM1qvez482LOuymmvMB/cNC27xFptMOCdOIWKzRuo2LwB9159kFSuOSM/ZmAXVJLEP7ZdYP3e73DICjNHdnd2WIIgNIHp8AHKvvgcHA50vfoQsOBXqA0GZ4d1HZE0dgCmg/uQqyxou0eii4xq8PGHz1+j3GQjNMjAwEjXqiwnCMKteXq4MaR3MEN611bZLDdZ61p7XMgqo6SymuOXijh+qQgAH09t7Sxk19okMtDXw5nhC+1EjV0mLbusdl9iZilXrhn5cVkdb71bXZLYt5s/vk7oAew1ahymA3upuZqH5fhRDHEJrR5DfY2K6YxaJfHh1lQ27s9ElhXuHNVd3PARhDZGcTgo37gO08HaLWSeYybgO2MWktr1ahG0WtKYn5/Pyy+/TGBgIIWFhTz77LN1ZcT/W25uLjNnzuS5555j9uzZdY+np6ezZMkSgoODWb58+XWvnzFjBnr9f1pArFu3js6dO7fcG2ojZKsV495dAPjckdTgAUWWFbYnXwFqZxnFgCQIbZuvp46Efp1I6NcJgKLy69t7VJhsJJ8vIPl8AQBBvu51RXWiu/rh44Qv80Lbdvj8NT75Ku26qr8atURUmC/9v08Uw4I9UTl5fJHc3PBJmkHpp/+kYvtmPAbGotK6bjub4f07Iangg82pbDqYhawozBrdQ4zTgtBGOEwmSv71IdbLl0CtwW/OfDzjhzs7rJtqtaRxyZIlzJ07l8TERPbs2cPzzz/PqlWrfvI6RVFYtmwZoaHX9xAsKytjy5YtDBo0iPz8/J8ct3DhQn73u9+1VPhtljn5ILLJhDY8Al2v6AYff/xSEQVlVQT6uBMfHdwCEQqC4ExBvh4E+XowZmCXur1lF7JKuXCljLTscorKqykqv8q+01cBCA001C1l7d3V9yd97wThv+UVmamyOugSaKDf9/sSe3f1ReeCVX31sXEY9+ymJj8X04E9eE+4w9kh3VJC306oJIn3N6Wy5dAVHLLCPWMjReIoCC7Olp9H8cp3cZSWoPLyJvCXv2lUzZHW1CpJY1lZGQcOHOCtt94CYMSIETz22GMUFBQQEhJy3Ws/+eQTkpKS+OSTT6573M/Pj0WLFtWd478dO3aMV199FZvNxsiRI0lMTGyZN9OGKDU1VH77NQDejZhlVBSFbYdrZxmnDOuK2kX3dwiC0DwkSSI00EBooIHEoeHIskJ2oZELWbWzkJdyy8krNpNXbGb38VwkCSJCvOqK6kSF+aLTul4iIDjX3WN78MCMfpiN1c4O5bYklQrfGXdR9N7bVO7aiWHYCNQG1+7rGR8dgkqSeG/TebYnZyPLCnPH9xSJoyC4KMuZk5R++i8UmxW38K4E/vIhNL5+zg7rtlolaczPz0ev16P7viGlVqvF29ubvLy865LGrKwsLly4wIIFC36SNN6Kl5cXc+bMYfr06VRVVTFv3jzUajXjx4+v9zmaq9lzUJDr9C8s2rULubICj65dCR87osEDyIm0Qq4UGPH11HHXhF6NvivsSp+J4LrEdeKaQkK8iRtQu/Kjxi5zKbuMM+lFnL5cTNqVUrKuGcm6ZmT7kWw0aoleXf0YGBVETM9Aekf4tUgBE3GttC2SJKF3d2sTSSOAe+++6Hr1wXrpIpW7duJ3593ODum2hvYJRq2S+NvGc+w8moNDVpg/MUokjoLgQhRZpvLr7VTu3ArUrmzwm3u/Sy+D/zGXKYQjyzLLli3jpZdeavCxPj4+TJ8+HQAPDw+SkpLYsmVLg5LGkhITsqzc/oW3EBTkRVGRsUnnaC6K3c7VTZsB0I+/g+Ji022O+KnPdlwAYOKQUCrLLY2Kw5U+E8F1ieuk7Qj20pIYG0pibCjWGgeXcytIvVLKxStlZF0zkppZSmpmKZ99lYZWoyLqR+09IkK8mtzeo6nXikolNdtNQqH98p0+i4I/v4rpwF68Ro9D4+/6ReAG9wri0VkD+NvGs+w6lossK9w/qZdIHAXBBcjWako//RdVZ0+BJOEz7S68xie2qd/PVkkau3TpgsViwWq1otPpsNlsVFZWXrdvMS0tDavVWlfgJjMzkw0bNpCRkcGTTz55y/Pn5+cTEBBQN5Pp5uZGdXXbuKPZUszHj+IoK0UT0gmPAYMafHxGfgUXs8vx0KkZPzisBSIUBKGt07mp6ypeAliqa0jLKa9dzppdRl6RmfPfV8sE8NBp6NPVt25PZGgL9NsThOagDQtHPyQey/GjVGzbRMDPfunskOplUFQgj80ewNtfnOObE3nICvzsjl5OLzIkCB2ZvaSY4pXvUXM1D8ndg4AFC/GI7ufssBqsVZJGPz8/Ro4cyf79+0lMTOTQoUPExsYSEhLCrl27GDZsGNHR0XzwwQd1x2RmZjJr1qzrqqfezPr164mJiWHs2LEAJCcnM2bMmBZ7P65OcTgw7toJgHfilEb1mvphL+O4waHo3V1mQloQBBemd3djcFQQg6OCAKgw20jLLiM1q4yLV8ooLK/iZHoxJ9OLgdo2Cz8kkNERfgT5eogkUnAZPkkzsJw6geVECl7jEtGG3bjiu6uJiQzk8bsH8Nf1Z9lzMg9ZVvj5lN4icRQEJ6hOT6PkXx8im81ogoIJ/NVvcQsOuf2BLui22UBNTQ179uzh3LlzlJSUoCgKAQEBDBgwgLFjx6Kt5zrcF154gaVLl7Jv3z4KCwvrlqGuWLGCJUuWMHToUADsdjuvvPIKWVlZbNq0CYfDwZw5c+pee+DAAYxGIy+++CLPPPMMWq2WIUOG8NFHH3Hw4EHKy8vp2bMn8+fPb+xn0uZZTh3HXlKEJiAI/aAhDT4+r9jMyfRiNGoVdwxtG4OkIAiux8egJT46hPjo2gGyuKKKi1fKuXCltjprucnG0QuFHL1QCECAt47oCP+6Fh9+Xs5v79FcY6DQ9mj8A/AaPRbjnt2Ub/6CoIcfbzM3Nfr3COD3c2L467oz7Dudjywr/CKpT5OXhwuCUD+KomA6tI/yDWtBlnHv05eABQtReehvf7CLkhRFuelGvtTUVB555BHc3NyIjIzE29sbgMrKSjIyMrDb7fz973+nT58+rRZwS2kvexoVWebamy9jL7iG370/w3PYiAaf46MtqRw8d41xg0P5+eTeTYrHFT4TwfWJ66TjURSFa6UWLl4pI/VK7Uykudp+3Ws6B+jr+kP2ifDD08OtVfc0dqQx8HaaOka21d9xh9nM1VcWo1RVEfibx/Do07fZzt0an8mFrFL+sv4MthqZEf07sXBqtEgc26C2+vvTUSl2O2VfrMGcfAAAr/GJ+Ey7q1Er/xqipcfHW840vvjii7z++usMGzbshs8fPXqUJUuWsHr16kYHKDSvqrOnsBdcQ+3nj2Hojf/dbqWkoprk1AIkqbbNhiAIQkuQJInOAQY6BxgYHxuGrCjkFpq4cKW2vUdaTjlXSyxcLbHw7Yk8JCA8xJPfz4vFz6N1lsyLMVBQGwx4T5xCxZYNVGzZiHuvPi3+xa85RXfzZ9GcgaxYe4ZD564hywq/mh4tWmgJQgtxGI0Ur3ofW2YGaDT4z72/Ud/HXdEtR16VSnXTwRIgPj4elfjD4zIURaHy6x0AeE+4A0nd8FL3O49m45AVhvUNIdjXo7lDFARBuCGVJNE1xIuuIV5Mju+K3SGTdc1Ym0RmlXI5r5LsAhOn04sYF9O5dWISY6AAeI0eh+ngXmryc7EcP4ohLsHZITVI765+LJo7kOVrT5OcWoCsKPx6Rl+ROApCM7Pl5lC88l0c5WWovX0IWPgQuq7dnB1Ws7nlXwwfHx/efvttrl279pPnCgoKeOedd/Dx8Wmx4ISGqT5/lpr8XNTePhjihzf4+EqLjX2n8wGYmhDR3OEJgiDUm0atomeoDzNGdOOp+2J5+4nR/N/CeO4a27PVYhBjoAAgubnhM6W2rVfF9s0oNTVOjqjheoX78sd7B+GuVXP0QiHvfXkeu0N2dliC0G5YTh6j8K1lOMrL0EZ0J+QPz7SrhBFuM9P42muv8X//939MmDABnU6Hp6cnkiRhNBqx2WxMmTKF1157rbViFW6hdpZxOwBe4ychubk1+By7j+Vis8vERAYQHiz6mAmC4Dq0bmrCgz1x07Te7IgYA4Uf6IfEY9z7DTX5uRj378F7wiRnh9RgPUN9+OO8Qfz589McSytC/vI8D9/ZD41azDgKQmMpskzFjs11XQv0cQn4z5mPpGn493BXd8tCOD8oLy8nNTWVkpISAAICAujbty++vr4tHmBraeuFcKouplL8/tuoPD3p/NzLqBpY0a/KaufJvx3CYrXzzP2x9Apvnn9bsXlbqA9xnQj11ZqFcH7QEcbA2+mohXB+7IdxVvLwoPP/exG1wdCk8znrM8m8WsmfVp/CYrUzqGcgv72rf6vejBEarj38/rRHcnUVJf9eRfX5syBJ+M68G88x451WZdmphXB+4Ovry9ChQykrK0NRFPz9/UWZcRdy3Szj2MQGJ4wAe0/lY7Ha6Rnm02wJoyAIQnsgxkABwL13NLpefbBeuohx9w58Z97t7JAapXtnb56cP5hlq09y6nIx72w4y6Oz+uOmaXgdBEHoqGqKCile+R72gquoPPQE/PxXuPeOdnZYLeq2SeOnn37KZ599RkZGBj9MSkqSVNcLsSP3Q3QV1ox0bJkZqDz0eI4c0+Dja+wyX6VkAzBN7GUUBEGoI8ZA4QeSJOE7fRYFf34V4/69eI4ah8Y/wNlhNUpEJ6/vE8dTnMko4a0vzvK72QNE4igI9VB96SIl//wQucqCJqQTgQsfxi0o2NlhtbhbJo2vv/4658+f58EHH6Rnz57X9ahKT09n48aN5OTk8NRTT7VKsMKN/TDL6DlmAip39wYff/j8NcpNNsKCDMREts0BUBAEobmJMVD4b9qwcPSxcVhOpFCxfTMB9//C2SE1WtcQL56aP5g3V5/k3Hel/HXdGX53dwxaN5E4CsKNKIqCad+3lG9aD4qCe78BBNz/C1TuHaPbwC2TxpSUFNatW3fD5/r168fMmTO59957WyQwoX6sWd9hTU9DcnfHa8y4Bh8vywrbk68AtRVTnbUOWxAEwdWIMVC4EZ+pM7GcPonl+FG8xk5EGxbu7JAaLSzYszZx/Owk57PK+Mu6Mzx+Tww6kTgKwnUUew2laz/DkpIMgHfiFLynTG9TfVub6pbvVJZlKioqbvq80WjE4XA0e1BC/dXNMo4ai8pD3+Djj18qoqCsikAfd+Ki2//UuiAIQn2JMVC4EY1/AJ6jxgJQvmWDk6NputAgT566LxYfg5YLV8r4y9rTVNvszg5LEFyGo7KCwndWYElJRnJzI2DBQnymzuxQCSPcZqbxF7/4BdOmTWPs2LH06NEDLy8vJEmisrKSzMxM9u7dy9NPP91asQr/xZaTTfWF80haHV5jJjb4eEVR2Ho4C4CkYV1Fo19BEIQfEWOgcDPeiVMwHz2E9dJFqtNSce/d19khNUmXQANP3Vc743gxu5wVa07z+zkD8dDVq16iILRb1uwsSv7xPo6KctS+fgQufLhNry5oilv+NZg5cyZ9+vRh/fr17N69u67cuL+/PwMGDGDlypVERUW1SqDCT9XNMo4Yjdqz4X0Vz2eVkl1gwlvvxsgBnZs7PEEQhDZNjIHCzagNBrwnTqFiywbKN28kJKpPm5916Bxg4On7Y3nj05Ncyq1g+ZrTLJorEkeh4zIfO0Lpmn+D3Y62eySBv/gNai8vZ4flNLf9S9CrVy/+93//tzViERrAlp9H1bnTSBo3vMY1fJYRYNvh2r2Mk+LCxcZ3QRCEGxBjoHAzXqPHYTqwh5r8XCwnUjAMHebskJosxE/P0/fH8uanJ7icV8GfPj/FH+YOQu8uEkeh41BkmYqtGzF+uwsAQ8Io/GbPRdJ07N+Dtn1brAOr3LUDAEPCSNTePg0+PiOvgovZ5Xjo1IwfHNbc4QmCIAhCuya5ueGTNAOAiu2bUWpqnBxR8wj29eDp+2IJ9HHnu/xK/vT5SczV7eO9CcLtyFUWij/8e23CqFLhe/e9+M2Z3+ETRmiGpPFvf/tbc8QhNEBNYQFVp0+AWo3X+MRGnWPb9xVTxw8OE3cQBUEQGkmMgR2bfkg8bp1DcZSVYjywx9nhNJtAXw+eum8wQb7uZF41suyzU5iqROIotG81hQUUrHiD6ovnURkMBD38OF4jx4rOAt+7ZdKYn59/2//27dvXWrEK36vctQMUBUPccDR+/g0+Pq/YzMn0YjRqFZOGillGQRCEGxFjoHA7kkqFz4xZQO3Y7DCbnRxR8wn0qZ1xDPbz4EqBkWWfncRosTk7LEFoEVUXzlOw4g3sRYW4de5CyBNP496zl7PDcim3nGKaPn06VVVVKIpy09fUN/vOz8/n5ZdfJjAwkMLCQp599lnCw29cfSg3N5eZM2fy3HPPMXv27LrH09PTWbJkCcHBwSxfvvwnx9XU1HDvvffSq1cvXnvttXrF1dbYS4qxnEgBlQrviXc06hw/9GUcHdMZH09dc4YnCILQbjTnGCi0X+69o9FF9caanoZx9w58Z97t7JCajb+3O0/fF8sbn50ku9DEm5+d5H/mD8Zbr3V2aILQLBRFwfjtLiq2bgRFwWPAIPzv+zkqnbuzQ3M5t0waFy9eTF5eHo8++uhNX7NgwYJ6/aAlS5Ywd+5cEhMT2bNnD88//zyrVq36yesURWHZsmWEhoZe93hZWRlbtmxh0KBB5Ofn3/BnvPfeexgMhnrF01ZV7t4Jsow+LgFNQGCDjy+uqOJIagGSBJOHdW2BCAVBENqH5hwDhfZLkiR8Z8yi4M+vYdy/F89R49D4Bzg7rGbj56Xj6e/bceQWmXnz09rE0ccgEkehbZNtNsrW/Lt2MgbwnjwN70lJbb4Scku55ady1113UV1dTUFBwU1fc+edd972h5SVlXHgwAFGjx4NwIgRIzh27NgNz/vJJ5+QlJSEr6/vdY/7+fmxaNEi3N1vnPmfP38eo9FIfHz8beNpq+xlpZhTkkGS8J44uVHn2Hk0B4esMCw6hGBfj2aOUBAEof1orjFQaP+0YV3Rx8aBw07F9s3ODqfZ+XrqeGr+YLoEGsgrNvPGpyeoMFmdHZYgNJq9vJzCd/6M5UQKklZHwC9+jc/kaSJhvIXbfjJ//OMfCQkJuenzHh63Tzzy8/PR6/XodLVLIbVaLd7e3uTl5V33uqysLC5cuMDkyQ1LiGw2G2+//Ta///3vG3RcW2P89mtwONAPGoJb8M3/TW6m0mJj/+naWdqkhIjmDk8QBKHdaY4xUOgYfKbOBLUGy4kUbLk5zg6n2fl8nziGBhm4WmLh9U9PUmYUiaPQ9lizvqNg+WvU5GSj9g8g+PH/QR8z2Nlhubx6l8202Wzs2rWL3NxcbLb/bITesGED06ZNa3IgsiyzbNkyXnrppQYf+9Zbb7Fw4UL0en2jf35AgGejj/2xoKCWafpZU15O7pFDAETMmY1HI37Ozu0XsNllhkaHENuvc3OHeFMt9ZkI7Yu4ToT6csa10tJjoND2afwD8Bw1FtPe3ZRv2UDww487O6Rm523Q8tT8wSxbfYqcQhNvfHqCJ+cPxt9b7P8S2gbT0cOUrf0MHHZ0kVEEPPBr1J7NkwO0d/VOGh977DGuXr1KVFRU3YwhgNV6+7tMXbp0wWKxYLVa0el02Gw2Kisrr9u3mJaWhtVqrStwk5mZyYYNG8jIyODJJ5+85flPnTpFRUUFmzdv5ty5c5jNZhYvXszjjz9OYGD99v2VlJiQ5ZsXO6iPoCAvioqMTTrHzZRv2ohSU4PHgIGY3H0xNfDnVFntbN7/HQCJsaEtFud/a8nPRGg/xHUi1FdTrxWVSmrUTcKmjIFCx+GdOBnzkUNYL12kOi0V9959nR1Ss/PSa3ly/mCWrT5JdoGJ1z89wVPzYwnwEYmj4LoUh4PyTV9g2v8tAJ6jxuJ75z1IarWTI2s76p00FhcXs2nTpp9UiluzZs1tj/Xz82PkyJHs37+fxMREDh06RGxsLCEhIezatYthw4YRHR3NBx98UHdMZmYms2bNuq566s18/PHHdf//1ltvkZeXx4svvljft+byHCYjpkP7AfCelNSoc+w9lY/FaicqzIde4b63P0AQBEGo05Qx8Af1qSKuKApvvvkmJSUlmEwmJk6cWDcOFhYWsnjxYrp06YLJZCIgIICnnnoKSZIoLi7m1Vdfxd/fn6qqKqKionjggQcAOHLkCI888sh1NQEOHjzY2I9CuAW1wRPvxMlUbNlI+eaNhET1aZd7pDw93Hhy/mD+tPoUWdeM3yeOgwkUtRIEF+Qwmyn5+COsly6CWo3f7HvxHD7K2WG1OfX+S9a/f3/MN+g/JMtyvY5/4YUXWL9+PYsXL2b16tV1y1BXrFhBWlpa3evsdjsvvvgiWVlZbNq0ibVr19Y9t2LFCvbu3UtqaiovvvjidUuEAFauXMnevXs5ffo0y5Ytq+9bc3nGfd+g2Gy4R/dDG9bwiqc1dpmdKdkATBV7GQVBEBqsqWMg1FYRnz17Ni+++CLz5s3j+eef/8lrduzYwZUrV3j99df585//zNtvv01ubi4A77//Pp06dWLx4sW88cYb7N27l7179wKwdOlSevfuzbPPPsvLL7/MV199RXJyct15n332WQ4ePFj3n9ByPEeNQ+3rR01+bl1VxvbI4O7G/8wbRPfO3hRXVPP6pycoLK9ydliCcJ2aa/kUrngD66WLqDw9Cf7t70XC2Ej1nmk0mUxMmzaNgQMHXtfWYv/+/cybN++2x4eFhfH3v//9J49v2bLl+oA0GhYvXszixYt/8tonnniCJ5544qY/Y+HChSxcuPC2sbQlssWCaX/tl4LGzjIeOneVCpONsCADMZHtpwy4IAhCa2nqGPhDFfG33noLqK0i/thjj1FQUHBdoZ0vv/yS8ePHA6DT6YiPj2fr1q089NBDBAcHc+7cOQCqq6sxmUx1M5/p6enX7a2MjIxk27ZtJCQkALBr1y4uXbpEdXU106ZNIy4uromfiHAzKq0Wn6QZlH72Lyq2b0Y/MBbJzc3ZYbUIvbsbf7xPKjpUAAAgAElEQVR3EMvXniIjr7Juj2OIX+NrTAhCc6k6d4aSf/8DxWrFLTScwIUPofHzd3ZYbVa9k8ZTp04xZ86cnzyu1Yo+PS3JuH8PirUaXVRvdN16NPh4WVbYfuQ/s4yiEbUgCELDNXUMvFUV8R8njXl5eQQE/OfmXkBAQN1M44MPPsiTTz7Jb3/7W0pLS7n77rsZO3YsAEOHDuXQoUMkJiZis9k4efIknTp1AmrrCsybN48xY8ZQVlbGrFmzePfdd+nTp0/jPgzhtvRD4jHu2U3N1TyMB/bgPX6Ss0NqMXp3DX+YO4gVa0+TnlvBG5+e5Mn5g+nkLxJHwTkURcG4awcVO7aAouAxaAj+8xagEjlLk9Q7aXz44YeZO3fuTx7v1q1bc8Yj/IhcXYVp3zdA42cZj6UVUlhWRaCPO3HRwc0ZniAIQofhCmPg8uXLMRgM/OlPf8Jms/HQQw9x5swZYmJieOqpp/jwww955ZVX0Ov1DBs2jJKSEgDCw8Pr9k76+fkxevRotm3b1qCksTkqjHe0Csm6n91HxptvYtq9k4ipd6C5QYXG9vSZLH1kFP/3YTLnvyth2eqTvPzwSMJD2s/7c7b2dK20JEd1NVc+/JCKI0dAkugyZw4hM2Z0mEmTlrxO6p003miwBJg+fXqzBSNcz3RwP3KVBW33SHSRUQ0+XlEUtiVfASBpWFfU7XAzviAIQmto6hhYnyriAKGhoXXJHkBJSUldYvrNN9+waNEioHamsm/fvqxdu5aYmBj0ej2PP/6fFg9Lly4lMjISqO2B/OPk1s3Njerq6nrF/Z84mlZhvCNWSFY6dUMX1RtrehqZn6/Hd+b1hf3a42fy2F39+cu601zMLueZdw7w5PzBhAYabn+gcEvt8VppCfayUopXvktNXi6Szp2An/0Cdb8YiotNzg6tVbR0dXGRRbgo2WrFuHcXAD53JDXqDsn5zFKyC0x4G7SMimm9voyCIAjC9X5cRRz4SRVxo7F2oJ85cyb79u0Datt5HD16tG6vYrdu3bh8+XLdOTMyMuqWoG7evJmTJ08CYDab2b9/f12i++6779Yd53A4SElJqdvrKLQcSZLwnTELAOOBPdhLS25zRNun06r5/ZyB9O3mR6XZxpufniC3qGN8YRecy/rdZQqWv0ZNXi6agCBCfv8kHv1inB1Wq7HWOKgwtWwLKPWSJUuWtOhPaCOqqmwoTWvTiMGgw2Kx3f6F9WA6uJeqs6fRhkfgM+2uRiWNq7ZfpLiimukjIugT4ZyNv835mQjtl7hOhPpq6rUiSRJ6vXP2tcTGxvLee+9x9OhRjhw5wuLFi/H19eXxxx9nwIABdOnShZ49e3L27Fk2b97Mpk2bmDt3LsOHDwdg0KBBrFmzhhMnTrBt2za0Wi2LFi3Czc2NzMxM3nrrLdLS0ti6dSuPPvpo3fJTq9XKhx9+yPnz51m/fj2jR4/mnnvuaVDsTR0jO+rvuNrbB3tRITV5ucgWM/oBg+qea6+fiUatYmjvYDKvGcktMpNysZD+PQLwMYj9ZI3VXq+V5mI6fICSjz9CsVrR9epD0EO/Q+PfMQreOGSZvSfz+Ov6s3y5L4MJsWG4aRo3J3i78VFSlKamSu1DU5feQPMtH1Bqashfuhi5soLAXz3cqDslGXkVLP34OB46NW/+diR693qvRG5WYkmFUB/iOhHqq6WX3wg3JpanNp69pJirr70IsoOQPzyDNrR2f2l7/0xq7A7e/uIcZ78rweCu4X/mDSaik9iX1xjt/VppLMXhoHzjWkwHa1dneI6dgO/0WUhqtZMjax0Xskr5bHc6uUW17aCG9AnmoRl90agblzSK5altkOnoIeTKCty6hOHed0CjzvHDXsbxg8OcljAKgiAIQkenCQjEc9QYUBQqtmx0djitxk2j5rHZAxgYGYC52s6y1SfJulbp7LCEdsJhMlH07l9rE0a1Bv95C/C7854OkTAWlll4a/0Z3lx9itwiM4E+7jxyV39eeDCh0QljfYik0cUodjvGb74GwHvSlEYtS80rMnEyvRiNWsWkuPDmDlEQBEEQhAbwTpyC5O5BddoFqtMuODucVuOmUfHo7AEMjgrEXG3nzc9O8V2+SByFprHl51Kw4nWsGemovLwJfnQRhvjhzg6rxVVZ7azdc5nnPjzCyfRidG5qZo/pwdJfD2Non+AWrxArkkYXYz52BEdZKZqQTnj8aO9DQ/zQl3F0TGexh0AQBEEQnExt8MQ7cTIA5Vs2oMiykyNqPRq1it/e1Z8hvYOostr50+cnycircHZYQhtlOXOSwr/+CUdpCdrwCEIWPYOuW3dnh9WiZEVh/+l8/vf9ZLYnZ2N3KIzo34lXfpPA9BHdcNO0zuyqSBpdiOJwYNz9FfD9XclGtMgorqjiSGoBKkliyrCuzR2iIAiCIAiN4DlqHGpfX2rycrGcSHF2OK1Ko1bx0Mx+xPUJpsrq4E+fnyI9t9zZYQltiCLLVOzYQsmqD1BsVvRD4gl6dBEaX19nh9aiLuWU89I/j/GP7RepNNuI7OLNsz8fwoPT++LnpWvVWMRmNxdiOXkMe0kRmsAg9IOGNOocO4/m4JAVEvqGEOTr0cwRCoIgCILQGCqtFp+kmZR+9i8qtm9GnjjG2SG1Ko1axW9m9kWlkjiSWsCfPz/NE3Ni6N3Vz9mhCS5OtlZT+uk/qTp7GiQJn+l34TUuscWXYzpTSUU1a/dc5uiFQgD8vHTcMy6SYX1DUDnpfYuk0UUoskzlrh0AeE2c3KiNvJUWG/tP5wMwNSGiWeMTBEEQBKFp9EPiMe7ZRc3VfIp27UIVN9rZIbUqtUrFr6f3RSVJHD5/jeVrT/PEPQPpEyESR+HG7CXFFK98l5qr+UjuHgQsWIhHdD9nh9VirDYH249cYceRbGx2GTeNiinxXZmaEIFO69wiPyJpdBFVZ05hLyxA7eePYeiwRp1j17FcbHaZmMgAwoJFSXlBEARBcCWSSoXP9FkUf/AO1778kuCo/mh8O1bCpFJJ/GpaNCoVHDx7jRVrT/P4PTH07dYx+uoJ9VednkbJvz5ENpvRBIcQuPBh3IJDnB1Wi1AUhSOpBazdk0GZ0QpAfHQw94yLJNDHNVYOiqTRBSiyTOXX2wHwnnBHo2YZq6x2vjmeC8C04WKWURAEQRBckXufvuiiemNNT+Pqy8/j3qcvhrgEPPoNQNK4OTu8VqFSSfxyajRqlcS+01f5y7oz/O7uAfTvHuDs0AQXoCgKpgN7Kf9yHcgy7n36EbBgISoP10iemlvm1Uo+3XWJjLzaysIRIV7MT4yiV7hr7dcUSaMLqE49R83VPNTePo0uGbz3VD4Wq52oMB+iwlzrIhMEQRAEoZYkSfjPW4Bl2wYqTp6kOvUc1annUHno0ccORR+XgDY8ol3v1wJQSRI/n9IHlSSx51Q+f113lt/dPYABPUTi2JEpdjtl6z/HfOQgAF7jJ+Ez7c5GFYd0dWVGK1/szeDguWsAeOvduHtsJCMHdEalcr3ff5E0OpmiKFR8vQ2o/cWQ3Bp+l7HGLrMzpbbNhphlFARBEATXpvHzJ/KJJ7iWmY/lRArmlCPU5OVgOrgP08F9aEI6YYhLwDAkHrVP+70RrJIkfja5N5JK4tsTeby1/gyPzhrAwJ6Bzg5NcAKHsZLiVR9gy8xA0rjhd+/9GIbEOzusZldjd7DzaA5bD1/BWuNArZK4Iy6c6SO64aFz3dTMdSPrIKrTLlCTk43K0xPD8FGNOsehc1epMNkIC/IUd+gEQRAEoY1Qe3rhNWYCXmMmYMvLxXwsGcvxFOwF16jYspGKrV/i3jsaQ1wC7v1iUGnbX+9llSTxs0m9UEsSu47n8vYXZ3lkVn8GRwU5OzShFdlysyle+R6O8jLUPr4ELnwIbXj7mghRFIXjaUWs+fYyxRXVAAyOCmTuhJ6E+OmdHN3ttVrSmJ+fz8svv0xgYCCFhYU8++yzhIeH3/C1ubm5zJw5k+eee47Zs2fXPZ6ens6SJUsIDg5m+fLldY9fuXKFN954g65du1JeXo7RaOSVV17B29u7xd9XUyiKUreX0WtsYqMGA1lW2H6kdpZx6vCu7X45iyAIgiC0R9rQMLSh9+A7fRbVF89jTjlC1fkzVF9MpfpiKpK7B/pBQzDEJ6CN6N6uxntJkpifGIVKJfFVSg5/23COh+/sz5DeInHsCCwnj1G6+mOUmhq03boT+IvfoPb2cXZYzSq7wMjq3elczK7tTxoaZGD+xKg2VQCq1ZLGJUuWMHfuXBITE9mzZw/PP/88q1at+snrFEVh2bJlhIaGXvd4WVkZW7ZsYdCgQeTn51/3nNFoZNq0aUydOhWAX//616xZs4YHH3ywxd5Pc7BmpGPLzEClN+A5snH9mo6lFVJYVkWQrztxfYKbOUJBEARBEFqTpFbj0S8Gj34xOMwmLCePY045TE1ONubkA5iTD6AJCsYQl4B+SDwav7bzpfNWJEni3gk9UakkdhzJ5u8bz/HQnf3Ed5t2TJFlKrZvxrh7JwCG+OH43TOvXRWEqrTY2LDvO/adzkdRwNPDjbtGd2fsoC6o29g+zVZJGsvKyjhw4ABvvfUWACNGjOCxxx6joKCAkJDrS+d+8sknJCUl8cknn1z3uJ+fH4sWLao7x4/179+f/v37A2C1Wrl27RpRUVEt9G6azw+zjJ5jxqNyd2/w8YqisO3wFQCmDItocxefIAiCIAg3pzZ44jVqLF6jxlJzLR9zSjLm4ynYiwqp2LaJiu2b0UX1xjB0GB4xg9v88lVJkpgzLhK1SmLr4Su89+V5ZFlhWN/22WahI5Orqyj55B9Up54DlQrfmbPxHD2+3cyg2x0yu4/nsulgFlVWOypJYuLQUO4c1R2De9tMilslaczPz0ev16PT6QDQarV4e3uTl5d3XdKYlZXFhQsXWLBgwU+Sxvr4+OOPWb9+PVOmTGHs2LENOjYgoHn6GgYFedXrdaZLl7Cmp6Hy8KDbXdPRGAwN/lknLhaSXWjC10vHXeOj0Lo5t+nnzdT3MxE6NnGdCPUlrhWhI3Lr1AXfGbPxmXon1ZcuYk5JpurcaayXLmK9dBFp/efoBw7GED8cbffINvvlW5IkZo/pgUqS2Hwoi/c3n0dWFIb36+Ts0IRmUlNUSPHKd7EXXEPloSfggQdx79XH2WE1C0VROJ1RwuffXKag1AJA/x7+zJsQRZfAhn/XdyUuUwhHlmWWLVvGSy+91OhzLFiwgPvuu48//vGPvPPOOzz66KP1PrakxIQsK43+2VD7RaaoyFiv1xat/QIAz5FjKbPIYKnfcT/26Y4LACQOCaOi3NLg41tDQz4ToeMS14lQX029VlQqqdluEgqCM0hqNR7R/fCI7odssWA5dRxzSjK2K5mYjx7GfPQw6oBADEMTMMQNQ+Pf9grkSZLErDE9UKskNh7I5MPNqciywsgBnZ0dmtBE1WmpFP/rI5SqKjQhnQn61cNoAtvH3tX8YjOrd6dzLrMUgBB/PfMn9iQmsn1UA26VpLFLly5YLBasVis6nQ6bzUZlZeV1+xbT0tKwWq11BW4yMzPZsGEDGRkZPPnkk7c8v9lsxt3dHbVajVqtJikpibfffrtBSWNrsuVcofrieSStDs8xExp1jst5FaTllOOh0zB+cOjtDxAEQRAEoV1R6fV4jhiN54jR1BQWYE5JxnLsCI6SYip3bqFy5xZ0kVEY4hLwGDgYla7hW2Gcaeao7kgqiQ37vmPl1gvIssLogV2cHZbQCIqiYNr3DeWbvgBFwb1fDAH3P4DK3cPZoTWZqaqGTQcy+eZEHrKi4KHTcOfIbkwYEoZG3X62jrVK0ujn58fIkSPZv38/iYmJHDp0iNjYWEJCQti1axfDhg0jOjqaDz74oO6YzMxMZs2adV311JtZuXIlgwcPZtSo2pYV6enpdO3atcXeT1NVfr0DAM8Ro1F7Nu6O9w97GSfEhrp0TxdBEARBEFqeW3AIvtPuxCdpBtb0tNrlq2dOYc1Ix5qRjvTFGjwGDsYwdBi6yKg20yx9xohuqFUS6/Zk8I/tF5EVhbGDxM3ytkSpqaF03WdYUpIB8J6UhPfkaW3mGrwZhyyz52Q+G/d/h7najiTBuEFduGtMD7z1bXt/8Y20WrbxwgsvsHTpUvbt20dhYWHdMtQVK1awZMkShg4dCoDdbueVV14hKyuLTZs24XA4mDNnTt1rDxw4gNFo5MUXX+SZZ55Bq9UyZMgQPvroI/bu3YvVaqW0tJTnn3++td5ag9jyc6k6dxpJ44bXuImNOkdekYlTl4tx06hIHHrjtiWCIAiCIHQ8kkqFe+9o3HtHI1dVYTl9onb5amYGlpRkLCnJqP38MQwdhiEuoU0sDZyaEIFKkljz7WX+uSMNWUGssmojHJUVFK98D1t2FpKbG/7zH0A/KNbZYTXZ+axSVu9KJ6/YDECfrr7MT+xFeHD73f4gKYrStI187URr7Wks/tdHVJ06jueocfjNntuon/PB5lQOn7/G+NhQFtzRu1HnaC1ir5pQH+I6EepL7Gl0jqaOkeJ3/Kda+zOpKSrEcuwI5mNHcJSV1j2u7R5Z275jUKzLLxX8KiWH1bvTAbh/Ui8mDglzckSto63+/lizsyhZ+R6OygrUfv4ELnwIbWjbnuwoKLPw+e7LnLpcDECgjzv3TuhJbK8gpxefaunxUaxrbEU1BdeoOn0C1Gq8xic26hzFFVUcSS1AJUlMiXfdJbiCIAiCILgOt6BgfJJm4D15GtaM9O+Xr57ElpmBLTOD8g1r8BgwCENcArqo3i65dPCOuHDUKol/f32Jf399CYescEdc205C2ivzsSOUrvk32O3oevQk4IFfo/Zqu5Wvq6x2Nh/K4uuUHByygk6rZvrwCO6IC8dN45rdC5qbSBpbUeXunaAoGOKGN7oZ784jOciKQkK/EIJ8XfuOoCAIgiAIrkVSqXCP6o17VG/k2fdSdeYk5pRkrBnpWE6kYDmRgtrXF/2Q2uWrbsGu1SNx4pAwVBJ8/NUlVu9OR5YVpgwTN9FdhSLLVGzZiHHPLgAMCaPwmz0XSdM2Uw5ZVjhw9ipf7M2g0lIDwMgBnbh7bCS+njonR9e62ua/YBtkLynGciIFVCq8J97RqHNUmm3sO5MPwNRhEc0ZniAIgiAIHYzK3R1D/HAM8cOxlxRjPnYU87FkHCXFGHfvxLh7J9qI7rXLVwcPQeWhd3bIAIyPDUOlkvjnjjTWfHsZWVGYmiC+FzmbXGWh5OOVVF9MBZUKv1lz8Rw5xtlhNdqlnHI+3XWJ7AITAD1DfZifGEX3zt5Ojsw5RNLYSip37wRZRh+XgCagcf1adh3PocYuMzAygLB2vNFWEARBEITWpQkIxGfyVLzvSMKWmYH56GEsp09gu5KJ7UomZRvX4tF/IIa4BNx7Rzt9+erYQaGoJIlV2y+ybk8GDllhxohuTo2pI6spuEbxynexFxWiMhgIeODXuPfs5eywGqW4ooq132aQcrEQAD8vHXPGRzIsOsTp+xadSSSNrcBeVoo5JRkkCe+Jkxt1jiqrnW+O5wEwbXi3ZoxOEARBEAShliRJ6Hr0RNejJ76z5lJ19hTmY0ewpqdRdeo4VaeOo/L2wTAkvnb5aqfOTot19MAuqFQSK7deYMO+75BlhTtHdXdaPB1VVeo5Sj5ZiVJdjVvnUAJ/9TAa/wBnh9VgVpuDrclX2Hk0mxq7jFajYsqwriQlRKBz6xj7Fm9FJI2twPjt1+BwoB88tNF7A/acysNitdMrzIeeYT7NHKEgCIIgCML1VDpdbWuOocOwl5XWVV+1FxVi/PZrjN9+jTY8An1cAvrBQ1EbDK0e48gBnVGpJD7cksqXBzKRZYW7Rnfv0DNCrUVRFIzffk3F1i9BUfCIGYz//AWodO7ODq1BZEXhyPkC1u65TLnJBsCwviHMGReJv3fbei8tSSSNLcxRWYEp+SAAXolTGnWOGruDr47mADBVzDIKgiAIgtDKNH7+eE9KwitxCras7zCnHMFy6hi2nCvYcq5Q/uV6PPoNqF2+2qcvkrr1ZmaG9+uEWiXx/qZUNh/KQlYUZo/pIRLHFiTbbJSt+XdtvQ7Ae/J0vCdNcfqy5YbKyK/gs13pfJdfCUBEJy/uS4wiKszXyZG5HpE0trDKb3eB3Y7HgIFoO3dp1DkOnrtGhdlGeLAnA3o0ruqqIAiCIAhCU0mShK57JLrukfjedQ/V589gPppM9aULVJ05SdWZk6g8vdAPiccQNwxtl9bppRgfHYJKknhv03m2Hr6CQ1aYMy5SJI4twF5eRvHK96jJzUbS6vC/7wH0MYOcHVaDlBmtrNuTweHz1wDwMWi5e2wkIwZ0QiWumRsSSWMLcpiMmA/vB8B7UlKjziHLCjuSswGYmhAh/vgJgiAIguASVFot+sFD0Q8eir28HMvx2uqr9oJrmPbuxrR3N26h4bXVV2OHovZs2T59Q/sEI0kS7355jh1HspFlhXsn9BTfnZqRNTOD4lUfIBsrUfsHELjwYbRdQp0dVr3ZahzsTMlh6+EsbDUyGrXEHXFdmTY8Ag+dSItuRXw6Lci49xsUmw336H5owxrXQ+hYWiGF5VUE+boztE9QM0coCIIgCILQdBpfX7wn3oHXhEnYsq9gOZaM5cQxavJyKM/LoXzTejz69kcfl4BHdP8W69s3pHcQj8zqz982nOOrlBxkWWF+YpRIHJuB6cghytatBocdXc9eBPz8QdSebaOav6IoHEsrYs03lymprAYgtlcQcyf0JFj0Pa8XkTS2ENliwXRgL9D4WUZFUdh2+AoAScMiULexdeKCIAiCIHQskiShi+iGLqIbvnfeTdX5s5hTkqm+mErVuTNUnTuDyuCJPnZobfXV0PBmT+gGRwXx6OwB/G3DWXYdz8WhKNw/qZdYdthIisNB+aYvMO3/FgDPUWPxvfOeVt232hRXrhn5bHc6l3LKAQgL8mR+YhTREX5OjqxtEUljCzHu/xbFWo0uqje6bj0adY5zmaVkF5rwMWgZOaBTM0coCIIgCILQciSNG/qBsegHxuKorMB8IgXL0WRqruVj2r8H0/49uHXugj4uAUNsHGrv5qsOP6hnII/NjuHtL87y7Yk8FFnhZ5N7i8SxgRxmEyX/+ghrehqo1fjdPQ/PhJHODqteKs02vtiXwf7TV1EATw83Zo/pwZjvW7UIDSOSxhYgV1dh2ld7N6axs4xA3SzjHXHhuGnaxt0cQRAEQRCE/6b29sF7XCJeYydSk5eD+WgylpMp1FzNp2LTF1Rs2Yh7n74Y4hLw6DcASePW5J8ZExnA4/cM4K31Z9lzKh+HrPBAUh+RONZTzbV8ij56F0dJMSpPLwJ/8Wt0PXo6O6zbsjtkdh3LZfOhTKqsDtQqiYlDwpg5sht696ZfVx2VSBpbgOngPuQqC9rukegioxp1jst5FaTllOOh0zBucNvZYCwIgiAIgnAzkiShDeuKNqwrvjNnU5V6DsuxZKpSz1H9/X8qDz362KHo4xLQhjetCGD/7gH8/p4Y/rruDPvPXEVWFH6ZFC1mmm6j6txpSv69CsVqxS0snMBfPoTGz7Ur+CuKwunLJaz+Jp3Csiqg9sbBvRN60jmg9XuItjciaWxmstWKcc9uAHzuSGr0H7ofZhknxIaKak6CIAiCILQ7kkaDPmYQ+phBOIxGLCdTMKckU5OXi+ngPkwH96EJ6YQhLgHDkHjUPo3rnde3mz9PzBnIinWnOXj2GrKs8KtpfUXieAOKolC5aweV2zcDoB88FL97f4ZKq3VyZLeWV2Ri9e50zmeVAdA5QM+8iVEM6BHg5MjaD5GNNDNz8gFkswlt127oekU36hy5RSZOXS7GTaNi0tDwZo5QEARBEATBtai9vPAaMwGvMROw5eViPpaM5XgK9oJrVGzZSMXWL3HvHY0hLgH3fjENTmL6RPjxh7mDWL7mNIfPFyAr8OD0aFFk8Edkq5XS1R9TdfoESBI+STPwmjjZpSvPmqpq2Lj/O/aczEdWFPQ6DXeO6s742FA0avFv25xaLWnMz8/n5ZdfJjAwkMLCQp599lnCw2+cEOXm5jJz5kyee+45Zs+eXfd4eno6S5YsITg4mOXLl9c9/s0337Bp0yY6d+5MTk4OsbGxLFy4sMXf03+TbTYqv90FgPekKY3+Jdv+fV/G0TGd8Ta49p0dQRAEoX7qMw4qisKbb75JSUkJJpOJiRMn1o2DhYWFLF68mC5dumAymQgICOCpp55CkiSKi4t59dVX8ff3p6qqiqioKB544IG68/7jH//g9OnTSJJEnz59eOihh1r1vQtCQ2hDw9CG3oPv9FlUXzyP+WgyValnqb6YSvXFVCR3D/SDhmCIT0Ab0b3e37d6hfvyh3sHsnzNaY6kFuCQFX4zo69ILgB7aQnF/3iPmrxcJJ07AT/7JR79Bjg7rJuyO2T2nMzjywOZmKvtSBKMjw3lrlHd8dKL784todWSxiVLljB37lwSExPZs2cPzz//PKtWrfrJ6xRFYdmyZYSGXr+Pr6ysjC1btjBo0CDy8/Ove+7rr79m0aJFREREYLPZmDhxIjExMQwdOrQl39JPlOzbh1xZgVuXMNz7Nu4Xrbi8iiOpBagkiSnxjevtKAiCILie+oyDO3bs4MqVK7zzzjtYrVaSkpKIj48nLCyM999/n06dOrF48WIApk6dyrBhwxg3bhxLly4lOjqa3/zmNwDcf//99O7dm4SEBM6cOcPmzZtZt24dkiQxZ84chgwZ0upjpCA0lKRW49EvBo9+MTjMJiwnjmE+lkxNTjbm5AOYkw+gCQrGEJeAfkh8vfbcRYX58sd7B/HnNac4drEQRVZ46M5+HTpxtH53meJV7yObTGgCgwj81cO4hXR2dlg3dS6zhNW7L5NfbAYgOsKP+ROjCAtuGz0j26pW+Q0pKyvjwIEDjB49GpXHifwAACAASURBVIARI0Zw7NgxCgoKfvLaTz75hKSkJHx9r1+37ufnx6JFi3B3d//JMUuXLiUiIgIArVZLSEgIhYWFLfBObk6x27m2ZQvQtFnGnUdzkBWFYX2DCRTNRgVBENqF+o6DX375JWPGjAFAp9MRHx/P1q1bAQgODqa0tBSA6upqTCZT3ViTnp5Ojx7/ae8UGRnJtm3bANi0aROjRo1CpVIhSRJjx45l06ZNLfuGBaGZqQ2eeI0eR6dFz9DpqefwGp+Iyssbe1EhFds2cfXl5yl896+Yjx9Fttluea7IUB/+Z95g9DoNxy8V8feN57A75FZ6J67FdGg/hX9bgWwy4d47mpAnnnbZhPFaqYW/rD3Nnz8/TX6xmWBfD343ewD/M2+QSBhbQaskjfn5+ej1enQ6HVCb2Hl7e5OXl3fd67Kysrhw4QKTJ09u0PlVP1qPXlBQQHl5OePGjfv/7d17QFR1/v/x5wz3GbmDEKiIKN4JSbxkqSlZWWmat2rb+nWvXdts81abWmrfTMv9bu1uu9V+q62sr5qZ2tfSytQ1VBQ1yxvkjYuACHKfAeb8/qBoXQURgeHyevzFzDlz5n0OnzPvec/5nM/nsuO+FMVJ2ynPzcU1JBSvvrH12kZBsZ3N+6quot40KKIhwxMRESeqax5MT08nMPCXgRsCAwNJS0sD4IEHHsDNzY1HH32Ue+65h9tvv51hw4YB0L9/f7Zt2waA3W4nOTmZzMxMoOqWj4CAgAtuU6QlcgsNw+/W8YTNWUjQg7/BKzYOXFywHT7ImfffJmPuLM589B62H1MwDOOC24i8woen7ojF6ulK8pHT/GXVfsor2k7haFRWcmbFMvJWLAOHg3bDRhL0wGOYLRZnh3aekrIKPvrqCM++uZ29qbl4urswcXgU8x8YSL/o4GZ9z2Vr0mwGwnE4HCxZsoT58+fXext2u53nnnuOV155BcslNvrAwPr/QmFUVpK1aQMAHcbdRkBI/SanXZ90gPIKBwN6hdKvV/P8lac+goO9nR2CtABqJ1JXbbWtLF26FKvVyssvv4zdbufhhx9m3759xMTEMGPGDN58801eeOEFLBYLAwcOJDc3t8He+3Jy5M/a6v+tNjomDSB0EAwdREVxMXmJiZzZupXilBSKt2+jePs2PNq3J+Caawi45ho8goPPeWlwsDcL/a08+7dt7Ek5zRvrDjD7nnjc3Zrf3NgN2VbKCwo4+uqfKT54EJOrK53uu4/An3pBNCeVDoMN24/z3voDnC2yYzLB9QM6cfdNPfH3Ob/noTTuZ0qTFI1hYWGUlJRgs9nw8PDAbrdTUFBwzn2Lhw4dwmazVQ9wc/ToUVatWkVqairTp0+/6HvY7XaefvppHnroIWJiYi45xtzcIhyOC/8addH3zszAnp2NR0gIFVG9yckpvORtlNoqWLPlRwAS4sLrtY3mKDjYu9XsizQetROpq8ttK2azqUEKoEtVlzwIEB4efk6xl5ubS+fOnYGqQd+mTZsGVF2p7NWrF8uXLycmJgaLxcLjjz9e/bqFCxcSFRUFQIcOHaq7tf68zf9834u5nBwJOscvRMekEcQMICBmAN5ZpyhO2k5J0nZs2dlkfvwxmR9/jEdUN6wDBuMVE4vZo6ro8PFw4feTY1ny4R6SDmQx92/b+O34vs2qcGzItmJPT+P0P16nMu8MZh9fgv7fQzgiIptdWzx0Io8PNh7hZHYRAN06+HJHQjc6h/pQYSsnJ6fcyRE2P42dH5uke6q/vz9Dhgxhy5YtAGzbto24uDhCQkLYuHEjhYWF9OzZkzfeeIPnn3+e559/nsjISMaNG1engrGsrIxZs2Zx7733EhsbS2ZmJm+99VZj71Y1t9Ar8J90F12mTcPkUr8PmU170im1VRDd0Y+uHep3pVJERJqnuuRBgDFjxrB582YAbDYbO3bs4Oabbwagc+fOpKSkVG8zNTWV0NBQANasWUNycjIAxcXFbNmyhUmTJlVvc+vWrTgcDgzD4JtvvmHMmDFNs+MiTuAWEorfzWO54tkFBD88FUtcPCZXN2ypRziz7F0y5s4md9m7lB05hOFw0CnEmxl39sPb4sb+o2f408p92Mornb0bDa5kbzLZry6hMu8M7h0jCJ02E4+ISGeHdY7T+aX8ZdV3LPogmZPZRQT6ePDI2N7MuiuOzqE+zg6vTTMZNXX2bmBpaWksXLiQ4OBgsrOzmT17NhEREdxyyy3MmzevehS3iooKXnjhBTZs2EBUVBQ333wzEydOBOCPf/wjW7dupbCwkCFDhjBr1izc3d35wx/+wNq1a7FarQBUVlZy1113MXXq1DrHd7m/okL9K/zyikpm/PVbzhbbeWLilcREtZ6JSPVLqtSF2onUVUu90gh1y4OGYbBo0SLy8vIoLCxkxIgRTJgwAai633H+/PmEhYVRXFyMw+Hgueeew2KxsH79ev7xj38QExNDbm4ukyZNYvDgwdXv/dZbb7Fv3z5MJhPdu3fn0UcfvaTYdaWx4emYNC1HaSkle3dTvDMR+9HU6udd/AOwxg/C2n8gWVhYvCyZgmI7PTr58bsJV+Lh7vwrjpfbVgyHg4IvPqPgi6rBsSz9BxAw8S5Mbm4NFeJlK7NXsO7b43y+4yQVlQ7c3cyMHhTBjQM6Naurvs1ZY+fHJisamztnFo2b9qTz7vpDdGzfjnn/L75V3dCrpCh1oXYiddWSi8aWTEVjw9MxcZ7ynGxKkrZTnLSdyrxfum57dOlKYc94XvsBzhaXE93RjycmxuDp7twhQC6nrTjKyjiz7B1Kv9sLJhO+t47De9jIZvNd02EYfLv/FCu+SeVsUdWot4N6hzBhWBQBum/xkjR2fmw2A+G0VZUOB+sTTwAwelBEszmJRURERFojt+D2+N50Kz433Iwt9QjFOxMp3ZeM7ccU3H9M4V4PP94OGM7hk/m88tEepk2Kxcuj5X1lrsg9zem3Xqf8VAYmTy8Cf30/Xj16OTusainpZ1m28QhHMwuAqhFt70zoRlS4btNqjlreGdDK7DqUQ3Z+Ke39vOjfI/jiLxARERGRy2Yym/Hs1h3Pbt1xjJ9M6b5kincmEpB6hHuzN/C23zBS0mHx377mifG98ekQ5uyQ66zsyCFy33kTR0kxru1DCLrvEdzahzg7LADOFJSx4ptUEr+vmqfWt507E4ZFMbhPKGZdPGm2VDQ6kWEYrPv2OAA3DuyEi7lJxiUSERERkX9j9vTEOmAw1gGDqcg9jU/Sdu7fuYd/VMZwrMTK4v/Zxv3eJwgaEI+l31WYvZrffIZQ9d2yaOs35K9eAQ4Hnj17E/ir+zB7eTk7NOzllazfcYLPEo9jL3fg6mLmhgEduXlwhNO7AMvF6T/kRPuPnuFkdhG+VneG9A11djgiIiIibZ5rYBC+N9yMz/U38fv9h1j6xUnSCeDNQvj1yhV4fbIcrz5XYo0fhGf3npiayY/+RkUFeSs/onj7vwDwHjEK39FjnB6fYRjsPJjN8q9TyC2wAdC/ezATr+tKsJ/zi1mpGxWNTvTzVcZR8R1xc9XIUCIiIiLNhclsJjymJ7MjInnpg91knA3g3ZBR3J29EfbsonTPLsw+vlivGoA1fhBuoVc4LdbKwgJOv/0G9qOpmFzd8J/8K6xXxTstnp8dP1XIBxsPcyTtLAAd27fjzoRudO/k7+TI5FKpaHSSlLSzHD6Zj5eHK8P7XdokyyIiIiLSNAJ9PZl5VxyLlyWTkQfvd5/IIx3OYtqznYqcbAq/3kDh1xtw7xiBJX4Qln79cflpGrimYD95gtP/8zqV+fm4+PoRdN/DuHeMaLL3v5CzRTZWbv6Rf+3LxAC8LW6MH9qFa2PCMJt132JLpKLRST5LrLrKOCIuvEWOyCUiIiLSVgT4eDLjzqrCMS23hNfNAfx+6mw8c9Io3rmdkj1J2E8ex37yOPmrV+LVu29V99UevTC5NF5vspLkJM4s+ydGRTnunSMJuvchXHycN/poeYWDjUknWbPtGGX2SlzMJhL6d+DWqyOxeOr7bkum/54TpOUUsSflNG6uZq7v39HZ4YiIiIjIRfh7ezDjzn5VhWNOMYs/3MP0O/oRMCkKv9smULp/LyU7t1N2+ACl+5Ip3ZeMuZ03lqsGYI0fiHtYhwaLxXA4OPvZpxR+9QUA1gGD8Z8wBZOrW4O9xyXFYxgkHznN/36VQnZ+KQCxXYOYPKIrIQHNc9AguTQqGp3g/36al/HamCvwsbo7ORoRERERqQu/dh7MuDOOJcuSST9dzEsf7Gb6Hf3wa+eBNS4ea1w8Ffn5lOzaQfHOb6nIzqLomy8p+uZL3MI7Yo0fhCWuPy7tvOsdg6O0lNz3/4eyH/aD2Yzf2Ntpd81wp831nZZTxLKNRzhwPA+AsCArU0Z2pU9koFPikcahorGJnc4vZfsPWZhNJm4c0MnZ4YiIiIjIJfC1ujP9zn4s+emK46IPkplxRz/8vT0AcPXzw2fkKLxHXI/9xHGKd35LSXIS5eknyU8/Sf6nK/Hq1RdL/EC8evbB5Fr3r+PlOdmcfuuvVGRnYfayEHjPA3hG92isXa1VYYmdT7YeZVNyOoYBVk9Xxl4TyfB+4bi6NI8RZaXhqGhsYp/vOInDMBjcO4QgDTMsIiIi0uL4WNyZfkc/Xv5wDyeyi1j0/m5m3NmPAB/P6nVMJhMeEZ3xiOiM/9gJlH7/HcVJiZQd/IHS/Xsp3b8Xs7Udlrj+WOMH4xbeodarhWWHfuD0u29hlJbiGnoFwfc9gmtQcFPs7jkqKh18vTud1VuPUmKrwGwyMSIunLHXRtLOyzndY6XxqWhsQgXFdjbvywDgpkHOHdVKREREROrP2+LOUz8VjsezCnnxp8IxyPf8iwImNzcssXFYYuOoLDhL8e6dlOxIpPxUBkVbNlG0ZRNuV4RhjR+M5ap4XLx9ql9rGAaFm74kf83HYBh49Ykh4M57MXt6nvc+je27H3P58MsjZOaWANC7sz9TRnYjPLhdk8ciTUtFYxPauOsk5RUOYrsG0UEnl4iIiEiL1s7LjafuiOWVj/ZwNLOQRe8nM+POfrVOWu/i44vP8AS8h42kPP0kxTsSKUneSXlmBvmfriR/7So8e/SqGn01uifH//4B+Vu3AuBz/U343HAzJnPTdv/MzC3mo69S2JeaC0B7fy+mjOjGlV0DnXYvpTQtFY1NpNRWwZe70gEYPVhXGUVERERaA6unG7+f3I+l/7uH1IyC6sFx2vvXPmqoyWTCvUMn3Dt0wm/MeEp/2E/xzkTKDuyn7If91QPd4HBgcncnYMqvscTGNdFeVSkpK+fTfx3jy11pVDoMvDxcuPXqSBL6d9B9i22MisYmsmlPOqW2CqI7+tE13Hnz54iIiIhIw7J4uvLk5FiW/u9eUtLPVg2Oc2c/Qi5SOP7M5OqKJSYWS0wslYWFlOzeSXFSIuXpabgHBuJ/70O4hzfdNG0Oh8HmvRl8vPlHikrLMQFDrwxj/NAuGvm/jVLR2ATKKyr5YsdJAG7WVUYRERGRVsfLw5Vpk67kv5fv5XDa2Z8Gx4kj9BLnKXTx9sZ72Ai8h42gPCebkC7hnCksb6Soz3fgeB7LNh4hLacIgOiOftwxshsRofWfJkRaPhWNTeBf+09xtthOp/bt6BMZ4OxwRERERKQRVBWOsfz3ir0cPJFfParqFYHWem3PLbg9Lp6e0ARFY3Z+Kcu/SmHX4RwAAn08mTyiK1d1D9Z9i0KTdUbOyMjgscceY86cOTzyyCOcPHmyxnXT0tKIi4vj448/Puf5I0eOcNdddzFt2rQatz9p0qQGj/1yVDocrE88AVTdy6iTTkRERKT18nB34XcTr6RnhD9ni+0s+iCZ9J+u2jVHpbYKVmxK5Q9vJLLrcA4ebi6MG9qFhQ8OpH+P9vruKkATFo3z5s1j/PjxPP/880yZMoVnn332gusZhsGSJUsIDw8/5/m8vDzWrl1LbGzsBV/3zjvvEB8f3+BxX65dh3LIzi+lvZ8XV3Vv+rl0RERERKRpebi58PiEGHp39qeg2M5Ly5JJy25ehaPDMNi6L5On/57IZ4nHqag0GNw7lBceGsStV3fG3c3F2SFKM9IkRWNeXh5bt27l2muvBeDqq68mKSmJrKys89Z97733uOmmm/Dz8zvneX9/f6ZNm4ZnDXPSzJ49G1/f5jXAjGEYrPv2OAA3DuqESxMPjywiIiIizuHh5sLU22Po0yWAwpJyXlqWzImsQmeHBUBK2lkWvJPEPz47wNliO1FhPjzz66t48NZe+Ht7ODs8aYaa5J7GjIwMLBYLHh5VjdDd3R0fHx/S09MJCQmpXu/YsWMcOHCAu+++m/fee68pQqsWGNgw8yYGB/9yk/Cug1mczC7C39uDscO7tdlfbP79mIjURO1E6kptRURaCnc3F6aO78ufV+1nX2oui5cl89SUfk4bVOZMQRnLN6Wy/YeqCzd+7dyZeF1XBvYKwaxuqFKLZjMQjsPhYMmSJcyfP98p75+bW4TDYVzWNoKDvcnJ+eUXpA/WHwQgoX8HzuaXXNa2W6r/PCYiF6J2InV1uW3FbDY12I+EIiJ14ebqwm/G9eWvn+xnT8ppFi9L5vdTYom8wqfJYrCVV7J++wn+L/E49goHbq5mbhzQidGDIvBwb5sXNeTSNEnRGBYWRklJCTabDQ8PD+x2OwUFBefct3jo0CFsNhtLly4F4OjRo6xatYrU1FSmT5/eFGE2qJS0sxw+mY/Fw5XhseEXf4GIiIiItEpurmYeG9eH11d/z+7DOSz5cA9PTr6SqLDGvbXKMAy2H8hixaZUzhTYAIjv0Z6J10UR5OvVqO8trUuTFI3+/v4MGTKELVu2kJCQwLZt24iLiyMkJISNGzcycOBAevbsyRtvvFH9mqNHjzJu3DjGjx/fFCE2uM8Sq+5lHHFVOF4ezeaCroiIiIg4gauLmUfG9ubvn35P0qEcXv5wD09OiqVrh8YpHI9mFrBs4xFS0s8C0CmkHXcmRBPd0e8irxQ5n8u8efPmNcUbxcXF8be//Y0dO3awfft25syZg5+fH48//jh9+/YlLCwMgIqKChYsWMDu3bvJycmhvLyc3r17A/DHP/6RLVu2kJ6eztGjRxk8eDAuLlWX1N9++20+//xzjh07Rnp6Oj179sRqrfucOKWldozL652K1epBSYmdtJwiln15BDdXM4+M6dOmL/v/fExEaqN2InV1uW3FZDJhsbg3YERtw+XmSJ3j59MxaZvMZhNx3YPJOlPC8awidhzMJrqDH4G+Fx7oES69reQX2Xh/w2He++IwZwpt+FjcuCMhmrtHdSfIT1cXW6vGzo8mw7jcUql1aMh7Gt9Y8z3ffp/FyLgO3DUquoEibJl0r5rUhdqJ1JXuaXSOy82ROsfPp2PStlU6HLy17gCJ32fh4ebCExNj6N7J/4Lr1rWtlFdU8sXOk6z99jg2eyUuZhOj4jtyy9Wd1eutDWjs/KgW1MBO55ey/YdszCYTNwzo6OxwRERERKSZcTGbeeDmXphNJrbtP8XS5Xv53YQr6Rlx4cKxNoZhsPvwaT766ginz5YB0K9bEJNGdCXE39LQoUsbpaKxga3fcQKHUTU5qroAiIiIiMiFmM0m7hvdE7PZxNZ9mfz38r1MnRBD784Bdd7Gyewilm08zMET+QCEB1mZktDtkrYhUhcqGhtQfqGNLfsyARg9qJOToxERERGR5sxsNnHvTT0wm0xs3pvBn1bsY+r4vvTpEljr6wpK7Hyy+Ue+2ZuBYYDV05VxQ7swLDYMF7O5iaKXtkRFYwP6dEsq5RUOYrsGER6se2ZEREREpHZmk4lf39gds9nEpuR0/rTyO347vg8xUUHnrVtR6eCrXWms/tcxSm0VmE0mRl4VzphrImnn5eaE6KWtUNHYQEptFXz2r6MAjB4c4eRoRERERKSlMJtM3D0qGheTiS93p/Hax9/x2G19ie32S+G4L/U0H36ZwqkzJQD0iQxgyshuhAXVfbYAkfpS0dhANiWnU1xWQfeOfnQNb9yJWkVERESkdTGZTNx5fTdMZtiYlMafV33Ho7f1oZcD/rJiD/t/PANASICFKSO6EhMViMlkcnLU0laoaGwAPw9xDLrKKCIiIiL1YzKZuGNkN1zMJj7fcZK/frIfgEqHgZeHK2OHdGbEVR1wddF9i9K0VDQ2gJz8Ms4W24nq4EufSI1WJSIiIiL1YzKZmHRdV8xmE/+XeAKzCYbHhnHb0C741DL5ukhjUtHYAMKCrDw56UpieoTgsFc4OxwRERERacFMJhMThkXRvaM/XTsHYHFRN1RxLl3bbiB9ugQS6Kt5GUVERETk8plMJmKiAokI9XF2KCIqGkVERERERKRm6p4qIiLSBDIyMliwYAFBQUFkZ2fzzDPP0LFjx3PWMQyDxYsXk5ubS1FRESNHjmT8+PEAZGdnM2fOHMLCwigqKiIwMJAZM2ZgMplqXfbxxx/z4osv4uZWNYdbaGgoK1eubPL9FxGRlktFo4iISBOYN28ekyZNIiEhgU2bNvHss8/y9ttvn7PO+vXrOX78OH/+85+x2WzcdNNNDBgwgA4dOvD3v/+d0NBQ5syZA8Do0aMZOHAgw4cPr3UZwKuvvsrAgQObcndFRKQVUfdUERGRRpaXl8fWrVu59tprAbj66qtJSkoiKyvrnPVWr17N0KFDAfDw8GDAgAGsW7cOgPbt23PmTNU8bWVlZRQVFVXP0VbbMoAVK1awaNEinnvuOQ4dOtS4OysiIq2OrjSKiIg0soyMDCwWCx4eHgC4u7vj4+NDeno6ISEh1eulp6cTGBhY/TgwMJC0tDQAHnjgAaZPn86jjz7KmTNnuP322xk2bNhFl0VHRxMVFcWVV17JiRMnmDJlCqtWrTrnfS8mMLDdZR+D4GDvy95Ga6NjInWltiJ10ZjtREWjiIhIC7B06VKsVisvv/wydrudhx9+mH379hETE1Prsj59+lRvo1OnTvTo0YNNmzYxefLkOr93bm4RDodR79iDg73JySms9+tbIx0TqSu1FamLy20nZrOp1h8IVTT+xGxumPlvGmo7rYmOidSF2onU1eW0FWe1s7CwMEpKSrDZbHh4eGC32ykoKCA8PPyc9cLDw8nNza1+nJubS+fOnQH46quvmDZtGlB1pbJXr14sX76cmJiYWpcdPXqUyMjI6m26ublRVlZ2SfE3xHHTOX4+HROpK7UVqYvGzI8qGn/i729tkO00RBee1kbHROpC7UTqqiW2FX9/f4YMGcKWLVtISEhg27ZtxMXFERISwsaNGxk4cCDe3t6MGTOGdevWMXnyZGw2Gzt27OA3v/kNAJ07dyYlJYWEhAQAUlNT6du370WXLViwgFdeeQVfX19KSkrYv38/TzzxxCXGf/k5siX+3xqbjonUldqK1EVjthOTYRj1728iIiIidZKWlsbChQsJDg4mOzub2bNnExERwS233MK8efPo378/hmGwaNEi8vLyKCwsZMSIEUyYMAGout9x/vz5hIWFUVxcjMPh4LnnnsNisdS67N1332Xbtm1ERERw8uRJRo0axW233ebkoyEiIi2JikYRERERERGpkabcEBERERERkRqpaBQREREREZEaqWgUERERERGRGqloFBERERERkRqpaBQREREREZEaqWgUERERERGRGqloFBERERERkRq5OjuAluqee+4hJSWl+vF9993H/fffT0ZGBgsWLCAoKIjs7GyeeeYZOnbs6MRIG09FRQXvvPMOr776KitXriQqKgqAgoIC5s6di7e3N6dOnWLq1Kn07dv3osuk9cnKymLJkiX4+/tjs9nIz89n7ty5BAQE1HqutKXzSH6xYMECSktLsVqtHDx4kEcffZTBgwfrM6WFUX5UfpS6UY6UumoW+dGQepk5c+YFn3/wwQeNDRs2GIZhGF9//bVxzz33NGFUTevDDz80du3aZURHRxspKSnVz8+bN894++23DcMwjEOHDhmjRo0yHA7HRZdJ65OYmGgsXbq0+vGLL75oPP3004Zh1H6utKXzSH7x0ksvVf+9bt06Y/To0YZh6DOlpVF+VH6UulGOlLpqDvlR3VPrqaSkhEWLFvHiiy/ypz/9idLSUvLy8ti6dSvXXnstAFdffTVJSUlkZWU5OdrGMXnyZOLi4s57/tNPP2Xo0KEAREdHU15ezp49ey66TFqfAQMG8Lvf/a76cYcOHcjKyqr1XGlr55H8Yvr06dV/Hzt2jOjoaECfKS2N8qPyo9SNcqTUVXPIjyoa62nEiBH89re/ZdasWbi7uzNz5kwyMjKwWCx4eHgA4O7ujo+PD+np6U6Otunk5+dTVFREYGBg9XOBgYGkpaXVukxaJ5PJhMlkqn68efNmpkyZUuu5ovOobdu/fz+PPfYY27Zt49lnn9VnSguk/Hhhasvyn5Qj5VI4Oz+qaKyn2267DavVCsC4cePYsGEDNpvNyVGJNF/Lly+nW7duJCQkODsUacb69OnDX/7yF+6//35+9atfUVFR4eyQ5BIpP4pcOuVIuRhn50cVjfVgt9vJyMiofuzm5obD4SAyMpKSkpLq5Gi32ykoKCA8PNxZoTY5Pz8/rFYrubm51c/l5uYSHh5e6zJp3VatWkVaWhpPPfUUAGFhYTWeK7Utk9arsrKS4uLi6sfXXXcdmZmZnDp1Sp8pLYjyY82UH6UmypFSm+aSH1U01kN2djaLFi2qfpyYmEjv3r3x9/dnyJAhbNmyBYBt27YRFxdHSEiIs0J1ijFjxrB582YAjhw5gouLC7GxsRddJq3TRx99RHp6OtOmTQOqRgCr7VzRedQ2ZWZmMmfOnOrHaWlpVFRUEBYWps+UFkT5sXZqy/KflCPlYppLfjQZhmFcxn60SUVFRTzzzDNYLBasVisZGRlMnz6dyMhI0tLSWLhwIcHBwWRnZzN79mwiIiKcHXKjSE5OD0Ns5gAABIpJREFUZs2aNbz//vvceuut3HjjjSQkJFQPGe3r60tmZiZTp04lJiYGoNZl0vokJSVx9913ExAQUP1cu3bt+Pzzz2s9V9rSeSRV/v1z1cfHh5SUFKZMmcL111+vz5QWRPmxivKj1IVypNRFc8mPKhpFRERERESkRuqeKiIiIiIiIjVS0SgiIiIiIiI1UtEoIiIiIiIiNVLRKCIiIiIiIjVS0SgiIiIiIiI1UtEo0gZkZWUxadIkunfvXv3cmjVreOaZZ5wYlYiIiPMpR4pcnKbcEGkj0tLSGDlyJIcOHQKgsrKSsrIyrFarkyMTERFxLuVIkdrpSqNIG+Xi4qJkKCIicgHKkSLncnV2ACJSN6+99hrLli1j+PDh5OXlkZWVRWBgIC+++CIBAQEXfM3rr7/OmjVrCA0NZdiwYdXPHzp0iBkzZlBYWMhXX33Fl19+yeLFiwkKCiImJobExET8/f2ZP38+S5cu5bvvvuOGG25g2rRpAOzevZvFixfj5uaGYRjcd999XHfddU1yHERERP6TcqRIIzNEpMWYOXOmkZCQYBQWFhqGYRh/+MMfjCeffPKC627atMkYMmSIkZeXZxiGYbz00ktGdHR09fLExETjuuuuq368cuVKIzY21khPTzccDocxduxY48EHHzRsNptx+vRpo3fv3kZWVpZhGIZx++23G3v27DEMwzAOHDhgzJw5s1H2V0REpK6UI0Uaj7qnirQww4YNo127dgCMHTuWzz//nMrKyvPWW79+PUOHDsXPzw+A0aNHX3TbkZGRhIWFYTKZ6Nq1K126dMHd3Z3AwEACAgJIS0sDwNfXl9WrV3P69Gl69OjB3LlzG3APRURE6kc5UqRxqGgUaWF8fX2r//bz86O8vJy8vLzz1svOzsbf3/+Cr6vJv9+/4erqet7j8vJyAF5++WU8PT0ZN24c999/P8eOHavProiIiDQo5UiRxqGiUaSFOXv2bPXfeXl5uLm5nZP4fta+fXvOnDlT/Tg/P7/BYrDb7cyYMYOvv/6a+Ph4HnvssQbbtoiISH0pR4o0DhWNIi3M1q1bKSoqAuCTTz7hhhtuwMXF5bz1brzxRjZv3lz9C+vatWsbLIbHH3+c0tJSXF1diYuLu2DXHxERkaamHCnSOFzmzZs3z9lBiEjdbNy4kcjISNauXcubb76Jw+Hg+eefx8vL67x1O3fuTHl5Of/1X//Fxo0b6datG99++y07duygV69ezJ07l/T0dA4ePIifnx+vvPIKJ06coKysjGPHjrF8+XKOHDlCWFgY//znP0lKSmL//v3Ex8djMplYunQpq1evZvPmzcyZM4eOHTs64YiIiIhUUY4UaTwmwzAMZwchInUza9YswsPDmTp1qrNDERERaVaUI0Uaj7qnioiIiIiISI1cnR2AiNTNa6+9xpYtW/Dw8CA0NJSJEyc6OyQREZFmQTlSpHGpe6qIiIiIiIjUSN1TRUREREREpEYqGkVERERERKRGKhpFRERERESkRioaRUREREREpEYqGkVERERERKRGKhpFRERERESkRv8fmAyfnBYkVXwAAAAASUVORK5CYII=
" />
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Figure 3</strong> NDCG100 (aka <code>n100</code>) plotted against the first dimension of the decoder for the following architectures: $\small{ i) \space I \rightarrow 150 \rightarrow 50 \rightarrow 150 \rightarrow I}$, $\small{ ii) \space I \rightarrow 300 \rightarrow 100 \rightarrow 300 \rightarrow I}$, $\small{ iii) \space I \rightarrow 600 \rightarrow 200 \rightarrow 600 \rightarrow I}$ and $\small{ iv) \space I \rightarrow 900 \rightarrow 300 \rightarrow 900 \rightarrow I}$</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>On the other hand, Liang et al mentioned in their paper that deeper architectures did not lead to any improvement. This is consistent with the results I found in my experiments. In fact, Figure 3 shows the NDCG100 (refereed in the figure as <code>n100</code>) vs the first dimension of the decoder for four different architectures. As we can see in the figure, even concentrating in architectures with the same number of layers, adding neurons per layer was not particularly helpful beyond a certain number (50 and 200 for the Movilens and the Amazon dataset respectively).</p>
</div>
</div>
</div>
<div class="cell border-box-sizing code_cell rendered">
<details class="description">
<summary class="btn btn-sm" data-open="Hide Code" data-close="Show Code"></summary>
<p><div class="input">
<div class="inner_cell">
<div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="c1">#collapse-hide</span>
<span class="n">plot_metric_vs_loss</span><span class="p">()</span>
</pre></div>
</div>
</div>
</div>
</p>
</details>
<div class="output_wrapper">
<div class="output">
<div class="output_area">
<div class="output_png output_subarea ">
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA40AAAEpCAYAAAA3RsfoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOzdd3hUVf7H8ffMZDLpPYSQQKihV6UoSkcpCmIB3AUsWNfuyrKiCK6AqIgKlhW7KPJThAWxuxqqiLAqKqAUaaEkpLdJJjP398fAQICQCSaZBD6v58nz5M49997vnClnvvece67JMAwDERERERERkVMw+zoAERERERERqb2UNIqIiIiIiEi5lDSKiIiIiIhIuZQ0ioiIiIiISLmUNIqIiIiIiEi5lDSKiIiIiIhIuZQ0ioiIVNK2bdvo1asXo0ePxm63e73dtGnT6Nq1Ky1btqRfv37VGKGIiEjVUdIoIiJSSStXruTQoUP88MMPbN++3evtHn74Yfr371+NkYmIiFQ9P18HICIiUtcMGTKElStXEh8fT+vWrX0djoiISLVS0igiInXKvHnzeOONN8jMzATgn//8J19//TWbNm2iU6dOPPXUU6xevZoFCxbwxx9/0KdPHx577DGCgoIA+PDDD3n33XdJS0vD5XLRqFEjxo8fz8CBA5k/fz7PPfcceXl5+Pn5MXToUJ588klWrlzJQw89RElJCffddx/z5s0jNTUVgBEjRtC9e3cAdu3axezZs/nf//5HQEAAISEh/PWvf+Xqq6/GZDKd9nktX76c119/nczMTMxmM82bN+eBBx4gOTmZ/Px8Ro8ezbZt2wAYNWoUxcXFrF69GpPJxLhx47jllls8+1qwYAHvvfceOTk5REVF4XQ6ad26NU8++WSVvx4iInIOMEREROqYOXPmGMnJyUZycrKxePFiwzAM48orrzSSk5ONQYMGGevXrzdKSkqMnj17GsnJyca8efMMwzCMF154wUhOTjb69u1r5OfnGxkZGUa3bt2M5ORkY9GiRYZhGMYbb7xhJCcnGy1btjT27t3rOeakSZOM5cuXG4ZhGOvWrfMcf926dYZhGMahQ4eMHj16GMnJycY333xjlJSUGAMHDjSSk5ONt956y7OfiRMnemI46v333zeSk5ON3r17G3a73bP/zp07GwcPHvSUO3rMSy65xCgsLDR+/fVXz2ObNm0yDMMwVq1aZSQnJxsDBgwwiouLDcMwjM2bNxtt27at8tdBRETODbqmUURE6rSBAwcC0LhxYwAyMzPp2rUrVquVxMREAH766ScKCwt5+eWXAejRowfBwcFERUXRoUMHAJ555hkMw+Cyyy7Dz88PwzBYunQpAHa7nTVr1jBgwIBy43j33Xc9vZ+dO3fGarXSpk0bwN07ejpz584FoHXr1thsNjp27AhAQUEB77zzzknlL7roIgIDA0lOTvY89uOPPwKwdetWANLT0/nqq6/Iy8ujdevWPP3006eNQUREpDwanioiInVaSEgIAFarFYDg4GDPuqOP5eTksH37ds9Mp+Hh4Z4yERERgDvJOnjwIPHx8fTs2ZMVK1awbNky7rjjDr766it69uyJzWYrN47Nmzd7/h87diwmk4m8vDxiYmIAyM/P98R6vIyMDA4dOgTAhg0bGD58OIBnu8OHD5+0zdH4/fyONeM5OTkAdOvWDbPZTFFREffddx8Wi4UePXpw++23lxu7iIjI6ShpFBEROcEVV1zBihUr2LVrFz/++CNLliypMOkyDMPz//z588skpt5ud/755/PSSy9VuM2pro88up8OHTqwYMEC5s+fz5o1a8jOzmbNmjWsX7+ejz/+mKSkJK/iEhEROUrDU0VE5JzQvHlzAgICgGO9cgBZWVkAxMbGUr9+fQD69+9PWFgY4B5aunfvXs4777zT7r9Vq1ae/w8cOOD5f926ddx3333lbhcTE0NsbCwABw8eLLPu8ccfZ9myZRU+t+OtX7+e0tJSZs+ezbfffusZlupwONiyZUul9iUiIgJKGkVE5BwRFBTEbbfdBrgTuYKCAjIzM9m0aRMA999/v6cHz2azMWjQIAD++9//ctlll1U4++m1115LaGgoAO+99x7gvhbypZdeokuXLqfd9uabbwZgy5Yt/PDDD57/P/74Y7p27Vqp57l582aeeOIJ8vLyMJvNnluCHH+NpYiISGWYjOPHxYiIiNRyJ95y46KLLqJfv348++yz5ObmYrVaGTJkCPHx8bz11lsUFRURGBhIr169mDNnDosWLWLBggWnvOXG8TZu3Mhf/vIXAL788ksaNWoEwJo1a5g8ebLnlhsJCQlMnz6dCy64gN9++41nnnmGDRs2EBYWRmxsLEOGDOG6664DYNq0aSxdutQTZ7t27Vi4cCEAH3zwAfPnz2f37t00bdqU6Oho7r77bjp06HDSLTeioqKYPHky77//Pt9++63nsRtuuIF27drx0ksvkZqaSkBAAIcPHyYpKYnbbruN/v37V/OrIyIiZyMljSIiIiIiIlIuDU8VERERERGRcilpFBERERERkXIpaRQREREREZFyKWkUERERERGRcilpFBERERERkXIpaRQREREREZFyKWkUERERERGRcilpFBERERERkXIpaRQREREREZFyKWkUERERERGRcilpFBERERERkXIpaRQREREREZFyKWkUERERERGRcilpFBERERERkXIpaRQREREREZFyKWkUERERERGRcilpFBERERERkXIpaRQREREREZFyKWkUERERERGRcilpFBERERERkXIpaRQREREREZFyKWkUERERERGRcilpFBERERERkXIpaRQREREREZFyKWkUqeXWrFnD8OHDadmyJWPGjOHaa69l6NChvP322z6Na9OmTQwfPpx+/fr5NA4RERGAL7/80tNefvTRRyetz8/P57zzzqNv377MmTPHBxGK1F1KGkVquZ49ezJp0iQA3nzzTd577z2effZZnnzySdasWXPabVu2bMm+ffuqJa4OHTp44hIREfG1gQMHMmnSJAICApg/f/5J6//zn/9QWlrKsGHDuPvuu6stjn79+vHdd99V2/5FfEFJo0gd1KJFC5KTk1m1apWvQxEREalVhgwZwi+//MKmTZs8jxmGwZo1a2jfvr0PIxOpu/x8HYCInJn8/Hy+/vprFixYwOWXX8706dN59dVXeeWVVxgxYgTbt28H4P7778dmszFr1izq1avHa6+9xhdffIHFYqFx48Y89NBDhISEMHXqVJYvX86YMWPYvn07v/zyC1dddRV33XUXr7/+Op999hk2m42AgADuvfde2rZt64nl1VdfJSUlhZycHObMmUOTJk18VS0iInKOa9CgAf379+ftt99m1qxZAKxevZqePXvy+eefA+526+WXXyYkJIQ333yT1157jc8//5ybb76Z0NBQXn75ZTp27EhoaCg///wzMTExPP/889hsNgBWrVrF888/j9VqJSQkhEcffZS4uDgefPBB0tPTmTFjBmFhYUycOJF27dr5rC5EqowhIrXeunXrjOTkZMPhcHiWW7dubfz4449G27ZtjQMHDhiGYRjFxcXGHXfc4dkuOTnZ2Lt3r2d5yZIlxpAhQ4zCwkLDMAxj0qRJxoMPPuhZP2bMGOOGG24wSktLjR07dhjvv/++sWzZMmPo0KGebV599VVjzpw5njjatm1rfP/994ZhGMaUKVOMyZMnV2NNiIiIlG/dunXGnDlzjO+++85o27atkZaWZhiGYdx///1Gfn6+MWbMGGP27NmGYRjGxx9/bPTp08fIy8szXn/9dSMlJcWznzlz5hgXXXSRkZ2dbTidTmPo0KHGRx99ZBiGYezZs8fo1KmTsWPHDsMwDOOdd94xrrvuOs+2ffv2NdatW1dDz1ikZmh4qkgdcv3113Pttdcyd+5cnnvuOTp27EjPnj1ZtmwZACtWrKBXr17lbr906VIGDx5MYGAgAFdeeSXLli3D6XR6yvTu3RuLxULTpk255pprWLx4MYMGDfJsM3LkSC699FJP+aCgIM4//3wAWrVqVW3XUIqIiHirW7duNGvWjIULF7Jnzx5iY2MJDg4uU2bIkCG0adOGCRMmsGvXLnr37l1mfceOHQkPD8dsNtOiRQtP+7Z8+XLatWtH06ZNAbjsssv49ttvSUtLq5knJ+IDGp4qUoe8+eab+PmV/dgOHz6cF154gVtuuYVPP/2UqVOnlrv9wYMHiYqK8ixHRUXhcDg4fPgwcXFxAISGhp52m9DQ0DJlQkJCPP/7+/vjcDjO6LmJiIhUpTFjxvDcc8+RnZ3NuHHjTllm0qRJ9O/fnxdffPGkdce3bzabzdO+HTx4kB07djB27FjP+oSEBDIyMqhXr14VPwuR2kE9jSJ1XP/+/UlLS2PVqlWYTCbCwsLKLRsfH09mZqZnOTMzE6vVSkxMjNfbFBYWsnPnzqoJXkREpJoMGzYMh8NBamoqSUlJpyyzZMkS/vrXv/L4449TVFTk1X7j4+Np164d8+fP9/wtWbKE5OTkqgxfpFZR0ihSx9lsNgYNGsSDDz7IkCFDyqwLCgrCbrezdOlSPvvsM0aMGMFnn32G3W4H3NOPDxs2DIvFUu7+j25ztDF96623NGuriIjUejabjRkzZnDvvfeecv3WrVspKChg8uTJNGnShNmzZ3u136FDh/LTTz+RmpoKQEZGBmPHjsXlcgEQHByM3W5n3bp1vPXWW1XzZER8zDL1dGPZRMTn1qxZw8yZMzl8+DDff/89iYmJJCQklCkTFhbGRx99xKOPPlomAczPz+fll1/m999/Z/z48XTu3JmioiKefvppFi9eTFhYGA899BD+/v48+eSTrFixgi1btuBwOOjcuTPgvtej3W5n1qxZLF26FIvFwt13383OnTuZPHkyqampHDx4kOjoaGbOnMmePXvIzs6mZ8+eNVpPIiJybjvaXv700084HA66dOlC06ZNPaNp/vGPf/D999+za9cuCgoKmDlzJlarlcsuu4x58+aRkpLCjh07cDgcvPHGG+zcuZPAwEA2b97MBx98wLZt24iKiqJr1660bt2aadOmsXTpUj7//HMmTZrkaZtdLhf//ve/2bhxI9dffz3R0dG+rBaRKmEyDMPwdRAi8ufs2LGDd955hylTpvg6FBERERE5y2h4qkgdtnz5ckpLS1m8eDEjRozwdTgiIiIichZS0ihSh/3666+MGDGCrKwsOnTo4OtwREREROQspOGpIiIiIiIiUi71NIqIiIiIiEi5lDSKiIiIiIhIuZQ0ioiIiIiISLn8fB1AbZGVVYDLVXsv74yODiEjI9/XYZwxxe9bit+3FL9vHR+/2WwiMjLYxxHVPbW9jTwTdf19XV1UL6emejmZ6uTU6mq9VNQ+Kmk8wuUyan2DWNvjq4ji9y3F71uK37fqevy+VhfayDNxNj6nqqB6OTXVy8lUJ6d2NtaLhqeKiIiIiIhIudTTKCLlCnYWgd1eccGAAAosgdUfkIick5x+FopLXRWWs/mZsZQ6ayAiEZFzi5JGkbNcsLOI4r3ZBDsrGCpxqsTPbufXCy6q8Bhtv10NwUoaRaR6FJe66DNjRYXlUib1JqgG4hEROdcoaRQ529ntbPqTiV/z117GEhFR7rZmw0lwQZZ6HEVERETOQkoaRaRClogIfrtqVIXlaqLHUUNmRURERGqWkkYRqVs0ZFZERESkRilpFJGqY5x9U0yfiRN7Q4vt5VxTqt5QkQrZHU5cFu9+rmQWOLj5vZ945LKWtKwfQmGJkyB/y2m30SQ7IiIVU9IoIlXGsNshBNIm/J2i1auInfkkQb37kDH9MewbNxI18Z8Edu9B9ksvUvzLz4TfcisBHTuRu3ABju3bCb1mJP4tW1Hw+Wc49u0leMAlWJOSyF2zlvztewg473wI8ff106yYekNFqsTq7RlMXfYbb9/WzavyDqeLbWkFRB/5nhj7+v8wDIPnr+1Ag4gADucXEx3sj8lk8mxz4iQ7L91wHhFB1pP2XT88ABPHtlMSKSLnEiWNIlJ1jvwQMwoKMAoKwOy+FWxpaiqO7dswiosBKP7lZ4pWrSR01GgAilatomjVSgIv7oV/y1bkf7SMopUr8G/eAmtSEgf/PY+cr76m3pznoWsn3zw3Eal2pU4XK7dlsPVgPn/r04QmMUHk2Utxenmj7HqhNl4d14mYEH/y7KVk5JdgGAb1wmw4nC5GvPg9YQF+LLqtKwA70wuoHxtSZh8RQVaufWFdhcfSTK1VT72+IrWXkkap1TTpSd1iCggAoN6zczCKizEF2ACIfmQqrtxc/Bo0ACDijjsJHTkKW9u2AISO/os7YWyRDEDwwEuwNmuGtVGSe/0FPSj1s2GpX7+mn5KIVNKZ/PA/2gNYUOJkyrLfKC51cXmH+jSMCuSD27p6eg4rYjJBh8QwAEID/Pjq/gtJzSrCz2xid0YRAVYzwTYLgf4WDmQXMfqVjbx/1wVn/mSlSunWKiK1l5LGWuR0CVKZa6LOpQRJw/zqJJO/Pyb/Yz/y/OrXh+MSPlubtmXKB118cZnlkOFXlFmO/9tt+KXnuRcKsqo4Wt8xSorJ++R9bJ274N+8OUZpKVgsZYbOidQ1lf3hP/2T31n+00HmjetE+4Qwxl3QkLAAPyKD3UNEG0UFUniGsfiZTSRFu9OLpOggvrj3AjILHAAcyLZTP8yGn1mfNxGRiihprE2UIEltExBA229XY3ZpGJC3DMMAV8W9LABGkZ2Mxx4l8t778W/enOwX5pL73gIi772PsNF/ofCbrynesoWgPn2wtWmLMycbk9WKOSi4mp+FSPUrLnUR5GciLMAPTCa2HsinfUIYN1+cdFJZm5+ZlEm9K9ynzc8Mpxm2aDKZPL2WXRpHsuzO7hQYZ5Y02h0uHv9kK32SY+jfOpbV2zNY9XsGPZtH0ys5mnU7M1m3M4vuTSK5oFkUG3dns3F3Np0bhdO1cSSb9uXw495cOiSG0alhOJv357H5QB6t40No2yCM3w/lsy2tgBb1gkmOC2FnegG7MgppHB1E09hg9mYWsS+7iIaRgSRGBnIwx86h3GLqh9uICwvgcH4JmQUlRAX7ExPiT3ahgzx7KeGBfoQFWikoLqXI4SLY393zWlLqotRl4HC6v79choHpSJ2JiJh9HUBdFOwsIrggq+I/Z9E5GY/UMgEBdFi/hrbfrqbtt6tps/Jr2qz8hrZrV3kea/vtajgytPR4BZZAsrMKcJlPP/uguJXs2M7+kVfhzMzwqrzJ35+QEVfi364dAM6MDIzCQkyB7pNChSnfkPPvFyn59VcAsufMYU+PbuT+30IA8hZ9wOHHHsX+008AOPbtxbFnN0ZJSVU/NZEqV1DsTu7+2j2Rj+7szjXnNyi3rKXUSRBGhX9ncp3bmeZEDqfB57+msz2tAIAtB/JZ8uNBthxwj4r4OTWPBetT2bQvF4D/7cnh1dV7+N/uHAC++yOb57/5g+92ukdPrNmRyZOfb2f1tkwAVvyewaMf/cY3Ww8D8N+th/nn4i18uTkdgM9+TeOehb/w8aZDAHy06RA3z/+J//xwEIAlPxxgzGv/48ON+wH4YON+rvr397y3PhWABetTGTJnHW9/uxeAt77dS59Za5j7xTYAXlu9h+6Pr+LlFbsAeHPtHno9tZrXVu8G4N3v9jHo2W892//f96mMeHE9763fB8CH/9vPX17ZyKIjx1/200HGv/Uj//nxAACf/nKIu97bxMc/u+P/cnMaWUd6gb1hdzgpdXp3gk5E/jz1NJ6J2tYjWNviKYdRWgpmMyazmZI/duLKysK/TVvMAQHkf7wcZ1oaIVddjSUsjMxZT1Gauo8m/5rs1b5dBQXkLllO8NDLsERGUnpgP6bgEMyhoef8WdICSyCx9eux99NvyJz1JIG9ehP5tzu82taxby+pQwcTtPBd2n67CqigLk+ReJ7tSrZto+CLz4n42x34xdWndO9e93vdC6bAQGIefcyzHP3oY0Q+MAGTn3tYXlD/AViiY/Bv3x4Aw+UEf38sMTEAFK1ZTeF/vyKwazfo2JHsOc9R8NmnxEx/nJDLh7lnqd2ymfAbxxPQqTPFP2/CKHHgn5yMOTS0imtCpHKO3gojKrgOzIh8CgFWM48Oa0mzWHfP/0XNo4gKttK6vvuz1b1JJIFWM+0T3NdYntcoHC5OonPDcMB97eWY7ol0aOhe3zo+hCs7x9OmgXv75vWCGdyuHi3i3PtvHB1In+RomsS4h9smRATQvUkEiVHudr1eqD/tE8KIC3dfSx4ZZKV5vWBPz2qozY+EiADCA62e+KOCrATZ3K+DxWwiwGrG38/dn2AcuYXS0Sa0pNSF3eHyTEpUUFxKZqEDu8OdqOcUOUjNtpNnd3//ZeSXsD29gMwC90msgzl2fk7NpXuTCAD2Zhbx3R/Znvr543AhTRMivK7/d7/bx8srd3PLxUncdHESn/x8iK9/O8yQdvXo1yqWzQfy2J1RSJv4UJKigygscWIxm9y90SJSaUoafcSZm4Pz4EEs0TFYoqMp+PILAtu28HVY5XLl5eHKz8MSE4PJ6o99w/c4Dx8msOdFmENDyZn/NqWpqYTfdBN+MbGkPzwJx84dxD75NNbERH4dOJjCX34lYelyrE2akHbP3ZTu+oMG/1mGf9Nm5LzyMo6dOwns1QtLWBhFa1bh2LEDY8pDXsVn5OeR+eRMAnv3xhIZycGbbqR0714aLP0I/yZNSZvwdwy7nejJU/CrV4/8Tz7GZLW64w8KwmW3Y7LZzuoE07HrD0o2/4pRVERRytdYmzQl9omnTr/Ntm2YgoPZ/85CYqc/XkORVuDIkFkAo9iOMyMDk82GJTrmpHLVxSgpcc8MazJx6PZbcKalEdCtG4Fdu1H/1dfxizuzCXtMJhOWsHDPclCv3gT1OjYkL2bKo0Q/MtUz/DXsr2MJ6NYN//YdADBHRODXIAG/eHePjX3jBuzrvyPsyCy12S++QNGa1dR7bi5BfftxeMojlGz7jaiJDxLQsROFK1Iw7HYCunbDEhWFYRjV9pnQJFdl7d+/n2nTphETE0NaWhoPPfQQDRs2LFNm8eLFzJw5E6vV/aO/fv36fPjhhwDMmzeP7du3ExUVxc6dOxk7diwXH7lW+HTb+VKA1QzU3Xu7Wi0mBreL8yy3jg+ldfyxkzEdEsM8k/IAdEmKoEvSsaSoe5NIujeJ9Cxf1Dyai5pHe5b7toyhb8tj32sD29RjYJt6nuUh7eMY0v7Y8Yd3imd4p3jP8tXnNeDq84713o7ulsDobgme5bE9GjK2x7H32I09G3Fjz0bExoaSnp7HLb0alxkqfGPPRozp0dBzDejYHg25sksDAq3uJGxU1wSGtI8jxOb+aXlVlwb0aRlD1JFrU4d3qk/3JpHUC7N54m+fGEZiRKDn+YWf4lYn5bE7XJhNEHwk6d16MJ+Vv2fQ6Uidf7k5nXe/28cdfZtw3QVBvL5mD29/u/fIckMW/28/q7ZnMqJTPL2So/lpbw57MovokBhGUnQQ+fZSzGYTgVazT34baCZZqW2UNFYjo6CAktQ0/JNbkvvOfPI/WU749TcSfMmlZD39NPlLPiR68hRCrxmJfd06XI3LH5pzoswnZ2LfuJHIu+8huFPbijfAPemGiwLMwcEUfPE5pfv3E3L55ViiY8h6fg6OHTuI/PsErImJHLrrDko2/0rcv1/Bv0ULDlw/Dse232nwwYf4t2xFxhOP4/jtN+L/bxG21q0p+GgpJVu3EjJsGH4xsTi2baNky2ZcOdmQmAgWC5hMuIrc0xnYWrXCEh7uOYUZPPRyXDnZmIPdU59H3nMfRkkJJot3wyJNwcGEjr7WkziYg4MxBQV5lu1r1+LKy8X0r2nu+ntiJq6sTBL/m4I5KIjUoYNw5eWT8PGn+MXGcvjRKZjMFiLvvc+dFKesxF5qxtamDSar941abRJ0yaVYWyRTum8vhx+c6NUQxqC+/Wj45de4Cs90GoqqV2AJ9PSYO7L2kv/51/g1bEjo8BGV3teZJC65CxeQ/dJLRE96iOBLBxH217GU7k91T/YD2Nq1r9bJekwmk/vzBAScfz4B55/vWRc96WGYdKxs1D8m4ti9G/8jEw9Zm7fAlZeHX6L7h2LJ1i2UbNmMyeT+0Zfz6isU//Qj9V9/C0tUFIduuQnH7l0EvvEKJDQlb9H7GPZiggYNwi8mFldeHqbgYEzmMzhzX0dGSNSUqVOnMnLkSAYMGEBKSgqTJ0/mzTffPKnc3Llz6d69+0mPr1q1ijfeeAM/Pz+2bdvGyJEjWbduHTab7bTbiZzO8cmSn8WM33FNcuCRayGPCg+0enoxAaJD/MvMehsXFkBc2LGTeYlHrsU8qklMEIUVjWY5zh19m3B7n8a4jvR8Xtk5ns4Nw2ka6+6JbRkXzCVtYkmu5+6pdThd+JlNhAe4f/puPZjPmu2Z9GwWBbiH+374vwM8cEkzkqKDeGnFLj7YuJ+/D2zGqK4JvJayk69+Psjorglc2CyK9X9ksT/bTudG4SRFB5Fd6MBscs/aWxVJpmaSldqmxpLGc/EsqjMnm6LV3+Gf3JLSw+mU/PILjl27ALAmNcLatKlnhsmggQMxH9fDUJHiLe4fe5W5GMN5+DCOojRsHTqQ89ablPy8CVvnzliiY7CvW0fxpp8Iu+56rImJuHKycaan48pzX4vhFxeHKy/XM+wusMeFWBs3wXzk2qvQv4zBlZeLJdZ9FjR68iMYThfWJk0BaL1kEYdziz0/LmOfnFUmtoibbymzHNSnr/sfL398m0NC3T+Yj2jwftnXv96LL+E8fBhzeDiGYRA8cCDOw4exREZiuFy4Cgsx7EVYItzr85ctBYeDyAcmALD9xptxFRXRaO13mKxW9l7SH5MtgAYL38ccHEzmM09jDgkl/LrrMfn7U7x1C5aISCxxcT7rvTw+ISq2ZxNmMaB5I2jWkHorvsbk5+eu33J6corWrgGTiYAeF+AXXDsnXrEmNiTyzrvPfAdeJi5tvvmKwl++I6h3HwBcWZkUrV1L8KWDCL/hxpM3OK43FMBiMeF0nqJHpZqH8/ont8Q/uaVnOervD5RZX++5ua92E54AACAASURBVJQe2I+1eXMAAnv2xBITg19iIgCl+1PdIyJC3Cdzct+Zj2PnTgK6dYOYWA7ceB2OHTto8N7/4d+yFVkvPI/hKCFszFj3yaPduzGHhWGOiDire/H/rKysLFavXs3cuXMBuPDCC7nzzjs5dOgQcXFxZcouWrSIlJQU7HY7o0ePpmVL9+v71ltvYT7y/ZqYmEhhYSF5eXmepLG87eTkSXaMSiQu4ltmkwmzxf16NY4JonHMsfRpULs4Bh3XE3zfgGbc278pR2/5+ZduiVzUPJpm9dzbtEsIo7DESfMjw40Nw8DmZyY80P1T+ed9OazbmcWlbd2/cz755RCf/JzGw0OTSYoOYu7XO/lo0yEmDW7BFZ3jeW31bjbuzuG6CxvSvUkka7ZnciDHTvcmkTSMCiQjvwSzCcICrVjOwhl81VN69qmxpPFcPItqDgry/BgLvfIqgvr09dx3LvzGmwi/8SZP2cAeF2CuRO9EvWeew7F7F/7NmgPeXThu8vcHhzvpC770UgI6d8YS6R4aE3H733AVFGBNagzgHrZotmCJcp+Bi3vx32X2deKPz9Aryvby2Nq1L7NsDgzAlO/9Be5VLaBj2RvCRz/8SJnlRmu/wygowGT1x3A6iX38CZxZWZgDAzEcDkIvuhD74UxMwcG47HacBw+Cnx+moCAMh4PcN14Hs5nwG8djOJ0cGD0SXC6SNv6IYbGwf+RVWCIiiXv5FTCbyX3zDSxRUQRfPgyT2Uzp4XQsEZHuRK6q/ImeHMMwyJz1FI7t24h95jmC+w+ouriqUOnhdPI/XIQ5MoqwkaOq7TjOzAyynnmawF69CblsGP6tWmM74T11vON7QwHPcK/axq9+fU8PKUDEbX8rsz5hyTJK09KwJTUiP9tO6NUjceze5Rn+atiLobTUc41l/uJFONPTPcNhD916M6X7U0n46BOsSUmkP+zuBo26/wGwnX0/ks7U/v37CQoK8rRn/v7+hIWFkZqaWiZpTE5OplmzZnTs2JE9e/YwevRolixZQlxcnCdhBEhJSWHgwIHEHHldTredHJ1k5xhnFc3UKrWPyWTiSI55UpI5tH0cQ48b7vuPQS34x6AWuI5c23nXJS3o0yyK5kd6Ls9rFIHFZKLZkZ5Ns9lEsL+FiCNDbLcezGfD7myu6uIeMvzRTwf5+rfDTL+iFQ2jApn95Q6+3JLOv4a3YlDbesz9eic/p+Zya6/GnJcUgd1Rtyf5UU/p2adGksZz9SyqOSKSoF7unjZroyRPwlgVLBERWCKO/Gj1Mtm0xMRiS3InieHjri+zLrBn2eTi6I/Cc4XJZMJ0pDfFZLEQfMmlx9ZZrSS//caxH/02Gw1Xr8WVlY3JZMLlchE5YSJGYQEmiwVXXh7+yS0xHA5MVivO7Gwcv/9OaWioe31+PlnPPI0pMJCQ4VfgKipiX78+mAICaPTdBnA4OHTXHfjFxRHzr2kYpaUUfPYplphYAnv0cE9OYBhnNiTQW6UOggcNpvAbf4J69aq+4/xJrowMsl94HmuL5GpNGk3+/oQMHwGlDswhIQR06lxtx6pNTP7+WBMTMVutgJ2wMWPLrE/86GNcRUWYjvSYRt5zH6UH9mOJicUwDMxhYZjycrHUr49hGBR+8bn72uJ/TsLbk11yTLsjs+wCNGrUiFatWpGSksKoUcfe+/v37+f9999n9uzZldquItHRIV6X3Zfp3XB2i8VMbJRvfy7GxmpCqFPxVb1k5hezanLfCssF+VuICrHVQETHxALN6h37HNw4oGwdPXvdeQCea8Efuaod+zKLaJ0QRmyojUs6xRMXFUTnFrHExoYSFmIjPMhKk/hwYmND2ZVl58e9uQQE24iNDSV1t5e/7Xz8OSrvvVKZ7wGLv4XCkopPwPjidT+dzPziU8a9L7PQcxkJ1L64z1SNJI06iypn7IRhfqcrV1OOTlZydMISs81G+NhxnvXm0FAavL/o2HJICA0WLcFVkA+A4XIRdt0NcOTeh67cXMxR0ZgDAjCZTJRmHMb+7Vos9dxDYJxZWRye9E/MkVE0WrEKV3Y2e/v1xtq4MQlLluEqKCBz5gz8EhKIuO1vGI4S7D/+SHBykzN/jlZ/Im65lYhbbj3jfdQES0wM4Tff4hkWXX3HiT31MFTxDFEHCBk2vMy64z8HhstFvWeeozQ9HXNICORn1liMtV2DBg0oLCykuLgYm81GSUkJubm5JCQklCn3xx9/0KTJsc+11WrFftw1uampqcyYMYNZs2YRGRnp9XbeyMjI91w7VhE/P4tXPXV+JnzaA19bRwD4mq/rxZuf1s4iJ+lFNXtrocrWS6gJWkcHgL2EdHsJ/ZpF0q+Z+3OZnp7HxIFNmTiwqWf5nr5NGH1ePAlBfqSn53k9y6vT6fLZ63W6OnF6Oczb6XSRV+R9r6Szhl/30ynEVCfjLo/ZbDrtCcJaNRFOXTmLWmzP9qqcxWKq1Nk6h7mEDuvXVFjOHBREbPSx/VZXPFXtzI7t/TbVfZ7tT9Vd/HnH7SiU+jMfLbMc/+sPGC4XJrMZZ3Ajgt59G5ejhMjYUEqcBRSOGI45IIDY2FCKMg+w1+nEYnK/nkU5aexZ+h9sTZvQYvJE7H/sYvf4G4j8j3fX9Z74vijel8rOO+4m7qYbiLr8sjN/zlXslPUfG0r9fz188uNeqsnPTl3v0aiS+K8Y7Pm3pr+3anP9R0ZG0rNnT1atWsWAAQNYu3YtXbp0IS4ujq+++oru3bsTGhrKtGnTmD17NuHh4RQWFvLLL79w7733ArBnzx6efPJJpk+fTkREBJ988gn169enS5cup92uOpw43LNcGtop4tEoKpBGUcdOwgX51837IzucLvZn24mJqp1zIciZq5Gk8Ww7ixp8qgktTsHpNCp59scfAk59v6oyZ3NcwHH7rb54qo6vz1z+WTUef3t3kpmengeWYMIenXFsOSqepI0/4MrLJz09D6fhT/SURzH5uc9OOtJzsXU5D7yc5fXE90XWvDfIX/89RnQszh4V9xbUhPLq35WfT+678zEFBBJ+3fWV3m9NfXb0/j9ZTX5vHR9/RWdSfWXKlClMnz6dlStXkpaWxmOPue/f+eyzzzJ16lTOP/98evfuzcSJE0lKSmLv3r1MmDCB1q1bA3DTTTeRlZXF0KFDAbDb7bz44osAp91ORKSynC7Dc7OcT385xOb9efyleyLx4QFc/8YPbEsr4MuJ3l3acji/RFNP1RE1kjSebWdRa92QydoWj1Q7k9XfM0mRJSKC0Kuu9qyzNmlC/JtvYznD2z6Ej78JS1z9kyYzqo1chYVkv/C8e5hqJZNGV3Exrtyc6glMpJISExN56aWXTnp8+fLlnv/HjRvHuHHjTioD8MUXX5S779NtJyJyOpsP5LH1QB49mkbRICKAR5Zu5ast6Tw7qh1D48L49Jc01u3MomvjSOLDA2gYFUh+calnAqGKOLw8gQhgGDDm9Y34W8y8fn1niktdjH/rB58te/kUzxo1Njz1bDqLeuLMiL5W2+KRuubYt15pejqW6OhqnVSmKplDggm/+RbMoWEVFz5B5hOPE3jt1RUXFBER8aGXbjjPMyvriQxMHJ1y5kxuX5Fd6GBXRiHx4QHEhdl4a+0ePt+czu29G3Nxi2jeWbePr7akM/XyljSICMBqMVHqMkjLKwbgsvZxdGsc4ZmJ9vERrTGZTF7fczMuzEZpJRLH3w8VeK73NAzDp8veOpxfQmKwH4YBa3dk0rxeMPHhda8jp8aSRp1FFamt3F/shmFw6G+3YtiLiZvzPNYmZz6RTk0xBwUTedc9ldrGVVCAOTiYiFtvx2RXT6PPlDtCwsCw24/MyGrSCAkROScdf/9QAxN9Z6RUuE15t6/ILXLPVh0WaOWbrYf579Z0Lm1bj4tbRPNSyh8s+fEgfx/YjFFdE8gsdLA9rYCd6QVc3CKabk0iCLCaqR/mnqLozr5NeOCS5gQeuebykrZlJ6Kr7D15zSbw9/NuG5MJ5t/YxXOLcn8/s0+XvX2qhuG+p+ierCL+/sGv1Av1Z/ldPcgudPBSyh+0aRDK8E7x3u3sCF/cB7NWTYQjIj5wZHyFMz0dV3YORkkJfidcb1xbGaWl5Lz2ChjGSfcYPJXClSs4/NCDxM5+lsCu3bA4wzS020dOO0IiGAxHCXnvv0/QgIH4xWkkhYicW46fUMq7m1e4bTmQx9odmbSJD+WCZlG8vGIXr63Zw629khh/URI70gv4YnM68eEBXNwimhZxIbSJDyXY5k4Cr+rSgEvb1KNRtPt794pO8VxxXEITGXzquTdqSsv6x65JN5tMPl32VmSwu5e4pNRF18YRxIS463BbWj5LfjzItvQChneKZ+vBPO5//1cubh7Fg0OSybeXsi+7iMbRQQRYy06M5Iv7YCppFDlbHdeTY7GYcJ4w/MOVm4OroABnQSGEROFXrx6Jn36OY/cuTP6+bRQqI/uF58Fi8SpptG/4HldODoWff0Zg124a2l2LZT75BHn/t5CSbb8TM/Vfvg5HRKRO+GlvLi+v3M2IzvFc0CyKuHAbAVazp1eqV3I09cNttGvgvqzj6vMacPV5x+7NffwMrn/G8T2lFZXzpsesLvMzmwCD5vWCeeEvHTyPN4wM5L4BTQkLdCeVO9ILOZxfQl6xu2dw455sJizaTI+mkcwZ3Z4daQV8/dthujQKp3VS5KkOVb3Po8aPKCI14viE6FSzX2a//R7ZL71AxN/uIGTEVdg3fE/wJZfg36y5L8I9MxYL4TffgsnP6rmh8okMh4O8JYsJvfoaIu+5D/9WrQkePMQHwUplhP11LPYf/kdQn4pv9C0iIm4dG4Yxpkci5zWKANzXHA7vWN/TPibHhZAcV/0zSFfu1jvn5vyp9cMDuLZbomd5cLt6dEwMw3nkbg5Ol0GT6CDP67VhdzavrNrNsI71lTSKSM0J/esYwsaOwxwSQtac58h5dR7279fXqV4dk8lU4TWN6f94gML/foXz0EEi77qHkCFDayg6+TOsjRvT4IPFmEwmXHY7Zg0RFhGpUOv4UFrHH7svrZ+lcpO2+EJleiVr0/1dy4vbYjHjdLrKlPMmbrPJRGLksZ7efq1i6dcqFuPIZURtGoQypkciHRMrP/lfVVDSKHIOCXYWwdF7mFoAnJCXQdBfrib+8kGYI8IxFWRBQIC7p7IOyHn9NVxFhUTcfGuZYbWGy4XJbCZ01GiKf/lFPVZ1zNH3qlFYgDM3F0tkFCab7eSCdei9KiIiJ6tcr2TtUV7csVFBZUd3/cm4j/YSt08Io32CO2GszHWuVUVJo8i5xG7n1wsuqrBY229X15lr/XJeexVXXi5hY8dhOZI0Fn23jsynniDupXkE9riAhOWfYD5VwiG111n4XhUREamrlDSKSJ1SprcUaPnh/wEGZnMpFGQBBgH1owmd+HfSF7xL5D33KmEUERER+ROUNIpI3eJlD1TrLz4l4q4LayAgERERkbObkkYROSuZQ0MxmWv/BAAiIiKnU1cnipGzi5JGEREREZFaqq5OFCPVxxcnEpQ0ioiIiIiI1BG+OJGgsVsiIiIiIiJSLiWNIiIiIiIiUi4ljSIiIiIiIlIuXdMoci4JCHDfDN2LciI+pfeqiIhIraGkUeQcUmAJhOBAX4chUiG9V0VERGoPJY0iUreoB0rOQvv372fatGnExMSQlpbGQw89RMOGDcuUWbx4MTNnzsRqtQJQv359PvzwQwAMw+Cpp54iIyOD/Px8+vfvz5VXXlnhOhEREW/UWNKoBlFEqoJ6oORsNHXqVEaOHMmAAQNISUlh8uTJvPnmmyeVmzt3Lt27dz/p8c8++4zdu3fzwgsvUFxczODBg+nWrRuJiYmnXSciIuKNGksa1SCKiIicLCsri9WrVzN37lwALrzwQu68804OHTpEXFxcmbKLFi0iJSUFu93O6NGjadmyJQBLly6lb9++ANhsNrp168bHH3/Mrbfeetp1IiIi3qiR2VOPNogXX3wx4G4QN2zYwKFDh04qu2jRIp544gkeffRRfvvtN8/jS5cupVevXkDZRq+idSIiIrXZ/v37CQoKwmazAeDv709YWBipqallyiUnJzNmzBgmTpzIDTfcwA033OBpR1NTU4mOjvaUjY6OZt++fRWuExER8UaN9DSerkE8/ixqcnIyzZo1o2PHjuzZs4fRo0ezZMkS4uLi1CCKiMg5rV27dp7/GzVqRKtWrUhJSWHUqFHVfuzo6JBqP4YvxMaG+jqEWkn1cmqql5OpTk7tbKyXWjURjhrE06vrb0DF71uK37cUv2/V5vgbNGhAYWEhxcXF2Gw2SkpKyM3NJSEhoUy5P/74gyZNmniWrVYrdrsdgISEBDIyMjzrMjIyaNy4cYXrvJWRkY/LZVTymdVusbGhpKfn+TqMWkf1cmqql5OpTk6trtaL2Ww6bT5UI0mjGsQ/r66+AY9S/L6l+H1L8fvW8fFX1Cj6QmRkJD179mTVqlUMGDCAtWvX0qVLF+Li4vjqq6/o3r07oaGhTJs2jdmzZxMeHk5hYSG//PIL9957LwDDhg3j448/ZtSoURQXF7N+/XruuOOOCteJiIh4o0auaTy+QQROahDz8tyN+bRp08jJyQHwNIjdunUD3I3eypUrATyN3tChQytcJyIiUttNmTKFDz/8kEceeYSFCxfy2GOPAfDss896ru/v3bs3EydO5PHHH+eBBx5gwoQJtG7dGoDBgwfTsGFDJk6cyH333cff/vY3zwzlp1snIiLiDZNhGDXSvbZv3z6mT59ObGwsaWlpPPjggyQlJXHZZZcxdepUzj//fN5++23Wrl1LUlISe/fu5ZJLLuGKK64A3LfVeOKJJ8jKyiIvL49+/fpx9dVXV7jOW+pprF6K37cUv28pft+q7T2NdUFtbyPPRF1/X1cX1cupqV5Opjo5tbpaLxW1jzWWNNZ2tb1BrKtvwKMUv28pft9S/L6lpPHPq+1t5Jmo6+/r6qJ6OTXVy8lUJ6dWV+ulovaxRoanioiIiIiISN2kpFFERERERETKpaRRREREREREyqWkUURERERERMqlpFFERERERETKpaRRREREREREylWppPGnn35ixYoVlJaWkpWVVV0xiYiI1DlqI0VE5GzlVdK4c+dOBg0axPjx45kxYwZ2u53x48ezYsWK6o5PRESkVlMbKSIiZzuvksbHHnuMBx98kA0bNlCvXj1CQkJ45513ePXVV6s7PhERkVpNbaSIiJztvEoaS0tL6d27NwAmkwmAoKCg6otKRESkjlAbKSIiZzs/bwoZhsH69evp1q2b57Gff/652oISEZGTGYZBVlY6JSV2wKix46almXG5XDV2vD/PhL9/AJGRsZ4krjqpjRQR8T1ftZEnqu1tpsXiR0hIBIGBwZXazquk8Z///Cc333wzwcHBZGdnc/nll5OVlcW8efPOKFgREam8/PwcTCYTcXGJmEw1N/m1n5+Z0tLa2wCeyDBcZGcfJj8/h9DQiGo/ntpIERHf81UbeaLa3GYahoHDUUJ2djpApRJHr5LGdu3a8eWXX/LNN99w8OBB4uPj6dOnDyEhIWcWsYiIVFpRUT5RUXE+bQzrApPJTGhoJJmZh2okaVQbKSLie2ojK2YymfD3txEREUtOzuGqTxoBQkJCuPzyy8s8tmLFCs91HCIiUr1cLicWi9df2+c0i8UPl8tZY8dTGyki4ltqI71ntfrjdJZWahuvavY///nPKR+fN2+eGkQRkRpUE9fonQ1qsp7URoqI1A5qI71zJvXkVdI4ffp0WrVq5VnOy8tj9+7dtGvXrtIHFBEROerOO2/hxhtvoUuX88st43K5GD16BK++Op+wsLAajM47aiNFRKSq1bb20aukcdy4cdx1111lHtu9ezcffPBBtQQlIiJylNls5tlnX6yVCSOojRQREd+oyfbRq6TxxMYQICkpiQ0bNnh9oP379zNt2jRiYmJIS0vjoYceomHDhqcsu2/fPoYNG8bDDz/MlVdeCbhnp1u1apWnTHFxMZdffjlTpkxh8eLFzJw5E6vVCkD9+vX58MMPvY5NREQqb8mSRbz55qsMGHAphw4dZMeObdx2211s3bqZH3/cSHBwCDNnzsbpLOWll+bi72+joCCf+vXjGTv2Bj7//BP27dvLBx8sJCXlv4wdewOPPPIg6elpDB06jHXr1lJSUsyIEdfw+uvzeOSRx+jS5XwOHNjPv/89l7i4+hw8eJBGjZK46abbfFYPaiNFROR4Z2P76FXS+Pzzz5dZdjgc/P7775U60NSpUxk5ciQDBgwgJSWFyZMn8+abb55UzjAMZs2aRUJCQpnHg4ODWbNmjWd5+vTpDB482LM8d+5cunfvXqmYRETkzI0YcTWbN/9CTk4206Y9wYYN63nwwQd47bX53HrrHdx++3h++GEDP/74AxERkVx//U0A3H77eNq0acellw7ho4/+wzXXjPYMv3nkkccYPXoEF13UixtuuJkPP3yfYcNG8MUXn3qO+69/Pcxf/jKOiy/ug8PhYPLkiT55/kepjRQRkeOdje2jV0njwoULufjiiz3LVquVzp07c9VVV3l1kKysLFavXs3cuXMBuPDCC7nzzjs5dOgQcXFxZcq+8847DB48mHfeeafM45MnT/b8X1hYyNatW3nooYc8jy1atIiUlBTsdjujR4+mZcuWXsUmIiJ/Trt2HQBo0CCBoKBAGjVKAiAhIZHDhw+zbt1aIiMjeeqpGQAEBgZy6NDBcvcXERFBixbu7/CrrhpZZl1hYQE//7yJ9u07Ae72aObM2VX+nCpDbaSIiJzK2dQ+epU03nTTTVx//fVnfJD9+/cTFBSEzWYDwN/fn7CwMFJTU8s0iLt27WLLli2MHTv2pAbxeB999BHDhg3zLCcnJ9OsWTM6duzInj17GD16NEuWLDmpsRURkarn7+8PuGdjs1r9PY+bTCYMwwBg8ODL6N//EgBKS0txucq/8fHx+6gL1EaKiMipnE3to1dJY3mN4bRp03j44YerJBCXy8WsWbN47LHHKiy7fPlyXnnlFc/y8TPUNWrUiFatWpGSksKoUaO8Pn50dO2/CXNsbKivQ/hTFL9vKX7fqor409LM+Pn55qbF5R3XZDJhNpvw8zNjsZjLlD267oILLmTDhu+49NJBAMyZ8xz9+w+kY8dOBATYMJkMtm/fip+flaCg4FMez2QyYbGYCQsLpUOHTmzevIlevfpQXFzMv/71CNOnP3FSbGaz2VPv1fn+URtZd9X175Xqono5NdXLyWpTnfiyjTyRn5+5VrePULaN9Oo5lbdi3LhxFW68detWrxrEBg0aUFhYSHFxMTabjZKSEnJzc8tck/Hbb79RXFzMM888A8Aff/zBkiVL2LFjBxMmTPCU27BhA+3atSMgIMDz2B9//EGTJk08y1arFbvdXmFcx8vIyMflMiq1TU2KjQ0lPT3P12GcMcXvW4rft6oqfpfLRWlp+Wcgq4ufn/mUx12zZhW//vozhw4dom3bDrz11mvk5uayYMG7NGzY0LPuzjvvZcmSD5g+/V/YbDbq1atP27YdKC110adPf+bPfxs/Pz8mTHiQl19+gdzcXJ588nHuvvvv+Pn5sWzZEvbu3cPChQto0CCRhx9+lH//+3k2btxATk4OV1016pTxuVwu0tPzytS/2WyqkgRIbWTdV9e/V6qL6uXUVC8nq2114qs28kR+fmZWrFhRq9tHONZGHlVR+2gyjvaNnmDw4MHccsst5W5oGAavvPIKn376ablljnfzzTczatQoz0X+r7/+Om+//TZfffUV3bt3JzS0bKY7duxYRowY4ZkZ7qi///3v3HPPPTRq1Mjz2Pjx45k9ezbh4eEUFhYycOBAXn31VVq3bu1VbFD7G8Ta9sGsLMXvW4rft6oq/oMHd1O/flIVRFQ55SWNtd3R+qqOpFFtZN1X179Xqovq5dRULyerbXXiqzbyRHWlzTyxvipqH8vtaZwwYQL9+vU77cEiIiK8DmzKlClMnz6dlStXkpaW5hli8+yzzzJ16lTOP989M1BpaSkzZsxg165dLFu2DKfTyTXXXANAeno6dru9TGMI0Lt3byZOnEhSUhJ79+5lwoQJlWoMRUREKkNtpIiInEvK7Wn0xvjx43nttdeqMh6fqe1nUWvb2ZzKUvy+pfh9Sz2NvlGdPY3eUBtZu9X175Xqono5NdXLyWpbnainsXKqrKfxeDt27OCRRx5h8+bNlb4OQkRE5GymNlJERM52XiWN06ZN45577uHpp59m9uzZOBwOVq1aRWpqanXHJyIiUqupjRQRkbOdV0kjQLdu3bBarZ7Z3Bo3bsxtt91WbYGJiIjUFWojRUTkbObVzUxcLhcOhwObzcaXX35JUVER69atY9u2bdUdn4iISK2mNlJERM52XvU0Dh06lCVLlnD77bdz2223UVBQgMViYfLkydUdn4iI1EGlpaW8//57vP76y7z22jskJTX2rMvLy2PWrBkEB4eQnp7G+PG30qpVmwrX1VZqI0VExFt1tX30Kmns378/0dHRAKSkpLBz504SEhI8j4mIiBzv44+X0b59h1NODDNv3ou0bduBkSOvZefO7Uya9A/ee+9DTCbTadfVVmojRUTEW3W1ffRqeOrYsWPZu3cvACEhIXTo0EGNoYhIHeCfuZfwzZ8R+dMSwjd/hn/m3irb95Ilixg+fBDPPfc0kyZNYPToK1m9egUAw4dfSfv2HU+53RdffEKPHhcC0LRpc0pLHfz6688Vrqut1EaKiNRNuSUuduU62Z7jZFeuk9ySqrlVxtnYPnqVNJrNZhYsWMBdd93FwoULycurPfdkERGRU/PP3Evwvh+wOIowARZHEcH7fqiyxHHEiKvp1q0HeXm5zJjxFPff/w/mz3/ztNvk5uZQUFBAZGSU57HIyCj2799/2nW1mdpIEZG6TgFMPQAAIABJREFUJ7fERXqRQemRW9CWGpBeZFRJ4ng2to9eDU997rnnaNasGS6Xi1WrVjFjxgxcLheDBg2ib9++1R2jiIicwD9zD7bM3act41eYicko2/iZDCfB+/6HLXNXudsVRyVREtXI61jatesAQEJCIpmZmV5vd7ZQGykiUrvklrjILTFOW8buPPkxA0grMsgtOcXKI8L8TYT5e9Xvdla1j149Y4vF4i5sNhMZGUlAQAApKSnMmjWrWoMTEZE/wSjnbGl5j58hf39/wN1GGBXsOyws/P/ZO/M4Oco6/7+fqupzrkwmk8lMMrmB3JALJBBAyIIQNhIwiOuiXAq6sAu6/HAFQhZBcTca5FjQ6IJrUNxF44ZDgkFBIAsG5ApXyEWSuSdz9Mz0XfX8/qienu6Znpmes7snz/v16ld3P/VU1VNPd9VTn/oeD15vHs3NXYNnc3MT5eXlfS7LZtQYqVAoFIpUjKXxMS1L4ze+8Q0+85nPsHXrVlpaWli9ejWbN29m0aJFI90+hUKhUKQgPH5qv9bAovefRY8EepRbDg9ts1eOVNP65dxzz+fVV3cydeo09u/fh67rzJ+/sN9l2YoaIxUKhSK7KHRqFDr7rnPQZ8ZdUxMxBEzJ10emYf2QzeNjWqJxz549TJkyhW9+85ucddZZGEZaqykUCoUigwQmzSfvyJsI2eVmI4VOYNL8Ydn+K6+8xPvv76a+vp7Fi5fyX//1CD6fj//+718xb958nnvu9wD8/Oc/46yzzuGMM84C4Ktf/Rr//u/f4+DB/dTX13HHHXehaVq/y7IVNUYqFApF7jHeLWgISBJ1o4iVD5WxOD4KKWXfDr/Agw8+yD/8wz+MRnsyxtGj7VhWv12RMUpLC2hoyN3kCqr9mUW1P7MMV/traz9h0qRpA1rH2XQYT+17aJEAlsNDYNJ8wuMrB7QNw9CIRofXpXU06OyvxP7XNEFJSf6w7keNkblJrl9XRgrVL6lR/dKTbOuTwYyRvrBFU9BOhmMIWzCmG6/YG7kyZnbvr/7Gx7Qeh471wVChUCjGKuHxlQMWiYqBocZIhUKhyE3ScWNV2GS3z49CoVAoFAqFQqFQKDKKEo0KhUKhUCgUCoVCoegVJRoVCoVCoVAoFAqFQtEraYnGZ599li9/+cu8+eabALz33nvccMMN1NXVpb2j6upqvv71r7N+/Xquu+46Dh8+3GvdI0eOsGTJEn7729/Gy1577TWWLl3KaaedFn8l8sgjj3DjjTdy00038eMf/zjtdikUCoVCMRSGY4xUKBQKhSKbSUs0PvbYY/zzP/8zixcvBmD+/PlcccUVrF+/Pu0dbdiwgYsvvpg777yTyy67jNtvvz1lPSklGzduZPLkyT2W3XrrrbzyyivxVyfvvPMOTz75JD/84Q/54Q9/yB/+8Adef/31tNumUCgUCsVgGY4xUqFQKBSKbCYt0ahpGgsXJk8euXTpUoLBYFo7aW5u5uWXX2blSnsy6RUrVvD666+nfAq7ZcsWzj//fMaNG9dj2Y4dO7jnnnvYsGEDu3btipdv27aN008/HU3TEEJw5plnsm3btrTaplAoFArFUBjqGKk8cRQKhUKR7aQ15UY4HObw4cNUVnalbT98+DDhcDitnVRXV+P1enG5XAA4nU4KCwupqqqirKwsXu/gwYN88MEHXH755WzZsiVpGxUVFVx22WWcccYZNDc3s3btWh5++GHmzJnDkSNH+NSnPhWvW1JSEncTUigUCsXosWXLoxw4sJ9x44o5dOggn/vcZZxyyqmA7Uny0EP30dTUREdHBytXnskFF/xthls8dIY6Rm7YsIFLL72UVatW8cILL3D77bfz6KOP9qjXnyfOxRdf3KO80xPniSeeQAjBunXrWLp0KcuWLUv/ABUKhUIxZHJ9fExLNF5//fVcdNFFLFy4kPHjx9PU1MTu3bu57777hq0hlmWxceNGvvOd76RcXllZGR+Qi4uLWblyJc888wxz5swZlv0P92TPI0FpaUGmmzAkVPszi2p/ZhmO9tfXaxjGwPKX6ft343ryEQJX3Y7WcCT+WRb09Oboi3T3+5e/vMp99/0HhmGwf/8+rr76Szz77B9xuVw8//wfqKo6wve//wNCoRCf//zFLFu2nIqKigG1JV00TYv3+0j+f4YyRnZ64tx///2A7Ylz/fXXU1dXl/RQFbo8cbo/VAXbE2fPnj0Eg0FWr17N8uXLgWRPHCDuiaNEo0KhUMCBliC/39/C5QtKafRH4p8LnPqw7+u11/6PTZsejI+P1157BU89tQOXy8Wf/vQ8hw8f5nvf20goFOKLX/wcixcvpbx8ZMbHwZCWaDzttNP43e9+x9NPP01tbS0nnHACd911F1OmTElrJxUVFfj9fkKhEC6Xi3A4jM/nS3pa+tFHHxEKhdi0aRMABw4cYOvWrezbt4+bb76ZgwcPMn369Hh9h8MRd/2ZMmUKTU1N8WVHjx5N+SS2L44ebcey5IDWGU1KSwtoaGjLdDMGjWp/ZlHtzyzD1X7LsohGrbTrGwfew/PY98GM4vr1jzCOfAxmFOP5/yFw4dXpb8fQUu5369YnePTRn3LuuedTXX2EN9/8K7fccitg1584sZxAIEBrq4/x40t45pmnWLFiJdGoha47WLx4Kdu3/57LL78y7bYMBMuyaGhoS+p/TRPD/pBwKGOk8sRRKBSKzHCgJcgv3z9K1JL85qMmqtrCRC3Ji4d8XDi7eEjb7m18NAxbelVUTCYQCNDR0Y7L5WL79qdZscIO43O5XCxevJQdO7aP2Pg4GNISjWBb+q677rqkst/85jdccskl/a5bXFzMaaedxksvvcSqVavYuXMnS5YsoaysjB07dnDKKacwd+5cNm/eHF/nwIEDrF27Nu5u8/DDD3PNNdcwe/ZsTNNk165d3HTTTQCsWbOGDRs2cOONNyKE4MUXX+SWW25J99AUCoUiJ8l/5F9TlrdfeQcAeb/eBJEwAjAOvo+Q9oMx57s7CVx4Nc43X8D51ou9rt8fa9d+jvff301NTTV33/3vvPvu27jd7vjynTtf5owzPs348SUA1NbWUFw8Pr68uHg81dXVae0r2xnKGNkfmfbEgdzwxhkMue7BMFKofkmN6peeZFOfdPfG+embqTNYX7PYfiD33x82EYkZjD5pDdFpOtrd6OeiOSX8taadv9Z29Lp+X6xbdykffPAedXU1fP/7P+Cdd+zxsbN9r732Cmee+WkmTiwF7PFxwoSS+PKSkhJqa6sH7F00EBK9cdKhV9G4a9euuHvLAw88kLLO1q1b0x4Q77jjDu6++27+/Oc/U19fHx/87r33XjZs2BB3lYlGo3z3u9/l4MGDbNu2DdM0WbduHStXruSee+5h1qxZ1NbWsmbNGs4++2wAFi1axOrVq7npppsQQnDOOefE265QKBTHKuG5y3G+/xqEAnHBKIUguPKiYd3PsmUnA7Bw4YnxstraWrZt+y3/+q/fHdZ9ZQvDNUbmgieOvV52e+MMhlz3YBgpVL+kRvVLT7KtT7p740iZ+prVWWdOiZsPGgOETBkXjAJYOaWAaNTCtGTKbfTn8dPpnSOlZMmS5USjFvPmLYyvW1tby9atv+Ff//W7SdsyTRn/blkSy+p/X0Oh0xunk/48cXoVjZs3b2bevHnk5eXx+OOPxzOfJhIKhdJu2JQpU3jooYd6lD/11FPJDTIM1q9f3yNV+erVq1m9enWv27/66vRdrRQKhWIs0J9FMLLodFzvvhIXjABoOlpzPQDhxWcRXnzWkNvhdDqTvtfW1vCjH/2AO+64i6KirtjJSZPKaW7uEjDNzU1UVk4d8v4zwXCNkcoTR6FQKEaGKxdN7HP5olIvuxsCJMpCTUBz0ARgcVkei8vyhtSGsTQ+9ioaf/KTn8Q/X3XVVVx11VU96vzsZz8bmVYpFAqFYsh4n3kUzCgA0nCCtBBmFOf7rxL425F50FZVdYQHH/wR//Ivt1NYWMTzzz/HxIllLFx4IueeewE7dmzns5+9mFAoxJtvvsGVV35lRNox0gznGKk8cRQKhWL0+f3+FqIxDwpDAynBlPB+Y2DIMY2pyPXxUcjebLcJzJ07l6uuuoqbb755NNqUEbLd9SbbXAAGimp/ZlHtzyzD1f7a2k+YNGla2vVFWwvuF3+D871X8V94Ncb+93C+/yodl95IdMb8tLfTWyKcV155if/4jx8xYcJE1q37PKeffiaXXbaWlpaW+NPVUCjI9773A5YsWYaUkgceuJfW1hY6Oto57bQzuPDCz6bdjoHS2V8jnQhHjZG5Sa5fV0YK1S+pUf3Sk2zrk4GOkW1hkxcP+Xi/McDq2eM40BLi/cYA6+aMZ8Y4d/8b6AXD0HjxxRezenyEnv3V3/iYlmi8/PLL+cUvftGj3OfzUVhYOMimZhfZPiBm24k5UFT7M4tqf2bJlGgcLnoTjdnOaIlGNUbmJrl+XRkpVL+kRvVLT7KtTzI1RnYnV8bMgYrGtFLynHXWWbz66qs9yq+//vpBNFGhUCgUirGDGiMVCoVCMdZJa8qNxx57jIaGBrxeL/n5tgKVUnL06NERbZxCoVAoFNmOGiMVCoVCMdZJSzQWFhZyzz33JJVJKfne9743Io1SKBQKhSJXUGOkQqFQKMY6aYnGO++8k0WLFiWV1dXV8YMf/GBEGqVQKBQKRa6gxkiFQqFQjHXSimn8z//8zx5lmzdv5v777x/2BikUCoVCkUuoMVKhUCgUY520RGNTU1OPsttuu43a2tphb5BCoVAoFLmEGiMVCoVCMdbp0z317LPPRghBY2Mj55xzTtKyYDDIcccdN6KNUygUCkVu8cwzT3L//ZtwOBwATJxYxk9/+l+AHef30EP30dTUREdHBytXnskFF/xtJps7JNQYqVAoFIp0yfXxsU/ReM8998SD+b/97W8nLcvLy2POnDkj2jiFQqFQ5B533/1vLFmyrEf5n/70PIcPH+Z739tIKBTii1/8HIsXL6W8vCIDrRw6aoxUKBQKxUDI5fGxT9F48sknA7Bp0yZmzJgxKg1SKBQKxfBw6NNnYKWY9kErKWHqn/485O1v3foEjz76U84993yqq4/w5pt/5Yorrubpp//Czp0vEwqFuOiiS5g1azYA27c/zYoVKwFwuVwsXryUHTu2c/nlVw65LZlAjZEKhUKRu3zmR/9HU0ekR/n4PAfP/tOpQ9r2WBwf08qeOmPGDJ588kn+93//FyklmzZt4qGHHuLGG2/E5XKNdBsVCoVCMQhSCca+ygfK2rWf4/33d1NTU83dd/877777Ng6Hk/nzFzF//gKqqo5w3XVX8cgjjzFhQim1tTUUF4+Pr19cPJ7q6uphaUsmUWOkQqFQ5B6pBGNf5QNhLI6PaSXCeeCBB3jsscc4/fTTaW1tpbCwkNmzZ7N+/fqRbp9CoRgmhACPS6PYKyj2Soq9Ao9LQ4hMt0wxWGquuoK2/93a5+fBrDtQli2zLW4LF57InDlzmT9/AQCTJ09h9uzjeOWVlwa8zVxCjZEKhUKRfVy35W2eeqe2z8+DWXcgjKXxMS3R+H//93889thjXHHFFXg8HgAuueQSlRlOocghir0CT9W7iNe2If7vd4jXtuGpepdir0DX07oUKBQpcTqd8c+HDn2StMzhcBAKhQCYNKmc5uauTKPNzU2Ul5ePTiNHEDVGKhQKhSIVY2l8TOtO0TRNdF0HQMTMEpZlEQwGR65lCoViWBACZNAPb2yHqj1gRe0FVtT+/sZ2itxSWRxzkPL/fJSCz67t8/Ng1h0K99777/h8PgACgQAffvgBixcvAeDccy/g1Vd3AhAKhXjzzTdYteq8Ie0vG1BjpEKhUGQfD//9iVy4aFKfnwez7mDJ9fExrZjGJUuWcMUVV3DRRRfh8/nYvn07Tz31VDwJgEKhyF7cTg158F0IdaSuEOqAQ7txVSwkGLJGt3GKnOaVV17i/fd3U19fT2FhIaeffiannno6d911B5WVlVRXV/G1r93AccedAMDZZ6/i/fd3c9ddd9DR0c4VV1xDRcXkDB/F0FFjpEKhUCgSGYvjo5BSyv4qRaNRNm/ezO9+9ztqa2uZNGkSF198MVdffTWGkZbupLq6mrvuuosJEyZQX1/PrbfeSmVlZcq6R44cYc2aNdx2221cfPHFAPzmN79h586dTJw4kQMHDvCZz3yGiy66CIDXXnuNr3/967jd7vg2XnnllbTa1cnRo+1YVr9dkTFKSwtoaGjLdDMGjWp/5ij2CsRr27osjKnQDOQpa2j2pz4HhLDFp1uXCE0gkGBJLCEIRgXBsEX/V5LBk8v9D8PX/traT5g0aVra9Ycre6phaESjufdAobO/Evtf0wQlJfnDup/hGCOznWwfIwdDrl9XRgrVL6lR/dKTbOuTgY6RI5U9NVfGzO791d/4mNZoZhgGX/va1/ja176WVO7z+SgsLEyrYRs2bODSSy9l1apVvPDCC9x+++08+uijPepJKdm4cSOTJyer6+3bt/PDH/6Q/Px8WlpaOPPMM1m+fHm83q233hoXmAqFIhGrb8EIseWpbwh1XaPILaF2L1rJZGTVHmTtQbCiCM3AUz4Tz7QFtAYFppn9F8ljieGYVkPRP8MxRioUCoVidBnqtBrHGmlnv7Asi4aGBqqrq6mpqaG6upqvfvWraa3b3NzMyy+/zMqV9vwjK1as4PXXX6eurq5H3S1btnD++eczbty4pPKHH36Y/Hxb/Y4bNw6v10tjY2N8+Y4dO7jnnnvYsGEDu3btSvewFIpjAA20fp4PaQbQM6hRCGzB+NbzaCUVWO+8iKzeq+IiFYpuDGWMVCgUCoUi20nL0vjLX/6SjRs34vf7k8pFmneI1dXVeL3e+HxVTqeTwsJCqqqqKCsri9c7ePAgH3zwAZdffjlbtmxJ2oamdenbd999l/LychYtWgRARUUFl112GWeccQbNzc2sXbuWhx9+mDlz5qTVPoViLBM0Bd6KWcgjH/VeqWImAVPQ3drodmrwybuIknJk1cdZFReZ6DILFqARNEfeVVah6M5wjJHZHL6RTQgBTpeBw+lASvt7JBwhHIqq816hUChGkLRE489//nO2bNnCCSecEM8QB3DDDTcMW0Msy2Ljxo185zvf6bOez+dj06ZN/OhHP4oPyJWVlfEBtri4mJUrV/LMM88MSDQOd4zLSFBaWpDpJgwJ1f7MIZ0LkQ1HUos+Vx7a9EUUuL10P0IZ8mPV7EecdDbWW3/seyfV+8kvm45372vgdCOcHnC6IfYe/+5yg8ODSLiWpEP3/pdBP/Lgu8jqfbbFUzPwVswib/pChNs7oG2PBsPx/6mv1zCMzEyPkqn9DgVN0+L9PpLn71DHSBW+kR66ruHJc1HdGqaxvh1LgiZgQoGTiiIPgY6QcpFXKBSKESIt0ThnzhzmzZvXo/yaa65JaycVFRX4/X5CoRAul4twOIzP50sa+D766CNCoRCbNm0C4MCBA2zdupV9+/Zx8803A9Da2sptt93G+vXrk57CHjx4kOnTp8e/OxyOAac6z/Yg/2wLNh4oqv2ZpbS0AJaeBwffhdoDcZFFxUyYuoBmv8Rs63l8xV6JsKK252pacZFARyt0tPaIkOz+3dIMpOHCcrjtd8OFZbiRDvtzYlnppOJ4/wthJ/fhje3JItiKIo98ZIvjpefR7JcZtzx0WkPzXAIzGmWo1lDLsohEzLQtWMNFrgT1JyKljLmMto14IpyhjJGd4Rv3338/YIdvXH/99dTV1SV54kBX+EZ3T5yHH3447o2TGL7ROcbu2LGDPXv2EAwGWb16NcuXLx/UcWYSIcCT5+L9mg7C0a6Tx5JQ7wvT4o8wrzyPjrZAxs97hUKROaSUoz5G5iJSWqQKS+qLtETjN77xDdavX88JJ5xAXl5evPwnP/kJzzzzTL/rFxcXc9ppp/HSSy+xatUqdu7cyZIlSygrK2PHjh2ccsopzJ07l82bN8fXOXDgAGvXro0/HW1qamL9+vXccsstVFZW8te//pXa2louuOACHn74Ya655hpmz56NaZrs2rWLm266aUAdkU2kcruTZtSeb+8YHQyVK+LQae4wKdY0tJM+DUJgGm4C0kHIb9F7EuVYPKTEfu8nA6vpyMN3/NmIaAgtGkKLBhER+7OIfdcisc9WFMJR9HAvLq8JmO8bFBlOpOFCVMyG+pascpVNRTyB0Cfv2tbamFAfSuIgw3DS0eEjL69QDYp9IKWko8OHYTj7rzwMDGWMVOEb6eF0GVS3hpMEYyLhqKSmNcwEj0Eo2M8DLoVCMSZRY2T/SCkxzShtbc04ne7+V0ggLdH4wAMPsHPnTvbu3ZvkepOYiKY/7rjjDu6++27+/Oc/U19fH3dDvffee9mwYQPLli0D7NTl3/3udzl48CDbtm3DNE3WrVvHN7/5Td5++20uu+wyACKRCN/61rcAWLlyJffccw+zZs2itraWNWvWcPbZZ6fdtmwi8UaTmv1xi5CsmEXx1PnHZIbK3vpEZe0cGCIcgCMfYcViGwPlCwhOPK7PdYKmwFM+E9lwCDFpup0EpzcqZhKwNExPUf+NkRJhhrvEZaRLVCaXBW3BaUXRw1EI+9EKi7H2v9n39qv346lcQKamVo8nEEphDaVqDzRWUbT0PJr9A3sQVFxcSnNzA+3tLcPf6D7QNA3Lyq1zzDCcFBeXjsq+hmOM7ItMh29A5kM4wlGLxnpfn3Ua2sJMGV9IYYEn7e3mctjBSKL6JTWqX3qSTX0ybpybw4cP09BwJNNNyWoMQ6e4uJgJEyYkPXTsd710Kn344Ye88MILOByOpPIHH3ww7R1NmTKFhx56qEf5U089ldwgw2D9+vWsX78+qfyRRx7pddurV69m9erVabdluBhu61dfN5ryyEfQcKTPG82xaI0bqZvvYxEtHOj2vb3fdYJhC8+0Bci3nkebvwJ5tKbXuEimLiDkT1NYCIHsdEHtr66UTBjvpqn2KFokSKHhGtIUIqNBZwKhfq2hk+YTDMeyeaSBrhtMmFA+jC1Nj5F0zx4L162hjJG5EL4BmQ/hyCvw0N/uLWmLy6aj7eha/+dUrocdjBSqX1Kj+qUn2dgnBQWlFGRYx2Zjv6Ti6NHke5Rhmadx8eLFhMPhHgNiSUnJIJo4NhgJ61faN5op3O7GpDVOStxOAZ/sznpXxFxAC9uZHS3dgWZG0Hvr0wSkhI6jRymYvwKrsQpt0RnIqr3IFHGRrUHRh5vrEBACYTixXPlYrnykLhBpuMravvqZucl169I+D/uiej/5E6fheWcHUtNtjwJNj70MiL0nf+8q669+Z51sngdlrFy3hjJGqvCNvjEtSUtY4syTaII+haMm7PoH2iw8BuQ7BPmGSEtAKhQKhaJv0hKN7e3tXHjhhSxcuDApXuOll16Ku4seSwza+mVF0SJBtEjMBS8SjMV42S/nCUuQadxoeifPIeL3YbryQXeMrjVOSpAmwjLBshAyirAsO14r9p3Yd5Hw3WrV8bb74+va65sImfA59r3rs4W25FysNPokr2wmjn1/xYrFvVmGE6k7YwlVnEg9VmY4QeReFsjhQIvYlsZo/gScrTVoaYhGpMR16G2scAfhE1bgMDyIGYsQMxYipURKQcAU/cRFDi+dLrNU7em9Ui9TiIweVnrWUCEQxJIN9Vd/kEihDUKIdqujd6D7wz3EKkIftCgdS14EQx0jVfhGTyKWpCUk8YUlEnB1RJhQ4KTeF+51ndICJ23+CACBKASikgakEpAKhUIxDKQlGt966y0uueSSHuVO5+gkGcg20rIIHnyXwrxirJr9tkiMBO3EH30gsJBp3Ghq0SBFH79gfzVcMHUe1Lf33Z5PdpNXPIVIc1M3YWcOWMgNBgkMLNw2hhBp3XwLLBzt9Wlt0tIdSN0WkJbhShKXlu6MuU12LZO6o/+N5gCdlsZI3gQcrbVoET9YJmi9T33haKvFCLRgGS7a9ULwS5KFWPfvI0+nyyyNVcPjKjsiiPQSBznzaV702ZhoNO1z0uw856Kx87LrPX4edp678WVm8jYSv0sLYVpgRgZ9NNYBSBWpKiF9ISp0pN4lXB3FJfDJ4THhRTDUMXIshm8MlqBpi8X2SNd1xWuAYUaZUOylxR9JmQzHaQjKi5x0tAWYUaDREbW34Y/2FJAFDkGe0VM8qjkgFQqFonfSEo3XXXcdl156aY/yxDiJY4m0XM9qD+A4aTrWx12JEKQQsSkF3FiGG8sRe8XK8h3etNzupNCJugvRQ+12wpBxpVhvvdt3e2r24y6bhrPqrQEcaWo6LRcIvcvi0HlT2Pk59r3zszffS0cgmnb9zu0XO7W0+sR0eGifsQJhhhDRMFo0lmQllmxFRMNoncvMiH0DnUbWTgCJwHS6KNQcSN3VTWwmWjY7lzlj7pHZgRAgzSiemXMRM44nz3AjHSCOfIgW9mO5e3H+lxJP7YcABCYenzXHJCW0BgVFS8+zXZcT3BpH3FU2DRy+OkRzS3qJg0wBQkPqTtBHQH5LCdJKW2Qm14nGHx45dUk0FO65jrTiQnagaBPT8yLIZEKjdFFj5NCQUhKIQnPIIpDwVypwCMa5BC7dFniBjhDzyvOoaQ3T0BaOz9NYWuCkvMhJoCOElKBrgkKnoNBpu6t2RCVtERkTj7aABEmz2Y5TWuQ5BE5D7zEHpEOHqSUeCou8WJZEoETkaKEEvEKRfaR1F5hqMAS48MILh7UxuUN6rmeW4cY38zSk4cZy2CKjL1euoKWl5XbXoXkJnnAOSIkWCTDOkV5iEKk7CI6f3kOYpfzch6gbjDtafmkBwUEEBaftimgZRArLeq/TiZQIM4IwQ3FhaQvJzkyeiWIznDA1RBCDIJDeMUihxwVlKnHZ6SorDVfMujkybrOdMWPy4zeQ1ftsS7ZmICbNQFuyCoevg1Av6yZaGUMl04e9bUP6OXZhAAAgAElEQVTBNC2a/eCdMANP2TSkZmA5PKPuKpuEZeGpfQ9Pw17QHfa8mMOVOGiwCNF1Xg9hM6WlBTSlOn+lHKBFtEuIevR+HgZBxhMapYsaI1PT342/lLY1sDkkCcdOBQEUOW2xaHRzJTVNi462ABM8BuVF+Unb7G1+xu4Csj1mgQxEocVv///0kGRehTdpDkiPU2PWRC/1vjAHGgJxgTqhwElFkYdARygn4m1zEV3Xegh41fcKRebJDtNBzqGl5XomdQfRgolpb3XAbndCYDm9SC29xCCW4cZfuTjt9mQDw+6KKERMrDmxXGk2wrKYMM5Bc11TksXStmiGEGbsPXGZNNEjAYgE+t9+525ibrNdcZgJbrPd3Gil4URqjj4FfGLMmOweM1b9MdbRavIXn0M4mCJmLEutjIlICaF2P659LxHJK6Ft9hlkSmBooXbyP9mFEWhBIgiUHkck4rStoYd2Q3V2WUOHDSFAdyB1x4B73mVkf0IjxeDp78a/vjlAfYdFp6epLmCcU1DoEuh9XNekhFAwOqi5GHVNUOQUFMUEpHC7qGkOUJTnpM7XNQekrsGsiV721HYkucJaEup9YVr8EeaV5/UqVBWDRwjw5LmSBDyovlcosoHsuxPMAUYqEUeS2123G01RMQsZm6ex+41mbiQGGRx99cmo3XxrGsLlxfSk6YInJVhmlztsTFCmEpdJy2Jus+lMdg+222xS7KWRGJvpwlFS2m/MmDz0Aa4pi3rEjCVbGWekd9wZQMass4ONtR0OnM2HyDvyNsKKYjq8dExbRjSvBCxJsx9cFQvJn7EIM2oCo584KFsZy9etY510bvyPn5RHXUc7Dg3GuQQFDoE2ipl+dU1QWuRCC4fJK3DyblXXFESlsYQ7qWInAcJRSU1rmAKHHk+8M6ZoC9MWzsw1tcDroLq1/76f4DEG9eBAoVAMHiUaB8FIJuLodLtzVSzEU7kA+2ZJoLndNDUHUt5o5kZikMHTW59k7c23EKAbWLoBzjzSkprxye4TrZcx99kksZngRmtFY59TO5hqk9KIGas9gGfawuSYsR5Wxt4T5WSczpvMTIhGM0pe1Vu4mg8DECqajL/yJNsNPYaUEAxZFBQW0OxrIxOJg7KVsX7dOpZxuox+b/zrfWFmlLiQ4QgiC6aFSZzKY3yegw9r+n5419AWpqQ8j7qjvWdzzVXqAv6M7buk2EFjU/99X1aYjwxkx39HoThWUKJxEIy09avzRrPrRl5Smmf06oqRFda4ESZVn4ypm++kye7TnJXWMpPEpdbNmjnYmDGHLzesjEBXHOgo/7d1fwv5n/wFPdyBFDr+yYsIjZ+W1XMiZht9XrcmzYDpC3P+unWs4nA6aKxv77NOY1uYiin5tEcyby0SgqQ5IIUQfc4HCcTcbQX5jrF3zrtdBsFQZn4XLc2+NyXs91m4dHDpArcObkNgCJSQVChGCCUaB0m2Wb+yrT2KUUDTkZoH0+FJuXhQMWNS4qnLESsjdkZiGEX3VClxNe7DW7MbISVRdyHt05ZjuQtHZ/9jjFTXLS3sh7qDtNY3YuaVZLqJikEgJWnd+GfLsBQJJ88BKaVMEpGp0AQYGkzyjr15f0tL82gYROK64cDQSKvvwZ6/M2ja07S0YpfpAlw6uHURF5Nqbs7MojLhjh2UaBwC2Wb9yrb2KDJLOjFjYtIMgqEInZeCnLIywqhaGkU0RN6hN3C21QEQLJmBv2Jh1gvrbKf7dctTvQ9Pw14cpaYdG6rIObpb7lKhiewxzIdDUSqKPPE5IJs6kkVkKkoLnETCYzCeMcN0F/CpKC1wQjTKjAKNoAkhUxI0JUHTtkD6o+CPdt3/OLQEa6QucOnKGjlaqEy4YwslGhWKMUo6MWNi8mwiR5ugYGLOWRmBBNE4soOO0dZA/qHX0aJBLN1BR+USIkUVI7rPY5VIYTmehr04W2sIlC/IHmWhSJt0b/yzRXRJmTwH5NH2MCeU58dFZHechqC8yElHW/rZsRXp0V3Adyex73VNkKdBXsxFWEpJVEIwKuNiMmRCxIKIJWmPQKeQ7LRGumPurQ5NCcnhRmXCHXso0ahQjFESY8bEofeQ1fuSYl3F5BOwdr+EVjQVCibmnpWR4cueKgS4nRpuXQIWoBE0BcFQFHfNB7jr99gTe+eV0DF1GZbTO+S2K1ITzRuPpTvRwx1ooXYsd5oxvoqsYSA3/tlC9zkgAeZPzqe6OURDWzhuISktcFJe5CTQEVI3uiNAdwE/kL4XQuAQ4HCKeGYAKSUhC0IJQjJsQSj2udOtVROJQlK5tQ4H6STEUplwcwslGhWKMUxnzFjJcUuxKueTGOvKkT14O1rQPSW5aWWEYcmequuaPZ/lJ+9CTVcyFk/5TLyTj0N21COBQNkcAmUndFk3FSOD0IgUluFqPozTV0NQicacYyg3/pmk+xyQQhAXkYmxWMoyMrJ0F/BD6XshutxSizq3L20LZDAqY66ttltrIAqBBLdWQ9jJdTrFpNlfoK4iCcPRf0KshrYw5UX5SjTmCEo0KhRjHClB6AbN/s4BTyKExFsyCa2sHI+m49YtxIRJmPVmzlgZIdHSOLjBXAhswfjG9mQXXisKVXuQjVWIRWfia25X8XWjSLiwHFfzYRytNQQnHp/p5igGwXDe+GeK7iJSMXqMZN/rQuA1wGsku7V2CslgzK01KqE90uXWWrW/FZdmJ5nrFKLKrdXuv4gFYQvCMUtu2JQcl59bCbEU/aNEo0JxjNFlWfvEnscxPs3BdPTFn0YPO3InMH2IMY1up2ZbGFPFfAKEOpA1ezEqFhIN5UifjAEiBWVIoWH4mxDRENJwZbpJikGgRJciF4i7tWrEp1CR0hY/QVMSitrvYQvb1TUs8dm1EGDHRRpdbq3GGHVr7RTXYbNLHFYfaiMQtlKmXEw3C/ExrrlzCiUaFYpjiL4sa7J6LxytoWjpeTT7c+Tp3xDdU926tF1S+6J6P57KBQlZiRUjjm4QyS/F2VaHw1dLePy0TLdIoVAcQwgh4nNA4rTLxpfkU1XXZmdqjXZZIwMmBMxkt1ZXbN7IzmytWg4pIyklZszyGrakLRJj7z1vC0zAPmanDk5N2O+6gGi034RYEwqctPnDIOUxb7HNBUZNNFZXV3PXXXcxYcIE6uvrufXWW6msrExZ98iRI6xZs4bbbruNiy++OF7+yCOP8PbbbyOEYM6cOVx77bVpLVMoFKDrgkLDhMN7ugSjpwAxfSHy410Qjdjlh3bjqlhIMCcsayIWpQlx/7eBIK2+57GE2PJcUNBji0hhOc62OpytNUo0KhSKjKNrAo9hv4g5P0StWHxkNyEZjUJHQnykU4tN92HY784BuLV2n+dQ1wQSiSXtsW8ocx5GLdnDrTRs2ungUvZBN3FYVpKH3+dPKYoj4f4TYk0sdPJBVTtuHSZ6NAxdqDkds5hRE40bNmzg0ksvZdWqVbzwwgvcfvvtPProoz3qSSnZuHEjkydPTip/5513ePLJJ3niiScQQrBu3TqWLl3KsmXL+lymUChsiqQftm9FLF9pD2OeArRZS7D+/Hu7bN8bdsVcsqwJYbuoSst+ifQS+IhIEHfjfrTKaUjN6Fs4agb20KxGrNEkXDiJvCpwtNeDZeZOciaFQnHMYGgCo9u0H5GYW2vitB9hy7bWEYuPFCRka40l2zFETyGZOM9hR4ufGaVealpCNCYkl0pnzkNTyiS30s53s5dhTRO20HXFBK4z9t49o2yB2yDYllr8ppMQq7ElgCWhIwoNYZg10UNNDs7p2F3Yh6MWLrcx5sTuqIjG5uZmXn75Ze6//34AVqxYwfXXX09dXR1lZWVJdbds2cL555/Pli1bksq3bdvG6aefjqbZMUxnnnkm27ZtY9myZX0uUyiOdXRdIJsbkffdCg3ViBlzYPLxaMXlmA/dZZfNnAvjy6GpJvcsa0LYzZUW0Lew0AOtuBv24mw5Yk/T4RIwaTpU7+19pYqZdrbZXOqTMYB0eoh6xmEEWnC0NxApnJTpJikUCkWfCNHlnlkYK7Ni2Vo7M7UGTUnUIv6ZsD226IL4vJFuQ+A2ID82z6FpSeZW5LOntu85D9t8fkJRW6Amupf2Kg7p6Vbq1Oy2DIe7aH8JsdwaTM3XqA9aTC/18kEOzumYKOwzIXa7C9aRtM6Oimisrq7G6/Xictn2fKfTSWFhIVVVVUmi8eDBg3zwwQdcfvnlPUTjkSNH+NSnPhX/XlJSwptvvtnvsnQpKckf8HGNNqWluZ16XrU/M0h/O9bTv4GGagCsXz2IdvUtccEIYD2/Fe1LNyKbakAz0A2d0lJvj+0Ib36/ZSNFb/1vajpYJhNK8hCOnglTpJTQVIP1yXu2KI5vcCqibAbCnYd1tCZ1MhxXHtr0RRS4vQz118/V/08nmWi/1TYNub+FwnAjWulxQ9pWrve/QqHITTQh8BjYbq0xTKtLQCZO+9HR6dYakkwqctIem+dwUpEdG9jXnIfVrWE0YVDb0TOGUNBNHMbeU1k3h5v+EmI5dMGM8a5+j6+mNUyJ2yAUjADZkbVWCPDEhH0mxO5oC9asSYRjWRYbN27kO9/5Tkb2f/RoO1YWz8FTWlpAQ0NbppsxaFT7M4cQUHLBFzCPWwS/2ASBDqwH1ndVKK1A//JNWPv+an+vmEl7SBL02cer68J2bf3jVqLnrMNnOlKWjSSp+l8IO/up98RPI5CY4TDBQJRg2LIvzpaJs+UI7oa9GMFYrjtNJ1Q8jWDpLCxXPkRAtyRFS8+DQ7uhOiGbbMVMmLqAZr/EbBvab5/L/x/IXPt1YzxFgFl3mKMl8wedZi+x/Zomsu4hoYr5VyiOLXRNkJfCrTXRGjk+z8GHNfbDzMTPvdHYFmZOeR5NbeG4xdClj544HApOl4PGhv7ndCzJz2OfLxQvE8SiVLp9ppfyzj4YzHqplud5HVS39i92J3iMYc8inQnBOiqisaKiAr/fTygUwuVyEQ6H8fl8SXGLH330EaFQiE2bNgFw4MABtm7dyr59+7j55puZMmUKTU1N8fpHjx6Nr9/XMoXiWEdKEHmFBGaeSN6V/w/rP+5IWq59+RtYn+yGQBu48mDqAkJ++8lUpzjsdG3V559M8ZSZiGg4qWzc5Jm0RrRRcxvpmjbkXWTNfqQVRWgGnvKZeKYtIFh1EGfVe2hRe3CxDDfBCTMJlcxAGs6kbZmmRbMfXBUL8VQugFhqnYApCPkt21KpyAimuwjT4UGPBNADLZje4kw3aURQMf8KxbFNoltrp0+EQxPx6SqEEGnNeejQBFMLci/+W8r05nTsLnwl3TK99ztcD3Y8T73evCKDxsa+xXyn2P2kLRQXohrJgjT5e2wqkoQyLSZau75Dnmf0Bas2LFvph+LiYk477TReeuklAHbu3MmSJUsoKytjx44dtLW1MXfuXDZv3sydd97JnXfeyYwZM1i7di0333wzAGvWrOHll1/GsuybuBdffJE1a9b0u0yhUNh4I+1Y//PjHuXy2V8jymbAlONh6Xm0BkX8Ipyvm/DHrbYba2kFuteLePZxxIEPusryC9Cee5wCLTIqxxGfNuS9VxCFE+yrJ9gWwqYaRGsD3omT0KRF1F1Ie+USWuaeS7DshB6CMd4HEoIhi2a/pNkPzX5JMGRlXezEMYcQRArLAXC21vRTOTfpjPlfuXIlYMf8v/7669TV1fWo2xnzP27cuKTyxLh+IUQ8rr+/ZQqFInvpFA/QNedhX+TynIeJx9obWmwuzdlFOrMKNWYVasws1JhRoDG9QGNagcbUfI3KfI0p+RpT8jQm52lU5GmUezUmeTXKPIKJsVepWzDBLShxC8a7BMUuwTiXYJxTUOQUFDoEBQ5BvkOQZ4A39vLonbGn6Yt5IQQRy06IFDLtaVr8UdsduT0i8UUkrWFJS1jSHJIcDUoag5KGgKQ+IKn1W9T4Lao7LKo6LA63W2gOg8a23qczAVuwOpzD5wk2au6pd9xxB3fffTd//vOfqa+vj7uh3nvvvWzYsCH+1DMajfLd736XgwcPsm3bNkzTZN26dSxatIjVq1dz0003IYTgnHPOYfny5QB9LlMojnW6J8Lpjty9C7HiPEKzFuHvZlnzRTQKzlmHPv9kdK8X85GNduKcypmIMy9E+9QqzJ9937Y4HrcI59QFhM2RPR63U4Pqj9Eq5yRnfk3IBsvK8wktOJsO6cndUVQBQLioHPfR/Th8NQTK52W6OcNOLsT8KxSK0ScSjsTnOWzqiPQ752FpgZNIeHQe3g43icfaG4nHl+hmyoCG+OG9H3BotpjtSzhqwnYPnpqvIbHrdlpI7e+y67u0pzuRCXXs77LHuukK1uF8+D1qonHKlCk89NBDPcqfeuqp5AYZBuvXr2f9+vU96l599dW9br+vZQrFsUy+bmI915UIh9IKxJU3I3//K3j3LwBY23+N49o5SJl8SZASfKaD4ikzsZ59vCtxzhOb0b50E+Z//lvXdrf/mrwrp2NW1YDuQBoOZP44pCfPXm6asceJQ3NwcBug5Y/HfOg7XZlfe2SDnYdr3hI6OjJrKtR1YVtrU5T5RtGdN5cx8ydA5VycpZWMd1pIoRM0RVfs6jFApmP+ITeSxQ0GlSApNapfUjPa/eL2WLT4IzS0hZlbkd/nnIdTxrtxGhpDzto2QIarTzqPNduOry9MS1Ja4KSuH7HrcupMnlTYa53BEI5a6QlWQxu23yhrEuEoFIqRwRfRKDn/C5izF8H2x9E//zWi//tzrEu/jn7qufDs44hrvo3PcpDKb1/XBSIaRsyah6icifXEZjuZzo/v6qpUWoF+6XWYD66nMMGa6f/MlwidegEAnu2/wP3as0hdB8OJ1A0wbHGJ4SC48iLCi04HwPn68zj2vhVb7sTM9+KJSDAcaKefjfXq892ywf6/btlgf4t2/EIyeYmLJwt6bivy/C8AIqmsYBQSCOU6duyqgMYo1lt/hG6xq61BkbXzdqVLrsT8Z3uyuMGQ6wmqRgrVL6nJRL/ouhaf5/BAg5/jJ+VR7wsnzdPYOedhuy8w6tfD4eyTxGNNNadjJo6vP4SA8iIPzX2I3fIiJ81N7cP+kNPlNtKyzgYDIVrTjGnsL1GcEo2KnCCVdUZZbNJDShD5hXRMnE7BRVdiPvoDREM1gY8/RCw4Ge91G/BZDswUEzklJsKRDdW2S+qXbkoWjNjJdEKv/AHpLURM9kA0gohGkHkJT9akfbEXpglmoKeTSNAf/2jUHMD5wa6uVQF37LP1xvNoV9+MmDkX61cPxrLBJiT3Ka1A//I3MA0nhDIzwHRPICSPPxH3jEV4I+1dCYRGyZ032+nr3NYdAt7YnjwdihWFqj3QWEXR0vNo9g+v+81okxjzv2rVqh4x/6eccko85r+TAwcOsHbt2nj21DVr1rBhwwZuvPFGhBC8+OKL3HLLLf0uUygU2U3yPIdepIQp41xMHue0Y+XomvMwl6+D0P+cjtl4fFJCoCPUp9gNdIRGpO3hUJSKIk+f1tnyIicdbYFh26cSjYqsJ5V1RllsBo5VfQjrkTvj37Wjdfbkv6ZBb5nB8nUTnutKhKN9apXtktp929v/G/H5G2jr43cIrL6KwAVXghmNi0qiEYRpv8v8ruQeoeV/Q2TWwnidArdGe0s7IhrBQRSj/jCirALtypux/mND0n60L38DK9hKwFsysA4aRpL6DbD+64d4r/kX5OP/keTO6/3K8YSJJecJBQABhgGafkzEYvZ3bouVn0Gmmj8TbCF5aDeuioUEM/RwYLhQMf/po+sC3eUkEgglPWToXqZQjBX6m+dwLJGLx5pK7BqGRjAQGlGxmwnBKqTKJw9kv+tNrruLDLb93S02fPU2AjNPTLLY8NXb6Bhhi81Y6P+ju9/D/dqzOPa8idbWTGjpOfjXfKXP9YSAgnALjrpDiJKyeNKblIzg75DY/0JAcZ6G1no0HteY1OYFyxFf/CeaIkbGbiCFgAItgv7Jh7DlXgh0Ez6lFehXfBNfk4/w1LkA5G+5B8fHbwEghQDdsF13dQcYBh1rv050xnwAPM89hl6933bf1R0xN14jHksaWrYKa+IUABzv/wXNdzRWz7AzyBpG/Ls5aTrSa8c7CF8TWJYtXGPbQjcoLSsa9v9/Oue2dtUtSC0ETb1kTtUM5ClraPb3/UNn+zyNuUA2jJG6LpBOB3/c18w5M8dhBiMpy9Il16/rI4Xql9SofumJ6pPUjGa/CAFOl4HD6UiyzoZD0QHfAyn31BykN7fLIqd1zLlidrfY8ItNeK/5NvLxB5MtNtfOiVnMFL1hlU3Fv+arGB+/RcGWe9CaavtdR0roqK5lXMmEpDhCSisQV38L+fSWeDKd0fodNE2AvyOlYAQ7GywHPsSRQdfPzgRChR3tOFK5837xHwnv+B3m0r/pWkd3IJ0uiEYRltllje3E6joYveYAjgPv9br/yPGL46LRtesPOPa/22vdtsv/hejsEwHI+91DOPb1rBvVDcbpBu1f+CbRmQsB8G77CXrV/gQB2iUypeEgePoarLKpADjffAGtpTFpuffEpbDz2eRz+6u3IX95f3Js6pduRPYmGq0og593S5FLdIrD+3cepqEjwnETvMwscBDRtKSyqV4DM3qM+3wrFIpjhtG0zqq77CyjN3ct2dyIvuM3fbpi5lrcn15zAM3XhAi0I/ztCH8bWsB+F4F2/Gu+gq9kEgXnrMNROdueYzDQgbz/1q6NxBKwtFZVw0T7BtX72wfRmuvB5UW63EiXB+n02O8uD+FFp8VdIbW6Qwgp48uk02NbWcYQui6Q/naEiKVvHj8JSitwnn9pvKwvzGlzaEJScPYl6McvshPnfOVWfFoe3s/fgH7qR/0m0xlOejxIKK1A+/t/RO74LTImYGWmHiRIiWg9ihw3AV0XGMcvxHqwZyZo6/nfwudvwEw4lzu+8M2EClaC624UYUawvF3xoYHzLifY4QMzgohGIRpGJLj9mhO6Ep1E5i7HnFDRta1oGMyoLUjNKDJhu9JbgFU4vmt5NGJv14za76Ir8612tAaj9mCvXRFe8mk6nUadb7/UU+T+8deIz30FcfmN8eRK8kf/0rU8Fptq7Xuj132gGdhRPVl2cVMMO7rLyXN7m2josB+kbHmzhmuWT+bxt6viZdv3HOXakycr0ahQKBQjwNi6Ox5lhlukdXfX0o9bhDvmrmX+qO/kGRmJ+wuHEIE2+6bTYcdlOd57Fb2xOib8OhCBNjR/O9FwB0VtPtov/xfMybMA8D75U4yqfb1uXmtrJjp+Ej7TQVFHG8aXvoH14+SU89rfXY/5Pz/BXH1NvMyoPoDecKTX7UZnLsCMica83zyAUXcoabk0HEinLTbbv3gLVql9A+7e8Ss0f3tMiHrjdTpf0akngMtjbyTot/tEz+wp1vkfsJ7eStGSs/Dv24M5/xS0K76J/MufKJg0HR+u3jcQ83WQCHzShXPqgq7EOVELH47kshTJdIab+NyRx9kCVv/81wi99BwiAwI2Eb1qn+02WneI9m8+QJEmkQ+uT+3O++5f0E/9qPdEOJoGThcy9tt0Pwpz0rS02xU6+dy063Z87h97FkrJhGIPjbVN8fMcwP/Z6xBBf0y4JgjMTuFa2iVcwyedQXTqCXGhGn9/6y8Yc09C/8q37eteAto138IKtEKgDxefipkETCUajwUigRDnzBzHcRO8bHmzhkDE4v6dh+PLS/McXLG0An9HCMcwhQOr+EmFQqHoQonGQTISIi2lK2Y3d61ULoB9ic20MjVaFiLYEbfwaf42opNnI/OLAHC98iRG1b6k5SLQjojYaX7brrg9Hmvl2vWHXt3mNED4u24Ao5XH25YNTz7SW4CMvVvefKSnADNmOdR1gX7SCqxuN5UA1h9/h3nNbVjRrv5oX/dPaIE2RCgAoQAiFECEgoiQHxEOYiUkXLFKyjGltOuFY3U73QL9bUlzCjrffw39aO8una3X/yAuMAs334beWG0LUJcnJjC7LJ/+1Vcjx02wt7vrD4hIOEmAdr5wurEKS0DXe91vb/T4X0yZRX7zEXAux/rJD6ChGuO4RTinL0r5v9Aaq8l74j6CZ6wlMu8UAMIpEuekKhtJOl0/nVMXkHfVtzAfXI/0FNBmjr6ABdCa6vA8/2ucu3cCYHnyKdAj8Mcns8Kdd8gIgXA4wO1NKrbGl/WyQk/CJ52ZsjzpP9oN66ktaH/3j8gje5Kzp3biyoOpCwj5czsJjqJ/wqbFe40Bit0GJ0zM4ysnT+a+Vw4n1fnCSeU8+kY1h1qClOU5uOi4Ysrznb1sMTWJglDTbHfY5/Y28TfHlRD1B5PKBho/qRgYSrArFNlHlt+tZCdDFWmitRH9aC1aewuirRmtrQWtvRkzHEA/7TzEzHm2UEzhrqV94XrCWx/FGw4hPfmEF64gf+asnmLzK99G/io57i/v6lmEsS1hnt//HMfHb9puocEORLcrcPvf3UzkhKUAGAc/wLnnrz2OQ+qGnUDD7PKjDs8/1Rac3i4haHkLKJ5cxtGghvR0BdgGzv/ygPu6ByksNlZZJeneRnZ8/qZuByUhEo4LSGtcaVd7/+bvEO2tSQJUxEVpoGsSewChITWtS4B2+JJ2E7jAikss98vb0Fsaem1j6z/ei1UyCYCCzbchOnxJApS4662b4MqL4klNCkIt8NJTXfFhT/zEni7jgdvjZfIP/4P32nkphYv75W0YNQdxfPx2XDRmE2ETzKoaChuqETHr9WgKWOFvw/3ib3Hteg5hmnYCmlM+Q3DlRaDnJVtDr72dFjwZcefNVtI5t+WBD2HpeXBoN1Tvt2MYNQMqZsJUe55Glctt7NIYiPB6TQdv1XcQjEqWlucxc1IBv3qrqkfd5/ce5dzjx/Ofu6qp64iQ7+x60PbL9xvxGhrTilxMLXQx3q0jumUn7oyZfG5vE38zqxgT4gx/tEcAACAASURBVLGS08d7mDPBQ9iUKn5yFEj8LbonPFKCXaHIHEo0DoLUFsFbkb98IEmk5X9hIqFtj6G1txA85TNEFpwKgOfFrbjeeD7ltuVHbxK59l9xfvW2ZMEIaF/4B6xf3o9xaG/8hzPLZ+CrPI6Cc9ZhTDse+euHbLF5X8+4v7aPP4LZJ9nbam9NsppZ7jxb6HlirwSrQujU8wkvOj22vADpzcfyFIDT1WNagPDyVSmPS5QWIAeRSSpV/NqIWmzsNFRIpytpCgiAyNyT096M7/qNtgCNRmyBGYoJzE4xmji9xLJV9gOEBAHaWY9QMOm30Job0Dpae91v8NTV8c/y4TsRn1mXFDOWlJCltKJX4aK1NOB8+yWkEARPX5P2cY821riJ+C+4AqugeFT3qx/6iPzHvo8W9COFIHTiGQTOvjRuPSbBGuq9bgNiXDHRhraMuPNmKynP7Sv+GZ75JfK91wE7NlVOn0OwYiGeygXY/1NBwBSE/JYSjGMQ05J81BRgV00HB1pD8fL5E71cMG9iXLR15726DlZMG8f6lVM53BKgICYa/RGLPU1BAN6qt+eCzXdqTCt0Mc8XoUSTVBS6kpLsnDmzmBf3N8f38+u3a7nm5Mk8/latip8cYVTCI4Uie1FTbsQYSDrxeDr9Qx/BLzb1mk7ffPQH8RuiwKfXETzrEgBcr23H8d6ryIJxWPnjsArG258LxsH4iRQWFyDvv63n0/cFJ2N+9koCB/bbrqL+djtLYswlsuDgWzhdTqyH70xaTfvGvxHY+xGhitmY5TPssqa6WBKMAqQ7b1DujwNhsOmHk/o6MQGLDCWVtQrviN6AZ0taaREXl8nWTmJloVPOi8edeX/7IFp7K8aCpejllT3+F+Jb9+HLKyOa4n/vefoR3H/ZTnjhitRxbqNMVvR/Zy5rgFCQovv+CbNsKoG/+SJm+fQ+V82K9g+BkWh/ynNbeCnc/1dEJIT14lPDdm6rKTeGzmCm3Bioi2HYtHjgjTp8MbcRQxMsLPWwvDyf6RMLeG5vEy/sbwbsGMarlk/m6Q8b2F1rj8FTx7m59uTJRPxdYtOSkpr2CId8IT5pDXHIF8Yf7fJF8Roat549nT/sbY5vu9hjcMNpUznQHOCJd+oIRJJ9V0rzHFz7qSn891u1tAQiCCHs8AtB/PNJZXksnWR7n7xT7+edBn+8jiaE/Y5df06Jm/kT7IeDe5oC7GkKoiVsK3GdGUUuZo5zA3DIF+KwL5y0rcTP5fkOKmIuuvUdERoCEQQitm07hVTndovdBuPd9oNXR76bgzW+pG11rqMh8Dg0PIYW/80ilkzarkasHbF9dLfq9oXT60r6nT0Oja+cPJlfJQj2VL/zaJDr1/GRQPVJanK1X/obH5VojDGYAdHt0vHW7O1pEfzGvxH8+EOi4QhWQTGyoBizpBxZMK6XLdn0664Fvc6F1+e6C0/G/PwNw58IZwAM9QRy6uDVorQlWGdSlY0UuXoB6PN/MX8ZnH4BzRXzkopFewtFm25ARCO0fv3fscoqR7HFqRnp/u8vqZW+9x08f/gV/s9eGxeIoq0ZmaaFM1f/P52MZPu7n8fuP/433sZDcOHf01pQPiznthKNQ2egY2Q6cypGAmH2t4Qoz3eQ57AfXD72XiPNwSjLyvM4cWJeXJwIAZrLwSFfmO17jnLN8gq0SBRp6EllIhzp8z8jpaQxEOUTX4i6kIWMmPztccVoLgcHW0P88q3auEg8c2YxJ5Tm8ZPXkhOr3XLWdD6q8fG7j5p63c+ZlQV8epqdF+BPn7Ty4uHez58zKgs4O1b3hUM+Xjjk67Vu4nYHUncgbfhLY5BnPmwclvYm1n3psI8/H27rIS47P39qcj6nTSlAcznY3xLi8bdrUwr261dUovXzO48EuX4dHwlUn6QmV/tFzdM4Qui6sGMYf3l/j2XWH55A//wNdAw1EU6arpiDifvLNbIhAUuu0e//4r3X0ZaegVMn6X/h3vk0IhohPGdZVgjGvhCBDtx//DU4XATO/eKgttFnUqvtWxl3wknwX98FwLXzKfyXXA+QtmBU9E338zh84ko8992E/Hg35s0/tt3gFTlFOi6GM8Z7eK+2jb8caeOcaYWsrLSnfrn4hPG4ddHDOiUlmMEIU70G1548GSsUJmpKiFpJZf0JCSEEpV6H/Yrd2HVuu8ih8aWlFfz41SOU5jlYXlnEz1/vGT/59IcNXDKvlAqvAwuJlPa/15K2KLUkFLm6vHcWTfQyucAZW058HUuCRDLR23WvcFyxG6+h9ajTue7Uwq7kPlMKnJw6OT+pXuLnxERApV4Hc0s8dvti/Zm43U4rI0CR26A839Hrdr1GV4I4QxN4DC1pu5aU8f5I/BWjliSS9OAh+bcKmzL+WwjL4vIlFT0E+xdOKudnu6qZWejkrKn2f6a23bYel3od5Du0AVk2FQpF+ijROAhGSqR1n0ogcS48x4o9WL//VcoYtFGP+1PkBOn8L6wXn8R7wmLC2Dct4v+3d+fxUVTp3sB/VdV7p7N3dsIWAwwhQGSRqCCDyzBX8XWuCFFcBicqvuKMjoqIQD4KvuIdcQRfuYqO+Lpw5w6CongVGMgrwhUEBwFBZJ0AWTp7k/Rede4fnVTSSXdn7aXi8/188km6utJ5+qS7qp4+5zzH1gTttzsAAI6pt0Uk7h7xuKE7sB2SMa5XSWPAolZOK1hLwSB+0DBIYwthMw+G86qZIXgSpD0pKR2erBwIVRcgVJ6HmD0i0iGRHvK3pqJ3iGHbmop/P12HW0en4Mdqm9ybCMDnZ39Ej9hpLpu/bT2OWeAQr1fjvX0XYDaqce+ETLx78JLf+ZPHKpsxJTse2XG6bv3dJL0aSfrufYicadIg09S9qq85CTrkJOi6tW+e2YA8s6HrHQFcMzQBI2K6d61wTZYJ12SZAt7ffjDb1EGxKMwy+U2GJcagbfnfCwKH7CQD/u++C50e7++nazFtWAKamhzytgMVzfiuyjtEWadq+VBAr4bZoILZ4B2ia1AHf10RQromlJSUlEQ6iGhgt7u6XcI5Vi2B37kJaCnWAHMGuEdXAo21gKXlU8maCqgnToVD6tmByskESAkpUE+cCit0EEUGJxNgHDwEUv4UeVt7LomDevhI8DmjgepycA+XwKpLhGr0lW3bipfACl3EylQbjVrYbK7I/PF+oMT4u/O6EOYsgLXZDUnnnXvDOWzgmxrATAlwXnNLhJ9Bm0Dtz4ke6PZuBRNUcF57a48ft9N7+fghaIflgm34t7YKs421YHc/juZBo3s991eJr5/2wh2/Z/BI2KfPBmupGtxX7ePnOA4GQ8+WYiA9O0dKHhHDzUbkJBtxwtIMu1vC/gtW2FqGGnqTsgzUNdhx09A4ZJoi05vc+rro2DM6ryADRyou47ilWY73fxcOwmWnB5YmbxJZ0+zGxOx4SG6FDt8Joj/f7+17/XiOg4r3fqkFDhqBh1bgoVXx0Kl4qHlO/l+8FqDgUXWzG5MHxSE3QYfWF2S13QO3yOAUJThFBqtTREWzG2canDhSbUNajBqpRm/S/l1lM842OOHwSOA5Dlo/vdqBKP04HgrUJv4ptV26Oj/SnMYWvS6EE6biLN0ZHx3peX/BKHV8dyslx68RgFiNhHqX4PO6iGm4BPbuy7h8493w5I73/SVJ8lmjMtICtr/TgYQX7gNTa9Hw7Ls9ftzuFLXiHl3Z5/eykl8/wMCKn+Y09k5v5v1rtSpUOMROayo+NW0IYjkJHk9k19hsfV10LL4yKE6L+ydl4lydA7vP1OGOsWn46mwdbv1FCs7X27s9f1KpIvl+7/i/6G7BI8Dbq9nkllBtc6Pa5mn57sa/5CTIQ4DXf2/BpcttF/MqnoNZr2oZsqzC2BQjYrX+PxxU+nEwFKhN/FNqu0TNnMby8nKsWLECycnJsFgsWLJkCQYN8p0vtXv3bmzevBlZWVk4f/48cnJy8Pjjj4PjODz99NPYs2ePvK/T6cQtt9yC5cuXY/PmzXjxxRehVnsPCmlpafjoo49C9lzaLy4ul873SBEvp0/z/og/LhHgYuIgtjuAuUTAUfpf0FaXQ6irhAfwTRSjKGEMStVyCGu3VmhPcFUX4fnvbZCuut7vMjfc/U/DyhshRvji9ueKr7oA3tEEz+BRkQ6F9JAgcHDzvN81FT8/WY05eSlAlLyv3HYnZgyLxxXJBp8iOyOSdBiamIktx6rwr3kpkBzOHs2fJD0X6H8xJy8FU7LbCh5JTj8jTzgOJo0Ak0bAsAB1ByemGZEZo/EmlHY3mlwSKprdqGjp1RyZpAfgTRo3/VgLBsgJ5QitBpAYVDzNmeyJnlZRJtErbEljSUkJ7rjjDlx//fUoLS3F0qVLsWHDBp99ampq8Pjjj2Po0KFwuVwoLCzEDTfcgPz8fBiNRuzdu1fed+XKlZg5s21+0dq1azF5cngXIKckjSiZmOgd+sfXVQFuF2LXPQX3yImwT58tL9sR9XjvyZ2TxO73jjIG1blj0O3bBvWpw4A5A/wNsyD95d8677rtfRjmLIQVkas8/HOlOnUYpvdfhCd9CC4/9GKkwyE90HG4Z0feOYGuqFlrL3CRHRcElYA7xqS2SxL7Pn+SBNZfBY8CGZdqxLjUttt2j2/PZGtBIIkxnKxztBTusXt3/rEOPOctGmQ2qDHrigR5Dq4oMQiUTHbSeizYfrquUxXl9tuIMoQlaayvr8fXX3+NtWu9lUYLCwvxyCOPoKqqCqmpbe/e2bNnyz9XVVVBrVYjIyMDALB06VL5PpvNhh9//BFLlrQtYL9p0yaUlpbC4XBg7ty5GDGCiicQEoggcNBPvgbYswV8cyO0/yiFwPNQFUyGQ61WzkceHAcmqMCJHm9vIx8k2fV4oDm2D9r/3gZV5T8BACxjMIT7noS0/oUBW3lYqTxDfgFJZ4Cq4jx4ywVIKdFdyZe06VgIx98Qwy9/qsWDkzKjKgELVZEd0nPh+l/oVTyyY7XIju08r/beMWZ5iGu1zYM6p4hamxs1dg/qHR5ohERvXIzhxW/KYdIIcvGd1u/JehU0Qt9G7ii1p647VZSvSDZEzYdHpGthSRrLy8thMBig1XrflBqNBrGxsbh06ZJP0thq2bJlOHToEF544QUkJyd3uv/TTz/FrFmz5Nu5ubkYPnw4xo4di7KyMsydOxdbtmzx+9iE/NzJy0ns+Ryexa+j2SMg7oP/A+G+P0L6thSm6wdHdE3P7mpdRxExcd4iVKIHgk7bab1FAIAkIfbfF0Go9g6Vk2Li4Jx0E7S/vBns75uo8nA0UmvgHn0VtId2Qfv9HthvuDPSEZFu6ssQQ0KiAc9xyDJpkNWukq3ZbMKlykbU2D2wOkUILQV0rE4RHomhzuFBncODk3UOn8cyG1R4eHwqOI6DyBgqmtww61VytdhglNxT150qytH44REJLCqvgp577jk0NjZi7ty50Ov1uOqqq3zu/+yzz7B+/Xr5dl5envxzdnY2Ro4cidLSUsyZM6fbf1MJhRHM5sBlrZWA4o+s1vhZfQ3EV71LTKhHjEWSKQ7cb+ZD3PByy7ZxSM6bCK6XlUJDpWP7s/oaSNs/ArdgGfDPU0hOSwCarJC2f4SkmUXeoja8AC4pBQAg5k8CO/Ed+Bm3QZhwHTQtc6DZzCKw3LGQ/msjhAeXgktIBrvnMbDTx73bHngWXHwCzP0cv9JEIn427SaIh3ZBd2wfjHOLwfVhrq3S219JQj3EkJBI0Qg8MmI0yGh3yZigU+GZwkzU2T0+PZPVdjdq7R6ouLYKrXV2D9763gIAiPXTM2k2qOUhr0rsqWstRlRr98BqsSE/zYicZAM++EcF7G7JpyiW2ageMB8eKbU3uKfCkjRmZGTAZrPB6XRCq9XC5XLBarUiMzPTZ7/Lly/DZPKe2OPi4lBYWIgdO3b4JI0HDx5EXl4edLq2tYnOnTuHoUOHyrfVajUcDt9PerrSm8pw4aTUSkytKP7Iao0/TiNB2PmR3LMm/b/V4O95HOLmt9u2ffEfYINHoMEdPZ8ptW//jmsrsivyYR87DQZrg7xNys4FO1gKpyYGtv/1kPdBptwGXHuHt2RqgwNA6zGCg2bQaBgeKkG9pG4pGORvW//Er0QRi980CLHxZggN1Wj4dj88w/K6/h0/qHpqZNBwT/JzoeY5pBrblvZoJUoMtnYFn5wiQ6pRjRqbG1aXCKtLxJmGtiqwmSYNisd6P+hkajX+fqY+6nrqWhddaE2ED5Q34Z9WJ2rt3p5WV/sPg36qwxNXZ6F4UmanKsrzCjLQ3OSEXmibC6rE5CtQz2+4eoPD2WZhKZGYkJCAq6++Wq5+um/fPhQUFCA1NRU7d+7E5cvek/ljjz0m/wwAp06dQnZ2ts9jbdy4EUVFRT7bVqxYgcbGRgDe+Y7Hjh3DpEmTQvmUCFEkq5uHZ8Zs4IFnAb0RsDdDeuN53+GZv3sGl6XoHZ4aI4jAri1tMb/3CgwVp+WEEQBY6SfgZ9zmLZTTesRUa7wJox8uEWhwq3x6PvxtI2HG83AX/hr8g89Cc+KAvFkQOMRppED/TkIIiQoC763o2irLpMGC8alYUpiJhVemYe6oJMwYHIt8swHpMWpkxLSde49fasSgeB3mFaRDr+Zhd0vYeLgSt+WlQq/m5Z46uN3QGLT9fjx0eCSUN7lwtNqG0jIrNp+sw/rDVVi1vxyn6ts6Zs42OPBDjR2VzW64RAa9ikemSYN8swG3XJGAGL0GGw9Xdnr87T/V4JJTwmen69Ho9PgkX7zW2w7+tkWT9r3BpWfrUWZ1QatVod7u9tkmqEIzcivcbRa2roTly5dj5cqV+Oqrr2CxWPD8888DAP785z+jpKQEEyZMwNSpU/HEE09gyJAhsFgsyM3NxZ13ts1jqa6uhsPh6JRITps2DYsWLcLgwYNx4cIFPPnkkxg1ikq0E9JR63IxumFjYVDoEhNWNw/TjNkQrsiX11b0eR7mDPDzHkODqIY4qzhygZI+8xZsuhZs18fQj8yDzeWEoNd55+Ru3wLTjNmKmH/bHQNpWSpCSHA8xyFJr0KSXtWyzEdnBjWP7y42IjvJgLsLMrDlWBXunZCJby804l/HpCIzVguVKMKjUmFXL3u0XKLknYtp9yDFoEZyy3qWn56ux6HK5oC/V+9o69mckB6DUcl6JOq8z8eg9iZIXVVR/qGqGeMzY2FnwMYTtbh/UhbeVNBQXCDIvM1vQt8bHInhyxxj0djZG340PDW0KP7ICja808eYSRDnLIy6C3F/7a/TCt4exo6J76JXYY1Nj/jC4e0NpNdPuHR8nXL3L4Zt+HgYxOa21+4Dz6K5G5VtlTA89YEHHvBZlmrDhg2dlqX629/+hgkTJvgsS/WXv/wF+fn5eP75532qjLcuS1VQUIDNmzcjMzOzT8tSRfs5sjeU/r4MFWoX/yJ1HGQaNT44XIlbfpGCdw96k5F5BelwixJGp8Zg7d4yVDe7UTwpM2iCUNHkwrkGJ2odHu9QUrsH1nYHz+uHxOKarFgAwO5/NmLvpctI1KmQqFchSa9GkvyzCjFqHhzHBW0TjUGL7afrUHq2HoD/KsrZ8TrMGZuKRoeIk9XN+P8t++rVfMtQ3Eo5+cqO1+HBSZlw25x+/157jDFIzFvZtrV6rSgxNDhFiBKDyFq+JLR8Z+A5DsMTvFPfnB4JR6tt8HTYR2x9TJ7D9MFx4DjvMNRzDQ7855Eq2N2+1x1moxoPTM6CTvRAFBlcooTDVTaoeA5qgfN+59u+awQO5pbEXWLev6fi2oYCB2vfvrYZ0PX5MXomLRFCQi5owggoZokJQeBgcDeBfbi2033s8w9obcUBIEYQge1tw5DZh2u8veMfrm177Q6Qyra0LBUhpKP2PUm35aXi2wuNcjLw0dEq3F2QISeMgLdH674r03GgrAF1dm9iOCJRj0ktVXtO1Tuw659Wn7/Bc95CPkk6FeK1bcfRa7JiMS07Fnwfxrx2t4oy53LD4fHg+mHxyE024P0ARXOKxqXhP7+vhMMtougX3pUVLrtEvPW9xSeha/0Z8BYbenxSOgCgyS1i7aHOw2RbmTQC/tiyr0OU8NmZhiD78pg+OM47esvqwPl6B+4uyMCb+y/67Fc0Lh2bj1Zhzgjv8ix2j4TPzwZ/3D9OypCf2yvfeuPtmFyqeA4pRjXmjE3FFUHarL8LDSn7TEsI6ZGOF+JKXGJioCS+JLjuDEPmfvcMrJIaUM7Kon4pYVmqaOyd7Q9UVdc/ahf/wtkuNpeIbSeqUd3sxnvflePOcem4ol0l0vYJitmoxh1j07Dum4s+Q0ETTVo55rGcAFEQkBKjgTlGgxSjBokGNQS+b5Mhu2qTOKMWOckGxGhVALR+tyV6cyrEG7UoVgtYs7fM5zGKxqVj4+FKlDU4YNIK8t/UOjxodFb4/bs8B6hUvLyvzulBsrEWKp6D0JJ4CRwHVUuPX4xGJe8b4xZx9RCHz77tf9a3e9x4UUJMvB7r9l3oFMOuM7WYNToF5gS9HMM1Qx1wiwxuUWr57u2BdEsMMZq254YmF1Q8B4/E5K/2RACJJp23zTQC1nzt22a/m5SFpBit3Ob9ITqvCgkhIeFzIf7Ff4ArXgIrb4RhzkIIU056t0X5hfhASHxJ1wbC/NtQicSyVDQ89eeD2sW/cLcLx8Gnp25wvBZaDiienNUpQSgal45txy0wqnhkpxnloaSpRrUccxyA6RnGlt9ggN2JOnv3hi0G0pM2sXexrbVndePhzkngrtO1uHNsKirrbFDznPw3JcbwhwlpEHgOAgcInDex4znIvaTt43tkfPAPytrve0NW8A/KqqsvyzGvCzBv81hlM6ZkxyMGTB42fH1m14/b6tnCTEjMmzC6pXbfW7pS6+qavG32j85t9ukJC+bkpfRonmtXw1PDUj2VEBIdWi/Em7PzwB4qQSNngMcjddoWzRVDfSrAZueAe3QlrKY0iHMWtm2L8gqwpHuCDkPe9j4MrG8XPNGi/bJUAIIuS9Wq/bJU7QValqq93ixLRQgJr47rnQoeD5wMfhOEXWe8SdV9Y8y4JScBhVkmjEzSI0GnjA9Ouyqac6yqGbV2D4YnGZBp0sjbeY5DvE4Fk0aAQS1Aq+Kh4rk+DavtUdxaDXa1WxbFbFRj0XVDMCbNKO/z5U+14DW9/z/wHAeNwMOoFhCnVSFZr0ZajAaZcdrgbVbZ3O+VWylpJORnSMlLTAyExJd0rVvDkMtOQhOaSuZhRctSEUICET0iJKcLokoV1gQhnAIlX3n9mHyFQuu8zeJJmciO12Fh4SCYIOHuggx5W3/PK2wViTYTSkpKSvrt0RTMbndF5aKhrYxGLWy2/n/RhQvFH1kDMX6RAQ6J93nf+tsWDQZi+4darFoCv3MT8MNB7wZzBrhHVwKNtYDlkndbTQXUE6fCIQX//LN9/BzHwWDQBN0/EgoKCvDGG2/gwIED2L9/P5YtW4b4+Hg8+uijGDNmDDIyMtDQ0IC3334bx44dw+bNmzFo0CAsWLAAguC9UKyursbOnTt9lqoCgMbGRrzzzjs4evQoNm3ahHvuuQfXXnttj+KL9nNkbyj9fRkq1C7+RbJd1Hotdp6px3GLt+qo2ajGwquz0ehww9LkTRpqmt2YmB0PyR2+yfz91SaSR8RwsxE5yUbUNLux4Kos6CQRo9NM8rbfTcwAXO6oOw4xj4QErQoTs+MBlxuiyBAfq4dGFH229bdQtFlX50dacqNFtM/XUPocA4o/sij+yKL4e47jABPvhlB20nf+LXP6bOtOr7ISltyIdtF+juwNpb8vQ4Xaxb9ItgvHAbxWjTKry6cSKVMJPtu4ECUogfR3mwgqAbxG5e1ZbXke/rZFu3C+VvqzzWjJDUIIIYrTOgxZk50Hw0MlsEpqiB4JVnTYppCLCEII6a2O8xslp8tbDMUj+WxT+vFQ9Iid1pn0t420CWebUdJICCEkarlEtFTBZUG3EULIQEdJFYkkKoRDCCGEEEIIISQg6mlswfdxcdNwUEKMwVD8kUXxRxbFH1mt8Sv9eUTKQG23gfq8+oraxT9ql86oTfxTYrt0FTMVwiGEEEIIIYQQEhANTyWEEEIIIYQQEhAljYQQQgghhBBCAqKkkRBCCCGEEEJIQJQ0EkIIIYQQQggJiJJGQgghhBBCCCEBUdJICCGEEEIIISQgShoJIYQQQgghhARESSMhhBBCCCGEkIAoaSSEEEIIIYQQEpAq0gEQrxUrVsBut8NoNOLHH3/EggULMGXKFDz99NPYs2ePvN/MmTPx7LPPAgCsViuWL18Ok8mEyspKLFy4EGPGjFFM/MHui4bYJUnC66+/DqvVCrvdjjNnzuDNN99ETEwMXC4XSkpKwPM8qqurceedd2LatGlhj7238a9duxYffPABBEEAAIwfPx6vvfZaVMWfn58Pk8kk79fY2IgPP/wQ+fn5KC8vx4oVK5CcnAyLxYIlS5Zg0KBBion/3nvvxenTp+X75s+fj/vvvz8S4QeM/+DBg1i3bh1ycnJw8eJF3Hjjjbj11lsBKOPYEyz+aDn2kNBS+nk1FJR8rg4lpV8HhILSry1CRenXLH3CSFR46aWX5J+3bdvGfv3rXzPGGFu0aFHA3ykpKWEbNmxgjDF28uRJduONNzJJkkIbaAC9iT/YfeEUKPZ33nmHbd++Xb7v6NGjzOl0MsYYW79+PVu5ciVjjLHa2lpWWFjImpqawhh1m97Ev2bNGnbhwoXwBhpAoPife+45ebvT6WRFRUXy67u4uJjt2LGDMcbY7t272b333hu+gDvoTfzR8tpnLHD8N998M9u2bRtjjLGamho2atQoZrVaGWPKOPYEiz+a2p+EjtLPq6Gg5HN1KCn9OiAUlH5tESpKv2bpC+ppjBJPPvmk/PP58+eRm5sr33755Zfh8XjAcRyKi4uRkJAAANi6dSs2bdoEez+S9wAACNRJREFUAMjNzYXb7cbhw4cxfvz48AaP3sXf1X3hEij29957D3/4wx/wyiuvoKGhAbNnz4ZGowEAfPLJJ1i0aBEAIDExEcOGDcPu3btx8803KyJ+AHj77beh0+ngdrsxf/58ZGRkhD12IHD8S5culbd/8cUXuOmmm8BxHOrr6/H1119j7dq1AIDCwkI88sgjqKqqQmpqaniDR8/jBwCbzYZVq1aBMQaDwYDi4mLo9frwBt4iUPwpKSmora0FANTU1IDneUiSBEAZx55g8QPRcewhoaX082ooKPlcHUpKvw4IBaVfW4SK0q9Z+iTSWStpc/ToUbZgwQJ21113sdraWsYYY19++SWzWCyMMca2bt3Kbr/9diaKIquvr2e5ubmssbFR/v3bb7+dbd26NSKxM9az+Lu6L9Kx22w2NmLECPbqq68yxhgrLy9nV111FausrGSMMTZ+/Hh2/Phx+fd///vfs3Xr1kUkdsZ6Hv++ffvY2bNnGWOMHTx4kE2fPp3ZbLaoib+j3/72t3Iv0bFjx9iVV17pc/+UKVPYoUOHwhKrPz2JnzHGtmzZIn8ivW7dOrZw4cKwxeqPv/gtFgubPXs2e/rpp9ktt9wif7KslGNPoPgZi65jDwktpZ9XQ0HJ5+pQUvp1QCgo/doiVJR+zdJblDRGoV27drGZM2fK3f2tJElieXl57OzZs1F9cutO/B0Fuy+cWmOvrq5mubm57MSJE/J98+fPZxs3bmSMRe/Jorvxd3TjjTeyPXv2hCvMgPy9dn744Qe2bNky+XY0H4C7E39HlZWVbOTIkczhcIQjxKDax3/HHXewTz/9lDHmvTD4zW9+w6xWq2KOPYHi7yhajj0ktJR+Xg0FJZ+rQ0np1wGhoPRri1BR+jVLT1H11CggiiKam5vl29OnT0dFRQV++uknnDt3Tt7OcRxUKhWcTifi4+NhNBrl4VcAUFtbi8zMzLDGDvQufgBB7wuXQLFXVlZCo9HIk7kBQK1Wy/FlZmZGddt3FX/7tm+9z+FwhCfodoK9dlp98MEHuOuuu+TbGRkZsNls8nNxuVywWq1R1f7B4ne5XCgvL5dvq9VqSJIU9tc+EDj+EydO4PDhw5g6dSoAID09HTzPo7S0VBHHnmDxA9Fx7CGhpfTzaigo+VwdSkq/DggFpV9bhIrSr1n6ipLGKFBRUYFly5bJty9evAiPx4OMjAyfsdNHjhxBTEwMhg0bBgCYNWsWvvrqKwDAqVOnIAgCxo0bF97g0fv4g90XLsFi/9WvfoUDBw4AAJxOJ44fP44pU6YA8G37uro6nD17FtOnTw9r7H2Jf/HixXC73QCAyspKWCwWjB07NqriB7zVx6qqqnzm3SQkJODqq6+Wq/nt27cPBQUFEZkb0Jv4LRYLVq1aJd/+5ptvMHr0aMTGxoYv8BaB4s/MzER8fLxc4dXpdOLixYtIT08HEP3Hnq7ij4ZjDwktpZ9XQ0HJ5+pQUvp1QCgo/doiVJR+zdJXHGOMRTqIn7umpiYsWbIEBoMBsbGxOH36NObOnYsbbrgBixcvhsvlQlJSEsrKyvDwww8jPz8fANDQ0IDly5cjLi4OFRUVWLhwoXyfEuIPdl80xF5fX48VK1YgKSkJVVVVmDFjBmbNmgXA+0nR8uXLwfM8ampqUFRUhOuuuy6ssfcl/tWrV+Ps2bPIyMhAWVkZioqKIlIqPFj8APDOO+8gLS0NM2fO9Pm9ixcvYuXKlTCbzbBYLFi8eDEGDx6siPjb/47RaER5eTmefPJJDB06NKri/+abb7B+/XoMHz4cZWVlGD9+PB588EEAyjj2BIs/Go49JLSUfl4NBSWfq0NJ6dcBoaD0a4tQUfo1S19R0kgIIYQQQgghJCAankoIIYQQQgghJCBKGgkhhBBCCCGEBERJIyGEEEIIIYSQgChpJIQQQgghhBASECWNhBBCCCGEEEICoqSRkAFg7969uPXWWzFixAjMmzcPDQ0NkQ6JEEIIiQp0jiSk72jJDUIGiP379+Oee+7BDz/8AJVKFelwCCGEkKhB50hC+oZ6GgkhhBBCCCGEBEQftRDyM3DkyBG89NJLYIyB4zg89dRTyM/PBwC89tpr2LNnDzQaDZKSkvDMM88gJSUFf/3rX/HRRx9Br9dDp9PhqaeewvDhwyP8TAghhJD+RedIQrpGSSMhA9zly5dRXFyMNWvWYPLkyTh48CCKi4uxY8cOWCwWfP7559i2bRs4jsMLL7yAc+fOwWg0YvXq1fKJ8t1338X3339PJ0RCCCEDCp0jCekeGp5KyAC3e/duxMTEYPLkyQCACRMmIC4uDrt27YLRaERNTQ22b98Ot9uNJ554AldeeSUEQQAAfPzxx7Db7bjrrrtw8803R/JpEEIIIf2OzpGEdA8ljYQMcJWVlUhMTPTZlpiYiMrKSqSnp+ONN97AJ598guuuuw6rV6+G2+2GTqfD+++/jwMHDmDGjBlYtmwZmpqaIvQMCCGEkNCgcyQh3UNJIyED2OXLl5GWloa6ujqf7XV1dUhLS4PdbkdOTg5ef/11fPzxxzh8+DDWr18Pt9uNpKQk/OlPf8KXX36JxsZGrFq1KkLPghBCCOl/dI4kpPsoaSRkAGtsbMT27dvR3NyMb7/9FgBw6NAhNDY24pe//CWOHDmCNWvWAADMZjOGDh0KURRRVVWFpUuXAgBMJhNGjRoFURQj9jwIIYSQ/kbnSEK6jwrhEDIAHDx4EGvXrgUAPPbYY+A4DgBgt9uRlJSEt956C6tWrYIkSeA4DuvXr0dsbCyGDRsGi8WCefPmwePxIDk5GYsXL4ZKpUJcXByKiorA8zy0Wi1WrFgRyadICCGE9AqdIwnpO44xxiIdBCGEEEIIIYSQ6ETDUwkhhBBCCCGEBERJIyGEEEIIIYSQgChpJIQQQgghhBASECWNhBBCCCGEEEICoqSREEIIIYQQQkhAlDQSQgghhBBCCAmIkkZCCCGEEEIIIQFR0kgIIYQQQgghJCBKGgkhhBBCCCGEBPQ/+E80Qt12PkUAAAAASUVORK5CYII=
" />
</div>
</div>
<div class="output_area">
<div class="output_png output_subarea ">
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA4YAAAEpCAYAAADPpjwfAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOzdd3gU1frA8e/sbN9s+iahNwGlSlVAioBXBQERRURUECug159yFVAkKHBRERSsFMWCcvGK0hTEAlJEBKUqCkFBElJITzabbfP7I7CYS1skySbk/TwPz8PZPTPzzskmM++eM+comqZpCCGEEEIIIYSotnShDkAIIYQQQgghRGhJYiiEEEIIIYQQ1ZwkhkIIIYQQQghRzUliKIQQQgghhBDVnCSGQgghhBBCCFHNSWIohBBCCCGEENWcJIZCCCGEEEIIUc1JYiiEEEIIIYQQ1ZwkhkIIIYQQQghRzelDHYAQQghxIV5//XVWrVqFqqqkpqZit9u5/vrreeihhzhy5AgPPvggf/zxBwBDhw4lKyuLLVu2YLPZGDduHI0aNWLOnDmsX7+eunXr8uyzz9KqVSsAkpKSmD59OocPHwYgOzub5s2bM2bMGNq1awfA0qVLeeGFF4iJiQEgPz+f1NRUAAYNGsS0adMA+Pjjj1m0aBHp6en4/X7q1q3LyJEjueaaawB4+OGH+fbbbykqKqJNmza0a9eOb775hrS0NHr27MkzzzyDxWKpsHYVQghRvUiPoRBCiCpt7dq1jBo1imXLlvH1119jt9uZO3cus2bNomHDhixYsCBQ9/fff2fWrFk89NBDJCcn88QTT7Bq1SpeeuklhgwZwr59+xg3blyg/uHDh8nMzGTZsmWsWbOGiRMnsnnzZh544AHS0tIC9YYOHcrKlStZvnw5devWBcBgMDBs2DAAXnvtNSZMmEBOTg5r1qxh5cqV/P7774wZM4aPP/4YgNmzZ9OyZUsAdu7cyaBBg/jkk08wmUwsX76cDz74oNzbUgghRPUliaEQQogqbc6cOVx//fUA2Gw2rrrqKgDWrFlzSt1evXqh0+moX78+AE6nkxYtWgAEXktKSqKgoACAK664gvnz52M2mwG49tprAcjLy2PTpk0AXH311QwePBiAhQsXsnXrVgBGjRpFs2bNcDqdvPnmmwBceeWV2Gw2oqOjA72Ss2bNQtO0UnFecsklNGzYEJPJRL169QDYsWPHhTSTEEIIcVYylFQIIUSVlpGRwXPPPceBAwcAyM3NBSAlJeWUujabDSjpzTvba7m5uYSFhaHX61m8eDHr1q0jOzsbq9UaqHNi/1FRUQDs37+fWbNmAdCqVSvuv/9+AA4cOIDL5QIgIiIisH1kZGQg/tTUVGrUqBF476/19PqSS3VOTk6wTSKEEEKcN+kxFEIIUWUdPXqUESNGsGbNGm666SY+++wzhgwZAnBKL9z5OLHt888/z8svv0x6ejpLlixh2bJlp9QB8Hg8PP7447jdbsxmM8899xyqqv7t4yuK8re3FUIIIf4O6TEUQghRZe3atQun0wlAz549AfD5fGW2/++//x6Atm3bEhUVdcZ9v/LKK/z8888AjB07loYNG5KZmcmRI0do3LgxZrMZl8sV6M2EkolsABwOBwkJCWUWsxBCCPF3SI+hEEKIKuuvwy8PHjyIz+dj27ZtZbb/EwnbH3/8gd/vDySKf7Vjxw7mzZsHQOfOnQMTzhw4cIAPP/wQq9XKAw88AMCWLVsoLCwkKyuLXbt2AfDoo49KD6EQQoiQUxMTExNDHYQQQgjxd8THx+P1ejlw4ACbN28mJSUFm80W6L2bN28ea9euJS8vD4A9e/ZQp04dEhMTyc/PB2Dr1q3UqFGD559/nsLCQgC+/fZbunTpQqdOndi9ezdJSUls27aNunXrsnbtWqAk8fP7/WzcuDFwvIKCAj788EPeeecdPvvsM+rVq0fv3r3p0KEDNWrUYP/+/bz55pssWrSI+vXr89RTT3HDDTcAJctV/PDDD3i9XrKyssjNzeX7779n7dq1gdd+/vnnwEQ7QgghRFlStAt5CEMIIYQQQgghRJUnQ0mFEEIIIYQQopqTxFAIIYQQQgghqjlJDIUQQgghhBCimpPEUAghhBBCCCGqOUkMhRBCCCGEEKKak8RQCCGEEEIIIao5SQyFEEIIIYQQopqTxFAIIYQQQgghqjlJDIUQQgghhBCimpPEUAghhBBCCCGqOUkMhRBCCCGEEKKak8RQCCGEEEIIIao5SQyFEEIIIYQQopqTxFAIIYQQQgghqjlJDIUQQgghhBCimpPEUAghhBBCCCGqOUkMhRBCCCGEEKKak8RQCCGEEEIIIao5SQyFEEIIIYQQopqTxFAIIYQQQgghqjlJDIUQQgghhBCimpPEUAghhBBCCCGqOUkMhRBCCCGEEKKak8RQCCGEEEIIIao5SQyFqAQ2bdrEgAEDaNq0KcOGDeO2226jb9++vPvuuyGNa9euXQwYMICePXuGNA4hhBACYO3atYHr5YoVK055v6CggHbt2nH11Vcze/bsEEQoRNUliaEQlUCXLl2YMGECAAsXLuTDDz/kpZde4vnnn2fTpk1n3bZp06YcOXKkXOJq1apVIC4hhBAi1K655homTJiA2WzmvffeO+X9Tz/9FK/XS//+/Xn44YfLLY6ePXvy/fffl9v+hQgFSQyFqKQaN25MkyZN2LBhQ6hDEUIIISqVPn36sGfPHnbt2hV4TdM0Nm3aRMuWLUMYmRBVlz7UAQghzqygoICvv/6aDz74gH79+jF16lTmz5/PvHnzGDhwIAcOHADg0UcfxWQyMWPGDOLi4liwYAFffPEFqqpSv359nnzyScLCwkhMTGTlypUMGzaMAwcOsGfPHgYNGsRDDz3EW2+9xerVqzGZTJjNZh555BGaN28eiGX+/PmsW7eO3NxcZs+eTYMGDULVLEIIIaq5mjVr0qtXL959911mzJgBwMaNG+nSpQtr1qwBSq5bb775JmFhYSxcuJAFCxawZs0a7r33Xux2O2+++SatW7fGbreze/duYmNjeeWVVzCZTABs2LCBV155BYPBQFhYGJMnTyY+Pp7x48eTkZHBtGnTCA8P54knnqBFixYhawshyowmhKgUtmzZojVp0kTzeDyB8mWXXabt2LFDa968uXb06FFN0zStuLhYGz16dGC7Jk2aaH/++Weg/Mknn2h9+vTRnE6npmmaNmHCBG38+PGB94cNG6aNGDFC83q9WlJSkrZkyRJt+fLlWt++fQPbzJ8/X5s9e3YgjubNm2s//PCDpmmaNmnSJG3ixInl2BJCCCHEmW3ZskWbPXu29v3332vNmzfX0tPTNU3TtEcffVQrKCjQhg0bps2cOVPTNE1btWqV1qNHDy0/P1976623tHXr1gX2M3v2bO2qq67ScnJyNJ/Pp/Xt21dbsWKFpmmadvjwYe3yyy/XkpKSNE3TtPfff1+76667AtteffXV2pYtWyrojIWoGDKUVIhKZvjw4dx2223MmTOHl19+mdatW9OlSxeWL18OwPr16+nWrdsZt1+2bBnXX389FosFgJtuuonly5fj8/kCdbp3746qqjRs2JBbbrmFpUuXct111wW2GTx4MNdee22gvtVqpX379gBceuml5fZMoxBCCBGsjh070qhRIxYvXszhw4dxOBzYbLZSdfr06UOzZs3417/+xR9//EH37t1Lvd+6dWsiIiLQ6XQ0btw4cH1buXIlLVq0oGHDhgDccMMNfPfdd6Snp1fMyQkRAjKUVIhKZuHChej1pX81BwwYwKuvvsp9993H559/TmJi4hm3T01NJTo6OlCOjo7G4/Fw7Ngx4uPjAbDb7Wfdxm63l6oTFhYW+L/RaMTj8fytcxNCCCHK0rBhw3j55ZfJycnhzjvvPG2dCRMm0KtXL1577bVT3vvr9c1kMgWub6mpqSQlJXHHHXcE3q9VqxaZmZnExcWV8VkIUTlIj6EQVUCvXr1IT09nw4YNKIpCeHj4GevWqFGDrKysQDkrKwuDwUBsbGzQ2zidTg4ePFg2wQshhBDlpH///ng8HpKTk6lXr95p63zyySfcfvvt/Pvf/6aoqCio/daoUYMWLVrw3nvvBf598sknNGnSpCzDF6JSkcRQiCrAZDJx3XXXMX78ePr06VPqPavVisvlYtmyZaxevZqBAweyevVqXC4XUDJ1d//+/VFV9Yz7P7HNiQvmO++8I7OhCiGEqPRMJhPTpk3jkUceOe37+/bto7CwkIkTJ9KgQQNmzpwZ1H779u3Lzp07SU5OBiAzM5M77rgDv98PgM1mw+VysWXLFt55552yORkhQkxNPNuYNCFEhdi0aRPTp0/n2LFj/PDDD9SuXZtatWqVqhMeHs6KFSuYPHlyqSSvoKCAN998k99++42RI0fSpk0bioqKePHFF1m6dCnh4eE8+eSTGI1Gnn/+edavX88vv/yCx+OhTZs2QMlaiC6XixkzZrBs2TJUVeXhhx/m4MGDTJw4keTkZFJTU4mJiWH69OkcPnyYnJwcunTpUqHtJIQQono7cb3cuXMnHo+Htm3b0rBhw8ComMcff5wffviBP/74g8LCQqZPn47BYOCGG25g7ty5rFu3jqSkJDweD2+//TYHDx7EYrHw888/89FHH7F//36io6Pp0KEDl112GVOmTGHZsmWsWbOGCRMmBK7Nfr+fN954g+3btzN8+HBiYmJC2SxClAlF0zQt1EEIIc4tKSmJ999/n0mTJoU6FCGEEEIIcZGRoaRCVHIrV67E6/WydOlSBg4cGOpwhBBCCCHERUgSQyEqub179zJw4ECys7Np1apVqMMRQgghhBAXIRlKKoQQQgghhBDVnPQYCiGEEEIIIUQ1J4mhEEIIIYQQQlRzkhgKIYQQQgghRDWnD3UAFSk7uxC//+J8pDImJozMzIJQh1GlSJudP2mz8ydtdv4utM10OoWoKFsZRlQ9lMc1sqp8/qtKnFB1Yq0qcULVibWqxAkSa3koizjPdX2sVomh369dtIkhcFGfW3mRNjt/0mbnT9rs/EmbVbzyukZWlZ9lVYkTqk6sVSVOqDqxVpU4QWItD+UdpwwlFUIIIYQQQohqrlr1GAohqiabrwhcrnNXNJspVC3lH5AQ4rR8epVirz9QPpLlxIdySj2TXofq9VVkaEIAp35G4fSfU/mMiupIEkMhROXncrG301XnrNb8u41gk8RQiFAp9vrpMW39Oeutm9AdawXEI8T/qsjP6OmS0NORJFRUFpIYCiGEEEIIUcbkixJR1UhiKIQQQghxAaRnqHqQn7O42EliKIQQQoRYSkoKU6ZMITY2lvT0dJ588knq1KlzSr3t27eTmJhI9+7dGTt2bOD1wsJCJk6ciNVqRafT4fF4mDx5MkajsSJPI2iaBihw+/zt6BSFuXe0xmJUmfrZbxh0Cv/s3QiTXscnPx3FqNdxbTMHelXHr6kFmA066kRb0CkKmqahKKc+w1jRpGeoepCfs7jYyaykQgghRIglJiZy00038cwzzzBkyBAmTpx4Sp2kpCR27NhB06ZNT3lvyZIl5OfnM2XKFJ555hmOHTvGRx99VBGh/y2KAn5NY396Ib+mFWDU6/BrGst3pPLfH4+i6kqSvudW72fyil8B0DSN4W//yC1vbsPn19A0jW4vbKLXzM14fCW9OPe/t5N/Lt6N93h59tcHmbfhEN7jU7x/vS+D75Ky8Gsl5bQ8F/kuL5pWNaaqFyV8ehUnyjn/+fRqYJuk9EJeW/c7lXFVAo+vJCivz0+Byxv4fFZWwbT/kSxnqfYXVYP0GAohhBAhlJ2dzcaNG5kzZw4AnTt3ZsyYMaSlpREfHx+o16hRIxo1asS4ceNO2UdcXBzZ2dn4/f7APis7BfjvAx1wur2oOgWfX2NSv6Y43T70OgWvz89NbWpQ7PWjV3W4vX4ax4fh9voxqDqKvX6KvX78mobh+Ps//ZmLXqeg6hQ8Pj/vbzmCqsA9V9XF6/Mzbukv6BT4blxXfH6Nfq9sRQEOvNgHv6bRb8732Ewq/7mvPRow7uOfsZn0PH1DEwDe3XIEm1FlUNsaAPx0OBebSU+dBHvI2rE6Crbn7qtx3XA73cSEGXnpqyS+/z2HwZ3rVUCE5yevyENEmIE9Kfnc995OWtUOZ/6dl7MvNZ+pq/bTqnY4/7r2Eg4dK+TNL5No5LDRv3UCOU4Pu5PziLObaJoQhtfnx+3TsBh05dqTLj2nF68KSwyDGSYzd+5cDhw4QHR0NAcPHuSOO+6ga9euQMk3hS+88AKZmZkUFBTQq1cvbrrppooKXwghhCgXKSkpWK1WTCYTAEajkfDwcJKTk0slhmfTt29f9u7dy4gRI1AUhWbNmnHLLbeUW8xnetZKO83SFGeiKAp1o0/OIqzqFPq0PHm+elXH49c1DpSNeh3v3t02UDbpdWx+4iqc7pJnuXS6kiGpLo8P5fgw03/2aojb60dRFLx+Pz2bxuI7PvzU5fYSZzcGtnW6fWQUuHG6VRRFwVnsZd1vmVgMOib1a0qR28er3/yO2aDj5nY1cXl8PLBoFya9jjWPdw3qnI8VuJmxch+zh7TE5fEx4ZNfsBpVptx4GS6Pj+dWH8BqVPnXtZfg9vp5bd3vWAwq93evj9vrZ9H3R4iNstKvWSwen58VO1MxGVT6tozH6/Oz/rdMjHodXRvH4PVr7Dici1Gv0Kp2BD6/xsFjhRh0OurHWvFrGscK3Oh1CtE2I5qm4fZpqDoFvS70w3PLQnq+m692pnBft/oMuLwGNSLMGNTTD5Z7fUQ7Iq2GQFlDwfk/dc7n830+DPqSmNxePzajit1ccnueWeDh17QCom0lcf2RUciHW5O5okEU/VsnsC81n8c+2kvH+pG8MrQVPx7OZcyHuwPl3cl5vLDmAG3rRvBI70YcynSybEcql8TZ6NMynswCN/tSC0gIN9EozobX58fr1zDpyzexFJVXhSWGiYmJDB48mN69e7Nu3TomTpzIwoULS9XZsGEDb7/9Nnq9nv379zN48GC2bNmCyWRi9erVHDp0iFdffZXi4mKuv/56OnbsSO3atSvqFIQQFcxfWEjG+CdoMPnUYXUieLIO5MVv8eLF7N+/n7fffhuAxx57jA0bNtCrV6+g9xETExZ03SNZztP2GHw4+sqgtldVHY7osu9LqBEfXqr8yA0RpcoLHrgi8H8H8P0z1wTK9WpG8t2kXjjdXhyOMMK9Pl4b3haPz4/DYcdZ7OW+qxuiAQ6HnbwiDx0aRqPqSs4nGB6fxs9H83E47OQ6PWw8kEW4RY/DYSen0M2q3WnYzXqeH9aGXKeHD7YmYzfreermluQ6Pby+/g/CLXru7t6AXKeH6asPYDfrGd7zEnKdHsZ/shG7Wc+uf19LXpGHUR9sKFW+fXrp8g3/Ll2+YsIXpcpXJn5FlNXApkm9yHd5uPa5b4mwGvj8X90ocHkZPOc7Iqx6PhzdicJiL/ct2Ea4xcDrI9rhLPaSuGo/doue6be2osjtY+qynwkz6xnX7zJcbh+vfHmAMJOeB3o1otjr4/2Nh7AY9QztXBe318+Kn1KwGFT6XF4Dj8/Phn0ZmI0qdWOC++woAHoVh8PObQ47t3Ur+eyeTqTVwG2vbjnr/s73832mY/2vKJsRR7SVvg47fTvWDTw728tuZnntSIx6HQ6HnQaKyvj+l1IjwoLDYad2oZeezeJoVisch8OOOd2JxagSHW7G4bDjPlrAvtQC6jpsOBx2fkot5P3vj9C7RTx39byEHWlp/N+SPfRsFseCezvw1d407pm/jV7N45h/Twe+T8pk+op9dGkSy9g+TfntaD6fbEtmYPta59UOVYHDUTV6/cs7zgpJDIMdJvPOO++g05X8ca1duzZOp5P8/HxMJhPLli3j6quvBsBkMtGxY0dWrVrF/fffXxGnIIQIgbxF71O07hv8OWOC3EJD87hRDJVzwo2QkXUgK7WaNWvidDopLi7GZDLhdrvJy8ujVq3gbr4Avv76a7p06RK4hnbo0IFFixadV2KYmVmAP8gHsE63aP358Pn8ZGTkX9A+ypLDYSczswAVsCsEYmtfsyRZPlG+p1PtUuVXh7QAwOk790yVADE2IzNvaU5GRj4en58Xb2ke2F+x18/TNzRBpyhkZOTj8vj4Z6+GqMfLRW4fwzvXIcJuJiMjH6fbx8DLEzDodWRk5FNQ7KVn01iMJ8ouL+3qRmA2qmRk5JNX5OEShw3L8XJukQdHmBHr8XKO04NRVTDolEC5yO3DqJ4sH81xUVjsDZR/Sckj3KIPlDfvzwyUVYuJz3YeJdyi57GeDchxeli0+TDhFj0jr6xNjtPDq2sPEG7RM6hVHDlOD1OW/UK4Wc81jaPIcXoY+8FOws16OtQKI8fpYeT8bYSb9Sx/tEtQ7R0XbuK+znVKfdYu9LMbjBOf72CPdbbfhwRzye90RkY+dR12BraIC5Rr2/RMv/HSQLlVnJX1Y7vg1zQyMvJpEmVm4fA2mAwln4log8KYqxtQM7LkM+QvdtOpYRSNYixkZOSTnlmIQVXQU7L9b3/msONQDnE2AxkZ+fzwazpvfJ3Eta2CG8lQ2X7Pz8ThsFebOHU65axfAlZIYhjsMJkTFzSAdevWcc011xAbGwtAcnIyMTExgfdjYmI4cuRIRYQvhKhgvrxcdPZwIu4eiS/zGLr4hJKk5aw0POkZpL/4FHGzXpLkUFQZUVFRdOnShQ0bNtC7d282b95M27ZtiY+P58svv+SKK67Abj/7t8T169cnKSkpUD548CAJCQnlHfopcpyeUr0qCRFmFE5NNk16HVTD6fxPDOsEMKglQz5PMOl13NDq5M/MbFC5/YqTo6IsRpVRPRoEbg6tRpXxfZoE3g8z6Zk+qNnJslnP68NaB8rhFgMf3NsuUI6wGFj18MmfVaTVwMYnuv7lfT3rx3bBd/zLArtZz/LRHQOTt4SZ9bz3l6G9NpPKK7e1LFWeeuOl6I4PSTQbdDx+7SWBYaomvY77u9VDf7y31aAqDOlQC+PxYZV6ncL1LeIwHy+rOoUujaIxG4KfN/F0I2JNeh3rJnQv9Zqq6vD6ynbCl/T8Ysw2U5nuM1gn2txu1tOs5sm/HfVirNzZ6WQPXtu6kbStGxko977MQe/LHIHJmjo1jGL+nZdjM5VMItMozsaD3etjMQQ3qYzXp/HbsQIax9lkaGoVUSknn0lJSWHJkiXMnDmzTPd7PsNkqqKq0g1emUibnb/ybjPXwd/59dahxN42mFqP/R9xM6cHtV3xkWR+HT4SX2Eh1tTDhLVtU65xno9Qf86KXTlB1VNVJeSxnlBZ4qgokyZNYurUqXz77bekp6fz7LPPAvDSSy+RmJhI+/bt8fv9TJkyhZ07d2KxWJgxY0ZgyYrRo0eTmJjIxIkTURSFzMxMEhMTK/w8Hnx7e6nyholXYzpdb1o1TAqrGkVRsBhPJgCqTiEhwhwo63UKTRNO3lcZVB0dG0QFyiaDyjXN4gJls0Hl5nY1A2WLUWXkVScngrGZ9Dx6TaNAOcysZ3L/SwNlu1nPrFuP99BewHmpXt8pE6I4oq0cyii4gL2W5vPDqEW7eHpgc74c1+2cz2xWti9KTsQbYTHQqvbJ5y4bOWw0cthwBtkTWuD2MWzBj/yzV0Nuv6Kklzjcog8krqLyqZDE8HyGySQnJzNt2jRmzJhBVNTJPzC1atUiMzMzUM7MzKR+/frnFcf5DJOpaqpKN3hlIm12/iqizZzbduE+epTML77GMOROlGDXYTOF43hjHv6cbIrqXELBgT/RhdlR1NBOl10ZPme2IL8J9/m0kMcKF95m5xoqUxnVrl2b119//ZTXV65cGfi/Tqfj6aefPu32ERERzJo1q9ziE0IET6eAI8zIcyt+4dWhrbD+ZVKb06pESWFZUhWIthq4smHJ/fxTn/7CwWNOnh/UjBa1ws+xtQiFCkkMgx0mc/jwYZ5//nmmTp1KZGQkn332GQkJCbRt25b+/fuzatUqbr31VoqLi9m6dSujR4+uiPDFRcjmK6L4z5xz3zDLZBwVxrVzJ4Z69bBe3ZO4Oa9hbtc2+KTwONNllwHgzcgg7d67MTZvQeyzU1F0smSrEKL8nG544pnqXaxJQHWQ4/Sw6emeuE8zIy9QMmj6+MQxc+5qS0G+q9RMp9WN3axn1cNXolPA4/OTkusi2+mhTrQFTdO4592d1I+x8M9eDQm3VN92qkwqbChpMMNk7rnnHrKzs+nbty8ALpeL1157DYDrr7+eXbt28cQTT5Cfn8+oUaNOWe5CVG6VamZEl4tdMhlHpVH0/RbSx4zC0KQpCfMWYO0a3NTvZ+JNScF79CjoVPwF+ajhEefeSAAEZsMTQgTvdMMTT0uSwirtwbe3882EHlw9bd05666b0L1aJ4UnqMeHpRpUhY8f6MDhrCIiLAb+OOZkd3Ief2YVMb5PE4q9fv79+W9cdUkMvS6NletQiFRYYhjMMJkvvvjijNsrinLaRX1FFSIzI4ozMNRvgC4mBmPjxiimC39Y39y6NfFvzsNQrz5qeATuAwcwNGokF5ogFH62irwPFhEx8h5sPYOf0VIIISqK9NCGVjDtr6o69Aql2l9RFOodX2qkfqyVxfe2IznHhV6nsOVgFp/tTudAeiG9L3OQWeDm+9+zueqSaOlNrECVcvIZIcqddnE+a1rV5C35D4a69bBceSU1Fy1GFx1dZsmb+fKSyWec33xN+thHCb/tdqIeGyvJ4TkUrlmNe/cufMeOAeBcvw50KpZOnVD0cskQQoSe9NCGVjDt74i2nvNZ8YYOGw0dNgCaxofxcM8GgV7Wb349xvNrDtC9SQwv3NycYwVuFCAmTGYcL09ylReVkr+gAH9hIWpUFIrRSPG+X/Dn5GBq0RJdWBiFX32JLzUV23XXo8bEkPvOQryHDxNxzz3oa9Qkc9oU3Pv3EztpMob69Ul78H6Kf95LwrwFGJs0xXcsI9SnWO0VbdpI1pRnUKxWaq/6HDUmtnwOpCigaWg+b/ns/yLjeO4FnF9/haVrVzRNI/ulmXiSkoib8yrW7j3w5eSgRkaee0fiohZsj43VqOIrkptzUaSpoScAACAASURBVPkF+5kW5cNhNzHsypOPiMWHm+hQP5Jel5bcG3y49QjvbznCI70bclvH2nh9/sBSJ6LsSGIoKqXUe+/GvXcvNRZ9iKllKzKfScS9Zw813v8QU6tW5C6Yj3vPbowtWqLGxFD4xRrcu3dh6z8AfY2auH/+meJdO/Hl5mDgeKKZnY3feXySa0X+mISK5i95aN/cqTO2G/pjbtu2/JJCwNrjamou/ghD48agaTg3bsDatVu5Ha9SMpsD60BqHjeK3lCSMJ+mnk61ENb3huN1Pdj63EDRhvVYOndB87hJHtAPfa1axL/xpjy7WY0F22MTHWYio8hd7vEIcaGC/UxfyFIZInhdG8eUWuezyO1Dryo0jS+ZcTpxxa/8mV3E49deQvOaMsNpWZHEUFRKanQMamxsIIkwNW+BzmpDMZc8f2brfQ2mli1Ro0p6LiLuuBNfTg76miVrJEU/Pg5/sQtjw5I1kRwzX0LRKegiSuqrseWXiIgz07xeMidPQhceTtTYx4mdOq1ChnYam5QsAn1s8iQKln5M1Nh/EXHn8HI/bmVRqFpwbv0OU+vLUaPjg95OMRiIvPc+Iu+9D4Difb+guYvRXC509nB8uTlkTZtK2IAbsXTuUl7hCyGEqOYev64xo65ugMWg4vNr/Hg4l2MFbiKPP3+YuGIfNcLN3NaxljyTeAEkMRSVUvyrpScqinlyYqlyxN0jS5Vt111fqmxq1apUWe9wlGF04u9y/7qPglUrUfQG7ENuw1CnboUe39ymLYWrP8fY5NJzV76I+DIzSX/s/1D0emp/+Q1q+N/7dtV06WXU+Xo93qMpKIpC4eefU/j5Z/hyc7B07oI3IwN/QT7GBg3L+AyEEEJUd2Gmk2nL0gc7sOtIHrWiLGTkF/PZ7nRMeh13dKqDX9OYv+EQVzaMpmUtu8wtcB4kMRRClDt/YSGaqwhT8xY4nnsB1eGo8KQQKOnZuqorakwM3rQ0XNu3Edanb4XHUdH8BQUlPXqq+reTwhN0FkugJ97a42r8+fkYLy1JtPM//IDc+XOJHDWayAdGXXDcQgghxOmYDSodG0QBEGU1MHtIC45ku7AaVfYk5zF/42GW70xlxZgrcLp97E3Jo03dSPQ6SRLPRhJDIUS58mVnkzb6ATS3m4S33sF2zT9CGo8aE4Pf6SR15HC8hw+jGI3Yel8T0pjKm6FePeJfeQ3NV7aTgOgTEgLDTAFQQLHZMLfvAEDmlGfw5+UR+eBoDA0alOmxhRBCCAC9quPKhtGBcoTFwJAOtYiyGlAUhc1JWUz45Bc6N4ripVtb4nT70OsUjHqZb+J/SWIoqiezmVZbN+HznWPZCrO5YuK5mGka/oJC8Hjw5+dfcI9VWdBZrdhvHkzh6s8DSczFquiHrRStX4998GAMdeuV67GiHvonEffch2IyobndFKxaiVZYSOTohwDI//i/mNq2DQw1tfmKwOU6ZT/Frhxsf/3dNJspVGVtUyFE6MkaipVfnWgLj17TKFDWNI260RY61C/pYVz6YwoLNh7mwR71Gdy+VqjCrJQkMRQV5/jMiJrHU7Ie2pnGfFdAMlaoWnAkxJVaY0fTNApXLMe1fTsxTz2FYpC1ci6E59AhvEf+xNLlKhLmzgOdij4uLtRhBUQMH0H40NtRjEYKv/oS/P6Q92aWh/wPFuH86kt04eFE3nd/uR9PZzmewBmN1Pr4U4q2fIehXj28qalkPpMIej11129EsdnA6WRvl3PPENv8u41gk8RQCBF6soZi1XNNszh6X+bgxPeNSRlOCt0+Ymwl93nTV/zCviO5jLyqLpcm2EMYaehJYigqhL+4mD9ffpnwYXcGZg6tbBRFIfftBXiSkggbMABz23ahDqnK8qalcfSuO9AKC0h4ayGmlq3OvVEIKEYj7t9+I2Pso6AoGBo0xHjJJaEOq0xFjLwHXXgEYQNvqvBj62vWxH7TIAA0n4+wG28CNHRhYRTv3YPXKM96CCGEKH+KoqA/fsmZ1K8pI6+qS2yYEU3TWP5jCkdzXAzvXLKO4rwNh4gNM3JNM0epCW+qg+p1tiJk8hd/SN777+HasYOaHywOdTinODGkrfGbr6FpWkmvR2H2qRVlSFtQ1Lg4rL164T1yBEOjRufeIIQMjRsTPuxOFIOh0sd6vjSvF1OLlphatAx1KBhq1SJ28jOBcvGOHdDx8hBGJIQQorqqHXXyXm7pP7uw4ofDNKthp7DYyzubD+PxaXRtHE2YSc+q3Wm0rxdBfPjF/3iRJIaiQlh79cL9y8/Yji+cXem4XOztdNU5q8mQtrNzbviWok0biX5iPDETngK/r9IPyVUUhahHHysp+P0cmzYF2z+uxXLFlaEN7AJpPh/JA/tjatmKmPFPorNXruEx4bcPQ5+XEeowhBBCVHMJkWZuvLwGAKpO4fHrGvPHMSexYSaOZBcxecWvhJlUvnikEygKydlF1IsJakBxlSOJoSh3vpwcDLXr4Jj+fKhDEeXIl5dLxhP/QisowNyufcnzeqoa6rCCcmKNo/zlyyj4aAnOr76i9mer0Vmr7h/+4j178B4+DJpW8jxfZaTKJUgIIUTlYTao9G+dECi7vX56XhpLuFmPXtWx7Y8cRn2wi25NYphxc3N8fg2dwkWzVqJclc/iTDPmnUKGF56RNyOD5H59sF3zD2ImTS6ZdEZcdPxOJ2p4BI7nXqD4x+1Yq+jyD2EDbsS97xds116HzmpF07Qq+8fe3Lo1tVatxpeaiqKTKbmFEEKI89XQYWP6Tc0C5bT8YsJMKg1jS744/nxPGm9tOsxdneow4HivY1Umd+lnI8MLL1jx9m1objf+/PyLIyn0+dD8frnRPk7TNHLnzaVgxXJqLHwXa9duWLuee5bJykrR6YgZ/yQAnuRkMh59hJhJkzE1a3aOLSsXX2Ymfmchhjp1MdSuHepwhBBCiItC35bx/KOZg2KPH4Dvf8/mSLYLr79kytMl25I5lFnEoLY1aOiopKN1zkLubkW5sl13PbWWrSBq7OOhDqVMeNPSONzlSvxFRWiaRv5/P8L1049ofn+oQwsJrbiYwrVr8B4+hGv7tlCHU6by3l2I+5efyZ79UqhDOW95Hy4iue/15CyYF+pQhBBCiIuKQdURZi7p7EjsdylvDGtFr0sdAHy6I5WPtqdwNLcYgJW7Utl0IAu3t2rcJ14EXTiissp9ZyGWzl0wNm4c6lDKjKLToUZEorNY8GZkkPlMIjp7OHU2bsbvdJIx/glMzZoRef+DaJoGfj9KEM/Zldew5fLar+b1Urx3D+bWl5PwxlyKd+3CenXPoLevCqLH/gudLYyIEXcD4He50FXAGptlQXO7UUwmzK3bhDoUIYQQ4qKl6hTa1o0MlJ/q04T1vx2jQ/1IvH6Nl786SG6Rlw/vbUcjh40fD+VwWQ07FmPlnINBEsMy4M/LA1sUxXt24/zqK0yXX461ew98mZn4Mo+hxsejRkSee0flJBTPShbv3UP2iy+Q89or1Pnym0o3I+LfpSYkUPPjT0oKHg9hA24EvR5FUXAnHaDom6/xHjlC5P0P4j10iJTBN2Pu1In4l+fgLyzE/csvGJo0Rg2PKL3j8hq2XA771bxeMv71GM7164h7+RWsXbtedEkhgGIwEvXwPwEoWLWS7JdnkTDvLQz16oU4snOLfnQsEffci84eHupQzs5sLvns/Q9VVfCdWIn4eD0hhBCismtW006zmiX3vE63j1vb1+LXtAIaxlrJcXoY/cEuzEaVzx++EpNeR2GxL9D7WBlUnkiqMq2ke7h49y5yF8zDPvhWrN17UPjVl2RNeYawQTcTO2kyBatWkjt/LmH9BhBx90iKd+2iaMt3mNq0wdKhI77sbPwF+aixjpJ19MpKCJ6V1CckYB9yGzq7/aJJCk/QHZ/hUV+zJrHPTg28rq9Tl9jnXghMVuL5/SCaqwi8XqDk85F23z2YLr+cGu8uwvPHHxQs+xRz+/bYLm9e8Sfyd6kq+tp10Fms6MIvrp/t6WiaRsGyT/GlpuJc9w0Rdw0PdUhnlfv2W5jatsPUqlWlnzinULWc9m+Ow2EnIyM/BBEJIYQQZcNqVLmn68kvkzML3FxW006YSY/ZoHIgvZA73/qRa5o5mNz/0lO29+lViv8yBPVIlhMfp17XTXodqtdXJjFLYlgGdGElN8emVq2JHPMQxstKJqrQmUwYGjVCf3zyB29yMp6kJPz5eQC4tm0l55XZhN81AkuHjhQsX0b2iy8QPuwOoh8fR/6yT8j/z2LCbryJ8MG34tq5k+JdOzFffjmmlq3w5eWCx4MuhL2Rp+N3OlFjYkvWsatG1MhIwq7vEyhbr+5JnQ2b8OcXlLzg92Ns0QJTq9YAuHb8RO6CeXiTjxBbBRJDX3Y2ye/Ox3D7CKIefYzw24air1kz1GGVO0VRiHvpZZxr1xI24EY0rxd/bg5qTGyoQzuF5/Ahsme9iGIyUfurdajhlbzHUAghhKgmGsXZeOuuNnh8JcneL0fz8WsaRn3JlC+bk7J4Z/Of3NyuJtc0c1Ds9dNj2vpz7nfdhO6U1eJaFZYYpqSkMGXKFGJjY0lPT+fJJ5+kTp06p9Tbvn07iYmJdO/enbFjxwZeLywsZOLEiVitVnQ6HR6Ph8mTJ2M0VoLFs48/Q2Zq3gJT8xaBl8MG3Fgy1PC48NuGYu1xNTp7WEn9VpcTMfIeTO3aA6AYDKg1a6LGl6yf4v3jEO49e/B37wFA0cZvyX3zDSLufxBTy1bkf7CInNdeJWLkvcRNmUj+x/+l8LNVhN00iLC+N+DauQPP/v1Ye3cvl9M+/RBVDV9GBqhqyfBZVa0ay3mcYUjb6eqdDzUiMjCM2NK5C5bOXQLvGZteSsR992Ns0vS89hkKmqaRNmYU7t27iMh3ETV6TLVICk/QWW0lSaHHQ8b4J3Dv+4WEt99F73CEOrRSFIuV8BF3g88nSaEQQghRCRnUkkSwX+sErrokOtAr+PW+Y/z0Zy4dG5TcN7o8FT9hTYUlhomJiQwePJjevXuzbt06Jk6cyMKFC0vVSUpKYseOHTRteuqN8pIlS8jPz2fmzJkA3HvvvXz00UfcfvvtFRF+mdDZ7Rj/MqzS3L495vbtA+XwobcTPvTk+YQPG4alRw/U2JKbT1Or1tiH3o65zfEJJRQFXVQU6vGbU/eB/bh+2Iqle0ki6Fy7lrx3FxLX9fPgAvT78OXlBX9DeREt53GmIW3lyXTZZZguu+x4ANkVeuzzcWItv6hH/o/82bOw3zI41CGFjOZy4T18GF9WFr7Uo5UuMdQ7HET/32OhDkMIIYQQQYiynezgeqRXQ9rXj6RlrZJcodBdNsNDz0eFLFeRnZ3Nxo0b6dq1KwCdO3dm27ZtpKWllarXqFEjRo4cif40693FxcWRnZ2N3+/H7/eTnV15b6TLihoTi7n15Rhq1QLA2rUbMeMmBHqdIu9/kLrrNwaSyYg77iJ+7nysPXsDYGrVirCBN6EYgutV9efn8+dVnchb/AEABcuXkTXjBYp/3guALycHf3FxmZ6jKD9+pxNfdtYF7aN43y8cvf02vKmpWDp0pNlny9HHxZVRhFWPzm4n/s15JCxYiKllKzxHjuDLyQl1WAAUrv6ctAfvp+iHraEORZynlJQURo0axdNPP80DDzzAn3/+edp627dvp1+/fsyYMeOU97777jsSExOZMmUKd999N6tXry7vsIUQQpShMLOe65rHUSuypKPCFoKZSyukxzAlJQWr1YrJZALAaDQSHh5OcnIy8fHxQe2jb9++7N27lxEjRqAoCs2aNeOWW24pz7DLbXhhedHXrFlqeJ/tH9di+8e1KMH2RmmgmC3ojw9ldX7zNc6vvsTUogWmZs3Jmj6Nws9WEfvcC4Rd3wfNWVgepyEukP/48F7FYgHfhX3blDP7Zdx7dpM7fy4xTz2NopOlT9WoKNSoKDyHD5E68m7U6Gji5y0I+dDN/KUf49ryHZYeV2Pp0DGksYjzc6EjatLT01myZAmzZs0CICsri9zc3IoIXQghRDkxGyr+nqvKTD6zePFi9u/fz9tvvw3AY489xoYNG+jVq1fQ+4iJCTvPowY/42JZPfR5IRyO08db7AquR8MQE0W7g/sCa+8Z7xlOYce2RHe7ArPDTq7ip1BViW1cH7vDTsHhoqD2q6rKGWMLtcoSV7A/o3O1Zd7GTfz+6OPE3nwTtR5/DFdB9GnrXbLgTdTIk5MW6TQ/4X+JQfP5UFSViFdmkjZvAbX+9Ri641/sVJY2CzW3L5ZjFhMGuw1HXDjqWWbfrYg2i1rwBseW/BfH7UPOGktVUV0+ZydG1MyZMwcoGVEzZswY0tLSSn1x2qhRIxo1asS4ceNO2cd//vMf6taty8yZM3E6nTRo0IChQ4dW2DkIIYS4OFRIYlizZk2cTifFxcWYTCbcbjd5eXnUOj5EMhhff/01Xbp0QXe8x6JDhw4sWrTovBLDzMwC/H7t3BWroLNN727zBXfOPp/GsWMFJ19o3gZD8zbkA/kZ+UT8ewbhz3opAlwZ+VittqD3Wxmnnq9MU+Kfz8/odDH7CwrQhYXhcnpx//knx9Z+jeHOkYTpTj8MQY2M5NdBt57zeM2/24j5/ofIzHMD7krVZiGn2nDMexud3U6WC5xrV2Fu1x6dtfTXRBXRZt60NNS4ONRBQ8hyAa6q/TO60DbT6ZS/8UVgaJTFiJqkpCR+/fVXli5ditlsZtSoUXg8HoYPH16OkQshhLjYVEhiGBUVRZcuXdiwYQO9e/dm8+bNtG3blvj4eL788kuuuOIK7Of4hrt+/fokJSUFygcPHiQhIaG8Qxf/Q/nL859KWa61WN39ddiyz4s3LQ1FVQMz1P613l9pPh9ZM56ncMVyan6yHHObtsS/OQ9zh44ox2eEPd1waJ1W8Q80X4z0x2/cC1at5NiEcZg7dCT+jbmlfk/Km9/lIuWmG9HXrk3C/LcuunVDxbkVFhbSrVs3LMf/Jvft25f33nvvvBPD8kqmq0rvb1WJE6pOrFUlTqg6sVaVOEFivVBHspxB1VNVHY7oshm7WGF3L5MmTWLq1Kl8++23pKen8+yzzwLw0ksvkZiYSPv27fH7/UyZMoWdO3disViYMWNGYMmK0aNHk5iYyMSJE1EUhczMTBITEysq/Kqtij0rWR39dVZUv7OQ3NVfoYuMIuKOO09bX/P70YqL0VkseFNS8BcU4PpuM2H9B2Dp1Pm0+/0rWyWeBbUqMrVoiRobi7lTpwpNCgE8SQdA1YGqSlJYBZXFiJqEhITAaBoAg8FA8d+YKKw8RtVUlVEGVSVOqDqxVpU4oerEWlXiBIm1LJxuMfvT1vP5g47/XCNqKuwOpnbt2rz++uunvL5y5crA/3U6HU8//fRpt4+IiAg8WC/OTyiWYhB/n85qI+qhf57xffev+zg26WlMLVsS8+REoh8fh//BUZguvawCoxR/ZahXj5qfLEMNj0DzuMl7/33Chw0LekbgC2Fq3oLaa7/Gl5Fe7scSZa8sRtT06dOHF198MbC0zLZt2+jSpctZtxFCCFG5mfQ61k04uRa5qurw+U5d29Ck14G3bEaCVZnJZ4S4mNl8RXB8NlEoWToEzY8uPBxOfGOk+cFixa3qcf+6D19ONlFFRceXMwm+d0GUDzU8AoBjiZMoXLEczx+/Ezv52XI9puePP/Ac+gPLVV0x1K5TrscS5edCR9R06tSJvn37Mm7cOGw2Gx6Ph0cffTSUpySEEOICqV5fqcktHdHW0/cMllFSCJIYigshQ1TLjsvF3k5XnbNa8w3rMF5yCXEvz8HcvgM6ec6z0gm/fRjFO37Cfutt5X6svPffJX/Jf4gYeS9R/3yk3I8nyseFjqgBGDFiRLnEJoQQovqQxFD8bTJEteL5i4ogAqzdup+7sggJU7Pm1Fq2EkWvx/P77/z5xgpM940plzUgDZc0Rl+3LrY+fct830IIIYSoXiQxFKIKkclFqgZFr0fzeEgb/SDeI38SZYs840RCFyJ8yG3Ybx2CogT3gLoQQgghxJmU/VfYQgghUAwGYp5OJOKaXtgH3Vym+9Y0jYynJpC/9GPweMp030IIIYSonqTHUIjqSJ4PrRCWK6+kbr9ryMjIJ/+/S/AcOULUP//vgnv43Hv3ULh8GUXr12Hre0OQE1oLIYQQQpyZJIZCVEPyfGjF8iQnkzltKni9WLv3wNym7QXtz3BJY2KnTMPvdKIzmcooSiGEEEJUZ5IYCiFEOTPUqoVj+vP4jh3D3KYtms+Hoqp/a1+ax41iMBDWf0AZRymEEEKI6kyeMRRCiApg+8e1hA+9HX9xMekPjyH3nYV/az/5H/+XI32vo+Dzz8o2QCGEEEJUa5IYCiFEBXJt/Z6iDd+S+9Z8fDk557190caN+FJS/naPoxBCCCHE6chQUiEqA5kMptqwdu1GzDNTMDVvgRoZib+wEJ3NFvT2cbNfwfXdZswdO5ZjlEIIIYSobs6rx3Dnzp2sX78er9dLdnZ2ecUkRLVTqFootEWd+58qE8ZcDOw3DsTYuDHu334juV8fCpYvC2o717ZtaMXFWLpchWIwlnOU4nzJNVIIIURVFlRiePDgQa677jpGjhzJtGnTcLlcjBw5kvXr15d3fEIIcdFybf0e37FjFK7+HE3TzlrXl5ND6gP3cuQfvfAXFFRQhCIYco0UQghxMQgqMXz22WcZP34827ZtIy4ujrCwMN5//33mz59f3vEJIcRFK3zYHcQ+9wKOWS+jKArelJQz1vVlZGBs0hRTi5bowsIqMEpxLnKNFEIIcTEIKjH0er10794dILAws9VqLb+ohBCimgi7vg86k4nc997lSL8+ONevO209Y+PG1PxgMY4XZ1VsgOKc5BophBDiYhDU5DOaprF161Y6/mWyg927d5dbUEJURZqmkZ2dgdvtAs4+LLCqSk/X4ff7L2gfqqonLCwSiyX4CVeqA19aGng8eFNTT3nPtXMnxdt/IGzAQNSYmBBEJ85GrpFCCFExyuNeqyzubSrC+cWpYDSaiYpyBL6wDEZQieG4ceO49957sdls5OTk0K9fP7Kzs5k7d27QBxLiYldQkIuiKMTH10ZRLs6VYPR6HV7v3//jqWkaHo+bnJwMAEkO/yLqsbFYe/UiptVlUJCF5vUEJpgxx4Shdb0SnacAXeHxz5bZLJMRVRJyjRRCiIpRHvdaF3pvU1HOJ05N85OTc4yCglzs9sjgjxFMpRYtWrB27Vq++eYbUlNTqVGjBj169CBMnnMRIqCoqIDo6PiLNiksC4qiYDSaiIx0kJt7TBLDv1AUBXObtlCQxd7OXc9Zv/l3G8EmiWFlINdIIYSoGHKvFRxF0WG3R5GVlVb2iSFAWFgY/fr1K/Xa+vXrA89VCFHd+f0+VFWWBg2GwWDE5/OGOozKKfgRH6ISkWukEEKUP7nXCp6q6vH7fee1TVAt++mnn5729blz58pFT4i/OJ9x3NWZtNPZSNtUNXKNFEKIiiP3EMH5O+0UVGI4depULr300kA5Pz+fQ4cO0aJFi6APlJKSwpQpU4iNjSU9PZ0nn3ySOnXqnFJv+/btJCYm0r17d8aOHVvqve+++441a9ag1+s5ePAggwcP5rrrrgs6BiFE8MaMuY+7776Ptm3bn7GO3+9nyJCBzJ//HuHh4RUYnRCVR1lcI4UQQlQ/53OvtXDhIqzW8n1EIajE8M477+Shhx4q9dqhQ4f46KOPgj5QYmIigwcPpnfv3qxbt46JEyeycOHCUnWSkpLYsWMHTZs2PWX79PR0lixZwqxZJVO1Z2VlkZubG/TxhRBlT6fT8dJLr0lSKKq1srhGCiGEEKfz13ut8p4kJ6jE8H8veAD16tVj27ZtQR0kOzubjRs3MmfOHAA6d+7MmDFjSEtLIz4+PlCvUaNGNGrUiHHjxp2yj//85z/UrVuXmTNn4nQ6adCgAUOHDg3q+EJUN5988l8WLpxP797XkpaWSlLSfh544CH27fuZHTu2Y7OFMX36THw+L6+/Pgej0URhYQEJCTW4444RrFnzGUeO/MlHHy1m3bqvuOOOETz99HgyMtLp27c/W7Zsxu0uZuDAW3jrrbk8/fSztG3bnqNHU3jjjTnExyeQmppK3br1uOeeB0LdHEKUqwu9RgohhKh6Kvpea/LkqbRu3bZc77WCSgxfeeWVUmWPx8Nvv/0W9EFSUlKwWq2YTCYAjEYj4eHhJCcnl0oMzyYpKYlff/2VpUuXYjabGTVqFB6Ph+HDhwcdR0zMxT1DnMNhD3UIVU5Ztll6ug69vnLMknXLLYP55Ze95OfnMn36C/zww/c88cRjLFy4iNGjH+K+++5m584f2bHjR6KjY7j77nsAuO++u2nZsiV9+97AypXLGDJkKO3alQxvmDx5KrfcMoDu3Xtw773389///oebbhrE2rWrUdWSc3/mmYncccdddOvWA4/Hw4QJj5+xTXQ6XbX5zJ7PeRa7coKqp6rKRd1+VencLvQaKYQQouoZOPBmfv55D7m5OUyZ8hzbtm1l/PixLFjwHvffP5oHHxzJTz9tY8eOn4iMjGL48JJ7rQcfHEmzZi249to+rFjxKbfcMiQwlPTpp59lyJCBXHVVN0aMuJePP15C//4D+eKLzwPHfeaZpxg69E66di2515o48YkyO6egEsPFixfTtevJ6dMNBgNt2rRh0KBBZRbIuRQWFtKtWzcslpLp2fv27ct77713XolhZmYBfv/FufC4w2EnIyM/1GFUKWXdZn6/v1Ktg6NpGs2bt8Tr9RMfXxOLxULNmnXwev3UrFmL9PR0Nm/eRFRUFP/+9xQAzGYzKSlH8Xr9aJqGz3fynHw+P5GRUTRs2Biv18+NN95Sql5eXj67d++kWbNWeL1+FEXl3/9+8Yxt4vf7q8Vn9nw/ZzZfcH+jfD7tom2/C/3d1OmUCv0isDJc2d0iwwAAIABJREFUI4UQQoRGixatAKhZsxZWq4W6desBUKtWbY4dO8aWLZuJiorihRemAWCxWEhLSz3j/iIjI2ncuOSxukGDBpd6z+ks5P/ZO/PwqKrz8X/uvbNlJjvZN5RFUBZlF1CxQq0L/sTdLloVVGy1tWq/4gKkFVrbooIbVqvSuqGV0iouICAKghsCgggIuARCSCDLZJn13vv7Y5JJhpkkk2QmmQnn8zw8TM4995z3nlnOec/7nvfdvv1Lhg07DfDNNw8++HDEniUsxXDGjBkdUsCOJS8vj4aGBlwuF2azGbfbjd1uJz8/P+w2cnJykOVmy4PRaMTlcnVaJoHgeMBk8iVIlyQJY2Oy9Ka/dd2ngJx//lQmTz4XAK/Xi6a1rtwajcYoSisQxCddnSMFAoFAEL9Efq1lavVatAnL7621CW/evHlhdZKWlsbEiRNZv349ABs3bmTkyJFkZ2ezevVqamvb3xm+4IIL+Oyzz/wD/PnnnzNx4sSw+hcIBKEZN248n376sf/vxx9fyK5dOwEwmcxomsauXV+zb9/edtuyWm0MG3Yq27dvBcDlcjFnzj3REbw3Y7EwZNOGdv9hsfS0pIJGujpHCgQCgaD3Ek9rrVYthtdee227N+/atYv7778/rI7mzp3L/Pnz+fDDDykvL+eBBx4AYOHChRQXFzN69Gg0TWPevHls27aNhIQEFixY4E9ZMX78eC688EJmzZqFzWbD4/Fwxx13hNW3QHC88dFH69m5cwfl5eUMG3Yq//zns9jtdl577RUKCwv912699XaWL/83f/nLPMxmM1lZOQwf7nNPOOecybz88gsYDAZ+//t7+Pvfn8But/Pww3/hN7+5E4PBwBtvLPcfnC4sLGL27D/y1FOPs3XrF9TU1HD55Vf18EjEH/VKAtgSeloMQTtEco6MRDon8KXJmDp1KpdffnnIgDgCgUAgiBzdvdZ67bVXyMsriOpaS9KbTHDHcP7553PTTTe1eqOu6zzzzDO88847rdaJNcQZQ0FLIj1mZWXfk5PTN2LtxSIGgxyxc5THw3iB+G52hng4YxjJOfKmm24KSOe0ZMmSkOmc1q1bx9dff01OTk5IxXDu3Lns3r2biRMndkoxjMYcGS+f/3iRE+JH1niRE+JH1niRE6InazTWDpFc20STzsh57Hi1Nz+2ajH8/e9/zznnnNNmZ6mpqR0STiAQCASC3kCk5shIpHMCWLt2LX379hVn7wUCgUDQaVo9Y9jehAfw0ksvRVQYgUAgEAjigUjNkW2lcwqXqqoqli9fLgLgCAQCgaBLhBWVdN++fcyZM4edO3fidDqjLZNAIBAIBHFDT8+Rf/3rX7nzzjsDInd3hmi538ZLTsp4kRPiR9Z4kRPiR9Z4kROiI2u0ckbHSh7q9uionB3NGR2WYjhv3jx++9vf8tBDD/Hwww/j8XhYv359h3Y0BQKBQCDojXRljuxqOqejR49y6NAhnnvuOQC2bNnCnj17qKioYM6cORgMYU3zjW2JM4bxQLzIGi9yQvzIGi9yQvRkjUbO6N58xvDYnNGdPmN4LGPHjsVoNPonqxNOOIGZM2d2SDiBQCAQCHojnZ0jW6ZzmjJlSlA6p3HjxpGU1Ppub58+fQIC1cyaNYv8/HwRlVQgEAgEHSYse6SmaXg8HsxmM++99x4Oh4OPP/6Yb775JtryCQQCgUAQ03R1jpw7dy7Lli1jzpw5LF26NCCd0+7du/19/PGPf2Tbtm1s3LiRBQsWBLXz8MMPs23bNj744AP+8Y9/RO4BBQKBQHBcEJbF8MILL2T58uXccsstzJw5k/r6ehRFYfbs2dGWTyAQRBiv18trr73Cc8/9nWeffZG+fU/wX6utrWXBgj9hsyVSUVHO9Ok3M3jwKf5rDz30Z6xWW9A1geB4pqtzZEFBAYsXLw4qX7Fihf+1LMvMmTOnzXbuuOMOkd9XIBAIYoCurLVau9YdhKUYTp48mT59+gCwbt069u/fT35+vr9MIBDED2+99QbDhg0PGSTj6aefZMiQ4Vx55U/Zv38v9977f7zyyjIkSeLpp59k2LDhXHbZ1UHXBILjGTFHCgQCgaAlXVlrtXatOwjLlfSaa66hpKQEgMTERIYPHy4mPIEgApgqS0jZ+S5p25aTsvNdTJUlEWl3+fLXufji81i06CHuvff3XH31pWzY8AEAF198KcOGnRryvlWr3ub00ycA0K/fALxeD199td1/bfz4iSGvCQTHM2KOFAgEgtjF7tb4zq6yt0blO7uK3R2ZQDPRWmu1dq07CEsxlGWZl19+mdtuu42lS5dSWxsfEZEEgljGVFmC7cAWFI8DCVA8DmwHtkREObzkkssZO/Z0amvt/OlPf+OOO/6PF15Y0uY9dnsN9fX1pKWl+8vS0tIpLS1t85pAcLwj5kiBQCCITexujQqHjrcx4LJXhwqHHhHlsDeutcJyJV20aBH9+/dH0zTWr1/Pn/70JzRN47zzzuNHP/pRtGUUCOIOU+UPmCu/b7OOoaESSQ/8YZJ0FduBLzBXftfqfa70vrjTi8KSY+jQ4QDk5xdQWVkZ1j0CgaBjiDlSIBAIuh+7W8PubjvFjlMNLtOBcoeO3e27KEkauh7YTrJJItkUXs7A3rTWCuuJFUXxVZZl0tLSsFgsrFu3LmRUNIFAECZ6K7tVrZV3ApPJBPi+u3o77SYnp2C12qiqav5Rq6qqJDc3t81rAsHxjpgjBQKB4PilN621wrIY3nHHHZx33nksX76c6upqLrzwQp555hmGDx8ebfkEgrjEnV7UrlUvZee7KB5HULlmTKB2wJnREq1Nzj33fD7+eCNFRX3Zv38fiqIwZMgw/7VNmz5qDD4TeE0gOJ4Rc6RAIBB0P8kmmWRT23W+s6t+N9KWGCQoSPRt6nV3gvv21lqtXesOwlIM9+zZQ0FBAXfeeSdnn302BkNYtwkEgjZw5AzBdmALkt7s56BLCo6cIV1u+6OP1rNz5w7Ky8sZMWIU//rX89jtdl577RVOOWUIq1a9A8A///ksZ589mbPOOhuAm266hb/97c98991+yssPM3fuPGRZ9l976KEH2bdvX9A1geB4RsyRAoFAEJukWyQqHDotdUOpsbyrRGut1dq17kDSj3WqDcETTzzBr3/96+6QJ6ocPVqHprX7uHFJZmYSFRUi4EFHiPSYlZV9T05O3w7dY6osIaHsK2SPA82YgCNnCO70wojJFGkiuavWmfGKR8R3s+N0dcxkWaJPn8QIStQ2Yo5snXj5/MeLnBA/ssaLnBA/ssaLnBA9WTuzdrC7NSqdvgA0BsmnFLY8P9jdFsPO0hk5jx2v9ubHsLY1e8OEJxDEIu70wphWBAUCQfuIOVIgEAhil3BcTgU+hB+YQCAQCAQCgUAgEBznCMVQIBAIBAKBQCAQCI5zhGIoEAgEAoFAIBAIBMc5YSmG7777Lr/85S/ZsmULAF999RW33XYbhw8fjqpwAoFAIBDEOmKOFAgEAkFvICzF8KWXXuKuu+5ixIgRAAwZMoTrrruOOXPmhN1RaWkpv/rVr5gzZw4zZ86kpKQkZL3Nmzdz0UUXtZoYuLa2lkmTJvHYY4+F3bdAIBAIBNEiEnOkQCAQCAQ9TViKoSzLDBsWmFxx1KhROJ3OsDsqLi7m0ksv5Y9//CNXX301s2fPDqqzb98+tm7dyqBBg1ptZ8GCBeTm5obdr0AgEAgE0SQSc6RAIBAIBD1NWOkq3G43JSUlFBY2h9UvKSnB7XaH1UlVVRUbNmzwW/kmTJjArbfeyuHDh8nOzvbX69+/P/3792fWrFkh21m7di19+/bF5XKF1a9AIGjmxReX8O23+0lNTeOHH77j8suvZty48QDous7ixY9SWVlJfX09Z545iQsuuKiHJRYI4oOuzpECgUAg6B3E+1orLMXw1ltvZdq0aQwbNoz09HQqKyvZsWMHjz76aFidlJaWYrVaMZvNAJhMJpKTkzl48GCAYtgWVVVVLF++nEWLFnHvvfeGdc+xdGfC454gMzOpp0WIOyI5ZuXlMgZDx+I5Kft3YH7zeRw3zEauOOB/rSelRkyuJj799GMeffRJDAYD+/fvY/r0a3n33bWYzWbWrHmPgwcP8Je/PITL5eKqqy5l9Ogx5OXlBbXT0WdsDVmWOzX+ekMdkjWx3bJYQnw3O048jVlX50iBQCAQRI9vq528s7+aa4ZmcqTB43+dZFIi3tcnn2zikUee8K+1br75OlasWI3ZbOb999dQUlLCn/+8AJfLxc9/fjkjRowiNzd4rdVThKUYTpw4kf/+97+89dZblJWVMWjQIObNm0dBQUG05fPz17/+lTvvvBNZ7vyi9OjROjRNj6BUsUNmZhIVFbU9LUZcEekx0zQNr1cLu77h269IeOkvoHoxv7oIw4FvQPViWPNvHFOnd0mW5ctfZ8mSf3DuuedTWnqALVu+4O677wNkvF6NrKxcHA4HNTV20tP78PbbK5gw4Uy8Xg1FMTJixChWrnyHa665PlBmg9yhZ2wLTdM6NP6KIpGiN8Da5XgnX4FdNYYsizXEd7PjdHXMZFnq1o3AWJgjBQKBQBDMt9VOXt55FK+ms2x3JQdr3Xg1nQ9+sDN1QFqX2m5trWUw+NSrvLx8HA4H9fV1mM1mVq58iwkTzgTAbDYzYsQoVq9eGbTW6knCUgwBCgsLmTlzZkDZsmXLuOyyy9q9Ny8vj4aGBlwuF2azGbfbjd1uJz8/P6y+jx49yqFDh3juuecA2LJlC3v27KGiooI5c+b43wCBIJZIfP4PIcvrrp8LgO3VR8DjRgIM3+1E0n2bFqbtG3FMnY5pyzpMWz9o9f62uOSSy9m5cweHDpUyf/7f2L59GxaLxX9948YNnHXWj0hP7wNAWdkh0tLS/dfT0tIpLS0N+1mjTZMCqD96H1SUogwcjqXfqVg9dQFlpqKhuNWellZwPNKVOVIgEAgEneP5L8tDll8/PAuA13YdxdNoFPq+xkWTeWhHRQNTB6Sx5XA928ob0HU95P1t0dvWWtCGYvjZZ58xZswYAB5//PGQdZYvXx7WpJeWlsbEiRNZv349U6ZMYePGjYwcOZLs7GxWr17NuHHjSEpq3W2oT58+LFmyxP/3rFmzyM/P57bbbmu3b4EgVnGfPAbTzk/A5fArhbok4TxzWsT6GD16LADDhp3qLysrK+ONN/7DH/7wp4j1E20SFRVWLYeKxh/QFx7BetP96C8/1ly28lWsNw/GrYqNIkH0ieQc2RtRFAnFbApZ5nG40Hun845AIIgxBvdJ4OsjDlyq7lcKJeDMwsgdV+gtay1oQzF85plnOOWUU7DZbCxdupQzzzwzqE5HgsDMnTuX+fPn8+GHH1JeXs4DDzwAwMKFCykuLmb06NFomsa8efPYtm0bCQkJLFiwgLvuuiugnYcffpht27axd+9eEhISmDFjRtgyCATdSXuWPc/wMzBv/8ivFAIgK8hVvt0v94izcY84u0symEyBC7OyskMsWvQQc+fOIyWl+RxjTk4uVVWV/r+rqiopLCzqUt+RxO6RSZp8BcqAYfDiQnDUoy+6p7lCZh7SjHuxa0ZArDhbQ1EkEhUVu0f2L8xDlQnaJ9JzZG9CUSR0k5FVeys5f3BGUNnkfqmoTk8PSykQCHoD7Vn2hmda2VHhCFgZyBJUOX3uRSOybYzJT+rSMZnestaCNhTDp59+2v/6hhtu4IYbbgiq8+yzz4bdUUFBAYsXLw4qX7Fihf+1LMvt5n264447uOOOO8LuVyCIVaxvLwHVC4BuMIGuIaleTDs/xnFR184YhuLgwQM88cQi7rlnNsnJKaxZs4qsrGyGDTuVc8+9gNWrV3LxxZficrnYsmUz119/Y8Rl6Cy6DnVl5SQf/Bbl2t+h/X1ewHVp+izssg01QucfeyP+85irlpN07BnNFmWC8Ij0HNlbaFIAH9tYQkW9h5OybJxoM+CRZX/ZwAwrRVYDqlf4fQsEgujyzv5qvI2upAbZt55Qddh5xNHlM4ahiOe1FoBSXFxc3F6ln/3sZzgcDiZOnBhQPnLkyGjJFRUcDnev3RG32cw0NIjQ6B0h0mNWV1dDYmL40UTdJ49FcjuRqytouPhmdIsNubqC+itvR0tr37e9LT76aD1vv/0mhw4dIikpkaKiE7jppl/y7bf7+d///sMrr7zIhx++z49+NIXc3DxOPLEfX3+9k/fee5f33nuH//f/LmHUqDFB7cqyFLEATh0dL9mgYM3IQHvlCWg4JjBJ9REMQ0bh0iMfYayrxMJ3M+CM5lefIw8YApm5JKn1AWVaWhZqDPxGdnXMJEnCajW1XzFCiDmyGWOCmdX7qthZXg/AzsN1nJSdyLOflVJR77MSHqn3MKYoFc0TO4phLHxPwyVeZI0XOSF+ZI0XOSF6snZ07TC4TwJuVafaqfL/BqZhMchUO1WuGJxOmsVnH+vs2iZaa63W6Iycx45Xe/OjpB972jIE11xzDS+88EJQud1uJzk5uUMC9iQiKqmgJZmZSVRW1kXMta6s7HtycvpGR9gYIZJRScMZL7myDF0xIKdnBgSfCcmN91PfN/aCz8TCdzPFpKGsWor+/hu+ggQb0rFnNPsORL95LtWenj+jGW9RScUc2YwkgWw28oPdzYtbDuHwBP5eZNqM3Hx6AYcrG7AqEslmBUWSutRnJIiF72m4xIus8SInxI+s8SInRE/WaKy1Irm2iSadkfPY8Wpvfgwr98PZZ5/Nxx9/HFR+6623dkg4gSDWSNEbUFYtJUn27WQ3WVZalgl6AF3HtHktyYvvxvafJ0mUvbC2RfCZzDykWY/CsLHN96x6FavsRaqr6RmZYxi7R8Y7+Qq46X5IsDWf0Ww5njPupVYTrqSd4XieIxVFwmQ106Tb6Trg8XJimoVrRwXn5vrZiFz+tfkQz2w5zKLPy5i/8SCLPj/ECzsqWLG3iiqn119X7aUbuQKBQBCrhLU1/NJLL1FRUYHVaiUx0adl6rrO0aNHoyqcQBAtFEVCrzoi0h/EIFJdDdY3nsa0ezMAui0Ze4PXF3xm4HB4dynSjfdhl21Yr7oNZfxuX9mMe6mtqiX58btwn3oWjh//HEQqG8C3WLerRt9n/Ia70Z8IPMstXXcXdj1BLMQ7SSTmyNLSUubNm0dGRgbl5eXcd999FBYWBtXbvHkzxcXFTJo0KSA427Jly9i4cSNZWVl8++23nHfeeUybFrkIx6EIFVBGUSRUkwmnR+M/2w8H3bP6m6NMHZzBWzvLqXSq1LpVqpxqYyAIF+Pzm3eyn9p6GIdHI81iIM1iID3BQJpFIb3xb5tRRooBa6NAIBD0FsJaNSUnJ/Pggw8GlOm6zp///OeoCCUQRJtERUVbtUykP4gxjLs+x/rG08j1djSLlYYLb8AzbCJIEnYVTEVDsc4sxq4ZUb0adowBZfJ3W5BcTiwfv4PhwF7qrvgtempGTz9WTKAokm/j47Wngi++/TLJY8+h1qHiOTn88w4CH5GYI4uLi7nyyiuZMmUK69atY/bs2QFpmgD27dvH1q1bGTRoUND9K1eu5OGHHyYxMZHq6momTZrEmDFjws4X3FGODTIzMMNKvyQjTknG4VF5/rOD/jOFLfnqcD0T+qYyY2QuqlfFo+pUubxUOb1UObykmH2/t7quU+tScao6dR43JbXBZ5V+NybHX//DEjsJBtmnQFoMpJgVFFkojYLWaZk+JVRZW0dJQtVr696O1hcIeoqwVrx//OMfGT58eEDZ4cOHeeihh6IilEAQbez1XvpMuQx14HB44RGR/qCncTmxvvsvzF+sBcBz4hDqL7kFPSVQqXOrNCrqesgydegEalMySfz3QgwHviH5qVnUX/ZrvANHdOPDxB4BwWdCnNHUv/oceeQZKH2y8S/lvV5hcQ2Trs6RVVVVbNiwgcceewyACRMmcOutt3L48GGys7P99fr370///v2ZNWtWUBtPPfUUsuw7HZKamorVauXIkSPRUwzNJlbtrfQrfy9uOcT1o/OQJIkdZXX+8kybkRvG5PPWrgp2lPkC0qzcc5Sbx+ajelWMikSW1UiWNdCNWZIk/u/0POwu1ac0OlUqnT4FstLppdalkmTyBZtSNZ33v7cH/FJLQIpZabQyGji/XyqGRkXRrWqYlLBO0nR8XIQC0K10dryPtXa3LNv4Qw0TTkjFU+8M2aYsh7aUt5aOpTXLukjfIohFwvplfO6554LKnnnmGf8kJhDENB43yoG9mD9dhfW/T5H05P+RMv+XqM/9FUe/U5Fuuj/oFn/6g1gI0XgcoFSUYNryPrrBSMNPrqHu2vuClMJwUQsHYp/5IJ6BpyE76kh68S9YVi8F9fj1CU5U1HbPaGofvoUlv9F1UVVJfvL3WP/zBErZ9z0gcXzR1TmytLQUq9WK2WwGfDmxkpOTOXjwYNgyNCmFANu3byc3NzdIWY0kHoeLyf1SuXFsPglGGYdH48lNB3jus4MUplq4ZlQuRakWfntmX5LQuGpoFjeOzaco1cKMMXlorvajFcqSRKrFwImpFkbm2JhyQgpXDO7Dzadlc9e4PORGN1JV1/lR32RGZFvpm2wi2aSgA9Uulf3VLnZUNKA0Gg/dqsafNpXyt09K+ce2cpbtruT972v45IdqfrC7qO3C2YGWi33ZbGy1TBAZOjveLa3d6/ZX8YPdjabp6CYjL20tY3B2Iqv2HEWxmILaNFhMQfeazYagMsWgtNpXW/UFgp4mrO3gysrKoLL777+fq6++OuICCQRdQtOgcYEk1Rwh8ZUFKIdLkLTAyV6XZUhJ97nWvRy8eNPfehHrVbdhR0zkUUPHb5VSCwbSMHU63sJBaNnB56o63LQ1ibqf/R+WDf/DsvY1Etb/F9l+lIZLf911ueMQu0cmZeQk5BNPQV/9eqtnNJss5ErpPuTKMsxHD2Heth5Pv6E4J0zFO+BUEGe6goilOdJut/PII4+waNGiDp+/60wk11SbmZvNCgvX/wCAw6Px4heHmDOlH0MnJJJoNkCC73c0xWZmQIbVV4a5w321xWU5KQF/e1SNo/UejjS4cXg0srJ80WEP17owyBL1Ho16j5sDTS6qJb7oiUlmhQcvOKnxWVTe3nWETJuRDJuJDJuJPlZjqy6qVQ4Pj67/3p+/cVh2IjUub0DZ0OzEiLi4ZmYmdbmN7iCacnZ2vBvcKm99XRFg7b7JpPDu7iNcdEoW//zc5wZ9UqaVYTlJAW1O6p/OB/sCLeU3jivglS3NrtMr9xzl1xMKSTQbQvbVVv1wiJf3HqIja3m5jMEQeYt/NNqMBh2VU5blDr0PbX4KzznnHCRJ4siRI0yePDngmtPpZODAgR0STiCA0OkgOpUiQlVRKg6glO5HKd2PoXQ/cnUFNXc9BbKMbktFKT8AuoaaVYA3rx9qXj+8ef0h/0TSTCrqwntCpz/Y/inK+N29KvjM22+/yWOPPYLR6FukZWVl849//AvwnedZvPhRKisrqa+v58wzJ3HBBRdFTxiPG6muGsuOjTh//FMA3KOnRLYPWcZ51iV4C0/CtvxJXGPPjWz78URtDTx2LwwcBtPvoUayhjyj2WQhVwtPwv7bRZg/fgfzF2sx7t+Bcf8O1KwCnOMvxD38DDCITZNIzZF5eXk0NDTgcrkwm8243W7sdnuH3UBramq4//77mTNnTsjANe3R0XQVTdaQl744FHTtPzsOc9XQLDAbgkLWOzosWedQgGwFUGS/DDJw7/i8FkFvfK6pDbrEoWoHNqPir3uw1s3avYFKf5OLaprFQGGyiXP6+hRSY4KJ1fuq/Iv9FzaXcuPYfF7ZWuYve3fXEfomm/E0uOgK8ZKyIJpymqxmVrdwZW5tvPOTTOw7ZKfGpVLt9FLjUhmQZmFyv1QGZlj9KVUWbfiBm8YV8FlJTXObXxzixrFyQJsvbznEL0bkUJRm4d9f+oIjPbrhh+ZnthmZMSYPd72TCruOJBHUV2v1vzvq5bsaF6lmX5ClBENwcKV4ee8herJqmhbx1BKRTFcRzbVWZ+TUNC3gfWgvXUWbiuGDDz7oP0B/7733Blyz2WwMHjy4Q8IJBE1nnVi1nKTJV2BXjSHLWkXXSVj5AoYD36Ac+g7JG+yXL1dXoKVng8FA7Y0PoKbngtkSUCfFpKGtei3AtU7+6a/R1iyHrz73lfXC4DPz5/+VkSNHB5W///4aSkpK+POfF+Byufj5zy9nxIhR5OYGh5vvErqOVG9Hth9FUr2Ydn6Mc9KlYIqs9aAl3hOHUPObRc3n5bxeTNs+xD3ibL91ubdjWf9fJLcLl0ejHhsts9iHOrcJoKVl4Tj/lzjPvgzT52uwfPIuSvkBbP/7O3JVOc7JV3XzU8QekZoj09LSmDhxIuvXr2fKlCls3LiRkSNHkp2dzerVqxk3bhxJSW3v+FZWVjJnzhzuvvtuCgsL+eKLLygrK+OCCy7o9PO1xbHBZ45lR1k944vcpNii993uLLIkkWI2kGI2cEKKT75Qi1ibUWZy3+SA8401LpXqxn8t2V9eR0GqhWtG5vL69kaF4aMS//UmBUBzuTlc7+GrIw2typdlNTI00woQsq6twkl9o3LZXt2OtBvpui3ljLQMRkXi1PwUTkhP4NVtZSHH+8pTc3g8xOdTB4Zm+oIlTR+Tz+Mbffe98EUplw3LZsbYfF5qUuKOafPiIVk8vrGEU7ITuWZkHk9/ciCg7RvG5CN7vHgbf2N1HVSnh35JRq4fnceTmwLr/3xErr9+aZ2Hf+9q3ogwKRKpZoVUi4FUc+NGRKPlR9d1EZE3hunxtVYXaHPFO3as7/zJI488woknntgtAgl6L8cGwGg1RUThELzlh/xWQKV0Pw2X/AotLQskCcO3X2FoPPekpmX5rYBqfj+8uSeCxervU80N/bm1e2T6nP9TX/CZxvQH6r8eQTn9HPQJ56KvfC3qwWd++NFZaCE6uIhiAAAgAElEQVTC2ct9+lD0/oddanv58tdZsuQfnHvu+ZSWHmDLli+47rrpvPXWp2zcuAGXy8W0aZfRv/8AAFaufIsJE84EwGw2M2LEKFavXsk111zfJTkCUL3IVRVILt/krpvM2G/+c1SVQj8tgqgkvPcSlo/fwfT1p9Rf+mt0a/y45XQGqeYo5s9XA+A858oO368nJOI682Jc4y/E9NVGzB+/i2tUs3XM9OUGvAUD0NJzIiZzvBDJOXLu3LnMnz+fDz/8kPLych544AEAFi5cSHFxMaNHj0bTNObNm8e2bdtISEhgwYIF/pQVd955J9u2bfO7r3o8npBBaiLFscFnWgsyMyDD2lYzMU2qxcCZhckBZV5Np7pRSTS0cFGsdan8Z08FE09IDakw/GJkswJwxOHhw5LWLSmn9EnwK0Sh69Z2oG5H2o103eB7IinDmu/sTOqXFnK8f3paLv/+sgyPV6Nvssm3EWBRSDEr5NhMKIqER5Z5dVvzOd4mN+irhmcHKIxNXDMyj4+/rWR0biKj+6by1MeBfQK8tavCZylvYdVRFIkGJP79ZYj0LXuP+uubFYlB6RaqnSpVLi9uVae8wUt5gy+3Z70ngXMa7/v6qJM3vqlsVBoblUeL0mht9JWZ48Q1Mtqct2gTlSE2r9JtRt797fgutR2Ta60uEpYp5MQTT+TNN9/kf//7H7qu88gjj7B48WJuv/12/2F5gaA9EhUVVi1vN0VE0rTr0R4L3H1XSvf7FEPAMfkqUAyouf3QrR0/EwO+XTwpMZn6Fm50CV4V6YWFOG+Zj/kY17poEEopbKu8I1xyyeXs3LmDQ4dKmT//b2zfvg2j0cSQIcMZMmQoBw8eYObMG3j++ZfIyMikrOwQaWnp/vvT0tIpLQ3hYttJJEcdcvUR0FSQFbTUTPSaIwFKfHfhGTAc07b1GL/ZSvJTs6i74nbUwt7rFp/wwX+QvB7cQ05HzT2h8w0ZDLhPPQv3qWf5i6Taaqz/fQo0Fc/gMTgnXIhaFJxKobcTiTmyoKCAxYsXB5WvWLHC/1qWZebMmRNUB+D555/vnPCdpCn4zMAMKyv3HGXGmDxkj5erhmYxvsjtL0s0G7rNdbQ7MMgSGVYjGcdEUT0lI4EhWQXoZiNPHKNQAKzac5Srh/kUgEyrkbOLkoPqNJFpNbR4HVzXZjNRX+8Oq25H2o103ZZyRkOGRLPCqYWpQUohwNp9R5kxKjdktM+2rN2ZNiOFaQn88/PgwE+rvjnCT4dlo0K7lvIiqwHVq/ryehqNPLWp/fpFyWaKTvH9Xui6jlPVqXJ6qXb63GDTE5rHo8blxanqlNV7KAvR7mlZVqad5JvTv69xsavSQZq5UXlsVBwjEZm3raiwsUIopbCt8o4Qa2utSBCWYvj444+zYcMGzjvvPFasWEFycjIDBgxgzpw5/OUvf4m2jIJegt0jNycpbyVFhHLlTNQlD6Elp7c4E9gPtaB54e49aWTEZApIdZB7AsZvv0LftZXqnP501VJ46IbrSLz4YpIuviTk687em/vckrBlGD3aZ9EYNuzUgPL8/AIGDBjIRx+t5+KLL+3ws3UIZwNypW+nVLdY0VIzQTFAzZHo9tsK3oEjsM98sDGlxV6SnivGce7PcZ1+fq8LrCIfLfNFe5UkHJ2wFraLpuIefgamL9dj+vpTTF9/irdgIM6JU/EMHnPcuOoej3Nkk4tckdXAzWPz0Vxun/ucVwsoi3SQmVilSdkI5boIvvyNP/T1KQBZVsgqCu+MbpbVGFS3tbNboep2pN1I123vjFlXZAjXlblJQWtJKGv39LEFrP+2kvF90/zBZ0K16R4KH+yvCjsdi2I2sSYMy3pT/SYkSSLBIJGQaCIvxP736XmJDM+yUtWoNFa7mv73KZIZLZTq72pcbDpYF9SG1SiTZlYYlZPIyBwb4FM4PapOisWAsZ0gSe2l4Th/cPfkEJ754jamDs9m6vCckK/DuXfayLyge5/6xalt3tuSmFhrRYiwFMNNmzbx0ksvoSgKa9asAeCyyy7jjTfeiKpwgt6FroNdNWI5cTi26+5CW/yHgOvytXdQV1OP65ez0ZPSul0+9ynj0FIy8fQb2u19RwuTqXnX7ocfvqeoqK//b6PRiMvlO/+Rk5NLVVXz2YaqqkoKC4siI4Q5Ad1iQ7dYfW6bMaB86akZ1F5f7Hcrtb77Lww/7KL+4pk9YsWMFpZ1ryNpGq4RZ6NlRP4Mg57Sh4ZpM3FMvgrzJysxf/4ehgPfkPjqI6hpWTh+/DM8Q06PeL+xxvE8R6peNWjhHaqstxOua+2xCoCgc3RlvENZu/skmrlwcCar9hxts82Xt5bxy1F5/nuvH53H9xV1TDs5k/FFqf72NJcbXdfDtqyHk76lJZIkYTMq2IwKBUltW+f6p1mQJV/6liYLZI3LS4NHo8GjcUoLt9fPDtWz4YBPmU80yf5AOKmN1sa+KWYyEoxBivnADN+ZTY8s+8tOyrJRaFGOi897TKy1IkRYiqGqqiiKL8dK02FXTdNwOp3Rk0zQK1EUCaujCu31Z4Kuaatew3TVbTjbCj4TRdTCk1ALT4pYey0te6FeH50dnD8x3Hs7w8KFf6O4+E8kJyfjcDjYtetrbrzxFgDOPfcCVq9eycUXX4rL5WLLls1cf/2NnetI05Bqq8BkQU+wgST5ggHFgEIYgMGA4/xf4u07GNt/n8K081M8/YbhHvPjnpYsMug6utmKbjT7AvxEs6ukNJxTrsZ51jTMW9Zh3vQ2SlV5YHCoFqlkehtijhRESwEQhKYr4x3K2g1mPPVOpvRPY2CmlZW7Q7d5zYgcdKfLf+9H+yt5e28VZxclM7lfmr89p0fjlZ1HGJ1j45QMa7uW9WgeWylIMgUpj5quU+fWqHZ5STY151A0yr6ANzUulTq3Rp3bzYEWRt8L+qX6FEOziZV7A9N23DA6n9e+PBgQFbY7NkJaWvZCvf7jij2dvrczdNtaK0qEpRiOHDmS6667jmnTpmG321m5ciUrVqzwH7wXCMJBUSRStHr0xX84blJE9BQffbSenTt3UF5eTnJyMmecMYnx489g3ry5FBYWUlp6kFtuuY2BA33nwc45Zwo7d+5g3ry51NfXcd11M8jL61iofAA8LuSqciSP23cO1JzgUwZiTSlsgeeUcdiz+2LevCbyKTN6EknCMfUGnJOvRE/o3FncDmOy4Bp3Hq4x52Lc8wWegaf5ynWdpGfnombk4hp/IWpO37bbiTPEHCkI17U2mgrA8UQkxvtYy7aug9fhpighnDZ99zoaFyuqpvvb03Sd/+2pZH+1i6MOLwPTEvz12+q/O5EliWSzQrJZCSifVJTMpKJkVF2ntikCb2OwpWqXSm6ib+Pe43BxcqaVwlQLyxqj8D6xKTCC63S/0t176bG1VhSRdL39rHFer5dnnnmG//73v5SVlZGTk8Oll17K9OnTMRjiJ5R/R3M0xRPxkNsmxaShrHwFfd2bvoLMPKTps9DfehG2f+or6zsQ/ea5VHui/7kKNWamz1Zj2v05jkmXdth6WFb2PTkdWPBGMypptGg1h46uI9XVINdW+ixVBqMvWJDJEly3kY6OV3di3PU5xp2f0jD1hjafIRx65Lup6zGljMuHS0he/H9IjdONp/8wnBOm4u0/HCQpKI9pZmYSlZV1Hc9t2tRfO3maIo2YI1snHuYmiB85IX5kjRc5ofOyfnSglve+q2F8XiI/6ZcKwJrvalh/oBaTInHD8ExybJELxBJLY1rj8qKYDDg0KUApBPjNxCKsJgWLx2c9fO+7GjRNpzDZRGGymSSTEqrJsOjo2iGcqKSRzGMYTToj57Hj1aU8hs2CGLjlllu45ZZbAsrtdjvJya1HlOotRCwh+3FO/fffk1LQD+ma29E+WIF0433YZRvWq25DGb/blzYiyiki2sNw6FuM32zF029YRN1KQxGryl+H8XqQqyuQXL7Yg5otGT25T/y6DXo9WN96DtleieHQfuqu/B1aZmzt6LWHbenDaGlZOCdd0n3WwjbQsgux/2Yh5o/fwbzlfYz7tmPctx1vdhGeyVdgHTwU1gTmMQ07t2kMcLzPkQLB8YiiSIzqm8qGg7V4GxeCuyqdnJyfwubD9Vx6UnpElcJYI91qRDcZ+WeIKLxr9h7lmlF5OD0edF1nS1k9DV6NTY3OYqlmxa8kFiaZyLYZkaO0mdnVlBTHG2Gv3DRNo6KigtLSUg4dOkRpaSk33XRTNGWLCZpy7ymrlpIke1otiwcURSLFpAUYEkKVRQVVJeF/f0d7cRGuBhf6zGJqJCter4ZdNVJfNNRf1pOuNlr/Icg3349SWeYv67YxaupPCn7+UGUx0a7LgVJxwKcUKgbUPrnoqZnxqxQCGIzUXnMPakYeSvkBkp++F+OXG3paqrAxfPc1pl2fYf5iLbG0a6WlZ+O44DpqfvcEjslXoyWlYdC8JObnoT92H/r7b6D8sBuLWUGvOoL+aHNZFzaXu43jdY4UCI5HmoKvbCqxc9mwbFQNDtZ5KMhMZHdFPbecns+AtK55m8Qy7UWF/epwPfsrHSgGBR24dFA6kwqT6JdqxqRIVLtUtlc4eHtfNX/fWo6rxbpvf7UTRxxY73orYVkMX375ZRYsWEBDQ0NAuRRDrkrRIOyE7HFwJq7pWVruwIcqixbGb7ZgOPQdakoGdYPHgceXIqKJlmkjegpFkbCePAx93f8wDx9NA6HHLaoyoIG9CjkpDa1x3yZUWay0i8EEkoyekICWkglKHKzgw0DLKsR+05+wvfk0pu0bSVz2OK7vd9Fw3rVgjOEdYF3HsmYpAM7xF/qiwMYYujUR51nTcE64kFStFm3D24G5TWfcg7r0yYDcptabBzf+PsQmx+scKRAcjxyrFP1iZC55aRZSUxJY0pjq4qQMK6khUmX0FsKJCtsUfEb2qgxIs/gVZU3XKW/wUGJ3U2J30+DVSDD41iC1bpV/7fClssqyGvwWxcJkE+kWg/hN7QbCmmn/+c9/8uKLLzJo0CB/5DWA2267LeyOSktLmTdvHhkZGZSXl3PfffdRWFgYVG/z5s0UFxczadIk7rrrLn/5smXL2LhxI1lZWXz77becd955TJs2Lez+O0O4CdljfdESCwquZ/Bo6n72e3SDsctntqKBf4yenAsVpchFA7AYdKyao9vGSEGD8oPgcSOZrUjmBGT9mDJTQodV54i362wAg9H3T1FQM/NBVmLqTFtEMFuov+w2PH1PxvrOPzF/vhrl4D7qrr4TPbV78jN1FMPebRh/2I1mTcI5/oKeFqdtDEZqpHSSfnwVyqDTmnObPtYiWm9mXo+7l4dDJOZIgUAQHxyrFC3bfphfjsrzK4XQ+1OThBMV9qbTC9AcrqB7ZUkix2Yix2ZiTG7gtQaPRmGSidI6N+UNXsobvGxuVDStBpmTMxIY0/OnI3o1YWkzgwcP5pRTTgkqnzFjRtgdFRcXc+WVVzJlyhTWrVvH7NmzWbJkSUCdffv2sXXrVgYNGhR0/8qVK3n44YdJTEykurqaSZMmMWbMGPLzo3f2J5yE7PJVv8L73F9IrqnEffJYnJOvAsCwfzvmT1aCYvApQwYjumIEgwHdYELN64fn5DEAyEcPYSjZ03jdiG4wgMGErvj+12zJ6Cl9fH16PaB6QfEtysNZjMeKgusZNCpqbXeVY8dIe/0ZrDMy0Zc+0aEx0nW9UztaiqSDvQo8jRG8jh5GzsyFyvLmMnslcmYeqh5++xFtV9PQK4+g1Nagmyy+vHiS5EtW30F0XQPiQJGUJNxjfoya3x/bawuRGmpjcmMDAF0nYc2rADjP+H9xkY/Rn9u036m+36SWv6+ANH0WdtmGGuNuRZGYIwUCQXzQUil6ccshHB6Npz4+4L+eaTP2+tQk4USFTU0wUlHXsZQ92TYj00/NwqvpHKrzWRRLat38YHdR79FwNs4Fuq7jVnXsbhWTImGSJUyKjCLHwbqiGwkjvmgQYUUl/f7773n22WcZNGgQNpvNX/7000/z9ttvt9tJVVUVEydOZMuWLZjNZtxuNyNHjmTNmjVkZ2cH1Z81axYZGRkBFkNN05BbnFsaP348Tz31FKeeGn6ukc5GXLOYFayH9gYtWuTfzEf77/Pww14AXCPPoeFi35kS0+drsL0ZnKuvCdfIH9Fw8c2+upvXYnvj6dbrjjibhmkzQ9bVDUZ0gxHZaEKVDbiHTcT5458CYNizBcumt8CahPHMnyA56tBfXQyO+sAOMvOQfjM/Kuf7DHu3oVQexjV6SsydO2sZ3UuSIEn2oPyw278JEFg5D/nGe3F88iG6vTqoLfeISRw2W7FYrCRCYP62Y9CtiX7lQqq3+xQ0SUK22kBTobLC939LjCbIykfthMunjIbkdsHRw51v1+1ErqpA8vpk1ZLS0RNTOmwl1HUdVfVSW1uFrkN6elYHn6bnkBz1SLWVaFk+Twep3o5usbarGHdXFDnjzk9IfPURtKQ0an67KLZdXltwrEdDAMPGol51W4dduLs7KmlX58hYQUQljX05IX5kjRc5oXOyms0GDjlVHv0oMPjK3WefQBJaVKJc9vYxbQ1d16lyqaiajtRwBIvFimawUecJHGNFBqMiY5IlzAYJY5jrzt4YlVTXderr7TidDWRkNJtmIxKV9PHHH2fjxo3s3bs3wE3myJEjYQlXWlqK1WrFbDYDYDKZSE5O5uDBgyEVw1C0VAq3b99Obm4uw4cPD+veJjq7UNCrjqC+/Fhw+fv/Q7l5ts+K53FjtVixpfvO9OhjJ6DnZvuued2NdTz+ugkF/bBl+upqJ56APvacxrqN9T0e9Ma/E/ILmusmWdDMFl9bmork9fiUEGcDCmCVPCQ11f26Fm3/Dp+wOzbBpKnI1/4O7e/zAp5DufEepD5ZpHdqdNoYN7cT9e3n4OhhEvukIJ8eezniMjMDz2DpqWPRp89Ce3x2QLl8ze1oLy7E3LgJcCzWoaeSNGQIJSUllFccBnew+4SfxORmq1NdTWBdc4JvQV9XE3hPSrrP9bNT6OD1gq5BbUfb1cHR0KwoKwokpoCj1vevExgMCmlpaWRkZAR8r2OfJCAHAN3rQX1uDhgMKDfcjZTWtmvpsZ+zSKNrKuoHrwNguOBqMvP6RLW/SKJXHUFdFEIpBNj+KcYJe8gYOgYphs+vdnWOFAgE8YWiSHhkmVe2Bs+fb+2q4KqhWRAHika8IEkS6RafyqKaM6mqqsDjqUbVdVRNx6v5ckm23NayGCQsjWcXvZqOroMiS4QyKsqyjKbF/vvVUTkNBhNpaZkd6iMsxXDXrl2sW7cOozFw1/aJJ57oUGeRwG6388gjj7Bo0aIOu+x1dDe0zZ1sQN/+Ker4c6kvGoobQAX8uyMJkD+07Q6a6qb3gwvbiV7XVPekCXDvBN9rTQOvB0n10CfFTOXhKnSjGb3JCpY3BOXae5G8HmSTEVtRX7TFfwhqWn3zBbQLfo7jm924Tx4HJnPbsoSJ5b1XSDh6GG9OX2pPGN1ibGKDY3ez/O/3q4uD6mqr/o1+xS04vt4Rsi1PYi5atZOkpEzSDpei1LS+IPRk5PrcMAGDvdJfV7YlYh3QD+2pByJjOXE5MW95H8uezRgvvR51yUPB7U48D2nqL6jyGIJSsdQdPoL19UcxHPApw87xF2K7egZHqttQejvA0aP17VeKUeQjpSRVHUG2V+KZfyv1l9/my8kXgm7Z4dU0TGPPx7RtPXUnTYy571prpJg0lNXLmj+XjblNeesl9O2fAKC9uxS976AO5TbtbothLM2RAoEgurQXkXNHWT3ji9wU9eLgMz2JohgCLGBNaLpORYOXkloXJXY3Y/rYyEnyrWdf+/ooO4/6UmplJBh8qTKSzBQmm8hIMJCVlRwXltjuWE+ENdOOGDECt9sdNOn16RPernReXh4NDQ24XC6/K6ndbu/w+cCamhruv/9+5syZEzJwTaQJOpsXKiF7TwafkWUwmdExI6UkobkD3x89pQ/elD4oikRSGwou2z/FMOIMEnd/ivr2EtzDJuIeeQ5q3omdF628BMvGFeiSRMNFM2I+WmV7mwDs+Axpwk/QJ/yk3eAz3gGn4h0QnotzU92m/rU23iNl/O7wgt943Fg++A/mz99DTkxBue7O0EphZh7KhB+jrXyVlB9fSbU3MFKtbeIFyKX70VL6UD/tFrz9hpJoNAGRUQzjGS0jD/vMB7Etewzjvu0kvvBnnJMuxTnpsp5xmZZl3KPOwT3qnO7vuwsEnON+d6k/t2nqtbejfvNVTOQ2DYeuzpECgSB+CCciZ28PPhOLyJJEts1Its3I6JzAa3lJRuo8Kgdr3RxxeDni8LLlsC+KdIJB5lcTDDT59Wi6HrWcivFAWNpMXV0dU6dOZdiwYQHnJ9avX8/VV1/d7v1paWlMnDiR9evXM2XKFDZu3MjIkSPJzs5m9erVjBs3jqSktl2tKisrmTNnDnfffTeFhYV88cUXlJWVccEF0Yu819qiJZYSsodDOAqu9sGbyJffhPzQ77F89h6Wz97Dm3MC7pE/wj18YseSZGsa1jf/gaSpuEZPQS0YGIWniiw9vQkQ0f4VA6avP0V21CPdeC/aZ+uC213/NsqEH/sVRuWk4Vj6nYbVZUd/fLY/Cqvzpj/gTM1DT7C12eXxiG5Lpu4X92D5cDmWda+TsG4Zhh/2UH/Zrb7zl92EVFuFnpgal1Fhm4LPmIqGYp1ZjF0zono1pLRk6luW9WBu03Do6hwpEAjih3Aicvb24DPxxhkFyZxRkNwc1Ka2MbCN3UWdRyMz0YTT7rMoPrXlMAZZ8lsUC5NNpJhjN/NApAkr+Mw555zDpZdeGlT+n//8h7Vr14bV0YEDB5g/fz6ZmZmUl5dzzz330LdvX6ZOnUpxcTGjR49G0zTmzZvHpk2bSEhIYMKECf4ANNdffz3btm0jISEBAI/Hw6xZs0LK1RqdPVhvUsAqe6ltsUAJVdaTtGVeDgis0lLB1V0BZTWSFb30e8xfvI9p23pkRx2AL5jFHU+0aglpcj20e2R0HUxfvI9to0/RrEotRLPEplLRavCZdsYoGu93p/vXdQzf78L80Zu4R52DZ/BoAAz7tqMbTWh9B4VsN1l2w8pX0de96WsnweZTGF9d3KxE9h2IfvPcABe+eDr43p0Y9m3Htuwx5Ho7amoG9lsf9gd/ieqYedykLPotanoO9Vf9Dt2WHJ1+upmujll3u5JGYo6MBUTwmdiXE+JH1niREzonq2JQkE0GNJfbPy+HKutpOXuKeJBV132RTQcUpFFRUYvTq/GXj0uDTD3JZsWfT3FMTmKPRT+NxJi2Nz+GpRi+9tprXHnllUHlK1asYOrUqV0SsDuJxqQXK4TzYemQgutxY9z9OebN76NmFeA4/5cAKIe+xfjNNlynnYWenN7serh2Od7JV1Brd5D8yl8xXn0L2mfr8E65MupJ4TtLqDHr6U0AkwJW3OhL/oZeU4X91gWh+9c0jF9/iuWjNzEc3AeA58Qh1F03u/V2W7QhSZCieFC+34X2wsKwI9XGww99TyHZK7H9exGek8fimnChvzyaY2b+aAXWVS/izTmB2pv/FHORfztLvCmGYo5snXj5zYgXOSF+ZI0XOSF+ZI0XOSF+ZXV5NQ76U2X4ziu6GtdBNqPMXWNzkSQJt6qxvqSWwmQTBUkmrMboH5nqDsUwLNtoqAkPiKsJTwBulUY3RL3NMgCMJjxDJ+AZOsEX5KYR82fvYd68Fsv7r+EdMxnTTy5HX/wHv+uhuWgIhmvvQH16XqOL4qlRTQofaTo0RlFARUKXJOSD+31Raf1lsm/B5nZi3rIO86a3UarKAdCsSbjGnotrzLmttnvsM+g6VHuNPvfRG+/1na1sQbzkj4sl9OR06q6bE6CcmbZ+gD5hUnQ6dDmwbPgfAI7JV/UapTAeEXOkQCAQ9A7MBpl+qRb6pfqix2u6zpHGoDZeDX/gy9I6D+sPNCtpxwa16ZNgiMuzimIlIWifFgtO95DxuE8ZC0iYTxsHH7zZ7Hr4wiNYK75Da1QKAd+5ONnb/TLHIU3WV+W9fyNfNgPJ40ZB85WtWkoSLlIW/hbr20tQqspR07Opv/AGan73OM4fXdHhc22KImH11KG/Ehw5UX/rRay6CDLTYRTFf9bP+PVn2JYvRv3zb1AarbqRxLLpbeSGWryFJ+EdeFrE2xcIBAKB4HhHliSybEZG5SQyLq/Z0pZokplYkERRsgmDjD+gzRt7q3jii8O8+vVRf12XV8OtxsdG+/FzmlIQEbz9h+HtPwyprgbTzk3YBgxFLhqA9voz4KhHX3RPc+XMvLgIzhMLHBsVVbrmd0iTppKi1aM/PtdvkfVOmor25ac4J17kO0/YSStRu1FYOxIFVRASNacv3rx+GEr3k/TsXBw/uQbX2HMjEiRGaqjDsnEF0GgtjMNdSYFAIBAI4pWMBCM/PsG3Ie/VdMrqPZTYXY2BbVxk25qPUW0rb+Cd/dXkJBqbg9okmUgxK2Gl3lMUCcVsClnmcbho/1Bg+AiLoaBT6IkpuMaeR2XhqTjyBiJf//ugOn53xBgIzhPrJCoqrG2OSqq9/jTS4NPQn5wbYH1VJp5H7Y0P4DllbJdcB4/tj8w8pFmPwrCxzZWEtbdLaGlZ1E7/A9KkqUiqF+vbz2N7/VFwObrctuWjN5BcDjz9h+E9cUgEpBUIBAKBQNAZDLJEQZKJ8flJXDm4D3eMyWVSYXMwOHvjDvuhOg+fHqpj2e5KFn5exsOflfHarqN8V9O6h1ZT3sxVeyupc3mDymRzZON4KMXFxcURbTGGcTjcEdWqYwmbzUxDQ/eHRlYMMokm0J//GzQccyC25iiGIaNw6bGZw7CnxiwUbk3C2H8w8oAhsHOzz/q6+cPmMc3MQ7qlGDuWiHyGA/qrKEX6VTF2SzqGIaOay268L6i/WKgMZtUAACAASURBVBqzuEBWSBw7Ebs1A+PebRgOfYdp56d4Tjil8yktnA0kvr4ISVWpv/w36MnpERU5Fujq50ySJKxWU/sVBQFEY46Ml9+MeJET4kfWeJET4kfWeJETjm9ZJUkKOF/YL9XC+PxE+qea6ZNgwCBLNHhUGrwaFQ1eBqZbyLL6FLxNB2vZVenEq+okWQwYEkw8trGEneX19OtjJSPBgGow+MsGZNhIMxvQwwwc1t78KBTDXkJPfAHbdUcsP4g8YAhaWhaxaDSMtR8tl65AZi6mQUPhkzUB16TfzMduSY+o9dWlK2hpWRjHnIUdC6qqhyxrSayNWTxgs5mpTczGfcpYDN9/jaHiIHJDLZ4hp3euQYMRz8DT0FIy8AybGFlhYwShGPYMQjGMfTkhfmSNFzkhfmSNFzlByHosBlkizWKgb4qZ4VlWJhQkMSTTSo7NRP9UCybF5wX25t5qdh11sOOIg1PyUvjsgJ2d5b7I8TsP13FSdiLPflZKRb0vQOGReg9jilLRPOGd+2lvfhSupIJOI9wRI4s/GMzLjwVdi1YwGLcK1R5DgAIYqkzQdbSMPGpnPIBz4kU0TJ3RpbbU3BNxnnVJhCQTCAQCgUDQnciSRJbVyKgcG4mmZs+6KSckc0ZBEn2TTbyy5RCFqRZ+MTKXBKOMw6Px6EclfqUw02Zkxpg8NFfklFqhGAo6jd0j4518Bdx0PxQN8Fm1knJQr7qtuWzGvdRqsZnHMJYIKxjMD7sxxaZXriBcTGYc5/4c3eqLbCZXlZP4/B+Qj4R4z0Ng2PMFuJ3RlFAgEAgEAkEPcVJ6AlNOSOH64VncPjqHBFXFpsB1o/OC6t4wJh/Z443oRr5QDAWdRtfBrhqpLxqKPrOYGsmK16sFlQnLU/sI6+vxScLqpRi/+5rkp+/D+NXHbdaVj5SS+MoCUh67A9wilUhvo7S0lF/96lfMmTOHmTNnUlJSErLe5s2bueiii1iwYEHQteeff57bb7+d3/3ud/z973+PtsgCgUAgiCKKLFGUaiY/NYHXvzwcdP2tXRXohshaDES6CkGX6emk8L0Bu0cmafIVKAOHw7tLfYFfZBvWq25DGb/bVyZSf/Q66i+aAbqG6auPSXxtIc5x5+E49xdgCP5pTnj/dSRNwz3gNDCZe0BaQTQpLi7myiuvZMqUKaxbt47Zs2ezZMmSgDr79u1j69atDBo0KOj+L7/8kjfffJPXX38dSZK44oorGDVqFKNHj+6mJxAIBAJBJGmKPvrYxmb30ZbsKKtnfJGbIqsB1RuZ3GLCYigQxADC+nqcYrFSf8VvabjgOnRFwfLJuyQ9Nxe5usLnXmzSkCRQyr7HtGMjek4hxmm/FGkLexlVVVVs2LCBM888E4AJEybw+eefc/hw4A5x//79mT59OoYQGwdvvPEGZ5xxBrIsI0kSkyZN4o033ugW+QUCgUAQeRSzibX7qgLOFN599gkMzbH566zccxTZFDk7n1AMBYIYQgSDOQ6RJFzjzqP2hj+gpmZgOLiPpNcXkuKpQVm1lCTZg2Xta5CZhzJ9Fsr7y0iSg3cOBfFLaWkpVqsVs9lnCTaZTCQnJ3Pw4MGw2zhw4ADp6c2pS/r06cOBAwciLqtAIBAIugePw8XkfqncODafolQLvz2zL0loXDU0y18W6eAzwpVUIBAIYgC1YAC1Nz+I7f2lmCedj/ZkMVSUYug/BENOLtL/+zna0/OhohRl4HBMRUNxR8ZzRCAAoE+fxKi0m5mZFJV2I028yAnxI2u8yAnxI2u8yAlC1kiRYjMzIMNKotkACcbgMiJ3vEQohgKBQBAj6NZElGnXo6961R+ISH/pUeRrf4e2ZEFzcKKVr2K9eXDjOV5BvJOXl0dDQwMulwuz2Yz7/7d379FRl3fix9/Pd+6XJOQKCeEWBJQii+Ct4q1iq9aWbbvV1lNtz661dnvq7lIVq1TkIO6K1epqW9u6Z7Xn6G9rradearcK9QIstRaqILUFQW5JyP02mfvM9/n9McmQhMxkAslcks/rHE4mk+9MPvPMQ77z+T7P83kiEXp6epg+fXrGz1FbW0tHR0fy+/b29lE9PvGYXswMN0nOVGVlEa2tvjF9zvFQKHFC4cRaKHFC4cRaKHGCxDoevMPEGRzlcxiGSnsRUKaSCiFEHumJWYhd3rcNjMsDQT/mTzcMrlgr28BMKKWlpSxfvpytW7cCsH37dpYuXcrUqVPZvHkzPt/IH1hWrlzJtm3bME0TrTVvvfUWK1euHO/QhRBCTCCSGAohRB7pL0QUrPs71De+d8LP1Y3fpcfwyJrTCeaee+7h+eefZ+3atfziF7/g3nvvBeCRRx5h7969AJimyfr169m1axfbt28ftGXF4sWLufrqq1m1ahWrVq1ixYoVnHPOOTl5LUIIIQqTzEMSQog8Y7Eo3NFe9P977ISf6Veexv2lW+hBRgwnktraWh5//PET7v/Nb36TvG0YBmvXrk35HDfeeOO4xCaEEGJykBFDIUTBUApcDoNSt6LUrSl1K1wOY0Jt32CxKEp0AP3omuPTRwd6/x0sR/ZiH9s9bYUQQggxyUliKIQoCBZLIiF0NbyP+uNLqD+8gPrjS7ga3qfUrbBYJsafM68lDq//evCawu8+Cmeee/ygV5/FbcRyE6AQQgghJiSZSiqEyHtKQYlTw85XIew//gMzBg37oK2BkmVX0BlIrNErZD1Rg6IV12CZtxh+9wvUTWvoMTy4v3QLlo/vTdz39bvoMW1Agb9YIYQQQuSNrCWGjY2NbNiwgYqKClpaWlizZg0zZsw44bidO3eybt06LrnkEm677bZBP3vyySfZtWsXSilOP/10br755myFL4TIIafdgMPvD04KBwr74cgeHDVnEgqb2Q1ujPUXn7HPXIT7m+voMW3EYyY9DLlPis8IIYQQYgxlLTFct24d1157LZdffjlvvvkmd999N0899dSgYw4cOMB7773HggULTnj87t27efnll/nVr36FUoprrrmGZcuWcfbZZ2fpFYiJSKlE0uG0aMAEDEJxRShiFvzI00TitGg49lH6gxo/wjVjEaHshDTuInH69inUae8bDenvQgghhEglK4tyOjs72bZtGxdddBEAF1xwATt27KC5uXnQcXPnzuXGG2/Eaj0xX33ppZe48MILMQwDpRSXXHIJL730UjbCFxPYZFizNhEobSamjaZjxpCplalNljWaQgghhDg5WRkxbGxsxO1243A4ALDb7RQXF9PQ0MDUqVMzeo76+nrOP//85Pfl5eW8++67o4qjvNw7quMLTWVlUa5DKCg6FEi7Zq30nCtRTmnTobLZz3QogD70PmrqLEzDmj45NKxYzBgVU9womyNrMWYiH/5v6lAA80+/K5j+ng9tJoQQQkwmk6r4THt7L6Y5MUcUKiuLaG315TqMguFyGLgb96Rds2Yeep/ABFizNpay1c9UJIirZR+OjkOJ0UKlYNocaPww9WOmzUE37MNsPECofDahytPQNte4xzqSfPi/6XIYuBrSr9HMp/5+qm1mGGrCXwjMd0qB3WElEjPxFLlQCqKRKJFwTKYtizHT389sdhtaM6p+NvCx4aiJt9g1bn30VOIUIpuykhjW1NQQCAQIh8M4HA4ikQg9PT1Mnz494+eora2lo6Mj+X17e/uoHi9EkhnHaWh044H0x02wNWuFwIgEcLbsw9FxGKVNNBAumU7YXUtxeTG0Nw6f3Dg86FkfI/rBH7GZMVyt+3G2HSBSOpNg5TzMPBoJy4XJuEZT5I7FYuDyOGjsjtDW0oOpwVBQUWSnpsRF0B8mHs/9BQhR2Ab3s95R9bNTeWw24xQi27KyqKS0tJTly5ezdetWALZv387SpUuZOnUqmzdvxucb+crwypUr2bZtG6ZporXmrbfeYuXKleMduhgH475JeTyKJdiDracJR9tHuBr34Dn0DsUfvsmUv/yWsvdfwhINyJq1PGJEArjr36Pkb5twth8EbRKeMp2eBSvwzz6XmLOI7pCCZVdA7Xww+q5pGdbE98uuoDtswTf7fLrnXUq4ZDpojaPjMCV7N+M99DYWf0f6ICa0zNZoKjM28nFCDKAUOJxWvMUuPEUuvMUu3F4nHxzz09IToX+SjqmhpSfCB8f8uDyOsft7n2eGaw+H0zphX2+uKAUuj+Ok+tmpPDabcQqRC1mbSnrPPfdw3333sWXLFlpaWrj33nsBeOSRR1i3bh1nn302pmmyYcMGdu3ahcvl4sEHH0xuWbF48WKuvvpqVq1ahVKKFStWcM4552QrfDFGLBYjsR/d4fcTIxhmDAwrruo6XLMW0R1S6a+caY2KRzAiQYxoACMSwBJJfO3/3ohH08agUWhUIqkYYc0aKCQ5HD+JEcK9fSOEOjFCOKWW4NQFmM7iQcfG4yadAXDUnIlrxiIS74siGFeEA4kLRgBxdyn+2ecSDPfibPkQR+cR7N3HsHcfI+qpIFQ1n2hRFZPrTGxk1N+NaJDSPb8n5i4jWlRJzFtJzF0KSgrTiBMNNxIyrcSOqSESG/7vZiSmOdYdocJlJRyaWBchZGQoe+wOK43dkZPqZyM9Nm5qesNxiopdmObw0z4znRp6KnEWgnTtIAqT0nryzG6WNYa5pVSiCugJBV/6OTyw7Aq6ukOosD+R8PUne323LZEAyoyn/T1aGZh2N6bNTdzuTty2uzBtfbdtTlxOK+7GPej6vamfqHZ+3qy5yhdj1c+MSABn814cnccTwkiKhPBUqWgIZ+t+HO0HMfoSo5izhFDVPCJTpo970pMP/zcTawx3Q0PqNZrUzMN0eFAH32NgyqwNK1FvBVFvJTFvFXFn0bgn1bLGMDdGc45UCjxFLj445h/0oXdhjYe/HfOT7mkMBWfUeDjQ5EepxOU3BSil+r4OvC/F10yPH3CbvscM51T7XKr26Ge3KhZWe/D7gqe8pmws/6b0fwTUye8H3O77Pnm778awt/uea+DzeLxOentDp/w8w8Uzr9rDXxtH7menVyf640D99w33WJfdYG6Vm5aeCG2+yJDk3k7QH04c138BIMUx/RcAvMUudtf3jhjnwhoP9a0BDJX43qLAUGrI91BVWURbW2/qJ8uiQRdChmkHl8Oa83NfpvLhPJ2JsYhzpPPjpCo+I3Iro03KD71PqWGkTdi0YT2e8Nn6kj57XxJoc6OtjhE/uIYiJp7ZZ6Jb61MnqTMXEQ5IUjiWjIgfZ3PfGkL6Rwhn9CWE47MOUNucBGsWEZq6AEf7QZyt+7GGuvEe2UG86QNClfMIl808Pj11AjKbDmLUnIbZlnqNJrPPpDug4WPTsfa2Yuv7Zwn3Yu9pwt7TlHguq4OotzKRKBZVYdrdWX41Ih+kGglRSqX9EAz0/VwROuEa32gzptFnWKkSyGMhH7FYfOTkM0USWuy2ZTQyVGS34vNHTikR62ryEwqbGSVQmSRi46U1FBzHZ8+snw3XH1P1UYsBc6vc7GsanNz3T/vsCkRZWO0BBR80pj+m/wKA1mQUp0bRFRl64IkPPNjTjRqQKCaSRoUBJySR/YnlwGONMbqoN3CKbKp2WFRbhFJIcZ0CM3E/CYm8k1EBjKaDsOQyYu3Ng5K9/uTPtLnRFtspj1hoDcrpTqxZO7IHGo9Pa6W6DvqmtU6iAfVxZYT9fVNGjxxPCEtnEKwav4RwKG2xEaqaT6hiLo7OIzhbPsQS8eNp2IWr+a+EKuYSLq9DW+1ZiSdbbF0NuI/8GbN9CuqsFej6vw3u7zV1MHNAf7faiU6ZTnRKoriXigSx9bYkEkVfK0YshKOrHkdXPQBxuyeRKBZVEfNWJC7MiAnPZrfR1nLiyIXWGkOl/zDc/2G1xmMMTmD6Eh09IHFJ3h76NdPjB9xO/p7+GwO+RsKpZqJkdg6oLLVxsCPFRc8+rb4I5dUemtvCGT1nKr5o+uUSJ6s/4UWluM2Q0VeOn4r7k+iBz+Ny2QiFoiM/zwnPqQZ9P1w81r5EZ6R+ZlUwp2jwrJBUj60sstPSkz657w3H8YXiGU8NVRnGaShNuTORsJoa4hpMrQfc7k8gh0s2Uz35ifcrBiaJqUcmDfoSziH397/HmUyRbegIFewU2clMEkORRZkVwDAdHnrmfyIrEXUGdHLNmopHMGJhIr29+AI6J0mhUomRVadFAyZgEIorQhGzIK+6GWE/rpa92IcmhFMXYDpyVCnUsBAun0O4bDa27kZcLfuwBrtwN/0VV8s+QuVzCFWchrbnfquLU2XtbcV7ZAcKCBRVEw5ZRlyjOZS2u4iUzSJSNgu0xgj7kkmitbcNS8SPpcOPs+MQkJimm1ifWEXUUw6WkU8zQ/u9DgdwOYyC7feTQaqRkA5/lIq+D9ipVBbZiUejuK1DL/CNzWhGKslEkhMTyClT3HR0Bob8LPXxQ+8zMhwpNZTCaxucQGWciPXdLi5y4fMFT0jERvs8yceO09Twykr3uE3Pi0Uz62exaBSLMfj1pXpsmcd2wrTToexWg1Zf+pHQVl+E6hIv4VCMcDizOM1ojFLHyMsayiu8NLf4hk0gTQ1xjieRptaDksp4X3+N990+bvSJ5eleJ22+kS+E9LeDKBySGIosyqwABlks+KI1hMImIcDi91Gy/y2U3YM+Y1pWfv9Ap1yYJ48Y4V5czXuxdx4dkhCejunIk7VfSiVGxkpqsPa24WrZh6235fhWF1NmEKyaN+ZrHrPFEuii6ODbKG0SKq8jVLUABvT3hFFOKFMK01lM2FlMuGIuaBNLsAubLzHt1Opvxxrqxhrqhtb9aKUShWz6pp0OV8hmuH5vFmi/n0xSjYS0+iKcUeOlKxBNudauusSOf4QP1+Nh4EjU0BzU67QSPIVE1WpkOIJlwDT3qa1rriy2Q/jURh0LXSQco6bEdVL9LNVjM5kGnelUaa0hEtccbQ8zr3rs/j8YSmE1Tq6f9l/oGJxUnphAJn6mMYc5tj+xzHQqb8wEX8TEYVHYjPG7CCHGjiSGGZhoozi5EuntwTFtNjTuT31QTR3BePYSw4Hi7ilow4ol4kdFg1ndHF0pEh+OhxbmMWPQsA/aGihZdgWdgZObr5+tPnxiQqgIl87sGyHMk4RwKKWIFVXiK6rEEujC2boPe1cDjs4jODqPECmuJlQ1j5inPNeRZswI91J0cDvKjBGeMp3A9MWnPP16WMog7i4j7i4jNHUBmHGs/vbj6xMDndj87dj87dD8t0QhG085saJKot4qTFfxuPZ7MX6ikeFHQuImHGgJMH+ah9aeCK0DilJUFtmp7ivOMdHez1TtMVBlkZ1oZHymgU42WkPQH2ZhtYdj3aPrZ6kem8k06EynSmug3m9iajjaFjipOMda/4WRE/LKUSaWcX18aulI7WBqTXMwcQFSAXYLOCwKh5H4areM3bpHMTYkMRzBRBrFyWWCa+1tw374HYyzLsNsP5afBV+UQbR4Ko6iIkqL7GhDk602yqgwz5E9OE6iSmo2+nAiIfwb9s76wkkIhxF3T8E/61yC03oTlUw7DmPvOYa95xhRT3nfVhdTxyfJGiMqGqLoo+0YsTBRbyX+GcuyF69hSYwMFlURBFQ8grW3rW9EsSVRyMbXjN3XnDh+5kJojY5LvxfjK92ITTBisr/Zz8JqL7VlTmIxM1nGfiyqcuajUxnBEicnHjfx+4JUuKxUl3gHbZcwUj8b+ljDMEBrKovsNKdJ7iMxM3mMxUgkd2UeG0ol1mh3+BOJf0tfAui2Qrmdk44znwxMLOMZTuUNhqJ4rBCOQ0wnvoaT81gTX219SaLDAg4jkSyeOCoqskUSwzTGexQnm3KZ4FpCPXgPvY2KRwk2HsExXMGXoQUwcsBiMXCcvgyOfAB/+m1is+/xbiMzjhEL43S5Ri7M0/gRrhmLBkwDHNl492Ej7BswQpi4Jhgum0Wwan5BJYRDmQ4vgdolBKeejrPtAI62g4mRr4N/IOYsJlQ1PytbXYyWikcp+mg7loifmGsKvtnngWHJWTzaYidaUkO0pCYRXzSYnHZq623BWlGL+d7r6Z/kJPq9GH+ZjNj4e0OUlXno7ky/FmkiOJURLHHytCaxlu8k1rENfGxlZRFtbT6qS1x0pknuvQ4LXqeFYDTOzHIXLT2R5NYX/Vs1TC918GGTnxK7osKp+pLGk48zH2VyIaS2zElPl59qT+IcFNeacDwxxbY/QYyYEDUhamp6o9CfLFoUiUTRohIjiwYyFTVLJDFMYzxHcbIplwmuEQng/Wg7RjxKpKQaf/k8AgMKvmRaAGO89beR3vn7U28jM44RC6GiYYxYOHE7FsaIDrgdCyd+biauLhrLPoWZQWGe0U6xHa8+bIR8iaIyAxLCUNlMQlULMB2eUcWYz7TNSbD6YwSr5uNsP9S31UVPYquLYx8QqjyNcPms/NjqwozjPfg21lA3cbsHX90FYLHlOqpBtM1FpGwmkbKZoDVldp1RQapcTC0XIzuVEZuJSNqjsGWa3CuV2NbiLw29KbdqOL3aQ6AnwHgXVMqVTNrKbjUG9XmLUritDCo6ZepEcjgwWQzHE9NVAzEIxI6vg1cMTBbB3jclVZLFsZUHn2byV0bbKzR+hKdqNs49b4CyoA0DlIFWFjAMtDJOuD9527D03WeAYRnm2IH3Dzh+mMelS6hyleCqeATvwe1YokGi7jJ6Z54DfVfOTqkAxjjIqI0O78FdOZdId8fxxC4W6rsd6ksCw4mRxgxpFNrmQKnxKcyTaR/OdERG+7vxHP4z9q6BCeEsQlPnY9onTkJ4AouNUNU8QhV12DuP4mr9EEu4F0/jblzNf0tsdVExJ3dbNWiN98gObP42TKsT39zl+b9thFJoZaDyrCCVGJ2JNhJyqqQ9Clsmyb3DaaWhM5x2q4bm7ggVLtuE7gcjtZUrg8rjhlI4LeC0HE/utNZETYiY/Yni8WQxFIfQkKmodmNIsmhJJKHi5EhimFZm2ysoBUY8CuRuUbkJlCYTyL7EtO+2bdFy9DhMU0wfUP/ohY+4o4jeOR/P6ZS2kWSUQB37CNfUWTgO/THtYVopTKsTbXVgWh2JrzZn320npu34/dpiB6VwGQau6rrE6GQqGRbmUdFQ33YCLRh1C9FjMCJjhHpwNe/F7KrH0fcaQ6WTICEcyrAQKZ9NpGxWYquL1g+xBjpxN/8VV+s+wmWzCVWelt1N37XGXf8e9u5GTIsNX90FBfOehOJqzPq9EEKMhZGS+1R7eA40WbZqGI8LIUol1hnaLSS3dgGImZpIHMLm8dHF/gQyYmp8A6aiWodORbUk7pPRxZFJYphWZqM4cbuHro99GrSJMk3Q8b6vJmrIbcy+r8ljB95vghlPfNUmyoyf+PNhn+/4v8TjBoeotJlRcmDEwtg7m4m5ShJ7zGX4H2i4ojZm2zEsoZ7E6EXdBQWwaXhmFwG0YSFSPO14wtefACYTPwfaYht1sY9QxMQ1axF0NqPKq1GVM5ODJLr1CLr9WMrCPCoWwepvS67dsoSP7xul9OnodH3YYkPVnIYyFKXuEwsS9SeE9q76vr2xjONTRrOZ/OSbgVtd+NtwtuzD7mvpW4/4EZHS7G114Wr+K86OQ2hl0Dv7fOKuknH/nWMl2e/bGvKzIJUQQgyRag/Pgfq3rBBjx2oorAa4GTwVNblusW+EMdJX6CYWA/+AqaiGIlkNtX900S5TUU8giWEamV/NNpLTtnL1d6CiwktbS/ewyWWxzZ3RdC0Vj+A9sgMAbViIOUuIu6YQc5UQd08h7igGY+Q9yDCsWKfNhqWfJBCIYVoLIYHI7CKAaXMnRj/HmNbQGzUoPmsF+tD7iYIcfW2pqudgnLWCnoiB1nGIx7AG2o8ngsGuQasYtGEh6qkg5q3EMG04U/VhzxSMhR9HN+5H/+GlwcV2Zn6MyMG/YGval8hPlSJUNhv3GWcR8MnZLkkpYt5Ker2VWIJdOFs+xN5VP2Cri2lo2xJgfLY+cbQewNW8F42id9a5xLwV4/J7xovW0B1SlORpQSohhBgq1R6eAxkqr4tXTxiGUris4LIOnop6wrpFM/F+BeMQHDAVtX8LjR4dQEfM5FTUybyFhiSGaRTS1WylVGK9IRaw2AYlqCEzg2mK1XWEgyF0cTWWYBeWaBBboANboCN5iFYGcWdxIlF0TSHuLqW4smzYoja6cT+0H8O77Ao6Azrvr5zlekqbUuC1mZg7hmnLhg/RbY0ULVlB5OAfsfpaUAMaVCtFtH8TcW8lMXdZMoFXgHO4PmyxYSz8OObuLSmL7TgXX0y87TChkhpCVfMx7W48Ti/4jo9IiuPirin4Z51DcNpCnK0f9m110YS543cUecoJVc0jWjRtzD4t2DvrcTfuBsBfu4RoSfWYPG+2xeMmnQEGFaSyWC30hnVOC1IJIcRwZM/K/KaUSk4j7V/lqLUmpklMRR2wbrF/C43j7+WJW2jYjcTXybKFhiSGaUyUq9kZJbizFtEb0Og5iREHFQtjCXZjDXZhCXZhDXRhifixBruwBruAw6gZp6MP1Rd81VbI/UWAjIrfHP0r9uIpmD3NxFxTkolg1FMOluH/K6fqw6rmtETynub3mQ37CS2+gmA8f9eG5iPT4enb6uIMnG0HcHX0b3XRTsxZRKhyPpHS2lPa6sLqa8FzdAcKCExbSKR89pjFnwtDC1JVVroJ9UyuCxCNjY1s2LCBiooKWlpaWLNmDTNmzBh0jNaa73//+7S3t9Pb28uKFSv4whe+AEBLSwtr166lpqaG3t5eysvLWb16tUyTEmKMyZ6VhUcphU0lEj7PgHWLcZ2Yempz2WnvDssWGkhiOKLhrmbnw/YKo3EyCa62OpIbVSfFo1iD3YlEMdiFs2o2vLs5/S8vkD3Icn0RIJPiN7rpIPqcT9NVctqo1mwO14eVodB/eCn9A5sO4px1JsFA/vfxfKRtDoLVC/EsXErv3vdxtu3HGvLhPbqTeNMHjD3tbAAAFGNJREFUhCrnES6blTKpT8US6KTo0NsorQlVzCVUNX+cXoHIpnXr1nHttddy+eWX8+abb3L33Xfz1FNPDTrmd7/7HYcPH+ZHP/oR4XCYq666inPPPZfa2lp+9rOfMW3aNNauXQvApz/9ac477zwuvfTS7L8YISYw2bNy4rD0TUWtnOLEGk2M8PZPRQ1P0i00JDHMQD5urzBaY5LgWmzEvBXEvBWEAbtdj7w1QwHtQZbbiwCZFb/BsKCtox/BG9qHS93mhHrv8pmy9m91MRd711FcLfsGbXURrqgjVFF3wvYSwxV1Cocj2A//CWXGCZfOIFBzpixkmQA6OzvZtm0bjz32GAAXXHAB3/72t2lubmbq1KnJ41588UU+8YlPAOBwODj33HN55ZVXuPnmm6mqqmLPnj0AhEIhent7C/rDiRD5TPasnLgGTkXtp7UmZpIscJPtLTSUArvDSiRm4ilyJftaJBwb874mieEkMvYJ7vjsvZdLubsIkO22nHjvXd4zDCJls4iUzsTWcwxXyz6sgU5czX/D2frhoK0uUhV1ck6bjfF3FxM+vA//tEWSFE4QjY2NuN1uHI7ExQG73U5xcTENDQ2DEsOGhgbKy8uT35eXl1NfXw/A17/+dW6//Xb++Z//mY6ODv7hH/6BSy65JLsvRIhJRPasnDyUUtgsYMvBFhoWi4HL46CxO0JbS09ydLqiyE5NiYugP0w8PnbLnCQxFCct1wVbJpJst6W8dzmkFNGSGqLF1Vj97X1bXTQf3+qifBaO+WfBztdOLAzUuB+z/Ri2pVeggvlf1Elkz8MPP4zH4+Ghhx4iEolw8803s3v3bhYvXpzxc5SXe8cltsrKkTe6zgeFEicUTqyFEicUTqyFEidMzljjpiYQieMPJ/4FwnH8kfiwW2hYDYXbYeBxWBL/7BZcdmNQshiJmbxf7xu0ntXUiYI5XYEoi2qLcFhPvmbBUJIYipOW64ItE0m221LeuzygFDFvBb3eCizB7kSC2NWA0+2Ew++j0xYiKoyiTiIzNTU1BAIBwuEwDoeDSCRCT08P06dPH3Tc9OnTaW9vT37f3t7O7NmzAXj99ddZtWoVkBhxXLhwIc8999yoEsP29l7MkTZoG6XKyiJaW/O/kFChxAmFE2uhxAmFE2uhxAkSqwUoVlDsBO0wiA5ct9g3whgzNT3BGD3B46PO/VtoOAxFZbENX5RhixxB4v6GjhAVLpXxyLVhqLQXAccuxRSTTn/BFpZdAbXz+6YekvhaOx+WXdFXsCW3cRaCbLelvHf5Je4qwT/rHLpP/yR62lx006H0D2j8CJdF3pyJorS0lOXLl7N161YAtm/fztKlS5k6dSqbN2/G17dFzMqVK9myZQsA4XCYd955h6uvvhqA2bNns3///uRzHjhwgGnTpmX5lQghhBhKKYXdoiiyG1S4DKZ7LMwpMphdZFDtNihzKDzWxNRSTWILjZ6oxuGw0eZLvS0KQKsvgs1uG7NYszZiKKW4J6aJULU1X2S7LeW9yz+mw4O2TqyiTiIz99xzD/fddx9btmyhpaWFe++9F4BHHnmEdevWcfbZZ3PVVVexe/du7rjjDnw+H9/61reS59G77rqLe++9l/Xr1+P3+ykqKuIf//Efc/mShBBCpKCUwqrAmmILjXBcYyjFSJM4TM2YXsTPWmIopbgnrolQtTVfZLst5b3LR1IYaDKqra3l8ccfP+H+3/zmN8nbSim++93vDvv46dOn85Of/GTc4hNCCDH++rfQcFkVViNRaCZdcmiosa1Dl5WppP2luC+66CIgUYp7x44dNDc3DzruxRdf5OKLLwYGl+IGqKqqoqOjA5BS3EKIiSsUV1Bdl/6gZGEgIYQQQkxE0UiUiqL0+1ZXFtmJRqJj9juzMmKYL6W4x6viWr4opOpP+ULabPSkzUZvtG2m7Ysx0xQGMmYvpsjpZiK/E9LPhBBCTGaRcIyaEhddgeiwBWjsVkV1iR2/Lzhmv7NgqpKORSnu8ai4li8KqfpTvpA2Gz1ps9E7mTazWAxKll0BR/ZA4/F9DKmpg5mL6Axo4r6J+z6caj8bqeqaEEIIke+0hqA/zMJqD8e6I7T6Isl9DCuL7FSX2An6w4W3xjBfSnELIUQhkMJAQgghhIjHTfy+IBUuK7VlxcRiJkolppn6fcExrx6flTWGUopbCCFGp78wUGdA0xmAzoAmFDZlCxEhhBBiEtEawqEYdquB3xektydIOBQbl88DWZtKKqW4hRBCCCGEECI/ZS0xzIdS3IYxsav4TfTXNx6kzUZP2mz0pM1G71TaTNr75IxXuxXK+1EocULhxFoocULhxFoocYLEOh5ONc6RHq+0LFYRQgghhBBCiEktK2sMhRBCCCGEEELkL0kMhRBCCCGEEGKSk8RQCCGEEEIIISY5SQyFEEIIIYQQYpKTxFAIIYQQQgghJjlJDIUQQgghhBBikpPEUAghhBBCCCEmOUkMhRBCCCGEEGKSk8RQCCGEEEIIISY5a64DEKOzadMmfvnLX1JXV8fhw4e5/vrrufDCC2lsbGTDhg1UVFTQ0tLCmjVrmDFjRq7DzQup2uxrX/sa+/fvTx73T//0T9x44405jDS/bNmyheeee47a2lqampr45je/yYIFC+jp6eGee+6hqKiIpqYmbrnlFs4888xch5sXUrXZd7/7XbZu3Zo87qqrruJ73/teDiPNnVgsxs9//nMee+wxnn/+eebOnQuQtl9Jn8t/9fX1fPazn8Xtdifv6+7u5q233mLTpk28+eabzJo1i0OHDnHLLbewaNGinMVaKOfR4eKcN28eDz74IKWlpYTDYbq6urjnnnsoKyvLWZypYr3wwguTP//v//5vNm7cyN69e3MYZUKqWEOhEI899hixWIyenh6ampp48skn8y7Okdo6Fwrl88JwcdbW1nL//fdjs9kwDIP6+nruvPNOZs2albM4U8W6YMGC5M9fffVV/uVf/oXf//731NbWjt0v1qJgmKaply1bpnft2qW11nrXrl36/PPP11prfdNNN+lNmzZprbV+44039Ne+9rVchZlX0rXZHXfckcvQ8lpnZ6desmSJbmpq0lprffjwYb1ixQodi8X0unXr9FNPPaW11nrv3r36U5/6lDZNM5fh5oV0bSZ97bhf/OIXeufOnXr+/Pl6//79yfvT9Svpc/mvsbFR/+xnP0t+39bWpr/xjW/ojo4OvWDBAt3W1qa11vqVV17RX/jCF3IVZsGcR1PF+fbbb+uHH344edz999+v77rrrlyFqbVO36Zaa71//35900036fnz5+cqxKR0sd533316z549yWN37tyZkxi1Th3nSG2dC4XyeSFVnIcPH9a33XZb8rinn34655+h07Wp1lq3t7frf/u3f9Pz58/XR48eHdPfLVNJC4hSioqKCtra2gBoa2tDKUVnZyfbtm3joosuAuCCCy5gx44dNDc35zLcvJCqzQACgQAbN27k/vvv59FHHyUYDOYy1Lxy9OhRLBYLU6dOBWDmzJk0Nzeze/duXnrpJS6++GIA5s+fTzQa5b333stluHkhXZsBPPTQQ2zcuJEHHniAzs7OXIaaU1/60pdYunTpCfen61fS5/JfdXU1N910U/L7X/7yl1xzzTU4HA5KSkqSf4Pb29uTf4NzoVDOo6niPPfcc/nXf/3X5HG1tbU5P9enO8/G43EefvhhvvOd7+QyxKRUsYZCId58800++OADHnroIdavX095eXnexZmurXOlUD4vpIqzvb2dBx54IHlcPvyfGunzxMaNG7n11lvH5XfLVNIC88Mf/pDVq1ezefNm9uzZw6OPPkpjYyNutxuHwwGA3W6nuLiYhoaGZKeazIZrM4DLLruMT37yk3g8Hn7yk59wxx13JH822dXV1WGz2Xj//fc588wzeffdd4lEIhw7doze3t5BJ8zy8nLq6+s566yzchhx7qVrs8suu4yzzjqLyspKXn75Zb7xjW/w7LPPYhhybQ6gq6srZb+aM2eO9LkCE4/H2bp1KzfffDOGYfDYY49xxx13cMYZZ7B3716+//3v5zS+QjmPDhfn0CRgy5YtfPnLX85JfAOlOs8+8cQTXHvttXi93hxHeNxwsTY0NHD48GGUUtx6663s2bOHr371q/zv//7voOnRuY4z3f25UiifF9LFOTCeN998k+uuuy7r8Q2ULtajR4+yZMmSsZ0+OoAkhgUkFApx88038+CDD3LWWWexZ88eNm7cyO23357r0PJWqjZ78skn+dznPpc87vOf/zz/+Z//STgcTn4wmMw8Hg9PPfUUzzzzDJs2baKiooK6ujri8XiuQ8tbqdrM6/Umr5gCfOYzn+Guu+7i8OHDzJkzJ4cRCzE+Xn/9dS699FIMw6C1tZXvfOc7/M///A8zZsxg06ZN/PjHP+ahhx7KSWyFch5Nd+6yWhMf3Z577jnmzZvH5ZdfnpexrlmzhubmZr75zW9SX1+f0xj7jfT+X3nllQAsWrQIp9PJzp07k6PI+RDn448/PmK/yLZC+byQ7hzdb8uWLfT29nLDDTfkMNL0bfr666/zyCOPjNvvlsSwgOzbt4/u7u7klY1FixZx4MABotEogUAgmdREIhF6enqYPn16jiPOvVRt9uc//5na2lpqamoAsNlsmKYpieEACxYsYP369QCYpsmPfvQjlixZgsfjob29neLiYiAxNUz6WsJwbTZnzhwOHjyYTAKVUlitVsLhcC5DzStTpkxJ2a/S/Uzkp1/96lf8x3/8BwA7duygvLw8WcTloosu4tvf/jarV6/OyUhcoZxHU8W5e/duli5dyq9//Wvq6+u57bbbchLfQKli3bx5M36/n7Vr1+L3+wFYu3YtF198cc6S2VSxxmIxgEGzOOx2e87+TqeK84MPPkjbL3KlUD4vpDpHA2zbto3XXnuN+++/P+fTc2H4WEOhEJD4f9Tv4Ycf5vzzz+eaa64Zk98r85gKSG1tLZFIhKamJiAxt9zn8zFt2jSWL1+erHq4fft2li5dKtNISd1mkJij3e/tt9/mYx/7WPKPl4ANGzZgmiYAb7zxBmeffTYzZsxg5cqVbNmyBYAPP/wQi8XCkiVLchlq3kjVZgNHI3bv3o3X66Wuri5XYealdP1K+lzhOHDgAKWlpckqmXPmzOHYsWMEAoHkz10uF6WlpTmJr1DOo6ninDp1Ks8++ywNDQ2sWrUKSPzdyaVUsX7+85/ngQceYP369clY169fn9MRznTtumzZMt55553k/a2trTmbrp4qzunTp6eMP5cK5fNCqjjfeOMNfv/737N+/XosFkvO/0/B8LFec801PPLII6xfvz6ZNK5atWrMkkIApbXWY/ZsYtz99re/5de//jVz5szho48+4sorr+SLX/wi9fX13HfffVRWVtLS0pIXpXbzxXBtduWVV7JmzRrcbjcej4fGxkZuv/12mdo3wK233ko4HKayspJQKMTq1aspLS1NlkcvKSnh2LFj3HLLLSxevDjX4eaFVG125513EolEKC8v58iRI3zrW9+atG327rvv8vLLL/PMM8/w2c9+liuvvJLLL788bb+SPlc4NmzYwMqVKwe9P08//TTbtm1j5syZ7N+/nxtuuIFPfOITOYuxUM6jw8U5e/ZsbrjhhkHbU3i9Xl599dWcxQmp2xQSo8bPPfccL7zwAl/5yle47rrrmDdvXt7FWl9fzwMPPEB1dTWNjY18+ctfZvny5XkXZ7q2zpVC+bwwXJy9vb1cddVVFBcXJ0cKfT5fstBLPsXaf0HtwIEDPPPMMzzzzDN87nOf44tf/CLnnHPOmPxeSQyFEEIIIYQQYpKTqaRCCCGEEEIIMclJYiiEEEIIIYQQk5wkhkIIIYQQQggxyUliKIQQQgghhBCTnCSGQgghhBBCCDHJSWIoRAH5v//7P/7+7/+eBQsWcP3119PV1ZXrkIQQQoick/OjEKdOtqsQosD88Y9/5Ktf/Sp/+ctfsFqtuQ5HCCGEyAtyfhTi1MiIoRBCCCGEEEJMcnI5RYgJZPfu3TzwwANorVFKsXr1ahYvXgzAD3/4Q7Zu3Yrdbqe8vJy77rqLqqoqnn32WZ5//nlcLhdOp5PVq1czd+7cHL8SIYQQYuzI+VGIkUliKMQE4fP5uOmmm3j00Uc577zz2LFjBzfddBObNm2ipaWF3/72t7zyyisopfj3f/93Dh48iMfj4Qc/+EHyhPjzn/+cXbt2yYlPCCHEhCHnRyEyI1NJhZgg3njjDbxeL+eddx4AZ599NiUlJbz++ut4PB7a2tp47bXXiEaj3HbbbSxbtgyLxQLACy+8QDAY5Ctf+Qqf+cxncvkyhBBCiDEl50chMiOJoRATRFNTE2VlZYPuKysro6mpierqan7605/y4osvcumll/KDH/yAaDSK0+nk6aef5p133mHFihWsXbuW3t7eHL0CIYQQYuzJ+VGIzEhiKMQE4PP5mDZtGh0dHYPu7+joYNq0aQSDQU477TR+/OMf88ILL/Dee+/xxBNPEI1GKS8v58EHH+TVV1+lu7ubjRs35uhVCCGEEGNLzo9CZE4SQyEmgO7ubl577TX8fj9/+tOfANi5cyfd3d1cdtll7N69m0cffRSAyspK5syZQzwep7m5mbvvvhuAoqIizjjjDOLxeM5ehxBCCDGW5PwoROak+IwQBWTHjh089thjAKxatQqlFADBYJDy8nL+67/+i40bN2KaJkopnnjiCYqLi6mrq6OlpYXrr7+eWCxGRUUFd955J1arlZKSEq677joMw8DhcLBhw4ZcvkQhhBBi1OT8KMSpkw3uhRBCCCGEEGKSk6mkQgghhBBCCDHJSWIohBBCCCGEEJOcJIZCCCGEEEIIMclJYiiEEEIIIYQQk5wkhkIIIYQQQggxyUliKIQQQgghhBCTnCSGQgghhBBCCDHJSWIohBBCCCGEEJOcJIZCCCGEEEIIMcn9f3v9tDqYwuksAAAAAElFTkSuQmCC
" />
</div>
</div>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>Figure 4</strong>. NDGC@100 (n100), and Recall@20 (r20) and Recall@50 (r50) plotted against the loss for all experiments I run</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p>Before I end up this exercise I wanted to emphasize a result I have already discussed in the past (see <a href="https://github.com/jrzaurin/RecoTour/blob/master/Amazon/neural_graph_cf/Chapter06_results_summary.ipynb">here</a>) and that is illustrated in Fig 4.</p>
<p>Fig 2 shows that, in general, the best ranking metrics do not correspond to the best loss values. Even though the the reconstruction of the input matrix of clicks might be worse, the ranking metrics might still improve. This is an important and not uncommon results, and something one has to bear in mind when building real world recommendation systems. When building recommendation algorithms we are not interested in achieving the best classification/regression loss, but in producing the best recommendations, which is more related to information retrieval effectiveness, and therefore ranking metrics. For more information on this and many other aspects of recommendation systems, I recommend this <a href="https://www.amazon.co.uk/Recommender-Systems-Textbook-Charu-Aggarwal/dp/3319296574/ref=sr_1_1?crid=2SK7PGNMA59FW&keywords=recommender+systems&qid=1559762483&s=gateway&sprefix=recommender+syste%2Caps%2C153&sr=8-1">fantastic book</a>. Chapter 7 in that book focuses on evaluation metrics.</p>
<p>And with this, I conclude my experimentation around the <code>Mult-VAE</code> with <code>Pytorch</code> and <code>Mxnet</code></p>
<p>The next, most immediate projects I want to add to the repo are:</p>
<ol>
<li><a href="https://arxiv.org/pdf/2002.02126.pdf">Sequential Variational Autoencoders for Collaborative Filtering</a> [7]</li>
<li><a href="https://arxiv.org/pdf/2002.02126.pdf">LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation</a> [8]</li>
</ol>
<p>But before that, I will be re-visiting the [nlp-stuff repo] and also updating the pytorch-widedeep package.</p>
<p>If you manage to read all that, I hope you found it useful.</p>
</div>
</div>
</div>
<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>References</strong>:</p>
<p>[1] Dawen Liang, Rahul G. Krishnan, Matthew D. Hoffman, Tony Jebara, 2018. Variational Autoencoders for Collaborative Filtering: <a href="https://arxiv.org/pdf/1802.05814.pdf">arXiv:1802.05814v1</a></p>
<p>[2] Diederik P Kingma, Max Welling, 2014. Auto-Encoding Variational Bayes: <a href="https://arxiv.org/pdf/1312.6114.pdf">arXiv:1312.6114v10</a></p>
<p>[3] Maurizio Ferrari Dacrema, Paolo Cremonesi, Dietmar Jannach. Are We Really Making Much Progress? A Worrying Analysis of Recent Neural Recommendation Approaches: <a href="https://arxiv.org/pdf/1907.06902.pdf">arXiv:1907.06902v3</a></p>
<p>[4] J. McAuley, C. Targett, J. Shi, A. van den Hengel. 2015. Image-based recommendations on styles and substitutes. <a href="https://arxiv.org/pdf/1506.04757.pdf">arXiv:1506.04757v1</a></p>
<p>[5] R. He, J. McAuley, 2016. Modeling the visual evolution of fashion trends with one-class collaborative filtering. <a href="https://arxiv.org/pdf/1602.01585.pdf">arXiv:1602.01585v1</a></p>
<p>[6] Samuel R. Bowman, Luke Vilnis, Oriol Vinyals, Andrew M. Dai, Rafal Jozefowicz, Samy Bengio, 2016. Generating Sentences from a Continuous Space: <a href="https://arxiv.org/abs/1511.06349.pdf">arXiv:1511.06349v4</a></p>
<p>[7] Noveen Sachdeva, Giuseppe Manco, Ettore Ritacco, Vikram Pudi, 2018. Sequential Variational Autoencoders for Collaborative Filtering: <a href="https://arxiv.org/pdf/1811.09975.pdf">arXiv:1811.09975v1 </a></p>
<p>[8] Xiangnan He, Kuan Deng, Xiang Wang, Yan Li, Yongdong Zhang, Meng Wang, 2020. LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation <a href="https://arxiv.org/abs/2002.02126.pdf">arXiv:2002.02126v2 </a></p>
</div>
</div>
</div>
</div>Javier Rodriguez