11. Introducción. Modelos preentrenados.#

PyTorch cuenta con una buena cantidad de modelos preentrenados en el módulo de TorchVision , listos ya para poder ser usados con los pesos que en el modelo de entrenamiento se les hayan asignado. En este módulo vamos a ver los que se tienen en PyTorch para computeVisión, pero sobre todo nos centraremos en dos.

El modelo preentrenado de imágenes, está basado en un buen conjunto de imágenes que se pueden encontrar en este enlace . ImageNet está compuesto por un total de catorce millones de imágenes, y que es mantenido por la universidad de Stanford. Todas las imágenes están etiquetas por una Jerarquía de nombres similar a la que utiliza WordNet (https://wordnet.princeton.edu/ )

Los modelos preentrenados se pueden encontrar en torchvision.models, y por lo tanto procedemos a su carga

import torch
torch.__version__
'2.0.1+cu117'
from torchvision import models

import torch.fx

Podemos ver una lista de modelos a preentrenado con la siguiente intrucción

dir(models)
['AlexNet',
 'AlexNet_Weights',
 'ConvNeXt',
 'ConvNeXt_Base_Weights',
 'ConvNeXt_Large_Weights',
 'ConvNeXt_Small_Weights',
 'ConvNeXt_Tiny_Weights',
 'DenseNet',
 'DenseNet121_Weights',
 'DenseNet161_Weights',
 'DenseNet169_Weights',
 'DenseNet201_Weights',
 'EfficientNet',
 'EfficientNet_B0_Weights',
 'EfficientNet_B1_Weights',
 'EfficientNet_B2_Weights',
 'EfficientNet_B3_Weights',
 'EfficientNet_B4_Weights',
 'EfficientNet_B5_Weights',
 'EfficientNet_B6_Weights',
 'EfficientNet_B7_Weights',
 'EfficientNet_V2_L_Weights',
 'EfficientNet_V2_M_Weights',
 'EfficientNet_V2_S_Weights',
 'GoogLeNet',
 'GoogLeNetOutputs',
 'GoogLeNet_Weights',
 'Inception3',
 'InceptionOutputs',
 'Inception_V3_Weights',
 'MNASNet',
 'MNASNet0_5_Weights',
 'MNASNet0_75_Weights',
 'MNASNet1_0_Weights',
 'MNASNet1_3_Weights',
 'MaxVit',
 'MaxVit_T_Weights',
 'MobileNetV2',
 'MobileNetV3',
 'MobileNet_V2_Weights',
 'MobileNet_V3_Large_Weights',
 'MobileNet_V3_Small_Weights',
 'RegNet',
 'RegNet_X_16GF_Weights',
 'RegNet_X_1_6GF_Weights',
 'RegNet_X_32GF_Weights',
 'RegNet_X_3_2GF_Weights',
 'RegNet_X_400MF_Weights',
 'RegNet_X_800MF_Weights',
 'RegNet_X_8GF_Weights',
 'RegNet_Y_128GF_Weights',
 'RegNet_Y_16GF_Weights',
 'RegNet_Y_1_6GF_Weights',
 'RegNet_Y_32GF_Weights',
 'RegNet_Y_3_2GF_Weights',
 'RegNet_Y_400MF_Weights',
 'RegNet_Y_800MF_Weights',
 'RegNet_Y_8GF_Weights',
 'ResNeXt101_32X8D_Weights',
 'ResNeXt101_64X4D_Weights',
 'ResNeXt50_32X4D_Weights',
 'ResNet',
 'ResNet101_Weights',
 'ResNet152_Weights',
 'ResNet18_Weights',
 'ResNet34_Weights',
 'ResNet50_Weights',
 'ShuffleNetV2',
 'ShuffleNet_V2_X0_5_Weights',
 'ShuffleNet_V2_X1_0_Weights',
 'ShuffleNet_V2_X1_5_Weights',
 'ShuffleNet_V2_X2_0_Weights',
 'SqueezeNet',
 'SqueezeNet1_0_Weights',
 'SqueezeNet1_1_Weights',
 'SwinTransformer',
 'Swin_B_Weights',
 'Swin_S_Weights',
 'Swin_T_Weights',
 'Swin_V2_B_Weights',
 'Swin_V2_S_Weights',
 'Swin_V2_T_Weights',
 'VGG',
 'VGG11_BN_Weights',
 'VGG11_Weights',
 'VGG13_BN_Weights',
 'VGG13_Weights',
 'VGG16_BN_Weights',
 'VGG16_Weights',
 'VGG19_BN_Weights',
 'VGG19_Weights',
 'ViT_B_16_Weights',
 'ViT_B_32_Weights',
 'ViT_H_14_Weights',
 'ViT_L_16_Weights',
 'ViT_L_32_Weights',
 'VisionTransformer',
 'Weights',
 'WeightsEnum',
 'Wide_ResNet101_2_Weights',
 'Wide_ResNet50_2_Weights',
 '_GoogLeNetOutputs',
 '_InceptionOutputs',
 '__builtins__',
 '__cached__',
 '__doc__',
 '__file__',
 '__loader__',
 '__name__',
 '__package__',
 '__path__',
 '__spec__',
 '_api',
 '_meta',
 '_utils',
 'alexnet',
 'convnext',
 'convnext_base',
 'convnext_large',
 'convnext_small',
 'convnext_tiny',
 'densenet',
 'densenet121',
 'densenet161',
 'densenet169',
 'densenet201',
 'detection',
 'efficientnet',
 'efficientnet_b0',
 'efficientnet_b1',
 'efficientnet_b2',
 'efficientnet_b3',
 'efficientnet_b4',
 'efficientnet_b5',
 'efficientnet_b6',
 'efficientnet_b7',
 'efficientnet_v2_l',
 'efficientnet_v2_m',
 'efficientnet_v2_s',
 'get_model',
 'get_model_builder',
 'get_model_weights',
 'get_weight',
 'googlenet',
 'inception',
 'inception_v3',
 'list_models',
 'maxvit',
 'maxvit_t',
 'mnasnet',
 'mnasnet0_5',
 'mnasnet0_75',
 'mnasnet1_0',
 'mnasnet1_3',
 'mobilenet',
 'mobilenet_v2',
 'mobilenet_v3_large',
 'mobilenet_v3_small',
 'mobilenetv2',
 'mobilenetv3',
 'optical_flow',
 'quantization',
 'regnet',
 'regnet_x_16gf',
 'regnet_x_1_6gf',
 'regnet_x_32gf',
 'regnet_x_3_2gf',
 'regnet_x_400mf',
 'regnet_x_800mf',
 'regnet_x_8gf',
 'regnet_y_128gf',
 'regnet_y_16gf',
 'regnet_y_1_6gf',
 'regnet_y_32gf',
 'regnet_y_3_2gf',
 'regnet_y_400mf',
 'regnet_y_800mf',
 'regnet_y_8gf',
 'resnet',
 'resnet101',
 'resnet152',
 'resnet18',
 'resnet34',
 'resnet50',
 'resnext101_32x8d',
 'resnext101_64x4d',
 'resnext50_32x4d',
 'segmentation',
 'shufflenet_v2_x0_5',
 'shufflenet_v2_x1_0',
 'shufflenet_v2_x1_5',
 'shufflenet_v2_x2_0',
 'shufflenetv2',
 'squeezenet',
 'squeezenet1_0',
 'squeezenet1_1',
 'swin_b',
 'swin_s',
 'swin_t',
 'swin_transformer',
 'swin_v2_b',
 'swin_v2_s',
 'swin_v2_t',
 'vgg',
 'vgg11',
 'vgg11_bn',
 'vgg13',
 'vgg13_bn',
 'vgg16',
 'vgg16_bn',
 'vgg19',
 'vgg19_bn',
 'video',
 'vision_transformer',
 'vit_b_16',
 'vit_b_32',
 'vit_h_14',
 'vit_l_16',
 'vit_l_32',
 'wide_resnet101_2',
 'wide_resnet50_2']

De entre estos modelos, vamos a centrarnos en Alexnet y ResNet.

11.1. AlexNet.#

La arquitectura de este modelo se pude ver en el siguiente gráfico

En esta figura, las imágenes de entrada entran por la izquierda y pasan por cinco pilas de filtros, cada uno de los cuales produce una serie de imágenes de salida. Después de cada filtro, las imágenes de tamaño, como se indica. Las imágenes producidas por la última pila de filtros se presentan como un conjunto de vector 1D de 4.096 elementos y se clasifican para producir 1.000 probabilidades de salida, una para cada clase de salida.

Para cargar este modelo obtenemos una instancia de la clase

alexnet = models.AlexNet()

De esta manera ya tendriamos en funcionamiento y listo para usar el modelo AlexNet.

11.2. ResNet.#

Usando la función resnet101, instanciaremos ahora una red neuronal convolucional de 101 capas capas. Para poner las cosas en perspectiva, antes de la llegada de las redes residuales en 2015, lograr un entrenamiento estable a tales profundidades se consideraba extremadamente difícil. Las redes residuales que lo hicieron posible, y al hacerlo, batieron varios puntos de referencia en un solo barrido ese año.

Para poner en marcha este tipo de red neuronal, tan sólo tendremos que ejecutar el siguiente código:

resnet = models.resnet101(pretrained=True)
D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torchvision\models\_utils.py:208: UserWarning: The parameter 'pretrained' is deprecated since 0.13 and may be removed in the future, please use 'weights' instead.
  warnings.warn(
D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torchvision\models\_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=ResNet101_Weights.IMAGENET1K_V1`. You can also use `weights=ResNet101_Weights.DEFAULT` to get the most up-to-date weights.
  warnings.warn(msg)
Downloading: "https://download.pytorch.org/models/resnet101-63fe2227.pth" to C:\Users\Francisco/.cache\torch\hub\checkpoints\resnet101-63fe2227.pth
0.0%
0.0%
0.0%
0.0%
0.0%
0.0%
0.0%
0.0%
0.0%
0.0%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.1%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.2%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.3%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.4%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.5%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.6%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.7%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.8%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
0.9%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.0%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.1%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.2%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.3%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.4%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.5%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.6%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.7%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.8%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
1.9%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.0%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.1%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.2%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.3%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.4%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.5%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.6%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.7%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.8%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
2.9%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.0%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.1%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.2%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.3%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.4%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.5%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.6%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.7%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.8%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
3.9%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.0%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.1%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.2%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.3%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.4%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.5%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.6%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.7%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.8%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
4.9%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.0%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.1%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.2%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.3%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.4%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.5%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.6%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.7%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.8%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
5.9%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.0%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.1%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.2%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.3%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.4%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.5%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.6%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.7%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.8%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
6.9%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.0%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.1%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.2%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.3%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.4%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.5%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.6%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.7%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.8%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
7.9%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.0%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.1%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.2%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.3%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.4%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.5%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.6%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.7%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.8%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
8.9%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.0%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.1%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.2%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.3%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.4%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.5%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.6%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.7%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.8%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
9.9%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.0%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.1%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.2%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.3%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.4%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.5%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.6%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.7%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.8%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
10.9%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.0%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.1%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.2%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.3%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.4%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.5%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.6%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.7%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.8%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
11.9%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.0%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.1%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.2%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.3%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.4%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.5%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.6%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.7%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.8%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
12.9%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.0%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.1%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.2%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.3%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.4%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.5%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.6%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.7%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.8%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
13.9%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.0%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.1%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.2%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.3%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.4%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.5%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.6%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.7%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.8%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
14.9%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.0%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.1%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.2%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.3%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.4%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.5%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.6%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.7%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.8%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
15.9%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.0%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.1%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.2%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.3%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.4%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.5%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.6%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.7%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.8%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
16.9%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.0%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.1%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.2%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.3%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.4%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.5%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.6%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.7%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.8%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
17.9%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.0%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.1%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.2%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.3%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.4%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.5%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.6%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.7%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.8%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
18.9%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.0%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.1%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.2%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.3%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.4%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.5%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.6%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.7%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.8%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
19.9%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.0%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.1%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.2%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.3%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.4%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.5%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.6%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.7%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.8%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
20.9%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.0%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.1%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.2%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.3%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.4%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.5%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.6%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.7%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.8%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
21.9%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.0%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.1%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.2%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.3%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.4%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.5%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.6%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.7%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.8%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
22.9%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.0%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.1%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.2%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.3%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.4%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.5%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.6%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.7%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.8%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
23.9%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.0%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.1%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.2%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.3%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.4%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.5%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.6%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.7%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.8%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
24.9%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.0%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.1%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.2%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.3%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.4%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.5%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.6%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.7%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.8%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
25.9%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.0%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.1%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.2%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.3%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.4%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.5%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.6%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.7%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.8%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
26.9%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.0%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.1%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.2%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.3%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.4%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.5%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.6%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.7%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.8%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
27.9%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.0%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.1%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.2%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.3%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.4%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.5%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.6%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.7%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.8%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
28.9%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.0%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.1%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.2%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.3%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.4%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.5%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.6%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.7%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.8%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
29.9%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.0%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.1%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.2%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.3%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.4%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.5%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.6%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.7%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.8%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
30.9%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.0%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.1%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.2%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.3%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.4%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.5%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.6%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.7%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.8%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
31.9%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.0%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.1%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.2%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.3%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.4%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.5%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.6%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.7%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.8%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
32.9%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.0%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.1%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.2%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.3%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.4%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.5%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.6%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.7%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.8%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
33.9%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.0%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.1%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.2%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.3%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.4%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.5%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.6%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.7%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.8%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
34.9%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.0%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.1%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.2%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.3%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.4%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.5%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.6%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.7%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.8%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
35.9%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.0%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.1%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.2%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.3%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.4%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.5%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.6%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.7%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.8%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
36.9%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.0%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.1%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.2%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.3%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.4%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.5%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.6%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.7%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.8%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
37.9%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.0%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.1%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.2%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.3%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.4%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.5%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.6%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.7%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.8%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
38.9%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.0%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.1%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.2%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.3%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.4%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.5%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.6%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.7%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.8%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
39.9%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.0%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.1%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.2%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.3%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.4%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.5%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.6%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.7%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.8%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
40.9%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.0%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.1%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.2%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.3%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.4%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.5%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.6%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.7%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.8%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
41.9%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.0%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.1%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.2%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.3%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.4%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.5%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.6%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.7%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.8%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
42.9%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.0%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.1%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.2%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.3%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.4%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.5%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.6%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.7%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.8%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
43.9%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.0%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.1%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.2%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.3%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.4%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.5%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.6%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.7%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.8%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
44.9%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.0%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.1%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.2%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.3%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.4%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.5%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.6%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.7%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.8%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
45.9%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.0%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.1%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.2%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.3%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.4%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.5%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.6%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.7%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.8%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
46.9%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.0%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.1%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.2%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.3%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.4%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.5%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.6%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.7%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.8%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
47.9%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.0%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.1%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.2%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.3%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.4%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.5%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.6%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.7%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.8%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
48.9%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.0%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.1%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.2%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.3%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.4%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.5%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.6%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.7%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.8%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
49.9%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.0%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.1%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.2%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.3%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.4%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.5%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.6%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.7%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.8%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
50.9%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.0%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.1%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.2%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.3%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.4%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.5%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.6%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.7%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.8%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
51.9%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.0%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.1%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.2%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.3%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.4%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.5%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.6%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.7%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.8%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
52.9%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.0%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.1%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.2%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.3%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.4%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.5%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.6%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.7%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.8%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
53.9%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.0%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.1%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.2%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.3%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.4%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.5%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.6%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.7%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.8%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
54.9%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.0%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.1%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.2%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.3%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.4%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.5%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.6%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.7%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.8%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
55.9%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.0%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.1%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.2%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.3%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.4%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.5%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.6%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.7%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.8%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
56.9%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.0%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.1%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.2%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.3%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.4%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.5%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.6%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.7%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.8%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
57.9%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.0%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.1%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.2%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.3%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.4%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.5%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.6%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.7%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.8%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
58.9%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.0%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.1%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.2%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.3%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.4%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.5%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.6%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.7%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.8%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
59.9%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.0%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.1%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.2%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.3%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.4%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.5%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.6%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.7%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.8%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
60.9%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.0%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.1%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.2%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.3%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.4%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.5%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.6%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.7%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.8%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
61.9%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.0%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.1%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.2%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.3%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.4%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.5%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.6%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.7%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.8%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
62.9%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.0%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.1%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.2%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.3%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.4%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.5%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.6%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.7%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.8%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
63.9%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.0%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.1%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.2%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.3%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.4%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.5%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.6%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.7%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.8%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
64.9%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.0%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.1%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.2%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.3%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.4%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.5%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.6%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.7%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.8%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
65.9%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.0%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.1%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.2%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.3%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.4%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.5%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.6%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.7%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.8%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
66.9%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.0%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.1%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.2%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.3%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.4%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.5%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.6%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.7%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.8%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
67.9%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.0%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.1%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.2%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.3%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.4%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.5%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.6%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.7%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.8%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
68.9%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.0%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.1%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.2%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.3%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.4%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.5%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.6%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.7%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.8%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
69.9%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.0%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.1%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.2%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.3%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.4%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.5%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.6%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.7%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.8%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
70.9%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.0%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.1%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.2%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.3%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.4%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.5%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.6%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.7%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.8%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
71.9%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.0%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.1%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.2%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.3%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.4%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.5%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.6%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.7%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.8%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
72.9%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.0%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.1%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.2%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.3%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.4%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.5%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.6%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.7%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.8%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
73.9%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.0%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.1%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.2%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.3%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.4%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.5%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.6%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.7%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.8%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
74.9%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.0%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.1%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.2%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.3%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.4%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.5%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.6%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.7%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.8%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
75.9%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.0%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.1%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.2%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.3%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.4%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.5%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.6%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.7%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.8%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
76.9%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.0%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.1%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.2%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.3%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.4%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.5%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.6%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.7%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.8%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
77.9%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.0%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.1%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.2%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.3%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.4%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.5%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.6%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.7%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.8%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
78.9%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.0%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.1%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.2%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.3%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.4%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.5%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.6%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.7%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.8%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
79.9%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.0%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.1%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.2%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.3%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.4%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.5%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.6%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.7%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.8%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
80.9%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.0%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.1%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.2%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.3%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.4%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.5%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.6%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.7%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.8%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
81.9%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.0%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.1%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.2%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.3%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.4%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.5%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.6%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.7%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.8%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
82.9%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.0%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.1%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.2%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.3%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.4%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.5%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.6%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.7%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.8%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
83.9%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.0%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.1%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.2%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.3%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.4%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.5%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.6%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.7%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.8%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
84.9%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.0%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.1%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.2%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.3%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.4%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.5%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.6%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.7%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.8%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
85.9%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.0%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.1%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.2%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.3%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.4%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.5%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.6%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.7%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.8%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
86.9%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.0%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.1%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.2%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.3%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.4%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.5%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.6%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.7%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.8%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
87.9%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.0%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.1%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.2%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.3%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.4%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.5%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.6%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.7%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.8%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
88.9%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.0%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.1%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.2%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.3%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.4%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.5%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.6%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.7%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.8%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
89.9%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.0%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.1%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.2%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.3%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.4%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.5%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.6%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.7%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.8%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
90.9%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.0%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.1%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.2%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.3%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.4%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.5%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.6%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.7%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.8%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
91.9%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.0%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.1%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.2%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.3%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.4%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.5%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.6%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.7%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.8%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
92.9%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.0%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.1%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.2%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.3%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.4%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.5%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.6%
93.7%
93.7%
93.7%
93.7%
93.7%
93.7%
93.7%

---------------------------------------------------------------------------
KeyboardInterrupt                         Traceback (most recent call last)
Cell In[5], line 1
----> 1 resnet = models.resnet101(pretrained=True)

File D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torchvision\models\_utils.py:142, in kwonly_to_pos_or_kw.<locals>.wrapper(*args, **kwargs)
    135     warnings.warn(
    136         f"Using {sequence_to_str(tuple(keyword_only_kwargs.keys()), separate_last='and ')} as positional "
    137         f"parameter(s) is deprecated since 0.13 and may be removed in the future. Please use keyword parameter(s) "
    138         f"instead."
    139     )
    140     kwargs.update(keyword_only_kwargs)
--> 142 return fn(*args, **kwargs)

File D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torchvision\models\_utils.py:228, in handle_legacy_interface.<locals>.outer_wrapper.<locals>.inner_wrapper(*args, **kwargs)
    225     del kwargs[pretrained_param]
    226     kwargs[weights_param] = default_weights_arg
--> 228 return builder(*args, **kwargs)

File D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torchvision\models\resnet.py:795, in resnet101(weights, progress, **kwargs)
    769 """ResNet-101 from `Deep Residual Learning for Image Recognition <https://arxiv.org/pdf/1512.03385.pdf>`__.
    770 
    771 .. note::
   (...)
    791     :members:
    792 """
    793 weights = ResNet101_Weights.verify(weights)
--> 795 return _resnet(Bottleneck, [3, 4, 23, 3], weights, progress, **kwargs)

File D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torchvision\models\resnet.py:301, in _resnet(block, layers, weights, progress, **kwargs)
    298 model = ResNet(block, layers, **kwargs)
    300 if weights is not None:
--> 301     model.load_state_dict(weights.get_state_dict(progress=progress))
    303 return model

File D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torchvision\models\_api.py:89, in WeightsEnum.get_state_dict(self, progress)
     88 def get_state_dict(self, progress: bool) -> Mapping[str, Any]:
---> 89     return load_state_dict_from_url(self.url, progress=progress)

File D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torch\hub.py:746, in load_state_dict_from_url(url, model_dir, map_location, progress, check_hash, file_name)
    744         r = HASH_REGEX.search(filename)  # r is Optional[Match[str]]
    745         hash_prefix = r.group(1) if r else None
--> 746     download_url_to_file(url, cached_file, hash_prefix, progress=progress)
    748 if _is_legacy_zip_format(cached_file):
    749     return _legacy_zip_load(cached_file, model_dir, map_location)

File D:\MisTrabajos\Pytorch\pytorch\lib\site-packages\torch\hub.py:633, in download_url_to_file(url, dst, hash_prefix, progress)
    630 with tqdm(total=file_size, disable=not progress,
    631           unit='B', unit_scale=True, unit_divisor=1024) as pbar:
    632     while True:
--> 633         buffer = u.read(8192)
    634         if len(buffer) == 0:
    635             break

File D:\programas\Anaconda\envs\py310\lib\http\client.py:466, in HTTPResponse.read(self, amt)
    463 if self.length is not None and amt > self.length:
    464     # clip the read to the "end of response"
    465     amt = self.length
--> 466 s = self.fp.read(amt)
    467 if not s and amt:
    468     # Ideally, we would raise IncompleteRead if the content-length
    469     # wasn't satisfied, but it might break compatibility.
    470     self._close_conn()

File D:\programas\Anaconda\envs\py310\lib\socket.py:705, in SocketIO.readinto(self, b)
    703 while True:
    704     try:
--> 705         return self._sock.recv_into(b)
    706     except timeout:
    707         self._timeout_occurred = True

File D:\programas\Anaconda\envs\py310\lib\ssl.py:1274, in SSLSocket.recv_into(self, buffer, nbytes, flags)
   1270     if flags != 0:
   1271         raise ValueError(
   1272           "non-zero flags not allowed in calls to recv_into() on %s" %
   1273           self.__class__)
-> 1274     return self.read(nbytes, buffer)
   1275 else:
   1276     return super().recv_into(buffer, nbytes, flags)

File D:\programas\Anaconda\envs\py310\lib\ssl.py:1130, in SSLSocket.read(self, len, buffer)
   1128 try:
   1129     if buffer is not None:
-> 1130         return self._sslobj.read(len, buffer)
   1131     else:
   1132         return self._sslobj.read(len)

KeyboardInterrupt: 

Como podemos comprobar, esta red tarda u poco en cargar, hay que tener presenta que soporta nada más y nada menos que 44.5 millones de parámetros.

Veamos cual es la estructura de esta última red neuronal.

resnet
ResNet(
  (conv1): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
  (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (relu): ReLU(inplace=True)
  (maxpool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
  (layer1): Sequential(
    (0): Bottleneck(
      (conv1): Conv2d(64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (downsample): Sequential(
        (0): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (1): Bottleneck(
      (conv1): Conv2d(256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (2): Bottleneck(
      (conv1): Conv2d(256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
  )
  (layer2): Sequential(
    (0): Bottleneck(
      (conv1): Conv2d(256, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (downsample): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (1): Bottleneck(
      (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (2): Bottleneck(
      (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (3): Bottleneck(
      (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
  )
  (layer3): Sequential(
    (0): Bottleneck(
      (conv1): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (downsample): Sequential(
        (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (1): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (2): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (3): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (4): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (5): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (6): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (7): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (8): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (9): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (10): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (11): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (12): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (13): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (14): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (15): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (16): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (17): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (18): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (19): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (20): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (21): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (22): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
  )
  (layer4): Sequential(
    (0): Bottleneck(
      (conv1): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (downsample): Sequential(
        (0): Conv2d(1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (1): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (1): Bottleneck(
      (conv1): Conv2d(2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (2): Bottleneck(
      (conv1): Conv2d(2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
  )
  (avgpool): AdaptiveAvgPool2d(output_size=(1, 1))
  (fc): Linear(in_features=2048, out_features=1000, bias=True)
)

Lo que vemos aquí son módulos, uno por línea. Tenga en cuenta que no tienen nada en común con los módulos de Python: son operaciones individuales, los bloques de construcción de una red neuronal. También se llaman capas en otros marcos de aprendizaje profundo.

Si nos desplazamos hacia abajo, veremos un montón de módulos Bottleneck que se repiten uno tras otro (¡101! en total). , que contienen convoluciones y otros módulos. Esa es la anatomía de una típica red neuronal profunda para visión por ordenador: una cascada más o menos secuencial de filtros y funciones no lineales, que termina con una capa (fc) que produce puntuaciones para cada una de las 1.000 clases de salida (out_features).

Con PyTorch se pueden hacer transformaciones mediante el módulo transforms.

from torchvision import transforms
preprocess = transforms.Compose([
        transforms.Resize(256),
        transforms.CenterCrop(224),
        transforms.ToTensor(),
        transforms.Normalize(
            mean=[0.485, 0.456, 0.406],
            std=[0.229, 0.224, 0.225]
        )])

Vamos a continuación a procesar una imagen mediante el paquete de Python PIL

from PIL import Image
img = Image.open("figuras/bobby.jpg")
img
../_images/7bc4b7c87f1e26201c1f8af20cebd5e9b749b31904da2c8289a9d97c1c434674.png

Ahora lo que hacemos son las transformaciones de la imagen que hemos programado anteriormente con transforms.Compose()

img_t = preprocess(img)
img_t
tensor([[[-0.6281, -0.6452, -0.6623,  ...,  0.0056, -0.0287, -0.0629],
         [-0.7137, -0.6965, -0.6965,  ...,  0.0227,  0.0227,  0.0056],
         [-0.7137, -0.7137, -0.7137,  ...,  0.0398,  0.0741,  0.0569],
         ...,
         [ 1.4440,  1.4269,  1.4783,  ...,  0.6049,  0.6221,  0.6906],
         [ 1.4269,  1.4440,  1.4783,  ...,  0.6906,  0.6734,  0.7077],
         [ 1.4612,  1.4783,  1.5297,  ...,  0.6906,  0.7248,  0.7591]],

        [[-1.2829, -1.2829, -1.2829,  ..., -0.6352, -0.6702, -0.7052],
         [-1.2654, -1.2654, -1.2654,  ..., -0.6176, -0.6527, -0.7052],
         [-1.2479, -1.2479, -1.2654,  ..., -0.6176, -0.6001, -0.6527],
         ...,
         [ 0.7829,  0.8004,  0.8704,  ..., -0.2850, -0.2675, -0.2150],
         [ 0.7654,  0.8354,  0.9055,  ..., -0.2150, -0.2150, -0.1625],
         [ 0.8004,  0.8529,  0.9230,  ..., -0.1800, -0.1275, -0.0749]],

        [[-1.4907, -1.4559, -1.4210,  ..., -1.0376, -1.0898, -1.1421],
         [-1.5081, -1.4559, -1.4210,  ..., -1.0376, -1.0898, -1.1421],
         [-1.4907, -1.4733, -1.4559,  ..., -1.0376, -1.0550, -1.0898],
         ...,
         [-0.5321, -0.5147, -0.4624,  ..., -1.3687, -1.2816, -1.1596],
         [-0.5321, -0.4798, -0.4275,  ..., -1.2816, -1.2293, -1.1073],
         [-0.4624, -0.4101, -0.3578,  ..., -1.2467, -1.1421, -1.0550]]])
# La imagen ahora quedaría de la siguiente manera
import matplotlib.pyplot as plt

plt.imshow(  img_t.permute(1, 2, 0)  )
Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).
<matplotlib.image.AxesImage at 0x284d112c2e0>
../_images/a056f357b03f92e230119f8a9bbbfcc924b178099b6e61c1dda932ea648ff773.png

A continuación, podemos remodelar, recortar y normalizar el tensor de entrada en el formato que la red espera

import torch
batch_t = torch.unsqueeze(img_t, 0)

Y de esta manera ya estariamos en condiciones de ejecutar el modelo.

El proceso de ejecutar un modelo entrenado con nuevos datos se denomina inferencia en los círculos del aprendizaje profundo. Para realizar la inferencia, tenemos que poner la red en modo de evaluación:

resnet.eval()
ResNet(
  (conv1): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False)
  (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
  (relu): ReLU(inplace=True)
  (maxpool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
  (layer1): Sequential(
    (0): Bottleneck(
      (conv1): Conv2d(64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (downsample): Sequential(
        (0): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
        (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (1): Bottleneck(
      (conv1): Conv2d(256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (2): Bottleneck(
      (conv1): Conv2d(256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
  )
  (layer2): Sequential(
    (0): Bottleneck(
      (conv1): Conv2d(256, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (downsample): Sequential(
        (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (1): Bottleneck(
      (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (2): Bottleneck(
      (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (3): Bottleneck(
      (conv1): Conv2d(512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
  )
  (layer3): Sequential(
    (0): Bottleneck(
      (conv1): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (downsample): Sequential(
        (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (1): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (2): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (3): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (4): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (5): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (6): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (7): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (8): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (9): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (10): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (11): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (12): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (13): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (14): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (15): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (16): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (17): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (18): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (19): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (20): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (21): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (22): Bottleneck(
      (conv1): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
  )
  (layer4): Sequential(
    (0): Bottleneck(
      (conv1): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
      (downsample): Sequential(
        (0): Conv2d(1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (1): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      )
    )
    (1): Bottleneck(
      (conv1): Conv2d(2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
    (2): Bottleneck(
      (conv1): Conv2d(2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv2): Conv2d(512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
      (bn2): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (conv3): Conv2d(512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
      (bn3): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
      (relu): ReLU(inplace=True)
    )
  )
  (avgpool): AdaptiveAvgPool2d(output_size=(1, 1))
  (fc): Linear(in_features=2048, out_features=1000, bias=True)
)

Si nos olvidamos de hacerlo, algunos modelos preentrenados, como la normalización por lotes y el abandono, no producirán respuestas significativas, sólo por la forma en que trabajan internamente. Ahora que la evaluación se ha establecido, estamos listos para la inferencia:

out = resnet(batch_t)
out
tensor([[-3.4997e+00, -1.6490e+00, -2.4391e+00, -3.2243e+00, -3.2465e+00,
         -1.3218e+00, -2.0395e+00, -2.5405e+00, -1.3043e+00, -2.8827e+00,
         -1.6696e+00, -1.2838e+00, -2.6184e+00, -2.9750e+00, -2.4380e+00,
         -2.8256e+00, -3.3083e+00, -7.9667e-01, -6.7075e-01, -1.2162e+00,
         -3.0311e+00, -3.9593e+00, -2.2631e+00, -1.0843e+00, -9.7915e-01,
         -1.0742e+00, -3.0908e+00, -2.4751e+00, -2.2153e+00, -3.1932e+00,
         -3.2964e+00, -1.8507e+00, -2.0642e+00, -2.1202e+00, -1.8665e+00,
         -3.2375e+00, -1.1210e+00, -1.1321e+00, -1.1657e+00, -9.0362e-01,
         -4.5209e-01, -1.4986e+00,  1.4366e+00,  1.2994e-01, -1.8379e+00,
         -1.4815e+00,  9.7278e-01, -9.3662e-01, -3.0276e+00, -2.7341e+00,
         -2.5960e+00, -2.0591e+00, -1.8170e+00, -1.9437e+00, -1.7875e+00,
         -1.3029e+00, -4.5200e-01, -2.0560e+00, -3.2882e+00, -4.7583e-01,
         -3.6261e-01, -1.1650e+00, -7.3942e-01, -1.4489e+00, -1.5039e+00,
         -2.1096e+00, -1.7095e+00, -4.1315e-01, -1.9146e+00, -1.5095e+00,
         -1.2012e+00, -1.3111e+00, -1.0662e+00,  1.2968e-01, -3.8648e-01,
         -2.4670e-01, -8.7028e-01,  5.9567e-01, -8.8329e-01,  1.4535e+00,
         -2.6384e+00, -3.6589e+00, -2.3378e-01, -4.9410e-02, -2.2602e+00,
         -2.3605e+00, -1.4038e+00,  2.4098e-01, -1.0531e+00, -2.9106e+00,
         -2.5321e+00, -2.2253e+00,  4.4062e-01, -1.3269e+00, -2.0447e-02,
         -2.8696e+00, -5.5842e-01, -1.3959e+00, -2.9274e+00, -1.9038e+00,
         -4.2433e+00, -2.9701e+00, -2.0552e+00, -2.4290e+00, -2.7691e+00,
         -4.0207e+00, -3.6511e+00, -4.8903e-01, -9.8899e-01, -1.8995e+00,
         -3.5459e+00, -1.4606e+00,  1.1054e+00, -6.9230e-01, -7.4877e-01,
         -2.1745e+00, -2.2471e+00, -5.3973e-01, -1.5121e+00, -2.5998e+00,
         -3.9889e+00, -9.7558e-01, -5.5630e-01, -8.6530e-01, -1.2110e+00,
         -7.3269e-01, -1.0847e+00, -2.0944e+00, -4.0080e+00, -3.3433e-01,
         -2.6919e+00, -2.9908e+00, -1.8793e+00, -1.8126e+00, -1.2953e+00,
         -2.2093e+00, -2.0246e+00, -3.1878e+00, -3.2739e+00, -2.7969e+00,
         -4.1270e-01, -3.7697e+00, -2.4744e+00, -2.6133e+00, -2.7514e+00,
         -2.6512e+00, -3.2751e+00, -4.3150e+00, -4.2214e+00, -3.5920e+00,
         -1.2186e+00,  2.3577e+00,  1.9648e+00,  2.2958e+00,  5.0050e+00,
          1.0229e+00,  2.9914e+00,  8.8864e-01,  2.2502e+00,  6.1697e+00,
          4.2833e+00,  4.5931e+00,  7.4686e+00,  4.3448e+00,  4.8805e+00,
          5.8076e+00,  3.9792e+00,  3.5555e+00,  9.5253e+00,  1.1172e+00,
          3.3158e+00,  1.9011e+00, -5.1524e-01,  1.3985e+00,  1.8150e+00,
          5.6135e+00,  5.7316e+00,  2.1489e+00,  2.4986e+00,  5.5508e-01,
          1.8658e+00,  1.4871e+00,  3.4364e+00,  4.4125e-01,  4.2860e+00,
          4.3832e+00,  1.3132e+00,  1.6830e+00,  1.1566e+00,  2.1403e+00,
          9.1246e-01,  3.0220e+00,  3.5398e+00,  2.8193e+00,  1.9450e+00,
          8.4404e-02,  2.0383e+00,  2.7105e+00,  1.0817e+00,  1.9388e+00,
          3.5678e+00,  2.3330e+00,  2.5193e+00,  1.3185e+00,  3.6052e+00,
          7.7896e+00,  4.4019e+00,  1.5813e+01,  1.2201e+01,  5.1687e+00,
          1.9371e+00,  5.5298e+00,  6.2411e+00,  7.5443e+00,  5.9197e+00,
          7.0106e+00,  5.7829e+00,  2.6940e+00,  5.3224e+00,  9.9329e+00,
          6.4925e+00,  2.4706e+00,  5.6329e+00,  3.4480e+00,  2.0878e+00,
          1.2667e+00,  2.5974e+00,  5.6932e+00,  7.1867e-01,  1.0250e+00,
          4.8133e+00,  6.0352e+00,  3.9970e+00, -1.7816e+00,  3.6096e+00,
          2.8417e+00,  2.8397e+00,  1.2274e+00,  4.2220e+00,  4.0891e+00,
          3.3476e+00,  2.3510e+00,  2.6175e-01,  8.3823e-01,  5.1252e+00,
         -7.6088e-01,  1.9809e+00,  2.8381e+00,  2.4766e+00,  1.8687e+00,
          5.8744e-01,  3.4939e+00, -5.0017e-01,  1.8486e+00,  4.4583e-01,
          2.4148e+00,  3.4823e+00,  5.1915e+00,  1.8092e+00,  3.9559e+00,
          4.3692e+00,  3.1688e-01, -4.8611e-01,  5.6470e+00,  3.6534e+00,
          2.4436e+00,  3.6937e+00,  3.3958e+00, -9.6209e-01, -2.2416e-01,
          1.2344e-01,  1.1682e+00, -5.7350e-02,  4.4999e+00, -1.8748e+00,
         -2.6818e+00, -1.6847e+00, -6.7984e-01, -1.5588e+00, -2.1798e+00,
         -1.0431e+00,  1.5370e+00,  7.6586e-01, -1.7835e+00,  6.3795e-01,
          8.8728e-01, -2.1297e+00, -2.1428e+00, -1.5652e+00, -4.8034e+00,
         -1.0672e+00, -9.5017e-01, -2.0649e+00,  2.0019e+00, -6.4177e-01,
         -2.5378e+00, -1.5008e+00, -3.1975e+00, -1.9334e+00, -5.8521e-01,
         -1.0363e+00,  1.3329e+00, -1.4872e+00, -2.0102e+00,  1.3183e+00,
          4.2174e-01, -3.5856e-01,  7.4732e-01, -2.7741e-01, -7.5281e-01,
         -5.5096e-01,  1.0833e+00,  1.3690e-01,  5.3099e-02,  1.2991e+00,
         -2.0118e+00, -1.7402e+00,  1.1788e+00, -4.2505e-01,  3.0964e-01,
         -1.4589e+00, -2.6225e+00, -2.4953e+00, -2.1100e-01, -2.3407e+00,
         -1.5377e+00, -2.4802e+00, -8.0411e-01,  2.1876e-01, -2.8007e+00,
         -1.9041e+00, -9.3844e-01, -3.4383e-01, -4.4379e-01, -7.5068e-01,
         -2.5060e+00, -1.9128e+00, -2.1313e+00, -5.4917e-01,  1.7571e-01,
         -2.0437e+00, -1.7883e+00, -2.4830e+00, -3.8768e+00, -4.4253e+00,
         -2.1819e+00, -2.0485e+00, -3.7339e+00, -1.9185e+00, -3.4523e+00,
         -2.0103e+00, -2.2686e+00, -7.9842e-01, -5.0350e-01, -2.9496e+00,
         -1.6959e+00, -2.9049e-01, -2.8050e+00, -1.8296e+00,  9.0891e-02,
         -2.1747e+00, -1.5452e+00, -1.2558e+00, -9.5389e-01, -2.5265e+00,
         -1.3665e+00, -3.8682e+00, -4.3242e+00, -1.5148e+00, -1.9113e+00,
         -2.9872e+00, -2.9385e+00, -4.8581e+00, -2.4930e+00, -1.3708e+00,
         -2.8970e+00, -1.2814e+00, -5.6305e-01, -2.8289e+00, -1.8666e+00,
         -2.3429e+00, -3.4331e+00, -2.5701e+00, -3.8991e+00, -4.4677e+00,
         -3.5612e+00, -9.3197e-01, -2.3963e+00, -2.0673e+00, -1.8581e+00,
         -3.2819e+00, -3.5219e+00, -2.8122e+00, -1.9661e+00, -3.7761e+00,
         -1.1609e+00, -2.7743e+00,  2.5780e-01, -1.7330e-01, -3.0291e-01,
          1.6415e-01,  1.7316e+00,  1.8974e+00, -3.1201e+00, -3.7529e+00,
         -2.6532e+00, -6.6325e-01, -8.5835e-01, -1.3166e+00,  4.9872e-01,
         -2.3256e+00,  4.7837e+00,  1.4947e+00, -1.6190e+00,  3.3725e+00,
          3.0711e+00,  1.5550e+00,  7.3323e-01,  1.2925e+00,  1.6615e+00,
          9.5884e-02,  1.3348e+00,  2.9194e-02,  9.8533e-01, -5.1703e-01,
         -1.3542e+00,  5.3265e-01,  2.7115e-01,  2.8676e+00,  2.2308e+00,
          1.8468e+00,  1.3861e+00,  1.1031e+00,  2.3657e+00,  6.4129e+00,
          1.8896e+00,  5.8361e-01, -1.5098e+00, -1.8949e+00, -2.6363e+00,
          1.5000e+00, -5.3598e-01, -6.9934e-02,  2.0223e+00, -1.7648e+00,
          4.1311e-01,  1.6696e+00,  7.3901e-01,  2.1839e+00, -1.4280e+00,
         -2.4708e+00,  4.8153e-01,  3.3817e+00, -2.9651e-01,  6.8333e-02,
          1.1068e+00, -1.3973e+00,  1.5518e+00, -2.2464e+00, -3.6723e-01,
         -1.5056e+00, -2.1660e+00,  2.7747e+00,  2.5229e+00,  1.8796e-01,
          4.6500e-01, -1.9905e+00, -1.6224e+00, -7.7251e-01, -1.1583e+00,
         -3.0849e-01, -1.4386e+00,  2.2714e+00, -4.5280e-01, -8.7969e-01,
          4.1363e-01, -1.4095e+00, -2.8906e-01,  3.4809e+00,  4.3324e-01,
          4.2387e-01, -1.1204e+00,  1.9781e-01, -2.6528e+00, -2.8554e+00,
         -6.6075e-01, -3.1026e-01,  7.8641e-01,  1.8965e-01, -4.5897e-01,
         -5.5900e-01, -2.8286e-01,  1.6559e+00,  7.1156e-02, -4.2351e-01,
          1.1153e+00,  2.2649e+00, -1.0259e+00,  9.7924e-01,  1.0194e+00,
         -2.6048e+00, -5.5008e-01,  2.1353e+00, -9.3641e-01,  4.3959e-02,
         -3.9618e-01,  1.5395e+00, -1.0598e+00,  1.8685e+00,  1.4945e+00,
         -4.4445e+00, -1.7147e-01, -1.6854e-01, -7.5243e-01,  1.8241e+00,
          3.3237e+00,  7.4655e-01, -7.6160e-01,  1.6335e+00,  1.9438e+00,
          1.2983e+00, -1.5951e+00,  2.2410e+00,  9.3765e-01, -6.2485e-01,
         -1.3969e+00,  1.6010e+00,  2.9322e-02,  8.8535e-01,  9.6981e-01,
          5.0879e-01,  3.9120e-01,  1.0257e+00,  6.5950e-01,  3.3687e+00,
         -1.0271e+00, -2.9353e+00, -7.5909e-01, -7.9742e-01,  7.0959e+00,
         -3.3062e+00,  1.3640e+00,  1.7516e+00,  3.7723e+00, -1.1675e+00,
          5.5042e-01, -1.8410e+00, -1.3717e+00,  4.1994e-01,  2.6614e+00,
          1.4409e+00,  8.2754e-01,  4.9228e+00,  1.5346e+00, -3.6179e+00,
         -2.2328e+00, -1.3824e+00, -8.8731e-01,  2.2056e+00,  2.0819e+00,
         -9.1448e-01, -1.7883e+00, -1.4364e+00,  1.0453e+00, -7.3781e-01,
         -1.4398e+00,  4.6885e-01,  8.8505e-01,  6.7727e-01, -1.1915e+00,
         -1.4081e+00, -8.5758e-01,  1.1674e+00, -1.8481e+00,  2.5000e+00,
          3.4197e-01, -1.7550e+00, -8.9500e-01,  2.9997e+00,  2.2094e+00,
         -8.3813e-01, -2.0126e+00,  9.3872e-01, -2.1452e+00,  3.8366e-02,
          2.4887e-01, -1.0520e-01, -6.5314e-01,  8.0483e-01,  4.0207e+00,
         -1.2178e+00,  2.6536e+00, -1.8085e+00,  7.5238e-01,  5.0293e-01,
          1.1507e-02, -7.7166e-01,  4.5107e-01,  1.3278e+00,  8.2390e-01,
         -6.1212e-01,  3.0906e-02,  2.6237e-01, -2.1468e+00, -1.4071e+00,
         -9.9405e-02, -4.3217e-01,  4.8178e-01,  1.2984e+00,  7.5574e-01,
          8.5152e-02, -1.3871e-01, -8.5174e-01,  8.1902e-01,  7.0772e-01,
          2.1356e+00,  7.1527e-01, -1.9824e-01,  2.4378e+00,  1.8625e+00,
          1.0683e+00,  1.0118e+00,  1.4432e+00, -1.9516e-02, -1.6767e-01,
         -1.6930e+00, -1.2930e+00, -1.3439e+00, -3.4781e+00,  5.9320e-01,
          1.0796e+00, -1.8918e-02, -3.9900e-01, -4.6973e-01, -3.7134e-01,
         -1.5430e+00,  3.1286e-01, -1.8760e-01,  3.2949e+00,  1.8577e+00,
         -1.5585e+00,  2.9257e+00, -7.3537e-01,  6.2360e-02, -4.3240e-02,
         -1.0193e+00,  1.1614e+00,  6.1005e-01,  1.4297e-01, -3.4956e+00,
          1.3337e+00,  3.8960e-01, -1.2900e+00, -1.2823e+00, -1.6482e+00,
          1.9285e+00,  2.7967e-02, -9.3298e-01,  6.6934e-01,  2.9179e+00,
          1.7228e+00,  2.1640e-01, -8.5777e-01, -2.0391e+00,  1.6510e+00,
         -2.3326e-01,  7.4310e-01,  1.5400e+00, -1.4460e+00, -1.2114e+00,
          5.2273e-01, -6.4955e-01,  4.8662e-02,  2.6569e+00, -1.6518e+00,
         -3.3896e+00,  2.1350e+00,  9.0318e-01,  1.3904e+00, -3.6350e-01,
          2.0223e-01,  1.4554e+00, -1.1211e+00,  1.8399e+00, -1.2923e+00,
         -2.1249e+00, -8.6377e-01, -5.1445e-01, -1.2872e+00,  1.1431e+00,
         -1.4240e+00, -7.6690e-01, -3.5563e-01,  2.1923e+00, -1.9831e-01,
         -1.8438e+00,  9.6278e-02,  1.7458e+00,  1.7894e-01,  8.0167e-01,
          4.7614e+00, -7.3425e-01, -1.6478e+00, -8.1854e-01, -1.5260e+00,
          3.4332e+00,  4.9594e-01, -3.4153e-01, -5.1844e-01,  3.1711e-01,
          1.8911e+00, -1.2033e+00, -2.0005e+00, -1.2113e-01, -7.0711e-01,
         -2.5450e+00, -6.7446e-02,  8.0279e-01, -1.8256e+00,  2.9197e-01,
         -1.1033e+00,  1.5367e+00,  2.9367e+00,  5.3938e-01, -7.6932e-01,
         -9.2092e-01, -1.7367e+00, -1.0012e+00,  9.3510e-01, -4.1550e-02,
         -2.3282e+00,  2.2907e+00,  9.3231e-01,  2.0218e+00, -1.7313e+00,
          2.6369e+00,  3.8262e+00,  6.9461e-01, -2.5067e-03, -9.9190e-01,
          2.1992e+00,  1.4213e+00,  4.3468e-01, -1.3524e+00, -8.5725e-01,
          5.7249e-01, -6.5436e-01,  5.3128e-01,  1.1928e+00,  9.9297e-01,
          2.8021e+00, -3.0989e+00,  3.5913e+00, -1.7487e-01, -7.0675e-02,
         -2.0895e+00, -1.4625e+00,  3.0381e+00, -4.9794e-01,  2.0560e-01,
         -1.4915e-01,  2.2701e+00,  3.2218e-01,  9.6720e-01, -1.1802e+00,
          3.3260e+00, -1.8322e+00,  3.2782e+00, -1.7115e-01,  1.3883e+00,
          6.0959e-01,  4.1543e-01, -7.9037e-01,  3.6867e-01,  1.9525e+00,
          4.2260e-01,  2.4431e+00, -1.9236e+00,  1.2817e-02,  2.2662e+00,
         -1.6735e+00, -7.1518e-01,  1.3959e+00, -2.5441e-02,  7.7251e-01,
          3.0006e+00, -9.7998e-01, -1.3561e+00,  2.3763e-01,  2.3640e+00,
          1.9144e+00,  1.8060e+00,  8.6761e-01,  4.2975e+00,  5.3407e-01,
         -1.4330e+00, -5.0711e-01,  3.0123e+00, -8.6910e-01,  3.2884e+00,
         -6.6958e-01, -2.6305e+00, -3.1957e+00, -2.9512e+00,  1.2077e+00,
          5.6051e+00,  2.7645e-01, -3.1766e+00,  2.5924e+00, -3.3277e-01,
          1.2678e-01,  2.3057e+00, -1.4350e+00,  3.3605e+00, -3.3686e+00,
         -6.4941e-01,  1.1768e+00, -2.6352e+00,  1.6955e+00, -1.4452e+00,
         -2.6586e+00, -3.6629e+00, -1.7268e+00,  1.1226e+00,  1.9893e+00,
         -1.6089e-01,  1.2605e+00,  1.2145e+00,  2.6831e+00, -1.8129e+00,
         -1.8627e+00,  1.5719e+00, -2.0970e+00, -2.3202e+00,  9.8824e-01,
         -3.8493e+00, -1.1166e+00,  8.9900e-01, -3.4761e-01, -3.5048e+00,
          2.0611e+00,  5.2646e-01,  6.1806e-01, -1.1572e-01,  8.0079e-01,
          8.0897e-01,  1.0604e+00, -1.8259e+00,  2.6638e-01, -1.1470e+00,
          8.3008e-01,  1.0902e+00,  9.0343e+00, -9.1355e-01, -9.0900e-01,
          1.1300e+00, -3.8219e-01, -1.7321e+00,  1.0250e-01,  2.0110e-01,
         -1.4002e+00,  8.6910e-01, -4.4545e-01, -7.9011e-01, -3.1371e+00,
         -3.6089e-01,  1.9827e-01, -2.1941e+00,  2.8923e+00,  4.7277e-01,
          1.4522e+00, -3.4306e+00,  1.4684e+00, -2.5527e+00, -2.9418e+00,
         -7.8795e-01,  2.8671e+00, -1.2269e+00, -6.5231e-01,  3.6328e+00,
          1.1509e+00,  1.1695e+00,  3.3913e+00,  9.2258e-01, -1.1167e+00,
          1.0877e+00,  1.9697e-01, -7.4067e-01, -2.9546e+00, -1.7774e-01,
          1.5686e+00, -4.4219e-01,  1.6109e+00,  1.0286e+00, -1.2968e+00,
         -1.7997e+00, -4.9459e-01,  5.5377e-01,  2.3711e+00,  5.1920e-01,
         -1.3707e+00, -2.3571e+00, -7.6039e-01,  1.3218e+00,  2.9741e+00,
         -2.6011e-01, -6.0607e-01, -4.2340e-02, -7.4892e-01, -7.3591e-02,
          2.7876e+00,  3.2539e-01, -1.1395e+00, -1.4875e+00, -4.3684e+00,
         -3.6234e-01, -3.4596e-01, -2.0751e+00,  3.6143e-01, -1.4748e+00,
         -6.3914e-01, -2.0747e+00, -9.1273e-01, -4.0377e-01, -3.3568e-01,
         -1.8499e+00, -1.7760e+00, -6.6116e-01,  6.3051e-01,  3.7449e+00,
          2.1652e+00,  4.1343e+00,  1.6970e+00,  8.3944e-01,  1.2930e+00,
          1.0517e+00,  1.4064e+00,  1.3186e+00, -7.6195e-01,  2.2661e+00,
          1.4491e-01, -7.7404e-01,  1.0131e+00,  8.7297e-01, -6.4795e-01,
         -3.4213e-01, -1.5297e+00,  2.0686e+00,  2.5062e+00,  2.1082e-01,
          7.5025e-01,  1.8742e-02, -1.7822e+00,  1.9568e+00,  4.3805e-01,
          1.2335e+00,  7.7087e-01, -1.0329e+00,  1.5594e-01, -3.6626e-01,
          6.3099e-02,  3.2490e+00,  6.4807e-01, -1.5173e-01,  9.3229e-01,
         -6.0559e-02, -1.1985e+00,  1.4659e-01,  3.9304e-01,  8.8496e-01,
         -1.9458e+00,  1.2217e-01, -1.5016e+00, -1.8235e+00, -3.8595e+00,
          1.8626e-01, -2.9675e+00,  5.4217e-01, -7.8127e-01, -2.6196e+00,
         -4.4958e+00,  5.7119e-01, -1.3398e+00, -3.8117e+00, -7.8630e-01,
         -5.6788e-01, -2.5453e+00,  2.4054e+00,  6.5602e-01,  4.7648e-01,
          2.8749e+00, -3.7451e+00,  1.5124e+00, -3.2777e+00, -2.4971e+00,
         -3.2263e-01,  1.2816e-01, -1.1752e+00,  3.4434e+00,  4.4534e+00]],
       grad_fn=<AddmmBackward>)

Ahora nos queda clasificar la imagen, para ello debemos cargar el fichero que contiene las posibles clases de clasificación

with open('textosimagenes/imagenet_classes.txt') as f:
    labels = [line.strip() for line in f.readlines()]

Ahora necesitamos el índice correspondiente al mayor score. Lo podemos hacer así

_, index = torch.max(out, 1)
index[0]
tensor(207)

Ahora podemos utilizar el índice para acceder a la etiqueta. Aquí, el índice no es un número simple de Python, sino un tensor unidimensional de un elemento (en concreto, tensor([207])), por lo que necesitamos obtener el valor numérico real para usarlo como índice en nuestra lista de etiquetas usando index[0]. También utilizamos torch.nn.functional.softmax (http://mng.bz/BYnq) para normalizar nuestras salidas al rango [0, 1], y dividir por la suma. Esto nos da algo aproximadamente parecido a la confianza que el modelo tiene en su predicción. En este caso, el modelo está 96% seguro de que sabe que lo que está viendo es un golden retriever:

percentage = torch.nn.functional.softmax(out, dim=1)[0] * 100
labels[index[0]], percentage[index[0]].item()
('golden retriever', 96.57185363769531)

Como el modelo produce puntuaciones, también podemos averiguar cuál es la segunda mejor, la tercera mejor, etcétera. y así sucesivamente. Para ello, podemos utilizar la función de sort, que ordena los valores en orden ascendente o descendente y también proporciona los índices de los valores ordenados en la matriz original:

_, indices = torch.sort(out, descending=True)
[(labels[idx], percentage[idx].item()) for idx in indices[0][:5]]
[('golden retriever', 96.57185363769531),
 ('Labrador retriever', 2.6082706451416016),
 ('cocker spaniel, English cocker spaniel, cocker', 0.2699621915817261),
 ('redbone', 0.17958936095237732),
 ('tennis ball', 0.10991999506950378)]

Vemos que los cuatro primeros son perros (redbone es una raza; ¿quién lo diría?), después de lo cual las cosas empiezan a ponerse divertidas. La quinta respuesta, “pelota de tenis”, se debe probablemente a que hay suficientes fotos de pelotas de tenis con perros cerca como para que el modelo diga: “Hay un 0,1% de posibilidades de que haya entendido mal lo que es una pelota de tenis”. Este es un gran ejemplo de las diferencias fundamentales en cómo los humanos y las redes neuronales ven el mundo, así como lo fácil que es que extraños y sutiles sesgos se cuelen en nuestros datos.