Part 1: The Prediction Challenge¶

As stated in the Challenge Statement, we are given a dataset from the "Urban Typologies" project, where we could find 65 indicators that relate to demographics, mobility, economy city form. We are expected to predict the 'CO2 Emissions per Capita(metric tonnes)' for each city, conditioned on any other variables that we choose except the 'Pollution Index'.

We start by importing the packages we need, reading in the dataset and doing some initial processing of the data. We will set the city ID as the index of the data frame and drop unecessary variables such as, 'City', 'Typology', 'Pollution Index'.

In [1]:
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline


pd.set_option('display.max_columns', None)
pd.set_option('display.max_rows', None)


df = pd.read_excel('Cities.xls', index_col=0, skipinitialspace=True) # Read with excel index. 
df.head()
Out[1]:
City cityID clusterID Typology Country Car Modeshare (%) Public Transit Modeshare (%) Bicycle Modeshare (%) Walking Modeshare (%) Gasoline Pump Price (USD/liter) Road Deaths Rate (per 1000) Subway Length (km) Subway Length Density (per km) Subway Stations per Hundred Thousand Subway Ridership per Capita Subway Age (years) BRT Length (km) BRT System Length Density (per km) BRT Stations per Hundred Thousand Persons BRT Fleet per Hundred Thousand Persons BRT Annual Ridership per Capita BRT Age (years) Bikeshare Stations Bikeshare Stations per Hundred Thousand Persons Bikeshare Number of Bikes Bikeshare Bicycles per Hundred Thousand Persons Bikeshare Age (years) Congestion (%) Congestion AM Peak (%) Congestion PM Peak (%) Traffic Index Travel Time Index Inefficiency Index Population Land Area (sq. km) Population Density (per sq. km) Population Change 1990 – 2000 Population Change 2000 – 2010 Population Change 2010 – 2020 Population Change 2020 – 2025 Urbanization Rate 2015 (%) Urbanization Rate Change 2015 – 2025 (pp) GDP per Capita (USD) Unemployment Rate (%) Cost of Living Index Rent Index Grocery Index Restaurant Price Index Local Purchasing Power Index Gini Coefficient Poverty Rate (%) Life Expectancy (years) Safety Index Internet Penetration Digital Penetration Innovation Index Smartphone Penetration (%) CO2 Emissions per Capita (metric tonnes) Pollution Index Street length total (m) Street Length Density (m/sq. km) Street Length Average (m) Intersection Count Intersection Density (per sq. km) Degree Average Streets per Node Circuity Self-Loop Proportion Highway Proportion Metro Propensity Factor BRT Propensity Factor BikeShare Propensity Factor Development Factor Sustainability Factor Population Factor Congestion Factor Sprawl Factor Network Density Factor
284 Baltimore(MD) 285 7 Auto Sprawl United States 85.0 6.1 0.3 2.6 0.66 8.5 24.9 0.013409 0.615385 6.417582 34 0.0 0.000000 0.000000 0.000000 0.000000 0.0 50 2.197802 NaN 0.00000 2.0 19.0 33.0 46.0 148.97 36.90 150.22 2275000 1857 1200 233673 332204 399059 195708 81.6 1.7 58789.0 7.20 77.33 48.58 76.48 78.28 150.69 0.443 22.9 78.8 31.19 81.0 0.78 45.0 72.0 14.300000 NaN 7468900.983 7.604833e+09 148.013337 28660.0 1018.199702 5.021972 2.869906 1.067736 0.007910 0.041018 0.160848 0.176867 0.360637 0.796264 0.355964 0.081956 0.180085 0.722163 0.425187
9 Melbourne 10 8 Auto Innovative Australia 80.0 14.0 2 4.0 1.11 5.4 0.0 0.000000 0.000000 0.000000 0 0.0 0.000000 0.000000 0.000000 0.000000 0.0 50 1.264223 600.0 15.17067 2.0 33.0 55.0 58.0 143.12 35.57 138.17 3955000 2543 1500 316060 462816 715525 350883 89.4 1.2 39358.0 5.50 79.04 44.30 72.93 76.07 139.62 NaN NaN 82.0 60.23 86.9 0.74 50.0 77.0 10.200000 26.77 8636838.530 8.653670e+09 107.503591 48571.0 1001.948856 4.948413 2.876305 1.036985 0.001626 0.014489 0.060387 0.168335 0.363675 0.786174 0.397894 0.082267 0.333173 0.539355 0.558910
185 Niamey 186 1 Congested Emerging Niger NaN 9.0 2 60.0 1.02 26.4 0.0 0.000000 0.000000 0.000000 0 0.0 0.000000 0.000000 0.000000 0.000000 0.0 0 0.000000 0.0 0.00000 0.0 NaN NaN NaN NaN NaN NaN 1435000 130 11100 248392 541978 960996 741379 18.7 3.5 427.4 NaN NaN NaN NaN NaN NaN NaN 18.6 61.8 NaN 2.4 0.04 NaN NaN 0.106861 NaN 2134329.200 3.496995e+09 97.860119 13033.0 1638.451450 6.161297 3.187450 1.019423 0.000095 0.000000 0.036220 0.010915 0.343161 0.000000 0.273646 0.248398 0.655464 0.275605 0.410312
327 Hanoi 328 12 MetroBike Emerging Vietnam 8.0 10.0 2 NaN 0.90 24.5 0.0 0.000000 0.000000 0.000000 0 14.5 NaN 0.143000 NaN 0.149000 1.0 0 0.000000 0.0 0.00000 0.0 NaN NaN NaN 160.60 40.78 189.18 7445000 868 8600 520495 1149423 1391900 608994 33.6 6.3 3425.0 6.84 41.73 17.15 37.25 22.72 23.47 0.340 5.4 76.0 54.64 50.1 0.31 35.0 35.0 1.700368 89.66 8079366.816 1.156634e+09 300.370541 15881.0 143.158938 5.205795 2.870473 1.097533 0.000921 0.022938 0.078028 0.084511 0.358868 0.157676 0.509057 0.292564 0.624623 0.666710 0.239113
66 Urumqi 67 12 MetroBike Emerging China 21.7 54.7 NaN NaN 1.16 18.8 0.0 0.000000 0.000000 0.000000 0 66.0 0.134146 2.537764 9.123867 34.441088 7.0 0 0.000000 0.0 0.00000 0.0 NaN NaN NaN NaN NaN NaN 3310000 492 6700 556511 1249115 1610288 491541 55.6 9.8 12189.0 3.40 NaN NaN NaN NaN NaN 0.320 NaN 76.1 NaN 50.1 0.43 NaN 58.0 7.550916 NaN 8891135.865 2.169530e+08 546.944874 9567.0 24.401041 4.736409 2.680313 1.064157 0.000348 0.060128 0.103334 0.447261 0.392551 0.287235 0.801464 0.322960 0.586019 0.791822 0.075617
  • Since there are already ID numbers assigned to each city, we can get rid of the additional indexing which provides the same information anyway. Since 'cityID' provides the exact information as the 'City' column, and 'Typology' provides the exact information as the 'clusterID', we drop those. On top of all, since we are restricted to use 'Pollution Index', we also drop it out of our data set.
In [2]:
# Skip all white-spaces.
df = df.set_index('cityID') # Remove the indices and use cityID as index.

# Drop unnecessary columns (that explicitly imply each other)
# cityID implies City
# clusterID implies Typology
# Pollution Index is restricted to be used in prediction.
col2drop = ['City', 'Typology', 'Pollution Index ']



df.drop(col2drop, inplace=True, axis=1) 
  • We are going to use our dataset in regression, so we don't want any non-numerical data in our dataset. By applying the operations above we are still left with a string type of column, which is 'Country'. It is an important features that is possibly related to our target column (assumption), therefore we don't want to drop it out. So we decide to encode the country column into numbers.
In [3]:
from sklearn.preprocessing import LabelEncoder

def multi_label_encoder(df, col):
        encoder = LabelEncoder()
        df[col+'(Encoded)'] = encoder.fit_transform(df[col])
        decode_dict = df[['Country', 'Country(Encoded)']].set_index('Country(Encoded)')['Country'].to_dict()
        df[col] = df[col+'(Encoded)']
        return df.drop(col+'(Encoded)', axis=1), decode_dict
    
df, decode_dict = multi_label_encoder(df, 'Country')
  • Now since we encoded our 'Country' column into numbers, and saved a dictionary for possible back-conversion later, let's see how our dataset now looks like:
In [4]:
df.head()
Out[4]:
clusterID Country Car Modeshare (%) Public Transit Modeshare (%) Bicycle Modeshare (%) Walking Modeshare (%) Gasoline Pump Price (USD/liter) Road Deaths Rate (per 1000) Subway Length (km) Subway Length Density (per km) Subway Stations per Hundred Thousand Subway Ridership per Capita Subway Age (years) BRT Length (km) BRT System Length Density (per km) BRT Stations per Hundred Thousand Persons BRT Fleet per Hundred Thousand Persons BRT Annual Ridership per Capita BRT Age (years) Bikeshare Stations Bikeshare Stations per Hundred Thousand Persons Bikeshare Number of Bikes Bikeshare Bicycles per Hundred Thousand Persons Bikeshare Age (years) Congestion (%) Congestion AM Peak (%) Congestion PM Peak (%) Traffic Index Travel Time Index Inefficiency Index Population Land Area (sq. km) Population Density (per sq. km) Population Change 1990 – 2000 Population Change 2000 – 2010 Population Change 2010 – 2020 Population Change 2020 – 2025 Urbanization Rate 2015 (%) Urbanization Rate Change 2015 – 2025 (pp) GDP per Capita (USD) Unemployment Rate (%) Cost of Living Index Rent Index Grocery Index Restaurant Price Index Local Purchasing Power Index Gini Coefficient Poverty Rate (%) Life Expectancy (years) Safety Index Internet Penetration Digital Penetration Innovation Index Smartphone Penetration (%) CO2 Emissions per Capita (metric tonnes) Street length total (m) Street Length Density (m/sq. km) Street Length Average (m) Intersection Count Intersection Density (per sq. km) Degree Average Streets per Node Circuity Self-Loop Proportion Highway Proportion Metro Propensity Factor BRT Propensity Factor BikeShare Propensity Factor Development Factor Sustainability Factor Population Factor Congestion Factor Sprawl Factor Network Density Factor
cityID
285 7 116 85.0 6.1 0.3 2.6 0.66 8.5 24.9 0.013409 0.615385 6.417582 34 0.0 0.000000 0.000000 0.000000 0.000000 0.0 50 2.197802 NaN 0.00000 2.0 19.0 33.0 46.0 148.97 36.90 150.22 2275000 1857 1200 233673 332204 399059 195708 81.6 1.7 58789.0 7.20 77.33 48.58 76.48 78.28 150.69 0.443 22.9 78.8 31.19 81.0 0.78 45.0 72.0 14.300000 7468900.983 7.604833e+09 148.013337 28660.0 1018.199702 5.021972 2.869906 1.067736 0.007910 0.041018 0.160848 0.176867 0.360637 0.796264 0.355964 0.081956 0.180085 0.722163 0.425187
10 8 5 80.0 14.0 2 4.0 1.11 5.4 0.0 0.000000 0.000000 0.000000 0 0.0 0.000000 0.000000 0.000000 0.000000 0.0 50 1.264223 600.0 15.17067 2.0 33.0 55.0 58.0 143.12 35.57 138.17 3955000 2543 1500 316060 462816 715525 350883 89.4 1.2 39358.0 5.50 79.04 44.30 72.93 76.07 139.62 NaN NaN 82.0 60.23 86.9 0.74 50.0 77.0 10.200000 8636838.530 8.653670e+09 107.503591 48571.0 1001.948856 4.948413 2.876305 1.036985 0.001626 0.014489 0.060387 0.168335 0.363675 0.786174 0.397894 0.082267 0.333173 0.539355 0.558910
186 1 77 NaN 9.0 2 60.0 1.02 26.4 0.0 0.000000 0.000000 0.000000 0 0.0 0.000000 0.000000 0.000000 0.000000 0.0 0 0.000000 0.0 0.00000 0.0 NaN NaN NaN NaN NaN NaN 1435000 130 11100 248392 541978 960996 741379 18.7 3.5 427.4 NaN NaN NaN NaN NaN NaN NaN 18.6 61.8 NaN 2.4 0.04 NaN NaN 0.106861 2134329.200 3.496995e+09 97.860119 13033.0 1638.451450 6.161297 3.187450 1.019423 0.000095 0.000000 0.036220 0.010915 0.343161 0.000000 0.273646 0.248398 0.655464 0.275605 0.410312
328 12 120 8.0 10.0 2 NaN 0.90 24.5 0.0 0.000000 0.000000 0.000000 0 14.5 NaN 0.143000 NaN 0.149000 1.0 0 0.000000 0.0 0.00000 0.0 NaN NaN NaN 160.60 40.78 189.18 7445000 868 8600 520495 1149423 1391900 608994 33.6 6.3 3425.0 6.84 41.73 17.15 37.25 22.72 23.47 0.340 5.4 76.0 54.64 50.1 0.31 35.0 35.0 1.700368 8079366.816 1.156634e+09 300.370541 15881.0 143.158938 5.205795 2.870473 1.097533 0.000921 0.022938 0.078028 0.084511 0.358868 0.157676 0.509057 0.292564 0.624623 0.666710 0.239113
67 12 21 21.7 54.7 NaN NaN 1.16 18.8 0.0 0.000000 0.000000 0.000000 0 66.0 0.134146 2.537764 9.123867 34.441088 7.0 0 0.000000 0.0 0.00000 0.0 NaN NaN NaN NaN NaN NaN 3310000 492 6700 556511 1249115 1610288 491541 55.6 9.8 12189.0 3.40 NaN NaN NaN NaN NaN 0.320 NaN 76.1 NaN 50.1 0.43 NaN 58.0 7.550916 8891135.865 2.169530e+08 546.944874 9567.0 24.401041 4.736409 2.680313 1.064157 0.000348 0.060128 0.103334 0.447261 0.392551 0.287235 0.801464 0.322960 0.586019 0.791822 0.075617
In [5]:
df.describe()
Out[5]:
clusterID Country Car Modeshare (%) Public Transit Modeshare (%) Walking Modeshare (%) Gasoline Pump Price (USD/liter) Road Deaths Rate (per 1000) Subway Length (km) Subway Length Density (per km) Subway Stations per Hundred Thousand Subway Ridership per Capita Subway Age (years) BRT Length (km) BRT System Length Density (per km) BRT Stations per Hundred Thousand Persons BRT Fleet per Hundred Thousand Persons BRT Annual Ridership per Capita BRT Age (years) Bikeshare Stations per Hundred Thousand Persons Bikeshare Number of Bikes Bikeshare Bicycles per Hundred Thousand Persons Bikeshare Age (years) Congestion (%) Congestion AM Peak (%) Congestion PM Peak (%) Traffic Index Travel Time Index Inefficiency Index Population Land Area (sq. km) Population Density (per sq. km) Population Change 1990 – 2000 Population Change 2000 – 2010 Population Change 2010 – 2020 Population Change 2020 – 2025 Urbanization Rate 2015 (%) Urbanization Rate Change 2015 – 2025 (pp) GDP per Capita (USD) Unemployment Rate (%) Cost of Living Index Rent Index Grocery Index Restaurant Price Index Local Purchasing Power Index Gini Coefficient Poverty Rate (%) Life Expectancy (years) Safety Index Internet Penetration Digital Penetration Innovation Index Smartphone Penetration (%) CO2 Emissions per Capita (metric tonnes) Street length total (m) Street Length Density (m/sq. km) Street Length Average (m) Intersection Count Intersection Density (per sq. km) Degree Average Streets per Node Circuity Self-Loop Proportion Highway Proportion Metro Propensity Factor BRT Propensity Factor BikeShare Propensity Factor Development Factor Sustainability Factor Population Factor Congestion Factor Sprawl Factor Network Density Factor
count 331.000000 331.000000 224.000000 227.000000 199.000000 331.000000 330.000000 331.000000 331.000000 331.000000 331.000000 331.000000 330.000000 329.000000 321.000000 300.000000 330.000000 316.000000 330.000000 313.000000 316.000000 328.000000 165.000000 165.000000 165.000000 142.000000 142.000000 142.000000 3.310000e+02 331.000000 331.000000 3.310000e+02 3.310000e+02 3.310000e+02 3.310000e+02 328.000000 328.000000 331.000000 193.000000 223.000000 223.000000 223.000000 223.000000 223.000000 226.000000 194.000000 331.000000 207.000000 330.000000 324.000000 258.000000 219.000000 331.000000 3.170000e+02 3.170000e+02 317.000000 317.000000 317.000000 317.000000 317.000000 317.000000 317.000000 317.000000 331.000000 331.000000 331.000000 331.000000 331.000000 331.000000 331.000000 331.000000 331.000000
mean 5.567976 67.126888 47.688694 25.999218 18.270352 1.053988 14.707576 38.872356 0.037598 0.729446 25.634767 16.199396 10.533455 0.013815 0.413424 1.690402 4.282701 3.126582 1.717524 1507.434505 28.744784 1.917683 29.387879 46.654545 57.539394 170.634718 40.236268 181.698099 4.343458e+06 1048.232628 6107.362538 5.650781e+05 6.583977e+05 8.284136e+05 4.069141e+05 67.545732 3.545732 25398.480846 7.807409 58.721121 28.269910 53.744439 52.883767 83.786637 0.427235 21.290825 74.402523 52.420483 57.426667 0.524599 39.678295 53.452055 7.397879 7.116418e+06 5.322874e+09 157.060505 28596.160883 1742.295843 5.112775 2.877538 1.066236 0.002914 0.016953 0.195855 0.173344 0.398780 0.416785 0.381920 0.171709 0.491645 0.492904 0.412400
std 3.591750 38.901689 28.588289 20.319709 15.845304 0.424823 8.718870 77.291871 0.068359 1.224042 48.264312 27.430951 25.093831 0.034030 1.074162 7.928495 12.671217 7.171579 4.722804 7587.973979 117.482423 3.358670 10.346785 17.249127 17.637093 55.197757 8.999045 75.456951 5.175587e+06 1303.503455 5264.683635 8.691368e+05 8.745002e+05 1.033509e+06 5.137592e+05 19.183544 2.667355 22422.648624 6.235819 20.934314 20.337176 21.468003 24.713496 39.246045 0.090711 12.611445 7.206536 15.514069 24.497672 0.220449 7.598970 19.906461 6.538615 1.005218e+07 3.538891e+09 85.743576 32794.221029 2998.022258 0.471003 0.208571 0.031631 0.003780 0.016285 0.195584 0.148059 0.078614 0.266073 0.153003 0.159347 0.189305 0.227212 0.151951
min 1.000000 0.000000 0.000000 0.400000 0.000000 0.010000 0.600000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 9.000000 11.000000 22.000000 64.000000 22.270000 36.900000 5.900000e+05 85.000000 500.000000 -6.265350e+05 -2.744360e+05 -1.909910e+05 -4.604500e+04 16.100000 -0.100000 352.600000 0.200000 23.770000 4.430000 19.010000 14.430000 2.710000 0.220000 0.150000 50.100000 14.050000 2.100000 0.040000 17.000000 4.000000 0.038311 4.524791e+05 4.277235e+07 56.758212 1574.000000 0.938526 3.728737 2.374583 1.009882 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000
25% 2.000000 30.500000 22.475000 9.000000 3.200000 0.705000 7.500000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 22.000000 33.000000 46.000000 136.147500 34.392500 119.612500 1.440000e+06 311.000000 2550.000000 1.087960e+05 1.289045e+05 1.899600e+05 9.665700e+04 55.600000 1.700000 5695.700000 4.000000 40.445000 13.380000 35.360000 31.330000 47.830000 0.362500 13.000000 70.700000 42.115000 43.000000 0.397500 34.000000 39.000000 1.894512 2.786606e+06 2.733555e+09 108.101444 10326.000000 782.733352 4.765528 2.735450 1.046901 0.000677 0.000756 0.067857 0.078734 0.358541 0.213589 0.281133 0.055895 0.366844 0.309768 0.319600
50% 6.000000 68.000000 43.000000 22.200000 16.000000 1.050000 13.950000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 0.000000 29.000000 48.000000 56.000000 161.570000 39.235000 172.795000 2.540000e+06 630.000000 4700.000000 2.905760e+05 3.407720e+05 4.640620e+05 2.247350e+05 74.000000 2.400000 17979.000000 6.400000 60.140000 23.690000 52.470000 49.630000 86.220000 0.440000 18.700000 76.100000 52.150000 59.900000 0.500000 40.000000 58.000000 6.200000 4.514029e+06 4.735599e+09 135.975848 19693.000000 911.440096 5.119624 2.885415 1.060536 0.001626 0.014996 0.106605 0.139255 0.390862 0.341371 0.341087 0.120180 0.533182 0.473091 0.394053
75% 8.000000 110.500000 75.650000 39.650000 27.000000 1.315000 20.450000 42.550000 0.052714 1.137465 25.739660 25.000000 7.750000 0.007384 0.138593 0.000000 0.824889 1.250000 0.927193 300.000000 8.743201 2.000000 36.000000 58.000000 68.000000 199.452500 44.625000 233.672500 4.727500e+06 1291.000000 8350.000000 6.463495e+05 8.743945e+05 1.062863e+06 4.813000e+05 81.600000 4.700000 43633.000000 8.800000 74.625000 37.155000 70.590000 72.460000 117.065000 0.471000 28.097500 79.600000 63.615000 76.200000 0.770000 44.000000 72.000000 11.200000 7.556040e+06 7.118912e+09 169.714325 32620.000000 1423.975084 5.443809 3.030951 1.079913 0.003602 0.026980 0.277798 0.190673 0.423609 0.706330 0.450211 0.248221 0.640847 0.679550 0.474617
max 12.000000 123.000000 94.800000 82.500000 78.000000 2.120000 37.200000 588.000000 0.612982 9.797980 326.159420 127.000000 207.000000 0.234286 8.818182 92.673267 105.678013 46.000000 38.987508 90000.000000 1181.102362 18.000000 66.000000 96.000000 118.000000 359.430000 71.050000 444.440000 3.775000e+07 11642.000000 44100.000000 6.135953e+06 6.202838e+06 7.338635e+06 3.661236e+06 100.000000 10.000000 86830.000000 38.000000 147.990000 144.320000 131.280000 135.800000 173.640000 0.750000 61.600000 85.000000 85.430000 95.000000 0.850000 60.000000 88.000000 44.100000 9.487584e+07 1.975170e+10 910.880917 261120.000000 27693.013740 6.650233 3.497378 1.215631 0.026391 0.082444 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000
  • We see that some columns seem to miss values looking at the 'count' row in the pandas.DataFrame.describe() output. Now, we check in-depth what is the situation with missing values.
In [6]:
# Checking the % of missing values for each column.
missData = pd.DataFrame(df.isnull().sum(axis=0), columns=['Missing Values (%)'])/df.count()[0]*100 
missData = missData.sort_values(by=missData.columns[0], ascending=False)
pd.set_option('display.max_columns', 1000)
missData.T.head()
Out[6]:
Traffic Index Inefficiency Index Travel Time Index Congestion PM Peak (%) Congestion (%) Congestion AM Peak (%) Bicycle Modeshare (%) Unemployment Rate (%) Poverty Rate (%) Walking Modeshare (%) Safety Index Smartphone Penetration (%) Local Purchasing Power Index Restaurant Price Index Grocery Index Rent Index Cost of Living Index Car Modeshare (%) Gini Coefficient Public Transit Modeshare (%) Innovation Index BRT Fleet per Hundred Thousand Persons Bikeshare Number of Bikes Bikeshare Bicycles per Hundred Thousand Persons BRT Age (years) Intersection Density (per sq. km) Intersection Count Degree Average Self-Loop Proportion Street Length Average (m) Streets per Node Circuity Street Length Density (m/sq. km) Highway Proportion Street length total (m) BRT Stations per Hundred Thousand Persons Digital Penetration Bikeshare Stations Urbanization Rate 2015 (%) Urbanization Rate Change 2015 – 2025 (pp) Bikeshare Age (years) BRT System Length Density (per km) Bikeshare Stations per Hundred Thousand Persons BRT Annual Ridership per Capita Internet Penetration Road Deaths Rate (per 1000) BRT Length (km) Metro Propensity Factor BikeShare Propensity Factor Development Factor Sustainability Factor Population Factor Congestion Factor BRT Propensity Factor Sprawl Factor clusterID CO2 Emissions per Capita (metric tonnes) Life Expectancy (years) Gasoline Pump Price (USD/liter) Subway Length (km) Subway Length Density (per km) Subway Stations per Hundred Thousand Subway Ridership per Capita Subway Age (years) Population Land Area (sq. km) Population Density (per sq. km) Population Change 1990 – 2000 Population Change 2000 – 2010 Population Change 2010 – 2020 Population Change 2020 – 2025 Country GDP per Capita (USD) Network Density Factor
Missing Values (%) 57.099698 57.099698 57.099698 50.151057 50.151057 50.151057 44.108761 41.691843 41.389728 39.879154 37.462236 33.836858 32.628399 32.628399 32.628399 32.628399 32.628399 32.326284 31.722054 31.41994 22.054381 9.365559 5.438066 4.531722 4.531722 4.229607 4.229607 4.229607 4.229607 4.229607 4.229607 4.229607 4.229607 4.229607 4.229607 3.021148 2.114804 0.906344 0.906344 0.906344 0.906344 0.60423 0.302115 0.302115 0.302115 0.302115 0.302115 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
  • Taking a look at the outputs above, we notice that there are many values missing in many of the columns. It is not feasible to impute the values if there are too many values missing. It's not likely that the existing values would be sufficient enough to generalize the column, so trying to impute the missing data could corrupt the column instead of making it useful. But on the other hand, it's not really feasible to drop all the columns if the number of missing values in the column is relatively small - because it might be possible to somewhat reliably impute the missing values.

There's no golden rule to perform such operation, so we decide to drop columns that are missing more than 25% of their values.

In [7]:
# missData is transposed only during display above. normally the dataframe has 1 column e.g. Missing Values (%).
cols2drop = missData[missData['Missing Values (%)']>=25].index
df = df.drop(cols2drop, axis=1)
df.head()
Out[7]:
clusterID Country Gasoline Pump Price (USD/liter) Road Deaths Rate (per 1000) Subway Length (km) Subway Length Density (per km) Subway Stations per Hundred Thousand Subway Ridership per Capita Subway Age (years) BRT Length (km) BRT System Length Density (per km) BRT Stations per Hundred Thousand Persons BRT Fleet per Hundred Thousand Persons BRT Annual Ridership per Capita BRT Age (years) Bikeshare Stations Bikeshare Stations per Hundred Thousand Persons Bikeshare Number of Bikes Bikeshare Bicycles per Hundred Thousand Persons Bikeshare Age (years) Population Land Area (sq. km) Population Density (per sq. km) Population Change 1990 – 2000 Population Change 2000 – 2010 Population Change 2010 – 2020 Population Change 2020 – 2025 Urbanization Rate 2015 (%) Urbanization Rate Change 2015 – 2025 (pp) GDP per Capita (USD) Life Expectancy (years) Internet Penetration Digital Penetration Innovation Index CO2 Emissions per Capita (metric tonnes) Street length total (m) Street Length Density (m/sq. km) Street Length Average (m) Intersection Count Intersection Density (per sq. km) Degree Average Streets per Node Circuity Self-Loop Proportion Highway Proportion Metro Propensity Factor BRT Propensity Factor BikeShare Propensity Factor Development Factor Sustainability Factor Population Factor Congestion Factor Sprawl Factor Network Density Factor
cityID
285 7 116 0.66 8.5 24.9 0.013409 0.615385 6.417582 34 0.0 0.000000 0.000000 0.000000 0.000000 0.0 50 2.197802 NaN 0.00000 2.0 2275000 1857 1200 233673 332204 399059 195708 81.6 1.7 58789.0 78.8 81.0 0.78 45.0 14.300000 7468900.983 7.604833e+09 148.013337 28660.0 1018.199702 5.021972 2.869906 1.067736 0.007910 0.041018 0.160848 0.176867 0.360637 0.796264 0.355964 0.081956 0.180085 0.722163 0.425187
10 8 5 1.11 5.4 0.0 0.000000 0.000000 0.000000 0 0.0 0.000000 0.000000 0.000000 0.000000 0.0 50 1.264223 600.0 15.17067 2.0 3955000 2543 1500 316060 462816 715525 350883 89.4 1.2 39358.0 82.0 86.9 0.74 50.0 10.200000 8636838.530 8.653670e+09 107.503591 48571.0 1001.948856 4.948413 2.876305 1.036985 0.001626 0.014489 0.060387 0.168335 0.363675 0.786174 0.397894 0.082267 0.333173 0.539355 0.558910
186 1 77 1.02 26.4 0.0 0.000000 0.000000 0.000000 0 0.0 0.000000 0.000000 0.000000 0.000000 0.0 0 0.000000 0.0 0.00000 0.0 1435000 130 11100 248392 541978 960996 741379 18.7 3.5 427.4 61.8 2.4 0.04 NaN 0.106861 2134329.200 3.496995e+09 97.860119 13033.0 1638.451450 6.161297 3.187450 1.019423 0.000095 0.000000 0.036220 0.010915 0.343161 0.000000 0.273646 0.248398 0.655464 0.275605 0.410312
328 12 120 0.90 24.5 0.0 0.000000 0.000000 0.000000 0 14.5 NaN 0.143000 NaN 0.149000 1.0 0 0.000000 0.0 0.00000 0.0 7445000 868 8600 520495 1149423 1391900 608994 33.6 6.3 3425.0 76.0 50.1 0.31 35.0 1.700368 8079366.816 1.156634e+09 300.370541 15881.0 143.158938 5.205795 2.870473 1.097533 0.000921 0.022938 0.078028 0.084511 0.358868 0.157676 0.509057 0.292564 0.624623 0.666710 0.239113
67 12 21 1.16 18.8 0.0 0.000000 0.000000 0.000000 0 66.0 0.134146 2.537764 9.123867 34.441088 7.0 0 0.000000 0.0 0.00000 0.0 3310000 492 6700 556511 1249115 1610288 491541 55.6 9.8 12189.0 76.1 50.1 0.43 NaN 7.550916 8891135.865 2.169530e+08 546.944874 9567.0 24.401041 4.736409 2.680313 1.064157 0.000348 0.060128 0.103334 0.447261 0.392551 0.287235 0.801464 0.322960 0.586019 0.791822 0.075617
  • Since we applied our threshold filtering on the missing data, now we need to replace the remaining empty cells in some way. Here, we mainly considered two methods, namely, to predict the values of each column one by one or to replace the values in each column with the mean(if float)/ mode(if binary). We didn't include the results with the predictions (we tried decision trees) for the sake of simplicity but as a result it turned out that predicting the missing values gave worse results than simply replacing the missing values with the columns mean/mode.

Below we implement a function that fills in the missing values by their means and modes depending on if they are integers or floats (e.g. since we cannot have a population with a floating numbers and alike).

In [8]:
def replace_missing_val(df):
    
    df = df.replace(r'^\s*$', np.NaN, regex=True)
    # print("Initial dataframe shape [rows, columns]:", df.shape)
    n_NaN = df.isna().sum().sum()
    # print(n_NaN) # As you can see the total number of NaNs in the database 3199, so we need to impute.

    # Impute all NaNs over the database.

    df_int = df.select_dtypes(include='integer')
    df_float = df.select_dtypes(include='float')

    df_int = df_int.fillna(df.mode().iloc[0]) # Try using mean/mod for different columns.
    df_float = df_float.fillna(df.mean().iloc[0]) # Try using mean/mod for different columns.

    df[df_int.columns.values.tolist()] = df_int # Use mod for integers,
    df[df_float.columns.values.tolist()] = df_float # Use mean for floats

    # Drop taget variable
    # df_targets = df['CO2 Emissions per Capita (metric tonnes)']
    # df_inputs = df.drop(columns=['CO2 Emissions per Capita (metric tonnes)'])
    # return df_inputs, df_targets
    return df
In [9]:
df = replace_missing_val(df)
print('The number of missing values in the dataset: {}'.format(df.isna().sum().sum()))
print('The size of the cleaned dataset: {}'.format(df.shape))
The number of missing values in the dataset: 0
The size of the cleaned dataset: (331, 54)

As you can see above, now we have no missing values in the remaining dataset. Now the actual manipulation and analysis of the data can take place with our cleaned data that is ready for action.

An inital prediction was attempted without any processing of the data, but as expected a relativly low R^2 score was attained (almost 0.4). The initial idea to improve the score would be to try multiple prediction algorithms and to reduce the dimensionality of the dataset.

Random forest regression, support vector regression and multilayer perceptron regression were all tested to predict the target variable. It turned out to be the multilayer perceptron model from the sklearn neural_network package which performed the best. As for dimensionality reduction, PCA analysis was used as well as utilizing the co-correlation and co-variance within the data set. It turned out to be that dropping columns of low variance (under a given threshold) produced the best result. Using the variance threshold method, a total of 44 columns were dropped, which means (as seen in the code outputs below) using 30 variables in the final model was enough to predict the target.

Feature Selection:¶

In this part we will discuss which columns that we are going to use for prediction and why, or what other techniques that we are going to use before training a model.

  • First, let's have a look at the correlation matrix of the dataset.
In [10]:
import seaborn as sns
corr = df.corr().abs()

fig, ax = plt.subplots(figsize=(15,15))
sns.heatmap(corr, 
        xticklabels=corr.columns,
        yticklabels=corr.columns, ax=ax)
ax.set_title('Absolute Correlation Matrix Heatmap');
  • Our dataset consists of too many columns, so it's really hard to see and analyze the features that are correlated with each other one by one. So instead, we have plotted a heatmap of the absolute values of the correlation matrix to see if there are many number of columns that are correlated with each other.
  • From the heatmap, we see that we have many correlated variables that are highly correlated. Correlated variables generally don't improve the prediction models and they might actually impact the performance of the models in a bad way. So, it seems feasible to consider a dimensionality reduction technique right away from the start.

For dimensionality reduction, we will consider two approaches: PCA and correlation thresholding.

Dimensionality Reduction - Approach 1:¶

Filtering Out 1 of Each Highly Correlated Feature Couples:¶

Principle component analysis, itself, takes care of the correlations already so each of the resulting components are orthogonal to each other. But we also wanted to try implementing a function for filtering out highly correlated columns.

In [11]:
def corr_filter(df_in, thresh):
    corr = df_in.corr().abs()
    up_tri = corr.where(np.triu(np.ones(corr.shape), k=1).astype(np.bool)) # Get the upper triangle.
    to_drop = [column for column in up_tri.columns if any(up_tri[column] > thresh)]
    df_in = df_in.drop(to_drop, axis=1)
    return df_in
In [12]:
filtered = corr_filter(df.copy().drop('CO2 Emissions per Capita (metric tonnes)', axis=1), 0.5)

filtered.info()

import seaborn as sns
corr = filtered.corr().abs()

fig, ax = plt.subplots(figsize=(10,10))
sns.heatmap(corr, 
        xticklabels=corr.columns,
        yticklabels=corr.columns, ax=ax)
ax.set_title('Filtered Dataset\'s Absolute Correlation Matrix Heatmap');
<class 'pandas.core.frame.DataFrame'>
Int64Index: 331 entries, 285 to 225
Data columns (total 16 columns):
 #   Column                                     Non-Null Count  Dtype  
---  ------                                     --------------  -----  
 0   clusterID                                  331 non-null    int64  
 1   Country                                    331 non-null    int32  
 2   Gasoline Pump Price (USD/liter)            331 non-null    float64
 3   Subway Length (km)                         331 non-null    float64
 4   BRT Length (km)                            331 non-null    float64
 5   BRT System Length Density (per km)         331 non-null    float64
 6   BRT Stations per Hundred Thousand Persons  331 non-null    float64
 7   BRT Fleet per Hundred Thousand Persons     331 non-null    float64
 8   Bikeshare Stations                         331 non-null    float64
 9   Bikeshare Age (years)                      331 non-null    float64
 10  Population Density (per sq. km)            331 non-null    int64  
 11  Digital Penetration                        331 non-null    float64
 12  Street length total (m)                    331 non-null    float64
 13  Street Length Density (m/sq. km)           331 non-null    float64
 14  Degree Average                             331 non-null    float64
 15  Streets per Node                           331 non-null    float64
dtypes: float64(13), int32(1), int64(2)
memory usage: 42.7 KB
  • Again, there's no rule golden rule for deciding what should be the threshold to accept a correlation to be high, we decided get rid of one of the columns that are correlated with each other with a score higher than 0.5. Please note that we didn't include the target column in correlation filtering

As you can see by the output above there are no highly correlated features left in the dataset. The remaining number of columns is 15 (from 53).

Dimensionality Reduction: Approach 2:¶

Principle Component Analysis:¶

For PCA, the inputs must have unit variance and zero mean, otherwise PCA results will be biased over variables that has larger scaling, which is technically faulty. So we use standardization on the dataset that is going to be used with PCA, but not yet on the filtered one.

In [13]:
# Define the standardization function.
def standardize_dataframe(df_in):
    return (df_in-df_in.mean())/df_in.std()
In [14]:
# PCA inputs without filtering:
df_target = df['CO2 Emissions per Capita (metric tonnes)']
df_inputs = standardize_dataframe(df.drop('CO2 Emissions per Capita (metric tonnes)', axis=1))
  • In the following cells we create train and test splits with correlation filtered and non-filtered datasets, seperately.
In [15]:
# Perform PCA
from sklearn.decomposition import PCA

pca = PCA(n_components=.999999)

pca.fit(df_inputs)
expl=pca.explained_variance_ratio_
cdf=[sum(expl[:i+1]) for i in range(len(expl))]

fig, ax = plt.subplots(nrows=1, ncols=2, figsize=(15,5))

ax[0].bar(range(len(expl)),pca.explained_variance_ratio_)
ax[0].set_xlabel('Principle Component')
ax[0].set_ylabel('Total Variance Explained (%)')
ax[0].set_title('Total Variance Explained (%) By Each Principle Component')


ax[1].plot(range(len(expl)), cdf, marker='o', color='r');
ax[1].set_xlabel('The First $n$ Principle Components')
ax[1].set_ylabel('Variance Explained (%)')
ax[1].set_title('Cumulative Variance Explained (%)')
Out[15]:
Text(0.5, 1.0, 'Cumulative Variance Explained (%)')

By the plots above you can see which principle component explains how much of the total variance in the dataset and the cumulative explained variance by choosing the first $n$ principle components.

  • Trying out a few alternatives we decided to use the first 18 principle components as it's on the elbow of the plot, and still explains 90% of the total variation. In the cell below, we perform the transformations on our standardized inputs according to the first 18 principle components.
In [16]:
print('The dimensions of the input dataset before PC transformation: {}'.format(df_inputs.shape))
pca18 = PCA(n_components=18)
df_inputs_PCA = pca18.fit_transform(df_inputs)
print('The dimensions of the input dataset after dimensionality reduction: {}'.format(df_inputs_PCA.shape))
The dimensions of the input dataset before PC transformation: (331, 53)
The dimensions of the input dataset after dimensionality reduction: (331, 18)

As you can see above the number of features to be used decreased from 53 to 18.

The Prediction Models¶

1. Forming the Test and the Training Data:¶

In the following cell the test and train splits are created for both the PCA and the correlation filtering cases.

We make sure that the dataset is not shuffled. Setting the shuffle parameter in train_test_split() ensures that we don't shuffle the data, and by default it takes the training portion from the top rows of the dataset first then uses the bottom remaining ones as the test split, so the splits here are formed correctly as wanted in the challenge document.

  • We will also apply min/max scaling(because unlike in PCA case, we don't need to sustain unit variance and zero mean here) on the correlation filtered dataset as an extra case to see if it will provide any better results.
In [17]:
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import minmax_scale

# PCA outputs are splitted:
tr_inputs, ts_inputs, tr_target, ts_target = \
                        train_test_split(df_inputs_PCA, df_target, test_size=0.25, \
                                         random_state=42, shuffle=False)

# Creating splits where the dimensionality reduction is performed using correlation filtering:
fil_df_target = df['CO2 Emissions per Capita (metric tonnes)']
fil_df_inputs = filtered
fil_tr_inputs, fil_ts_inputs, fil_tr_target, fil_ts_target = \
                        train_test_split(fil_df_inputs, fil_df_target, test_size=0.25,\
                                         random_state=42, shuffle=False)

sc_fil_df_target = df['CO2 Emissions per Capita (metric tonnes)']
sc_fil_df_inputs = minmax_scale(filtered)

sc_fil_tr_inputs, sc_fil_ts_inputs, sc_fil_tr_target, sc_fil_ts_target = \
                        train_test_split(sc_fil_df_inputs, sc_fil_df_target, test_size=0.25,\
                                         random_state=42, shuffle=False)
  • Since both of our datasets have so many dimensions, it's not possible to visualize them as they are. So we directly start training different types of models.

2. Training Different Models and Hyperparameter Tuning¶

In this section, we try training different models to see if we could hit the goal ($R^2$>=0.55) or how good results that we can get with each model.

In the cell below we will train Linear Regression models as baselines for:

  • PCA performed data
  • Non-Scaled Correlation filtered data

and we calculate $R^2$-scores for the models found for each. Please note that Linear Regression is not susceptible against non-standardized data, so we don't standardize or normalize the correlation filtered dataset.

After the next cell, we will proceed with training other models. At the end, we will plot all the results on the same figure for performance comparison.

In [18]:
from sklearn.model_selection import GridSearchCV
from sklearn.linear_model import LinearRegression, Ridge

R2_lin_PCA = LinearRegression().fit(tr_inputs, tr_target).score(ts_inputs, ts_target)
R2_lin_Corr = LinearRegression().fit(fil_tr_inputs, fil_tr_target).score(fil_ts_inputs, fil_ts_target)

In the cell below we carry out hyperparameter tuning via GridSearchCV() on:

  • Random Forest Regression with PCA performed data
  • Random Forest Regression with Non-Scaled Correlation filtered data

and we calculate $R^2$-scores for the best models found for each.

Please note since Random Forests are robust against non-standardized data, so we don't need to train a new model for the standardized or the normalized version of the correlation filtered dataset.

In [19]:
from sklearn.ensemble import RandomForestRegressor

# # Parameters to perform search on.
# params = {
#     'bootstrap': [False, True],
#     'max_depth': [5, 15, 25, 35],
#     'min_samples_leaf': [2, 4, 8, 15],
#     'min_samples_split': [5, 10, 15, 20],
#     'n_estimators': [50, 100, 150, 200]
# }
# regrPCA = RandomForestRegressor()
# regrCor = RandomForestRegressor()

# gridPCA_rf = GridSearchCV(estimator = regrPCA, param_grid = params, 
#                           cv = 3, n_jobs = -1, verbose = 2)
# gridCor_rf = GridSearchCV(estimator = regrCor, param_grid = params, 
#                           cv = 3, n_jobs = -1, verbose = 2)

# gridPCA_rf.fit(tr_inputs, tr_target)
# gridCor_rf.fit(fil_tr_inputs, fil_tr_target)

# R2_rf_PCA = gridPCA_rf.best_estimator_.score(ts_inputs, ts_target) # R2 score of the best model for PCA
# R2_rf_Cor = gridCor_rf.best_estimator_.score(fil_ts_inputs, fil_ts_target) # R2 score of the best model for correlation filtered dataset.

params = {'bootstrap': True,
 'criterion': 'mse',
 'max_depth': None,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 1,
 'min_samples_split': 5,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 100,
 'n_jobs': -1,
 'oob_score': False,
 'random_state': 42,
 'verbose': 0,
 'warm_start': False}

# Parameters to perform search on.
R2_rf_PCA = RandomForestRegressor(**params)\
                .fit(tr_inputs, tr_target).score(ts_inputs, ts_target)

params = {'bootstrap': True,
 'criterion': 'mse',
 'max_depth': None,
 'max_features': 'auto',
 'max_leaf_nodes': None,
 'min_impurity_decrease': 0.0,
 'min_impurity_split': None,
 'min_samples_leaf': 5,
 'min_samples_split': 2,
 'min_weight_fraction_leaf': 0.0,
 'n_estimators': 150,
 'n_jobs': -1,
 'oob_score': False,
 'random_state': 42,
 'verbose': 0,
 'warm_start': False}
R2_rf_Corr = RandomForestRegressor(**params)\
                .fit(fil_tr_inputs, fil_tr_target).score(fil_ts_inputs, fil_ts_target)
  • We tried many different combinations of parameters and also ran grid search(which take a lot of time so we commented out the code here). We couldn't exceed an $R^2$-score of 0.55 so we gave up.

In the cell below we carry out hyperparameter tuning for Both Linear/Non-linear Support Vector Regression via GridSearchCV() on:

  • PCA performed data
  • Standardized and normalized correlation filtered data

Since SVR required the inputs to be standardized, we didn't create a model with the non-standardized version of the correlation filtered dataset.

In [20]:
from sklearn.svm import SVR

# params = {'kernel':['linear', 'rbf' , 'sigmoid'],
#           'C':[0.001, 0.01, 0.1, 1, 10],
#           'gamma':['scale', 'auto']}

# regrSVR_PCA = SVR()
# regrSVR_Cor = SVR()
# regrSVR_Cor_scale = SVR()

# gridPCA_SVR = GridSearchCV(estimator = regrSVR_PCA, param_grid = params, 
#                           cv = 3, n_jobs = -1, verbose = 2)
# gridCor_SVR = GridSearchCV(estimator = regrSVR_Cor, param_grid = params, 
#                           cv = 3, n_jobs = -1, verbose = 2)
# gridCor_scaled_SVR = GridSearchCV(estimator = regrSVR_Cor_scale, param_grid = params, 
#                           cv = 3, n_jobs = -1, verbose = 2)

# gridPCA_SVR.fit(tr_inputs, tr_target)
# gridCor_SVR.fit(fil_tr_inputs, fil_tr_target)
# gridCor_scaled_SVR.fit(sc_fil_tr_inputs, sc_fil_tr_target)

# R2_svr_PCA = gridPCA_SVR.best_estimator_.score(ts_inputs, ts_target)
# R2_svr_Cor = gridCor_SVR.best_estimator_.score(fil_ts_inputs, fil_ts_target)
# R2_svr_Cor_scaled = gridCor_scaled_SVR.best_estimator_.score(sc_fil_ts_inputs, sc_fil_ts_target)

params = {'kernel':'linear',
          'C': 1,
          'gamma': 'auto',
          'epsilon': 3.12}
R2_svr_PCA = SVR(**params).fit(tr_inputs, tr_target).score(ts_inputs, ts_target)


params = {'kernel':'linear',
          'C': 1,
          'gamma': 'auto',
          'epsilon': 2.5}
R2_svr_Cor_scaled = SVR(**params).fit(sc_fil_tr_inputs, sc_fil_tr_target).score(sc_fil_ts_inputs, sc_fil_ts_target)

Linear kernel seemed to perform better. But again, we tried many different combinations of parameters and also ran grid search. But we again couldn't manage to exceed an 𝑅2-score of 0.55 with SVR so we gave up.

In the cell below we defined and tuned (manually) a Neural Network on:

  • PCA performed data

Apparently we managed to exceed the goal value with an $R^2$-score=~0.62 which is bigger than 0.55 so we didn't train another model further on.

In [21]:
# Perform predictions

from sklearn.model_selection import RandomizedSearchCV
from sklearn.metrics import r2_score
from keras.models import Sequential
from keras.layers import Dense, Dropout, BatchNormalization
from keras.optimizers import Adam

# define the keras model
modelPCA = Sequential()
modelPCA.add(Dense(128, input_dim=np.shape(tr_inputs)[1], activation='relu'))
modelPCA.add(Dense(128, activation='relu'))
modelPCA.add(Dense(128, activation='relu'))

modelPCA.add(Dense(1, activation='linear')) # Output layer.


# compile the keras model
opt = Adam(learning_rate=2e-5)
modelPCA.compile(loss='mean_squared_error', optimizer=opt, metrics=['mean_squared_error'])
modelPCA.fit(tr_inputs, tr_target, epochs=900, batch_size=24, verbose=1) # Do not print the progress since 3500 epochs.
Epoch 1/900
11/11 [==============================] - 0s 974us/step - loss: 89.0216 - mean_squared_error: 89.0216
Epoch 2/900
11/11 [==============================] - 0s 980us/step - loss: 88.0679 - mean_squared_error: 88.0679
Epoch 3/900
11/11 [==============================] - 0s 902us/step - loss: 87.1243 - mean_squared_error: 87.1243
Epoch 4/900
11/11 [==============================] - 0s 1ms/step - loss: 86.1839 - mean_squared_error: 86.1839
Epoch 5/900
11/11 [==============================] - 0s 1ms/step - loss: 85.2370 - mean_squared_error: 85.2370
Epoch 6/900
11/11 [==============================] - 0s 1ms/step - loss: 84.3196 - mean_squared_error: 84.3196
Epoch 7/900
11/11 [==============================] - 0s 1ms/step - loss: 83.4016 - mean_squared_error: 83.4016
Epoch 8/900
11/11 [==============================] - 0s 2ms/step - loss: 82.4843 - mean_squared_error: 82.4843
Epoch 9/900
11/11 [==============================] - 0s 960us/step - loss: 81.5891 - mean_squared_error: 81.5891
Epoch 10/900
11/11 [==============================] - 0s 1ms/step - loss: 80.6338 - mean_squared_error: 80.6338
Epoch 11/900
11/11 [==============================] - 0s 1ms/step - loss: 79.7399 - mean_squared_error: 79.7399
Epoch 12/900
11/11 [==============================] - 0s 978us/step - loss: 78.8079 - mean_squared_error: 78.8079
Epoch 13/900
11/11 [==============================] - 0s 1ms/step - loss: 77.8818 - mean_squared_error: 77.8818
Epoch 14/900
11/11 [==============================] - 0s 1ms/step - loss: 76.9372 - mean_squared_error: 76.9372
Epoch 15/900
11/11 [==============================] - 0s 1ms/step - loss: 75.9911 - mean_squared_error: 75.9911
Epoch 16/900
11/11 [==============================] - 0s 1ms/step - loss: 75.0606 - mean_squared_error: 75.0606
Epoch 17/900
11/11 [==============================] - 0s 1ms/step - loss: 74.1104 - mean_squared_error: 74.1104
Epoch 18/900
11/11 [==============================] - 0s 1ms/step - loss: 73.1656 - mean_squared_error: 73.1656
Epoch 19/900
11/11 [==============================] - 0s 993us/step - loss: 72.2071 - mean_squared_error: 72.2071
Epoch 20/900
11/11 [==============================] - 0s 1ms/step - loss: 71.2779 - mean_squared_error: 71.2779
Epoch 21/900
11/11 [==============================] - 0s 1ms/step - loss: 70.3087 - mean_squared_error: 70.3087
Epoch 22/900
11/11 [==============================] - 0s 1ms/step - loss: 69.3846 - mean_squared_error: 69.3846
Epoch 23/900
11/11 [==============================] - 0s 1ms/step - loss: 68.4717 - mean_squared_error: 68.4717
Epoch 24/900
11/11 [==============================] - 0s 996us/step - loss: 67.5113 - mean_squared_error: 67.5113
Epoch 25/900
11/11 [==============================] - 0s 1ms/step - loss: 66.5757 - mean_squared_error: 66.5757
Epoch 26/900
11/11 [==============================] - 0s 1ms/step - loss: 65.6432 - mean_squared_error: 65.6432
Epoch 27/900
11/11 [==============================] - 0s 1ms/step - loss: 64.6748 - mean_squared_error: 64.6748
Epoch 28/900
11/11 [==============================] - 0s 1ms/step - loss: 63.7454 - mean_squared_error: 63.7454
Epoch 29/900
11/11 [==============================] - 0s 1ms/step - loss: 62.8353 - mean_squared_error: 62.8353
Epoch 30/900
11/11 [==============================] - 0s 1ms/step - loss: 61.8613 - mean_squared_error: 61.8613
Epoch 31/900
11/11 [==============================] - 0s 1ms/step - loss: 60.9213 - mean_squared_error: 60.9213
Epoch 32/900
11/11 [==============================] - 0s 1ms/step - loss: 59.9895 - mean_squared_error: 59.9895
Epoch 33/900
11/11 [==============================] - 0s 1ms/step - loss: 59.1051 - mean_squared_error: 59.1051
Epoch 34/900
11/11 [==============================] - 0s 1ms/step - loss: 58.2009 - mean_squared_error: 58.2009
Epoch 35/900
11/11 [==============================] - 0s 1ms/step - loss: 57.2895 - mean_squared_error: 57.2895
Epoch 36/900
11/11 [==============================] - 0s 1ms/step - loss: 56.3806 - mean_squared_error: 56.3806
Epoch 37/900
11/11 [==============================] - 0s 1ms/step - loss: 55.4911 - mean_squared_error: 55.4911
Epoch 38/900
11/11 [==============================] - 0s 997us/step - loss: 54.5737 - mean_squared_error: 54.5737
Epoch 39/900
11/11 [==============================] - 0s 1ms/step - loss: 53.7199 - mean_squared_error: 53.7199
Epoch 40/900
11/11 [==============================] - 0s 1ms/step - loss: 52.8602 - mean_squared_error: 52.8602
Epoch 41/900
11/11 [==============================] - 0s 1ms/step - loss: 51.9767 - mean_squared_error: 51.9767
Epoch 42/900
11/11 [==============================] - 0s 1ms/step - loss: 51.2062 - mean_squared_error: 51.2062
Epoch 43/900
11/11 [==============================] - 0s 1ms/step - loss: 50.3391 - mean_squared_error: 50.3391
Epoch 44/900
11/11 [==============================] - 0s 1ms/step - loss: 49.4482 - mean_squared_error: 49.4482
Epoch 45/900
11/11 [==============================] - 0s 994us/step - loss: 48.6774 - mean_squared_error: 48.6774
Epoch 46/900
11/11 [==============================] - 0s 973us/step - loss: 47.8209 - mean_squared_error: 47.8209
Epoch 47/900
11/11 [==============================] - 0s 1ms/step - loss: 46.9915 - mean_squared_error: 46.9915
Epoch 48/900
11/11 [==============================] - 0s 970us/step - loss: 46.1952 - mean_squared_error: 46.1952
Epoch 49/900
11/11 [==============================] - 0s 816us/step - loss: 45.3826 - mean_squared_error: 45.3826
Epoch 50/900
11/11 [==============================] - 0s 1ms/step - loss: 44.5765 - mean_squared_error: 44.5765
Epoch 51/900
11/11 [==============================] - 0s 1ms/step - loss: 43.7521 - mean_squared_error: 43.7521
Epoch 52/900
11/11 [==============================] - 0s 1ms/step - loss: 42.9908 - mean_squared_error: 42.9908
Epoch 53/900
11/11 [==============================] - 0s 907us/step - loss: 42.1772 - mean_squared_error: 42.1772
Epoch 54/900
11/11 [==============================] - 0s 1ms/step - loss: 41.3900 - mean_squared_error: 41.3900
Epoch 55/900
11/11 [==============================] - 0s 816us/step - loss: 40.6250 - mean_squared_error: 40.6250
Epoch 56/900
11/11 [==============================] - 0s 888us/step - loss: 39.8734 - mean_squared_error: 39.8734
Epoch 57/900
11/11 [==============================] - 0s 684us/step - loss: 39.1050 - mean_squared_error: 39.1050
Epoch 58/900
11/11 [==============================] - 0s 786us/step - loss: 38.4019 - mean_squared_error: 38.4019
Epoch 59/900
11/11 [==============================] - 0s 872us/step - loss: 37.6638 - mean_squared_error: 37.6638
Epoch 60/900
11/11 [==============================] - 0s 852us/step - loss: 36.9613 - mean_squared_error: 36.9613
Epoch 61/900
11/11 [==============================] - 0s 725us/step - loss: 36.1958 - mean_squared_error: 36.1958
Epoch 62/900
11/11 [==============================] - 0s 765us/step - loss: 35.4883 - mean_squared_error: 35.4883
Epoch 63/900
11/11 [==============================] - 0s 801us/step - loss: 34.7967 - mean_squared_error: 34.7967
Epoch 64/900
11/11 [==============================] - 0s 726us/step - loss: 34.1114 - mean_squared_error: 34.1114
Epoch 65/900
11/11 [==============================] - 0s 816us/step - loss: 33.4345 - mean_squared_error: 33.4345
Epoch 66/900
11/11 [==============================] - 0s 805us/step - loss: 32.7650 - mean_squared_error: 32.7650
Epoch 67/900
11/11 [==============================] - 0s 816us/step - loss: 32.0852 - mean_squared_error: 32.0852
Epoch 68/900
11/11 [==============================] - 0s 782us/step - loss: 31.4421 - mean_squared_error: 31.4421
Epoch 69/900
11/11 [==============================] - 0s 725us/step - loss: 30.7721 - mean_squared_error: 30.7721
Epoch 70/900
11/11 [==============================] - 0s 790us/step - loss: 30.1441 - mean_squared_error: 30.1441
Epoch 71/900
11/11 [==============================] - 0s 786us/step - loss: 29.5527 - mean_squared_error: 29.5527
Epoch 72/900
11/11 [==============================] - 0s 816us/step - loss: 28.9242 - mean_squared_error: 28.9242
Epoch 73/900
11/11 [==============================] - 0s 725us/step - loss: 28.3634 - mean_squared_error: 28.3634
Epoch 74/900
11/11 [==============================] - 0s 808us/step - loss: 27.7748 - mean_squared_error: 27.7748
Epoch 75/900
11/11 [==============================] - 0s 725us/step - loss: 27.1735 - mean_squared_error: 27.1735
Epoch 76/900
11/11 [==============================] - 0s 790us/step - loss: 26.6333 - mean_squared_error: 26.6333
Epoch 77/900
11/11 [==============================] - 0s 712us/step - loss: 26.0508 - mean_squared_error: 26.0508
Epoch 78/900
11/11 [==============================] - 0s 725us/step - loss: 25.5028 - mean_squared_error: 25.5028
Epoch 79/900
11/11 [==============================] - 0s 855us/step - loss: 24.9572 - mean_squared_error: 24.9572
Epoch 80/900
11/11 [==============================] - 0s 867us/step - loss: 24.4399 - mean_squared_error: 24.4399
Epoch 81/900
11/11 [==============================] - 0s 783us/step - loss: 23.9302 - mean_squared_error: 23.9302
Epoch 82/900
11/11 [==============================] - 0s 907us/step - loss: 23.4646 - mean_squared_error: 23.4646
Epoch 83/900
11/11 [==============================] - 0s 816us/step - loss: 22.9914 - mean_squared_error: 22.9914
Epoch 84/900
11/11 [==============================] - 0s 854us/step - loss: 22.5134 - mean_squared_error: 22.5134
Epoch 85/900
11/11 [==============================] - 0s 821us/step - loss: 22.0841 - mean_squared_error: 22.0841
Epoch 86/900
11/11 [==============================] - 0s 793us/step - loss: 21.6299 - mean_squared_error: 21.6299
Epoch 87/900
11/11 [==============================] - 0s 816us/step - loss: 21.2162 - mean_squared_error: 21.2162
Epoch 88/900
11/11 [==============================] - 0s 725us/step - loss: 20.7966 - mean_squared_error: 20.7966
Epoch 89/900
11/11 [==============================] - 0s 816us/step - loss: 20.4380 - mean_squared_error: 20.4380
Epoch 90/900
11/11 [==============================] - 0s 767us/step - loss: 20.0431 - mean_squared_error: 20.0431
Epoch 91/900
11/11 [==============================] - 0s 693us/step - loss: 19.6868 - mean_squared_error: 19.6868
Epoch 92/900
11/11 [==============================] - 0s 907us/step - loss: 19.3508 - mean_squared_error: 19.3508
Epoch 93/900
11/11 [==============================] - 0s 725us/step - loss: 19.0252 - mean_squared_error: 19.0252
Epoch 94/900
11/11 [==============================] - 0s 720us/step - loss: 18.7215 - mean_squared_error: 18.7215
Epoch 95/900
11/11 [==============================] - 0s 758us/step - loss: 18.4034 - mean_squared_error: 18.4034
Epoch 96/900
11/11 [==============================] - 0s 635us/step - loss: 18.1236 - mean_squared_error: 18.1236
Epoch 97/900
11/11 [==============================] - 0s 816us/step - loss: 17.8374 - mean_squared_error: 17.8374
Epoch 98/900
11/11 [==============================] - 0s 883us/step - loss: 17.5596 - mean_squared_error: 17.5596
Epoch 99/900
11/11 [==============================] - 0s 714us/step - loss: 17.3208 - mean_squared_error: 17.3208
Epoch 100/900
11/11 [==============================] - 0s 725us/step - loss: 17.0640 - mean_squared_error: 17.0640
Epoch 101/900
11/11 [==============================] - 0s 816us/step - loss: 16.8272 - mean_squared_error: 16.8272
Epoch 102/900
11/11 [==============================] - 0s 787us/step - loss: 16.5976 - mean_squared_error: 16.5976
Epoch 103/900
11/11 [==============================] - 0s 684us/step - loss: 16.3803 - mean_squared_error: 16.3803
Epoch 104/900
11/11 [==============================] - 0s 725us/step - loss: 16.1911 - mean_squared_error: 16.1911
Epoch 105/900
11/11 [==============================] - 0s 897us/step - loss: 15.9864 - mean_squared_error: 15.9864
Epoch 106/900
11/11 [==============================] - 0s 884us/step - loss: 15.7887 - mean_squared_error: 15.7887
Epoch 107/900
11/11 [==============================] - 0s 800us/step - loss: 15.6122 - mean_squared_error: 15.6122
Epoch 108/900
11/11 [==============================] - 0s 899us/step - loss: 15.4240 - mean_squared_error: 15.4240
Epoch 109/900
11/11 [==============================] - 0s 985us/step - loss: 15.2525 - mean_squared_error: 15.2525
Epoch 110/900
11/11 [==============================] - 0s 1ms/step - loss: 15.1045 - mean_squared_error: 15.1045
Epoch 111/900
11/11 [==============================] - 0s 923us/step - loss: 14.9427 - mean_squared_error: 14.9427
Epoch 112/900
11/11 [==============================] - 0s 994us/step - loss: 14.8009 - mean_squared_error: 14.8009
Epoch 113/900
11/11 [==============================] - 0s 807us/step - loss: 14.6542 - mean_squared_error: 14.6542
Epoch 114/900
11/11 [==============================] - 0s 960us/step - loss: 14.5244 - mean_squared_error: 14.5244
Epoch 115/900
11/11 [==============================] - 0s 948us/step - loss: 14.4095 - mean_squared_error: 14.4095
Epoch 116/900
11/11 [==============================] - 0s 817us/step - loss: 14.2794 - mean_squared_error: 14.2794
Epoch 117/900
11/11 [==============================] - 0s 813us/step - loss: 14.1656 - mean_squared_error: 14.1656
Epoch 118/900
11/11 [==============================] - 0s 880us/step - loss: 14.0617 - mean_squared_error: 14.0617
Epoch 119/900
11/11 [==============================] - 0s 875us/step - loss: 13.9695 - mean_squared_error: 13.9695
Epoch 120/900
11/11 [==============================] - 0s 907us/step - loss: 13.8697 - mean_squared_error: 13.8697
Epoch 121/900
11/11 [==============================] - 0s 953us/step - loss: 13.7751 - mean_squared_error: 13.7751
Epoch 122/900
11/11 [==============================] - 0s 781us/step - loss: 13.7027 - mean_squared_error: 13.7027
Epoch 123/900
11/11 [==============================] - 0s 907us/step - loss: 13.6174 - mean_squared_error: 13.6174
Epoch 124/900
11/11 [==============================] - 0s 938us/step - loss: 13.5347 - mean_squared_error: 13.5347
Epoch 125/900
11/11 [==============================] - 0s 977us/step - loss: 13.4534 - mean_squared_error: 13.4534
Epoch 126/900
11/11 [==============================] - 0s 816us/step - loss: 13.3816 - mean_squared_error: 13.3816
Epoch 127/900
11/11 [==============================] - 0s 897us/step - loss: 13.3152 - mean_squared_error: 13.3152
Epoch 128/900
11/11 [==============================] - 0s 896us/step - loss: 13.2533 - mean_squared_error: 13.2533
Epoch 129/900
11/11 [==============================] - 0s 987us/step - loss: 13.1870 - mean_squared_error: 13.1870
Epoch 130/900
11/11 [==============================] - 0s 836us/step - loss: 13.1289 - mean_squared_error: 13.1289
Epoch 131/900
11/11 [==============================] - 0s 907us/step - loss: 13.0676 - mean_squared_error: 13.0676
Epoch 132/900
11/11 [==============================] - 0s 887us/step - loss: 13.0152 - mean_squared_error: 13.0152
Epoch 133/900
11/11 [==============================] - 0s 812us/step - loss: 12.9645 - mean_squared_error: 12.9645
Epoch 134/900
11/11 [==============================] - 0s 725us/step - loss: 12.9223 - mean_squared_error: 12.9223
Epoch 135/900
11/11 [==============================] - 0s 802us/step - loss: 12.8677 - mean_squared_error: 12.8677
Epoch 136/900
11/11 [==============================] - 0s 816us/step - loss: 12.8187 - mean_squared_error: 12.8187
Epoch 137/900
11/11 [==============================] - 0s 904us/step - loss: 12.7702 - mean_squared_error: 12.7702
Epoch 138/900
11/11 [==============================] - 0s 889us/step - loss: 12.7297 - mean_squared_error: 12.7297
Epoch 139/900
11/11 [==============================] - 0s 998us/step - loss: 12.6843 - mean_squared_error: 12.6843
Epoch 140/900
11/11 [==============================] - 0s 993us/step - loss: 12.6471 - mean_squared_error: 12.6471
Epoch 141/900
11/11 [==============================] - 0s 997us/step - loss: 12.6034 - mean_squared_error: 12.6034
Epoch 142/900
11/11 [==============================] - 0s 939us/step - loss: 12.5727 - mean_squared_error: 12.5727
Epoch 143/900
11/11 [==============================] - 0s 862us/step - loss: 12.5212 - mean_squared_error: 12.5212
Epoch 144/900
11/11 [==============================] - 0s 886us/step - loss: 12.4886 - mean_squared_error: 12.4886
Epoch 145/900
11/11 [==============================] - 0s 866us/step - loss: 12.4530 - mean_squared_error: 12.4530
Epoch 146/900
11/11 [==============================] - 0s 880us/step - loss: 12.4325 - mean_squared_error: 12.4325
Epoch 147/900
11/11 [==============================] - 0s 868us/step - loss: 12.3854 - mean_squared_error: 12.3854
Epoch 148/900
11/11 [==============================] - 0s 779us/step - loss: 12.3562 - mean_squared_error: 12.3562
Epoch 149/900
11/11 [==============================] - 0s 879us/step - loss: 12.3299 - mean_squared_error: 12.3299
Epoch 150/900
11/11 [==============================] - 0s 917us/step - loss: 12.2966 - mean_squared_error: 12.2966
Epoch 151/900
11/11 [==============================] - 0s 907us/step - loss: 12.2699 - mean_squared_error: 12.2699
Epoch 152/900
11/11 [==============================] - 0s 898us/step - loss: 12.2470 - mean_squared_error: 12.2470
Epoch 153/900
11/11 [==============================] - 0s 907us/step - loss: 12.2120 - mean_squared_error: 12.2120
Epoch 154/900
11/11 [==============================] - 0s 721us/step - loss: 12.1883 - mean_squared_error: 12.1883
Epoch 155/900
11/11 [==============================] - 0s 816us/step - loss: 12.1562 - mean_squared_error: 12.1562
Epoch 156/900
11/11 [==============================] - 0s 816us/step - loss: 12.1253 - mean_squared_error: 12.1253
Epoch 157/900
11/11 [==============================] - 0s 972us/step - loss: 12.0976 - mean_squared_error: 12.0976
Epoch 158/900
11/11 [==============================] - 0s 882us/step - loss: 12.0767 - mean_squared_error: 12.0767
Epoch 159/900
11/11 [==============================] - 0s 910us/step - loss: 12.0519 - mean_squared_error: 12.0519
Epoch 160/900
11/11 [==============================] - 0s 907us/step - loss: 12.0305 - mean_squared_error: 12.0305
Epoch 161/900
11/11 [==============================] - 0s 810us/step - loss: 12.0048 - mean_squared_error: 12.0048
Epoch 162/900
11/11 [==============================] - 0s 907us/step - loss: 11.9811 - mean_squared_error: 11.9811
Epoch 163/900
11/11 [==============================] - 0s 816us/step - loss: 11.9589 - mean_squared_error: 11.9589
Epoch 164/900
11/11 [==============================] - 0s 726us/step - loss: 11.9422 - mean_squared_error: 11.9422
Epoch 165/900
11/11 [==============================] - 0s 791us/step - loss: 11.9174 - mean_squared_error: 11.9174
Epoch 166/900
11/11 [==============================] - 0s 905us/step - loss: 11.8949 - mean_squared_error: 11.8949
Epoch 167/900
11/11 [==============================] - 0s 846us/step - loss: 11.8795 - mean_squared_error: 11.8795
Epoch 168/900
11/11 [==============================] - 0s 901us/step - loss: 11.8555 - mean_squared_error: 11.8555
Epoch 169/900
11/11 [==============================] - 0s 800us/step - loss: 11.8470 - mean_squared_error: 11.8470
Epoch 170/900
11/11 [==============================] - 0s 872us/step - loss: 11.8137 - mean_squared_error: 11.8137
Epoch 171/900
11/11 [==============================] - 0s 897us/step - loss: 11.7905 - mean_squared_error: 11.7905
Epoch 172/900
11/11 [==============================] - 0s 904us/step - loss: 11.7747 - mean_squared_error: 11.7747
Epoch 173/900
11/11 [==============================] - 0s 984us/step - loss: 11.7539 - mean_squared_error: 11.7539
Epoch 174/900
11/11 [==============================] - 0s 997us/step - loss: 11.7321 - mean_squared_error: 11.7321
Epoch 175/900
11/11 [==============================] - 0s 996us/step - loss: 11.7224 - mean_squared_error: 11.7224
Epoch 176/900
11/11 [==============================] - 0s 895us/step - loss: 11.6993 - mean_squared_error: 11.6993
Epoch 177/900
11/11 [==============================] - 0s 819us/step - loss: 11.6785 - mean_squared_error: 11.6785
Epoch 178/900
11/11 [==============================] - 0s 935us/step - loss: 11.6635 - mean_squared_error: 11.6635
Epoch 179/900
11/11 [==============================] - 0s 867us/step - loss: 11.6499 - mean_squared_error: 11.6499
Epoch 180/900
11/11 [==============================] - 0s 907us/step - loss: 11.6297 - mean_squared_error: 11.6297
Epoch 181/900
11/11 [==============================] - 0s 742us/step - loss: 11.6056 - mean_squared_error: 11.6056
Epoch 182/900
11/11 [==============================] - 0s 887us/step - loss: 11.5908 - mean_squared_error: 11.5908
Epoch 183/900
11/11 [==============================] - 0s 884us/step - loss: 11.5727 - mean_squared_error: 11.5727
Epoch 184/900
11/11 [==============================] - 0s 885us/step - loss: 11.5592 - mean_squared_error: 11.5592
Epoch 185/900
11/11 [==============================] - 0s 910us/step - loss: 11.5373 - mean_squared_error: 11.5373
Epoch 186/900
11/11 [==============================] - 0s 905us/step - loss: 11.5206 - mean_squared_error: 11.5206
Epoch 187/900
11/11 [==============================] - 0s 997us/step - loss: 11.5047 - mean_squared_error: 11.5047
Epoch 188/900
11/11 [==============================] - 0s 997us/step - loss: 11.4883 - mean_squared_error: 11.4883
Epoch 189/900
11/11 [==============================] - 0s 808us/step - loss: 11.4710 - mean_squared_error: 11.4710
Epoch 190/900
11/11 [==============================] - 0s 907us/step - loss: 11.4585 - mean_squared_error: 11.4585
Epoch 191/900
11/11 [==============================] - ETA: 0s - loss: 16.0966 - mean_squared_error: 16.09 - 0s 908us/step - loss: 11.4396 - mean_squared_error: 11.4396
Epoch 192/900
11/11 [==============================] - 0s 881us/step - loss: 11.4233 - mean_squared_error: 11.4233
Epoch 193/900
11/11 [==============================] - 0s 822us/step - loss: 11.4182 - mean_squared_error: 11.4182
Epoch 194/900
11/11 [==============================] - 0s 895us/step - loss: 11.3938 - mean_squared_error: 11.3938
Epoch 195/900
11/11 [==============================] - 0s 811us/step - loss: 11.3730 - mean_squared_error: 11.3730
Epoch 196/900
11/11 [==============================] - 0s 868us/step - loss: 11.3594 - mean_squared_error: 11.3594
Epoch 197/900
11/11 [==============================] - 0s 964us/step - loss: 11.3416 - mean_squared_error: 11.3416
Epoch 198/900
11/11 [==============================] - 0s 819us/step - loss: 11.3245 - mean_squared_error: 11.3245
Epoch 199/900
11/11 [==============================] - 0s 879us/step - loss: 11.3075 - mean_squared_error: 11.3075
Epoch 200/900
11/11 [==============================] - 0s 816us/step - loss: 11.2964 - mean_squared_error: 11.2964
Epoch 201/900
11/11 [==============================] - 0s 763us/step - loss: 11.2793 - mean_squared_error: 11.2793
Epoch 202/900
11/11 [==============================] - 0s 871us/step - loss: 11.2686 - mean_squared_error: 11.2686
Epoch 203/900
11/11 [==============================] - 0s 816us/step - loss: 11.2548 - mean_squared_error: 11.2548
Epoch 204/900
11/11 [==============================] - 0s 815us/step - loss: 11.2353 - mean_squared_error: 11.2353
Epoch 205/900
11/11 [==============================] - 0s 911us/step - loss: 11.2210 - mean_squared_error: 11.2210
Epoch 206/900
11/11 [==============================] - 0s 869us/step - loss: 11.2315 - mean_squared_error: 11.2315
Epoch 207/900
11/11 [==============================] - 0s 720us/step - loss: 11.2010 - mean_squared_error: 11.2010
Epoch 208/900
11/11 [==============================] - 0s 890us/step - loss: 11.1825 - mean_squared_error: 11.1825
Epoch 209/900
11/11 [==============================] - 0s 819us/step - loss: 11.1678 - mean_squared_error: 11.1678
Epoch 210/900
11/11 [==============================] - 0s 879us/step - loss: 11.1561 - mean_squared_error: 11.1561
Epoch 211/900
11/11 [==============================] - 0s 997us/step - loss: 11.1398 - mean_squared_error: 11.1398
Epoch 212/900
11/11 [==============================] - 0s 903us/step - loss: 11.1275 - mean_squared_error: 11.1275
Epoch 213/900
11/11 [==============================] - 0s 997us/step - loss: 11.1113 - mean_squared_error: 11.1113
Epoch 214/900
11/11 [==============================] - 0s 816us/step - loss: 11.1015 - mean_squared_error: 11.1015
Epoch 215/900
11/11 [==============================] - 0s 907us/step - loss: 11.0844 - mean_squared_error: 11.0844
Epoch 216/900
11/11 [==============================] - 0s 725us/step - loss: 11.0740 - mean_squared_error: 11.0740
Epoch 217/900
11/11 [==============================] - 0s 949us/step - loss: 11.0627 - mean_squared_error: 11.0627
Epoch 218/900
11/11 [==============================] - 0s 894us/step - loss: 11.0608 - mean_squared_error: 11.0608
Epoch 219/900
11/11 [==============================] - 0s 939us/step - loss: 11.0335 - mean_squared_error: 11.0335
Epoch 220/900
11/11 [==============================] - 0s 958us/step - loss: 11.0212 - mean_squared_error: 11.0212
Epoch 221/900
11/11 [==============================] - 0s 818us/step - loss: 11.0066 - mean_squared_error: 11.0066
Epoch 222/900
11/11 [==============================] - 0s 821us/step - loss: 10.9901 - mean_squared_error: 10.9901
Epoch 223/900
11/11 [==============================] - 0s 817us/step - loss: 10.9773 - mean_squared_error: 10.9773
Epoch 224/900
11/11 [==============================] - 0s 862us/step - loss: 10.9663 - mean_squared_error: 10.9663
Epoch 225/900
11/11 [==============================] - 0s 943us/step - loss: 10.9574 - mean_squared_error: 10.9574
Epoch 226/900
11/11 [==============================] - 0s 866us/step - loss: 10.9422 - mean_squared_error: 10.9422
Epoch 227/900
11/11 [==============================] - 0s 896us/step - loss: 10.9275 - mean_squared_error: 10.9275
Epoch 228/900
11/11 [==============================] - 0s 962us/step - loss: 10.9160 - mean_squared_error: 10.9160
Epoch 229/900
11/11 [==============================] - ETA: 0s - loss: 7.1007 - mean_squared_error: 7.10 - 0s 971us/step - loss: 10.9054 - mean_squared_error: 10.9054
Epoch 230/900
11/11 [==============================] - 0s 1ms/step - loss: 10.8925 - mean_squared_error: 10.8925
Epoch 231/900
11/11 [==============================] - 0s 1ms/step - loss: 10.8799 - mean_squared_error: 10.8799
Epoch 232/900
11/11 [==============================] - 0s 965us/step - loss: 10.8616 - mean_squared_error: 10.8616
Epoch 233/900
11/11 [==============================] - 0s 956us/step - loss: 10.8620 - mean_squared_error: 10.8620
Epoch 234/900
11/11 [==============================] - 0s 908us/step - loss: 10.8309 - mean_squared_error: 10.8309
Epoch 235/900
11/11 [==============================] - 0s 870us/step - loss: 10.8311 - mean_squared_error: 10.8311
Epoch 236/900
11/11 [==============================] - 0s 781us/step - loss: 10.8175 - mean_squared_error: 10.8175
Epoch 237/900
11/11 [==============================] - 0s 816us/step - loss: 10.8091 - mean_squared_error: 10.8091
Epoch 238/900
11/11 [==============================] - 0s 817us/step - loss: 10.7899 - mean_squared_error: 10.7899
Epoch 239/900
11/11 [==============================] - 0s 997us/step - loss: 10.7795 - mean_squared_error: 10.7795
Epoch 240/900
11/11 [==============================] - 0s 878us/step - loss: 10.7633 - mean_squared_error: 10.7633
Epoch 241/900
11/11 [==============================] - 0s 785us/step - loss: 10.7544 - mean_squared_error: 10.7544
Epoch 242/900
11/11 [==============================] - 0s 816us/step - loss: 10.7397 - mean_squared_error: 10.7397
Epoch 243/900
11/11 [==============================] - 0s 904us/step - loss: 10.7264 - mean_squared_error: 10.7264
Epoch 244/900
11/11 [==============================] - 0s 990us/step - loss: 10.7128 - mean_squared_error: 10.7128
Epoch 245/900
11/11 [==============================] - 0s 956us/step - loss: 10.7026 - mean_squared_error: 10.7026
Epoch 246/900
11/11 [==============================] - 0s 905us/step - loss: 10.6907 - mean_squared_error: 10.6907
Epoch 247/900
11/11 [==============================] - 0s 882us/step - loss: 10.6787 - mean_squared_error: 10.6787
Epoch 248/900
11/11 [==============================] - 0s 872us/step - loss: 10.6668 - mean_squared_error: 10.6668
Epoch 249/900
11/11 [==============================] - 0s 1ms/step - loss: 10.6556 - mean_squared_error: 10.6556
Epoch 250/900
11/11 [==============================] - 0s 963us/step - loss: 10.6439 - mean_squared_error: 10.6439
Epoch 251/900
11/11 [==============================] - 0s 907us/step - loss: 10.6303 - mean_squared_error: 10.6303
Epoch 252/900
11/11 [==============================] - 0s 815us/step - loss: 10.6156 - mean_squared_error: 10.6156
Epoch 253/900
11/11 [==============================] - 0s 896us/step - loss: 10.6096 - mean_squared_error: 10.6096
Epoch 254/900
11/11 [==============================] - 0s 870us/step - loss: 10.5953 - mean_squared_error: 10.5953
Epoch 255/900
11/11 [==============================] - 0s 1ms/step - loss: 10.5824 - mean_squared_error: 10.5824
Epoch 256/900
11/11 [==============================] - 0s 907us/step - loss: 10.5689 - mean_squared_error: 10.5689
Epoch 257/900
11/11 [==============================] - 0s 907us/step - loss: 10.5593 - mean_squared_error: 10.5593
Epoch 258/900
11/11 [==============================] - 0s 907us/step - loss: 10.5460 - mean_squared_error: 10.5460
Epoch 259/900
11/11 [==============================] - 0s 816us/step - loss: 10.5335 - mean_squared_error: 10.5335
Epoch 260/900
11/11 [==============================] - 0s 725us/step - loss: 10.5262 - mean_squared_error: 10.5262
Epoch 261/900
11/11 [==============================] - 0s 786us/step - loss: 10.5104 - mean_squared_error: 10.5104
Epoch 262/900
11/11 [==============================] - 0s 816us/step - loss: 10.5003 - mean_squared_error: 10.5003
Epoch 263/900
11/11 [==============================] - 0s 956us/step - loss: 10.4858 - mean_squared_error: 10.4858
Epoch 264/900
11/11 [==============================] - 0s 1ms/step - loss: 10.4834 - mean_squared_error: 10.4834
Epoch 265/900
11/11 [==============================] - 0s 997us/step - loss: 10.4676 - mean_squared_error: 10.4676
Epoch 266/900
11/11 [==============================] - 0s 968us/step - loss: 10.4625 - mean_squared_error: 10.4625
Epoch 267/900
11/11 [==============================] - 0s 907us/step - loss: 10.4418 - mean_squared_error: 10.4418
Epoch 268/900
11/11 [==============================] - 0s 816us/step - loss: 10.4280 - mean_squared_error: 10.4280
Epoch 269/900
11/11 [==============================] - 0s 883us/step - loss: 10.4222 - mean_squared_error: 10.4222
Epoch 270/900
11/11 [==============================] - 0s 741us/step - loss: 10.4096 - mean_squared_error: 10.4096
Epoch 271/900
11/11 [==============================] - 0s 984us/step - loss: 10.3963 - mean_squared_error: 10.3963
Epoch 272/900
11/11 [==============================] - 0s 997us/step - loss: 10.3829 - mean_squared_error: 10.3829
Epoch 273/900
11/11 [==============================] - 0s 987us/step - loss: 10.3734 - mean_squared_error: 10.3734
Epoch 274/900
11/11 [==============================] - 0s 905us/step - loss: 10.3603 - mean_squared_error: 10.3603
Epoch 275/900
11/11 [==============================] - 0s 879us/step - loss: 10.3524 - mean_squared_error: 10.3524
Epoch 276/900
11/11 [==============================] - 0s 907us/step - loss: 10.3435 - mean_squared_error: 10.3435
Epoch 277/900
11/11 [==============================] - 0s 916us/step - loss: 10.3309 - mean_squared_error: 10.3309
Epoch 278/900
11/11 [==============================] - 0s 798us/step - loss: 10.3266 - mean_squared_error: 10.3266
Epoch 279/900
11/11 [==============================] - 0s 897us/step - loss: 10.3058 - mean_squared_error: 10.3058
Epoch 280/900
11/11 [==============================] - 0s 779us/step - loss: 10.2979 - mean_squared_error: 10.2979
Epoch 281/900
11/11 [==============================] - 0s 991us/step - loss: 10.2964 - mean_squared_error: 10.2964
Epoch 282/900
11/11 [==============================] - 0s 915us/step - loss: 10.2737 - mean_squared_error: 10.2737
Epoch 283/900
11/11 [==============================] - 0s 819us/step - loss: 10.2639 - mean_squared_error: 10.2639
Epoch 284/900
11/11 [==============================] - 0s 908us/step - loss: 10.2495 - mean_squared_error: 10.2495
Epoch 285/900
11/11 [==============================] - 0s 997us/step - loss: 10.2454 - mean_squared_error: 10.2454
Epoch 286/900
11/11 [==============================] - 0s 907us/step - loss: 10.2271 - mean_squared_error: 10.2271
Epoch 287/900
11/11 [==============================] - 0s 982us/step - loss: 10.2210 - mean_squared_error: 10.2210
Epoch 288/900
11/11 [==============================] - 0s 996us/step - loss: 10.2088 - mean_squared_error: 10.2088
Epoch 289/900
11/11 [==============================] - 0s 786us/step - loss: 10.1987 - mean_squared_error: 10.1987
Epoch 290/900
11/11 [==============================] - 0s 936us/step - loss: 10.1858 - mean_squared_error: 10.1858
Epoch 291/900
11/11 [==============================] - 0s 893us/step - loss: 10.1722 - mean_squared_error: 10.1722
Epoch 292/900
11/11 [==============================] - 0s 989us/step - loss: 10.1614 - mean_squared_error: 10.1614
Epoch 293/900
11/11 [==============================] - 0s 907us/step - loss: 10.1511 - mean_squared_error: 10.1511
Epoch 294/900
11/11 [==============================] - 0s 963us/step - loss: 10.1493 - mean_squared_error: 10.1493
Epoch 295/900
11/11 [==============================] - 0s 868us/step - loss: 10.1417 - mean_squared_error: 10.1417
Epoch 296/900
11/11 [==============================] - 0s 777us/step - loss: 10.1213 - mean_squared_error: 10.1213
Epoch 297/900
11/11 [==============================] - 0s 787us/step - loss: 10.1091 - mean_squared_error: 10.1091
Epoch 298/900
11/11 [==============================] - 0s 997us/step - loss: 10.0993 - mean_squared_error: 10.0993
Epoch 299/900
11/11 [==============================] - 0s 808us/step - loss: 10.0867 - mean_squared_error: 10.0867
Epoch 300/900
11/11 [==============================] - 0s 725us/step - loss: 10.0748 - mean_squared_error: 10.0748
Epoch 301/900
11/11 [==============================] - 0s 905us/step - loss: 10.0816 - mean_squared_error: 10.0816
Epoch 302/900
11/11 [==============================] - 0s 802us/step - loss: 10.0732 - mean_squared_error: 10.0732
Epoch 303/900
11/11 [==============================] - 0s 1ms/step - loss: 10.0475 - mean_squared_error: 10.0475
Epoch 304/900
11/11 [==============================] - 0s 816us/step - loss: 10.0403 - mean_squared_error: 10.0403
Epoch 305/900
11/11 [==============================] - 0s 904us/step - loss: 10.0285 - mean_squared_error: 10.0285
Epoch 306/900
11/11 [==============================] - 0s 726us/step - loss: 10.0110 - mean_squared_error: 10.0110
Epoch 307/900
11/11 [==============================] - ETA: 0s - loss: 7.4252 - mean_squared_error: 7.42 - 0s 896us/step - loss: 10.0052 - mean_squared_error: 10.0052
Epoch 308/900
11/11 [==============================] - 0s 962us/step - loss: 9.9953 - mean_squared_error: 9.9953
Epoch 309/900
11/11 [==============================] - 0s 907us/step - loss: 9.9789 - mean_squared_error: 9.9789
Epoch 310/900
11/11 [==============================] - 0s 726us/step - loss: 9.9765 - mean_squared_error: 9.9765
Epoch 311/900
11/11 [==============================] - 0s 997us/step - loss: 9.9582 - mean_squared_error: 9.9582
Epoch 312/900
11/11 [==============================] - ETA: 0s - loss: 8.9645 - mean_squared_error: 8.96 - 0s 970us/step - loss: 9.9436 - mean_squared_error: 9.9436
Epoch 313/900
11/11 [==============================] - 0s 1ms/step - loss: 9.9363 - mean_squared_error: 9.9363
Epoch 314/900
11/11 [==============================] - 0s 1ms/step - loss: 9.9311 - mean_squared_error: 9.9311
Epoch 315/900
11/11 [==============================] - 0s 991us/step - loss: 9.9222 - mean_squared_error: 9.9222
Epoch 316/900
11/11 [==============================] - 0s 898us/step - loss: 9.9081 - mean_squared_error: 9.9081
Epoch 317/900
11/11 [==============================] - 0s 911us/step - loss: 9.8964 - mean_squared_error: 9.8964
Epoch 318/900
11/11 [==============================] - 0s 796us/step - loss: 9.8855 - mean_squared_error: 9.8855
Epoch 319/900
11/11 [==============================] - 0s 829us/step - loss: 9.8888 - mean_squared_error: 9.8888
Epoch 320/900
11/11 [==============================] - 0s 805us/step - loss: 9.8766 - mean_squared_error: 9.8766
Epoch 321/900
11/11 [==============================] - 0s 724us/step - loss: 9.8538 - mean_squared_error: 9.8538
Epoch 322/900
11/11 [==============================] - 0s 865us/step - loss: 9.8449 - mean_squared_error: 9.8449
Epoch 323/900
11/11 [==============================] - 0s 808us/step - loss: 9.8291 - mean_squared_error: 9.8291
Epoch 324/900
11/11 [==============================] - 0s 875us/step - loss: 9.8254 - mean_squared_error: 9.8254
Epoch 325/900
11/11 [==============================] - 0s 880us/step - loss: 9.8128 - mean_squared_error: 9.8128
Epoch 326/900
11/11 [==============================] - 0s 907us/step - loss: 9.8022 - mean_squared_error: 9.8022
Epoch 327/900
11/11 [==============================] - 0s 801us/step - loss: 9.8029 - mean_squared_error: 9.8029
Epoch 328/900
11/11 [==============================] - 0s 826us/step - loss: 9.7905 - mean_squared_error: 9.7905
Epoch 329/900
11/11 [==============================] - 0s 805us/step - loss: 9.7695 - mean_squared_error: 9.7695
Epoch 330/900
11/11 [==============================] - 0s 913us/step - loss: 9.7628 - mean_squared_error: 9.7628
Epoch 331/900
11/11 [==============================] - ETA: 0s - loss: 23.0689 - mean_squared_error: 23.06 - 0s 896us/step - loss: 9.7534 - mean_squared_error: 9.7534
Epoch 332/900
11/11 [==============================] - 0s 868us/step - loss: 9.7424 - mean_squared_error: 9.7424
Epoch 333/900
11/11 [==============================] - 0s 911us/step - loss: 9.7312 - mean_squared_error: 9.7312
Epoch 334/900
11/11 [==============================] - 0s 816us/step - loss: 9.7213 - mean_squared_error: 9.7213
Epoch 335/900
11/11 [==============================] - 0s 861us/step - loss: 9.7137 - mean_squared_error: 9.7137
Epoch 336/900
11/11 [==============================] - 0s 1ms/step - loss: 9.6999 - mean_squared_error: 9.6999
Epoch 337/900
11/11 [==============================] - 0s 1ms/step - loss: 9.6913 - mean_squared_error: 9.6913
Epoch 338/900
11/11 [==============================] - 0s 995us/step - loss: 9.6898 - mean_squared_error: 9.6898
Epoch 339/900
11/11 [==============================] - 0s 997us/step - loss: 9.6756 - mean_squared_error: 9.6756
Epoch 340/900
11/11 [==============================] - 0s 877us/step - loss: 9.6582 - mean_squared_error: 9.6582
Epoch 341/900
11/11 [==============================] - 0s 997us/step - loss: 9.6471 - mean_squared_error: 9.6471
Epoch 342/900
11/11 [==============================] - 0s 816us/step - loss: 9.6425 - mean_squared_error: 9.6425
Epoch 343/900
11/11 [==============================] - 0s 960us/step - loss: 9.6263 - mean_squared_error: 9.6263
Epoch 344/900
11/11 [==============================] - 0s 819us/step - loss: 9.6170 - mean_squared_error: 9.6170
Epoch 345/900
11/11 [==============================] - 0s 895us/step - loss: 9.6047 - mean_squared_error: 9.6047
Epoch 346/900
11/11 [==============================] - 0s 815us/step - loss: 9.6021 - mean_squared_error: 9.6021
Epoch 347/900
11/11 [==============================] - 0s 702us/step - loss: 9.5896 - mean_squared_error: 9.5896
Epoch 348/900
11/11 [==============================] - 0s 717us/step - loss: 9.5805 - mean_squared_error: 9.5805
Epoch 349/900
11/11 [==============================] - 0s 686us/step - loss: 9.5710 - mean_squared_error: 9.5710
Epoch 350/900
11/11 [==============================] - 0s 816us/step - loss: 9.5612 - mean_squared_error: 9.5612
Epoch 351/900
11/11 [==============================] - 0s 906us/step - loss: 9.5460 - mean_squared_error: 9.5460
Epoch 352/900
11/11 [==============================] - 0s 907us/step - loss: 9.5374 - mean_squared_error: 9.5374
Epoch 353/900
11/11 [==============================] - 0s 787us/step - loss: 9.5338 - mean_squared_error: 9.5338
Epoch 354/900
11/11 [==============================] - 0s 779us/step - loss: 9.5183 - mean_squared_error: 9.5183
Epoch 355/900
11/11 [==============================] - 0s 887us/step - loss: 9.5102 - mean_squared_error: 9.5102
Epoch 356/900
11/11 [==============================] - 0s 715us/step - loss: 9.5015 - mean_squared_error: 9.5015
Epoch 357/900
11/11 [==============================] - 0s 890us/step - loss: 9.4837 - mean_squared_error: 9.4837
Epoch 358/900
11/11 [==============================] - 0s 859us/step - loss: 9.4744 - mean_squared_error: 9.4744
Epoch 359/900
11/11 [==============================] - 0s 816us/step - loss: 9.4687 - mean_squared_error: 9.4687
Epoch 360/900
11/11 [==============================] - 0s 815us/step - loss: 9.4629 - mean_squared_error: 9.4629
Epoch 361/900
11/11 [==============================] - 0s 871us/step - loss: 9.4439 - mean_squared_error: 9.4439
Epoch 362/900
11/11 [==============================] - 0s 960us/step - loss: 9.4382 - mean_squared_error: 9.4382
Epoch 363/900
11/11 [==============================] - 0s 906us/step - loss: 9.4307 - mean_squared_error: 9.4307
Epoch 364/900
11/11 [==============================] - 0s 967us/step - loss: 9.4226 - mean_squared_error: 9.4226
Epoch 365/900
11/11 [==============================] - 0s 871us/step - loss: 9.4049 - mean_squared_error: 9.4049
Epoch 366/900
11/11 [==============================] - 0s 802us/step - loss: 9.3934 - mean_squared_error: 9.3934
Epoch 367/900
11/11 [==============================] - 0s 810us/step - loss: 9.3808 - mean_squared_error: 9.3808
Epoch 368/900
11/11 [==============================] - 0s 907us/step - loss: 9.3769 - mean_squared_error: 9.3769
Epoch 369/900
11/11 [==============================] - 0s 820us/step - loss: 9.3617 - mean_squared_error: 9.3617
Epoch 370/900
11/11 [==============================] - 0s 775us/step - loss: 9.3521 - mean_squared_error: 9.3521
Epoch 371/900
11/11 [==============================] - 0s 849us/step - loss: 9.3435 - mean_squared_error: 9.3435
Epoch 372/900
11/11 [==============================] - 0s 919us/step - loss: 9.3341 - mean_squared_error: 9.3341
Epoch 373/900
11/11 [==============================] - 0s 1ms/step - loss: 9.3241 - mean_squared_error: 9.3241
Epoch 374/900
11/11 [==============================] - 0s 806us/step - loss: 9.3125 - mean_squared_error: 9.3125
Epoch 375/900
11/11 [==============================] - 0s 965us/step - loss: 9.3061 - mean_squared_error: 9.3061
Epoch 376/900
11/11 [==============================] - 0s 861us/step - loss: 9.2942 - mean_squared_error: 9.2942
Epoch 377/900
11/11 [==============================] - 0s 976us/step - loss: 9.2875 - mean_squared_error: 9.2875
Epoch 378/900
11/11 [==============================] - 0s 973us/step - loss: 9.2764 - mean_squared_error: 9.2764
Epoch 379/900
11/11 [==============================] - 0s 1ms/step - loss: 9.2815 - mean_squared_error: 9.2815
Epoch 380/900
11/11 [==============================] - 0s 897us/step - loss: 9.2531 - mean_squared_error: 9.2531
Epoch 381/900
11/11 [==============================] - 0s 816us/step - loss: 9.2450 - mean_squared_error: 9.2450
Epoch 382/900
11/11 [==============================] - 0s 771us/step - loss: 9.2371 - mean_squared_error: 9.2371
Epoch 383/900
11/11 [==============================] - 0s 910us/step - loss: 9.2227 - mean_squared_error: 9.2227
Epoch 384/900
11/11 [==============================] - 0s 907us/step - loss: 9.2137 - mean_squared_error: 9.2137
Epoch 385/900
11/11 [==============================] - 0s 716us/step - loss: 9.2086 - mean_squared_error: 9.2086
Epoch 386/900
11/11 [==============================] - 0s 816us/step - loss: 9.1920 - mean_squared_error: 9.1920
Epoch 387/900
11/11 [==============================] - 0s 1ms/step - loss: 9.1844 - mean_squared_error: 9.1844
Epoch 388/900
11/11 [==============================] - 0s 907us/step - loss: 9.1743 - mean_squared_error: 9.1743
Epoch 389/900
11/11 [==============================] - 0s 816us/step - loss: 9.1738 - mean_squared_error: 9.1738
Epoch 390/900
11/11 [==============================] - 0s 910us/step - loss: 9.1547 - mean_squared_error: 9.1547
Epoch 391/900
11/11 [==============================] - 0s 907us/step - loss: 9.1440 - mean_squared_error: 9.1440
Epoch 392/900
11/11 [==============================] - 0s 813us/step - loss: 9.1380 - mean_squared_error: 9.1380
Epoch 393/900
11/11 [==============================] - 0s 878us/step - loss: 9.1250 - mean_squared_error: 9.1250
Epoch 394/900
11/11 [==============================] - 0s 864us/step - loss: 9.1164 - mean_squared_error: 9.1164
Epoch 395/900
11/11 [==============================] - 0s 1ms/step - loss: 9.1061 - mean_squared_error: 9.1061
Epoch 396/900
11/11 [==============================] - 0s 1ms/step - loss: 9.0963 - mean_squared_error: 9.0963
Epoch 397/900
11/11 [==============================] - 0s 905us/step - loss: 9.0847 - mean_squared_error: 9.0847
Epoch 398/900
11/11 [==============================] - 0s 908us/step - loss: 9.0769 - mean_squared_error: 9.0769
Epoch 399/900
11/11 [==============================] - 0s 1ms/step - loss: 9.0730 - mean_squared_error: 9.0730
Epoch 400/900
11/11 [==============================] - 0s 984us/step - loss: 9.0575 - mean_squared_error: 9.0575
Epoch 401/900
11/11 [==============================] - 0s 887us/step - loss: 9.0474 - mean_squared_error: 9.0474
Epoch 402/900
11/11 [==============================] - 0s 1ms/step - loss: 9.0390 - mean_squared_error: 9.0390
Epoch 403/900
11/11 [==============================] - 0s 1ms/step - loss: 9.0277 - mean_squared_error: 9.0277
Epoch 404/900
11/11 [==============================] - 0s 816us/step - loss: 9.0169 - mean_squared_error: 9.0169
Epoch 405/900
11/11 [==============================] - 0s 865us/step - loss: 9.0059 - mean_squared_error: 9.0059
Epoch 406/900
11/11 [==============================] - 0s 817us/step - loss: 8.9958 - mean_squared_error: 8.9958
Epoch 407/900
11/11 [==============================] - 0s 963us/step - loss: 8.9885 - mean_squared_error: 8.9885
Epoch 408/900
11/11 [==============================] - 0s 874us/step - loss: 8.9898 - mean_squared_error: 8.9898
Epoch 409/900
11/11 [==============================] - 0s 956us/step - loss: 8.9751 - mean_squared_error: 8.9751
Epoch 410/900
11/11 [==============================] - 0s 836us/step - loss: 8.9702 - mean_squared_error: 8.9702
Epoch 411/900
11/11 [==============================] - 0s 810us/step - loss: 8.9542 - mean_squared_error: 8.9542
Epoch 412/900
11/11 [==============================] - 0s 906us/step - loss: 8.9427 - mean_squared_error: 8.9427
Epoch 413/900
11/11 [==============================] - ETA: 0s - loss: 17.6725 - mean_squared_error: 17.67 - 0s 908us/step - loss: 8.9353 - mean_squared_error: 8.9353
Epoch 414/900
11/11 [==============================] - 0s 997us/step - loss: 8.9206 - mean_squared_error: 8.9206
Epoch 415/900
11/11 [==============================] - 0s 952us/step - loss: 8.9133 - mean_squared_error: 8.9133
Epoch 416/900
11/11 [==============================] - 0s 990us/step - loss: 8.9045 - mean_squared_error: 8.9045
Epoch 417/900
11/11 [==============================] - 0s 867us/step - loss: 8.9191 - mean_squared_error: 8.9191
Epoch 418/900
11/11 [==============================] - 0s 907us/step - loss: 8.8830 - mean_squared_error: 8.8830
Epoch 419/900
11/11 [==============================] - 0s 968us/step - loss: 8.8771 - mean_squared_error: 8.8771
Epoch 420/900
11/11 [==============================] - 0s 956us/step - loss: 8.8612 - mean_squared_error: 8.8612
Epoch 421/900
11/11 [==============================] - ETA: 0s - loss: 11.9735 - mean_squared_error: 11.97 - 0s 997us/step - loss: 8.8564 - mean_squared_error: 8.8564
Epoch 422/900
11/11 [==============================] - 0s 1ms/step - loss: 8.8547 - mean_squared_error: 8.8547
Epoch 423/900
11/11 [==============================] - 0s 993us/step - loss: 8.8428 - mean_squared_error: 8.8428
Epoch 424/900
11/11 [==============================] - 0s 903us/step - loss: 8.8252 - mean_squared_error: 8.8252
Epoch 425/900
11/11 [==============================] - 0s 801us/step - loss: 8.8138 - mean_squared_error: 8.8138
Epoch 426/900
11/11 [==============================] - 0s 897us/step - loss: 8.8090 - mean_squared_error: 8.8090
Epoch 427/900
11/11 [==============================] - 0s 904us/step - loss: 8.8061 - mean_squared_error: 8.8061
Epoch 428/900
11/11 [==============================] - 0s 780us/step - loss: 8.7858 - mean_squared_error: 8.7858
Epoch 429/900
11/11 [==============================] - 0s 898us/step - loss: 8.7762 - mean_squared_error: 8.7762
Epoch 430/900
11/11 [==============================] - 0s 897us/step - loss: 8.7683 - mean_squared_error: 8.7683
Epoch 431/900
11/11 [==============================] - 0s 997us/step - loss: 8.7561 - mean_squared_error: 8.7561
Epoch 432/900
11/11 [==============================] - 0s 1ms/step - loss: 8.7518 - mean_squared_error: 8.7518
Epoch 433/900
11/11 [==============================] - 0s 1ms/step - loss: 8.7423 - mean_squared_error: 8.7423
Epoch 434/900
11/11 [==============================] - 0s 999us/step - loss: 8.7277 - mean_squared_error: 8.7277
Epoch 435/900
11/11 [==============================] - 0s 986us/step - loss: 8.7200 - mean_squared_error: 8.7200
Epoch 436/900
11/11 [==============================] - 0s 816us/step - loss: 8.7131 - mean_squared_error: 8.7131
Epoch 437/900
11/11 [==============================] - 0s 874us/step - loss: 8.7007 - mean_squared_error: 8.7007
Epoch 438/900
11/11 [==============================] - 0s 865us/step - loss: 8.6899 - mean_squared_error: 8.6899
Epoch 439/900
11/11 [==============================] - 0s 816us/step - loss: 8.6846 - mean_squared_error: 8.6846
Epoch 440/900
11/11 [==============================] - 0s 905us/step - loss: 8.6705 - mean_squared_error: 8.6705
Epoch 441/900
11/11 [==============================] - 0s 817us/step - loss: 8.6652 - mean_squared_error: 8.6652
Epoch 442/900
11/11 [==============================] - 0s 796us/step - loss: 8.6548 - mean_squared_error: 8.6548
Epoch 443/900
11/11 [==============================] - 0s 985us/step - loss: 8.6420 - mean_squared_error: 8.6420
Epoch 444/900
11/11 [==============================] - 0s 1ms/step - loss: 8.6385 - mean_squared_error: 8.6385
Epoch 445/900
11/11 [==============================] - 0s 906us/step - loss: 8.6256 - mean_squared_error: 8.6256
Epoch 446/900
11/11 [==============================] - 0s 900us/step - loss: 8.6155 - mean_squared_error: 8.6155
Epoch 447/900
11/11 [==============================] - 0s 972us/step - loss: 8.6083 - mean_squared_error: 8.6083
Epoch 448/900
11/11 [==============================] - 0s 956us/step - loss: 8.6002 - mean_squared_error: 8.6002
Epoch 449/900
11/11 [==============================] - 0s 907us/step - loss: 8.5880 - mean_squared_error: 8.5880
Epoch 450/900
11/11 [==============================] - 0s 998us/step - loss: 8.5828 - mean_squared_error: 8.5828
Epoch 451/900
11/11 [==============================] - 0s 1ms/step - loss: 8.5704 - mean_squared_error: 8.5704
Epoch 452/900
11/11 [==============================] - 0s 911us/step - loss: 8.5694 - mean_squared_error: 8.5694
Epoch 453/900
11/11 [==============================] - 0s 1ms/step - loss: 8.5515 - mean_squared_error: 8.5515
Epoch 454/900
11/11 [==============================] - 0s 997us/step - loss: 8.5436 - mean_squared_error: 8.5436
Epoch 455/900
11/11 [==============================] - 0s 1ms/step - loss: 8.5305 - mean_squared_error: 8.5305
Epoch 456/900
11/11 [==============================] - 0s 1ms/step - loss: 8.5224 - mean_squared_error: 8.5224
Epoch 457/900
11/11 [==============================] - 0s 1ms/step - loss: 8.5124 - mean_squared_error: 8.5124
Epoch 458/900
11/11 [==============================] - 0s 995us/step - loss: 8.5034 - mean_squared_error: 8.5034
Epoch 459/900
11/11 [==============================] - 0s 983us/step - loss: 8.4964 - mean_squared_error: 8.4964
Epoch 460/900
11/11 [==============================] - 0s 997us/step - loss: 8.4866 - mean_squared_error: 8.4866
Epoch 461/900
11/11 [==============================] - 0s 805us/step - loss: 8.4771 - mean_squared_error: 8.4771
Epoch 462/900
11/11 [==============================] - 0s 816us/step - loss: 8.4675 - mean_squared_error: 8.4675
Epoch 463/900
11/11 [==============================] - 0s 816us/step - loss: 8.4580 - mean_squared_error: 8.4580
Epoch 464/900
11/11 [==============================] - 0s 787us/step - loss: 8.4570 - mean_squared_error: 8.4570
Epoch 465/900
11/11 [==============================] - 0s 817us/step - loss: 8.4392 - mean_squared_error: 8.4392
Epoch 466/900
11/11 [==============================] - 0s 725us/step - loss: 8.4320 - mean_squared_error: 8.4320
Epoch 467/900
11/11 [==============================] - 0s 870us/step - loss: 8.4184 - mean_squared_error: 8.4184
Epoch 468/900
11/11 [==============================] - 0s 975us/step - loss: 8.4147 - mean_squared_error: 8.4147
Epoch 469/900
11/11 [==============================] - 0s 816us/step - loss: 8.4026 - mean_squared_error: 8.4026
Epoch 470/900
11/11 [==============================] - 0s 901us/step - loss: 8.3927 - mean_squared_error: 8.3927
Epoch 471/900
11/11 [==============================] - 0s 1ms/step - loss: 8.3804 - mean_squared_error: 8.3804
Epoch 472/900
11/11 [==============================] - 0s 1ms/step - loss: 8.3749 - mean_squared_error: 8.3749
Epoch 473/900
11/11 [==============================] - 0s 1ms/step - loss: 8.3634 - mean_squared_error: 8.3634
Epoch 474/900
11/11 [==============================] - 0s 989us/step - loss: 8.3566 - mean_squared_error: 8.3566
Epoch 475/900
11/11 [==============================] - 0s 804us/step - loss: 8.3458 - mean_squared_error: 8.3458
Epoch 476/900
11/11 [==============================] - 0s 816us/step - loss: 8.3426 - mean_squared_error: 8.3426
Epoch 477/900
11/11 [==============================] - 0s 872us/step - loss: 8.3341 - mean_squared_error: 8.3341
Epoch 478/900
11/11 [==============================] - 0s 775us/step - loss: 8.3174 - mean_squared_error: 8.3174
Epoch 479/900
11/11 [==============================] - 0s 957us/step - loss: 8.3229 - mean_squared_error: 8.3229
Epoch 480/900
11/11 [==============================] - 0s 957us/step - loss: 8.2933 - mean_squared_error: 8.2933
Epoch 481/900
11/11 [==============================] - 0s 1ms/step - loss: 8.2890 - mean_squared_error: 8.2890
Epoch 482/900
11/11 [==============================] - 0s 940us/step - loss: 8.2841 - mean_squared_error: 8.2841
Epoch 483/900
11/11 [==============================] - 0s 998us/step - loss: 8.2740 - mean_squared_error: 8.2740
Epoch 484/900
11/11 [==============================] - 0s 968us/step - loss: 8.2645 - mean_squared_error: 8.2645
Epoch 485/900
11/11 [==============================] - 0s 904us/step - loss: 8.2521 - mean_squared_error: 8.2521
Epoch 486/900
11/11 [==============================] - 0s 997us/step - loss: 8.2439 - mean_squared_error: 8.2439
Epoch 487/900
11/11 [==============================] - 0s 908us/step - loss: 8.2321 - mean_squared_error: 8.2321
Epoch 488/900
11/11 [==============================] - 0s 998us/step - loss: 8.2286 - mean_squared_error: 8.2286
Epoch 489/900
11/11 [==============================] - 0s 1000us/step - loss: 8.2153 - mean_squared_error: 8.2153
Epoch 490/900
11/11 [==============================] - 0s 993us/step - loss: 8.2102 - mean_squared_error: 8.2102
Epoch 491/900
11/11 [==============================] - 0s 816us/step - loss: 8.1991 - mean_squared_error: 8.1991
Epoch 492/900
11/11 [==============================] - 0s 851us/step - loss: 8.1828 - mean_squared_error: 8.1828
Epoch 493/900
11/11 [==============================] - 0s 984us/step - loss: 8.1839 - mean_squared_error: 8.1839
Epoch 494/900
11/11 [==============================] - 0s 1ms/step - loss: 8.1731 - mean_squared_error: 8.1731
Epoch 495/900
11/11 [==============================] - 0s 962us/step - loss: 8.1602 - mean_squared_error: 8.1602
Epoch 496/900
11/11 [==============================] - 0s 1ms/step - loss: 8.1527 - mean_squared_error: 8.1527
Epoch 497/900
11/11 [==============================] - 0s 901us/step - loss: 8.1377 - mean_squared_error: 8.1377
Epoch 498/900
11/11 [==============================] - 0s 872us/step - loss: 8.1435 - mean_squared_error: 8.1435
Epoch 499/900
11/11 [==============================] - 0s 977us/step - loss: 8.1339 - mean_squared_error: 8.1339
Epoch 500/900
11/11 [==============================] - 0s 791us/step - loss: 8.1177 - mean_squared_error: 8.1177
Epoch 501/900
11/11 [==============================] - 0s 718us/step - loss: 8.1132 - mean_squared_error: 8.1132
Epoch 502/900
11/11 [==============================] - 0s 898us/step - loss: 8.0994 - mean_squared_error: 8.0994
Epoch 503/900
11/11 [==============================] - 0s 987us/step - loss: 8.0940 - mean_squared_error: 8.0940
Epoch 504/900
11/11 [==============================] - 0s 784us/step - loss: 8.0811 - mean_squared_error: 8.0811
Epoch 505/900
11/11 [==============================] - 0s 907us/step - loss: 8.0771 - mean_squared_error: 8.0771
Epoch 506/900
11/11 [==============================] - 0s 816us/step - loss: 8.0636 - mean_squared_error: 8.0636
Epoch 507/900
11/11 [==============================] - 0s 1ms/step - loss: 8.0559 - mean_squared_error: 8.0559
Epoch 508/900
11/11 [==============================] - 0s 815us/step - loss: 8.0415 - mean_squared_error: 8.0415
Epoch 509/900
11/11 [==============================] - 0s 908us/step - loss: 8.0400 - mean_squared_error: 8.0400
Epoch 510/900
11/11 [==============================] - 0s 995us/step - loss: 8.0231 - mean_squared_error: 8.0231
Epoch 511/900
11/11 [==============================] - 0s 1ms/step - loss: 8.0143 - mean_squared_error: 8.0143
Epoch 512/900
11/11 [==============================] - 0s 997us/step - loss: 8.0058 - mean_squared_error: 8.0058
Epoch 513/900
11/11 [==============================] - 0s 906us/step - loss: 7.9992 - mean_squared_error: 7.9992
Epoch 514/900
11/11 [==============================] - 0s 955us/step - loss: 8.0195 - mean_squared_error: 8.0195
Epoch 515/900
11/11 [==============================] - 0s 908us/step - loss: 7.9769 - mean_squared_error: 7.9769
Epoch 516/900
11/11 [==============================] - 0s 904us/step - loss: 7.9884 - mean_squared_error: 7.9884
Epoch 517/900
11/11 [==============================] - 0s 816us/step - loss: 7.9673 - mean_squared_error: 7.9673
Epoch 518/900
11/11 [==============================] - 0s 907us/step - loss: 7.9561 - mean_squared_error: 7.9561
Epoch 519/900
11/11 [==============================] - 0s 985us/step - loss: 7.9410 - mean_squared_error: 7.9410
Epoch 520/900
11/11 [==============================] - 0s 1ms/step - loss: 7.9331 - mean_squared_error: 7.9331
Epoch 521/900
11/11 [==============================] - 0s 1ms/step - loss: 7.9215 - mean_squared_error: 7.9215
Epoch 522/900
11/11 [==============================] - 0s 831us/step - loss: 7.9125 - mean_squared_error: 7.9125
Epoch 523/900
11/11 [==============================] - 0s 906us/step - loss: 7.9043 - mean_squared_error: 7.9043
Epoch 524/900
11/11 [==============================] - 0s 872us/step - loss: 7.8941 - mean_squared_error: 7.8941
Epoch 525/900
11/11 [==============================] - 0s 816us/step - loss: 7.8890 - mean_squared_error: 7.8890
Epoch 526/900
11/11 [==============================] - 0s 890us/step - loss: 7.8759 - mean_squared_error: 7.8759
Epoch 527/900
11/11 [==============================] - 0s 896us/step - loss: 7.8735 - mean_squared_error: 7.8735
Epoch 528/900
11/11 [==============================] - 0s 998us/step - loss: 7.8582 - mean_squared_error: 7.8582
Epoch 529/900
11/11 [==============================] - 0s 867us/step - loss: 7.8494 - mean_squared_error: 7.8494
Epoch 530/900
11/11 [==============================] - 0s 761us/step - loss: 7.8471 - mean_squared_error: 7.8471
Epoch 531/900
11/11 [==============================] - 0s 816us/step - loss: 7.8280 - mean_squared_error: 7.8280
Epoch 532/900
11/11 [==============================] - 0s 965us/step - loss: 7.8255 - mean_squared_error: 7.8255
Epoch 533/900
11/11 [==============================] - 0s 875us/step - loss: 7.8173 - mean_squared_error: 7.8173
Epoch 534/900
11/11 [==============================] - ETA: 0s - loss: 5.5240 - mean_squared_error: 5.52 - 0s 864us/step - loss: 7.8031 - mean_squared_error: 7.8031
Epoch 535/900
11/11 [==============================] - 0s 868us/step - loss: 7.8022 - mean_squared_error: 7.8022
Epoch 536/900
11/11 [==============================] - 0s 816us/step - loss: 7.7866 - mean_squared_error: 7.7866
Epoch 537/900
11/11 [==============================] - 0s 816us/step - loss: 7.7779 - mean_squared_error: 7.7779
Epoch 538/900
11/11 [==============================] - 0s 802us/step - loss: 7.7724 - mean_squared_error: 7.7724
Epoch 539/900
11/11 [==============================] - 0s 1ms/step - loss: 7.7626 - mean_squared_error: 7.7626
Epoch 540/900
11/11 [==============================] - 0s 906us/step - loss: 7.7523 - mean_squared_error: 7.7523
Epoch 541/900
11/11 [==============================] - 0s 816us/step - loss: 7.7391 - mean_squared_error: 7.7391
Epoch 542/900
11/11 [==============================] - 0s 808us/step - loss: 7.7340 - mean_squared_error: 7.7340
Epoch 543/900
11/11 [==============================] - 0s 815us/step - loss: 7.7213 - mean_squared_error: 7.7213
Epoch 544/900
11/11 [==============================] - 0s 964us/step - loss: 7.7098 - mean_squared_error: 7.7098
Epoch 545/900
11/11 [==============================] - 0s 888us/step - loss: 7.7025 - mean_squared_error: 7.7025
Epoch 546/900
11/11 [==============================] - 0s 888us/step - loss: 7.6937 - mean_squared_error: 7.6937
Epoch 547/900
11/11 [==============================] - 0s 877us/step - loss: 7.6892 - mean_squared_error: 7.6892
Epoch 548/900
11/11 [==============================] - 0s 1ms/step - loss: 7.6757 - mean_squared_error: 7.6757
Epoch 549/900
11/11 [==============================] - 0s 871us/step - loss: 7.6649 - mean_squared_error: 7.6649
Epoch 550/900
11/11 [==============================] - 0s 1ms/step - loss: 7.6583 - mean_squared_error: 7.6583
Epoch 551/900
11/11 [==============================] - 0s 875us/step - loss: 7.6491 - mean_squared_error: 7.6491
Epoch 552/900
11/11 [==============================] - 0s 842us/step - loss: 7.6389 - mean_squared_error: 7.6389
Epoch 553/900
11/11 [==============================] - 0s 896us/step - loss: 7.6431 - mean_squared_error: 7.6431
Epoch 554/900
11/11 [==============================] - 0s 1ms/step - loss: 7.6276 - mean_squared_error: 7.6276
Epoch 555/900
11/11 [==============================] - 0s 997us/step - loss: 7.6193 - mean_squared_error: 7.6193
Epoch 556/900
11/11 [==============================] - 0s 979us/step - loss: 7.6110 - mean_squared_error: 7.6110
Epoch 557/900
11/11 [==============================] - 0s 896us/step - loss: 7.6018 - mean_squared_error: 7.6018
Epoch 558/900
11/11 [==============================] - 0s 997us/step - loss: 7.5950 - mean_squared_error: 7.5950
Epoch 559/900
11/11 [==============================] - 0s 1ms/step - loss: 7.5742 - mean_squared_error: 7.5742
Epoch 560/900
11/11 [==============================] - 0s 816us/step - loss: 7.5682 - mean_squared_error: 7.5682
Epoch 561/900
11/11 [==============================] - 0s 795us/step - loss: 7.5625 - mean_squared_error: 7.5625
Epoch 562/900
11/11 [==============================] - 0s 864us/step - loss: 7.5557 - mean_squared_error: 7.5557
Epoch 563/900
11/11 [==============================] - 0s 962us/step - loss: 7.5431 - mean_squared_error: 7.5431
Epoch 564/900
11/11 [==============================] - 0s 995us/step - loss: 7.5320 - mean_squared_error: 7.5320
Epoch 565/900
11/11 [==============================] - 0s 945us/step - loss: 7.5226 - mean_squared_error: 7.5226
Epoch 566/900
11/11 [==============================] - 0s 953us/step - loss: 7.5131 - mean_squared_error: 7.5131
Epoch 567/900
11/11 [==============================] - 0s 992us/step - loss: 7.5043 - mean_squared_error: 7.5043
Epoch 568/900
11/11 [==============================] - 0s 907us/step - loss: 7.4957 - mean_squared_error: 7.4957
Epoch 569/900
11/11 [==============================] - 0s 871us/step - loss: 7.4917 - mean_squared_error: 7.4917
Epoch 570/900
11/11 [==============================] - 0s 805us/step - loss: 7.4795 - mean_squared_error: 7.4795
Epoch 571/900
11/11 [==============================] - 0s 826us/step - loss: 7.4697 - mean_squared_error: 7.4697
Epoch 572/900
11/11 [==============================] - 0s 803us/step - loss: 7.4654 - mean_squared_error: 7.4654
Epoch 573/900
11/11 [==============================] - 0s 857us/step - loss: 7.4525 - mean_squared_error: 7.4525
Epoch 574/900
11/11 [==============================] - 0s 896us/step - loss: 7.4452 - mean_squared_error: 7.4452
Epoch 575/900
11/11 [==============================] - 0s 716us/step - loss: 7.4452 - mean_squared_error: 7.4452
Epoch 576/900
11/11 [==============================] - 0s 997us/step - loss: 7.4275 - mean_squared_error: 7.4275
Epoch 577/900
11/11 [==============================] - 0s 998us/step - loss: 7.4164 - mean_squared_error: 7.4164
Epoch 578/900
11/11 [==============================] - 0s 1ms/step - loss: 7.4132 - mean_squared_error: 7.4132
Epoch 579/900
11/11 [==============================] - 0s 1ms/step - loss: 7.3972 - mean_squared_error: 7.3972
Epoch 580/900
11/11 [==============================] - 0s 997us/step - loss: 7.3977 - mean_squared_error: 7.3977
Epoch 581/900
11/11 [==============================] - 0s 1ms/step - loss: 7.3821 - mean_squared_error: 7.3821
Epoch 582/900
11/11 [==============================] - 0s 998us/step - loss: 7.3804 - mean_squared_error: 7.3804
Epoch 583/900
11/11 [==============================] - 0s 907us/step - loss: 7.3603 - mean_squared_error: 7.3603
Epoch 584/900
11/11 [==============================] - 0s 892us/step - loss: 7.3553 - mean_squared_error: 7.3553
Epoch 585/900
11/11 [==============================] - 0s 999us/step - loss: 7.3477 - mean_squared_error: 7.3477
Epoch 586/900
11/11 [==============================] - 0s 964us/step - loss: 7.3456 - mean_squared_error: 7.3456
Epoch 587/900
11/11 [==============================] - 0s 871us/step - loss: 7.3274 - mean_squared_error: 7.3274
Epoch 588/900
11/11 [==============================] - 0s 905us/step - loss: 7.3160 - mean_squared_error: 7.3160
Epoch 589/900
11/11 [==============================] - 0s 814us/step - loss: 7.3144 - mean_squared_error: 7.3144
Epoch 590/900
11/11 [==============================] - 0s 925us/step - loss: 7.3043 - mean_squared_error: 7.3043
Epoch 591/900
11/11 [==============================] - 0s 966us/step - loss: 7.2960 - mean_squared_error: 7.2960
Epoch 592/900
11/11 [==============================] - 0s 908us/step - loss: 7.2849 - mean_squared_error: 7.2849
Epoch 593/900
11/11 [==============================] - 0s 987us/step - loss: 7.2737 - mean_squared_error: 7.2737
Epoch 594/900
11/11 [==============================] - 0s 997us/step - loss: 7.2723 - mean_squared_error: 7.2723
Epoch 595/900
11/11 [==============================] - 0s 952us/step - loss: 7.2537 - mean_squared_error: 7.2537
Epoch 596/900
11/11 [==============================] - 0s 882us/step - loss: 7.2496 - mean_squared_error: 7.2496
Epoch 597/900
11/11 [==============================] - 0s 725us/step - loss: 7.2376 - mean_squared_error: 7.2376
Epoch 598/900
11/11 [==============================] - 0s 816us/step - loss: 7.2320 - mean_squared_error: 7.2320
Epoch 599/900
11/11 [==============================] - 0s 903us/step - loss: 7.2185 - mean_squared_error: 7.2185
Epoch 600/900
11/11 [==============================] - 0s 816us/step - loss: 7.2111 - mean_squared_error: 7.2111
Epoch 601/900
11/11 [==============================] - 0s 792us/step - loss: 7.2027 - mean_squared_error: 7.2027
Epoch 602/900
11/11 [==============================] - 0s 769us/step - loss: 7.1940 - mean_squared_error: 7.1940
Epoch 603/900
11/11 [==============================] - 0s 725us/step - loss: 7.1882 - mean_squared_error: 7.1882
Epoch 604/900
11/11 [==============================] - 0s 806us/step - loss: 7.1788 - mean_squared_error: 7.1788
Epoch 605/900
11/11 [==============================] - 0s 986us/step - loss: 7.1687 - mean_squared_error: 7.1687
Epoch 606/900
11/11 [==============================] - 0s 804us/step - loss: 7.1560 - mean_squared_error: 7.1560
Epoch 607/900
11/11 [==============================] - 0s 715us/step - loss: 7.1552 - mean_squared_error: 7.1552
Epoch 608/900
11/11 [==============================] - 0s 812us/step - loss: 7.1470 - mean_squared_error: 7.1470
Epoch 609/900
11/11 [==============================] - 0s 997us/step - loss: 7.1351 - mean_squared_error: 7.1351
Epoch 610/900
11/11 [==============================] - 0s 919us/step - loss: 7.1241 - mean_squared_error: 7.1241
Epoch 611/900
11/11 [==============================] - 0s 781us/step - loss: 7.1140 - mean_squared_error: 7.1140
Epoch 612/900
11/11 [==============================] - 0s 883us/step - loss: 7.1042 - mean_squared_error: 7.1042
Epoch 613/900
11/11 [==============================] - 0s 892us/step - loss: 7.1028 - mean_squared_error: 7.1028
Epoch 614/900
11/11 [==============================] - 0s 816us/step - loss: 7.0884 - mean_squared_error: 7.0884
Epoch 615/900
11/11 [==============================] - 0s 900us/step - loss: 7.0824 - mean_squared_error: 7.0824
Epoch 616/900
11/11 [==============================] - 0s 998us/step - loss: 7.0750 - mean_squared_error: 7.0750
Epoch 617/900
11/11 [==============================] - 0s 810us/step - loss: 7.0689 - mean_squared_error: 7.0689
Epoch 618/900
11/11 [==============================] - 0s 903us/step - loss: 7.0563 - mean_squared_error: 7.0563
Epoch 619/900
11/11 [==============================] - 0s 997us/step - loss: 7.0448 - mean_squared_error: 7.0448
Epoch 620/900
11/11 [==============================] - 0s 993us/step - loss: 7.0372 - mean_squared_error: 7.0372
Epoch 621/900
11/11 [==============================] - 0s 998us/step - loss: 7.0233 - mean_squared_error: 7.0233
Epoch 622/900
11/11 [==============================] - 0s 963us/step - loss: 7.0451 - mean_squared_error: 7.0451
Epoch 623/900
11/11 [==============================] - 0s 960us/step - loss: 7.0186 - mean_squared_error: 7.0186
Epoch 624/900
11/11 [==============================] - 0s 872us/step - loss: 7.0034 - mean_squared_error: 7.0034
Epoch 625/900
11/11 [==============================] - 0s 952us/step - loss: 6.9958 - mean_squared_error: 6.9958
Epoch 626/900
11/11 [==============================] - 0s 862us/step - loss: 6.9847 - mean_squared_error: 6.9847
Epoch 627/900
11/11 [==============================] - 0s 908us/step - loss: 6.9763 - mean_squared_error: 6.9763
Epoch 628/900
11/11 [==============================] - 0s 816us/step - loss: 6.9774 - mean_squared_error: 6.9774
Epoch 629/900
11/11 [==============================] - 0s 871us/step - loss: 6.9599 - mean_squared_error: 6.9599
Epoch 630/900
11/11 [==============================] - 0s 974us/step - loss: 6.9512 - mean_squared_error: 6.9512
Epoch 631/900
11/11 [==============================] - 0s 997us/step - loss: 6.9511 - mean_squared_error: 6.9511
Epoch 632/900
11/11 [==============================] - 0s 1ms/step - loss: 6.9345 - mean_squared_error: 6.9345
Epoch 633/900
11/11 [==============================] - 0s 915us/step - loss: 6.9441 - mean_squared_error: 6.9441
Epoch 634/900
11/11 [==============================] - 0s 815us/step - loss: 6.9237 - mean_squared_error: 6.9237
Epoch 635/900
11/11 [==============================] - 0s 969us/step - loss: 6.9061 - mean_squared_error: 6.9061
Epoch 636/900
11/11 [==============================] - 0s 1ms/step - loss: 6.9033 - mean_squared_error: 6.9033
Epoch 637/900
11/11 [==============================] - 0s 966us/step - loss: 6.8875 - mean_squared_error: 6.8875
Epoch 638/900
11/11 [==============================] - 0s 997us/step - loss: 6.8801 - mean_squared_error: 6.8801
Epoch 639/900
11/11 [==============================] - 0s 897us/step - loss: 6.8729 - mean_squared_error: 6.8729
Epoch 640/900
11/11 [==============================] - 0s 957us/step - loss: 6.8636 - mean_squared_error: 6.8636
Epoch 641/900
11/11 [==============================] - 0s 1ms/step - loss: 6.8523 - mean_squared_error: 6.8523
Epoch 642/900
11/11 [==============================] - 0s 1ms/step - loss: 6.8478 - mean_squared_error: 6.8478
Epoch 643/900
11/11 [==============================] - 0s 907us/step - loss: 6.8415 - mean_squared_error: 6.8415
Epoch 644/900
11/11 [==============================] - 0s 907us/step - loss: 6.8305 - mean_squared_error: 6.8305
Epoch 645/900
11/11 [==============================] - 0s 911us/step - loss: 6.8284 - mean_squared_error: 6.8284
Epoch 646/900
11/11 [==============================] - 0s 899us/step - loss: 6.8176 - mean_squared_error: 6.8176
Epoch 647/900
11/11 [==============================] - 0s 1ms/step - loss: 6.8077 - mean_squared_error: 6.8077
Epoch 648/900
11/11 [==============================] - 0s 986us/step - loss: 6.7966 - mean_squared_error: 6.7966
Epoch 649/900
11/11 [==============================] - 0s 912us/step - loss: 6.7913 - mean_squared_error: 6.7913
Epoch 650/900
11/11 [==============================] - 0s 1ms/step - loss: 6.7826 - mean_squared_error: 6.7826
Epoch 651/900
11/11 [==============================] - 0s 987us/step - loss: 6.7727 - mean_squared_error: 6.7727
Epoch 652/900
11/11 [==============================] - 0s 993us/step - loss: 6.7705 - mean_squared_error: 6.7705
Epoch 653/900
11/11 [==============================] - 0s 907us/step - loss: 6.7801 - mean_squared_error: 6.7801
Epoch 654/900
11/11 [==============================] - 0s 1ms/step - loss: 6.7482 - mean_squared_error: 6.7482
Epoch 655/900
11/11 [==============================] - 0s 908us/step - loss: 6.7411 - mean_squared_error: 6.7411
Epoch 656/900
11/11 [==============================] - 0s 896us/step - loss: 6.7325 - mean_squared_error: 6.7325
Epoch 657/900
11/11 [==============================] - 0s 818us/step - loss: 6.7228 - mean_squared_error: 6.7228
Epoch 658/900
11/11 [==============================] - 0s 860us/step - loss: 6.7203 - mean_squared_error: 6.7203
Epoch 659/900
11/11 [==============================] - 0s 773us/step - loss: 6.7058 - mean_squared_error: 6.7058
Epoch 660/900
11/11 [==============================] - 0s 870us/step - loss: 6.6974 - mean_squared_error: 6.6974
Epoch 661/900
11/11 [==============================] - 0s 866us/step - loss: 6.6883 - mean_squared_error: 6.6883
Epoch 662/900
11/11 [==============================] - 0s 974us/step - loss: 6.6822 - mean_squared_error: 6.6822
Epoch 663/900
11/11 [==============================] - 0s 889us/step - loss: 6.6850 - mean_squared_error: 6.6850
Epoch 664/900
11/11 [==============================] - 0s 792us/step - loss: 6.6610 - mean_squared_error: 6.6610
Epoch 665/900
11/11 [==============================] - 0s 947us/step - loss: 6.6572 - mean_squared_error: 6.6572
Epoch 666/900
11/11 [==============================] - 0s 907us/step - loss: 6.6498 - mean_squared_error: 6.6498
Epoch 667/900
11/11 [==============================] - 0s 907us/step - loss: 6.6370 - mean_squared_error: 6.6370
Epoch 668/900
11/11 [==============================] - 0s 815us/step - loss: 6.6410 - mean_squared_error: 6.6410
Epoch 669/900
11/11 [==============================] - 0s 807us/step - loss: 6.6211 - mean_squared_error: 6.6211
Epoch 670/900
11/11 [==============================] - 0s 801us/step - loss: 6.6091 - mean_squared_error: 6.6091
Epoch 671/900
11/11 [==============================] - 0s 906us/step - loss: 6.6103 - mean_squared_error: 6.6103
Epoch 672/900
11/11 [==============================] - 0s 796us/step - loss: 6.5934 - mean_squared_error: 6.5934
Epoch 673/900
11/11 [==============================] - 0s 907us/step - loss: 6.5916 - mean_squared_error: 6.5916
Epoch 674/900
11/11 [==============================] - 0s 772us/step - loss: 6.5834 - mean_squared_error: 6.5834
Epoch 675/900
11/11 [==============================] - 0s 874us/step - loss: 6.5768 - mean_squared_error: 6.5768
Epoch 676/900
11/11 [==============================] - 0s 980us/step - loss: 6.5596 - mean_squared_error: 6.5596
Epoch 677/900
11/11 [==============================] - 0s 890us/step - loss: 6.5524 - mean_squared_error: 6.5524
Epoch 678/900
11/11 [==============================] - 0s 822us/step - loss: 6.5421 - mean_squared_error: 6.5421
Epoch 679/900
11/11 [==============================] - 0s 814us/step - loss: 6.5364 - mean_squared_error: 6.5364
Epoch 680/900
11/11 [==============================] - 0s 704us/step - loss: 6.5269 - mean_squared_error: 6.5269
Epoch 681/900
11/11 [==============================] - 0s 817us/step - loss: 6.5179 - mean_squared_error: 6.5179
Epoch 682/900
11/11 [==============================] - 0s 804us/step - loss: 6.5138 - mean_squared_error: 6.5138
Epoch 683/900
11/11 [==============================] - 0s 907us/step - loss: 6.5015 - mean_squared_error: 6.5015
Epoch 684/900
11/11 [==============================] - 0s 816us/step - loss: 6.4929 - mean_squared_error: 6.4929
Epoch 685/900
11/11 [==============================] - 0s 954us/step - loss: 6.4833 - mean_squared_error: 6.4833
Epoch 686/900
11/11 [==============================] - 0s 906us/step - loss: 6.4784 - mean_squared_error: 6.4784
Epoch 687/900
11/11 [==============================] - 0s 865us/step - loss: 6.4703 - mean_squared_error: 6.4703
Epoch 688/900
11/11 [==============================] - 0s 867us/step - loss: 6.4574 - mean_squared_error: 6.4574
Epoch 689/900
11/11 [==============================] - 0s 816us/step - loss: 6.4469 - mean_squared_error: 6.4469
Epoch 690/900
11/11 [==============================] - 0s 802us/step - loss: 6.4399 - mean_squared_error: 6.4399
Epoch 691/900
11/11 [==============================] - 0s 868us/step - loss: 6.4338 - mean_squared_error: 6.4338
Epoch 692/900
11/11 [==============================] - 0s 905us/step - loss: 6.4256 - mean_squared_error: 6.4256
Epoch 693/900
11/11 [==============================] - 0s 918us/step - loss: 6.4145 - mean_squared_error: 6.4145
Epoch 694/900
11/11 [==============================] - 0s 905us/step - loss: 6.3996 - mean_squared_error: 6.3996
Epoch 695/900
11/11 [==============================] - 0s 817us/step - loss: 6.3995 - mean_squared_error: 6.3995
Epoch 696/900
11/11 [==============================] - 0s 897us/step - loss: 6.3886 - mean_squared_error: 6.3886
Epoch 697/900
11/11 [==============================] - 0s 933us/step - loss: 6.3819 - mean_squared_error: 6.3819
Epoch 698/900
11/11 [==============================] - 0s 1ms/step - loss: 6.3717 - mean_squared_error: 6.3717
Epoch 699/900
11/11 [==============================] - 0s 907us/step - loss: 6.3612 - mean_squared_error: 6.3612
Epoch 700/900
11/11 [==============================] - 0s 964us/step - loss: 6.3526 - mean_squared_error: 6.3526
Epoch 701/900
11/11 [==============================] - 0s 932us/step - loss: 6.3441 - mean_squared_error: 6.3441
Epoch 702/900
11/11 [==============================] - 0s 816us/step - loss: 6.3366 - mean_squared_error: 6.3366
Epoch 703/900
11/11 [==============================] - 0s 907us/step - loss: 6.3262 - mean_squared_error: 6.3262
Epoch 704/900
11/11 [==============================] - 0s 903us/step - loss: 6.3184 - mean_squared_error: 6.3184
Epoch 705/900
11/11 [==============================] - 0s 816us/step - loss: 6.3112 - mean_squared_error: 6.3112
Epoch 706/900
11/11 [==============================] - 0s 795us/step - loss: 6.3155 - mean_squared_error: 6.3155
Epoch 707/900
11/11 [==============================] - 0s 802us/step - loss: 6.2963 - mean_squared_error: 6.2963
Epoch 708/900
11/11 [==============================] - 0s 879us/step - loss: 6.2904 - mean_squared_error: 6.2904
Epoch 709/900
11/11 [==============================] - 0s 903us/step - loss: 6.2779 - mean_squared_error: 6.2779
Epoch 710/900
11/11 [==============================] - 0s 961us/step - loss: 6.2716 - mean_squared_error: 6.2716
Epoch 711/900
11/11 [==============================] - 0s 905us/step - loss: 6.2646 - mean_squared_error: 6.2646
Epoch 712/900
11/11 [==============================] - 0s 891us/step - loss: 6.2524 - mean_squared_error: 6.2524
Epoch 713/900
11/11 [==============================] - 0s 881us/step - loss: 6.2439 - mean_squared_error: 6.2439
Epoch 714/900
11/11 [==============================] - 0s 920us/step - loss: 6.2362 - mean_squared_error: 6.2362
Epoch 715/900
11/11 [==============================] - 0s 809us/step - loss: 6.2253 - mean_squared_error: 6.2253
Epoch 716/900
11/11 [==============================] - 0s 985us/step - loss: 6.2196 - mean_squared_error: 6.2196
Epoch 717/900
11/11 [==============================] - 0s 906us/step - loss: 6.2109 - mean_squared_error: 6.2109
Epoch 718/900
11/11 [==============================] - 0s 816us/step - loss: 6.2035 - mean_squared_error: 6.2035
Epoch 719/900
11/11 [==============================] - 0s 904us/step - loss: 6.1849 - mean_squared_error: 6.1849
Epoch 720/900
11/11 [==============================] - 0s 907us/step - loss: 6.1862 - mean_squared_error: 6.1862
Epoch 721/900
11/11 [==============================] - 0s 960us/step - loss: 6.1763 - mean_squared_error: 6.1763
Epoch 722/900
11/11 [==============================] - 0s 888us/step - loss: 6.1692 - mean_squared_error: 6.1692
Epoch 723/900
11/11 [==============================] - 0s 1ms/step - loss: 6.1572 - mean_squared_error: 6.1572
Epoch 724/900
11/11 [==============================] - 0s 813us/step - loss: 6.1508 - mean_squared_error: 6.1508
Epoch 725/900
11/11 [==============================] - 0s 807us/step - loss: 6.1406 - mean_squared_error: 6.1406
Epoch 726/900
11/11 [==============================] - 0s 997us/step - loss: 6.1349 - mean_squared_error: 6.1349
Epoch 727/900
11/11 [==============================] - 0s 1ms/step - loss: 6.1264 - mean_squared_error: 6.1264
Epoch 728/900
11/11 [==============================] - 0s 903us/step - loss: 6.1137 - mean_squared_error: 6.1137
Epoch 729/900
11/11 [==============================] - 0s 884us/step - loss: 6.1035 - mean_squared_error: 6.1035
Epoch 730/900
11/11 [==============================] - 0s 886us/step - loss: 6.1054 - mean_squared_error: 6.1054
Epoch 731/900
11/11 [==============================] - 0s 907us/step - loss: 6.0945 - mean_squared_error: 6.0945
Epoch 732/900
11/11 [==============================] - 0s 815us/step - loss: 6.0798 - mean_squared_error: 6.0798
Epoch 733/900
11/11 [==============================] - 0s 907us/step - loss: 6.0795 - mean_squared_error: 6.0795
Epoch 734/900
11/11 [==============================] - 0s 976us/step - loss: 6.0605 - mean_squared_error: 6.0605
Epoch 735/900
11/11 [==============================] - 0s 1ms/step - loss: 6.0529 - mean_squared_error: 6.0529
Epoch 736/900
11/11 [==============================] - 0s 1ms/step - loss: 6.0488 - mean_squared_error: 6.0488
Epoch 737/900
11/11 [==============================] - 0s 811us/step - loss: 6.0382 - mean_squared_error: 6.0382
Epoch 738/900
11/11 [==============================] - 0s 946us/step - loss: 6.0286 - mean_squared_error: 6.0286
Epoch 739/900
11/11 [==============================] - 0s 864us/step - loss: 6.0214 - mean_squared_error: 6.0214
Epoch 740/900
11/11 [==============================] - 0s 866us/step - loss: 6.0126 - mean_squared_error: 6.0126
Epoch 741/900
11/11 [==============================] - 0s 816us/step - loss: 6.0033 - mean_squared_error: 6.0033
Epoch 742/900
11/11 [==============================] - 0s 879us/step - loss: 5.9927 - mean_squared_error: 5.9927
Epoch 743/900
11/11 [==============================] - 0s 813us/step - loss: 5.9885 - mean_squared_error: 5.9885
Epoch 744/900
11/11 [==============================] - 0s 970us/step - loss: 5.9834 - mean_squared_error: 5.9834
Epoch 745/900
11/11 [==============================] - 0s 907us/step - loss: 5.9696 - mean_squared_error: 5.9696
Epoch 746/900
11/11 [==============================] - 0s 997us/step - loss: 5.9626 - mean_squared_error: 5.9626
Epoch 747/900
11/11 [==============================] - 0s 787us/step - loss: 5.9513 - mean_squared_error: 5.9513
Epoch 748/900
11/11 [==============================] - 0s 706us/step - loss: 5.9500 - mean_squared_error: 5.9500
Epoch 749/900
11/11 [==============================] - 0s 1ms/step - loss: 5.9313 - mean_squared_error: 5.9313
Epoch 750/900
11/11 [==============================] - 0s 902us/step - loss: 5.9390 - mean_squared_error: 5.9390
Epoch 751/900
11/11 [==============================] - 0s 899us/step - loss: 5.9335 - mean_squared_error: 5.9335
Epoch 752/900
11/11 [==============================] - 0s 895us/step - loss: 5.9136 - mean_squared_error: 5.9136
Epoch 753/900
11/11 [==============================] - 0s 802us/step - loss: 5.9021 - mean_squared_error: 5.9021
Epoch 754/900
11/11 [==============================] - 0s 896us/step - loss: 5.8924 - mean_squared_error: 5.8924
Epoch 755/900
11/11 [==============================] - 0s 907us/step - loss: 5.8922 - mean_squared_error: 5.8922
Epoch 756/900
11/11 [==============================] - 0s 808us/step - loss: 5.8762 - mean_squared_error: 5.8762
Epoch 757/900
11/11 [==============================] - 0s 816us/step - loss: 5.8700 - mean_squared_error: 5.8700
Epoch 758/900
11/11 [==============================] - 0s 922us/step - loss: 5.8651 - mean_squared_error: 5.8651
Epoch 759/900
11/11 [==============================] - 0s 986us/step - loss: 5.8541 - mean_squared_error: 5.8541
Epoch 760/900
11/11 [==============================] - 0s 787us/step - loss: 5.8480 - mean_squared_error: 5.8480
Epoch 761/900
11/11 [==============================] - 0s 910us/step - loss: 5.8378 - mean_squared_error: 5.8378
Epoch 762/900
11/11 [==============================] - 0s 816us/step - loss: 5.8308 - mean_squared_error: 5.8308
Epoch 763/900
11/11 [==============================] - 0s 908us/step - loss: 5.8238 - mean_squared_error: 5.8238
Epoch 764/900
11/11 [==============================] - 0s 816us/step - loss: 5.8159 - mean_squared_error: 5.8159
Epoch 765/900
11/11 [==============================] - 0s 990us/step - loss: 5.8100 - mean_squared_error: 5.8100
Epoch 766/900
11/11 [==============================] - 0s 827us/step - loss: 5.8107 - mean_squared_error: 5.8107
Epoch 767/900
11/11 [==============================] - 0s 906us/step - loss: 5.7898 - mean_squared_error: 5.7898
Epoch 768/900
11/11 [==============================] - 0s 969us/step - loss: 5.7821 - mean_squared_error: 5.7821
Epoch 769/900
11/11 [==============================] - 0s 906us/step - loss: 5.7709 - mean_squared_error: 5.7709
Epoch 770/900
11/11 [==============================] - 0s 773us/step - loss: 5.7649 - mean_squared_error: 5.7649
Epoch 771/900
11/11 [==============================] - 0s 870us/step - loss: 5.7543 - mean_squared_error: 5.7543
Epoch 772/900
11/11 [==============================] - 0s 906us/step - loss: 5.7496 - mean_squared_error: 5.7496
Epoch 773/900
11/11 [==============================] - 0s 993us/step - loss: 5.7428 - mean_squared_error: 5.7428
Epoch 774/900
11/11 [==============================] - 0s 907us/step - loss: 5.7307 - mean_squared_error: 5.7307
Epoch 775/900
11/11 [==============================] - 0s 816us/step - loss: 5.7265 - mean_squared_error: 5.7265
Epoch 776/900
11/11 [==============================] - 0s 832us/step - loss: 5.7112 - mean_squared_error: 5.7112
Epoch 777/900
11/11 [==============================] - 0s 910us/step - loss: 5.7053 - mean_squared_error: 5.7053
Epoch 778/900
11/11 [==============================] - 0s 862us/step - loss: 5.6949 - mean_squared_error: 5.6949
Epoch 779/900
11/11 [==============================] - 0s 967us/step - loss: 5.6848 - mean_squared_error: 5.6848
Epoch 780/900
11/11 [==============================] - 0s 907us/step - loss: 5.6905 - mean_squared_error: 5.6905
Epoch 781/900
11/11 [==============================] - 0s 802us/step - loss: 5.6828 - mean_squared_error: 5.6828
Epoch 782/900
11/11 [==============================] - 0s 997us/step - loss: 5.6709 - mean_squared_error: 5.6709
Epoch 783/900
11/11 [==============================] - 0s 957us/step - loss: 5.6580 - mean_squared_error: 5.6580
Epoch 784/900
11/11 [==============================] - 0s 800us/step - loss: 5.6561 - mean_squared_error: 5.6561
Epoch 785/900
11/11 [==============================] - 0s 873us/step - loss: 5.6411 - mean_squared_error: 5.6411
Epoch 786/900
11/11 [==============================] - 0s 880us/step - loss: 5.6363 - mean_squared_error: 5.6363
Epoch 787/900
11/11 [==============================] - 0s 907us/step - loss: 5.6256 - mean_squared_error: 5.6256
Epoch 788/900
11/11 [==============================] - 0s 997us/step - loss: 5.6173 - mean_squared_error: 5.6173
Epoch 789/900
11/11 [==============================] - 0s 985us/step - loss: 5.6143 - mean_squared_error: 5.6143
Epoch 790/900
11/11 [==============================] - 0s 990us/step - loss: 5.5988 - mean_squared_error: 5.5988
Epoch 791/900
11/11 [==============================] - 0s 948us/step - loss: 5.5905 - mean_squared_error: 5.5905
Epoch 792/900
11/11 [==============================] - 0s 996us/step - loss: 5.5852 - mean_squared_error: 5.5852
Epoch 793/900
11/11 [==============================] - 0s 1ms/step - loss: 5.5775 - mean_squared_error: 5.5775
Epoch 794/900
11/11 [==============================] - 0s 960us/step - loss: 5.5740 - mean_squared_error: 5.5740
Epoch 795/900
11/11 [==============================] - 0s 816us/step - loss: 5.5594 - mean_squared_error: 5.5594
Epoch 796/900
11/11 [==============================] - 0s 996us/step - loss: 5.5617 - mean_squared_error: 5.5617
Epoch 797/900
11/11 [==============================] - 0s 905us/step - loss: 5.5487 - mean_squared_error: 5.5487
Epoch 798/900
11/11 [==============================] - 0s 816us/step - loss: 5.5365 - mean_squared_error: 5.5365
Epoch 799/900
11/11 [==============================] - 0s 878us/step - loss: 5.5336 - mean_squared_error: 5.5336
Epoch 800/900
11/11 [==============================] - 0s 888us/step - loss: 5.5205 - mean_squared_error: 5.5205
Epoch 801/900
11/11 [==============================] - 0s 897us/step - loss: 5.5130 - mean_squared_error: 5.5130
Epoch 802/900
11/11 [==============================] - 0s 806us/step - loss: 5.5057 - mean_squared_error: 5.5057
Epoch 803/900
11/11 [==============================] - 0s 907us/step - loss: 5.4969 - mean_squared_error: 5.4969
Epoch 804/900
11/11 [==============================] - 0s 808us/step - loss: 5.4930 - mean_squared_error: 5.4930
Epoch 805/900
11/11 [==============================] - 0s 907us/step - loss: 5.4929 - mean_squared_error: 5.4929
Epoch 806/900
11/11 [==============================] - 0s 967us/step - loss: 5.4805 - mean_squared_error: 5.4805
Epoch 807/900
11/11 [==============================] - 0s 880us/step - loss: 5.4671 - mean_squared_error: 5.4671
Epoch 808/900
11/11 [==============================] - 0s 870us/step - loss: 5.4575 - mean_squared_error: 5.4575
Epoch 809/900
11/11 [==============================] - 0s 982us/step - loss: 5.4518 - mean_squared_error: 5.4518
Epoch 810/900
11/11 [==============================] - 0s 872us/step - loss: 5.4493 - mean_squared_error: 5.4493
Epoch 811/900
11/11 [==============================] - 0s 872us/step - loss: 5.4373 - mean_squared_error: 5.4373
Epoch 812/900
11/11 [==============================] - 0s 874us/step - loss: 5.4295 - mean_squared_error: 5.4295
Epoch 813/900
11/11 [==============================] - 0s 1ms/step - loss: 5.4175 - mean_squared_error: 5.4175
Epoch 814/900
11/11 [==============================] - 0s 882us/step - loss: 5.4118 - mean_squared_error: 5.4118
Epoch 815/900
11/11 [==============================] - 0s 906us/step - loss: 5.4109 - mean_squared_error: 5.4109
Epoch 816/900
11/11 [==============================] - 0s 910us/step - loss: 5.3999 - mean_squared_error: 5.3999
Epoch 817/900
11/11 [==============================] - 0s 907us/step - loss: 5.3858 - mean_squared_error: 5.3858
Epoch 818/900
11/11 [==============================] - 0s 816us/step - loss: 5.4087 - mean_squared_error: 5.4087
Epoch 819/900
11/11 [==============================] - 0s 879us/step - loss: 5.3705 - mean_squared_error: 5.3705
Epoch 820/900
11/11 [==============================] - 0s 946us/step - loss: 5.3742 - mean_squared_error: 5.3742
Epoch 821/900
11/11 [==============================] - 0s 997us/step - loss: 5.3535 - mean_squared_error: 5.3535
Epoch 822/900
11/11 [==============================] - 0s 816us/step - loss: 5.3490 - mean_squared_error: 5.3490
Epoch 823/900
11/11 [==============================] - 0s 816us/step - loss: 5.3426 - mean_squared_error: 5.3426
Epoch 824/900
11/11 [==============================] - 0s 907us/step - loss: 5.3339 - mean_squared_error: 5.3339
Epoch 825/900
11/11 [==============================] - 0s 960us/step - loss: 5.3268 - mean_squared_error: 5.3268
Epoch 826/900
11/11 [==============================] - 0s 828us/step - loss: 5.3205 - mean_squared_error: 5.3205
Epoch 827/900
11/11 [==============================] - 0s 816us/step - loss: 5.3098 - mean_squared_error: 5.3098
Epoch 828/900
11/11 [==============================] - 0s 803us/step - loss: 5.3038 - mean_squared_error: 5.3038
Epoch 829/900
11/11 [==============================] - 0s 760us/step - loss: 5.2936 - mean_squared_error: 5.2936
Epoch 830/900
11/11 [==============================] - 0s 716us/step - loss: 5.2878 - mean_squared_error: 5.2878
Epoch 831/900
11/11 [==============================] - 0s 992us/step - loss: 5.2901 - mean_squared_error: 5.2901
Epoch 832/900
11/11 [==============================] - 0s 907us/step - loss: 5.2736 - mean_squared_error: 5.2736
Epoch 833/900
11/11 [==============================] - 0s 1ms/step - loss: 5.2657 - mean_squared_error: 5.2657
Epoch 834/900
11/11 [==============================] - 0s 920us/step - loss: 5.2592 - mean_squared_error: 5.2592
Epoch 835/900
11/11 [==============================] - 0s 998us/step - loss: 5.2491 - mean_squared_error: 5.2491
Epoch 836/900
11/11 [==============================] - 0s 900us/step - loss: 5.2458 - mean_squared_error: 5.2458
Epoch 837/900
11/11 [==============================] - 0s 896us/step - loss: 5.2301 - mean_squared_error: 5.2301
Epoch 838/900
11/11 [==============================] - 0s 775us/step - loss: 5.2267 - mean_squared_error: 5.2267
Epoch 839/900
11/11 [==============================] - 0s 816us/step - loss: 5.2215 - mean_squared_error: 5.2215
Epoch 840/900
11/11 [==============================] - 0s 816us/step - loss: 5.2135 - mean_squared_error: 5.2135
Epoch 841/900
11/11 [==============================] - 0s 999us/step - loss: 5.2032 - mean_squared_error: 5.2032
Epoch 842/900
11/11 [==============================] - 0s 907us/step - loss: 5.1996 - mean_squared_error: 5.1996
Epoch 843/900
11/11 [==============================] - 0s 816us/step - loss: 5.1914 - mean_squared_error: 5.1914
Epoch 844/900
11/11 [==============================] - 0s 816us/step - loss: 5.1883 - mean_squared_error: 5.1883
Epoch 845/900
11/11 [==============================] - 0s 1ms/step - loss: 5.1800 - mean_squared_error: 5.1801
Epoch 846/900
11/11 [==============================] - 0s 907us/step - loss: 5.1813 - mean_squared_error: 5.1813
Epoch 847/900
11/11 [==============================] - 0s 1ms/step - loss: 5.1757 - mean_squared_error: 5.1757
Epoch 848/900
11/11 [==============================] - 0s 904us/step - loss: 5.1581 - mean_squared_error: 5.1581
Epoch 849/900
11/11 [==============================] - 0s 953us/step - loss: 5.1476 - mean_squared_error: 5.1476
Epoch 850/900
11/11 [==============================] - 0s 911us/step - loss: 5.1396 - mean_squared_error: 5.1396
Epoch 851/900
11/11 [==============================] - 0s 907us/step - loss: 5.1340 - mean_squared_error: 5.1340
Epoch 852/900
11/11 [==============================] - 0s 795us/step - loss: 5.1248 - mean_squared_error: 5.1248
Epoch 853/900
11/11 [==============================] - 0s 988us/step - loss: 5.1142 - mean_squared_error: 5.1142
Epoch 854/900
11/11 [==============================] - 0s 947us/step - loss: 5.1193 - mean_squared_error: 5.1193
Epoch 855/900
11/11 [==============================] - 0s 878us/step - loss: 5.1020 - mean_squared_error: 5.1020
Epoch 856/900
11/11 [==============================] - 0s 866us/step - loss: 5.0922 - mean_squared_error: 5.0922
Epoch 857/900
11/11 [==============================] - 0s 819us/step - loss: 5.0881 - mean_squared_error: 5.0881
Epoch 858/900
11/11 [==============================] - 0s 852us/step - loss: 5.0751 - mean_squared_error: 5.0751
Epoch 859/900
11/11 [==============================] - 0s 725us/step - loss: 5.0710 - mean_squared_error: 5.0710
Epoch 860/900
11/11 [==============================] - 0s 800us/step - loss: 5.0639 - mean_squared_error: 5.0639
Epoch 861/900
11/11 [==============================] - 0s 816us/step - loss: 5.0524 - mean_squared_error: 5.0524
Epoch 862/900
11/11 [==============================] - 0s 772us/step - loss: 5.0449 - mean_squared_error: 5.0449
Epoch 863/900
11/11 [==============================] - 0s 866us/step - loss: 5.0381 - mean_squared_error: 5.0381
Epoch 864/900
11/11 [==============================] - 0s 816us/step - loss: 5.0425 - mean_squared_error: 5.0425
Epoch 865/900
11/11 [==============================] - 0s 815us/step - loss: 5.0265 - mean_squared_error: 5.0265
Epoch 866/900
11/11 [==============================] - 0s 816us/step - loss: 5.0203 - mean_squared_error: 5.0203
Epoch 867/900
11/11 [==============================] - 0s 710us/step - loss: 5.0054 - mean_squared_error: 5.0054
Epoch 868/900
11/11 [==============================] - 0s 865us/step - loss: 5.0034 - mean_squared_error: 5.0034
Epoch 869/900
11/11 [==============================] - 0s 708us/step - loss: 5.0009 - mean_squared_error: 5.0009
Epoch 870/900
11/11 [==============================] - 0s 816us/step - loss: 4.9874 - mean_squared_error: 4.9874
Epoch 871/900
11/11 [==============================] - 0s 890us/step - loss: 4.9815 - mean_squared_error: 4.9815
Epoch 872/900
11/11 [==============================] - 0s 997us/step - loss: 4.9763 - mean_squared_error: 4.9763
Epoch 873/900
11/11 [==============================] - 0s 881us/step - loss: 4.9676 - mean_squared_error: 4.9676
Epoch 874/900
11/11 [==============================] - 0s 790us/step - loss: 4.9569 - mean_squared_error: 4.9569
Epoch 875/900
11/11 [==============================] - 0s 814us/step - loss: 4.9502 - mean_squared_error: 4.9502
Epoch 876/900
11/11 [==============================] - 0s 900us/step - loss: 4.9394 - mean_squared_error: 4.9394
Epoch 877/900
11/11 [==============================] - 0s 907us/step - loss: 4.9368 - mean_squared_error: 4.9368
Epoch 878/900
11/11 [==============================] - 0s 889us/step - loss: 4.9339 - mean_squared_error: 4.9339
Epoch 879/900
11/11 [==============================] - 0s 857us/step - loss: 4.9258 - mean_squared_error: 4.9258
Epoch 880/900
11/11 [==============================] - 0s 904us/step - loss: 4.9152 - mean_squared_error: 4.9152
Epoch 881/900
11/11 [==============================] - 0s 980us/step - loss: 4.9184 - mean_squared_error: 4.9184
Epoch 882/900
11/11 [==============================] - 0s 975us/step - loss: 4.9046 - mean_squared_error: 4.9046
Epoch 883/900
11/11 [==============================] - 0s 887us/step - loss: 4.8961 - mean_squared_error: 4.8961
Epoch 884/900
11/11 [==============================] - 0s 884us/step - loss: 4.8954 - mean_squared_error: 4.8954
Epoch 885/900
11/11 [==============================] - 0s 816us/step - loss: 4.8799 - mean_squared_error: 4.8799
Epoch 886/900
11/11 [==============================] - 0s 902us/step - loss: 4.8732 - mean_squared_error: 4.8732
Epoch 887/900
11/11 [==============================] - 0s 907us/step - loss: 4.8623 - mean_squared_error: 4.8623
Epoch 888/900
11/11 [==============================] - 0s 911us/step - loss: 4.8622 - mean_squared_error: 4.8622
Epoch 889/900
11/11 [==============================] - 0s 896us/step - loss: 4.8501 - mean_squared_error: 4.8501
Epoch 890/900
11/11 [==============================] - 0s 785us/step - loss: 4.8412 - mean_squared_error: 4.8412
Epoch 891/900
11/11 [==============================] - 0s 907us/step - loss: 4.8367 - mean_squared_error: 4.8367
Epoch 892/900
11/11 [==============================] - 0s 964us/step - loss: 4.8302 - mean_squared_error: 4.8302
Epoch 893/900
11/11 [==============================] - 0s 816us/step - loss: 4.8192 - mean_squared_error: 4.8192
Epoch 894/900
11/11 [==============================] - 0s 781us/step - loss: 4.8171 - mean_squared_error: 4.8171
Epoch 895/900
11/11 [==============================] - 0s 816us/step - loss: 4.8295 - mean_squared_error: 4.8295
Epoch 896/900
11/11 [==============================] - 0s 725us/step - loss: 4.7873 - mean_squared_error: 4.7873
Epoch 897/900
11/11 [==============================] - 0s 791us/step - loss: 4.7886 - mean_squared_error: 4.7886
Epoch 898/900
11/11 [==============================] - 0s 824us/step - loss: 4.7878 - mean_squared_error: 4.7878
Epoch 899/900
11/11 [==============================] - 0s 815us/step - loss: 4.7817 - mean_squared_error: 4.7817
Epoch 900/900
11/11 [==============================] - 0s 724us/step - loss: 4.7747 - mean_squared_error: 4.7747
Out[21]:
<tensorflow.python.keras.callbacks.History at 0x2310be53940>

In the cell below predictions are made on the test data using all the rows and R2 scores are printed. It can be seen that an R2 score of 0.578332 is attained.

In [22]:
est_out = modelPCA.predict(ts_inputs)

R2_NN_PCA = r2_score(ts_target, est_out)

r2vec = [R2_lin_PCA, R2_lin_Corr, R2_rf_PCA, R2_rf_Corr, R2_svr_PCA, R2_svr_Cor_scaled, R2_NN_PCA]
r2_labels = ['Linear Regression with PCA',
            'Linear Regression with Correlation Filtering',
            'Random Forest Regression with PCA',
            'Random Forest Regression with Correlation Filtering',
            'Support Vector Regression with PCA',
            'Support Vector Regression with Correlation Filtering (Normalized)',
            'Neural Network with PCA']

fig, ax = plt.subplots(figsize=(5, 5))
ax.barh(r2_labels, r2vec, color='orange')
ax.set_title('$R^2$-scores obtained by each method');

for idx,label in enumerate(r2_labels):
    print('R^2-score obtained with '+label+' is {}'.format(r2vec[idx]))
R^2-score obtained with Linear Regression with PCA is 0.5434994705319338
R^2-score obtained with Linear Regression with Correlation Filtering is 0.3390603164540187
R^2-score obtained with Random Forest Regression with PCA is 0.5164346834115915
R^2-score obtained with Random Forest Regression with Correlation Filtering is 0.5094058766945282
R^2-score obtained with Support Vector Regression with PCA is 0.5232724838930993
R^2-score obtained with Support Vector Regression with Correlation Filtering (Normalized) is 0.28155897490458315
R^2-score obtained with Neural Network with PCA is 0.646610556801686

Part 2:¶

In Part 2 of the Prediction Challenge, we basically need to do the same prediction but with new test/train splits. In this case, the training will be performed without any American country, but all the testing will be performed on American countries (North or South). So firstly, we need to get the list of countries that lie either on North or South America (or in the continent of America). To do this, we basically merge our dataset with a region dataset.

In [23]:
regions = pd.read_csv('country_region.csv')[['Country', 'Region']]
cities = pd.read_excel('Cities.xls')
cities = pd.merge(cities, regions)

# Just to print the resulted merge. Only the shown columns are sliced.
america_cities = cities[cities['Region'].str.contains('America')][['City', 'Country', 'Region']]
pd.set_option('display.max_rows', 1000)
america_cities.head(3)
Out[23]:
City Country Region
0 Baltimore(MD) United States North America
1 Milwaukee(WI) United States North America
2 Austin(TX) United States North America

As you can see above, we got the list of countries that lie either on North or South America.

  • Since the procedure until training a model could basically be the same we will use our already prepared dataset df. However it has it's 'Country' column is encoded by integers. So we will use the dictionary that we created during the encoding of the country to decode the country list we obtained during the merge above so we could slice out the countries that lie within North or South America from our df.
In [24]:
# Get DataFrame for only South/ North America
country_list = cities[cities['Region'].str.contains('America')]['Country'].unique()

# Encode the country names if they exist in our dataset.
tmp = []
for country in country_list:
    try: 
        tmp.append(list(decode_dict.keys())[list(decode_dict.values()).index(country)])
    except:
        print('Error finding the country in the dataset. We possible removed it during cleaning.')
        continue
country_list = tmp

# Get cityID's since they are not getting standardized.
north_south_america = df[df.Country.isin(country_list)]

Since we are still using the same data (before splitting), we decided that we should use PCA + Neural Network approach here too because it gave the best results for us in the first part. For PCA part, there's not any reason to change why and how we picked our principle components, so we can use the already existing PCA object and transform our new splits with it.

In [25]:
ts_inputs = pca18.transform(df_inputs[df_inputs.index.isin(north_south_america.index)])
tr_inputs = pca18.transform(df_inputs[~df_inputs.index.isin(north_south_america.index)])

ts_target = df_target[df_target.index.isin(north_south_america.index)]
tr_target = df_target[~df_target.index.isin(north_south_america.index)]
  • Now we need to train a new model and optimize it using our new test and train splits:
In [26]:
from keras.models import Sequential
from keras.layers import Dense, Dropout, BatchNormalization
from keras.optimizers import Adam

# define the keras model
model = Sequential()
model.add(Dense(64, input_dim=np.shape(tr_inputs)[1], activation='tanh'))
model.add(Dense(64, activation='tanh'))

model.add(Dense(1, activation='linear')) # Output layer.


# compile the keras model
opt = Adam(learning_rate=1e-3)
model.compile(loss='mean_squared_error', optimizer=opt, metrics=['mean_squared_error'])
model.fit(tr_inputs, tr_target, epochs=90, batch_size=50, verbose=1) # Do not print the progress since 3500 epochs.
Epoch 1/90
5/5 [==============================] - 0s 804us/step - loss: 67.7026 - mean_squared_error: 67.7026
Epoch 2/90
5/5 [==============================] - 0s 998us/step - loss: 62.9678 - mean_squared_error: 62.9678
Epoch 3/90
5/5 [==============================] - 0s 803us/step - loss: 59.0603 - mean_squared_error: 59.0603
Epoch 4/90
5/5 [==============================] - 0s 1ms/step - loss: 56.2658 - mean_squared_error: 56.2658
Epoch 5/90
5/5 [==============================] - 0s 798us/step - loss: 53.6002 - mean_squared_error: 53.6002
Epoch 6/90
5/5 [==============================] - 0s 1ms/step - loss: 51.3559 - mean_squared_error: 51.3559
Epoch 7/90
5/5 [==============================] - 0s 796us/step - loss: 49.1077 - mean_squared_error: 49.1077
Epoch 8/90
5/5 [==============================] - 0s 997us/step - loss: 47.0035 - mean_squared_error: 47.0035
Epoch 9/90
5/5 [==============================] - 0s 1ms/step - loss: 44.8918 - mean_squared_error: 44.8918
Epoch 10/90
5/5 [==============================] - 0s 997us/step - loss: 42.7892 - mean_squared_error: 42.7892
Epoch 11/90
5/5 [==============================] - 0s 798us/step - loss: 40.7543 - mean_squared_error: 40.7543
Epoch 12/90
5/5 [==============================] - 0s 798us/step - loss: 38.7429 - mean_squared_error: 38.7429
Epoch 13/90
5/5 [==============================] - 0s 1ms/step - loss: 36.7251 - mean_squared_error: 36.7251
Epoch 14/90
5/5 [==============================] - 0s 1ms/step - loss: 34.7970 - mean_squared_error: 34.7970
Epoch 15/90
5/5 [==============================] - 0s 1ms/step - loss: 32.7470 - mean_squared_error: 32.7470
Epoch 16/90
5/5 [==============================] - 0s 926us/step - loss: 30.8569 - mean_squared_error: 30.8569
Epoch 17/90
5/5 [==============================] - 0s 997us/step - loss: 28.9436 - mean_squared_error: 28.9436
Epoch 18/90
5/5 [==============================] - 0s 1ms/step - loss: 27.0093 - mean_squared_error: 27.0093
Epoch 19/90
5/5 [==============================] - 0s 599us/step - loss: 25.2957 - mean_squared_error: 25.2957
Epoch 20/90
5/5 [==============================] - 0s 1ms/step - loss: 23.4162 - mean_squared_error: 23.4162
Epoch 21/90
5/5 [==============================] - 0s 796us/step - loss: 21.7668 - mean_squared_error: 21.7668
Epoch 22/90
5/5 [==============================] - 0s 1ms/step - loss: 20.1145 - mean_squared_error: 20.1145
Epoch 23/90
5/5 [==============================] - 0s 992us/step - loss: 18.5127 - mean_squared_error: 18.5127
Epoch 24/90
5/5 [==============================] - 0s 1ms/step - loss: 17.1276 - mean_squared_error: 17.1276
Epoch 25/90
5/5 [==============================] - 0s 798us/step - loss: 15.7748 - mean_squared_error: 15.7748
Epoch 26/90
5/5 [==============================] - 0s 998us/step - loss: 14.6015 - mean_squared_error: 14.6015
Epoch 27/90
5/5 [==============================] - 0s 1ms/step - loss: 13.6337 - mean_squared_error: 13.6337
Epoch 28/90
5/5 [==============================] - 0s 1ms/step - loss: 12.8211 - mean_squared_error: 12.8211
Epoch 29/90
5/5 [==============================] - 0s 1ms/step - loss: 11.9998 - mean_squared_error: 11.9998
Epoch 30/90
5/5 [==============================] - 0s 1ms/step - loss: 11.3029 - mean_squared_error: 11.3029
Epoch 31/90
5/5 [==============================] - 0s 2ms/step - loss: 10.8391 - mean_squared_error: 10.8391
Epoch 32/90
5/5 [==============================] - 0s 1ms/step - loss: 10.2759 - mean_squared_error: 10.2759
Epoch 33/90
5/5 [==============================] - 0s 1ms/step - loss: 9.9255 - mean_squared_error: 9.9255
Epoch 34/90
5/5 [==============================] - 0s 1ms/step - loss: 9.5981 - mean_squared_error: 9.5981
Epoch 35/90
5/5 [==============================] - 0s 798us/step - loss: 9.3513 - mean_squared_error: 9.3513
Epoch 36/90
5/5 [==============================] - 0s 598us/step - loss: 9.1021 - mean_squared_error: 9.1021
Epoch 37/90
5/5 [==============================] - 0s 799us/step - loss: 8.8863 - mean_squared_error: 8.8863
Epoch 38/90
5/5 [==============================] - 0s 1ms/step - loss: 8.6711 - mean_squared_error: 8.6711
Epoch 39/90
5/5 [==============================] - 0s 997us/step - loss: 8.5316 - mean_squared_error: 8.5316
Epoch 40/90
5/5 [==============================] - 0s 794us/step - loss: 8.3696 - mean_squared_error: 8.3696
Epoch 41/90
5/5 [==============================] - 0s 1ms/step - loss: 8.2249 - mean_squared_error: 8.2249
Epoch 42/90
5/5 [==============================] - 0s 1ms/step - loss: 8.0482 - mean_squared_error: 8.0482
Epoch 43/90
5/5 [==============================] - 0s 1ms/step - loss: 7.9215 - mean_squared_error: 7.9215
Epoch 44/90
5/5 [==============================] - 0s 998us/step - loss: 7.7595 - mean_squared_error: 7.7595
Epoch 45/90
5/5 [==============================] - 0s 1ms/step - loss: 7.6231 - mean_squared_error: 7.6231
Epoch 46/90
5/5 [==============================] - 0s 798us/step - loss: 7.4760 - mean_squared_error: 7.4760
Epoch 47/90
5/5 [==============================] - 0s 1ms/step - loss: 7.3813 - mean_squared_error: 7.3813
Epoch 48/90
5/5 [==============================] - 0s 798us/step - loss: 7.2180 - mean_squared_error: 7.2180
Epoch 49/90
5/5 [==============================] - 0s 899us/step - loss: 7.1167 - mean_squared_error: 7.1167
Epoch 50/90
5/5 [==============================] - 0s 1ms/step - loss: 6.9884 - mean_squared_error: 6.9884
Epoch 51/90
5/5 [==============================] - 0s 939us/step - loss: 6.8792 - mean_squared_error: 6.8792
Epoch 52/90
5/5 [==============================] - 0s 1ms/step - loss: 6.7553 - mean_squared_error: 6.7553
Epoch 53/90
5/5 [==============================] - 0s 1ms/step - loss: 6.6141 - mean_squared_error: 6.6141
Epoch 54/90
5/5 [==============================] - 0s 743us/step - loss: 6.5215 - mean_squared_error: 6.5215
Epoch 55/90
5/5 [==============================] - 0s 798us/step - loss: 6.3913 - mean_squared_error: 6.3913
Epoch 56/90
5/5 [==============================] - 0s 997us/step - loss: 6.2921 - mean_squared_error: 6.2921
Epoch 57/90
5/5 [==============================] - 0s 601us/step - loss: 6.1863 - mean_squared_error: 6.1863
Epoch 58/90
5/5 [==============================] - 0s 798us/step - loss: 6.1056 - mean_squared_error: 6.1056
Epoch 59/90
5/5 [==============================] - 0s 798us/step - loss: 6.0062 - mean_squared_error: 6.0062
Epoch 60/90
5/5 [==============================] - 0s 798us/step - loss: 5.9189 - mean_squared_error: 5.9189
Epoch 61/90
5/5 [==============================] - 0s 709us/step - loss: 5.8266 - mean_squared_error: 5.8266
Epoch 62/90
5/5 [==============================] - 0s 598us/step - loss: 5.7112 - mean_squared_error: 5.7112
Epoch 63/90
5/5 [==============================] - 0s 798us/step - loss: 5.6293 - mean_squared_error: 5.6293
Epoch 64/90
5/5 [==============================] - 0s 768us/step - loss: 5.5110 - mean_squared_error: 5.5110
Epoch 65/90
5/5 [==============================] - 0s 801us/step - loss: 5.4424 - mean_squared_error: 5.4424
Epoch 66/90
5/5 [==============================] - 0s 764us/step - loss: 5.3398 - mean_squared_error: 5.3398
Epoch 67/90
5/5 [==============================] - 0s 598us/step - loss: 5.2688 - mean_squared_error: 5.2688
Epoch 68/90
5/5 [==============================] - 0s 797us/step - loss: 5.1829 - mean_squared_error: 5.1829
Epoch 69/90
5/5 [==============================] - 0s 760us/step - loss: 5.1046 - mean_squared_error: 5.1046
Epoch 70/90
5/5 [==============================] - 0s 601us/step - loss: 5.0102 - mean_squared_error: 5.0102
Epoch 71/90
5/5 [==============================] - 0s 741us/step - loss: 4.9249 - mean_squared_error: 4.9249
Epoch 72/90
5/5 [==============================] - 0s 798us/step - loss: 4.8572 - mean_squared_error: 4.8572
Epoch 73/90
5/5 [==============================] - 0s 599us/step - loss: 4.7732 - mean_squared_error: 4.7732
Epoch 74/90
5/5 [==============================] - 0s 798us/step - loss: 4.7124 - mean_squared_error: 4.7124
Epoch 75/90
5/5 [==============================] - 0s 798us/step - loss: 4.6421 - mean_squared_error: 4.6421
Epoch 76/90
5/5 [==============================] - 0s 752us/step - loss: 4.5624 - mean_squared_error: 4.5624
Epoch 77/90
5/5 [==============================] - 0s 761us/step - loss: 4.4965 - mean_squared_error: 4.4965
Epoch 78/90
5/5 [==============================] - 0s 798us/step - loss: 4.4284 - mean_squared_error: 4.4284
Epoch 79/90
5/5 [==============================] - 0s 751us/step - loss: 4.3700 - mean_squared_error: 4.3700
Epoch 80/90
5/5 [==============================] - 0s 598us/step - loss: 4.3000 - mean_squared_error: 4.3000
Epoch 81/90
5/5 [==============================] - 0s 798us/step - loss: 4.2243 - mean_squared_error: 4.2243
Epoch 82/90
5/5 [==============================] - 0s 598us/step - loss: 4.1689 - mean_squared_error: 4.1689
Epoch 83/90
5/5 [==============================] - 0s 798us/step - loss: 4.1093 - mean_squared_error: 4.1093
Epoch 84/90
5/5 [==============================] - 0s 598us/step - loss: 4.0493 - mean_squared_error: 4.0493
Epoch 85/90
5/5 [==============================] - 0s 798us/step - loss: 4.0074 - mean_squared_error: 4.0074
Epoch 86/90
5/5 [==============================] - 0s 598us/step - loss: 3.9332 - mean_squared_error: 3.9332
Epoch 87/90
5/5 [==============================] - 0s 694us/step - loss: 3.8766 - mean_squared_error: 3.8766
Epoch 88/90
5/5 [==============================] - 0s 716us/step - loss: 3.8362 - mean_squared_error: 3.8362
Epoch 89/90
5/5 [==============================] - 0s 798us/step - loss: 3.7924 - mean_squared_error: 3.7924
Epoch 90/90
5/5 [==============================] - 0s 798us/step - loss: 3.7349 - mean_squared_error: 3.7349
Out[26]:
<tensorflow.python.keras.callbacks.History at 0x2310e8f7f40>
  • In the cell below predictions are made on the data using only the rows from countries within North and South America and the R2 score is printed. It can be seen that the R2 score obtained is 0.416 (above the goal).
In [28]:
est_out = model.predict(ts_inputs)
R_2 = r2_score(ts_target, est_out)
print('The obtained R^2-score is {}'.format(R_2))
The obtained R^2-score is 0.41638630585990144

It's quite natural that the performance of the model we could get is poorer than what we could get in the first part, because we had no American countries in our training set but we performed all the testing on the cities that are located on the continent of America.

For a model to be transferable, the model needs to be trained on a dataset that covers enough number of different combination of feature values that could define the target behaviour. In our case, it is seen that the model is still able to predict the target up to an extent with this new data, but losing score and accuracy.