The dataset contains real-life driving data of two Volkswagen e-Golf cars, with year of manufacture as 2014 and 2016 respectively. The data is available at the Spritmonitor website.
Volkswagen e-Golf, year 2014, 85 kW (116 PS): https://www.spritmonitor.de/en/detail/679341.html?page=8
Volkswagen e-Golf, year 2016, 85kW (116 PS): https://www.spritmonitor.de/en/detail/786327.html
The data was scrapped using a python crawler (vehicle_crawler.py) available at: https://github.com/armiro/crawlers/tree/master/SpritMonitor-Crawler
The file includes data about 3615 trips with a total travel distance of around 152167 kilometers.
It is important to restrict the dataset to a limited number of cars. If data of the same type of car, owned by different people around the world, is included in the same dataset then it leads to instability in the model training. Hence, it is important to make sure that the dataset should not contain data of cars driven in too varied conditions.
Note: Some of the features were trimmed from the original CSV file before being loaded in the notebook.
import pandas as pd
data = pd.read_csv('/content/drive/MyDrive/Colab Notebooks/Projects/EV Range Prediction/Dataset/Volks.csv')
data.head()
Manufacturer | Power(kW) | Quantity(kWh) | Tire_Type | City | Highway | Country_Roads | Driving_Style | A/C | Park_heating | Avg_Speed(km/h) | Trip_Distance(km) | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | Volkswagen | 85 | 2.41 | Summer tires | 1 | 0 | 1 | Normal | 0 | 0 | 44.0 | 21.0 |
1 | Volkswagen | 85 | 13.17 | Summer tires | 1 | 1 | 1 | Normal | 0 | 0 | 57.0 | 108.0 |
2 | Volkswagen | 85 | 8.89 | Summer tires | 1 | 1 | 1 | Normal | 0 | 0 | 52.0 | 76.0 |
3 | Volkswagen | 85 | 12.46 | Summer tires | 1 | 1 | 1 | Normal | 0 | 0 | 63.0 | 103.0 |
4 | Volkswagen | 85 | 8.97 | Summer tires | 1 | 0 | 1 | Normal | 0 | 0 | 38.0 | 78.0 |
To check for categorical and non-categorical features in the dataset.
data.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 3615 entries, 0 to 3614 Data columns (total 12 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Manufacturer 3615 non-null object 1 Power(kW) 3615 non-null int64 2 Quantity(kWh) 3615 non-null float64 3 Tire_Type 3615 non-null object 4 City 3615 non-null int64 5 Highway 3615 non-null int64 6 Country_Roads 3615 non-null int64 7 Driving_Style 3615 non-null object 8 A/C 3615 non-null int64 9 Park_heating 3615 non-null int64 10 Avg_Speed(km/h) 3585 non-null float64 11 Trip_Distance(km) 3615 non-null float64 dtypes: float64(3), int64(6), object(3) memory usage: 339.0+ KB
Since, the dataset contains data related to only one car i.e., Volkswagen Golf, the Manufacturer and Power(kW) of battery are constant. Hence, they can be safely dropped from the dataset, without leading to any loss of valuable information.
data.drop(['Manufacturer','Power(kW)'], axis=1, inplace=True)
All the rows/ records with missing values of Trip_Distance(km) should be removed. These values cannot be imputed, because this is the target variable.
data = data[data['Trip_Distance(km)'].isnull() == False]
data.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 3615 entries, 0 to 3614 Data columns (total 10 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Quantity(kWh) 3615 non-null float64 1 Tire_Type 3615 non-null object 2 City 3615 non-null int64 3 Highway 3615 non-null int64 4 Country_Roads 3615 non-null int64 5 Driving_Style 3615 non-null object 6 A/C 3615 non-null int64 7 Park_heating 3615 non-null int64 8 Avg_Speed(km/h) 3585 non-null float64 9 Trip_Distance(km) 3615 non-null float64 dtypes: float64(3), int64(5), object(2) memory usage: 310.7+ KB
The categorical variables cannot be used in the dataset for training the model. They have to be converted into a an integer representation of boolean form, i.e. 1/0
data.select_dtypes('object')
Tire_Type | Driving_Style | |
---|---|---|
0 | Summer tires | Normal |
1 | Summer tires | Normal |
2 | Summer tires | Normal |
3 | Summer tires | Normal |
4 | Summer tires | Normal |
... | ... | ... |
3610 | Winter tires | Normal |
3611 | Winter tires | Normal |
3612 | Winter tires | Normal |
3613 | Winter tires | Fast |
3614 | Winter tires | Normal |
3615 rows × 2 columns
print('Unique tire types:', data['Tire_Type'].unique())
print('Unique driving styles:', data['Driving_Style'].unique())
Unique tire types: ['Summer tires' 'Winter tires'] Unique driving styles: ['Normal' 'Moderate' 'Fast']
for each in data['Tire_Type'].unique():
data['Tire_Type '+each] = list(map(int, data['Tire_Type']==each))
data.drop(['Tire_Type'], axis=1, inplace=True)
data.head()
Quantity(kWh) | City | Highway | Country_Roads | Driving_Style | A/C | Park_heating | Avg_Speed(km/h) | Trip_Distance(km) | Tire_Type Summer tires | Tire_Type Winter tires | |
---|---|---|---|---|---|---|---|---|---|---|---|
0 | 2.41 | 1 | 0 | 1 | Normal | 0 | 0 | 44.0 | 21.0 | 1 | 0 |
1 | 13.17 | 1 | 1 | 1 | Normal | 0 | 0 | 57.0 | 108.0 | 1 | 0 |
2 | 8.89 | 1 | 1 | 1 | Normal | 0 | 0 | 52.0 | 76.0 | 1 | 0 |
3 | 12.46 | 1 | 1 | 1 | Normal | 0 | 0 | 63.0 | 103.0 | 1 | 0 |
4 | 8.97 | 1 | 0 | 1 | Normal | 0 | 0 | 38.0 | 78.0 | 1 | 0 |
for each in data['Driving_Style'].unique():
data['Driving_Style '+each] = list(map(int, data['Driving_Style']==each))
data.drop(['Driving_Style'], axis=1, inplace=True)
data.head()
Quantity(kWh) | City | Highway | Country_Roads | A/C | Park_heating | Avg_Speed(km/h) | Trip_Distance(km) | Tire_Type Summer tires | Tire_Type Winter tires | Driving_Style Normal | Driving_Style Moderate | Driving_Style Fast | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 2.41 | 1 | 0 | 1 | 0 | 0 | 44.0 | 21.0 | 1 | 0 | 1 | 0 | 0 |
1 | 13.17 | 1 | 1 | 1 | 0 | 0 | 57.0 | 108.0 | 1 | 0 | 1 | 0 | 0 |
2 | 8.89 | 1 | 1 | 1 | 0 | 0 | 52.0 | 76.0 | 1 | 0 | 1 | 0 | 0 |
3 | 12.46 | 1 | 1 | 1 | 0 | 0 | 63.0 | 103.0 | 1 | 0 | 1 | 0 | 0 |
4 | 8.97 | 1 | 0 | 1 | 0 | 0 | 38.0 | 78.0 | 1 | 0 | 1 | 0 | 0 |
Checking the data types of the encoded features
data.info()
<class 'pandas.core.frame.DataFrame'> Int64Index: 3615 entries, 0 to 3614 Data columns (total 13 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Quantity(kWh) 3615 non-null float64 1 City 3615 non-null int64 2 Highway 3615 non-null int64 3 Country_Roads 3615 non-null int64 4 A/C 3615 non-null int64 5 Park_heating 3615 non-null int64 6 Avg_Speed(km/h) 3585 non-null float64 7 Trip_Distance(km) 3615 non-null float64 8 Tire_Type Summer tires 3615 non-null int64 9 Tire_Type Winter tires 3615 non-null int64 10 Driving_Style Normal 3615 non-null int64 11 Driving_Style Moderate 3615 non-null int64 12 Driving_Style Fast 3615 non-null int64 dtypes: float64(3), int64(10) memory usage: 395.4 KB
data = data.sample(frac=1)
Y = data['Trip_Distance(km)']
X = data.drop(['Trip_Distance(km)'], axis=1)
X.reset_index(level=None, drop=True, inplace=True, col_level=0, col_fill='')
Y.reset_index(level=None, drop=True, inplace=True)
from sklearn.model_selection import train_test_split
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, train_size=0.8, random_state=1)
X_train.reset_index(level=None, drop=True, inplace=True, col_level=0, col_fill='')
X_test.reset_index(level=None, drop=True, inplace=True, col_level=0, col_fill='')
Y_train.reset_index(level=None, drop=True, inplace=True)
Y_test.reset_index(level=None, drop=True, inplace=True)
print(X_train.shape, Y_train.shape, X_test.shape, Y_test.shape)
(2892, 12) (2892,) (723, 12) (723,)
Missing values for numerical/ non-categorical features is to be imputed using the rest of the data in the same feature.
X_train.isnull().any()
Quantity(kWh) False City False Highway False Country_Roads False A/C False Park_heating False Avg_Speed(km/h) True Tire_Type Summer tires False Tire_Type Winter tires False Driving_Style Normal False Driving_Style Moderate False Driving_Style Fast False dtype: bool
Here, we have used a simple imputer using the 'mean strategy', i.e. the missing values will be replaced by the mean of non-missing values in that feature.
from sklearn.impute import SimpleImputer
imputer = SimpleImputer(strategy='mean')
cols_with_missing_data = [cols for cols in X_train.columns
if X_train[cols].isnull().any()]
X_train_imputed_data_columns = pd.DataFrame(imputer.fit_transform(X_train[cols_with_missing_data]))
X_test_imputed_data_columns = pd.DataFrame(imputer.transform(X_test[cols_with_missing_data]))
X_train_imputed_data_columns.columns = cols_with_missing_data
X_test_imputed_data_columns.columns = cols_with_missing_data
X_train = pd.concat([X_train.drop(cols_with_missing_data, axis=1, inplace=False), X_train_imputed_data_columns], axis=1)
X_test = pd.concat([X_test.drop(cols_with_missing_data, axis=1, inplace=False), X_test_imputed_data_columns], axis=1)
Checking the number of non-null values in each column of the dataset
X_train.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 2892 entries, 0 to 2891 Data columns (total 12 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Quantity(kWh) 2892 non-null float64 1 City 2892 non-null int64 2 Highway 2892 non-null int64 3 Country_Roads 2892 non-null int64 4 A/C 2892 non-null int64 5 Park_heating 2892 non-null int64 6 Tire_Type Summer tires 2892 non-null int64 7 Tire_Type Winter tires 2892 non-null int64 8 Driving_Style Normal 2892 non-null int64 9 Driving_Style Moderate 2892 non-null int64 10 Driving_Style Fast 2892 non-null int64 11 Avg_Speed(km/h) 2892 non-null float64 dtypes: float64(2), int64(10) memory usage: 271.2 KB
X_test.info()
<class 'pandas.core.frame.DataFrame'> RangeIndex: 723 entries, 0 to 722 Data columns (total 12 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Quantity(kWh) 723 non-null float64 1 City 723 non-null int64 2 Highway 723 non-null int64 3 Country_Roads 723 non-null int64 4 A/C 723 non-null int64 5 Park_heating 723 non-null int64 6 Tire_Type Summer tires 723 non-null int64 7 Tire_Type Winter tires 723 non-null int64 8 Driving_Style Normal 723 non-null int64 9 Driving_Style Moderate 723 non-null int64 10 Driving_Style Fast 723 non-null int64 11 Avg_Speed(km/h) 723 non-null float64 dtypes: float64(2), int64(10) memory usage: 67.9 KB
This analysis is performed on continuos features assuming that they are having Gaussian distribution. The data points having a z-score of more than 3 and less than -3 are removed from the dataset since they are considered as outliers.
import numpy as np
from scipy.stats import norm
import matplotlib.pyplot as plt
%matplotlib inline
# Outlier analysis is used for identifying outlying continuous data.
# In our dataset Quantity(kWh) & Avg_Speed(km/h) are the only continuos data apart from the target variable.
plt.figure(figsize=(6,6))
plt.hist(X_train['Avg_Speed(km/h)'], bins=30, rwidth=0.8)
plt.xlabel('Avg_Speed(km/h)')
plt.ylabel('Count')
plt.show()
plt.figure(figsize=(6,6))
rng = np.arange(X_train['Avg_Speed(km/h)'].min(), X_train['Avg_Speed(km/h)'].max(), 0.1)
plt.plot(rng, norm.pdf(rng, X_train['Avg_Speed(km/h)'].mean(), X_train['Avg_Speed(km/h)'].std()))
plt.show()
# It could be cleary seen from the bell shaped curve that the data for average speed is normalized
# Thus removing outliers depending upon the z-score
X_train['Speed_z_score'] = (X_train['Avg_Speed(km/h)'] - X_train['Avg_Speed(km/h)'].mean())/X_train['Avg_Speed(km/h)'].std()
X_test['Speed_z_score'] = (X_test['Avg_Speed(km/h)'] - X_test['Avg_Speed(km/h)'].mean())/X_test['Avg_Speed(km/h)'].std()
# Reomving inputes with z-score of more than 3 or less than -3 for average speed
to_be_included_train = (X_train['Speed_z_score']<3) | (X_train['Speed_z_score']>-3)
to_be_included_test = (X_test['Speed_z_score']<3) | (X_test['Speed_z_score']>-3)
X_train = X_train[to_be_included_train]
Y_train = Y_train[to_be_included_train]
X_test = X_test[to_be_included_test]
Y_test = Y_test[to_be_included_test]
X_train.drop(['Speed_z_score'], axis=1, inplace=True)
X_test.drop(['Speed_z_score'], axis=1, inplace=True)
print(X_train.shape, Y_train.shape, X_test.shape, Y_test.shape)
(2892, 12) (2892,) (723, 12) (723,)
plt.figure(figsize=(6,6))
plt.hist(X_train['Quantity(kWh)'], bins=30, rwidth=0.8)
plt.xlabel('Quantity(kWh)')
plt.ylabel('Count')
plt.show()
plt.figure(figsize=(6,6))
rng = np.arange(X_train['Quantity(kWh)'].min(), X_train['Quantity(kWh)'].max(), 0.1)
plt.plot(rng, norm.pdf(rng, X_train['Quantity(kWh)'].mean(), X_train['Quantity(kWh)'].std()))
plt.show()
# From the curve it is concluded that the Quantity(kWh) is not distributed normally
# Hence, removing all the rows with Quantity(kWh) greater than 65kWh
to_be_included_train = X_train['Quantity(kWh)']<=65
to_be_included_test = X_test['Quantity(kWh)']<=65
X_train = X_train[to_be_included_train]
Y_train = Y_train[to_be_included_train]
X_test = X_test[to_be_included_test]
Y_test = Y_test[to_be_included_test]
print(X_train.shape, Y_train.shape, X_test.shape, Y_test.shape)
X_train.reset_index(level=None, drop=True, inplace=True, col_level=0, col_fill='')
X_test.reset_index(level=None, drop=True, inplace=True, col_level=0, col_fill='')
Y_train.reset_index(level=None, drop=True, inplace=True)
Y_test.reset_index(level=None, drop=True, inplace=True)
(2884, 12) (2884,) (723, 12) (723,)
import seaborn as sns
import matplotlib.pyplot as plt
plt.figure(figsize=(10,10))
sns.heatmap(X_train.corr(method='pearson'), annot=True)
plt.show()
As it could be seen from the heatmap that the Driving_Style Normal & Driving_Style Moderate are possessing strong negative correlation, one of them needs to be dropped. Same goes for Summer & winter tire types.
X_train = X_train.drop(['Driving_Style Normal', 'Tire_Type Summer tires'], axis=1)
X_test = X_test.drop(['Driving_Style Normal', 'Tire_Type Summer tires'], axis=1)
import tensorflow as tf
import tensorflowjs
import keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from keras import regularizers
cols_to_be_normalized = ['Quantity(kWh)', 'Avg_Speed(km/h)']
for each in cols_to_be_normalized:
mean = X_train[each].mean()
std = X_train[each].std()
X_train[each] = (X_train[each] - mean)/std
X_test[each] = (X_test[each] - mean)/std
X_train
Quantity(kWh) | City | Highway | Country_Roads | A/C | Park_heating | Tire_Type Winter tires | Driving_Style Moderate | Driving_Style Fast | Avg_Speed(km/h) | |
---|---|---|---|---|---|---|---|---|---|---|
0 | 0.262074 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0.132081 |
1 | -0.706020 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | -0.965159 |
2 | -0.608280 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | -1.200282 |
3 | -0.273170 | 1 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0.288829 |
4 | -0.384874 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | -0.338165 |
... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
2879 | -0.432968 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | -1.121908 |
2880 | 1.034687 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0.759075 |
2881 | -0.571046 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | -0.416539 |
2882 | -0.577251 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | -1.357031 |
2883 | -0.308853 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | -0.024668 |
2884 rows × 10 columns
X_test
Quantity(kWh) | City | Highway | Country_Roads | A/C | Park_heating | Tire_Type Winter tires | Driving_Style Moderate | Driving_Style Fast | Avg_Speed(km/h) | |
---|---|---|---|---|---|---|---|---|---|---|
0 | -0.319713 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0.445578 |
1 | -0.468651 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | -1.670528 |
2 | -0.840995 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | -1.670528 |
3 | -0.329022 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0.367204 |
4 | -0.856509 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | -1.748902 |
... | ... | ... | ... | ... | ... | ... | ... | ... | ... | ... |
718 | 0.893507 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 1.307695 |
719 | -0.712226 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | -1.278656 |
720 | -0.360051 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0.288829 |
721 | 1.372900 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | -0.024668 |
722 | 0.994350 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | -0.259791 |
723 rows × 10 columns
model = Sequential()
model.add(Dense(units=32, activation='relu', input_dim=len(X_train.columns)))
model.add(Dense(units=32, activation='relu'))
model.add(Dense(units=1, activation='linear'))
batch_size = 16 # batch size for model fitting
epochs = 1000 # number of epochs for model fitting
STEPS_PER_EPOCH = X_train.shape[0]/batch_size
es = keras.callbacks.EarlyStopping(monitor='val_loss', patience=50, verbose=0) # Early stopping to prevent overfitting
lr_schedule = tf.keras.optimizers.schedules.InverseTimeDecay( # Using a dynamic learning rate that decays a certain rate per 100 epochs
0.001,
decay_steps=STEPS_PER_EPOCH*100,
decay_rate=0.98,
staircase=False)
opt = keras.optimizers.RMSprop(lr_schedule) # Using RMSprop optimizer with the learning rate schedule as mentioned above
model.compile(loss='mae', optimizer=opt, metrics=['mae'])
model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense (Dense) (None, 32) 352 _________________________________________________________________ dense_1 (Dense) (None, 32) 1056 _________________________________________________________________ dense_2 (Dense) (None, 1) 33 ================================================================= Total params: 1,441 Trainable params: 1,441 Non-trainable params: 0 _________________________________________________________________
model.fit(X_train, Y_train, epochs=epochs, batch_size=batch_size, callbacks=[es], validation_split=0.2, validation_data=None, verbose=1)
Epoch 1/1000 145/145 [==============================] - 1s 3ms/step - loss: 36.3070 - mae: 36.3070 - val_loss: 26.2210 - val_mae: 26.2210 Epoch 2/1000 145/145 [==============================] - 0s 2ms/step - loss: 24.8942 - mae: 24.8942 - val_loss: 20.6493 - val_mae: 20.6493 Epoch 3/1000 145/145 [==============================] - 0s 2ms/step - loss: 18.1672 - mae: 18.1672 - val_loss: 13.9419 - val_mae: 13.9419 Epoch 4/1000 145/145 [==============================] - 0s 2ms/step - loss: 11.5387 - mae: 11.5387 - val_loss: 9.4910 - val_mae: 9.4910 Epoch 5/1000 145/145 [==============================] - 0s 2ms/step - loss: 8.2225 - mae: 8.2225 - val_loss: 7.7160 - val_mae: 7.7160 Epoch 6/1000 145/145 [==============================] - 0s 2ms/step - loss: 6.9653 - mae: 6.9653 - val_loss: 6.7490 - val_mae: 6.7490 Epoch 7/1000 145/145 [==============================] - 0s 2ms/step - loss: 6.3281 - mae: 6.3281 - val_loss: 6.1742 - val_mae: 6.1742 Epoch 8/1000 145/145 [==============================] - 0s 2ms/step - loss: 5.9244 - mae: 5.9244 - val_loss: 5.8284 - val_mae: 5.8284 Epoch 9/1000 145/145 [==============================] - 0s 2ms/step - loss: 5.6816 - mae: 5.6816 - val_loss: 5.5616 - val_mae: 5.5616 Epoch 10/1000 145/145 [==============================] - 0s 2ms/step - loss: 5.5277 - mae: 5.5277 - val_loss: 5.4544 - val_mae: 5.4544 Epoch 11/1000 145/145 [==============================] - 0s 2ms/step - loss: 5.4023 - mae: 5.4023 - val_loss: 5.2028 - val_mae: 5.2028 Epoch 12/1000 145/145 [==============================] - 0s 2ms/step - loss: 5.2984 - mae: 5.2984 - val_loss: 5.1732 - val_mae: 5.1732 Epoch 13/1000 145/145 [==============================] - 0s 2ms/step - loss: 5.2131 - mae: 5.2131 - val_loss: 5.0341 - val_mae: 5.0341 Epoch 14/1000 145/145 [==============================] - 0s 2ms/step - loss: 5.1185 - mae: 5.1185 - val_loss: 5.0084 - val_mae: 5.0084 Epoch 15/1000 145/145 [==============================] - 0s 2ms/step - loss: 5.0599 - mae: 5.0599 - val_loss: 4.9771 - val_mae: 4.9771 Epoch 16/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.9893 - mae: 4.9893 - val_loss: 4.8390 - val_mae: 4.8390 Epoch 17/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.9543 - mae: 4.9543 - val_loss: 4.8807 - val_mae: 4.8807 Epoch 18/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.8955 - mae: 4.8955 - val_loss: 4.8536 - val_mae: 4.8536 Epoch 19/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.8286 - mae: 4.8286 - val_loss: 4.7917 - val_mae: 4.7917 Epoch 20/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.8554 - mae: 4.8554 - val_loss: 4.6966 - val_mae: 4.6966 Epoch 21/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.7699 - mae: 4.7699 - val_loss: 4.6495 - val_mae: 4.6495 Epoch 22/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.7503 - mae: 4.7503 - val_loss: 4.6067 - val_mae: 4.6067 Epoch 23/1000 145/145 [==============================] - 0s 1ms/step - loss: 4.7161 - mae: 4.7161 - val_loss: 4.5587 - val_mae: 4.5587 Epoch 24/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.6858 - mae: 4.6858 - val_loss: 4.5182 - val_mae: 4.5182 Epoch 25/1000 145/145 [==============================] - 0s 1ms/step - loss: 4.6579 - mae: 4.6579 - val_loss: 4.5521 - val_mae: 4.5521 Epoch 26/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.6220 - mae: 4.6220 - val_loss: 4.5094 - val_mae: 4.5094 Epoch 27/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.6009 - mae: 4.6009 - val_loss: 4.4898 - val_mae: 4.4898 Epoch 28/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.5689 - mae: 4.5689 - val_loss: 4.4618 - val_mae: 4.4618 Epoch 29/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.5276 - mae: 4.5276 - val_loss: 4.4460 - val_mae: 4.4460 Epoch 30/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.4899 - mae: 4.4899 - val_loss: 4.3060 - val_mae: 4.3060 Epoch 31/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.4865 - mae: 4.4865 - val_loss: 4.3314 - val_mae: 4.3314 Epoch 32/1000 145/145 [==============================] - 0s 1ms/step - loss: 4.4520 - mae: 4.4520 - val_loss: 4.3761 - val_mae: 4.3761 Epoch 33/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.4171 - mae: 4.4171 - val_loss: 4.4257 - val_mae: 4.4257 Epoch 34/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.3950 - mae: 4.3950 - val_loss: 4.5638 - val_mae: 4.5638 Epoch 35/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.3984 - mae: 4.3984 - val_loss: 4.3336 - val_mae: 4.3336 Epoch 36/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.3606 - mae: 4.3606 - val_loss: 4.4266 - val_mae: 4.4266 Epoch 37/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.3516 - mae: 4.3516 - val_loss: 4.3104 - val_mae: 4.3104 Epoch 38/1000 145/145 [==============================] - 0s 1ms/step - loss: 4.3465 - mae: 4.3465 - val_loss: 4.2789 - val_mae: 4.2789 Epoch 39/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.3190 - mae: 4.3190 - val_loss: 4.2410 - val_mae: 4.2410 Epoch 40/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.3011 - mae: 4.3011 - val_loss: 4.1979 - val_mae: 4.1979 Epoch 41/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.2750 - mae: 4.2750 - val_loss: 4.3659 - val_mae: 4.3659 Epoch 42/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.2556 - mae: 4.2556 - val_loss: 4.1541 - val_mae: 4.1541 Epoch 43/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.2562 - mae: 4.2562 - val_loss: 4.1957 - val_mae: 4.1957 Epoch 44/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.2331 - mae: 4.2331 - val_loss: 4.2357 - val_mae: 4.2357 Epoch 45/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.2332 - mae: 4.2332 - val_loss: 4.0993 - val_mae: 4.0993 Epoch 46/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.2307 - mae: 4.2307 - val_loss: 4.1179 - val_mae: 4.1179 Epoch 47/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.2083 - mae: 4.2083 - val_loss: 4.1171 - val_mae: 4.1171 Epoch 48/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.2009 - mae: 4.2009 - val_loss: 4.0515 - val_mae: 4.0515 Epoch 49/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1872 - mae: 4.1872 - val_loss: 4.3476 - val_mae: 4.3476 Epoch 50/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1791 - mae: 4.1791 - val_loss: 4.3400 - val_mae: 4.3400 Epoch 51/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1632 - mae: 4.1632 - val_loss: 4.0830 - val_mae: 4.0830 Epoch 52/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1761 - mae: 4.1761 - val_loss: 4.1936 - val_mae: 4.1936 Epoch 53/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1399 - mae: 4.1399 - val_loss: 4.0399 - val_mae: 4.0399 Epoch 54/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1412 - mae: 4.1412 - val_loss: 4.1157 - val_mae: 4.1157 Epoch 55/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1368 - mae: 4.1368 - val_loss: 4.2025 - val_mae: 4.2025 Epoch 56/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1138 - mae: 4.1138 - val_loss: 4.1260 - val_mae: 4.1260 Epoch 57/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1161 - mae: 4.1161 - val_loss: 4.1705 - val_mae: 4.1705 Epoch 58/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1041 - mae: 4.1041 - val_loss: 4.0954 - val_mae: 4.0954 Epoch 59/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.1094 - mae: 4.1094 - val_loss: 4.0748 - val_mae: 4.0748 Epoch 60/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0903 - mae: 4.0903 - val_loss: 3.9924 - val_mae: 3.9924 Epoch 61/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0944 - mae: 4.0944 - val_loss: 4.1145 - val_mae: 4.1145 Epoch 62/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0823 - mae: 4.0823 - val_loss: 4.0102 - val_mae: 4.0102 Epoch 63/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0745 - mae: 4.0745 - val_loss: 4.2580 - val_mae: 4.2580 Epoch 64/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0693 - mae: 4.0693 - val_loss: 4.0097 - val_mae: 4.0097 Epoch 65/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0590 - mae: 4.0590 - val_loss: 3.9932 - val_mae: 3.9932 Epoch 66/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0483 - mae: 4.0483 - val_loss: 4.0399 - val_mae: 4.0399 Epoch 67/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0526 - mae: 4.0526 - val_loss: 4.3833 - val_mae: 4.3833 Epoch 68/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0502 - mae: 4.0502 - val_loss: 3.9731 - val_mae: 3.9731 Epoch 69/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0453 - mae: 4.0453 - val_loss: 4.0275 - val_mae: 4.0275 Epoch 70/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0338 - mae: 4.0338 - val_loss: 4.1954 - val_mae: 4.1954 Epoch 71/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0233 - mae: 4.0233 - val_loss: 4.1872 - val_mae: 4.1872 Epoch 72/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0305 - mae: 4.0305 - val_loss: 4.0546 - val_mae: 4.0546 Epoch 73/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0097 - mae: 4.0097 - val_loss: 4.1004 - val_mae: 4.1004 Epoch 74/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0134 - mae: 4.0134 - val_loss: 3.9416 - val_mae: 3.9416 Epoch 75/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0079 - mae: 4.0079 - val_loss: 4.0411 - val_mae: 4.0411 Epoch 76/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0123 - mae: 4.0123 - val_loss: 4.1289 - val_mae: 4.1289 Epoch 77/1000 145/145 [==============================] - 0s 2ms/step - loss: 4.0037 - mae: 4.0037 - val_loss: 3.9946 - val_mae: 3.9946 Epoch 78/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9976 - mae: 3.9976 - val_loss: 3.9889 - val_mae: 3.9889 Epoch 79/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9857 - mae: 3.9857 - val_loss: 4.0041 - val_mae: 4.0041 Epoch 80/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9913 - mae: 3.9913 - val_loss: 4.0290 - val_mae: 4.0290 Epoch 81/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9844 - mae: 3.9844 - val_loss: 3.9782 - val_mae: 3.9782 Epoch 82/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9849 - mae: 3.9849 - val_loss: 3.9352 - val_mae: 3.9352 Epoch 83/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9749 - mae: 3.9749 - val_loss: 4.0014 - val_mae: 4.0014 Epoch 84/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9577 - mae: 3.9577 - val_loss: 3.9358 - val_mae: 3.9358 Epoch 85/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9860 - mae: 3.9860 - val_loss: 3.9882 - val_mae: 3.9882 Epoch 86/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9619 - mae: 3.9619 - val_loss: 3.9481 - val_mae: 3.9481 Epoch 87/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9639 - mae: 3.9639 - val_loss: 4.2011 - val_mae: 4.2011 Epoch 88/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9623 - mae: 3.9623 - val_loss: 3.9198 - val_mae: 3.9198 Epoch 89/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9474 - mae: 3.9474 - val_loss: 3.8865 - val_mae: 3.8865 Epoch 90/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9563 - mae: 3.9563 - val_loss: 3.9250 - val_mae: 3.9250 Epoch 91/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9519 - mae: 3.9519 - val_loss: 3.8471 - val_mae: 3.8471 Epoch 92/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9431 - mae: 3.9431 - val_loss: 3.9301 - val_mae: 3.9301 Epoch 93/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9447 - mae: 3.9447 - val_loss: 3.8970 - val_mae: 3.8970 Epoch 94/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9399 - mae: 3.9399 - val_loss: 3.8583 - val_mae: 3.8583 Epoch 95/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9305 - mae: 3.9305 - val_loss: 4.1568 - val_mae: 4.1568 Epoch 96/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9275 - mae: 3.9275 - val_loss: 3.9955 - val_mae: 3.9955 Epoch 97/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9212 - mae: 3.9212 - val_loss: 4.0612 - val_mae: 4.0612 Epoch 98/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9259 - mae: 3.9259 - val_loss: 3.8839 - val_mae: 3.8839 Epoch 99/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9169 - mae: 3.9169 - val_loss: 3.9754 - val_mae: 3.9754 Epoch 100/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9028 - mae: 3.9028 - val_loss: 3.8813 - val_mae: 3.8813 Epoch 101/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9064 - mae: 3.9064 - val_loss: 3.9027 - val_mae: 3.9027 Epoch 102/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9133 - mae: 3.9133 - val_loss: 3.8762 - val_mae: 3.8762 Epoch 103/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9082 - mae: 3.9082 - val_loss: 3.8534 - val_mae: 3.8534 Epoch 104/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9013 - mae: 3.9013 - val_loss: 3.9671 - val_mae: 3.9671 Epoch 105/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8923 - mae: 3.8923 - val_loss: 3.8347 - val_mae: 3.8347 Epoch 106/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.9038 - mae: 3.9038 - val_loss: 3.8641 - val_mae: 3.8641 Epoch 107/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8870 - mae: 3.8870 - val_loss: 3.9325 - val_mae: 3.9325 Epoch 108/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8777 - mae: 3.8777 - val_loss: 3.8553 - val_mae: 3.8553 Epoch 109/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8750 - mae: 3.8750 - val_loss: 3.8230 - val_mae: 3.8230 Epoch 110/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8670 - mae: 3.8670 - val_loss: 3.8107 - val_mae: 3.8107 Epoch 111/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8790 - mae: 3.8790 - val_loss: 3.9825 - val_mae: 3.9825 Epoch 112/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8759 - mae: 3.8759 - val_loss: 3.8362 - val_mae: 3.8362 Epoch 113/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8728 - mae: 3.8728 - val_loss: 3.8418 - val_mae: 3.8418 Epoch 114/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8656 - mae: 3.8656 - val_loss: 3.8441 - val_mae: 3.8441 Epoch 115/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8714 - mae: 3.8714 - val_loss: 3.9338 - val_mae: 3.9338 Epoch 116/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8700 - mae: 3.8700 - val_loss: 3.9621 - val_mae: 3.9621 Epoch 117/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8500 - mae: 3.8500 - val_loss: 3.9767 - val_mae: 3.9767 Epoch 118/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8543 - mae: 3.8543 - val_loss: 3.9954 - val_mae: 3.9954 Epoch 119/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8575 - mae: 3.8575 - val_loss: 3.8488 - val_mae: 3.8488 Epoch 120/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8413 - mae: 3.8413 - val_loss: 4.2209 - val_mae: 4.2209 Epoch 121/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8342 - mae: 3.8342 - val_loss: 4.0471 - val_mae: 4.0471 Epoch 122/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8476 - mae: 3.8476 - val_loss: 3.8740 - val_mae: 3.8740 Epoch 123/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8425 - mae: 3.8425 - val_loss: 3.7967 - val_mae: 3.7967 Epoch 124/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8523 - mae: 3.8523 - val_loss: 3.9554 - val_mae: 3.9554 Epoch 125/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8340 - mae: 3.8340 - val_loss: 4.1435 - val_mae: 4.1435 Epoch 126/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8390 - mae: 3.8390 - val_loss: 3.8387 - val_mae: 3.8387 Epoch 127/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8429 - mae: 3.8429 - val_loss: 3.9356 - val_mae: 3.9356 Epoch 128/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8260 - mae: 3.8260 - val_loss: 3.8528 - val_mae: 3.8528 Epoch 129/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8376 - mae: 3.8376 - val_loss: 3.7829 - val_mae: 3.7829 Epoch 130/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8274 - mae: 3.8274 - val_loss: 3.7830 - val_mae: 3.7830 Epoch 131/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8218 - mae: 3.8218 - val_loss: 4.1597 - val_mae: 4.1597 Epoch 132/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8274 - mae: 3.8274 - val_loss: 3.8796 - val_mae: 3.8796 Epoch 133/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8214 - mae: 3.8214 - val_loss: 3.7664 - val_mae: 3.7664 Epoch 134/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8257 - mae: 3.8257 - val_loss: 3.9273 - val_mae: 3.9273 Epoch 135/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8112 - mae: 3.8112 - val_loss: 3.8943 - val_mae: 3.8943 Epoch 136/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8133 - mae: 3.8133 - val_loss: 3.8593 - val_mae: 3.8593 Epoch 137/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8119 - mae: 3.8119 - val_loss: 3.9246 - val_mae: 3.9246 Epoch 138/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8163 - mae: 3.8163 - val_loss: 3.9661 - val_mae: 3.9661 Epoch 139/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8108 - mae: 3.8108 - val_loss: 3.7982 - val_mae: 3.7982 Epoch 140/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8110 - mae: 3.8110 - val_loss: 3.7809 - val_mae: 3.7809 Epoch 141/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8022 - mae: 3.8022 - val_loss: 3.7735 - val_mae: 3.7735 Epoch 142/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8012 - mae: 3.8012 - val_loss: 3.9416 - val_mae: 3.9416 Epoch 143/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7977 - mae: 3.7977 - val_loss: 3.9120 - val_mae: 3.9120 Epoch 144/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8037 - mae: 3.8037 - val_loss: 3.9079 - val_mae: 3.9079 Epoch 145/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7950 - mae: 3.7950 - val_loss: 3.8993 - val_mae: 3.8993 Epoch 146/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7902 - mae: 3.7902 - val_loss: 3.9533 - val_mae: 3.9533 Epoch 147/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7952 - mae: 3.7952 - val_loss: 3.8330 - val_mae: 3.8330 Epoch 148/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.8006 - mae: 3.8006 - val_loss: 3.8389 - val_mae: 3.8389 Epoch 149/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7928 - mae: 3.7928 - val_loss: 3.8610 - val_mae: 3.8610 Epoch 150/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7952 - mae: 3.7952 - val_loss: 3.8203 - val_mae: 3.8203 Epoch 151/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7819 - mae: 3.7819 - val_loss: 3.8736 - val_mae: 3.8736 Epoch 152/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7845 - mae: 3.7845 - val_loss: 3.8288 - val_mae: 3.8288 Epoch 153/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7714 - mae: 3.7714 - val_loss: 4.0121 - val_mae: 4.0121 Epoch 154/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7902 - mae: 3.7902 - val_loss: 3.8147 - val_mae: 3.8147 Epoch 155/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7820 - mae: 3.7820 - val_loss: 3.8848 - val_mae: 3.8848 Epoch 156/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7807 - mae: 3.7807 - val_loss: 3.8427 - val_mae: 3.8427 Epoch 157/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7831 - mae: 3.7831 - val_loss: 3.8193 - val_mae: 3.8193 Epoch 158/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7731 - mae: 3.7731 - val_loss: 4.0418 - val_mae: 4.0418 Epoch 159/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7879 - mae: 3.7879 - val_loss: 3.8736 - val_mae: 3.8736 Epoch 160/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7757 - mae: 3.7757 - val_loss: 3.7759 - val_mae: 3.7759 Epoch 161/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7710 - mae: 3.7710 - val_loss: 3.7972 - val_mae: 3.7972 Epoch 162/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7729 - mae: 3.7729 - val_loss: 3.8027 - val_mae: 3.8027 Epoch 163/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7648 - mae: 3.7648 - val_loss: 3.7506 - val_mae: 3.7506 Epoch 164/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7653 - mae: 3.7653 - val_loss: 3.7598 - val_mae: 3.7598 Epoch 165/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7637 - mae: 3.7637 - val_loss: 3.8646 - val_mae: 3.8646 Epoch 166/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7693 - mae: 3.7693 - val_loss: 3.7910 - val_mae: 3.7910 Epoch 167/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7619 - mae: 3.7619 - val_loss: 3.9029 - val_mae: 3.9029 Epoch 168/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7640 - mae: 3.7640 - val_loss: 3.7994 - val_mae: 3.7994 Epoch 169/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7580 - mae: 3.7580 - val_loss: 3.8781 - val_mae: 3.8781 Epoch 170/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7590 - mae: 3.7590 - val_loss: 3.8050 - val_mae: 3.8050 Epoch 171/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7606 - mae: 3.7606 - val_loss: 3.7627 - val_mae: 3.7627 Epoch 172/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7600 - mae: 3.7600 - val_loss: 3.8491 - val_mae: 3.8491 Epoch 173/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7598 - mae: 3.7598 - val_loss: 3.8302 - val_mae: 3.8302 Epoch 174/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7569 - mae: 3.7569 - val_loss: 3.7687 - val_mae: 3.7687 Epoch 175/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7454 - mae: 3.7454 - val_loss: 3.8378 - val_mae: 3.8378 Epoch 176/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7621 - mae: 3.7621 - val_loss: 3.8098 - val_mae: 3.8098 Epoch 177/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7484 - mae: 3.7484 - val_loss: 3.7937 - val_mae: 3.7937 Epoch 178/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7369 - mae: 3.7369 - val_loss: 3.9079 - val_mae: 3.9079 Epoch 179/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7534 - mae: 3.7534 - val_loss: 3.8143 - val_mae: 3.8143 Epoch 180/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7553 - mae: 3.7553 - val_loss: 3.7814 - val_mae: 3.7814 Epoch 181/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7417 - mae: 3.7417 - val_loss: 3.7351 - val_mae: 3.7351 Epoch 182/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7466 - mae: 3.7466 - val_loss: 3.7713 - val_mae: 3.7713 Epoch 183/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7427 - mae: 3.7427 - val_loss: 3.7152 - val_mae: 3.7152 Epoch 184/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7307 - mae: 3.7307 - val_loss: 3.8405 - val_mae: 3.8405 Epoch 185/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7456 - mae: 3.7456 - val_loss: 3.7846 - val_mae: 3.7846 Epoch 186/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7440 - mae: 3.7440 - val_loss: 3.8021 - val_mae: 3.8021 Epoch 187/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7429 - mae: 3.7429 - val_loss: 3.8639 - val_mae: 3.8639 Epoch 188/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7379 - mae: 3.7379 - val_loss: 3.7759 - val_mae: 3.7759 Epoch 189/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7287 - mae: 3.7287 - val_loss: 3.7321 - val_mae: 3.7321 Epoch 190/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7348 - mae: 3.7348 - val_loss: 3.7485 - val_mae: 3.7485 Epoch 191/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7308 - mae: 3.7308 - val_loss: 3.8167 - val_mae: 3.8167 Epoch 192/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7295 - mae: 3.7295 - val_loss: 3.7498 - val_mae: 3.7498 Epoch 193/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7401 - mae: 3.7401 - val_loss: 3.7439 - val_mae: 3.7439 Epoch 194/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7410 - mae: 3.7410 - val_loss: 3.7744 - val_mae: 3.7744 Epoch 195/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7288 - mae: 3.7288 - val_loss: 3.8491 - val_mae: 3.8491 Epoch 196/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7302 - mae: 3.7302 - val_loss: 3.8661 - val_mae: 3.8661 Epoch 197/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7249 - mae: 3.7249 - val_loss: 3.7324 - val_mae: 3.7324 Epoch 198/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7233 - mae: 3.7233 - val_loss: 3.7727 - val_mae: 3.7727 Epoch 199/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7326 - mae: 3.7326 - val_loss: 3.8582 - val_mae: 3.8582 Epoch 200/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7218 - mae: 3.7218 - val_loss: 3.8268 - val_mae: 3.8268 Epoch 201/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7215 - mae: 3.7215 - val_loss: 3.9189 - val_mae: 3.9189 Epoch 202/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7256 - mae: 3.7256 - val_loss: 3.8209 - val_mae: 3.8209 Epoch 203/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7228 - mae: 3.7228 - val_loss: 3.8079 - val_mae: 3.8079 Epoch 204/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7184 - mae: 3.7184 - val_loss: 3.8486 - val_mae: 3.8486 Epoch 205/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7137 - mae: 3.7137 - val_loss: 3.7979 - val_mae: 3.7979 Epoch 206/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7196 - mae: 3.7196 - val_loss: 3.8349 - val_mae: 3.8349 Epoch 207/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7152 - mae: 3.7152 - val_loss: 3.7406 - val_mae: 3.7406 Epoch 208/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7176 - mae: 3.7176 - val_loss: 3.7266 - val_mae: 3.7266 Epoch 209/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7024 - mae: 3.7024 - val_loss: 3.7189 - val_mae: 3.7189 Epoch 210/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7120 - mae: 3.7120 - val_loss: 3.7727 - val_mae: 3.7727 Epoch 211/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7091 - mae: 3.7091 - val_loss: 3.7580 - val_mae: 3.7580 Epoch 212/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7113 - mae: 3.7113 - val_loss: 3.6997 - val_mae: 3.6997 Epoch 213/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6861 - mae: 3.6861 - val_loss: 3.7912 - val_mae: 3.7912 Epoch 214/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7121 - mae: 3.7121 - val_loss: 3.7260 - val_mae: 3.7260 Epoch 215/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7118 - mae: 3.7118 - val_loss: 3.8379 - val_mae: 3.8379 Epoch 216/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7083 - mae: 3.7083 - val_loss: 3.7559 - val_mae: 3.7559 Epoch 217/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7102 - mae: 3.7102 - val_loss: 3.7668 - val_mae: 3.7668 Epoch 218/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7081 - mae: 3.7081 - val_loss: 3.7018 - val_mae: 3.7018 Epoch 219/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7026 - mae: 3.7026 - val_loss: 3.7326 - val_mae: 3.7326 Epoch 220/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7110 - mae: 3.7110 - val_loss: 3.8770 - val_mae: 3.8770 Epoch 221/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7072 - mae: 3.7072 - val_loss: 3.8502 - val_mae: 3.8502 Epoch 222/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7064 - mae: 3.7064 - val_loss: 3.7551 - val_mae: 3.7551 Epoch 223/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7045 - mae: 3.7045 - val_loss: 3.7732 - val_mae: 3.7732 Epoch 224/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6900 - mae: 3.6900 - val_loss: 3.7349 - val_mae: 3.7349 Epoch 225/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7115 - mae: 3.7115 - val_loss: 3.7888 - val_mae: 3.7888 Epoch 226/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7048 - mae: 3.7048 - val_loss: 3.7332 - val_mae: 3.7332 Epoch 227/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7002 - mae: 3.7002 - val_loss: 3.7429 - val_mae: 3.7429 Epoch 228/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6973 - mae: 3.6973 - val_loss: 3.7680 - val_mae: 3.7680 Epoch 229/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6912 - mae: 3.6912 - val_loss: 3.7210 - val_mae: 3.7210 Epoch 230/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6936 - mae: 3.6936 - val_loss: 3.7433 - val_mae: 3.7433 Epoch 231/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6936 - mae: 3.6936 - val_loss: 3.7610 - val_mae: 3.7610 Epoch 232/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6925 - mae: 3.6925 - val_loss: 3.7922 - val_mae: 3.7922 Epoch 233/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6900 - mae: 3.6900 - val_loss: 3.7704 - val_mae: 3.7704 Epoch 234/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6866 - mae: 3.6866 - val_loss: 3.7079 - val_mae: 3.7079 Epoch 235/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.7002 - mae: 3.7002 - val_loss: 3.7072 - val_mae: 3.7072 Epoch 236/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6898 - mae: 3.6898 - val_loss: 3.7079 - val_mae: 3.7079 Epoch 237/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6917 - mae: 3.6917 - val_loss: 3.7212 - val_mae: 3.7212 Epoch 238/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6930 - mae: 3.6930 - val_loss: 3.7035 - val_mae: 3.7035 Epoch 239/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6792 - mae: 3.6792 - val_loss: 3.6939 - val_mae: 3.6939 Epoch 240/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6817 - mae: 3.6817 - val_loss: 3.7482 - val_mae: 3.7482 Epoch 241/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6910 - mae: 3.6910 - val_loss: 3.7321 - val_mae: 3.7321 Epoch 242/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6872 - mae: 3.6872 - val_loss: 3.7248 - val_mae: 3.7248 Epoch 243/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6936 - mae: 3.6936 - val_loss: 3.7362 - val_mae: 3.7362 Epoch 244/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6876 - mae: 3.6876 - val_loss: 3.7168 - val_mae: 3.7168 Epoch 245/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6802 - mae: 3.6802 - val_loss: 3.7136 - val_mae: 3.7136 Epoch 246/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6808 - mae: 3.6808 - val_loss: 3.7984 - val_mae: 3.7984 Epoch 247/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6826 - mae: 3.6826 - val_loss: 3.7154 - val_mae: 3.7154 Epoch 248/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6696 - mae: 3.6696 - val_loss: 3.7268 - val_mae: 3.7268 Epoch 249/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6783 - mae: 3.6783 - val_loss: 3.7395 - val_mae: 3.7395 Epoch 250/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6805 - mae: 3.6805 - val_loss: 3.7842 - val_mae: 3.7842 Epoch 251/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6822 - mae: 3.6822 - val_loss: 3.7288 - val_mae: 3.7288 Epoch 252/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6785 - mae: 3.6785 - val_loss: 3.7860 - val_mae: 3.7860 Epoch 253/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6710 - mae: 3.6710 - val_loss: 3.6874 - val_mae: 3.6874 Epoch 254/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6819 - mae: 3.6819 - val_loss: 3.7440 - val_mae: 3.7440 Epoch 255/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6702 - mae: 3.6702 - val_loss: 3.7895 - val_mae: 3.7895 Epoch 256/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6710 - mae: 3.6710 - val_loss: 3.8633 - val_mae: 3.8633 Epoch 257/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6761 - mae: 3.6761 - val_loss: 3.7461 - val_mae: 3.7461 Epoch 258/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6795 - mae: 3.6795 - val_loss: 3.7985 - val_mae: 3.7985 Epoch 259/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6755 - mae: 3.6755 - val_loss: 3.8370 - val_mae: 3.8370 Epoch 260/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6813 - mae: 3.6813 - val_loss: 3.8013 - val_mae: 3.8013 Epoch 261/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6709 - mae: 3.6709 - val_loss: 3.6929 - val_mae: 3.6929 Epoch 262/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6722 - mae: 3.6722 - val_loss: 3.7179 - val_mae: 3.7179 Epoch 263/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6662 - mae: 3.6662 - val_loss: 3.7349 - val_mae: 3.7349 Epoch 264/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6730 - mae: 3.6730 - val_loss: 3.8174 - val_mae: 3.8174 Epoch 265/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6712 - mae: 3.6712 - val_loss: 3.7080 - val_mae: 3.7080 Epoch 266/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6648 - mae: 3.6648 - val_loss: 3.7428 - val_mae: 3.7428 Epoch 267/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6637 - mae: 3.6637 - val_loss: 3.7719 - val_mae: 3.7719 Epoch 268/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6676 - mae: 3.6676 - val_loss: 3.7008 - val_mae: 3.7008 Epoch 269/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6664 - mae: 3.6664 - val_loss: 3.7884 - val_mae: 3.7884 Epoch 270/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6710 - mae: 3.6710 - val_loss: 3.7506 - val_mae: 3.7506 Epoch 271/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6603 - mae: 3.6603 - val_loss: 3.7307 - val_mae: 3.7307 Epoch 272/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6655 - mae: 3.6655 - val_loss: 3.7083 - val_mae: 3.7083 Epoch 273/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6634 - mae: 3.6634 - val_loss: 3.7417 - val_mae: 3.7417 Epoch 274/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6568 - mae: 3.6568 - val_loss: 3.7942 - val_mae: 3.7942 Epoch 275/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6652 - mae: 3.6652 - val_loss: 3.7242 - val_mae: 3.7242 Epoch 276/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6654 - mae: 3.6654 - val_loss: 3.6803 - val_mae: 3.6803 Epoch 277/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6630 - mae: 3.6630 - val_loss: 3.7154 - val_mae: 3.7154 Epoch 278/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6629 - mae: 3.6629 - val_loss: 3.7147 - val_mae: 3.7147 Epoch 279/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6592 - mae: 3.6592 - val_loss: 3.7039 - val_mae: 3.7039 Epoch 280/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6489 - mae: 3.6489 - val_loss: 3.7062 - val_mae: 3.7062 Epoch 281/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6434 - mae: 3.6434 - val_loss: 3.7974 - val_mae: 3.7974 Epoch 282/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6614 - mae: 3.6614 - val_loss: 3.7296 - val_mae: 3.7296 Epoch 283/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6575 - mae: 3.6575 - val_loss: 3.7701 - val_mae: 3.7701 Epoch 284/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6572 - mae: 3.6572 - val_loss: 3.7176 - val_mae: 3.7176 Epoch 285/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6540 - mae: 3.6540 - val_loss: 3.7544 - val_mae: 3.7544 Epoch 286/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6486 - mae: 3.6486 - val_loss: 3.6784 - val_mae: 3.6784 Epoch 287/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6534 - mae: 3.6534 - val_loss: 3.8012 - val_mae: 3.8012 Epoch 288/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6461 - mae: 3.6461 - val_loss: 3.7325 - val_mae: 3.7325 Epoch 289/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6519 - mae: 3.6519 - val_loss: 3.6936 - val_mae: 3.6936 Epoch 290/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6540 - mae: 3.6540 - val_loss: 3.7213 - val_mae: 3.7213 Epoch 291/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6521 - mae: 3.6521 - val_loss: 3.7835 - val_mae: 3.7835 Epoch 292/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6514 - mae: 3.6514 - val_loss: 3.7431 - val_mae: 3.7431 Epoch 293/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6582 - mae: 3.6582 - val_loss: 3.6987 - val_mae: 3.6987 Epoch 294/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6468 - mae: 3.6468 - val_loss: 3.7583 - val_mae: 3.7583 Epoch 295/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6458 - mae: 3.6458 - val_loss: 3.6784 - val_mae: 3.6784 Epoch 296/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6488 - mae: 3.6488 - val_loss: 3.7254 - val_mae: 3.7254 Epoch 297/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6529 - mae: 3.6529 - val_loss: 3.7713 - val_mae: 3.7713 Epoch 298/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6478 - mae: 3.6478 - val_loss: 3.6988 - val_mae: 3.6988 Epoch 299/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6469 - mae: 3.6469 - val_loss: 3.8366 - val_mae: 3.8366 Epoch 300/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6521 - mae: 3.6521 - val_loss: 3.6914 - val_mae: 3.6914 Epoch 301/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6497 - mae: 3.6497 - val_loss: 3.7830 - val_mae: 3.7830 Epoch 302/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6490 - mae: 3.6490 - val_loss: 3.7191 - val_mae: 3.7191 Epoch 303/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6356 - mae: 3.6356 - val_loss: 3.7216 - val_mae: 3.7216 Epoch 304/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6508 - mae: 3.6508 - val_loss: 3.7125 - val_mae: 3.7125 Epoch 305/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6403 - mae: 3.6403 - val_loss: 3.7409 - val_mae: 3.7409 Epoch 306/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6463 - mae: 3.6463 - val_loss: 3.7974 - val_mae: 3.7974 Epoch 307/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6411 - mae: 3.6411 - val_loss: 3.7190 - val_mae: 3.7190 Epoch 308/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6398 - mae: 3.6398 - val_loss: 3.7014 - val_mae: 3.7014 Epoch 309/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6402 - mae: 3.6402 - val_loss: 3.7094 - val_mae: 3.7094 Epoch 310/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6412 - mae: 3.6412 - val_loss: 3.7314 - val_mae: 3.7314 Epoch 311/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6355 - mae: 3.6355 - val_loss: 3.7400 - val_mae: 3.7400 Epoch 312/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6444 - mae: 3.6444 - val_loss: 3.7491 - val_mae: 3.7491 Epoch 313/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6415 - mae: 3.6415 - val_loss: 3.7300 - val_mae: 3.7300 Epoch 314/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6377 - mae: 3.6377 - val_loss: 3.7306 - val_mae: 3.7306 Epoch 315/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6445 - mae: 3.6445 - val_loss: 3.7481 - val_mae: 3.7481 Epoch 316/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6399 - mae: 3.6399 - val_loss: 3.6927 - val_mae: 3.6927 Epoch 317/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6304 - mae: 3.6304 - val_loss: 3.7181 - val_mae: 3.7181 Epoch 318/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6335 - mae: 3.6335 - val_loss: 3.7096 - val_mae: 3.7096 Epoch 319/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6405 - mae: 3.6405 - val_loss: 3.7378 - val_mae: 3.7378 Epoch 320/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6393 - mae: 3.6393 - val_loss: 3.7334 - val_mae: 3.7334 Epoch 321/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6335 - mae: 3.6335 - val_loss: 3.6974 - val_mae: 3.6974 Epoch 322/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6369 - mae: 3.6369 - val_loss: 3.7087 - val_mae: 3.7087 Epoch 323/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6388 - mae: 3.6388 - val_loss: 3.7615 - val_mae: 3.7615 Epoch 324/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6354 - mae: 3.6354 - val_loss: 3.6950 - val_mae: 3.6950 Epoch 325/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6335 - mae: 3.6335 - val_loss: 3.6764 - val_mae: 3.6764 Epoch 326/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6367 - mae: 3.6367 - val_loss: 3.7731 - val_mae: 3.7731 Epoch 327/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6360 - mae: 3.6360 - val_loss: 3.6809 - val_mae: 3.6809 Epoch 328/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6307 - mae: 3.6307 - val_loss: 3.7811 - val_mae: 3.7811 Epoch 329/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6356 - mae: 3.6356 - val_loss: 3.7490 - val_mae: 3.7490 Epoch 330/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6295 - mae: 3.6295 - val_loss: 3.6876 - val_mae: 3.6876 Epoch 331/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6316 - mae: 3.6316 - val_loss: 3.7280 - val_mae: 3.7280 Epoch 332/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6344 - mae: 3.6344 - val_loss: 3.7426 - val_mae: 3.7426 Epoch 333/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6377 - mae: 3.6377 - val_loss: 3.7215 - val_mae: 3.7215 Epoch 334/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6345 - mae: 3.6345 - val_loss: 3.7228 - val_mae: 3.7228 Epoch 335/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6308 - mae: 3.6308 - val_loss: 3.8138 - val_mae: 3.8138 Epoch 336/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6337 - mae: 3.6337 - val_loss: 3.6981 - val_mae: 3.6981 Epoch 337/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6310 - mae: 3.6310 - val_loss: 3.7468 - val_mae: 3.7468 Epoch 338/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6254 - mae: 3.6254 - val_loss: 3.7798 - val_mae: 3.7798 Epoch 339/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6285 - mae: 3.6285 - val_loss: 3.7681 - val_mae: 3.7681 Epoch 340/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6297 - mae: 3.6297 - val_loss: 3.7114 - val_mae: 3.7114 Epoch 341/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6311 - mae: 3.6311 - val_loss: 3.7390 - val_mae: 3.7390 Epoch 342/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6294 - mae: 3.6294 - val_loss: 3.7182 - val_mae: 3.7182 Epoch 343/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6276 - mae: 3.6276 - val_loss: 3.7240 - val_mae: 3.7240 Epoch 344/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6256 - mae: 3.6256 - val_loss: 3.6938 - val_mae: 3.6938 Epoch 345/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6287 - mae: 3.6287 - val_loss: 3.7663 - val_mae: 3.7663 Epoch 346/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6274 - mae: 3.6274 - val_loss: 3.7185 - val_mae: 3.7185 Epoch 347/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6224 - mae: 3.6224 - val_loss: 3.7052 - val_mae: 3.7052 Epoch 348/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6254 - mae: 3.6254 - val_loss: 3.7259 - val_mae: 3.7259 Epoch 349/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6261 - mae: 3.6261 - val_loss: 3.7344 - val_mae: 3.7344 Epoch 350/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6201 - mae: 3.6201 - val_loss: 3.6842 - val_mae: 3.6842 Epoch 351/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6140 - mae: 3.6140 - val_loss: 3.6969 - val_mae: 3.6969 Epoch 352/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6255 - mae: 3.6255 - val_loss: 3.7237 - val_mae: 3.7237 Epoch 353/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6245 - mae: 3.6245 - val_loss: 3.6905 - val_mae: 3.6905 Epoch 354/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6176 - mae: 3.6176 - val_loss: 3.6811 - val_mae: 3.6811 Epoch 355/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6231 - mae: 3.6231 - val_loss: 3.7219 - val_mae: 3.7219 Epoch 356/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6206 - mae: 3.6206 - val_loss: 3.7536 - val_mae: 3.7536 Epoch 357/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6211 - mae: 3.6211 - val_loss: 3.7082 - val_mae: 3.7082 Epoch 358/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6154 - mae: 3.6154 - val_loss: 3.6848 - val_mae: 3.6848 Epoch 359/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6248 - mae: 3.6248 - val_loss: 3.7141 - val_mae: 3.7141 Epoch 360/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6170 - mae: 3.6170 - val_loss: 3.7563 - val_mae: 3.7563 Epoch 361/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6274 - mae: 3.6274 - val_loss: 3.6839 - val_mae: 3.6839 Epoch 362/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6223 - mae: 3.6223 - val_loss: 3.7069 - val_mae: 3.7069 Epoch 363/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6189 - mae: 3.6189 - val_loss: 3.6834 - val_mae: 3.6834 Epoch 364/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6221 - mae: 3.6221 - val_loss: 3.7032 - val_mae: 3.7032 Epoch 365/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6196 - mae: 3.6196 - val_loss: 3.7071 - val_mae: 3.7071 Epoch 366/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6213 - mae: 3.6213 - val_loss: 3.7586 - val_mae: 3.7586 Epoch 367/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6194 - mae: 3.6194 - val_loss: 3.7261 - val_mae: 3.7261 Epoch 368/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6177 - mae: 3.6177 - val_loss: 3.7381 - val_mae: 3.7381 Epoch 369/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6141 - mae: 3.6141 - val_loss: 3.7020 - val_mae: 3.7020 Epoch 370/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6117 - mae: 3.6117 - val_loss: 3.6927 - val_mae: 3.6927 Epoch 371/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6152 - mae: 3.6152 - val_loss: 3.7374 - val_mae: 3.7374 Epoch 372/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6185 - mae: 3.6185 - val_loss: 3.8001 - val_mae: 3.8001 Epoch 373/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6196 - mae: 3.6196 - val_loss: 3.6886 - val_mae: 3.6886 Epoch 374/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6141 - mae: 3.6141 - val_loss: 3.6950 - val_mae: 3.6950 Epoch 375/1000 145/145 [==============================] - 0s 2ms/step - loss: 3.6168 - mae: 3.6168 - val_loss: 3.7086 - val_mae: 3.7086
<tensorflow.python.keras.callbacks.History at 0x7fc4b20b1e10>
from sklearn.metrics import mean_absolute_error
predictions = (model.predict(X_test))
print(mean_absolute_error(Y_test, predictions))
3.1801840226844797
compare = pd.concat([pd.DataFrame(predictions), pd.DataFrame(Y_test)], axis=1)
compare.columns = ['Predictions', 'Actual Output']
compare
Predictions | Actual Output | |
---|---|---|
0 | 19.853456 | 20.0 |
1 | 24.822134 | 21.0 |
2 | 1.678609 | 2.0 |
3 | 19.744425 | 20.0 |
4 | 1.367931 | 1.0 |
... | ... | ... |
718 | 74.930794 | 78.0 |
719 | 8.039471 | 6.0 |
720 | 19.390198 | 20.0 |
721 | 126.380241 | 134.0 |
722 | 71.318306 | 65.0 |
723 rows × 2 columns
Y_test.std()
43.84078892739657
As per the research paper (https://sci-hub.se/10.1109/ICSPIS48872.2019.9066042), a model having a MAE of less than 10% of standard deviation of the label (for regression), i.e. 4.384 km in our case, is considered as an excellent model. the MAE of our model for test data is 3.18 (approximately). Hence, it can considered as a reliable model.
import tensorflowjs
model.save('/content/drive/MyDrive/Colab Notebooks/Projects/EV Range Prediction/Model (Final)/range_prediction_model.h5')
tensorflowjs.converters.save_keras_model(model, '/content/drive/MyDrive/Colab Notebooks/Projects/EV Range Prediction/Model (Final)/JSON')