앙상블 모델2 (다:다)- 2개의 모델이 입력되어 3개의 모델로 출력되는 앙상블 2:3 모델.
- 실제 이런 모델은 많이 만들지는 않겠지만, 가능성은 있는 모델.
- 각각 200개씩의 데이터를 가지고 있는 2개의 x1, x2
- 각각 200개씩의 데이터를 가지고 있는 3개의 y1, y2, y3
(2, 100)
(2, 100)
(2, 100)
(2, 100)
(2, 100)
(100, 2)
(100, 2)
(100, 2)
(100, 2)
(100, 2)
train, test, validation 분리y3_train.shape : (80, 2)
y3_val.shape : (10, 2)
y3_test.shape : (10, 2)
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_2 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
dense (Dense) (None, 100) 300 input_1[0][0]
__________________________________________________________________________________________________
dense_3 (Dense) (None, 50) 150 input_2[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 30) 3030 dense[0][0]
__________________________________________________________________________________________________
dense_4 (Dense) (None, 30) 1530 dense_3[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 7) 217 dense_1[0][0]
__________________________________________________________________________________________________
dense_5 (Dense) (None, 7) 217 dense_4[0][0]
__________________________________________________________________________________________________
concatenate (Concatenate) (None, 14) 0 dense_2[0][0]
dense_5[0][0]
__________________________________________________________________________________________________
dense_6 (Dense) (None, 10) 150 concatenate[0][0]
__________________________________________________________________________________________________
dense_7 (Dense) (None, 5) 55 dense_6[0][0]
__________________________________________________________________________________________________
dense_8 (Dense) (None, 30) 180 dense_7[0][0]
__________________________________________________________________________________________________
dense_9 (Dense) (None, 30) 930 dense_8[0][0]
__________________________________________________________________________________________________
dense_12 (Dense) (None, 20) 620 dense_8[0][0]
__________________________________________________________________________________________________
dense_15 (Dense) (None, 25) 775 dense_8[0][0]
__________________________________________________________________________________________________
dense_10 (Dense) (None, 7) 217 dense_9[0][0]
__________________________________________________________________________________________________
dense_13 (Dense) (None, 70) 1470 dense_12[0][0]
__________________________________________________________________________________________________
dense_16 (Dense) (None, 5) 130 dense_15[0][0]
__________________________________________________________________________________________________
dense_11 (Dense) (None, 2) 16 dense_10[0][0]
__________________________________________________________________________________________________
dense_14 (Dense) (None, 2) 142 dense_13[0][0]
__________________________________________________________________________________________________
dense_17 (Dense) (None, 2) 12 dense_16[0][0]
==================================================================================================
Total params: 10,141
Trainable params: 10,141
Non-trainable params: 0
__________________________________________________________________________________________________
Train on 80 samples, validate on 10 samples
Epoch 1/100
80/80 [==============================] - 1s 16ms/sample - loss: 230105.7874 - dense_11_loss: 4486.0205 - dense_14_loss: 74777.2266 - dense_17_loss: 150842.5938 - dense_11_mse: 4486.0205 - dense_14_mse: 74777.2266 - dense_17_mse: 150842.5938 - val_loss: 2316.5406 - val_dense_11_loss: 817.8552 - val_dense_14_loss: 96.5040 - val_dense_17_loss: 1402.1815 - val_dense_11_mse: 817.8552 - val_dense_14_mse: 96.5040 - val_dense_17_mse: 1402.1815
Epoch 2/100
80/80 [==============================] - 0s 3ms/sample - loss: 716.4842 - dense_11_loss: 180.8671 - dense_14_loss: 54.2707 - dense_17_loss: 481.3465 - dense_11_mse: 180.8671 - dense_14_mse: 54.2707 - dense_17_mse: 481.3465 - val_loss: 3150.8110 - val_dense_11_loss: 1068.2441 - val_dense_14_loss: 61.3816 - val_dense_17_loss: 2021.1852 - val_dense_11_mse: 1068.2441 - val_dense_14_mse: 61.3816 - val_dense_17_mse: 2021.1852
... Epoch 99/100
80/80 [==============================] - 0s 3ms/sample - loss: 2.7323 - dense_11_loss: 0.1585 - dense_14_loss: 1.2893 - dense_17_loss: 1.2845 - dense_11_mse: 0.1585 - dense_14_mse: 1.2893 - dense_17_mse: 1.2845 - val_loss: 0.7659 - val_dense_11_loss: 0.1389 - val_dense_14_loss: 0.0636 - val_dense_17_loss: 0.5634 - val_dense_11_mse: 0.1389 - val_dense_14_mse: 0.0636 - val_dense_17_mse: 0.5634
Epoch 100/100
80/80 [==============================] - 0s 3ms/sample - loss: 0.4230 - dense_11_loss: 0.0772 - dense_14_loss: 0.1695 - dense_17_loss: 0.1763 - dense_11_mse: 0.0772 - dense_14_mse: 0.1695 - dense_17_mse: 0.1763 - val_loss: 0.6731 - val_dense_11_loss: 0.1647 - val_dense_14_loss: 0.0353 - val_dense_17_loss: 0.4730 - val_dense_11_mse: 0.1647 - val_dense_14_mse: 0.0353 - val_dense_17_mse: 0.4730
10/10 [==============================] - 0s 3ms/sample - loss: 6.4008 - dense_11_loss: 1.3801 - dense_14_loss: 0.4249 - dense_17_loss: 4.5958 - dense_11_mse: 1.3801 - dense_14_mse: 0.4249 - dense_17_mse: 4.5958
mse : [6.4007875442504885, 1.3800805, 0.42485866, 4.5958486, 1.3800805, 0.42485866, 4.5958486]
- y1_test : 91에서 100, 191에서 200
- y2_test : 591에서 600, 691에서 700
- y3_test : 790에서 800, 890에서 900
y1 예측값 :
[[ 90.26873 190.21713]
[ 91.17966 191.13255]
[ 92.09063 192.04803]
[ 93.00153 192.96333]
[ 93.91245 193.87877]
[ 94.82329 194.79411]
[ 95.73423 195.70949]
[ 96.64515 196.62488]
[ 97.55602 197.54033]
[ 98.46696 198.45572]]
y2 예측값 :
[[591.2135 691.50714]
[592.24646 692.5766 ]
[593.2792 693.6459 ]
[594.3119 694.71515]
[595.34467 695.7844 ]
[596.3774 696.85376]
[597.41003 697.923 ]
[598.4429 698.9924 ]
[599.4757 700.06165]
[600.5083 701.1309 ]]
y3 예측값 :
[[791.9705 892.63965]
[793.0878 893.8419 ]
[794.205 895.044 ]
[795.3222 896.24603]
[796.4394 897.448 ]
[797.5565 898.6501 ]
[798.67365 899.8522 ]
[799.79095 901.0542 ]
[800.9082 902.2565 ]
[802.02515 903.4582 ]]
|