앙상블 모델 (다:다)- 2개의 모델이 합쳐져서 2개의 모델로 출력되는 경우
- 각각 300개씩의 데이터를 가지고 있는 2개의 x
- 100개의 데이터를 가지고 있는 1개의 y
(2, 100)
(2, 100)
(2, 100)
(2, 100)
(100, 2)
(100, 2)
(100, 2)
(100, 2)
train, test, validation 분리x2_train.shape : (80, 2)
x2_val.shape : (10, 2)
x2_test.shape : (10, 2)
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
input_2 (InputLayer) [(None, 2)] 0
__________________________________________________________________________________________________
dense (Dense) (None, 100) 300 input_1[0][0]
__________________________________________________________________________________________________
dense_3 (Dense) (None, 50) 150 input_2[0][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 30) 3030 dense[0][0]
__________________________________________________________________________________________________
dense_4 (Dense) (None, 30) 1530 dense_3[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 7) 217 dense_1[0][0]
__________________________________________________________________________________________________
dense_5 (Dense) (None, 7) 217 dense_4[0][0]
__________________________________________________________________________________________________
concatenate (Concatenate) (None, 14) 0 dense_2[0][0]
dense_5[0][0]
__________________________________________________________________________________________________
dense_6 (Dense) (None, 10) 150 concatenate[0][0]
__________________________________________________________________________________________________
dense_7 (Dense) (None, 5) 55 dense_6[0][0]
__________________________________________________________________________________________________
dense_8 (Dense) (None, 30) 180 dense_7[0][0]
__________________________________________________________________________________________________
dense_9 (Dense) (None, 30) 930 dense_8[0][0]
__________________________________________________________________________________________________
dense_12 (Dense) (None, 20) 620 dense_8[0][0]
__________________________________________________________________________________________________
dense_10 (Dense) (None, 7) 217 dense_9[0][0]
__________________________________________________________________________________________________
dense_13 (Dense) (None, 70) 1470 dense_12[0][0]
__________________________________________________________________________________________________
dense_11 (Dense) (None, 2) 16 dense_10[0][0]
__________________________________________________________________________________________________
dense_14 (Dense) (None, 2) 142 dense_13[0][0]
==================================================================================================
Total params: 9,224
Trainable params: 9,224
Non-trainable params: 0
__________________________________________________________________________________________________
Train on 80 samples, validate on 10 samples
Epoch 1/100
80/80 [==============================] - 1s 14ms/sample - loss: 34903.4876 - dense_11_loss: 2219.0393 - dense_14_loss: 32684.4473 - dense_11_mse: 2219.0393 - dense_14_mse: 32684.4473 - val_loss: 1057.6422 - val_dense_11_loss: 860.5211 - val_dense_14_loss: 197.1210 - val_dense_11_mse: 860.5211 - val_dense_14_mse: 197.1210
Epoch 2/100
80/80 [==============================] - 0s 3ms/sample - loss: 608.2382 - dense_11_loss: 530.3654 - dense_14_loss: 77.8727 - dense_11_mse: 530.3654 - dense_14_mse: 77.8727 - val_loss: 2607.1814 - val_dense_11_loss: 2336.1191 - val_dense_14_loss: 271.0622 - val_dense_11_mse: 2336.1191 - val_dense_14_mse: 271.0622
...
Epoch 99/100
80/80 [==============================] - 0s 3ms/sample - loss: 5.5020 - dense_11_loss: 0.4499 - dense_14_loss: 5.0521 - dense_11_mse: 0.4499 - dense_14_mse: 5.0521 - val_loss: 0.9372 - val_dense_11_loss: 0.0410 - val_dense_14_loss: 0.8962 - val_dense_11_mse: 0.0410 - val_dense_14_mse: 0.8962
Epoch 100/100
80/80 [==============================] - 0s 3ms/sample - loss: 1.6134 - dense_11_loss: 0.4837 - dense_14_loss: 1.1297 - dense_11_mse: 0.4837 - dense_14_mse: 1.1297 - val_loss: 0.4074 - val_dense_11_loss: 0.0829 - val_dense_14_loss: 0.3245 - val_dense_11_mse: 0.0829 - val_dense_14_mse: 0.3245
10/10 [==============================] - 0s 2ms/sample - loss: 0.9674 - dense_11_loss: 0.3178 - dense_14_loss: 0.6496 - dense_11_mse: 0.3178 - dense_14_mse: 0.6496
mse : [0.9674274861812592, 0.3178359, 0.64959157, 0.3178359, 0.64959157]
- y1_test : 91에서 100, 191에서 200
- y2_test : 591에서 600, 691에서 700
y1 예측값 :
[[ 91.02833 190.49353 ]
[ 92.00755 191.4657 ]
[ 92.97272 192.42719 ]
[ 93.9117 193.36871 ]
[ 94.846695 194.30725 ]
[ 95.78053 195.24481 ]
[ 96.71423 196.18234 ]
[ 97.648026 197.11989 ]
[ 98.581795 198.0574 ]
[ 99.515594 198.99501 ]]
y2 예측값 :
[[591.2311 691.949 ]
[592.22375 692.9734 ]
[593.21783 694.002 ]
[594.21454 695.03845]
[595.2164 696.08093]
[596.2199 697.12524]
[597.2235 698.1698 ]
[598.227 699.2142 ]
[599.23047 700.2586 ]
[600.2342 701.3031 ]]
|